Pivotal HAWQ and Hortonworks Data Platform: Modern Data Architecture for IT Transformation

View Slides

Pivotal HAWQ, one of the world’s most advanced enterprise SQL on Hadoop technology, coupled with the Hortonworks Data Platform, the only 100% open source Apache Hadoop data platform, can turbocharge your analytic efforts. Featuring a massively-parallel processing (MPP) SQL query engine that runs directly in the compute resources of a Hadoop cluster, Pivotal HAWQ and the Hortonworks Data Platform provide businesses with a Modern Data Architecture for analytics and data science.

​Pivotal ​HAWQ brings a mature implementation of SQL with the ability to run in parallel using the compute resources of a Hadoop cluster. Analysts and data scientists can use familiar tools and analytical capabilities, while directly taking advantage of the scale and flexibility of Hadoop.

Attend this session to learn about this powerful technology, including:

  • A brief history of MPP SQL and Pivotal HAWQ
  • Architecture and use cases for SQL on Hadoop
  • Technical deep dive of features and integration with the Hortonworks Data Platform
  • Quick demo of analytical features

Shivaji Dutta
Developer Evangelist / Sr Partner Solutions Engineering, Hortonworks

Shivaji is Sr Partner Engineer with Hortonworks. He has over 18 years of Software Development and Consulting Experience. He has held variety of Solutions Architect and Technology Consulting Roles at Oracle, Amberpoint, Sun, Marklogic. He has worked in the NOSQL space for past 10 years.

Parham Parvizi
Product Manager, Pivotal HDB, Pivotal

Parham Parvizi ​is a Product Manager​​ at Pivotal​, where​ he is responsible for driving the technical product roadmap of ​the company's flagship SQL on Hadoop product – Pivotal HDB. Prior to joining the product management team, he was a Lead Big Data Solution Architect and prominent member of Pivotal's Advanced Engineering group. Parham has been involved with Hadoop since its early inception (around 2009). Since then, he has continued to expand his depth of knowledge and experience in designing, architecting and building big data solutions.