Greenplum Experts Panel, Greenplum Operations at Scale – Greenplum Summit 2019

Slides from a panel I hosted with some of Pivotal Greenplum’s largest customers.

How to Meet Enhanced Data Security Requirements with Pivotal Greenplum

Cross posted from The Pivotal Blog.

As enterprises seek to become more analytically driven, they face a balancing act: capitalizing on the proliferation of data throughout the company while simultaneously protecting sensitive data from loss, misuse, or unauthorized disclosure. However, increased regulation of data privacy is complicating how companies make data available to users.

Join Pivotal Data Engineer Alistair Turner for an interactive discussion about common vulnerabilities to data in motion and at rest. Alastair will discuss the controls available to Greenplum users—both natively and via Pivotal partner solutions—to protect sensitive data. We’ll cover the following topics: – Security requirements and regulations like GDPR – Common data security threat vectors – Security strategy for Greenplum – Native security features of Greenplum

Speakers: Alastair Turner, Data Engineer & Greg Chase, Business Development, Pivotal

Boost Greenplum BI Performance with Heimdall Data

Jeff Kelly talking to Greg Chase and Eric Brandsberg at Greenplum Summit Built to Adapt interview.
Eric Brandsberg, CTO of Heimdall Data and Greg Chase, Business Development Lead at Pivotal, chat with Jeff Kelly about how Heimdall Data can help you boost Greenplum BI Performance.

Cross posted from the Pivotal Blog.

One of the more interesting startups I’ve run across recently is Heimdall Data. They are a “SQL-savvy” platform that elegantly slips into your application instance for improved data performance and reliability of backend data sources.

Here are some use cases for Pivotal Greenplum that we have been developing with Heimdall:

  • SQL Caching — This will be super helpful for a Pivotal Greenplum instance when it supports traditional BI users who make repeated queries. Heimdall Data auto-caches and auto-invalidates SQL results into Pivotal GemFire without code changes. Users will experience faster response times while freeing up the system workloads for what your data scientists might run.
  • Automated Master Failover — Typical Pivotal Greenplum deployments have a standby master node. However, actually failing over to the standby in the event of a failure of the active master node is a manual process. Heimdall can automate failover to the standby master node Greenplum. Heimdall is an enhancement over Pgpool as it handles failover behind the scenes without the need for development work.
  • SQL Traffic Manager — We also think Heimdall could be useful for customers that want to combine OLTP with their analytics, such as an HTAP application. In this case Heimdall determines whether SQL operations it received should be executed by Pivotal Greenplum, an associated PostgreSQL database, or even processed by both. This provides the kind of performance that transactional applications expect, and also means the Pivotal Greenplum data warehouse has access to up to date data. Like most of Heimdall’s approaches, this solution also slips in, and requires no code changes to your SQL queries to work.

Listen to our interview with Heimdall CTO Eric Brandsberg from Greenplum Summit 2018 in Jersey City this last May.

Boost Greenplum BI Performance with Heimdall Data was originally published in Built to Adapt on Medium, where people are continuing the conversation by highlighting and responding to this story.

Celebrating an Amazing 2015 for Pivotal Big Data Communities

In 2015, Pivotal released more than 6 million lines of code into open source, and we launched 4 major new open source projects:

This was a significant accomplishment by Pivotal in terms of engineering, legal, product development, and marketing efforts.

 

Since then we’ve seen quite a bit of interest in these nascent new communities in 2015.

 

2016 is our year to build them to critical mass of members with first official releases, simplifying the path to adoption and contribution, and driving awareness.

 

Here is Happy New Year message from Pivotal to the Pivotal Big Data Community.

Our Customers at Pivotal Recognize the Importance of Bridging Traditional Data Warehousing into Next Generation Platform

Cross posted from my blog at Pivotal POV:

Recently Gartner published the report, “Gartner Critical Capabilities for Data Warehouse Database Management Systems” that shares survey results of customers from a variety of Data Warehouse solution vendors.  The report ranks vendors in 4 categories of use cases in the Data Warehouse market: “Traditional Data Warehouse”, “Operational Data Warehouse”, “Logical Data Warehouse”, and “Context Independent Data Warehouse.”

Based on existing customer implementations and their experiences with data warehouse DBMS products, the report scored Pivotal in the top 2 out of 16 vendors in two use cases: “Traditional Data Warehouse” and “Logical Data Warehouse”.  In a third use case, “Context Independent Data Warehouse”, Pivotal scored in the top 3 relative to the 15 other vendors.

In the report, Gartner writes “the adoption rate for modern use cases (such as the logical data warehouse and the context independent warehouse) is increasing year over year by more than 50%—but the net percentage for the context independent and logical data warehouse combined remains below 8% of the total market.”

Modern Data Warehouse Use Cases Generate Trillions in Value

Many of Pivotal’s big data analytics customers started out as Greenplum Databasecustomers. These customers are both well established in traditional data warehousing techniques and take advantage of modern data warehousing scenarios supported by Greenplum Database’s advanced analytics capabilities, and other products of Pivotal Big Data Suite: Pivotal HAWQ and Pivotal HD.

Industry leaders like General Electric are using Pivotal Big Data Suite to create new solutions that cut weeks of analysis time that would be required using traditional data warehouse approaches. For example, a process for refining insightful analytics from sensor data streams generated by industrial machinery was compressed from 30 daysto just 20 minutes.

Other companies are using these approaches to improve customer retention, target advertising, detect anomalies, improve asset utilization and more. The combined potential benefit of these opportunities is staggering. GE alone predicts its solutions will boost GDP by $10-15 trillion in the next 20 years by saving labor costs and improving energy efficiency. [Read more…]