A Fast Shift to Digital to Reach Kinetica’s End Users

I’d been paying attention to the news unfolding as SARS-COV2 began spreading around the world in February and March. It was when Mobile World Congress in Barcelona was suddenly cancelled that I really understood how COVID was going to radically reshape our world so quickly.

Over the next few weeks, conference after conference was cancelled or turned into being a digital only experience. For those early digital-only conferences, we were talking a very small subset of content with no partner exhibition or speaking opportunities. This was a problem for Kinetica since the next six months of our marketing calendar was anchored in some important partner conferences.

The marketing team rapidly revamped Kinetica’s demand generation strategy to be 100% digital. They asked me to produce an end-user education series to help practitioners such as data scientists, data engineers, geospatial analysts, and application developers understand the kinds of problems Kinetica can help them solve.

With the help of Kinetica’s amazing technologists and our plucky demand generation team, we cranked out a huge curriculum of talks featuring Kinetica’s many subject matter experts:

Kinetica can best be thought of as a convergence of real-time data warehouses and context-independent data warehouses, with some additional capabilities to support deployment of machine learning models at scale, and to help developers create analytics-driven applications. In Gartner’s updated view of the DBMS market, Kinetica is best suited for event stream processing use cases, and for creating Augmented Transaction Processing solutions.

You’ll find that most of the talks are geared towards data engineers who would be tasked with designing an event stream processing pipeline at scale. For example, such a data engineer will want to enable analysis of data in real time such as time series sensor feeds from IOT devices, or change-data-capture (CDC) messages from upstream transactional systems. The data engineer would collect this data with an event stream processing platform such as Apache Kafka, which can rapidly feed data into Kinetica. Once the data is ingested by Kinetica, the data engineer would set up real time data transformation and feature calculation, and then have these features fed into a deployed ML model for scoring.

We also have a few talks for geospatial analysts and application developers. Generally, GIS users will be quite comfortable with Kinetica’s capabilities, especially if they are already familiar with Postgres w/ PostGIS. Developers come later to a Kinetica project compared to other end users. They’ll be tasked with connecting their application to Kinetica via the REST API, or display the map visualizations rendered by Kinetica.

I’d like to give a special thanks to all my colleagues that helped us roll out so much amazing educational content over a short period of time.

Leave a comment