Kinetica CTO predicts IoT and AI will finally yield returns, data warehouses will slowly die

Virginia Backaitis
Digitizing Polaris
Published in
4 min readDec 13, 2017

--

Yesterday’s technologies won’t solve today’s problems. We seem to be learning this over and over again. While that is the bad news, the good news is that there are also brilliant minds that create new technologies to solve those problems.

Take tracking terrorists, for example. They generally leave a big enough digital data trail to make their intentions and whereabouts known. The difficulty in tracking them is in analyzing that data quickly enough to prevent them from endangering society.

When the Army and the NSA tried using yesterday’s leading edge technologies for the task, they kept hitting dead ends — data warehouses were too slow, NoSQL wouldn’t scale and premium in-memory solutions promised real time but didn’t deliver. So they turned to the founders of Kinetica, who were employed elsewhere at the time, to solve the problem.

The answer was an entirely new kind of database.

The Kinetica database is an in-memory database that utilizes commodity hardware running Nvidia GPUs to super-charge the processing. It is 10x-100x faster than even the most advanced in-memory databases. The U.S. Post Office leverages it to track more than 200,000 mail trucks, delivering more than 500 million pieces of mail to more than 154 million addresses daily. Not only can Kinetica display precise locations at any point in time, but traffic and weather condition overlays can also be applied to optimize routes. Last year insights gleaned from Kinetica’s database yielded savings of more than $70 million.

Enough about what Kinetica does, Kinetica’s cofounder and CTO Nima Negahban has an idea about where today’s hot technologies like AI and IoT will take us tomorrow.

Here are his predictions:

Organizations will look for / demand a return on their IoT investments

There are a lot of smart things — even a light bulb has an IP address behind it these days. Companies continued to invest in IoT initiatives in 2017, but 2018 will be the year where IoT monetization becomes critical. While it is a good start for enterprises to collect and store IoT data, what is more meaningful is understanding it, analyzing it and leveraging the insights to improve efficiency. Think saving energy, package route delivery optimization, faster pizza deliveries. The focus on location intelligence, predictive analytics and streaming data analysis use cases will dramatically increase to drive a return on IoT investments.

Enterprises will move from AI science experiments to truly operationalizing it

Enterprises have spent the past few years educating themselves on various artificial intelligence frameworks and tools. But as AI goes mainstream, it will move beyond just small scale experiments run by data scientists in an ad hoc manner to being automated and operationalized. The complexity of technologies used for data-driven machine and deep learning has meant that data scientists spend less time coding and building algorithms and more time configuring and administering databases and data management systems. And as enterprise move forward with operationalizing AI, they will look for products and tools to automate, manage and streamline the entire machine learning and deep learning life cycle. Data scientists need to focus on the code and algorithms and not automating and operationalizing the process. In 2018 investments in AI life cycle management will increase and technologies that house the data and supervise the process will mature.

Beginning of the end of the traditional data warehouse

As the volume, velocity and variety of data being generated continues to grow, and the requirements to manage and analyze this data continue to grow at a furious pace as well, the traditional data warehouse is struggling with managing this data and analysis. While in-memory databases have helped alleviate the problem to some extent by providing better performance, data analytics workloads continue to be more and more compute-bound.

These workloads can be up to 100x faster leveraging the latest advanced processors like GPUs, however this means a near complete re-write of the traditional data warehouse. In 2018, enterprises will start to seriously re-think their traditional data warehousing approach and look at moving to next-generation databases either leveraging memory or advanced processors architectures (GPU, SIMD) or both.

Building safer artificial intelligence with audit trails

Artificial intelligence is increasingly getting used for applications like drug discovery or the connected car that can have a detrimental impact to human life if the incorrect decision is made. Many AI frameworks are a black box with many layers of computation getting built within, as the framework learns from various data points. Detecting and pinpointing what caused the final incorrect decision leading to a serious problem is something enterprises will start to look at in 2018. This might be a result of a serious AI blunder which is unfortunately bound to happen eventually. Auditing and tracking every input and every score that a framework produces will help with detecting the human-written code that ultimately caused the problem.

Can you cosign these predictions?

If Negahban is right, and let’s hope he is, then next year we may very well be living in a smarter, safer more efficient world.

What are your predictions for 2017? If you need inspirations, MapR’s chief applications officer Ted Dunning shared his with Digitizing Polaris here.

--

--