33699-4238165

When Nikita Ivanov co-founded GridGain Systems back in 2005, he envisioned in-memory computing going mainstream and becoming a massive category unto itself within a few years. That obviously didn’t pan out, but on the eve of the In-Memory Computing Summit 2020 taking place later this week, the GridGain CTO is still bullish on the future in-memory computing, particularly for powering stream processing.

“When I started this journey close to 20 years ago, there was a general belief that in-memory computing will be a massive category, as has been with cloud compute,” Ivanov says. “And it didn’t turn out this way. In-memory computing hasn’t become a massive category. It’s still an important category, but it’s a little bit different.”

Instead, in-memory computing “kind of morphed into something else,” Ivanov says. In particular, in-memory computing technologies–specifically in-memory data grids (IMDGs) that GridGain Systems and other vendors develop–have become a core element underlying large stream processing setups. IMDGs and stream computing is not the same thing, obviously, but they share similarities.

GridGain has found a certain degree of success with Apache Ignite, which is an open-source platform for storing and computing on large volumes of data across a cluster of nodes. Ignite essentially is the free and open version of GridGain’s enterprise-level in-memory computing platform, which is donated to the Apache Software Foundation back in 2014.

Today, Ignite is the fifth most popular open-source project at the ASF, Ivanov says, behind Spark, Kafka, Hadoop, and Cassandra. The project has matured to the point where it no longer has to prove itself and is widely considered as a core technological building block when developers absolutely need the fastest transactional performance.

Another IMDG vendor ScaleOut Software’s CEO William Bain says "IMDGs are flourishing in massive real-world use cases, such as tracking data for a million e-commerce shoppers or a fleet of rental cars".

Thanks to their ability to store fast-changing data in memory, IMDGs will be essential for acting upon live streams of data when the latencies involved with data lakes and other big data systems are too great, Bain says.

IMDGs aren’t ideal for all streaming or IoT use cases. But when the use case is critical and time is of the essence, IMDGs will have a role in orchestrating the data and providing fast response times.

“The combination of memory-based storage, transparent scalability, high availability, and integrated computing offered by IMDGs ensures the most effective use of computing resources and leads to the fastest possible responses, Powerful but simple APIs enable application developers to maintain a simplified view of their data and quickly analyze it without bottlenecks. IMDGs offer the combination of power and ease of use that applications managing live data need more than ever before.”