As you may have noticed, we’ve started a new series about “The New Digital Experience." It’s meant to share the best practices companies adopted to improve the customer experience and transform into a digital business, with a particular focus on the use of in-memory computing with other technologies.
One of the most important architectural concepts that companies need to understand is why in-memory computing is so critical to improving the customer experience.
So, we’re kicking of the New Digital Experience Webinar Series with a guest speaker, Matt Aslett, a research director from the 451 Group, who’s going to cover his area of expertise – In-Memory Computing: A New Engine for Accelerating the Data Behind Digital Business and the Customer Experience.
Matt’s research includes A New HOAP? Hybrid operational-analytic processing and the future of the database market.
We’ve always talked about being more data driven to help improve the customer experience. It’s partly what’s driven a 50x growth in data over the last decade and the big investments in real-time analytics and automation.
There are two problems with the current approaches. First, the data is out-of-date or missing. It’s hard to improve an experience when you don’t know what’s happening at the moment. That’s partly because the data flows for analytics are separate from operations, and partly because the analytics take too long. Second, people often can’t take action fast enough. If you’re going to improve the customer experience in real-time, some actions need to be automated. That means performing the analytics on up-to-date data as the transaction, operation or interaction is happening.
Whether it’s called hybrid transactional (HTAP, coined by Gartner), operational (HOAP, coined by the 451 Group) or interactional (HOIP, which I made up and is doomed to failure) doesn’t really matter. The end goal, reaching HTOAPIA, is the same. The way many retail innovators like Amazon, Expedia/HomeAway or SaaS innovators like Workday have succeeded is with in-memory computing.
The way to implement HTAP or HOAP is to keep operational data and analytics together, so that analytics can work against the most recent data. This means the same data infrastructure needs to support SQL and ACID transactions, for example, and any analytics or other computational needs.
The way many companies have implemented real-time HTAP or HOAP is to put all the data in memory, and to run the analytics against the data in-memory. For Big Data analytics and other computations, the only way achieve real-time responsiveness is to avoid any network movement by collocating the processing with the data, because it takes too long to move Big Data over the network.
In order to support current transaction, query, analytics and data growth rates, the only way to scale is horizontally on low cost hardware. To scale horizontally and perform real-time analytics, the only way is to distribute data to ensure collocated processing, what is often called massively parallel processing (MPP). An in-memory computing platform needs to provide all these capabilities to be an effective platform for real-time HTAP or HOAP.
Watch the Webinar with Matt Aslett from the 451 Group, In-Memory Computing: A New Engine for Accelerating the Data Behind Digital Business and the Customer Experience. It happens live on Wednesday, April 18 at 10 am PT (1 pm ET). Register here.
This live interactive webinar will give you a deeper understanding of the importance of HTAP/HOAP, and the need for in-memory computing. You’ll also learn more about how some other companies have implemented real-time HTAP/HOAP and in-memory computing, and the roadmap for adopting in-memory computing over time.