As we move through 2023, enterprises will increasingly demand that their data “circulatory system” – data storage, movement, sharing, processing, and analytics – run on an end-to-end microsecond computing ecosystem.
In the distant past (25 years ago), latency wasn’t a big topic of conversation. Database analytics was performed in batch mode, often nightly or over weekends, and most users were happy just to be online, no matter how long it took for a static webpage to load. Today, we live in a very different real-time world, with cybersecurity, fraud detection, video meetings, logistics optimization, mobile banking, telemedicine, data-driven decision-making, online betting, most IoT use cases, and so much more, all depending on low-latency data.
And it isn’t just speed. It’s speed at scale. The amount and types of data exchanged among systems continue to soar, with the machine-to-machine (M2M) connections market expected to reach over $54 billion by the end of 2030. But if businesses are to achieve the benefits they expect from their data-driven use cases, then data access and processing cannot come at the cost of increased latency.
A traffic control system based on autonomous vehicles continues to serve as the most intuitive and accessible example of the challenges and need to eliminate latency end-to-end. Microseconds of delay in a vehicle starting to brake can be the difference between a safe stop and hitting a pedestrian in a crosswalk. Yet consider the data requirements of that vehicle as well as every other vehicle in the system. A massive amount of data must be collected, shared, streamed, processed, and analyzed. Vehicle health. Road maps. Road and traffic conditions. Geolocation. Proximity detection. Visual and audio processing. All in real time with 100% accuracy—there can’t be a single mistake, ever.
Certainly, not every data use case is this demanding. But imagine a delay in the ability of a remote surgeon (or a robotic surgeon) to respond to complications. Or the consequences of an online betting solution that is slow to recalculate odds. Or the cost of a logistics solution that continually delivers just-too-late guidance. Or the cumulative effect of small delays in the thousands of decisions required to ensure a flight takes off on time.
Nearly every data solution today claims to deliver low-latency response times – and they probably do, but for what level of scale? And what happens when these solutions have to interact with others that are part of the data pipelines? If these low-latency solutions cannot maintain performance across silos and boundaries in the data ecosystem, then the overall value to enterprises is still constrained.
That’s why enterprises need a tightly integrated, end-to-end microsecond computing ecosystem. And that's why vendors who want to thrive over the coming years will need to figure out how to drive and support that ecosystem.
What will it take? The specifics aren’t yet clear, but I expect the following trends will start to emerge.
Vendor lock-in: Businesses dread vendor lock-in because it can limit agility and the ability to optimize for cost. However, to achieve the required speed and reliability at scale, many businesses may be willing – or at least feel compelled – to accept vendor lock-in when it comes to their data infrastructure so they can minimize the latency in data movement and have a more tightly coupled data flow. The vendors that will benefit the most because of this are the usual suspects among hyperscalers.
“Best-of-breed” to “Best fit”: Along with vendor lock-in comes another compromise, that of acceptance of whatever solution stack the vendor provides. There will be some solution components that will be inferior to the best-of-breed solutions in the market. This creates a small window of opportunity for today’s “best-of-breed” solution providers, and they must figure out how to integrate into these ecosystems. Will they sacrifice some innovation to ensure the best fit? Will they form new partnerships, or will the industry establish new standards?
Consolidation: To differentiate and ensure success, established vendors will strive to control and deliver broader solutions that encompass more parts of the microsecond ecosystem, so we will likely see disruption in the data industry, including some consolidation.
As a world that increasingly expects “real-time decisions” to become reality, we simply cannot get where we want to go without an end-to-end microsecond computing ecosystem. Technology is available to make this happen, so it’s only a matter of time and willpower. With enterprises demanding it, I expect vendors to step up and deliver new solutions for this modern data circulatory system or be left behind.
This article was originally published at Forbes.com