Covering Scientific & Technical AI | Wednesday, November 27, 2024

Blockchain Interoperability Remains Elusive 

via Shutterstock

Early adopters of blockchain technology, particularly financial institutions that are among the first to wring out the distributed, encrypted and immutable ledgers, are being force to analyze data offline, not “on-chain.” The primary reason is a paucity of standard query languages, data access latency and a general lack of interoperability between analytics and data visualization tools with multiple blockchain protocols.

That’s the conclusion of an analysis of early blockchain rollouts by capital marker firms released this week by TABB Group. Is “data—the lifeblood of capital markets—literally locked up in new data paradigm?” the report asks.

The study found that blockchain proof-of concept efforts are handling analytics “off-chain” after extracting data from enterprise deployments. “This is neither ideal nor sustainable. Required will be an on-chain approach to extract full value and competitive advantage of enterprise blockchain,” the authors note.

Data analytics vendors are beginning to tackle some of these teething problems with tools for specific applications like smart contracts, but even these developers note that blockchain standards remain largely non-existent.

Nevertheless, As HPC platforms unable experimentation with emerging financial technologies, market trackers such as TABB Group have touted the promise of blockchain deployments. While blockchain has helped overcome some data silo problems, this latest study expands on that theme as data analytics gaps surface in the blockchain ecosystem.

TABB Group’s study also examines early data analytics and visualization offerings that attempt to address emerging interoperability issues with various enterprise blockchain protocols.

Terry Roche, TABB Group’s head of fintech research, has previously noted that the next-generation database architecture “is just one piece of the puzzle, not the entire puzzle” that provides a validation mechanism, thereby eliminating the burden of having to “go in and check your data.”

As far back as 2016, the market analyst was highlighting the need for blockchain interoperability standards. In one example, Roche described the challenge of validating precisely what financial instruments traders are trading so that multiple peer-to-peer ledgers of transitions agree with one another.

“The data is all over the organization,” Roche said. “Blockchain can solve this problem, but only if the industry comes together to standardize its understanding of what’s being traded…and establish community services, utility services that moves all of the post-trading data validation to happen pre-trade.”

Those concerns now extend to data analysis and visualization tools needed as early adopters seek what TABB Group identifies as an “intermediate strategy” for moving prototype blockchain platforms to production.

Analytics vendors are meanwhile focusing on enterprise architectures that would boost what one vendor, Tibco Software, calls the “consumability” of blockchain technology.

Beyond that, Tibco CTO Nelson Petracek agreed that distributed ledgers are not yet optimized for high-speed query access. “How do I get data from the mainframe to the blockchain?” Petracek noted by way of explaining the challenge. “How do I trigger transactions?”

About the author: George Leopold

George Leopold has written about science and technology for more than 30 years, focusing on electronics and aerospace technology. He previously served as executive editor of Electronic Engineering Times. Leopold is the author of "Calculated Risk: The Supersonic Life and Times of Gus Grissom" (Purdue University Press, 2016).

AIwire