A Few Years of Interoperability Research in Review — Hello 2023 👋

Rafael Belchior
4 min readJan 8, 2023

The current crypto-winter has brought me to the keyboard, to reflect upon my work in the field of blockchain, where a few years feel like a long long time by now. I look at the available technology now and, comparing to 2017, there are already some things we take for granted — but there is still a long way ahead. Back then, the scalability (rollups and ZK tech), security (formal methods applied to bug discovery), privacy (again, ZK tech, and, for example, access control advances), and interoperability of blockchains (for bridges, rollups) were still at its infancy. Now, they are still in their infancy, in the grand scheme of things, but with a bit more research and methods available, to allow things to work better.

I started my blockchain journey in 2017 — studying information systems and blockchain access control (already collaborating with Hyperledger), and the delicate balance between off-chain and on-chain systems. While at the Portuguese government, I quickly realized that integration was not enough to create more efficient processes relying on blockchains — we needed more. We needed interoperability across old and new processes, centralized regulated systems, and decentralized technologies. What is interoperability? From our (rather comprehensive) survey — “In 1996, Wegner stated that interoperability is the ability of two or more software components to cooperate despite differences in language, interface, and execution platform”. And if in 1996 people were already discussing interoperability, its roots come from before, similarly to the area of (business process) view integration/database view integration.

After years of research, interoperability is still a key challenge facing the widespread adoption of blockchain technology. The ability of different blockchain networks to communicate and exchange data with each other is essential for the growth and integration of technology into various industries. We took this concept and thought that there could be a regulated organization, that is physically (have hardware running protocols) and legally (they are virtual asset service providers) able to perform the most desired form of interoperability nowadays — asset transfers. And hence, the notion of blockchain gateway was born, coined by Dr. Thomas Hardjono and colleagues. After I read their paper, I reached out and we started working on the SATP protocol. A few weeks later, Thomas assembled a forming working group at IETF, in mid-2021. While the technicalities behind blockchain gateways might not be super complex at a first sight, (a discussion in our paper that describes the technology), the devil is in the details: how to make sure the gateways agree on the state of centralized and decentralized systems (via blockchain views), crash recovery, regulation, and networks of gateways are actually trickier behind the scenes. See, for example, an example of a (happy, normal) flow of an asset transfer across two users represented by different organizations:

All these problems create research opportunities, and we can see a growing tendency for people to study the space (and we are not including the vast majority of grey literature, such as blog posts, news, etc):

My gut feeling is that interconnecting off-chain systems and blockchains in a more methodical manner will bring about synergetic effects similar to the rise of the Internet. For this, we need not only secure interoperability protocols, but also standards, pilots from the industry (and hopefully different use cases than asset transfers), and synergies with other scientific areas. The areas of business process management, namely process mining, and view integration, machine learning (namely data science) can be really good allies for monitoring interoperability processes, as those conflate a delicate balance between heterogeneous systems. In fact, monitoring the complex system-of-systems that is a blockchain interoperability solution and their consumers, might be the trick to avoid more multi-million hacks. Apart, of course, from choosing the right interoperability solution, a paper I’m really keen of:

(We talk about how to choose a solution in this other Medium article. )

Of course that all of this work wouldn’t evolve if there wasn’t collaboration and the “open-source spirit”. The Hyperledger Foundation is an excellent example of an organization that promotes, incentivizes, and develops work in the open-source area, that so much value delivers. If you have research ideas, please propose them to Hyperledger, namely on the summer internships program. If you want to learn, apply to that program. Many of our research collaborations were established within the boundaries of such programs. So, among open-source-backed projects, new innovative products and services, and more theoretical research, the field of interoperability is becoming more and more popular as people realize its importance.

My prediction is that it will stabilize in a few years, and shift focus from security and privacy (main research areas nowadays) to more operational topics (monitoring, discovering inefficiencies, synergies with off—chain systems) and novel use cases. Before we solved all the interesting problems, many winters will pass. And after we solved them, more will follow.

Let’s conclude: in order for this technology to reach its full potential and maturity it needs, it is important for us to keep researching. With every crypto winter, a new train of opportunity enters the station. Let’s use this time to stay focused & build.

Cheers

--

--

Rafael Belchior

R&D Engineer at Blockdaemon. Opinions and articles are my own and do not necessarily reflect the view of my employer. https://rafaelapb.github.io