Webinar – What is Blockchain Interoperability?

Webinar – What is Blockchain Interoperability?

Blast Team

27 min read

With Alex Smirnov – CEO and Co-Founder of deBridge & Gal Stern – Head of Business Development at deBridge

In the fast-paced world of blockchain technology, achieving seamless communication between diverse blockchain networks is crucial. Blockchain interoperability addresses this challenge, enabling different systems to share data effortlessly. This webinar explores the significance of blockchain interoperability, the technologies behind it, and its transformative impact on various industries.

Join this insightful discussion between Alex Smirnov, CEO and Co-Founder of deBridge, and Gal Stern, Head of Business Development at deBridge, moderated by Radu Enoiu – Co-founder and Head of Product at Bware Labs.

The main talking points include:

  • what is blockchain interoperability
  • how interoperability helps developers build cross-chain applications
  • why decentralization is key for interoperability

About the speakers:

Alex Smirnov – CEO and Co-Founder of deBridge

Alex is a seasoned IT professional, entrepreneur, and passionate blockchain advocate, boasting extensive expertise in software development, system design, and computer modeling. Throughout his career, he has spearheaded research and development efforts in multiple tech firms, contributing significantly to their advancements. He has also showcased his expertise by publishing eight scientific articles and presentations at various international conferences. Alex holds a Master’s in Mechanics and Mathematics from Moscow State University. His current focus lies within the realm of decentralized finance (DeFi), where he dedicates his efforts to the development of deBridge, a groundbreaking cross-chain interoperability and liquidity transfer protocol.

Gal Stern – Head of Business Development at deBridge

Gal is a dynamic professional with a strong background in the blockchain and decentralized finance (DeFi) industry. Currently serving as the Head of Business Development at deBridge, he has driven the company’s growth. With previous roles as the Community & Marketing Lead at xBacked DAO and Business Development Manager at GDA Capital, Gal has a proven track record in community engagement, marketing, and forging strategic partnerships within the blockchain space.

Radu Enoiu, Co-founder and Head of Product @ Bware Labs

With over 10 years of experience in the engineering field, Radu has explored several product management and engineering roles, working for companies like Fitbit and Google. Currently, he leverages his tech expertise and contributes to providing Web3 builders with decentralized infrastructure and high API performance.




Radu 00:00

Hello everyone, and glad to have you here again on a new educational webinar we are presenting today! We are going to talk in this edition about a quite hot topic in the Web3 space, that of blockchain interoperability, and I have here with me Gal and Alex from deBridge, who are, I think, way better equipped to talk about this subject.

Even though Bware Labs and Blast are multi-chain platforms, and we are in partnership with a lot of blockchains and new projects, we’ve also been working very closely with deBridge for about two years now, and I cannot think of anyone who would be better suited to tackle this subject.

Today, we’ll try to provide very useful information and go to the bottom of what interoperability is or, at least, what it should be.

Before getting into that, I would like you guys to make a short presentation about yourself, describing your professional experience, what is interesting to you about blockchain technology, and whatever else you think is relevant to our listeners.

Alex 01:27

Yeah, I can start.

So, hello, everyone! My name is Alex. I’m the CEO and Co-founder of deBridge. I’m a technical person myself and a developer, my background is mainly in math and mechanics. I was doing a Ph.D. in the area of satellites and national navigation before I got into crypto, just like back in 2016.

I found out about BitShares, it was one of the first technologies. I started to learn about and got super fascinated by blockchain tech in general, and then, like, getting into the BitShares community, I was helping with some sort of developments for different projects and teams, and then also found out about like Steemit, which used to be like the very first social network based on the Steem blockchain. Which was super exciting because back then, I was like living in the dormitory of my university, getting like a $100 scholarship per month, and then on Steem, I just published some article about like Popular Science, you know, and the Steemit works in a way that like if some whale with big stake upvotes or likes your article, you get a payout from inflation for instant token. And just like some random whale liked my post, I got more than $1000 payout for the article I spent like 4 hours on. 

It was like super, you know, exciting and incredible, and I realized like, well, guys, I’m doing something wrong. Like I should not do that much science and math; I should dive into blockchain deeper, and that’s how it all started, you know. I started to be active in the Steemit community.

And actually, the moment I got into crypto is sort of memorized into blockchain and Steem. So fun, like some of our team members recently sent me a link like ‘look, here’s the transaction where you like, that you can use as a confirmation on the date when you first got into the Blockchain space.’ Yeah, even though I got maybe a few months before that, it’s quite exciting because it’s a good example of how users are onboarded and funneled to this base.

And then I just started to be super active within crypto communities, to develop more, learn how different teams work. And in 2017, we formed the team with another deBridge Co-founder, Yaro,  where we started to provide blockchain development services. And yeah, it was a long story, but later on, back in 2020, we started to work on deBridge.

Radu  04:06

But before we dive into that, yeah, Gal, tell us your story.

Gal 04:09

Yeah, it also… for me, started quite organically. But yeah, leading Business Development and Ecosystem for deBridge, ensuring we grow from a position of strength. But yeah, for me, around 2016 is when it started.

For me very… Actually, a bit earlier, I was a bit of a gaming nerd back then. Had a nice gaming PC, and I was fascinated by the idea that I could just generate these Doge Coins from my computer. It was, yes, it started like that. It was just a curiosity, and then I started to dive a bit deeper into it during my studies, and I’m meeting a lot of people in this space, getting to know people, and learning about it. Even yeah, I met Alex back in 2018 in Singapore at a Binance event.

Yes, that’s how that’s how these relationships start to build, and everything snowballed, and now working on an incredible infrastructure.

Radu 05:06

Oh, it looks like you guys are real OGs. I don’t know many people who started in 2016 and even before that. So, I guess, in crypto years, that makes you like one of the very senior guys of blockchain technology.

Okay, thank you for your presentation. Now, I’d like to start with a short presentation of your project, deBridge. Please correct me if I’m wrong, but to my understanding and working with you guys, deBridge presents itself as both a messaging protocol and a cross-chain interoperability platform that can be used by developers, builders, to build cross-chain applications.

I’d like to know more details about what that means and how deBridge helps people build applications cross-chain because I’m a big fan of the cross-chain concept, and I think that’s what’s required for Web3 to actually get traction.

So I’m really interested to see how you guys are seeing this and how you tackle it.

Blockchain Interoperability and How deBridge Helps Developers Build Cross-Chain Apps

Alex 06:17

Yeah, so we are basically solving the problem of interoperability, right? Because it’s been super challenging, especially considering all the security incidents and hacks happening in this space, and our goal is to enable high performance and secure interoperability, right?

And deBridge… we started to work on the project early in early 2021, as the team participated in the Channel Global Hackathon, and we won. Like more than 140 teams worldwide. And yeah, after reading the hacks on it we got a lot of attention from the community and various VCs, and yeah, we started to work on bridging infrastructure even before everybody started to talk about bridges, before it became mainstream.

But yeah, from the beginning, we’ve been focused on solving the interoperability challenge. Because, in general, this problem globally consists of three main verticals, right, and we are probably the only team trying to differentiate all these verticals instead of solving them simultaneously.

The first one that is quite important is the asset constitute, right, when we have the asset in one chain, we should have the synthetic representation, or we may need to have the synthetic representation of this asset on another.

And like an ideal scenario, the asset constitute should be enabled by canonical bridges such as Arbitrum Bridge or Optimism Bridge, which can be trust minimized, right? They can be trustless, meaning that they don’t introduce any additional trust assumptions, and they don’t need to have a condition validation where…

But like classical sort of trustless interoperability solutions, the problem is that they are not scalable. They mainly need to connect on the two blockchain ecosystems, let’s say Layer 1 and Layer 2. Or the custody is also like you should think about who is in charge of the asset issues. Because when we have native use to see on different chains, we have like Circle as a native custodian who is in charge of like issues of these assets. But in many cases, native custody is not available because there are some Layer 1s that are not interconnected with Ethereum, and in this case, we should solve the asset custody challenge.

And in deBridge, this framework, which is called AMM, that we proposed to Uniswap governants at the beginning of this year, allows the aggregation of multiple infrastructures to avoid any infrastructure dependency and make asset custody way more secure. So yes, custody is like the first challenge on how you create or mint the assets on another chain.

The second vertical that is super important is the transfer of cross-chain messages or authenticated data, where you need to, let’s say, connect two smart contracts on different chains and allow them to exchange information.

But it’s very important to emphasize here that the message… the word message assumes authentication that the receiver should know who was the sender, right? It’s like sending a message from Telegram. When we receive a message from someone, we can identify the person using the handle, the Telegram ID. The same goes for the cross-chain space; when smart contracts receive the message, they should be able to understand the address of the sender and the chain. And that’s what we are solving at deBridge.

From the very beginning, deBridge has been a cross-chain messaging infrastructure, and our focus is on high-performance interoperability because we have quite a unique design where we can process basically an unlimited number of messages per second because we have like off-chain validation where our validators don’t need to broadcast any transactions, they just produce cryptographic signatures using which anyone can deliver or execute the message from the destination chain.

And this is also… like messaging is one of the pillars or foundations of interoperability. But messaging itself is insufficient because the whole DeFi space is built on principles of composable liquidity. People or projects don’t need to transfer data on them; mainly, everyone needs to transfer liquidity and data or even purely liquidity.

And for this one, for cross-chain trading or value transfers, we have a third vertical and a third solution. Yeah, Gal, maybe you can tell a bit more about how we are solving this.

Gal 10:51

Yes, so with… that’s what DLN is for. So that’s like our flagship product, probably has the most unique approach for value transfers.

So, the idea of value transfers is moving an asset that already exists on one chain and trading it for or exchanging it for an asset that already exists on another chain in the fastest and cheapest way possible.

And that’s what DLN is solving with its high-performance aspects. And we realize just from a high level if we look at the whole DeFi space right now and the liquidity fragmentation issue, we see that, you know, every every ecosystem is isolated from each other. This creates a lot of bottlenecks and friction when it comes to user onboarding, liquidity onboarding, etc. And that’s what DLN is really solving for, creating essentially this global liquidity engine where there’s… it doesn’t matter where the user is, there’s always liquidity on demand and accessible from somewhere.

So we’re creating this because we’re… we see this issue, and we noticed the biggest bottleneck is the aspect of liquidity pools, or where the price discovery happens on-chain. Instead, we’re taking this completely different approach, like Alex hinted, where we’re taking an off-chain matching where the settlement happens on-chain.

So it’s like an off-chain order book with an on-chain settlement, and essentially, the beauty of it is a completely zero TVL approach, where there is no possible lock liquidity. It’s good because it has many benefits, such as exact output on the destination chain, zero slippage, near-instant settlement, and much more secure and scalable.

So secure because the attack surface layer is minimized, the risk is taken off, you know, these multi-million dollar liquidity pools that we’ve seen all these bridge hacks happen over time. And more secured because with liquidity pool-based models, you’re… as you get close to the size of the liquidity pool when it comes to order sizes, you… the exponential increase of slippage happens. And that’s not really favorable for capital efficiency, right?

So with us, there’s no limitation because liquidity is very capital efficient, and we’re not capped to the size of a liquidity pool, and yeah, because of that, you know, we had a lot of institutional interest, and I can touch that a bit later on that in terms of alpha.

Radu 13:29

Yeah, just for the… to clarify for our listeners, I would like to ask who are the liquidity providers in the case of DLN and how it is done.

The Shift from Liquidity-Based Models to Liquidity On-Demand

Alex 13:42

Yeah, that’s a good question, actually.

So we… like in deBridge, we are shifting the paradigm from the liquidity pool-based model to the liquidity in time or liquidity on-demand approach.

So you can think about DLN as a cross-chain order book of intent, right? Intent is sort of a novel narrative in the crypto, where you… you’re not just initiating a transaction, you’re creating like intent of what you want to achieve, like what you want to do.

Let’s say you want to trade 100 USDC on Polygon for 99 USDC on Solana, right? And you create this intent, and you wait until someone is willing to fall for your intent. So intent is basically a limit order. And in our case, users are creating this intent or limit orders on the DLN infrastructure.

On the other side, we have private market makers. Basically, anyone can be a market maker, like any liquidity owner, but these are parties from the automated bots, and if they see that your limitatory is profitable for them, they’re then incentivized to fulfill it as soon as possible. The first one who will provide you with, let’s say, 99 USDC on Solana will be able to send a cross-chain message to unlock 100 USDC in Polygon and earn this part of 1 USDC in this example. And there are a few big advantages here.

First, that is just-in-time liquidity, right? You don’t need to incentivize market makers continuously in time like all the classical bridges are doing. Because like everybody… all the technologies that are built based on liquidity pool have to pay interest to LPs. And these market interests are super high because, imagine that US Treasuries are paying 5.5% annually, right? Bridges are super risky ventures, especially in taking all the security risks, like LPs would need to pay more than 15% per year at least.

The capital is not used efficiently because classical bridges based on pools are distributing these incentives continuously in time when transfers, in fact, are happening discontinuously, right? You can have a time period when there are no transfers at all, but you still have to distribute incentives. We are changing this model drastically by introducing this sort of P2P approach with the order book of intents, where market makers provide this just-in-time liquidity, and they’re just-in-time incentivized as well directly from this product of the limit order.

And if you want to additionally incentivize them, we can also do that, and we can do that like deterministically, you know. Because we just free-bait this part. So, let’s say this part was like 0.1%, then we can just distribute incentives based on the volume that the specific market maker settled. That makes incentivization way more efficient than we even suggested the incentivization program for the Optimism community, and get these grants approved by their grant comity because this allows making the cost of transfer of liquidity to and from the ecosystem to be near you, right? That creates a big incentive for builders and users to leverage a basic system because it facilitates the onboarding of the end users. 

So yeah, this is like… how’s designed in our case. For this zero TVL design.

Radu 17:17

This is actually really interesting. I wonder… I mean, this looks like the way it should be done actually. It’s even tackling a security issue by working directly on the motivation of possible attackers. Cause if there’s no liquidity pool, there is less motivation to do anything.

Also, it seems like the next step forward in interoperability, like having order books and just-in-time liquidity, seems like… And now I’m wondering why doesn’t everyone do it like that.

Alex 17:53

At some point!

But liquidity… like you’ll be already there, you know, because you already see market makers some deal and fulfilling 6 and 7-figure trades, and the interesting thing here is that the price discovery happens off-chain. Like no airman models, no order books.

Because if there is any systemic risk that is happening in any of the support ecosystems, then the asset, let’s say USDC, on that ecosystem, let’s say Phantom, can be traded with a sufficient… with a decent discount to USDC on another chain.

So, we allow the creation of an open market between any two arbitrary assets that are deployed in any two different blockchains, right, with the fair price discovery. This price discovery mechanism consistently takes into account any risks that may arise.

And that’s not something that would be possible with AMM models or liquidity pools. 

Radu 18:54

Yeah, it totally makes sense. Okay, thank you, guys, that was very educational for me. Now I have another question. We’ve seen that, and even discussed earlier that there’s quite a wide range of projects or platforms that are trying to provide cross-chain solutions or bridging solutions to builders out there, and we can see that some of them are decentralized, others are less decentralized. And I wanted to get your feeling about decentralization when it comes to cross-chain, the cross-chain concept.

How important would you say decentralization is, and what would be the advantages of it compared to a solution that is fully controlled?

Alex 19:49

Yeah, Gal, would you like to start? And I can add or…

The Importance of Decentralization for Cross-Chain Applications

Gal 19:54

Yeah, so, like, from the foundations where, in terms of our architecture, there are different aspects to it, but decentralization is probably more important than even, or just as important as, the decentralization of a Layer 1. Because it’s a settlement layer between chains, so from the foundations, we’re taking this much more from the way capital is handled, we’re taking a more decentralized approach because we see that just the fact that few smart contracts are holding multi-million dollars in liquidity that’s a form of centralization.

That’s why we’re taking this peer-to-peer approach where market makers and takers, like order creators and the counterparty, they’re completely, you know, non-custodial, they don’t have to, you know, possibly lock their liquidity anywhere, it’s just when they need to create an order and fulfill an order, and that’s it.

And from the… for the… Since we’re non-canonical, we require a validation layer to sign every message, so we have validators who are signing every message, making sure that it’s being authenticated.

And yeah, that’s those are my main points on that. I think Alex has some insight, too.

Alex 21:09

Yeah, because here, it’s also important to differentiate canonical bridges from interoperability solutions or so-called non-canonical bridges. Because people are using the word bridge, but this word is super confusing because, as I mentioned, there are three verticals: asset custody, messaging, and cross-chain trading. They all are called bridges, even though they should be sold differently.

So decentralization is really important for messaging, right, and asset custody, but, in general, everything has been built on top of messaging, and the non-canonical bridges cannot be fully trust-minimized, right? They cannot avoid trust assumptions, meaning that they cannot avoid validation, where there have to be some participants that create a consensus that these participants are validators. Some teams are calling them oracles on the layers, some are calling them guardians, but, in fact, these are some entities, right, or people who are in charge of validation.

That’s why one of the main verticals to compare cross-chain interoperability solutions or messaging layers is how… is by comparing how their validation layer is designed, and, of course, decentralization is one of the most important aspects of it.

In deBridge, we are trying to build an infrastructure that is truly decentralized, not in term… not only in terms of validation but also in terms of governance. Like how protocol and infrastructure will be governed and controlled, right? And here, we have quite a unique design of the validation layer. So, right now, we have 12 validators whom we assigned to bootstrap the infrastructure based on their performance and the testnet that we had back in 2021 by the fact… like the idea that the governance future token holders will be able to decide on how many validators the infrastructure should have.

And second, who should be validators, right? And moreover, on top of increasing the number of validators, we’re also taking different steps to increase decentralization. So one of the interesting ones is that deBridge is probably the only infrastructure that announced delegated staking and slashing. So validators will post financial guarantees, they will have a stake, right, and that stake will act as insurance.

And the interesting thing is that this is not like proof of stake, it’s not a stake that is sort of posed in governors, too. It’s a stake that is posting liquid assets such as ETH and USDC, right? Because as soon as validators start to receive part of the protocol fees, they will also start bearing financial guarantees and financial risks, right?

And the main assumption that deBridge users and integrators will bear is that validators will never collude, right? That at least two-thirds will not decide to sign some forged message. But in fact, like any attempt to forge is guaranteed, it’s insured, right? And if the total stake of validators is, let’s say, $40 million, then if you’re a user and you’re sending a cross-chain message, you just need to be sure that the value of your message is not… does not exceed $40 million, right? Because, you know, that is always ensured by the collateral of validators. And if the value of the message is bigger, it’s just split into multiple messages to be sure that each one is insured.

So, that’s how decentralization is enabled. Moreover, on top of proposing financial guarantees, we’re also exploring ways to sort of integrate or combine a validation where restake proofs or storage proofs that are developed by teams like Herodotus and LA Branch because, for some chains… for some chains, for some specific directions, you can pass a message and you can also require to validate the existence of the message in the source chain by proposing or like posting the ZK proof, right?

It’s not… it will not be available for any-to-any direction, but that’s something that will be available for some specific directions. And that’s quite cool because that allows us to increase security even further. And I’m personally quite excited about this state and storage proofs that provide an even better level of decentralization. 

And we actually just posted an interesting article where we sort of outlined all the steps of how we are ensuring the decentralization of the deBridge infrastructure. We’ve been super surprised that LayerZero has two validators, Oracle and Relayer, and they just replaced one of the validators with Google, which is like a centralized entity, like a US corporation, right? So basically, anyone using their default validation settings is bearing the risk of being censored, right, or being forged by the company.

And I think that decentralization is super important because, now, if you take a look into the overall token distribution or cap tables of the cross-chain infrastructures, there are not many candidates to have truly decentralized governance, right? And there is no candidate for Ethereum in terms of decentralization if you compare Ethereum to all the other Layer 1s. Our goal at deBridge is to create technology that can be as decentralized as the most decentralized Layer 1s.

Radu 27:09

Oh, that’s a very thorough explanation, and I totally get it. So I didn’t even know about the GCP and LayerZero collaboration, but you guys seem to be thinking about all the important aspects of interoperability and even keeping the Web3 mantra alive rather than just doing partnerships.

Okay, great! So let’s… let’s talk about what you guys are working on at the moment and what are the plans for the future. I know you just announced a couple of integrations. Maybe tell us a little bit about that, and then, if you feel like giving away some of your… of your future plans or even some alphas, that would be awesome.

deBridge Alphas

Gal 28:08

Yeah, maybe I can bring some light to that. Yeah, so I think one of the highlights has been the software integration with MetaMask that we had recently, where, you know, MetaMask Snaps came out. It’s been like a big, big push, a big… like really big for, you know, EVM users to have different capabilities, and one of those is the ability to really seamlessly move assets from and interact with Dapps on Solana directly from a MetaMask wallet.

And that’s what they have been pioneering in, and they chose us as the go-to bridge for MetaMask users into Solana, which is huge, like MetaMask is, you know, 30 million, over 30 million users. And yeah, they really realized that they really liked how, you know, we’re completely native to native instant, very capital efficient, and yes, so far, we’ve been getting good feedback.

Also, like the Jupiter Bridge compare, we got featured on that, too. And in terms of things in the near pipeline, I can’t reveal too much, but I can… in terms of liquidity, we’re getting to a point now where we’ve secured deals where we can easily fulfill like very large orders, and it’s, it’s putting us in a position where we’re ready to scale very strong.

And also securing deals around… like the, the areas that… where… which are being underserved in cross-chain, and what we really have the capability to serve, so things like payments, institutional capital, we’re securing and discussing deals around that too.

So, you know, like because we have exact output instant cross-chain capabilities, that’s gained a lot of interest from payment providers, and because we’re not limited by scalability bottlenecks, we’ve also been… gotten a lot of interest from like institutions and OTC desks who wanna do non-custodial trading. But yeah, maybe Alex, you could add some, some Alpha around recent features.

Alex 30:12

Yeah, absolutely! First, I would like to mention what you, Gal, brought up with the ability to sort of power institutional trades, right? Because in DLN, when you create the limit order, you can optionally specify who should be a counterpart, like who should be your market maker, you know? And that allows us to perform any non-custodial OTC deals. So, if you want to change, I don’t know, like $5 million of one of the tokens on one chain for some asset on another chain, you don’t need to trust OTC providers anymore. You don’t need to bear custodial risks. You just create a DLN limit order, and you specify the address of the counterparty. And that also allows to create like fully compliant institutional infrastructure, you know, because, like, let’s say, if you’re a big VC fund, and you do the trade, you don’t want to have your trade settled through liquidity pool or settled by some unknown third-party. You want to know who exactly you’re dealing with. But in this case, you can say, okay, I want to have Wintermute to be my counterparty, or Amber Group should settle this deal. And you will always know the specific liquidity that you get is coming from the well-known market maker or OGC provider, so that’s the first thing.

In terms of the secret features we are working on, actually with the theMetaMask release, we’ve already released some secret features that are used by software guys, and they’re doing amazing jobs, so big shout out to them because they’re true innovators. 

The future is like that with DLN; I cannot only do the trade, but you can attach the call data or instructions that should be executed together with your trade. Because, normally, when you move liquidity, you have to do something with the liquidity, right? You want to buy NFTs, or you want to supply this liquidity to Ave. With DLN, you can specify an option with these instructions that should be executed. So that when the market maker fulfills your deal, you will also be obliged to execute this call data. And it’s quite interesting, and it’s like super powerful primitive that no… no one has done yet because it allows enabling very high-performance cross-chain interactions, right, which abstracts away the entire infrastructure stack. So the user can just open the Dapp, connect any wallet they have… even if the Dapp is on Solana and the user has the MetaMask with liquidity on Polygon, the user just don’t… does not need to think about this anymore. They can… they connect the wallet, click one button, and in seconds, they’re getting their liquidity supplied on the Solana side. And as governance should be, since we don’t use AMM models or liquidity pools, we can also provide exact output. So if the user wants to receive, let’s say, 100 USDC, they just specify this exact amount, right? Our infrastructure will automatically tell them how much asset they should sell on the source chain to get exactly 100 USDC.

And that’s not something that is feasible with the classical bridges based on liquidity pools. This is also a super cool use case for cross-chain payments because, as you probably know, Visa chose Solana for the settlements because Solana has the best finality to decentralization ratio, like transactions on Solana are getting finalized reading in second intervals. With cross-chain infrastructure, such as DLN, this sort of use case can be combined very well because now, any cross-chain payments can be all settled in seconds on Solana.

So yeah, these are a couple of features that we are super excited about and many projects that are integrating this functionality. So it’s getting like really good adoption. And moving forward, what we’ll be doing is like, first, doubling down decentralization. As I mentioned, and in this article that we published recently in our blog post, we mentioned all the points on how we achieve and improve decentralization further.

Plus, we’re preparing something for DLN v2, which will be even more capital efficient in terms of price discovery and overall capital use because, like in DLN, we don’t need to have TVL, right? And having just like 5 million or like 10 million of total liquidity across all the market makers and all the chains, we can actually process hundreds of millions of dollars of trading volume per day. And that’s not something that other bridges would be able to do because, if you look at, let’s say, Stargate, they have like $386 million, and the trading pool is like 35 million per day. And the interest that you should pay on this liquidity is quite high because market rates are high right now, especially for bridges. and in our case, we don’t need to log this liquidity, so we are trying to build an economy that works and can scale horizontally super efficiently.

So yeah, these are some of the alpha leaks for you guys.

Radu 35:40

Yeah, it definitely sounds exciting, like a lot of exciting times that are already happening and a lot of things that seem to be coming for you guys. It’s an amazing job! I have to admit, I was not that familiar with the solution, but right now, I must say I’m quite impressed about what you guys managed to build at deBridge.

And now, as we are coming up to the end of our webinar here today, I have one question that I usually ask everyone cause we’ve been talking with a lot of founders or industry experts, and I want to see if there’s a pattern that… in their opinions.

So, the question is more philosophical or related to the broader space. I want to know from each of you what you think would be the key elements that are required for Web3 to reach a point where we become mainstream in terms of adoption.

Cause, in my opinion, we’re still far from that. Do or maybe do you think we are already there, if not, I would like to see some of the key elements that you guys think would be required for Web3 to become mainstream for anyone that uses social media, banking, also to use Web3 applications.

The Key to Web3 Becoming Mainstream

Gal 37:10

I think, yeah, I have some points on this. So I think we’re still quite far from where we could be. The technology is amazing, but there’s so much more room. I think two points that I have that come to mind.

So, one is like the foundation. I think the foundation, building a strong, high throughput, highly scalable solution, and strong foundation, is important to actually open the doors and actually enable the ability to be scalable. That’s why we’re taking, of course, a zero TVL approach, and making sure we’re… and being highly decentralized, to ensure that we’re prepared for that shift. So that’s one aspect.

The other aspect is, I think, just improving the UX and the actual user experience of users interacting with Web3. And that’s what Alex hinted at in terms of one of our recent features: it doesn’t matter where an application is deployed, they can deploy in Solana, they can… whatever suits them best, it doesn’t matter, but you can still enable this whole global accessibility concept where it doesn’t matter where a user is, they can interact… or it doesn’t matter where their assets are, they can interact with an application in one click.

And I think these kinds of user experiences are what will enable, you know, really like hyper-scalability in the Web3 space.

Radu 38:33

It’s nice, UX seems to be a motif that keeps coming around, and I fully agree. To me, it shouldn’t be visible or… for a user, for the end user, that the technology behind is blockchain or decentralized. Should be as easy… The thing is, yeah, just abstract all the complexities.

Yeah, but for that, we need, exactly as you said, the entire infrastructure services behind to make it easier for the app developers to do it. Let us hear Alex’s opinion as well.

Alex 39:10

Yeah, I think we already have infrastructure, right? We have so many Layer 1s, we have so many Layer 2s, and now we have finally managed to solve the interoperability challenge.

In my opinion, because infrastructure is life, and I think at deBridge, we did a really great job in this regard as being one of the only infrastructures that never had any security problems or disclosed vulnerabilities.

Plus, we are secured by design with the zero TVL approach, and I think that to enable global adoption, it’s time to focus on building apps instead of on building infrastructure, right? Because infrastructure is quite like, it’s quite ready, it’s already in place, and now I’m really excited to see what applications and projects will be built for end users –  consumer applications, right?

And Friend Tech like is a really great example, it’s the first-ever app that generates more revenue than the infrastructure on which it’s been built, right? Because the profit of Friend Tech is bigger than the profit of Base Layer 2 itself.

And it’s quite exciting, and this like… The examples that I want to see are not only apps deployed on one specific Layer 2 or chain but also cross-chain apps right where people or teams can innovate to bring more users into Web3. And I think this sort of paradigm shift is already happening, and we’ll be gradually seeing the process of like institutional organizations and like big social networks to be switching to Web3 infrastructure, just because it’s way more like efficient and transparent, right? And like you don’t have any third-party bottlenecks. Here, you can actually resolve any sort of technological challenge quite easily, and you don’t need to ask permission to innovate.

That’s what excites me most, but yeah, we’ll see. But I think that, like… it’s a matter of a few years to get a new wave of really big user adoption in about three…

Radu 41:24

Okay, so basically, the take is that we need… right now, we need to focus on the application layer of the Web3 technology, having a lot more applications that attract users and make revenue, rather than focusing on the underlying infrastructure that already is at the level that it can be built on.

Alex 41:45

Yeah, we like deBridge and Bware we should focus on supporting teams for building consumer apps, right, because we already built in construction we have been doing this for a few years already, but like those who are starting to innovate and build something, I think they should be focusing on applications.

Probably, it’s really a bit late to build some foundational infrastructure, right, such as blockchains or Layer 2s. It’s already in place, and very soon, we’ll see that any roll-up can be deployed with a few clicks.

But what really should be built is the apps that get users excited, that make users use this entire infrastructure, and use it in a way that they don’t need to think about it. They don’t need to think about blockchains, they don’t need to think about bridges, they just interact natively with the application, and all this sort of infrastructure interaction is happening under the hood in the background.

Yeah, so that’s what I foresee, and looking forward to seeing as many apps. Yeah, I just like Friend Tech is one of the examples, but I hope that we’ll see many more in the near future.

Radu 42:59

I totally agree! Okay, guys, so thank you very much for accepting to join me today and provide a little bit of clarity to people about what you guys are doing and, even more importantly, about what interoperability means. I hope you guys had a good time. For me, it was very useful, and I learned a lot. So yeah, thank you for being here, and hope to see you soon at the next conference when you are less busy.

Alex 43:32

Yeah, absolutely! Thanks a lot for hosting us this Friday. I’m excited to work with Bware again. Big shout out to you guys because you’ve been like one of the early validators in deBridge, so we’ll tell everyone that, ‘look, the Bware guys have been from the very beginning,’ and yeah, you’re like… we really appreciate your strong support.

Radu 43:52

Just happy to be able to be part of a little bit of the journey. So yeah, thank you for having us.

Gal 44:01

Thank you!

More articles

Ecosystem Deep-Dive: CARV

Ecosystem Deep-Dive: CARV

We are thrilled to introduce a new feature for CARV builders: Node-as-a-Service! This feature allows users to host their nodes on one of the top-performing infrastructure platforms in the Web3 space, leveraging services that power some of the industry’s leading brands.

Blast Team

4 min read
Ecosystem Deep-Dive: Aethir

Ecosystem Deep-Dive: Aethir

Blast Has Released Node-as-a-Service Feature for Aethir

If running the Checker Node Client on your own machine is not what you’re aiming for, opting for either a Virtual Private Server (VPS) or a Node-as-a-Service (NaaS) is an alternative.

Blast Team

5 min read