George Samman and Antony Lewis discuss the state of blockchain, deep diving into scalability and emerging solutions.

At the end of June, I took part in an interesting Q&A with Antony Lewis, a Singapore-based cryptocurrency and blockchain consultant. He writes a blog about bitcoin and blockchain, and I was pleased when he granted permission for me to reproduce the interview right here on BankNXT for your pleasure.

It hinges around a report I co-authored with Sigrid Semibold of KPMG, in which we explore the basics behind blockchain, our key observations regarding whether it’s right for your organisation, and what the road ahead looks like. The report is called Consensus – Immutable agreement for the internet of value, and you can find it right here. Without further ado, here’s the interview I did with Antony. Please leave comments below!

George, it’s a pleasure to chat with you. The KPMG report ‘Consensus: Immutable agreement for the internet of value’ you co-authored was an interesting read, and shone a light on some of the challenges facing private blockchains and distributed ledgers. How would you summarise the findings?

One of the key findings is that getting consensus right is really hard, and some of the brightest minds in the space are coming to terms with this and reexamining a lot of the work they’ve done or researched. Trying to re-model existing blockchain technology turns out not to be the answer.

When you say “getting consensus right”, what do you mean? Do you mean multiple databases all reaching the same state quickly, or do you mean something else?

Consensus has been around for as long as human beings have formed societies and needed to come to a state of agreement without necessarily trusting each other. For purposes of this interview, we can refer to consensus computing for distributed systems. In this context, it’s a way for nodes to agree on the validity of a transaction, and updating the ledger with a coherent set of confirmed facts.

How would you describe the problems around achieving consensus?

Keeping data in sync and ordering transactions are what consensus mechanisms are supposed to do. The main problem that’s being researched is around network stability and latency.

Why is getting consensus right so difficult, and why is the consensus methodology important?

Most of the material on the subject of consensus comes from academic papers and applications in other industries such as air traffic control or stabilising airplanes. The challenges are very different to the consensus challenges in capital markets. This hasn’t been done before and the issues are different.

For example, ordering becomes really important when you’re dealing with stock market order books. If multiple people are bidding for a stock at the same price, who’s the first one to get that price? An issue of fairness also comes into play, which some blockchain systems suffer from because of how they’re attempting to achieve consensus. Leader-based consensus systems have this problem because the leader selects the ordering of data, so you end up with a centralisation of control, which is what we’re trying to avoid. So depending on the use case, the consensus mechanisms themselves become extremely important.

Furthermore, with certain consensus systems, it turns out that there are a maximum number of nodes you can have before the system breaks. This is certainly an additional complexity if you need a lot of nodes in a network where parties don’t trust each other, yet want to have equivalent write-access to the same ledger.

Getting consensus right is critical, particularly when nodes can be located all over the world, and network latency adds another layer of complexity to system stabilisation.

Point taken on pre-trade order books. I suspect that’s why this isn’t an area of focus any more for private blockchain vendors to financial service companies.
In terms of node distribution or decentralisation, I don’t see any reason why nodes in a high throughput distributed ledger will end up being scattered across the world. Although with bitcoin, we currently see geographical distribution for nodes, I think that any successful distributed ledger for the traditional financial industry will have nodes clustered in the same data centres, in the same building, where a number of banks rent hardware sitting physically next to each other, connected with short cables. This should help to reduce some of the latency issues. Of course, this will be replicated to other data centres as redundant backups in case the main one fails.
To summarise, the ‘distributed’ in ‘distributed ledger technology’ will be ownership distribution rather than geographic distribution.

That makes sense. Although, if you want true distribution of the information, geographically distributing the nodes and using different cloud providers for the nodes adds an extra layer of distribution and security.

Scalability

Moving on from consensus to the concept of scalability and transaction throughput. In financial markets, a lot of tickets are printed, from the start of the process, with orders being submitted and cancelled within milliseconds, through to matched trades and eventually settlement. Clearly, you need throughput.

The problem of consensus becomes harder by orders of magnitudes when dealing with how many transactions financial institutions make. It’s essential for the network to be up and running all the time. Scaling to tens of thousands of transactions per second and beyond, keeping the network up and running, is extremely difficult. This is why there aren’t many projects in production and able to do this as of today. It’s a big challenge. A general principle that could be thought about is to run two sets of consensus mechanisms: one that runs locally and one that runs globally, and make them intersect. This could be done at intervals of time in a Baby Step Giant Step (BSGS) manner.

Regarding scalability, the notion that you start a blockchain with an endless lifetime is still preliminary. The reasoning for this is threefold:

  • Public blockchains are supposed to be everlastingly immutable, but are immature and haven’t yet figured out how to deal with unknown issues (as we’ve seen with the recent issues with The DAO).
  • Technological innovation has yet to come up with a suitable solution for the transaction volume common in the financial sector, and this also then becomes a consensus problem.
  • Configurations – you can’t deploy a single-service solution until you’ve tested and retested the correct configurations for such a vital network railroad.
I’ve seen internal ‘proof of concepts’ where a single or double node blockchain is spun up, with a user-friendly front-end. They seem to work, at a rudimentary level. Surely it’s now a case of institutionalising the technology?

Yes, you’re right – the proof of concepts are validating that the technology “may be able to live up to its promise”. They also have great marketing value. However, this is a long way off from institutionalised technology and the inherent stability necessary for this. Institutional technology needs to be battle-tested and hardened, and has to be as anti-fragile as possible. Hence, I believe the cycle to get the technology up to an acceptable level for satisfying a switchover will be longer than people think. There can be no room for mistakes, even if there are inherent benefits in the technology.

OK, aside from consensus and scalability, what are the other challenges facing the private ‘DLT’ space?

I think one of the challenges continues to be a lack of industry standards. The longer that common standards and protocols aren’t agreed and written up by industry participants, the more harmful it can be when trying to integrate different solutions, and the further away from interoperability we become. Is distributed ledger technology creating the next legacy system problem?

Another problem is a technical problem around interoperability with existing systems, and potentially between different blockchain networks. I think this directly correlates to the above point about standards and protocols. How will these ledgers interact if they’re built separately for different types of assets and then work with existing market infrastructure technology?

What we’re seeing is sort of the exact opposite of a common framework being adopted, where people are trying all sorts of different things.

Sure, but that’s what you would expect with a new set of technologies, and then some sort of Darwinian selection process will occur and the best will win. At the heart of it, this seems to be an interoperability and standards discussion. APIs and inter-system communication comes to mind here. It seems that a lot of the interoperability issues could be fixed by creating standards for common APIs. You then remove the privacy concerns of shared ledgers. But APIs don’t solve for immutability and decentralised control, if that’s really what’s wanted.

Emerging solutions

An interesting takeaway is that R3 is not building a blockchain. That’s surprising to some people: one of the world’s most well known “blockchain companies” isn’t building a blockchain!

I think it’s surprising because some people thought that private “distributed ledger technology” would be the panacea to cure us of all the ills of public blockchains (total transparency of network, mining and potential centralisation, anonymous actors and the slow speed of transaction times). However, we have seen that this is not the case. In my opinion, R3 realised that the financial problems it aims to solve are not blockchain-compatible at the present time.

We are seeing the amount of nodes in these distributed ledger networks shrink all the way down to two – ie back to bilateral communication, with “consensus” being the two nodes agreeing. This is centralisation, and the exact opposite of what blockchains try to solve for. A blockchain is supposed to offer immutable consensus. The benefits of transparency – no middleman, P2P transacting without needing trust and speed – are what appealed to me about blockchains to begin with.

This also applies to replication of the data: while this can certainly be permissioned to allow certain nodes to have certain actions, those in the network benefit by knowing that whatever business logic was supposed to happen did the way it was supposed to.

Well, in every system and with every tool there are tradeoffs. When you’re performing certain capital market operations, and privacy and confidentiality are most important, a distributed ledger may not be your best tool. Particularly when we’re still trying to get consensus right for scaling to hundreds of thousands of transactions per second.

Corda solves the consensus and ordering problems by not being a blockchain requiring network consensus. Instead, computers communicate and agree bilaterally. This gets rid of a lot of the complexity involved with the privacy issues of forming a network with your competitors. This also brings in a great debate about whether or not a blockchain will be the end solution, and if that solution will need consensus. Let the debate begin! In my opinion, Corda can be considered more of an oracle that can connect to blockchains if necessary.

What do you mean, an oracle that can connect to blockchains?

What I mean by oracle is a bridge between a blockchain and the outside world, so in the case of Corda it’s a platform that can speak to blockchains but is not a blockchain itself.

On 2 June this year, Morgan Stanley Research published a report stating, “For ASX, blockchain deployment seeks to reduce market costs, defend the clearing monopoly and grow new revenue streams”. It’s amazing that we’ve moved from blockchains being “disruptive” to blockchains being used to “defend the clearing monopoly” so quickly! No wonder there’s confusion! I tried to clarify this here.

You are getting from the banks a lot of Orwellian doublethink. This is the ability to hold two contradictory thoughts in your head and believe that they are both true. In this case, that blockchains will change the world, but we can’t use them properly for certain things we need to do, and in a way we are comfortable doing them.

There have also been cautious, or even negative, sentiments in recent days about the utility of blockchain technology. The CEO of Ice isn’t convinced about blockchains.

Sure, some will hate, some will love. What are you getting at?

I would just say be cautious of false narratives, and that there is a deep need for understanding what this technology is really good at, and what it might not be good at.

For me, consensus is a feature, not a bug. A blockchain is a transparency machine like nothing that has come before it. Therefore, if you want a blockchain, look for use cases where total transparency is suitable. There are three questions that need to be answered in order to help you guide your decision making:

  1. Who are you?
  2. What do you want to achieve?
  3. Who will be the nodes?

If you can answer these questions, then you’re on your way to figuring out consensus and the types of permissions you want to configure in your blockchain experiment.

Knowing the types of entities that are going to be involved in the transaction, as well as the type of assets being created, are also big steps. Once you have a handle on this, figuring out the consensus layer is much easier and you can start to build applications on top of the stack.

What about the proof of concepts that we’re seeing in the media?

A lot of the use cases that companies are going after right now don’t need a blockchain; or at the very least, the end clients (often banks) aren’t needing a blockchain solution for them yet. A lot of the blockchain companies are also proof of concepts themselves, and have still not been taken “out of the box”. This is where separating hype from reality starts. I also think a lot of use cases people are looking at for a blockchain to solve are things that aren’t meant to be solved by a blockchain.

From the company side, it’s important to define your identity. Are you a blockchain company or a company using a blockchain? There’s a big difference. For example, if you’re working on solving issues in trade finance and you’re using a blockchain as a solution, unless you’re designing your own platform from scratch, you are just improving efficiencies using technology, but you’re still a trade finance company.

Industry

Clearly, we’re at the start of the innovation cycle, and the problem is just that the hype and promise has accelerated and deviated from how quickly we can deliver. This is an unfortunate reality, but sometimes necessary to attract the investment needed to light up a technology. Can we reach the promised land of $20bn reduced annual cost by using distributed ledgers?

I think eventually we do reach the $20bn mark, and that’s nice, but it’s not revolutionary. It’s also a drop in the bucket compared to what banks spend today. In order to get there and switch systems, the costs saved will need to outweigh the money invested to do that. This hurdle may be too large to jump. Perhaps the way to think about it is, are there other accrued costs that will also be saved, aside from just reducing settlement costs and back office reduction savings. The answer is to this is yes.

While the cost savings are appealing to banks for many reasons talked about, I think the more relevant story will be how we can generate revenue from the apps built on DLT technology. While ideas are floating around now, the reality will probably look very different from their original conception.

You’re talking about how blockchain/DLT technology providers will monetise?

Yes, the VC funding cycle has become long in the tooth and the IPO market is no longer attractive. Some of the private company tech darlings, including Airbnb, are getting valuation write-downs. The fintech narrative is starting to question monetisation paths, and where the revenues will come from when VC money dries up.

Scary picture. When and how will VC money dry up?

This can come from rate hikes in the future, or recession, or some shock to the system. It’s hard to predict. However, the funding cycle has become long in the tooth. Global growth has slowed, and even Mary Meeker pointed this out in exquisite detail in her latest state of the union.

Particularly in the blockchain space, The DAO should be looked at as the top. This is really madness in many ways, but based on the sheer amount of money that was raised, is astounding. I think we are post peak-hype and reality will start to set in sooner rather than later.

The DAO raised the USD equivalent of $150m from pseudonymous investors, to fund unknown projects, to be voted on by unknown participants. That really does seem pretty nuts. It was also hacked recently – or at least it behaved exactly as it was coded to, to the detriment of many of the investors who hadn’t scrutinised the code.
So the billion dollar question is … as celebrated startups move towards becoming technology providers, unable to monetise on a “per ticket” basis, how are the company valuations justified? Who should we invest in?

Valuations seem to be based on what the last round someone else raised at as a starting point. Particularly for the bigger startups, who raising later stage rounds.

The financial service companies investing in these large rounds will not be taken for fools. They understand valuation like no other. What’s interesting is the lack of high profile VCs investing in these bigger rounds. The funding seems to be coming from the titans of finance and those that are at risk of being ‘disintermediated’ by cryptocurrencies. It’s a good narrative-control play.

The funding source from finance titans can also come back and bite DLT startups. If they are beholden to only the established incumbents, they may not be able to design the disruptive ecosystems promised by blockchain technology.

I think it’s way too early to predict any clearcut winners. I would be investing in the cloud companies that will be hosting all these companies, their data and their applications, and also the companies that are using blockchain technology properly. This isn’t an easy thing to do when people are trying to fit square pegs into round holes. Simplicity always wins.

What’s next for the DLT/blockchain industry?

Companies need to deliver, and companies need to make money to stay in business. Therefore, if you are under certain time constraints to make something people want, and there are still inherent problems in the technology you want to use, you pivot to making things that can improve existing workflows.

This is what you’ve called “industry workflow tools” in your blog, and though some costs may be saved, this doesn’t transform finance any more than the next efficiency technology. In fact, in many ways it exposes us to the same risks as have been seen in the past, because privacy and confidentiality are more important than anything else for banks performing capital market operations.

The problem with this thinking is that this does nothing to benefit the consumer, except maybe faster transaction times. The customer experience should be a major focus for banks, as they already are one of the most hated brands for young couple consumers.

Perhaps some of the cost savings will be passed to consumers, settlement will speed up, and collateral released so that businesses can make better use of working capital.

We all hope so!

READ NEXT: The best free research papers on fintech and blockchain

– This article is reproduced with kind permission. Some minor changes have been made to reflect BankNXT style considerations. Read more here. Main image: Sashkin, Shutterstock.com

About the author

George Samman

George Samman is the former CMO of Fuzo, which is using blockchain to bring financial inclusion to the developing world. He is also committee chair of the Wall Street blockchain Alliance (WSBA) for blockchain and financial services. He co-founded BTC.sx, now magnr, a bitcoin trading platform, and is a former Wall Street senior portfolio manager and market strategist, as well as technical analyst.

Leave a Comment