A tale of two cryptocurrencies: Ethereum and Bitcoin’s ongoing challenges

halvening.png

It’s been one of the most interesting months in the history of cryptocurrency. The price of Bitcoin has soared up to nearly $800 (then dropped to $675 as $19 million in BTC hit the market) as the reward for mining a block is soon set to be halved. Ethereum created what is arguably the world’s most complex multi-million dollar financial instrument, the DAO, only to see it hacked through a combination of flaws in its “smart contract” and the language it was implemented in. Meanwhile, the average size of a block in the Bitcoin blockchain nears 90% of the 1MB hard limit.

I’ve been meaning to write a year-in-review followup to my previous post The Death Of Bitcoin, but with the recent events covering Ethereum I just couldn’t help but cover that too, and also speculate what effects this will have on Bitcoin.

Ethereum’s Hindenburg moment: The DAO disaster and smart contract security takeaways #

hindenburg.jpg

Once upon a time we made airships filled with hydrogen, which can be manufactured easily and inexpensively. One spectacular disaster later we learned that hydrogen was probably a poor choice.

I believe we’ve just witnessed something similar with Solidity, the language in which Ethereum smart contracts such as The DAO are authored. Much has already been written about what transpired during the DAO attack, with great analyses by Phil Daian, Peter Vessenes, and Robert Graham .

Ethereum has responded, and their proposed plan involves wrapping up certain common patterns to try to alleviate some of the sharp edges in the language. While I think they should certainly do this, I’m one of many who thinks that the Solidity programming language is overcomplicated with too much expressive power, and that an imperative programming model is probably ill-suited for this purpose.

I think it’s also a great example of LANGSEC: how programming language design affects the security properties of languages. Despite smart contracts being a relatively new field, the pitfalls of smart contract languages like Ethereum’s have already been studied. Ethereum shipped what is effectively a platform for running attacker-controlled Turing-complete programs on its blockchain. The attack was in one of the Turing-complete features (recursion). LANGSEC has something to say about that:

InputLanguages.jpg

More interestingly, this class of attack is a general problem in systems that allow for recursion.

I hope the risky design decisions made in the design of languages like Solidity will help influence future smart contract languages towards taking a much more conservative approach.

Perhaps I spend too much time thinking about things through the lens of access control systems, but I think the safest approach is to model smart contracts as a formally provable authorization language which is amenable to model checking. This would involve designing the language in such a way that proofs could be written in an authorization logic, such as Abadi or Nexus authorization logic. To me, whether or not to move funds is an authorization decision, and the propositions we put into such a systems are effectively credentials, such as possession of certain cryptographic keys.

The Crypto-Conditions language of the W3C Interledger Protocol (explanatory video here) provides a system that fits this bill. Beyond accepting certain cryptographic inputs, it provides little more than boolean algebra. Clearly this isn’t expressive enough to create something like The DAO, but I’m not sure The DAO is actually a good idea. Instead, a language like this gives us the foundational building blocks for things like escrowing and unlocking funds, a necessary component of connecting ledgers.

Done carefully though, and in the context of a proof framework, I think such languages could be made more expressive than boolean algebra, while avoiding the incomprehensible complexity of what a language like Solidity allows.

One last smart contract system I’d like to mention is Hawk, which I saw presented in January at the Real World Crypto 2016 conference. While Hawk provides a complex, expressive language of the sort I was just ranting against, it’s been designed using formal methods, down to building a formal model of the blockchain itself, then building the smart contract language up from there. It also leverages zero-knowledge proofs to protect the privacy of participants in smart contracts. While I’m still not sure complex smart contracts of this nature are actually a good idea, Hawk seems to have been designed both with a great deal more rigor than Solidity and also with the common pitfalls of implementing smart contracts in mind.

The entire DAO situation leaves Ethereum in a quagmire. Smart contracts are a vast, uncharted territory from a legal perspective. If a smart contract allows you to do something, are you actually violating it? These are interesting legal questions the actual attacker is posing to Ethereum. I think there’s a huge legal and regulatory minefield for smart contracts, especially when they’re effectively complex financial instruments.

Also, in case anyone is wondering, I retract my hat tip I gave to Ethereum on smart contracts in the original Death of Bitcoin post.

In contrast, the decision of Bitcoin developers to remove most of the operations from Bitcoin’s scripting language seems like a rather good one. With the price of a Bitcoin approaching $800, things seem to be shaping up for the seminal cryptocurrency. But is all well in Bitcoin land?

Bitcoin’s Blockpocalypse Is Nigh #

blockpocalypse.png

Bitcoin’s source code contains within it a hard coded cap of 1MB on the number of transactions allowed inside of a block. Originally added to thwart certain classes of theoretical attacks, if it were to to be reached it would cause spikes in Bitcoin transaction fees and/or delays in confirmation times. This represents the network’s maximum capacity, and if it were to be unaddressed once reached, the network would become overprovisioned.

As you can see from the above graph, this is set to happen fairly soon, sometime shortly after the mining reward halving set to happen on July 10th. Fortunately, there are two solutions in the works to address this problem:

One is called Segregated Witness, or “SegWit” for short, and involves a “soft fork” to Bitcoin which increases the effective number of transactions that can fit in a block by moving signatures out-of-band of individual transactions.

Another is The Lightning Network, which functions as a sort of “caching layer” for Bitcoin by moving certain transactions off of the blockchain.

Both of these solutions are extremely complicated, and in the wake of the problems that have arisen from Ethereum’s complexity have lead many to question whether they’re actually good ideas.

Meanwhile, the hard-fork approach of simply upping the block size limit has been rejected in favor of the more complicated solutions.

The Bitcoin “mempool”, its buffer for unconfirmed transactions, has been sporadically spiking for some time, like foreshocks of a coming earthquake:

Screen Shot 2016-06-20 at 9.12.03 PM.png

Unless this problem is addressed, there could be some serious consequences. People who want to sell in the wake of a crash (which is something Warren Buffett would probably recommend against) will find the system clogged by other people who want to sell, and to get your transaction through you will need to pay higher transaction fees, otherwise your funds are effectively frozen.

This could lead to what is effectively a “bank run” on Bitcoin: many people interested in selling their Bitcoin, but the network capacity exhausted, furthering panic and causing transaction fees to greatly spike. One Bitcoin transaction paid the equivalent of ~$50 in fees recently.

There have been many other proposals to “hard fork” Bitcoin. After all, the 1MB cap is just a number in the source code which can be easily changed to whatever value is deemed sufficient. These proposals were all shot down by Bitcoin Core. Their arguments are that hard forks are difficult, and that Bitcoin needs decentralization: we can’t let Bitcoin be too good or they won’t move onto “sidechains” like Blockstream, a commercial sidechain company whose founders include Bitcoin Core members like Gregory Maxwell. Some have alleged a conflict of interest there.

“Solutions” to Bitcoin’s Scalability Challenges #

bitcoinfine.png

Bitcoin developers and many Bitcoin users are well aware of these problems, and work has been underway on solutions to them for some time. However, I’ve struggled finding good descriptions of these systems, so I thought I’d take a crack at descriptions of my own.

Segregated Witness #

The confusingly named SegWit soft fork modifies the structure of Bitcoin transactions. Ordinarily a Bitcoin transaction looks like this:

Screen Shot 2016-06-21 at 8.44.13 AM.png

Each transaction consists of inputs and outputs, which are in the form of Bitcoin’s (very constrained) scripting language. Ordinarily inputs contain a signature of the transaction created by the wallet private key of the sender, and outputs contain the addresses of one or more recipients. The transaction ID is then computed against all of the data, including the signature.

There are a couple reasons why this structure is disadvantageous. First, this message structure is one source of transaction malleability, an earlier Bitcoin design flaw which was allegedly abused to steal at least some Bitcoin from Mt. Gox. Secondly, signatures represent some 60% of the blockchain by data volume as the rest of the information contained within a transaction is comparatively compact.

Bitcoin ordinarily stores transactions in a Merkle tree (with the blockchain itself being a Merkle tree with some special extra fields on block headers). SegWit move the signatures out-of-band of transactions themselves and place them in a parallel Merkle tree:

Screen Shot 2016-06-21 at 8.54.23 AM.png

So really “Segregated Witness” is about segregated signatures. Why the wacky name with the word “witness” then? I’m not sure… I think Bitcoin people just love inventing jargon. (Edit: Apparently the term comes from proof-of-knowledge protocols, but I still find it confusing and I’m not sure the word “witness” actually applies to this case)

This has one more interesting advantage: signatures can be garbage collected from the blockchain over time. Ordinarily these signatures are checked over and over and over again by every new miner that joins the network or processes a newly mined block. This checking and rechecking of the blockchain provides consensus between the nodes in the system.

However, after these signatures have been checked and rechecked so many times, and are so many blocks deep from the latest block, they cease to provide value. The Proof-of-Work function alone and the fact so many miners have already accepted those signatures as valid suffices to authenticate the transactions. So, you can imagine that after so many months the signatures can be discarded.

If successfully deployed, Segregated Witness should lead to a 1.4X improvement in overall system transaction capacity. Yep, that’s it.

Arguably Segregated Witness is how Bitcoin should’ve been designed to begin with. But instead of adding it retroactively with a hard fork, Bitcoin Core is deploying it as a mound of band-aids to the existing protocol in order to deploy it as a soft fork.

First, Segregated Witness abuses a nonsense scripting operation called an “anyone can spend” transaction to indicate that a transfer was “SegWit enabled”.

Second, the root node of the signature Merkle tree needs to be stored somewhere, but Bitcoin’s block format doesn’t have space for a second Merkle root hash. There’s one place that has space though, the so-called “coinbase” transaction (not to be confused with the company of the same name) that contains the mining rewards and a small amount of space for miner-controlled data.

SegWit has been denounced by many as being an overcomplicated solution, including Rick Falkvinge and Coinbase CEO Brian Armstrong (the company this time, not the transaction), while providing little gains.

The Lightning Network #

The Lightning Network allows Bitcoin transactions to be moved “off chain”, passing between users and “hubs” in a directed acyclic graph called a “payment channel”, with a deadline to settle back onto the blockchain. It’s been described as a “caching layer” which allows a net settlement between parties to be calculated before actually settling on the blockchain.

An alpha of the Lightning Network just launched in the form of the Blockchain (the company, not the concept) Thunder Network:

screen-shot-2016-05-16-at-3-41-47-pm.png

Lightning provides a hub-and-spoke network where payments can pass between multiple parties before eventually settling back onto a central blockchain:

Screen Shot 2016-06-21 at 10.27.26 AM.png

If I’ve gotten anything wrong here, it’s because the Lightning Network is ridiculously complex: the paper is some 57 pages written in blockchainiac gobbledygook terminology. I’m the sort of person who reads academic papers for fun, and can attest that this is not a paper I remotely purport to understand or enjoyed reading. I think there are very few people on Planet Earth who have read this paper and understand it.

For a paper which details the design of a complex distributed system, there’s a surprising (or perhaps unsurprising) lack of formal methods. The correctness of these systems is notoriously hard to get correct, leading many designers of other systems to describe them in terms of temporal logic which can be model in a proof system like TLA+.

I am also confused about the security properties of the Lightning Network. What happens when one of the hubs gets popped? Can someone steal all of the unsettled transactions? Suddenly this proposed scaling element starts taking on the security properties of Bitcoin exchanges, which are notorious for being popped.

I am much more interested in a system that solves many of the same problems: Interledger. Interledger connects payment networks the same way the Internet connects data networks. Any currency, store of value, or fungible asset class is welcome. Unlike the Lightning Network, Interledger needs no coordinating blockchain. This means that unlike Lightning with its hub-and-spoke architecture, Interledger is fully decentralized just like the Internet itself. Payments can be routed however the network sees fit, and the network can calculate optimal routes in a distributed manner, just like packets on the Internet.

The Interledger paper is much smaller, simpler, and written in the terminology of existing payment systems and should be comprehensible to anyone with a background in distributed systems. Finally, Interledger is formally modeled in terms of temporal logic using TLA+, which I mentioned earlier.

Screen Shot 2016-06-21 at 10.42.32 AM.png

Comparatively, I find the Interledger both a breath of fresh air compared to the Lightning Network, and also much more impressive in terms of solving a much more general problem. Since consensus is maintained at all times (including an optional “atomic” mode that ropes in notaries to witness transaction), I find it less worrisome, or at least much more comprehensible from a security perspective.

I will be curious to see how the Lightning Network fares in production.

Ongoing Scalability Challenges #

Transaction Rates (3).png

VISANet, the network that powers VISA credit cards, reports the network is capable of 56,000 transactions per second and peaks at about ¼th of that.

Bitcoin, on the other hand, has ~10% headroom as compared to VISA’s 300%, and is presently pushing about 5 transactions per second, peaking around 7. Its total capacity isn’t enough to power a top 10 US retailer. Far from powering every payment on Earth, Bitcoin can’t even handle Bloomingdale’s.

Part of the problem is the way Bitcoin’s consensus algorithm works: the transaction throughput is coupled to the proof-of-work problem and the maximum size of a block.

Consensus algorithms are a well-studied problem in computer science, with many formally proven solutions. If the role of Bitcoin’s Proof-of-Work function is changed from being the crux of its consensus algorithm to something more like a leader election system, then blockchains can use a more traditional consensus algorithm, and therefore reap much higher throughput. It would also allow for lower confirmation times, since we’re not waiting on the Proof-of-Work lottery winner to produce a block in order to decide which transactions are validated.

Tendermint, a next-generation blockchain system which uses an algorithm similar to Practical Byzantine Fault Tolerance, has recently claimed they’ve reached 10,000 transactions per second on simulated tests on Amazon’s EC2. While I expect actual over-the-Internet performance will be slower than that, it makes it over 1000x faster than Bitcoin, and starts reaching a transaction rate that is actually practical for a modern payments network.

Byzcoin is a similar system which uses Proof-of-Work to elect leaders, then runs a consensus algorithm to process transactions.

Using a more traditional consensus algorithm can massively boost performance. For this reason I don’t think the Bitcoin approach is long for this world. We’ll see.

Irrational Exuberance #

Irrational Exuberance.png

Despite the recent attack on the DAO and Bitcoin’s looming-but-well-known problems, cryptocurrency markets are still strong. An investment analyst has told CNBC that Bitcoin is the new gold and should be considered a safe-haven asset. An IMF report labeled Bitcoin’s blockchain the Internet of Trust.

I think cryptocurrency is in its infancy and these sort of pronouncements are premature. I can’t explain the Bitcoin price, except by saying that I think a lot of people believe in Bitcoin to a degree of religiosity, and for that reason it may continue to hold its value. The mining reward halving appears to be causing Bitcoin to deflate exactly as Satoshi said it would.

I don’t have a crystal ball to tell me, but unless Bitcoin’s capacity problems are addressed I don’t expect the price will keep going up.

Conclusion #

i want to believe.png

Ethereum and Bitcoin have taken two very different approaches to implementing a cryptocurrency, and both approaches have their respective faults but the same core lesson:

In either case, there’s a common theme: complexity. I don’t think this complexity is necessary complexity either: it’s incidental complexity which I think largely arises from these systems not being designed to solve specific problems from first principles, but rather glomming on complexity to existing systems whose design probably needs to be reconsidered from the get go.

As I originally described in my post The Death of Bitcoin, I think there’s plenty of innovation yet to be done in this space beyond what Bitcoin has already done. I think two of the systems I originally called out in that post, Hyperledger and Tendermint, provide more practical systems for meeting real-world needs. I am also still curious to see how Zcash and Stellar SCP fare in this space. I am also excited about what Interledger can do to unite disparate payment networks into a single interoperable system.

There’s a balance that needs to be struck between Bitcoin’s conservativism and Ethereum’s recklessness, and parsimony that must be maintained when “complecting” (to use one of Rich Hickey’s favorite words) the design to improve performance. These systems need to evolve to handle the capacity of real-world payment networks, which requires adding more complexity, but this needs to be done in a selective and tasteful way, drawing from the rich history of computer science rather than inventing new doodads to glom onto the protocol to solve each and every problem.

I hope this post has been informative and prescriptive. I do see potential in so called “blockchain technology” (although I’d prefer we get away from that term and move on to “distributed ledger technology” and “immutable databases” already). I am curious to see how this space evolves.

 
327
Kudos
 
327
Kudos

Now read this

A short statement regarding Ashley Williams’ abusers

It seems there is some drama in the Rust community regarding Ashley Williams’ recent appointment to the Rust Community Team. Did Ashley violate the Node.js Code of Conduct, and is there an active censorship effort underway by the Rust... Continue →