The Bitcoin Classic developers should use the block size increase to showcase their abilities

Bitcoin Classic has, for me, had a questionable start. The team that wants to lead development of the Bitcoin protocol, a technology with a market capitalisation approaching $6 billion, has done a few things that have concerned me. One of their team, Michael Toomim, boasted about being high in an IRC chat with Core developers, while his brother, Jonathan, admitted that their plan is to just copy segregated witness code from Core when its ready.

My biggest concern however, is that Classic are simply taking all the hard work and years of effort that has gone into maintaining Bitcoin Core, tweaking a tiny bit of code, and then taking control of the whole project.

Anyone with even a basic level of coding could probably work out how to increase the block size limit to 2MB themselves… it’s not complicated.

I’m not closed minded to the idea that there are teams out there other than Core who possess the skill set to advance Bitcoin, I just want to see proof of this before we go handing over the keys.

It has previously been established that increasing the block size through a soft fork is technically possible, but more complex to code.

It is likely that the Bitcoin block size is going to have to be increased on multiple occasions as the network grows. Why clunkily kick old software off the network every time through a hard fork, when we can instead preserve some legacy functionality and gracefully degrade older clients?

Doing the legwork now and coding a resilient soft fork solution for future block size increases would showcase the capabilities of whoever implemented it.

In summary, we’d move to a two block system. Old clients would see the 1MB ‘legacy’ block, while those who had upgraded would also see an additional expanded block of say 8MB.

Transactions that occurred on the expanded blocks would be invisible to older clients, but they would be protected from double spends and being kicked off the network completely.

Once coded, future block size increases could easily be rolled out using the same mechanism. All the software needs to know is the maximum block size it can accept. The updated software would accept blocks upto 8MB.

The next update could increase the maximum block size to 32MB for example, and miners would start mining 8MB and 32MB blocks.

In that case, the 1MB legacy blocks would finally disappear. At this point anybody who still hadn’t updated their old software would be kicked off the network after a long grace period, and 8MB would become the legacy block size.

The network would go from 1MB+8MB blocks to 8MB+32MB blocks, with the newest software recognising 32MB as the maximum block size. In addition, the software could provide alerts that the main chain is now mining larger blocks and older software requires an upgrade to participate fully.

This change would require an initial outlay of effort, but would provide future benefit for Bitcoin, eliminating much of the contentiousness and fear that surrounds hard forks.

I’m open minded. If the Classic developers or any other team step up to the challenge and show themselves capable and willing to look at more complex solutions in the best long term interests of Bitcoin, they will earn my support. As long as they’re a cover band, I’ll stick with the original artist.

photo credit: my drum kit 2 via photopin (license)