Why I’m massively in favour of a hard fork block size increase, and also massively against one

I made some reddit posts that recently have been interpreted as my being in favour of small blocks and not raising the block size limit.

This is not my position at all. I’m making the important case that Bitcoin cannot rely on on-chain scaling alone. Satoshi mentioned Moore’s law in the white paper. These were very compelling comments, and for my first few years following Bitcoin it seemed reasonable that Bitcoin could scale on-chain indefinitely.

Unfortunately global propagation is harder than it first seemed when blocks were tiny, and on-chain scaling is not as viable as first thought. Moore’s law alone is not our scaling saviour.

That said I’m not opposed to a hard forks to increase the block size – I think they are necessary. My concern is at hard forks being seen as an easy solution to scaling.

If I seem like more of a small blocker than I am it’s because I’m trying in my mind to balance out the community by pushing the small block cause. I want people to realise that on chain scaling has real implications and is not a long term solution.

I’m incredibly sympathetic to the arguement that we need Bitcoin to be attractive, and low transactional fees are something that first attracted me to Bitcoin. However we’ve also got to be careful of precedent.

The block size debate is more than technical – it is about the politics and future direction of bitcoin.

If we head in the wrong direction and become dependent upon bigger and bigger blocks, there is a genuine risk we embark on a slippery slope and slowly erode what makes Bitcoin special.

I’m not convinced anyone is using Bitcoin at the moment to buy coffee. I’m also sympathetic that we want to make Bitcoin accessible and that lower fees helps the poorest participate, but we need to be cautious.

Bitcoin’s decentralised nature is our democracy, and good democracy requires checks and balances. It might not feel like it at times, but the passionate debate and resistance over changing the status quo is giving us exactly that.

No matter what you think of your opponents, we’re all playing an important role in Bitcoin’s governance. There has never before been anything like it. Fierce debate over monetary policy has taken place behind closed doors throughout most of history, now we all get our say.

I am not opposed to a block size increase, I am opposed to a block size increase being easy. Not because I think bigger blocks will ruin Bitcoin, but because I think lots of block size increases would ruin Bitcoin.

We need to put up a fight against anything that could change what Bitcoin currently is. That doesn’t mean we shouldn’t ever change Bitcoin, but that such changes should have stood up to immense scrutiny.

You might be massively in favour of increasing the block size, but you should also be thankful in the face of resistance. If Bitcoin ever becomes easy to change it becomes easy to break.

That’s why I’m simultaneously opposed to a block size increase while also being in favour of one.

Yes I’m a paradox, but I’m quite happy that way.

Bitcoin is under siege! We need to fight against post-Truth propaganda, and a plan B to reclaim Bitcoin if taken

We now have a completely divided community where people believe nonsense. A sizable minority have now been convinced that SegWit is dangerous and creates an insurmountable technical debt. These people generally have no development experience, and just blindly repeat misinformation despite the protests of those who do. The vitriol they have been fed is a contagion that is spreading, while others just want to block SegWit out of spite.

I recently tried to compile a list of developers who were opposed to SegWit. The exhaustive list consisted of four. That’s right… four. From the stink kicked up by the anti-SegWit brigade you’d think this number would be far higher.

If you repeat a lie often enough, people will believe it. There is a real risk that enough of the non-technical community now believes SegWit is too complicated and risky to prevent its activation. For the technical community this is a total non-debate, actual developers opposed to SegWit are the flat earth society of Bitcoin. Disagree with this? Try to list developer names and credentials opposing SegWit and you’ll soon realise how feeble the technical opposition is.

In addition to SegWit hate, the vitriol directed at Blockstream is absurd. Bitcoin is and always will be open source, and Blockstream’s business model depends entirely on the success of an open and decentralised Bitcoin. All the big names there have a proven track record of dedicating themselves to Bitcoin’s advancement. Their business model is to profit from their expertise, gained by valuable contributions to Bitcoin’s development. This is a sound and reasonable business model that has been successful on many other open source projects such as MySQL. The profit they make can be used to further advance Bitcoin – it is a win, win.

People literally believe that Blockstream is Evil Corp. I’ve seen people argue that Blockstream profits from keeping blocks small so they can charge for the lightning network. This demonstrates a shocking lack of comprehension and common sense. There are even conspiracy theories that Blockstream is a secret banking trojan horse to bring down Bitcoin from the inside. People peddling such misinformed nonsense need their heads inspecting.

Five years ago in response to scaling concerns, I used to argue that Bitcoin could scale infinitely on-chain, often citing Moore’s law. The more I learned about Bitcoin, the more I realised this isn’t viable without risking Bitcoin’s fundamental value proposition – decentralisation.

I have not been “brainwashed by Blockstream lies”, I have simply joined the consensus of those with a more informed technical understanding. With off-chain scaling we can have our decentralised, inexpensive and instant digital money cake, and eat it too. Sadly, we now live in a post truth world, and having the better argument is often trumped by those shouting the loudest.

Valid concerns can be raised about user experience, missed opportunities, and yes, Lightning Network and Sidechains aren’t ready yet and we do need solutions now. Well, guess what, we have a solution right now: SegWit will immediately ease the stress on the network, it is coded, extensively tested and ready to launch… and there is even consensus for a hard fork block size increase after its activation.

The only thing that will prevent SegWit from activating is misinformation combined with a political power grab by opportunistic miners.

There is now a movement, in the form of Bitcoin Unlimited, to hand over control of the blocksize to miners. There are many reasons why Bitcoin Unlimited is a terrible answer to the block size debate. Sadly, much of this discussion takes place in the bitcoin-dev mailing list where the brightest technical minds hang out, while the rest of the community indulges in misinformed squabbles on reddit. In short, handing over control of the block size to miners would be terribly centralising.

People arguing that the community wants a block size increase are right. I’m all for a block size increase too, however it is vitally important for the health of Bitcoin that the best technical solutions win and we do not concede to misinformation and fear. SegWit MUST be activated before a hard fork block size increase.

If the propaganda succeeds in persuading miners to fritter control of Bitcoin’s block size limits away to an implementation as poorly conceived as Bitcoin Unlimited, then that chain and those who created it must be punished by the market.

To do this, I propose Bitcoin 4Core, a hard fork response that would clearly support the scaling vision of Bitcoin Core, and hopefully recruit their talented development team.

I believe the best way to protect the network from attack and simultaneously improve decentralisation would be to introduce additional proofs of work. 4 proofs of work each with 40 minute block creation targets and respective difficulties. We could add Ethash, Scrypt and Equihash to give a mix of CPU and memory intensive methods, and improve diversity of hardware. We could also take the opportunity to introduce a 4MB maximum block size.

By using proof of work methods with existing altcoin implementations, the mining ecosystems already exist, though some altcoins would likely face severe disruption as miners fled to profit from Bitcoin. Existing Bitcoin miners also wouldn’t be shut out completely as with a change of PoW, and could reluctantly return with diminished income and influence when they a realise that the economic majority will overwhelmingly follow the technical majority when given a choice.

I don’t know if the Core developers would support a proposal like this, but I personally think it would be a great way to reclaim Bitcoin and give a clear mandate to the sound vision of the Core development team. This, however, should be a last resort, and I remain optimistic that SegWit can still activate despite all the noise.

People who argue that introducing SegWit as a soft fork is “too complicated” are concern trolling

Back in February I wrote a piece on then big block flavour of the month, Bitcoin Classic.

I was frustrated that the approach of rival implementations to Bitcoin Core was basically to lift most of the work of the core development team, make a few simple tweaks, and then try and push their implementation as the saviour of Bitcoin.

So I laid down a gauntlet, instead of being a cheap cover band, actually write some code that showcases your abilities and proves your worth. Do that, and a rival implementation could earn the respect and credibility essential to advance their agenda.

Segregated Witness (SegWit) is a clever way to almost double Bitcoin’s capacity without increasing the block size, while also solving other problems such as transaction malleability. It was widely agreed that SegWit was a win win.

Fast forward to now and SegWit has been developed, fully tested and is ready to be implemented as a soft fork.

Great news you would think, except if you go to the big blocker parts of the Internet, suddenly SegWit is considered dangerous!

The argument isn’t that SegWit is bad, it’s that it is way too complicated to be introduced as a soft fork, and should have been implemented as a hard fork. They also claim that the complexity of the code (over 500 lines), and compromises required as a soft fork will make Bitcoin really difficult to develop for in the future.

The developers at Bitcoin Core, who have delivered the solid dependability for which Bitcoin has become known, have collectively decided that SegWit was not just within their capabilities to write, but also to build upon in the future.

If any serious developer is arguing that Bitcoin is going to be too complicated for them after SegWit, they’re probably not a good enough developer that they should even be considering working on such a critical software. Anyone who isn’t a developer frankly needs to keep their concern to themselves, as they are not qualified to hold such a view.

I’d be a lot less harsh if SegWit had suddenly been announced and implemented under a shroud of secrecy. The Core developers said in December however, that SegWit would be coded as a soft fork.

If any rival team of developers disagreed with this approach they had a simple solution… write their own implementation of SegWit as a hard fork.

This would give them an opportunity to showcase their abilities, and give the community something to think about. If it was as simple and elegant as they say… they could have had the code ready months before Core, impressed us all with its elegance, and really built some momentum

What do we have instead? We have a small community determined to do everything in its power to block SegWit activation. It’s a shame that instead of sitting on the sidelines complaining, they didn’t take some initiative. They need to learn from this experience, as right now they just look like the petty children who have taken their ball and gone home because the game isn’t going their way.

photo credit: John Spooner Beware of the Troll via photopin (license)

The blocksize debate: is an end in sight for the civil war that has engulfed Bitcoin?

Depending on which parts of the Internet you inhabit, your perception of what’s happening in Bitcoin land can vary hugely.

The Bitcoin community is bitterly divided. For years now it has been split into two camps, those who think Bitcoin needs an urgent blocksize increase, and those that think other scaling approaches should be prioritised.

The “big blockers” are worried that with the current limit of mostly full 1MB blocks, there isn’t enough capacity for Bitcoin to grow. They think this will cause real harm to Bitcoin’s network effect, and that not addressing it urgently could result in Bitcoin losing its position and momentum as #1 cryptocurrency.

Whether you see merit in this view or not, it’s important to recognise that to somebody who is convinced that failing to urgently raise the blocksize could lead to Bitcoin’s downfall, the current standoff and ongoing lack of an increase would be incredibly frustrating. It is understandable that frustration and helplessness would lead to a deep seated suspicion and contempt for those they see as standing in their way.

For those of you that don’t visit the big blockers communities, it’s staggering to see the vitriol and anger directed at those “progress preventers” the Bitcoin Core developers, the team that has long served as custodians of the main Bitcoin implementation.

While big blocker communities can feel a little bit like the front line of a war, frequenting the “small blocker” parts of the Internet can feel a lot happier – you wouldn’t necessarily realise there even was war.

The thing is that everyone, big and small blockers alike, agree that Bitcoin needs to scale.

The Bitcoin Core team have identified a few interesting ideas that they believe are the best way to scale Bitcoin, primarily Segregated Witness (SegWit), and Lightning Network.

Lightning Network is not popular with big blockers. It aims to move transactions off chain, sending them directly between individuals rather than being stored by every participant on the network.

They are skeptical, arguing that it is hypothetical and unproven, and that even if it achieved everything claimed, it does nothing to address the scaling problems that Bitcoin is facing right now. Many also believe that these transactions taking place “off chain”, are undesirable and not part of Satoshi’s vision.

They contend that on chain scaling is an essential and easy fix that can be implemented immediately, and that Lightning Network is a distraction, causing Core developers to neglect more pressing issues.

I can understand these concerns, but I also see the merit in the approach taken by the Core developers. In summary, an increase in blocksize is a barrier to running a node and reduces decentralisation, a sacred and essential property of Bitcoin which, they contend, must be preserved as much as possible.

Middle ground is hard to find when the argument is so subjective. On one side, a $0.09 transaction fee is far too high and going to put off new users so Bitcoin never grows. On the other a $0.09 transaction is far too cheap to require that every full node, thousands now, possibly millions in the future, is required to store details of $2 coffee purchases for thousands of years to come – leading to a bloated chain that will suffocate under its own weight and jeopardise the highly prized property of decentralisation.

SegWit seems to be a middle ground. It works by splitting the data from transactions into two parts, half of which can be included in 1MB blocks, the other half stored separately and not contributing towards the block size limit, while improving a other areas of Bitcoin (like transaction malleability) as an added bonus.

This, the developers claim, will give an effective block size increase to around 1.7MB without requiring that everyone upgrade their software (a hard fork).

Great news, you would think, the big blockers and small blockers can both agree this is a win win for Bitcoin. Also, there’s no longer need to wait, SegWit is coded, tested and ready for implementation.

The thing is, to the surprise of those who don’t frequent the big blocker communities, the frustration and suspicion has grown so pernicious that SegWit is not trusted. They don’t believe it does enough to address Bitcoin’s urgent scaling problem, has taken too long, and will take too long to come into effect.

There is almost a sense that, in accepting SegWit, they will have “lost”, and that they still haven’t been listened to. Some even argue that introducing SegWit as a soft fork is more dangerous.

All this frustration and bad feeling has manifested itself in the rejection by the big blocker community of SegWit. They would rather block its implementation than “lose”.

You might think they’d be barmy to block something that is ready to increase Bitcoin’s capacity, but that is exactly the plan. They have lost complete confidence in Bitcoin Core and many would like to see a switch to a rival implementation, Bitcoin Unlimited, that would allow miners to decide the maximum block size instead.

There is a genuine belief that in blocking SegWit, they can force a stalemate that will enable them to push the community into choosing “their” scaling solution, and that they can still win the war.

If you’ve not passed by this community, this may sound absolutely outrageous. To everyone else, the war is almost over, but to those on the other side, battle has just commenced.

So, what happens now?

In order to be activated, SegWit requires 95% of miners to vote for its activation. Currently, mining pool ViaBTC has stated it will vote against SegWit, and since it has over 5% of hash power, it will succeed.

This leads to an interesting dynamic. To those outside the big block community, those that have most vocally demanded the network capacity increase are now the ones standing in its way. In a war of ideas, it’s hard to see that the big blockers are going to suddenly gain much new support when it looks like this.

How will the Core developers react? Well, I think they’ll patiently respect the 95% activation threshold.

It’s also interesting to note that a number of prominent Core developers signed an agreement in February about how to scale Bitcoin.

The agreement was that SegWit would be worked on as a priority, and the once finished the developers would take around 3 months to write code for a hard fork to increase the block size somewhere between 2-4MB.

They then went on to estimate that SegWit would be coded by April, and if that were the case the hard fork would be coded by July 2016. This is unfortunate, because this optimistic timescale has led to accusations that the Core developers had failed on their “promise” to code a hard fork by July.

Software often takes longer than hoped, but it is a shame this mention of July 2016 has led to some in the big block community feel like they have been betrayed and misled, when it was an estimate rather than a commitment.

If Core developers present had said SegWit would take until October 2016 instead of April 2016, it is possible that consensus may not have been agreed- and you could argue was agreed on false pretences. While I believe this was a genuine underestimation, I can understand why others already cynical would assume the worst.

So, what happens now? Well, SegWit will probably not activate, and the Core developers who signed that agreement will spend the next 3 months writing the code they promised for a hard fork – those present signed the agreement and their reputation now depends on it.

It would actually be good for the big blocker cause if the Core developers present reneged on the agreement, as they would be vindicated and would gain new support.

In the meantime, the big blockers will promote Bitcoin Unlimited, and despite their overwhelming optimism in the face of what to many looks like adversity, it will probably face the same fate as Bitcoin XT and Bitcoin Classic, similar attempts which failed before it.

Around 3 months from now we’ll possibly still be waiting for SegWit activation, but we’ll probably have code for a blocksize increase. The thing is, part of the agreement was that the code would not be implemented by Core until after SegWit had activated.

At that point, I feel the guns may fall silent, and the great Bitcoin war could finally reach its conclusion.

Will full blocks really be bad news for Bitcoin?

The idea that full blocks, or a ‘fee event’ will be bad news for Bitcoin is based on a couple of assumptions. First, that Bitcoin is used as a currency, and second, that those using it as such will stop doing so.

This is just wrong. Bitcoin is young and evolving, with only a minority of transactions used for purchases.

Bitcoin is primarily a commodity and store of value at this stage of its existence. Full blocks have little impact on this role.

Leicester City Football Club are currently leading the English Premier League. They are reaching capacity every game with demand so high that £50 tickets are touted at £15,000 for a pair, a major ‘fee event’.

No journalist would frame this as bad news for Leicester City. It is an achievement, a direct consequence of their success.

Despite this, higher capacity would enable greater sales instead of ‘missed opportunities’. On 7th May, as the full time whistle blows, these opportunities have passed forever.

This is not the case for Bitcoin. As a commodity, rather than an event, Bitcoin will increase in value as demand increases. There is no deadline at which its value drops to zero.

For Bitcoin to go from less than 8,000 transactions per day to over 200,000 in 4 years is a success story. Full blocks are a newsworthy event, a celebration of what Bitcoin has achieved. “Product in such high demand that nobody wants it any more” read no headline, ever.

Any news stories on reaching capacity are good. The real story is a 4 year use increase of 2500% and major capacity increases on the way. The future is bright.

 
photo credit: 20,000 Miles via photopin (license)

The Bitcoin Classic developers should use the block size increase to showcase their abilities

Bitcoin Classic has, for me, had a questionable start. The team that wants to lead development of the Bitcoin protocol, a technology with a market capitalisation approaching $6 billion, has done a few things that have concerned me. One of their team, Michael Toomim, boasted about being high in an IRC chat with Core developers, while his brother, Jonathan, admitted that their plan is to just copy segregated witness code from Core when its ready.

My biggest concern however, is that Classic are simply taking all the hard work and years of effort that has gone into maintaining Bitcoin Core, tweaking a tiny bit of code, and then taking control of the whole project.

Anyone with even a basic level of coding could probably work out how to increase the block size limit to 2MB themselves… it’s not complicated.

I’m not closed minded to the idea that there are teams out there other than Core who possess the skill set to advance Bitcoin, I just want to see proof of this before we go handing over the keys.

It has previously been established that increasing the block size through a soft fork is technically possible, but more complex to code.

It is likely that the Bitcoin block size is going to have to be increased on multiple occasions as the network grows. Why clunkily kick old software off the network every time through a hard fork, when we can instead preserve some legacy functionality and gracefully degrade older clients?

Doing the legwork now and coding a resilient soft fork solution for future block size increases would showcase the capabilities of whoever implemented it.

In summary, we’d move to a two block system. Old clients would see the 1MB ‘legacy’ block, while those who had upgraded would also see an additional expanded block of say 8MB.

Transactions that occurred on the expanded blocks would be invisible to older clients, but they would be protected from double spends and being kicked off the network completely.

Once coded, future block size increases could easily be rolled out using the same mechanism. All the software needs to know is the maximum block size it can accept. The updated software would accept blocks upto 8MB.

The next update could increase the maximum block size to 32MB for example, and miners would start mining 8MB and 32MB blocks.

In that case, the 1MB legacy blocks would finally disappear. At this point anybody who still hadn’t updated their old software would be kicked off the network after a long grace period, and 8MB would become the legacy block size.

The network would go from 1MB+8MB blocks to 8MB+32MB blocks, with the newest software recognising 32MB as the maximum block size. In addition, the software could provide alerts that the main chain is now mining larger blocks and older software requires an upgrade to participate fully.

This change would require an initial outlay of effort, but would provide future benefit for Bitcoin, eliminating much of the contentiousness and fear that surrounds hard forks.

I’m open minded. If the Classic developers or any other team step up to the challenge and show themselves capable and willing to look at more complex solutions in the best long term interests of Bitcoin, they will earn my support. As long as they’re a cover band, I’ll stick with the original artist.

photo credit: my drum kit 2 via photopin (license)

A question for Core developers on the block size: let’s find the middle ground

I’m a fan of Bitcoin Core. They’ve shown themselves up to the task of delivering a reliable and revolutionary software, an incredibly strong foundation upon which Bitcoin has been built.

On top of this, they have a vision for the future that is truly scalable.

The fears coming from the bigger blocks side of the debate are that full blocks are going to create a fee event and lead to a slow down in processing transactions.

I’m sure most people would agree that if fees got ‘too’ high, or transactions became ‘too’ slow, any potential risks from implementing hard fork and increasing centralisation would be outweighed by the serious damage that would be inflicted upon Bitcoin’s reputation and usefulness.

While I agree that full blocks and slower transactions in the short term do not have to signal disaster, I do have a threshold where I think an increase in block size should be urgently implemented.

Everybody has a different threshold, for some, any increase in fee will signal disaster, others may think $1000 fees per transaction are acceptable.

I know from the Core team their roadmap for scaling. I just would appreciate some clarification from them as to where they would draw the line push out a block size increase if network performance or fee prices were getting bloated.

Perhaps the Core developers are closer to the big blockers than we all realise. If we can have a pre-agreed criteria as to what conditions would trigger an emergency block size increase, then the community could unite.

Let’s give a completely hypothetical example, maybe 7 consecutive days where a 15 cent fee fails, on average, to reach the block chain within an hour, maybe that could be the threshold that would trigger the Core developers to release a hard fork block size increase. Some sort of threshold would eliminate all the uncertainty which has spawned so much of the tension.

I’d just like some indication from Core as to where they stand on this. If they turn around and say they think up to $10 average fees in 2016 would be acceptable and why, I can make an objective decision about whether I share their vision. At the moment I feel a little like I’m making that decision blind.

Just some sort of clarification could help make the community feel more at ease, and diffuse the tension that has been building. This dispute has been been a hotbed of fear, uncertainty and doubt, transparency is the best weapon we have against them.

Forking crazy: a dramatic speculation on how a contentious hard fork could play out

Bitcoin blocks are full. Every 10 minutes, around 1MB of transactions make their way into the block chain, the network is at capacity.

The community is divided on what happens next. Little do they know that what seems like a simple choice between competing clients could turn into a plot worthy of Hollywood with subterfuge, subversion and excitement in abundance.

The Core team of developers are behind the software that has powered Bitcoin for the last 7 years. Their plan, Segregated Witness, is to start moving some of the transaction data outside the blocks to create more space inside them and can be introduced through a soft fork.

The Core developers also have a revolutionary idea, the Lightning Network, which they say will allow Bitcoin to scale substantially without creating huge and centralising demands on the block size.

A new team of developers, named Classic, think this is will not happen quickly enough, and want to create additional capacity now by simply doubling the block size through a hard fork.

I previously explained the difference between a soft and hard fork, but for this article all you need to know is that a hard fork is generally considered more dangerous because people who don’t upgrade their software will no longer be connected to the same Bitcoin network as everyone else, which could create problems.

What is playing out is a battle of ideologies. One side believes in patiently planning for the future, the other thinks that short term capacity issues are far more pressing to prevent Bitcoin losing momentum.

In reality, the Bitcoin network could probably hard fork to 2MB blocks without major incident, and the network can probably operate at full capacity for a while without a major impact on growth. The doomsday scenario painted by either side is overblown, but since this battle dictates the future direction for Bitcoin, its importance is not.

Realising this is about power allows us to more accurately speculate on the dynamics that could play out. The first assumption to make is that as both sides believe they are right, they will use all tools available to them to secure their vision.

Things that may feel like a malicious attack to one side, would be considered a moral necessity to protect Bitcoin by the other. All actions are likely to be good intentioned by those carrying them out so try not to take them personally, remember, all parties believe they are acting in the best interests of Bitcoin.

With that in mind, let us speculate on how the Core team may react to Classic’s attempt to hard fork Bitcoin to 2MB blocks.

For a hard fork to be successful it needs consensus. A perfect hard fork would have 100% consensus, everyone would have updated their software before the fork activated and nobody would be separated from the main network.

On paper, it seems miners hold the power in determining whether a hard fork has consensus. A threshold is set for activation based on the number of recent blocks mined that support a fork, say 75%.

In reality, its not that simple. Its easy to measure miner support in this way, but even if 99% of miners are in favour, if nobody else changed their software, those miners would just be mining amongst themselves, their newly minted coins worthless. Everybody else continues using the existing network which can be considered to have ‘won’.

Even then, its not that simple. If a hard fork to Bitcoin Classic was successfully achieved through 100% miner consensus but nobody else upgraded their software, the existing network could not be considered to have ‘won’ because it is dependent upon miners to function. It would actually be rendered completely useless as no blocks would be created and no transactions could take place.

The most likely outcome is that if 75% of miners reached consensus, most of the remaining 25% would follow suit for fear of ‘losing out’. This is the most rational action to take, and consequently 75% should actually be enough to secure a successful hard fork.

The Core network would be rendered useless.

Even in the best case scenario for Core, that 25% of miners continued to work on the their side of the fork, seemingly against their economic interests, the network would still be severely disrupted. Instead of 10 minute blocks, it would take 40 minutes to process 1MB of transactions, the Classic network could handle 600% this capacity (2MB every 13.3mins). The Core network would be rendered useless.

Following defeat in a hard fork, how could Core cling on to some hope and stay in the game? The answer comes in the form of a new hard fork of their own, deployed very quickly.

That hard fork could introduce any number of ideas, many of which may have already been coded behind the scenes in anticipation. The absolute minimum requirement would be lowering the difficulty level so that the their network could continue to function.

The next priority might be to keep their network as closely synchronised with Classic’s as possible – the further the competing sides of the fork drift apart, the less likely it is anyone would ever switch away from the more dominant one.

To help stay synchronised, it would probably make sense for Core to increase the block size to 2MB. This sounds counter productive, since this is exactly what they have been trying to avoid, but it would be essential to keep their network as closely synchronised as possible and prevent transactions that wouldn’t fit into a 1MB block disappearing from their side of the fork.

This would be life support, Core would be limping along next to Classic, running a separate but similar network.

As long as Core still has some skin in the game, they will be seen as a threat by Classic. Miners from the Classic side of the fork could conspire to attack the Core network. With the Core difficulty level decreased, attacks against the network would be far easier.

This would not be a malicious attack, a reprisal for allowing the network to be split, it would simply be an act of self preservation by the miners to protect their investment and should be considered the most logical and inevitable action for them to take.

With that in mind, another change Core could introduce in a hard fork would be to alter the Proof of Work (PoW). One small change to one line of code would render useless every bit of specialised mining hardware the network currently uses, and help mitigate the risk of attack from the Classic network. Far from a nuclear option, it would be a logical act of self defence that could also gain support from the altcoin community, whose hardware could be useful under a different PoW.

Changing the PoW as an idea isn’t just hypothetical, it has already been coded, ironically by a Core developer, as suggestion for the Classic source code. This was more likely an early act of subterfuge than a serious proposal, a way for Core to flex their muscles and show there are options open to them and they could be willing to put up a fight.

 

So what would happen next? With updated software, the Core developers would have a Phoenix network, resurrected from the ashes. This new network would be viewed by the community as being decentralisation driven and backed by a group of talented developers with a proven track record. It is possible many would find such a network an attractive proposition, and the network could potentially compete with Classic.

history shows us that one format will triumph

This would not be the end of the world; divergence is not uncommon in new technologies. Betamax vs VHS; Bluray vs HD DVD, SD cards vs many others, history shows us that one format will triumph. Core vs Classic could become a noteworthy addition to the list, but ultimately one will win the format war of Bitcoin while the other will end up in the bargain bin with the rest of the altcoins.

What I have speculated would require alignment of a number of factors to be viable. Crucially, it would require sufficient appetite from the Core developers to put up such a potentially demanding and drawn out fight. While some may be keen for the challenge, without overwhelming developer consensus, the prospect of success is severely diminished.

More powerful at this stage is just the idea that they’d be prepared to put up a fight. Any miner that says they support Classic only needs a quiet word in their ear from someone in the Core team suggesting they might create a fork that risks making their millions invested in hardware worthless, and they just may be tempted to change their mind. Whether Classic can gain consensus, and whether Core would pursue this path remain to be seen, but if they did, it would make fascinating viewing.