Core Developer Gavin Andresen announced a request to increase the Bitcoin block size next year, which could mean significant changes to the Bitcoin code.
This measure would require a hardfork to the entire network. A hardfork involves a change to the Bitcoin protocol that makes previously invalid blocks or transactions valid, and therefore requires all users to upgrade.
Andresen has been a strong supporter for a hardfork concerning scalability. At the last two Bitcoin conferences he attended, he titled his talks, "We're gonna need a bigger chain."
On his blog, Andresen previously addressed the scalability issue and his personal roadmap. A few days ago, redditors noticed this announcement, or rough idea, Andresen shared with the world. On GitHub, Andresen had stated:
"Hard fork: allow 20MB blocks after 1 March 2016 Allows any block with a timestamp on or after 1 March 2016 00:00:00 UTC to be up to 20,000,000 bytes big (serialized).
“I believe this is the simplest possible set of changes that will work."
Redditors and Core Developers Disagree
Immediately after the GitHub post, the Reddit community went into an uproar over the announcement, while some readers were positive about the news. Another core developer, Gregory Maxwell, wrote a reply to the Reddit post:
"Reddit, I think you're jumping the gun based on watching a personal repository. I think this is just some testing code — he hasn't discussed this particular change with the other core developers; I for one would vigorously oppose it."
“For one, it's actually /broken/ because it doesn't change the protocol message size (makes for a nice example of how misleading unit tests often are; in this case they're vacuous as they don't catch that blocks over about 2MB wouldn't actually work). It's also not consistent with the last discussions we had with Gavin over his large block advocacy, where he'd agreed that his 20mb numbers were based on a calculation error.”
Developer Peter Todd agreed with Maxwell via Twitter.
Recently, Cointelegraph spoke with Todd about Andresen's advocacy to increase the block size. He said he was not in favor of it and called Andresen’s move "premature." Todd told CT that others agreed with his stance. He said that with a 20MB increased, centralized systems will grow larger, saying it goes against the very nature of decentralization.
"If transaction demand goes up about a hundred times — which can easily happen — then we can’t just do another block increase,” he said, continuing:
“Six megabytes is the best you can do; 20 megabytes is already stretching it based on [an] optimistic set of assumptions. You are not to going to get 100-megabyte blocks, gigabyte blocks; not in a system that’s decentralized."
There are many arguments against Andresen’s proposal. He seems to want to address them all one by one. It's public knowledge that some disagree with Andresen’s ideas on scalability.
Following the comment made by Maxwell's disagreement with redditors, Andresen responded:
"Actually, it does change the protocol size. But yes, it is intended as 'it is time to discuss this now.' I will be writing a series of blog posts in the coming week or two responding to objections I've heard."
Objections have been discussed everyday since the block size announcement. Maxwell stressed on Reddit that most of the core developers are not committed to the proposal posted by Andresen.
Lightning, Chains and Factoids
Other ideas, such as lightning networks, sidechains, and even Factom have been mentioned as alternatives to Andresen’s idea. Bitcoin developer Mike Hearn discussed the alternatives in a recent editorial following Andresen’s proposal. He specifically highlighted lighting networks as a method of using payment channels as arbitration to transmit payments.
Andresen spoke about all of these counter-arguments openly in his blog. In his recent keynote address in both Boston and London he stated, "Consensus is hard," noting that his theory may not solve the problem.
“The next scaling problem that needs to be tackled is the hardcoded 1-megabyte block size limit that means the network can support only approximately 7-transactions-per-second,” he stated. Shortly after that, Andresen released a blog post titled, "Block Size and Miner Fees ... Again," about the disagreements shared over his proposal.
He described his views about the argument for a 1MB block limit. "The network will be more secure with one megabyte blocks, because there will be more competition among transactions, and, therefore, higher transaction fees for miners," he stated, adding that he's countered this argument before:
"I’ve written before about the economics of the block size (and even went to the trouble of having 'Real Economists' review it), but I’ll come at that argument from a different angle."
He argued that a limited block size of even 1MB only handles 30% of transactions per day. He said that over time, the blockchain will increase in size, and pruning the network is seemingly becoming necessary.
A Fork in the Road
Andresen wrote a blog post addressing Todd and others’ focus on "centralization theory." Andresen has written a series of blog posts concerning this subject since the Github statement.
He addressed the centralization of nodes using API services like those that heavyweights Coinbase and Bitpay offer. Andresen also addressed the cost increase to network transactions. He said he agrees that increased costs will likely happen:
“I agree with Jameson Lopp’s conclusion on the cause of the decline in full nodes — that it is a direct result of the rise of web based wallets and SPV (Simplified Payment Verification) wallet clients, which are easier to use than heavyweight wallets that must maintain a local copy of the blockchain.
“Give the typical user three experiences — a SPV wallet, a full node processing one-megabyte blocks, and a full node processing 20-megabyte blocks, and they will choose SPV every time.”
Andresen believes these two issues have a possibility of increasing, but not by very much. Over time, the network will level itself out, he said.
Bitcoin expert Andreas M. Antonopoulos has joined the block size debate as well, encouraging others to follow along:
He said, “If you ever wondered how the bitcoin core development team debates key issues that affect ‘consensus’ code, this is your chance.” Now is the time to see "almost every core dev ... posting detailed, nuanced and well-considered arguments for and against increasing the block size limit."
Bitcoin has experienced a hardfork in the past and people disagreed then. Many remember the hardfork of March 11, 2013 where miners rejected the new implementations. The fork caused the network to roll back to an earlier version of the Bitcoin protocol. The code is open-source and secured by the network itself, and either the consensus updates or the fork is ultimately rejected. With consensus necessary, big changes to the code happen rarely.