Does Zcash address the scaling issues that Bitcoin has?


Despite being a software engineer, I'm not very well versed in some of the specifics underlying cryptocurrencies, so please bear with me.

Problems with Bitcoin

Transactions / Sec

Bitcoin ran into a problem that they still have not solved. That is, the limit on the number of transactions that can be handled at any given time. Increasing the size of each block addresses this issue, but still leaves the number of possible transactions / sec, to my knowledge, at a number that's nowhere near what would be required to compete in the context of global ecommerce (e.g., with Visa).

Blockchain Size

To my knowledge, it is the case, and even worse so with the solution to the aforementioned transactions / sec problem (increase block size), that the block chain will continue to grow in size as long as Bitcoin is used, meaning Bitcoin's success is pegged to a sort of Moore's Law of networking, computation, storage, etc. This may seem like a non-issue in the 2-3 transactions per second range, but the cited ~8Mb/sec required to meet demands similar to Visa seems like a much bigger deal, especially for emerging markets and mobile users, than it's made out to be, especially when considering that electronic transactions are drastically growing in volume, worldwide.


Given that, even just 7 million transactions (which is what Visa transacts in a single hour), at a much smaller block size than would be required to handle higher volumes, requires 100MiB of storage (numbers therein are from 2012, before most people even knew what Bitcoin was), it seems like storage will be a non-trivial issue to deal with as well.


How will Zcash deal with each of those, and other issues of scale that I've overlooked? My understanding is that at least part (most of?) the overhead of Bitcoin is related to the triple entry method by which transactions are verified. Does zero-knowledge proofing render all of these issues moot? How many transactions per second could the first iteration of Zcash handle, in theory? Are there logical limitations like in the case of Bitcoin?


We can't make any concrete assessments on scaling. As far as scaling is concerned, our tech is better in some respects (most transaction data can be pruned) and worse in some others (we have an unprunable accumulator). Lots of ideas are floating around our team that can help this and still maintain a resilient decentralized network, but we want to address this later, as our concern is privacy first-and-foremost. Premature optimization could lead to poor designs.

We also may consider the tradeoffs of hardforks differently than upstream Bitcoin does.


I think "premature optimization" dogmas don't apply here, because what we're discussing is not code optimization, but rather system architecture that's based on a white paper and includes mathematically provable outcomes, right?

As an analogy for my point, if somebody is planning a 3000 story building, it's not premature optimization to ask what kind of elevator would make it possible to carry people from floor 1 to floor 30001, because, if the intention of the building from the outset is to have people on each floor, then not accounting for the elevator would lead to a 3000 story paperweight. Premature optimization in this analogy would be building a new highway system to support the building's perceived effects on regional traffic, without realizing that most of the building could be sold to foreigners who visit for only 2 weeks per year. IOW, both are about transport and throughput, but the former is an ancillary structural requirement for the end goal2, while the latter is making unfounded assumptions about auxiliary needs.

Also, I'm merely asking these questions about these scaling issues, not suggesting anything be changed.

  1. Even if the initial means is to simply provide stairs
  2. Assuming the goal of Zcash is to become a dominant form of cryptocurrency, or better


Zooko has commented here about this subject before, see this post.

To expand a bit: some optimizations we've proposed impact our security proof and assumptions in non-trivial ways and require expensive and time-consuming analysis. We've considered a lot of potential changes and approaches, and I personally believe we have enough available to us after launch to gradually make performance better without impacting the design much.

This isn't to say we aren't doing any optimizations before launch; we're currently working to improve performance using novel constructions and have contacted additional experts to verify their security before we commit to them. We're prioritizing optimizations based on their impact to the scheme's security, their practicality and their impact. In some cases we may be making changes to the scheme before launch which do impact our security proof but simultaneously improve the design and performance.

We're all very aware of Bitcoin's scalepocalypse and keep that in mind in every decision we make. :slightly_smiling:


Thanks, @ebfull. I'm more interested in whether or not Zcash's architecture implicitly addresses any of the built-in scaling issues that Bitcoin has (e.g., by Zcash not requiring triple-entry, or something else along those lines). Based on your replies, I'm guessing that this isn't the case. Is that right?


The zerocash whitepaper goes into more detail, but our scheme is basically composed of two structures which are maintained by full nodes:

  1. An incremental merkle tree which is fixed-size (it stays the same size over time, and it's append-only). This contains cryptographic commitments to Coins (we may call these "notes" instead, see #539). If you send someone money using our scheme, you're constructing a Coin and proving in zero-knowledge that the commitment has the correct value and some other details. This is just appended to the tree. As a result, this construction does not have a huge scalability impact.
  2. A "spent serials set" which constantly grows. If you spend some money, to prove you haven't spent it, your zero-knowledge proof validates that you've disclosed a serial number (like on a banknote) which is derived from the Coin and some secrets that the spender has that cannot be changed. Full nodes append to this set and do not permit transactions which use the same serial twice as they would be double-spending. This does mean, though, that a fully-validating node needs to retain the entire map. The serials are currently 32-byte hashes.

The scheme does not intend to address any scalability issues in Bitcoin, but does have some good pruning properties anyway.


I believe you will come to realize eventually that the main issue with scalability is not the block chain storage size but the fact that verification MUST be centralized. Once you come to that realization, you will eventually end up with my conclusion as to the only possible solution. A recent discussion about how Satoshi's PoW design does not solve the Byzantine Generals Problem[1] is relevant to capitulating to this realization. Note also the irreparable flaw in Segregated Witness.