# Equihash algorithm

I know am missing something simple when it comes to the algorithm, but I don’t understand how each nuance should only give two (on average) solutions. This is how I interpret it.

To start with 1048576 blake2b hashes and split them in half to make a list of 2097152 pseudo random numbers.

Then we XOR then numbers together such that the first 20 bits equal 0. The output list should be about the same size as the input list. Rinse and repeat. For each stage the XOR output list should be about the same size as the input list.

So why is their not about 2097152 solutions per nuance?

from the white paper
“Before the last (k-th) collision search we expect N − 2^(k−1) + 1 ≈ N = 2^((n/k+1)+1) entries, thus on average we obtain two solutions after the last step”

2^((200/9+1)+1 = 2097152 entries that should all equal 0 on the last step no?

on the last (the ninth) step, not only first 20*9=180 bits of xors should be zero, but all 200 bits.

so it’s kind of like this: on the last step 2M of strings get divided by 1M thus giving around of 2 solutions.

perhaps i’m far from strict with math here, so it’s just an idea.