Does difficulty have anything to do with the size of the dataset Equihash has to process? If not, what factor(s) does difficulty relate to?

There are two things going on with Equihash w.r.t. how hard it is to solve.

The first thing is the chosen parameters (the N and K), and those dictate how much memory space/bandwidth will be needed to find a solution. My understanding is that these are fixed (or at least changed extremely rarely).

The second thing (and more to your question) is that a solution must have at least *d* leading zeros. This (*d*) is the parameter that can be adjusted by the network so we continually target a 10 min block time.

EDIT: So to compare to bitcoin: it works exactly the same way. We essentially just hash until we find a digest that has enough leading zeros. Itâ€™s just that our hashing algo is different in that it requires high memory bandwidth.

The second thing above is equivalent to what Bitcoin does, and is what is canonically thought of as the â€śdifficultyâ€ť.

In terms of the actual hardness of the PoW, yes that is largely dictated by the size of the dataset (linked to the first thing above) - the limiting factor is sorting a list of `2^((n/(k+1))+1)`

tuples. For e.g. `n = 96, k = 3`

, that is `2^25`

rows.

Thank you both. I didnâ€™t even realise the same leading zero approach to difficulty applied to Equihashâ€¦ So, when the dataset is sorted, is it then somehow reduced to a 32 or 64 bit integer which is then assessed for leading zeros? And, if / when that integer doesnâ€™t have sufficient leading zeros, what parts of the dataset can be altered in order to try again? For eg: if a bitcoin block hash comes up short of the target difficulty, thereâ€™s a nonce field and even a data/time field that can be incremented to produce a slightly different dataset for the next hash.

The block header can be thought of as `Data | Nonce | Solution`

. The Equihash algorithm takes as input `Data | Nonce`

and outputs `[Solution_1, Solution_2, ...]`

(two on average). The difficulty check takes as input the block header hash `SHA256(SHA256(Data | Nonce | Solution))`

and checks for sufficient leading zeroes. The overall PoW is therefore to try different nonces, and for each nonce check each returned solution as a possibility for satisfying the difficulty check.

Because the Equihash algorithm depends on `Data`

which contains a timestamp, altering the timestamp will alter the solution space in the same way altering the nonce will. Similarly for altering the coinbase, included transactions etc.

That was very helpful - thank you again.

Page 6 of the Equihash paper (pdf) might shed more light if youâ€™re interested. It might also be more detailed for what youâ€™re looking for. Either way itâ€™s useful reference.

How do you estimate time to finding a block? For bitcoin it is

time to finding block = difficulty * 2**32 / hashrate

where difficulty is the current difficulty, hashrate is the number of

hashes your miner calculates per second, and time is the average in

seconds between the blocks you find.

We have a hashrate of 2/60 = 1/30 hashes/second per thread?

Do I just divide by 4?

The time to find is the same calculation as for bitcoin. We just have a slower hash rate

Sorry to resurrect an old threadâ€¦ Iâ€™m getting some strange output when I perform this calculation:

Using the difficulty and network hash rate at the time of this writing for Bitcoin:

D = 253,618,246,641

H = 1,798,095,669 GH/s

T = D * 2^32 / H / 60 = ~10 minutes

This is in agreement with expectations. For Zcash, however, a difficulty of 12,000 implies that the network hash rate is 34 G/s , which is obviously incorrect:

H = D * 2^32 / 2.5 / 60 = 34 GH/s

If I divide this answer by 10^6, then I get reasonable numbers. Is this 10^6 factor baked in to the formula?

I was also interested.

How to convert network difficulty to hasrate? Sol/s and H/s?

ZEC find block = Difficulty * 8192 / hashrate

ZEC(testnet) find block = Difficulty * 32 / hashrate

More precise:

Bitcoin hashrate = 2^32 * sum(past 144 Difficulties) / (previous TimeStamp - TimeStamp 144 blocks ago)

Zcash Sol/s = 2^13 * sum(past 120 difficulties) / (previous TimeStamp - TimeStamp 120 blocks ago)

Which is easier to understand as averages:

Zcash Sol/s = 2^13 * avg(120 Dâ€™s) / avg( 120 TSâ€™s)

edit: more precisely Zcash equihash H/s = 2^12 * avg(120 Dâ€™s) / avg( 120 TSâ€™s).

The â€ś12â€ť comes from the number of leading bits required in a valid hash of the header when D=1.

In HUSH, it did not have any leading 0â€™s, so itâ€™s difficulty = number of equihash runs to find a valid hash.