So I test my rig management code on a small test rig (Four GTX 1080 Ti’s). I have been analyzing the log files since Aug 8th and noticed the following.
Below is a plot showing the hourly Accepted Share values for each GPU. GPU0 had an issue with halving its Sol/Sec without failing so I had written code to watch GPU performance over time to deal with that issue (Running Avg and Variance). You can see the halving of the shares earned / GPU on Sept 28th that we all noticed. So nothing here raised any red flags for me.
However, when I did the same plot for the Dev Fee Shares below, I noticed two things.
- The Dev fee is all over the place and will have periods of high rates and low rates (Why)?
- The Dev Fee shares did NOT halve with the earned Shares.
So here is a different plot that shows the EWBF DevFee goes from 1.9% before the share halving to 3.6% after (average calculations). The DevFee is now sitting up at 8% for many hours when it peaks. So what is going on here?
Update, So the period for the DevFee seems to correlate with a drop in difficulty. When there is a drop in difficulty (total mined shares goes up) the DevFee spikes. When the difficulty is high (mined shares goes down), then DevFee is low. So the DevFee effectively removes the variability in shares earned that is caused by difficulty fluctuation and keeps it for the Dev. So a flat average calculation is not correct and the stated DevFee is NOT 2% (before and especially after the change at Flypool).
I am NOT against DevFees and happily pay them, but NOT when I am lied too about what the DevFee really is.
Below is a scatter plot of total shares mined vs the DevFee shares mined. The right is before the earned shares halve and the left is after on Flypool. Either way there is a strong and steep correlation between mined shares and DevFee shares mined. So even a small increase in mined shares and the DevFee skyrockets (Its NOT 2% folks)!