Hashrate/power usage observations

A few observations on hashrates and power usage based on my 3-4 months mining with 19x GTX 1070’s. I’ve had EVGA cards of various varieties (SC, SC2, FTW, FTW2, FTW hybrid, Black), a couple MSI Gaming X, a Zotac mini, 2 Gigabyte G1, and a Gigabyte Xtreme. All open air dedicated-mining rigs with all new components.

The EVGA’s provide the best hashrate/power usage. Of them, the FTW varieties (both hybrid and non-hybrid) do the best (440-450 using 110W). Non FTW EVGA’s consistently get 420-430 using 110W.

The MSI and Gigabyte use much more power to get near 450 hashrate, around 140-150W or more. Lowering power limit lowers hashrate close to 400 to get to 110W usage. Zotac mini was the absolute worst and could never go higher than 420 at max power, and below 400 at 120W. Zotac also ran 20F higher in any condition than all other cards.

I’ve spent considerable amount of time playing with processor and memory frequencies, power limit, and even voltage/frequency curve. The voltage/frequency curve made a difference but it was impossible to find a setting that didn’t crash the miner. Very touchy.

Interested in other experiences on this.


Very good post

Only one question, in your knowledge which is the rig/card that is the fastest on recovering investment?

Definitely the EVGA FTW variant. At 55-65% power limit they’ll hit 440-450 Sol/s using 110-115W each. Crank them up to 70-80% power and you’ll get 465-475 Sol/s using 140W. The FTW hybrid will stay stable to the highest, up near 480-490 while still at lower temps than the rest. If you can find them for a good price the hybrids are the best. Never tried a non-FTW hybrid so I can’t speak to them

Thank you, what would be a good price at this moment on time?

$450 for hybrid ftw is a good deal. They normally go for 500 or more. 1070s don’t come cheap at all anymore though

my EVGA 1080 Hybrid does 513 sols/s @ 122W equal 4.20 sols/w average

I didn’t go with a 1080 since they cost significantly more than a 1070. 450 sol/s for $450 (1070) vs 513 sol/s for $570 (EVGA FTW 1080)

it is “only” 513 because I undervolted it and so I can’t overclock it that much, but you must look at the sol/w and not the sols/s, except your electricity cost nothing! of course my one can much more then 513.

Yes, 4.2 Sol/W for 1080 and similar for 1070 at 4.1 Sol/W (450 using 110W). That comes to 122 Sol/s per Sol/W (513/4.2) for the 1080 vs 110 Sol/s per Sol/W (450/4.1) for the 1070. Including the variation you’ll get from card to card they are pretty much equivalent.

yes, but on the GTX1080 you get the result faster :slight_smile:

This isn’t necessarily true. If tuning for best pec you can actually leave money on the table as I have found many times that more power creates more hashing. In other words a worse efficiency still can make more money than a higher pec even taking power into account.

I see many people trying to get 4 sol/w which is great but many times they could make more money by mining less efficiently.


how that? you still need to pay or you at least still should pay the consumed power :man_shrugging:
Efficiency - Wikipedia :wink:

It’s pretty straightforward, actually. You just need to do some back-of-the-envelope calculations regarding your ZEC / (sol/s) and compare it against a second calculation that looks at the net delta of kW-hr saved. That being said, it’s going to be a little different for each individual because of variance in card tolerance for undervolt/OC settings, etc., but in general (i.e., assuming you don’t pay $0.50 / kW-hr), I absolutely believe that @17outs is correct.

so what should I change or calculate now?

@17outs is correct. Regardless, the point of my original post is that EVGA cards tend to perform better whether used at 60% power or 100% power. It aggravated me to the point that I sold the 6 non-EVGA cards I had and am replacing them with EVGA FTW (hybrid and non-hybrid) models now. The efficieny gain will offset the cost of selling/buying these 6 in roughly 10 months. Also roughly the same time period it takes at this point to “break even” (not counting selling your used cards when you’re done).

My evga hash about 100 sols collectively higher than my mixed rig with asus
and msi and gigabyte. However they have pigtail power cords which make them
more a hassle to plan for power wise.

I’m not familiar with a “pig-tail” power connector on these cards. If you mean the better models (i.e. - FTW) have two 8-pin power connectors then yes. You can get a 8-pin to 2x8-pin adapter to connect these to your PSU.

hi guys. im new miner. i am very confuse about the hashrate and power usage. the problem begins with the hash rate average that the profitability calculator shows in my case i have 3 rx580 8Gb, a h81m-k MB, 8GB RAM, ssd 240 GB, PSU 1000 W. the calculator shows an average of 44 usd/m (also noted that says it can be more or less) my hash rate right now is 299 H/s average for each gpu in total of 893-898 H/s for all three and mmy usd are about 70-87 usd/m. am I doing something wrong? please im like a fish out of the water here. thanks.

I’d first ask why are you mining zcash ($65/mo) instead of ethereum ($88/mo)? Those numbers are the amounts after subtracting electricity costs. AMD cards are more profitable with ethereum than zcash at the moment.

How do you figure you’re making $70-87/mo right now mining zcash on those three cards? Are you not subtracting what it costs you in electricity?