I frequently see conversations in mining forums that extol the virtues of reducing voltage input on GPUs to increase the efficiency. People post proudly that they are getting 4+ sol/watt etc.
My question is this: Why?
Example: If I’m getting 400 sol/s at 80% and I crank up my voltage to 100% and I get 460 sol/s, why wouldn’t I do that as long as the additional coin I’m mining is worth more than the additional electricity I’m using? Sure, your efficiency (sols/watt) goes down because of diminishing returns, but who cares as long as each additional watt increases your profit?
What other factor am I missing here? Does this practice reduce the useful life of the card?