I often hear that improvements of mining technology (more hashes per kWh) would improve Bitcoin energy usage. Isn’t that just a misunderstanding? Isn’t the energy usage hardly dependent on the actual technology used by the miners, because improvements just lead to a difficulty raise?
Consider this thought experiment. Let’s say there were miracle ASICs invented that have 1000 times the hash rate per kWh as current ASICs. What I think would happen is this. At first, miners changing to the miracle ASIC would be hugely more profitable than the others. So others would change to the miracle ASIC, too, and the global hash rate increases, and thus the difficulty. The miners using the old ASICs would get unprofitable and have to scramble to get the new ASICs, too. In the end all would change to the miracle ASICs, some would give up, some would come new, the difficulty would be 1000 times than before and the energy usage just about the same. (All in all the energy usage is determined by the mining reward, because there is an incentive for new miners to come in until most of the mining reward is spent on costs, and energy is the largest cost factor, so there is no coming down as long as POW is used.)
Am I missing something important there?
(BTW: I’m aware that there could be some shifts if e.g. the miracle ASIC is way more expensive than the old ones. My point in this thought experiment is that we wouldn’t have to expect a change in the order of magnitude of energy usage, despite three orders of magnitude of change in mining efficiency.)