I see news stories that will give examples of how much energy a type of technology uses (usually AI or crypto). They’ll claim very big numbers like the whole ecosystem using “as much as a small country” or one instance of use being “as much as an average home uses in a year.”
With the crypto ecosystem being so big and I’m less inclined to defend it, I haven’t thought as much about the claims. But with AI while it still has problematic aspects, it also has a lot of useful applications. When I run a single query the idea it’s the same energy as driving my car ten miles or whatever doesn’t seem to pass the smell test.
How are these numbers generated? Historically media doesn’t do great with science reporting (“a cure for cancer was just invented” etc) so just trying to get some context/perspective.
Look for actual data, the numbers.
Let’s call Romania a “small country”. They use about 50 billion kWh per year. https://www.worlddata.info/europe/romania/energy-consumption.php
And 120 TWh used for mining Bitcoin in 2023 across the whole world. https://www.cnet.com/home/energy-and-utilities/how-much-power-does-crypto-use-the-government-wants-to-know/
Are the numbers precise to the watt-hour? No, they’re estimates. But when costs to mine are multiple orders of magnitude greater than what a small country spends, a 5-10% buffer in the estimates doesn’t invalidate the statement that crypto mining uses more energy than a small country.
Use multiple sources, include .gov and academic sources to reduce risk of being in some conspiracy rabbit hole.