No, rather companies are buying thousands of computers to install AV on.
Bitcoin miners are a actually a very small minority of computer users, whereas AV results in an extra 10-30% power overhead (possibly more, if we factor in that modern cpus throttle way down if not under load) for the majority of all the corporate PCs in operation, to say nothing of home users.
Back of the napkin math suggests that the comparison is indeed ridiculous, but only because AV usage absolutely dwarfs bitcoin usage.
My pet peeve is VP9 on YouTube vs Chrome on MacOS. My original estimate lack of codec on MacOS / YouTube's choice to drop x264 for high resolution videos waste as much power as entire country Puerto Rico.
It's even impossible to play 8K YouTube videos on highest end MacBook and Chrome. It's ironic that MKBHD uploading them without being able to play them himself.
> The comparison of bitcoin Vs AV energy usage is a bit ridiculous. No one of buying hundreds of GPUs to mine AV.
No, but they run almost everywhere.
I'd be very surprised if bitcoin mining produces even 1% of the CO2 emissions of what AV software does. Mostly because the reward from mining has been competed so low that if you have to pay normal amounts for electricity, it's nowhere near profitable, so mining mostly happens in places with very low electricity prices, such as towns in China near hydroelectric dams with massive excess production.
Sometime last year, someone had writeup where they worked out that buying enough gas (I forget if "natural" or "-oline") to mine 1 bitcoin, ignoring fixed costs like the generator or GPU, would cost them ~1.2 BTC. That might change if you live near a oil well/refinery/coal mine, but I'd kinda like to see a statistical analysis of whether bitcoin time-between-blocks varies with time of day based on which areas have excess solar power.
That said, both are wasteful and ultimately neither should exist.