this post was submitted on 21 Dec 2024
231 points (98.3% liked)

Technology

60035 readers
3753 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] cogman@lemmy.world 32 points 17 hours ago (1 children)

And I'll bet roughly 50->90% of that usage is idiots doing crypto/ai garbage.

[–] LodeMike@lemmy.today 5 points 15 hours ago (2 children)

No lol the amount of power that cloud services use is atrocious. The serverless trend makes things add up. It's cheaper in terms of hardware but oh boy do all those layers of abstraction make things heavy, especially loading of the applet, communication to other applets. (I'm forgetting if applets is the correct name).

I actually don't belive the ~5% figure at all—especially with the track record of honesty (or the lack of it) these companies have.

Sure, datacenters are rather efficent, but times 330 million people in the US adds up.

[–] DarkDarkHouse@lemmy.sdf.org 5 points 9 hours ago

I would bet that hardware being way more efficient and corporate IT infrastructure being consolidated in data centers is much more energy efficient than the alternative. The fact that we are running much more layered and compute-intensive systems doesn't really change that.

[–] cogman@lemmy.world 15 points 14 hours ago* (last edited 13 hours ago)

The amount of power AI and Crypto require is orders of magnitude the amount of power required by pretty much any regular application. The company I work at uses somewhere around 2000 CPU cores worth of compute at AWS (and we have ~100 microservices. We are a fairly complex org that way).

Generally speaking, an 80CPU core system takes up ~200W worth of power. That means my companies entire fleet operating eats about 5kW of power when running full bore (it isn't doing that all the time). My company is not a small company.

Compare that to what a single nvidia A100 eats up. Those GPUs take up to 400W of power. When doing AI/crypto stuff you are running them as hard as possible (meaning you are eating the full 400W). That means just 12 AI or crypto apps will eat all the same amount of power that my company with 100 different applications eats while running full bore. Now imagine that with the model training of someone like chatgpt which can eat pretty much as many GPUs as you can throw at it.

To put all of this in perspective. 5kW is roughly what a minisplit system will consume.

Frankly, I'm way more concerned about my companies travel budget in terms of CO2 emissions than I am our datacenter usage.