Using it, not all that energy intensive (one llm use is roughly the same as 3 pre-ai-bullshit google searches iirc). Training it, very energy intensive.
Yes it would but we haven’t even replaced all our previous needs with renewables so it aint helping.
Oh damn. Very good article btw.
According to numbers floating around online, thiat would mean one llama query is around as expensive as 10 google searches. And it’s likely that those costs will increase further.
It still seems like the biggest factor here is the scale of adaptation. Unfortunately the total energy costs of AI might even scale exponentially since the more complex the queries get, the better the responses will likely be. And that will further drive adaptation.
This pace is so clearly unsustainable it’s horrifying, and while it was obvious to some degree, it seems it’s worse than I thought.