this post was submitted on 23 Apr 2025
624 points (92.6% liked)

196

5128 readers
1859 users here now

Be sure to follow the rule before you head out.


Rule: You must post before you leave.



Other rules

Behavior rules:

Posting rules:

NSFW: NSFW content is permitted but it must be tagged and have content warnings. Anything that doesn't adhere to this will be removed. Content warnings should be added like: [penis], [explicit description of sex]. Non-sexualized breasts of any gender are not considered inappropriate and therefore do not need to be blurred/tagged.

Also, when sharing art (comics etc.) please credit the creators.

If you have any questions, feel free to contact us on our matrix channel or email.

Other 196's:

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Ragdoll_X@sh.itjust.works 1 points 1 week ago* (last edited 1 week ago) (2 children)

You're not gonna save the world by not using ChatGPT, just like you won't save all those slaves in Zambia by not buying from Apple, and just like you didn't destroy Twitter by joining Bluesky.

To have real effect requires systemic change, so if you want to actually make a difference you can do things like canvassing, running for local office positions and school boards, educating friends and family about politics, or try killing a few politicians and tech CEOs. You know, basic stuff.

Also I asked Gemini's Deep Research to research this for me because why not UwU

Executive Summary

Estimates for the energy consumed by ChatGPT during its training and inference phases vary considerably across different studies, reflecting the complexity of the models and the proprietary nature of the data. Training a model like GPT-3 is estimated to require around 1.3 GWh of electricity^1^, while more advanced models such as GPT-4 may consume significantly more, with estimates ranging from 1.75 GWh to over 62 GWh.^2^ Models comparable to GPT-4o are estimated to consume between 43.2 GWh and 54 GWh during training.^3^ These figures represent substantial energy demands, with the training of GPT-4 potentially exceeding the annual electricity consumption of very small nations multiple times over. The energy used during ChatGPT inference, the process of generating responses to user queries, also presents a wide range of estimates, from 0.3 watt-hours to 2.9 watt-hours per query.^4^ This translates to an estimated annual energy consumption for inference ranging from approximately 0.23 TWh to 1.06 TWh. This level of energy demand can be comparable to the entire annual electricity consumption of smaller countries like Barbados. The lack of official data from OpenAI and the diverse methodologies employed by researchers contribute to the variability in these estimates, highlighting the challenges in precisely quantifying the energy footprint of these advanced AI systems.^4^

  1. https://balkangreenenergynews.com/chatgpt-consumes-enough-power-in-one-year-to-charge-over-three-million-electric-cars/

  2. https://www.baeldung.com/cs/chatgpt-large-language-models-power-consumption

  3. https://www.bestbrokers.com/forex-brokers/ais-power-demand-calculating-chatgpts-electricity-consumption-for-handling-over-78-billion-user-queries-every-year/

  4. https://epoch.ai/gradient-updates/how-much-energy-does-chatgpt-use

[–] SoftestSapphic@lemmy.world -3 points 1 week ago (1 children)
[–] Ragdoll_X@sh.itjust.works 0 points 6 days ago* (last edited 6 days ago)

You're who my comment is about.