The University of Rhode Island’s AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT’s reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.

A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI’s GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.

      • Encrypt-Keeper@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        4 months ago

        AI models require a LOT of VRAM to run. Failing that they need some serious CPU power but it’ll be dog slow.

        A consumer model that is only a small fraction of the capability of the latest ChatGPT model would require at least a $2,000+ graphics card, if not more than one.

        Like I run a local LLM with a etc 5070TI and the best model I can run with that thing is good for like ingesting some text to generate tags and such but not a whole lot else.

          • gerryflap@feddit.nl
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 months ago

            It’s horrendously slow, unusable imo. With the larger DeepSeek distilled models I tried that didn’t fit into VRAM you could easily wait 5 minutes until it was done writing its essay. Compared to just a few seconds when it does. Bit that’s with a RTX 3070 Ti, not something the average ChatGPT user has lying around probably.

          • Evono@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 months ago

            Basicly I can run 9b models on my 16gb gpu mostly fine like getting responses of lets say 10 lines in a few seconds.

            Bigger models if they don’t outright crash take for the same task then like 5x or 10x longer so long it isn’t even useful anymore

            So very worse.