• redsunrise@programming.dev
    link
    fedilink
    English
    arrow-up
    291
    arrow-down
    1
    ·
    15 days ago

    Obviously it’s higher. If it was any lower, they would’ve made a huge announcement out of it to prove they’re better than the competition.

    • Ugurcan@lemmy.world
      link
      fedilink
      English
      arrow-up
      41
      arrow-down
      8
      ·
      edit-2
      15 days ago

      I’m thinking otherwise. I think GPT5 is a much smaller model - with some fallback to previous models if required.

      Since it’s running on the exact same hardware with a mostly similar algorithm, using less energy would directly mean it’s a “less intense” model, which translates into an inferior quality in American Investor Language (AIL).

      And 2025’s investors doesn’t give a flying fuck about energy efficiency.

    • Chaotic Entropy@feddit.uk
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      2
      ·
      15 days ago

      I get the distinct impression that most of the focus for GPT5 was making it easier to divert their overflowing volume of queries to less expensive routes.

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      15 days ago

      Unless it wasn’t as low as they wanted it. It’s at least cheap enough to run that they can afford to drop the pricing on the API compared to their older models.

    • thatcrow@ttrpg.networkBanned
      link
      fedilink
      English
      arrow-up
      1
      ·
      14 days ago

      It warms me heart to see ya’ll finally tune-in to the scumbag tactics our abusers constantly employ.