• drislands@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    2 hours ago

    The problem as I see it is that there is an upper limit on how good any game can look graphically. You can’t make a game that looks more realistic than literal reality, so any improvement is going to just approach that limit. (Barring direct brain interfacing that gives better info than the optical nerve)

    Before, we started from a point that was so far removed from reality than practically anything would be an improvement. Like say “reality” is 10,000. Early games started at 10, then when we switched to 3D it was 1,000. That an enormous relative improvement, even if it’s far from the max. But now your improvements are going from 8,000 to 8,500 and while it’s still a big absolute improvement, it’s relatively minor – and you’re never going to get a perfect 10,000 so the amount you can improve by gets smaller and smaller.

    All that to say, the days of huge graphical leaps are over, but the marketing for video games acts like that’s not the case. Hence all the buzzwords around new tech without much to show for it.

  • Ibaudia@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    4 hours ago

    I don’t understand why developers and publishers aren’t prioritizing spectacle games with simple graphics like TABS, mount and blade, or similar. Use modern processing power to just throw tons of shit on screen, make it totally chaotic and confusing. Huge battles are super entertaining.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      2 hours ago

      The dream of the '10s/20s game industry was VR. Hyper-realistic settings were supposed to supplant the real world. Ready Player One was what big development studios genuinely thought they were aiming for.

      They lost sight of video games as an abstraction and drank too much of their own cyberpunk kool-aid. So we had this fixation on Ray Tracing and AI-driven NPC interactions that gradually lost sight of the gameplay loop and the broader iterative social dynamics of online play.

      That hasn’t eliminated development in these spheres, but it has bifricated the space between game novelty and game immersion. If you want the next Starcraft or Earthbound or Counterstrike, you need to look towards the indie studios and their low-graphics / highly experimental dev studios (where games like Stardew Valley and Undertale and Balatro live). The AAA studios are just turning out 100 hour long movies with a few obnoxious gameplay elements sprinkled in.

  • parlaptie@feddit.org
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    2
    ·
    5 hours ago

    There’s no better generational leap than Monster Hunter Wilds, which looks like a PS2 game on its lowest settings and still chugs at 24fps on my PC.

    • upandatom@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      3
      ·
      3 hours ago

      Could’ve done your research before buying. Companies aren’t held to standards bc people are uninformed buyers.

  • dragonlobster@programming.dev
    link
    fedilink
    English
    arrow-up
    22
    ·
    7 hours ago

    I don’t mind the graphics that much, what really pisses me off is the lack of optimization and heavy reliance on frame gen.

  • PlexSheep@infosec.pub
    link
    fedilink
    English
    arrow-up
    7
    ·
    6 hours ago

    To be fair there isn’t just graphics.

    Something like Zelda Twilight princess HHD to Zelda Breath of the wild was a huge leap in just gameplay. (And also in graphics but that’s not my point)

  • 2ugly2live@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    6 hours ago

    I feel like we won’t be able to see the difference until a couple of years, like CGI in old movies.

    • I Cast Fist@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 hours ago

      The generational leap from PS3 -> PS4 wasn’t that significant already, and that happened more than 10 years ago. The biggest difference seem to be lights/shadows and texture size, the latter of which balloons game size and can tank performance

  • Steve Dice@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    1
    ·
    21 hours ago

    I mean, how much more photorealistic can you get? Regardless, the same game would look very different in 4K (real, not what consoles do) vs 1080p.

    • hlmw@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      13 hours ago

      The lighting in that image is far, far from photorealistic. Light transport is hard.

      • Steve Dice@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        13 hours ago

        That’s true but realistic lightning still wouldn’t make anywhere near the same amount of difference that the other example shows.

  • HEXN3T@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    45
    arrow-down
    3
    ·
    edit-2
    22 hours ago

    Let’s compare two completely separate games to a game and a remaster.

    Generational leaps then:

    Good lord.

    EDIT: That isn’t even the Zero Dawn remaster. That is literally two still-image screenshots of Forbidden West on both platforms.

    Good. Lord.

    • Maggoty@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      ·
      20 hours ago

      Yeah no. You went from console to portable.

      We’ve had absolutely huge leaps in graphical ability. Denying that we’re getting diminishing returns now is just ridiculous.

      • HEXN3T@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        2
        ·
        12 hours ago

        We’re still getting huge leaps. It simply doesn’t translate into massively improved graphics. What those leaps do result in, however, is major performance gains.

        I have played Horizon Zero Dawn, its remaster, and Forbidden West. I am reminded how much better Forbidden West looks and runs on PS5 compared to either version of Zero Dawn. The differences are absolutely there, it’s just not as spectacular as the jump from 2D to 3D.

        The post comes off like a criticism of hardware not getting better enough faster enough. Wait until we can create dirt, sand, water or snow simulations in real time, instead of having to fake the look of physics. Imagine real simulations of wind and heat.

        And then there’s gaussian splatting, which absolutely is a huge leap. Forget trees practically being arrangements of PNGs–what if each and every leaf and branch had volume? What if leaves actually fell off?

        Then there’s efficiency. What if you could run Monster Hunter Wilds at max graphics, on battery, for hours? The first gen M1 Max MacBook Pro can comfortably run Baldur’s Gate III. Reducing power draw would have immense benefits on top of graphical improvements.

        Combined with better and better storage and VR/AR, there is still plenty of room for tech to grow. Saying “diminishing returns” is like saying that fire burns you when you touch it.

        • I Cast Fist@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 hours ago

          What those leaps do result in, however, is major performance gains.

          Which many devs will make sure you never feel them by “optimizing” the game for only the most bleeding edge hardware

          Then there’s efficiency. What if you could run Monster Hunter Wilds at max graphics, on battery, for hours? The first gen M1 Max MacBook Pro can comfortably run Baldur’s Gate III. Reducing power draw would have immense benefits on top of graphical improvements.

          See, if the games were made with a performance first mindset, that’d be possible already. Not to dunk on performance gains, but there’s a saying that every time hardware gets faster, programmers make their code slower. I mean, you can totally play emulated SNES games with minimal impact compared to leaving the computer idling.

          Saying “diminishing returns” is like saying that fire burns you when you touch it.

          Unless chip fabrication can figure a way to make transistors “stack” on top of one another, effectively making 3D chips, they’ll continue to be “flat” sheets that can only increase core count horizontally. Single core frequency peaked in early 2000s, from then on it’s been about adding more cores. Even the gains from a RTX 5090 vs a RTX 4090 aren’t that big. Now compare with the gains from a GTX 980 vs a GTX 1080

        • Maggoty@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          9 hours ago

          I am reminded how much better Forbidden West looks and runs on PS5 compared to either version of Zero Dawn.

          Really? I’ve played both on PS5 and didn’t notice any real difference in performance or graphics. I did notice that the PC Version of Forbidden West has vastly higher minimum requirements though. Which is the opposite of performance gains.

          Who the fuck cares if leaves are actually falling off or spawning in above your screen to fall?

          And BG3 has notoriously low minimums, it is the exception, not the standard.

          If you want to see every dimple on the ass of a horse then that’s fine, build your expensive computer and leave the rest of us alone. Modern Next Gen Graphics aren’t adding anything to a game.

    • starman2112@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      14
      ·
      19 hours ago

      The fact that the Game Boy Advance looks that much better than the Super Nintendo despite being a handheld, battery powered device is insane

    • HEXN3T@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      5
      ·
      21 hours ago

      It is baffling to me that people hate cross gen games so much. Like, how awful for PS4 owners that don’t have to buy a new console to enjoy the game, and how awful for PS5 owners that the game runs at the same fidelity at over 60FPS, or significantly higher fidelity at the same frame rate.

      They should have made the PS4 version the only one. Better yet, we should never make consoles again because they can’t make you comprehend four dimensions to be new enough.

      • Maggoty@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        ·
        20 hours ago

        The point isn’t about cross generation games. It’s about graphics not actually getting better anymore unless you turn your computer into a space heater rated for Antarctica.

    • Dil@is.hardlywork.ing
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      5 hours ago

      We should be looking at more particles, more dynamic lighting, effects, realism is forsure a goal just not in the way you think, pixar movies have realistic lighting and shadows but arent “realistic”

      After I started messing with cycles on blender I went back to wanting more “realistic” graphics, its better for stylized games too

      But yeah I want the focus to shift towards procedural generation (I like how houdini and unreal approach it right now), more physics based interactions, elemental interactions, realtime fire, smoke, fluid, etc. Destruction is the biggest dissapointment, was really hoping for a fps that let me spend hours bulldozing and blowing up the map.

      • mrvictory1@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 hours ago

        Destruction is the biggest dissapointment, was really hoping for a fps that let me spend hours bulldozing and blowing up the map.

        Ever heard of The Finals?

    • The Picard Maneuver@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      79
      ·
      1 day ago

      So many retro games are replayable and fun to this day, but I struggle to return to games whose art style relied on being “cutting edge realistic” 20 years ago.

      • sploosh@lemmy.world
        link
        fedilink
        English
        arrow-up
        45
        ·
        1 day ago

        I dunno, Crysis looks pretty great on modern hardware and its 18 years old.

        Also, CRYSIS IS 18 WHERE DID THE TIME GO?

      • MudMan@fedia.io
        link
        fedilink
        arrow-up
        15
        arrow-down
        3
        ·
        1 day ago

        Really? Cause I don’t know, I can play Shadow of the Colossus, Resident Evil 4, Metal Gear Solid 3, Ninja Gaiden Black, God of War, Burnout Revenge and GTA San Andreas just fine.

        And yes, those are all 20 years ago. You are now dead and I made it happen.

        As a side note, man, 2005 was a YEAR in gaming. That list gives 1998 a run for its money.

        • Cethin@lemmy.zip
          link
          fedilink
          English
          arrow-up
          4
          ·
          20 hours ago

          I would say GoW and SotC at least take realism as inspiration, but aren’t realistic. They’re like an idealized version of realism. They’re detailed, but they’re absolutely stylized. SotC landscapes, for example, look more like paintings you’d see rather than places you’d see in real life.

          Realism is a bad goal because you end up making every game look the same. Taking our world as inspiration is fine, but it should almost always be expanded on. Know what your game is and make the art style enhance it. Don’t just replicate realism because that’s “what you’re supposed to do.”

          • MudMan@fedia.io
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            20 hours ago

            Look, don’t take it personally, but I disagree as hard as humanly possible.

            Claiming that realism “makes every game look the same” is a shocking statement, and I don’t think you mean it like it sounds. That’s like saying that every movie looks the same because they all use photographing people as a core technique.

            If anything, I don’t know what “realism” is supposed to mean. What is more realistic? Yakuza because it does these harsh, photo-based textures meant to highlight all the pores or, say, a Pixar movie where everything is built on this insanely accurate light transfer, path traced simulation?

            At any rate, the idea that taking photorealism as a target means you give up on aesthetics or artistic intent is baffling. That’s not even a little bit how it works.

            On the other point, I think you’re blending technical limitations with intent in ways that are a bit fallacious. SotC is stylized, for sure, in that… well, there are kaijus running around and you sometimes get teleported by black tendrils back to your sleeping beauty girlfirend.

            But is it aiming at photorealism? Hell yeah. That approach to faking dynamic range, the deliberate crushing of exteriors from interiors, the way the sky gets treated, the outright visible air adding distance and scale when you look at the colossi from a distance, the desaturated take on natural spaces… That game is meant to look like it was shot by a camera all the way. They worked SO hard to make a PS2 look like it has aperture and grain and a piece of celluloid capturing light. Harder than the newer remake, arguably.

            Some of that applies to GoW, too, except they are trying to make things look like Jason and the Argonauts more than Saving Private Ryan. But still, the references are filmic.

            I guess we’re back to the problem of establishing what people mean by “realism” and how it makes no sense. In what world does Cyberpunk look similar to Indiana Jones or Wukong? It just has no real meaning as a statement.

            • Cethin@lemmy.zip
              link
              fedilink
              English
              arrow-up
              3
              ·
              20 hours ago

              If anything, I don’t know what “realism” is supposed to mean. What is more realistic? Yakuza because it does these harsh, photo-based textures meant to highlight all the pores or, say, a Pixar movie where everything is built on this insanely accurate light transfer, path traced simulation?

              The former is more realistic, but not for that reason. The lighting techniques are techniques, not a style. Realism is trying to recreate the look of the real world. Pixar is not doing that. They’re using advanced lighting techniques to enhance their stylized worlds.

              Some of that applies to GoW, too, except they are trying to make things look like Jason and the Argonauts more than Saving Private Ryan. But still, the references are filmic.

              Being inspired by film is not the same as trying to replicate the real world. (I’d argue it’s antithetical to it to an extent.) Usually film is trying to be more than realistic. Sure, it’s taking images from the real world, but they use lighting, perspective, and all kinds of other tools to enhance the film. They don’t just put some actors in place in the real environment and film it without thought. There’s intent behind everything shown.

              I guess we’re back to the problem of establishing what people mean by “realism” and how it makes no sense. In what world does Cyberpunk look similar to Indiana Jones or Wukong? It just has no real meaning as a statement.

              Cyberpunk looks more like Indiana Jones than Persona 5. Sure, they stand out from each other, but it’s mostly due to environments.

              I think there’s plenty of games that benefit from realism, but not all of them do. There are many games that could do better with stylized graphics instead. For example, Cyberpunk is represented incredibly well in both the game and the anime. They both have different things they do better, and the anime’s style is an advantage for the show at least. The graphics style should be chosen to enhance the game. It shouldn’t just be realistic because it can be. If realism is the goal, fine. If it’s supposed to be more (or different) than realism, maybe try a different style that improves the game.

              Realism is incredibly hard to create assets for, so it costs more money, and usually takes more system resources. For the games that are improved by it, that’s fine. There’s a lot of games that could be made on a smaller budget, faster, run better, and look more visually interesting if they chose a different style though. I think it should be a consideration that developers are allowed to make, but most are just told to do realism because it’s the “premium” style. They aren’t allowed to do things that are better suited for their game. I think this is bad, and also leads to a lack in diversity of styles.

              • MudMan@fedia.io
                link
                fedilink
                arrow-up
                1
                arrow-down
                2
                ·
                19 hours ago

                I don’t understand what you’re saying. Or, I do, but if I do, then you don’t.

                I think you’re mixing up technique with style, in fact. And really confusing a rendering technique with an aesthetic. But beyond that, you’re ignoring so many games. So many. Just last year, how do you look at Balatro and Penny’s Big Breakaway and Indiana Jones and go “ah, yes, games all look the same now”. The list of GOTY nominees in the TGAs was Astro Bot, Balatro, Wukong, Metaphor, Elden Ring and Final Fantasy VII R. How do you look at that list of games and go “ah, yes, same old, same old”.

                Whenever I see takes like these I can’t help but think that people who like to talk about games don’t play enough games, or just think of a handful of high profile releases as all of gaming. Because man, there’s so much stuff and it goes from grungy, chunky pixel art to lofi PS1-era jank to pitch-perfect anime cel shading to naturalistic light simulation. If you’re out there thinking games look samey you have more of a need to switch genres than devs to switch approach, I think.

                • Cethin@lemmy.zip
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  16 hours ago

                  By “all games look the same” I’m being hyperbolic. I mean nearly all AAA games and the majority of AA games (and not an insignificant number of indies even).

                  Watch this video. Maybe it’ll help you understand what I’m saying.

                  Whenever I see takes like these I can’t help but think that people who like to talk about games don’t play enough games, or just think of a handful of high profile releases as all of gaming.

                  Lol. No. Again, I was being hyperbolic and talking mostly about the AAA and AA space. I personally almost exclusively play indies who know what they’re trying to make and use a style appropriate to it. I play probably too many games. I also occasionally make games myself, I was the officer in a game development club in college, and I have friends in the industry. I’m not just some person who doesn’t understand video games.

        • snooggums@lemmy.world
          link
          fedilink
          English
          arrow-up
          14
          ·
          edit-2
          1 day ago

          Did those go for realism though, or were they just good at balancing the more detailed art design with the gameplay?

          • MudMan@fedia.io
            link
            fedilink
            arrow-up
            5
            arrow-down
            4
            ·
            1 day ago

            Absolutely they went for realism. That was the absolute peak of graphics tech in 2004, are you kidding me? I gawked at the fur in Shadow of the Colossus, GTA was insane for detail and size for an open world at the time. Resi 4 was one of the best looking games that gen and when the 360 came out later that year it absolutely was the “last gen still looked good” game people pointed at.

            I only went for that year because I wanted the round number, but before that Silent Hill 2 came out in 2001 and that was such a ridiculous step up in lighting tech I didn’t believe it was real time when the first screenshots came out. It still looks great, it still plays… well, like Silent Hill, and it’s still a fantastic game I can get back into, even with the modern remake in place.

            This isn’t a zero sum game. You don’t trade gameplay or artistry for rendering features or photorealism. Those happen in parallel.

            • snooggums@lemmy.world
              link
              fedilink
              English
              arrow-up
              9
              arrow-down
              1
              ·
              edit-2
              1 day ago

              They clearly balanced the more detailed art design with the game play.

              GTA didn’t have detail on cars to the level of a racing game, and didn’t have characters with as much detail as Resident Evil, so that it could have a larger world for example. Colossus had fewer objects on screen so it could put more detail on what was there.

              • MudMan@fedia.io
                link
                fedilink
                arrow-up
                2
                arrow-down
                3
                ·
                1 day ago

                Yeah. So like every other game.

                Nothing was going harder for visuals, so by default that’s what was happening. They were pushing visuals as hard as they would go with the tech that they had.

                The big change isn’t that they balanced visuals and gameplay. If anything the big change is that visuals were capped by performance rather than budget (well, short of offline CG cutscenes and VO, I suppose).

                If anything they were pushing visuals harder than now. There is no way you’d see a pixel art deck building game on GOTY lists in 2005, it was all AAA as far as the eye could see. We pay less attention to technological escalation now, by some margin.

                • snooggums@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  5
                  ·
                  1 day ago

                  Yeah. So like every other game.

                  Except for the ones that don’t do a good job of balancing the two things. Like the games that have incredible detail but shit performance and/or awful gameplay.

      • conditional_soup@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        24 hours ago

        STALKER is good, though I played a lot of Anomaly mostly, and I’m not sure that STALKER was ever known for bleeding edge graphics

    • conditional_soup@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      ·
      24 hours ago

      Idk, I’d say that pursuing realism is worthy, but you get diminishing returns pretty quick when all the advances are strictly in one (or I guess two, with audio) sense. Graphical improvements massively improved the experience of the game moving from NES or Gameboy to SNES and again to PS1 and N64. I’d say that the most impressive leap, imo, was PS1/N64 to PS2/XBox/GameCube. After that, I’d say we got 3/4 of the return from improvements to the PS3 generation, 1/2 the improvement to PS4 gen, 1/5 the improvement to PS5, and 1/8 the improvement when we move on to PS5 Pro. I’d guess if you plotted out the value add, with the perceived value on the Y and the time series or compute ability or texture density or whatever on the x, it’d probably look a bit like a square root curve.

      I do think that there’s an (understandably, don’t get me wrong) untapped frontier in gaming realism in that games don’t really engage your sense of touch or any of the subsets thereof. The first step in this direction is probably vibrating controllers, and I find that it definitely does make the game feel more immersive. Likewise, few games engage your proprioception (that is, your knowledge of your body position in space), though there’ve been attempts to engage it via the Switch, Wii, and VR. There’s, of course, enormous technical barriers, but I think there’s very clearly a good reason why a brain interface is sort of thought of as the holy grail of gaming.

      • jpreston2005@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        23 hours ago

        Having a direct brain interface game, that’s realistic enough to overcome the Uncanny Valley, would destroy peoples lives. People would, inevitably, prefer their virtual environment to the real one. They’d end up wasting away, plugged into some machine. It would lend serious credence to the idea of a simulated universe, and reduce the human experience by replacing it with an improved one. Shit, give me a universe wherein I can double-jump, fly, or communicate with animals, and I’d have a hard time returning to this version.

        We could probably get close with a haptic feedback suit, a mechanism that allows you to run/jump in any direction, and a VR headset, but there would always be something tethering you to reality. But a direct brain to machine interaction would have none of that, it would essentially be hijacking our own electrical neural network to run simulations. Much like Humans trying to play Doom on literally everything. It would be as amazing as it was destructive, finally realizing the warnings from so many parents before its time: “that thing’ll fry your brain.”

        • conditional_soup@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          ·
          23 hours ago

          Tbf, it’s kinda bullshit that we can’t double jump IRL. Double jumping just feels right, like it’s something we should be able to do.

          Yeah, no, it’d likely be really awful for us. I mean, can you imagine what porn would be like on that? That’s a fermi paradox solution right there. I could see the tech having a lot of really great applications, too, like training simulations for example, but the video game use case is simultaneously exhilarating and terrifying.

    • snooggums@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 day ago

      Like cgi and other visual effects, realism has some applications that can massively improve the experience in some games. Just like how lighting has a massive impact, or sound design, etc.

      Chasing it at the expense of game play or art design is a negative though.

    • ProfessorProteus@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      23 hours ago

      I agree generally, but I have to offer a counterpoint with Kingdom Come: Deliverance. I only just got back into it after bouncing off in 2019, and I wish I hadn’t stopped playing. I have a decent-ish PC and it still blows my entire mind when I go roaming around the countryside.

      Like Picard said above, in due time this too will look aged, but even 7 years on, it looks and plays incredible even at less-than-highest settings. IMHO the most visually impressive game ever created (disclaimer: I haven’t seen or played Horizon). Can’t wait to play KC:D 2!

    • Cid Vicious@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      It’s the right choice for some games and not for others. Just like cinematography, there’s different styles and creators need to pick which works best for what they’re trying to convey. Would HZD look better styled like Hi-Fi Rush? I don’t really think so. GOW? That one I could definitely see working more stylized.

  • renegadespork@lemmy.jelliefrontier.net
    link
    fedilink
    English
    arrow-up
    29
    ·
    22 hours ago

    This is true of literally any technology. There are so many things that can be improved in the early stages that progress seems very fast. Over time, the industry finds most of the optimal ways of doing things and starts hitting diminishing returns on research & development.

    The only way to break out of this cycle is to discover a paradigm shift that changes the overall structure of the industry and forces a rethinking of existing solutions.

    The automobile is a very mature technology and is thus a great example of these trends. Cars have achieved optimal design and slowed to incremental progress multiple times, only to have the cycle broken by paradigm shifts. The most recent one is electrification.

    • Maggoty@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      3
      ·
      20 hours ago

      Okay then why are they arbitrarily requiring new GPUs? It’s not just about the diminishing returns of “next gen graphics”.

      • Obelix@feddit.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 hours ago

        If you think about it, the gaming GPUs have been in a state of crisis for over half a decade. First shortages because everybody used them to mine bitcoins, then the covid chip shortages happened and now AI is killing cheaper GPUs. Therefore many people are stuck with older hardware, SteamDecks, consoles and haven’t upgrades their systems and those highly flammable $1000+ GPUs will not lead to everyone upgrading their PCs. So games are using older GPUs as target

      • AdrianTheFrog@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        18 hours ago

        path tracing is a paradigm shift, a completely different way of showing a scene to that normally done, it’s just a slow and expensive one (that has existed for many years but only started to become possible in real time recently due to advancing gpu hardware)

        Yes, usually the improvement is minimal. That is because games are designed around rasterization and have path tracing as an afterthought. The quality of path tracing still isn’t great because a bunch of tricks are currently needed to make it run faster.

        You could say the same about EVs actually, they have existed since like the 1920s but only are becoming useful for actual driving because of advancing battery technology.

        • Maggoty@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          17 hours ago

          Then let the tech mature more so it’s actually analogous with modern EVs and not EVs 30 years ago.

          • AdrianTheFrog@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            2
            ·
            16 hours ago

            Yea, it’s doing that. RT is getting cheaper, and PT is not really used outside of things like cyberpunk “rt overdrive” which are basically just for show.

            • Maggoty@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              16 hours ago

              Except it’s being forced on us and we have to buy more and more powerful GPUs just to handle the minimums. And the new stuff isn’t stable anyways. So we get the ability to see the peach fuzz on a character’s face if we have a water-cooled $5,000 spaceship. But the guy rocking solid GPU tech from 2 years ago has to deal with stuttering and crashes.

              This is insane, and we shouldn’t be buying into this.

              • AdrianTheFrog@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                13 hours ago

                It’s not really about detail, it’s about basic lighting especially in dynamic situations

                (Sometimes it is used to provide more detail in shadows I guess, but that is also usually a pretty big visual improvement)

                I think there’s currently a single popular game where rt is required? And I honestly doubt a card old enough to not support ray tracing would be fast enough for any alternate minimum setting it would have had instead. Maybe the people with 1080 ti-s are missing out, but there’s not that many of them honestly. I haven’t played that game and don’t know all that much about it, it might be a pointless requirement for all I know.

                Nowadays budget cards support rt, even integrated gpus do (at probably unusable levels of speed, but still)

                I don’t think every game needs rt or that rt should be required, but it’s currently the only way to get the best graphics, and it has the potential to completely change what is possible with the visual style of games in the future.

                Edit: also the vast majority of new solid gpus started supporting rt 6 years ago, with the 20 series from nvidia

                • Maggoty@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  9 hours ago

                  That’s my point though, the minimums are jacked up well beyond where they need to be in order to cram new tech in and get 1 percent better graphics even without RT. There’s not been any significant upgrade to graphics in the last 5 years, but try playing a 2025 AAA with a 2020 graphics card. It might work, but it’s certainly not supported and some games are actually locking out old GPUs.

  • RightHandOfIkaros@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    ·
    23 hours ago

    Ironically, Zelda Link to the Past ran at 60fps, and Ocarina of Time ran at 20fps.

    The same framerates are probably in the Horizon pictures below lol.

    Now, Ocarina of Time had to run at 20fps because it had one of the biggest draw distances of any N64 game at the time. This was so the player could see to the other end of Hyrule Field, or other large spaces. They had to sacrifice framerate, but for the time it was totally worth the sacrifice.

    Modern games sacrifice performance for an improvement so tiny that most people would not be able to tell unless they are sitting 2 feet from a large 4k screen.

    • Maalus@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      1
      ·
      23 hours ago

      Had to, as in “they didn’t have enough experience to optimize the games”. Same for Super Mario 64. Some programmers decompiled the code and made it run like a dream on original hardware.

      • RightHandOfIkaros@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        21 hours ago

        The programming knowledge did not exist at the time. Its not that they did not have the experience, it was impossible for them to have the knowledge because it did not exist at the time. You can’t really count that against them.

        Kaze optimizing Mario 64 is amazing, but it would have been impossible for Nintendo to have programmed the game like that because Kaze is able to use programming technique and knowledge that literally did not exist at the time the N64 was new. Its like saying that the NASA engineers that designed the Atlas LV-3B spacecraft were bad engineers or incapable of making a good rocket design just because of what NASA engineers could design today with the knowledge that did not exist in the 50s.

    • CancerMancer@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      21 hours ago

      One of the reasons I skipped the other consoles but got a GameCube was because all the first party stuff was buttery smooth. Meanwhile trying to play shit like MechAssault on Xbox was painful.

      • RightHandOfIkaros@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        18 hours ago

        I never had trouble with MechAssault, because the fun far outweighed infrequent performance drops.

        I am a big proponent of 60fps minimum, but I make an exception for consoles from the 5th and 6th generations. The amount of technical leap and improvement, both in graphics technology and in gameplay innovation, far outweighs any performance dips as a cost of such improvement. 7th generation is on a game by game basis, and personally 8th generation (Xbox One, Switch, and PS4) is where it became completely unacceptable to run even just a single frame below 60fps. There is no reason that target could not have been met by then, definitely now. Switch was especially disappointing with this, since Nintendo made basically a 2015 mid-range smartphone but then they tried to make games for a real game console, with performance massively suffering as a result. 11fps, docked, in Breath of the Wild’s Korok Forest or Age of Calamity (anyehwere in the game, take your pick,) is totally unacceptable, even if it only happened one time ever rather than consistently.

        • thisismyhaendel@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 hours ago

          I’m usually tolerant of frame drops, especially when they make hard games easier (like on the N64), but I agree it has gotten much worse on recent consoles. Looking at you, Control on PS4 (seems like it should just have been a PS5 game with all the frame drops; even just unpausing freezes the game for multiple seconds).