What I don’t understand is why games look prettier but things like NPC AI (which is really path-finding and decision trees, not actual AI), interactivity of the game world, destructability of game objects - all those things are objectively worse than they have been in a game of 10-15 years ago (with some exceptions like RDR2).
How can a game like Starfield still have all the Bethesda jank but now the NPCs lack any kind of daily routine?
Most enemies in modern shooters barely know how to flank, compare that to something like F.E.A.R. which came out in 2006!
from the image it seems like you’re expecting a fourth dimension in games now? I don’t think you’d like the development cycle on that. miegakure is still on the way is it?
The problem as I see it is that there is an upper limit on how good any game can look graphically. You can’t make a game that looks more realistic than literal reality, so any improvement is going to just approach that limit. (Barring direct brain interfacing that gives better info than the optical nerve)
Before, we started from a point that was so far removed from reality than practically anything would be an improvement. Like say “reality” is 10,000. Early games started at 10, then when we switched to 3D it was 1,000. That an enormous relative improvement, even if it’s far from the max. But now your improvements are going from 8,000 to 8,500 and while it’s still a big absolute improvement, it’s relatively minor – and you’re never going to get a perfect 10,000 so the amount you can improve by gets smaller and smaller.
All that to say, the days of huge graphical leaps are over, but the marketing for video games acts like that’s not the case. Hence all the buzzwords around new tech without much to show for it.
Graphics are only part of it, with the power that is there I am disappointed in the low quality put to rrlease. I loved Jedi survivor, a brilliant game but it was terribly optimised. I booted it today and had nothing but those assest loading flashes as walls and structures in my immediate vicinity and eyeline flashed white into existence.
Good games arent solely reliant om graphics but christ if they dont waste what they have. Programmers used to push everything to the max, now they get away with pushing beta releases to print.
Well you can get to a perfect 10k hypothetically, you can have more geometric/texture/lighting detail than the eye could process. From a technical perspective.
Of course you have the technical capabilities, and that’s part of the equation. The other part is the human effort to create the environments. Now the tech sometimes makes it easier on the artist (for example, better light modeling in the engine at run time means less effort to bake lighting in, and ability for author to basically “etc…” to more detail, by smoothing or some machine learning extrapolations). Despite this, more detail does mean more man hours to try to make the most of that, and this has caused massive cost increases as models got more detailed and more models and environments became feasible. The level of artwork that goes into the whole have of pacman is less than a single model in a modern game.
I don’t mind the graphics that much, what really pisses me off is the lack of optimization and heavy reliance on frame gen.
I don’t understand why developers and publishers aren’t prioritizing spectacle games with simple graphics like TABS, mount and blade, or similar. Use modern processing power to just throw tons of shit on screen, make it totally chaotic and confusing. Huge battles are super entertaining.
The dream of the '10s/20s game industry was VR. Hyper-realistic settings were supposed to supplant the real world. Ready Player One was what big development studios genuinely thought they were aiming for.
They lost sight of video games as an abstraction and drank too much of their own cyberpunk kool-aid. So we had this fixation on Ray Tracing and AI-driven NPC interactions that gradually lost sight of the gameplay loop and the broader iterative social dynamics of online play.
That hasn’t eliminated development in these spheres, but it has bifricated the space between game novelty and game immersion. If you want the next Starcraft or Earthbound or Counterstrike, you need to look towards the indie studios and their low-graphics / highly experimental dev studios (where games like Stardew Valley and Undertale and Balatro live). The AAA studios are just turning out 100 hour long movies with a few obnoxious gameplay elements sprinkled in.
There’s no better generational leap than Monster Hunter Wilds, which looks like a PS2 game on its lowest settings and still chugs at 24fps on my PC.
Could’ve done your research before buying. Companies aren’t held to standards bc people are uninformed buyers.
Never said I bought it. Why would I buy a 70€ game without running the benchmark tool first?
I just still find it ridiculous that it looks and runs like ass when MH World looks and runs way better on the same PC. Makes me wonder what’s really behind whatever ‘technological advancements’ have been put into Wilds. It’s like it’s an actual scam to make people buy new hardware with no actual benefit.
This is what a remaster used to look like.
Pretty sick if you ask me
I agree whole heartedly
It was a remake not a remaster. The hit boxes weren’t the same.
Technically an original source code was adopted to SNES, even including some (most?) glitches, so I’d say it’s more like a port or remaster than remake, even though graphics and audio were remade.
The difference is academic and doesn’t affect my point.
deleted by creator
To be fair there isn’t just graphics.
Something like Zelda Twilight princess HHD to Zelda Breath of the wild was a huge leap in just gameplay. (And also in graphics but that’s not my point)
Idk. Breath of the Wild felt more like a tech demo than a full game. Tears of the Kingdom felt more fleshed out, but even then… the wideness of the world belied its shallowness in a lot of places. Ocarina of Time had a smaller overall map, but ever region had this very bespokely crafted setting and culture and strategy. By the time you got to Twilight Princess, you had this history to the setting and this weight to this iteration of the Zelda setting.
What could you really do in BotW that you couldn’t do in Twilight? The graphics got a tweak. The amount of running around you did went way up. But the game itself? Zelda really peaked with Majorem’s Mask. So much of this new stuff is more fluff than substance.
What? Botw was awesome! There was so much to explore, the world was interesting, the NPCs are good, and so on. Oot and Majora’s Mask are both amazing too of course, but botw is a modern masterpiece.
The question is whether “realism” was ever a good target. The best games are not the most realistic ones.
So many retro games are replayable and fun to this day, but I struggle to return to games whose art style relied on being “cutting edge realistic” 20 years ago.
I dunno, Crysis looks pretty great on modern hardware and its 18 years old.
Also, CRYSIS IS 18 WHERE DID THE TIME GO?
There’s a joke in there somewhere about Crysis being the age of consent but I just can’t land it right now.
Probably because I’m old enough to remember it’s release.
I guess the joke can’t run Crysis
Yeah, but it was about 15 years ahead of it’s time.
Really? Cause I don’t know, I can play Shadow of the Colossus, Resident Evil 4, Metal Gear Solid 3, Ninja Gaiden Black, God of War, Burnout Revenge and GTA San Andreas just fine.
And yes, those are all 20 years ago. You are now dead and I made it happen.
As a side note, man, 2005 was a YEAR in gaming. That list gives 1998 a run for its money.
Did those go for realism though, or were they just good at balancing the more detailed art design with the gameplay?
Absolutely they went for realism. That was the absolute peak of graphics tech in 2004, are you kidding me? I gawked at the fur in Shadow of the Colossus, GTA was insane for detail and size for an open world at the time. Resi 4 was one of the best looking games that gen and when the 360 came out later that year it absolutely was the “last gen still looked good” game people pointed at.
I only went for that year because I wanted the round number, but before that Silent Hill 2 came out in 2001 and that was such a ridiculous step up in lighting tech I didn’t believe it was real time when the first screenshots came out. It still looks great, it still plays… well, like Silent Hill, and it’s still a fantastic game I can get back into, even with the modern remake in place.
This isn’t a zero sum game. You don’t trade gameplay or artistry for rendering features or photorealism. Those happen in parallel.
They clearly balanced the more detailed art design with the game play.
GTA didn’t have detail on cars to the level of a racing game, and didn’t have characters with as much detail as Resident Evil, so that it could have a larger world for example. Colossus had fewer objects on screen so it could put more detail on what was there.
Yeah. So like every other game.
Nothing was going harder for visuals, so by default that’s what was happening. They were pushing visuals as hard as they would go with the tech that they had.
The big change isn’t that they balanced visuals and gameplay. If anything the big change is that visuals were capped by performance rather than budget (well, short of offline CG cutscenes and VO, I suppose).
If anything they were pushing visuals harder than now. There is no way you’d see a pixel art deck building game on GOTY lists in 2005, it was all AAA as far as the eye could see. We pay less attention to technological escalation now, by some margin.
Yeah. So like every other game.
Except for the ones that don’t do a good job of balancing the two things. Like the games that have incredible detail but shit performance and/or awful gameplay.
I would say GoW and SotC at least take realism as inspiration, but aren’t realistic. They’re like an idealized version of realism. They’re detailed, but they’re absolutely stylized. SotC landscapes, for example, look more like paintings you’d see rather than places you’d see in real life.
Realism is a bad goal because you end up making every game look the same. Taking our world as inspiration is fine, but it should almost always be expanded on. Know what your game is and make the art style enhance it. Don’t just replicate realism because that’s “what you’re supposed to do.”
Look, don’t take it personally, but I disagree as hard as humanly possible.
Claiming that realism “makes every game look the same” is a shocking statement, and I don’t think you mean it like it sounds. That’s like saying that every movie looks the same because they all use photographing people as a core technique.
If anything, I don’t know what “realism” is supposed to mean. What is more realistic? Yakuza because it does these harsh, photo-based textures meant to highlight all the pores or, say, a Pixar movie where everything is built on this insanely accurate light transfer, path traced simulation?
At any rate, the idea that taking photorealism as a target means you give up on aesthetics or artistic intent is baffling. That’s not even a little bit how it works.
On the other point, I think you’re blending technical limitations with intent in ways that are a bit fallacious. SotC is stylized, for sure, in that… well, there are kaijus running around and you sometimes get teleported by black tendrils back to your sleeping beauty girlfirend.
But is it aiming at photorealism? Hell yeah. That approach to faking dynamic range, the deliberate crushing of exteriors from interiors, the way the sky gets treated, the outright visible air adding distance and scale when you look at the colossi from a distance, the desaturated take on natural spaces… That game is meant to look like it was shot by a camera all the way. They worked SO hard to make a PS2 look like it has aperture and grain and a piece of celluloid capturing light. Harder than the newer remake, arguably.
Some of that applies to GoW, too, except they are trying to make things look like Jason and the Argonauts more than Saving Private Ryan. But still, the references are filmic.
I guess we’re back to the problem of establishing what people mean by “realism” and how it makes no sense. In what world does Cyberpunk look similar to Indiana Jones or Wukong? It just has no real meaning as a statement.
If anything, I don’t know what “realism” is supposed to mean. What is more realistic? Yakuza because it does these harsh, photo-based textures meant to highlight all the pores or, say, a Pixar movie where everything is built on this insanely accurate light transfer, path traced simulation?
The former is more realistic, but not for that reason. The lighting techniques are techniques, not a style. Realism is trying to recreate the look of the real world. Pixar is not doing that. They’re using advanced lighting techniques to enhance their stylized worlds.
Some of that applies to GoW, too, except they are trying to make things look like Jason and the Argonauts more than Saving Private Ryan. But still, the references are filmic.
Being inspired by film is not the same as trying to replicate the real world. (I’d argue it’s antithetical to it to an extent.) Usually film is trying to be more than realistic. Sure, it’s taking images from the real world, but they use lighting, perspective, and all kinds of other tools to enhance the film. They don’t just put some actors in place in the real environment and film it without thought. There’s intent behind everything shown.
I guess we’re back to the problem of establishing what people mean by “realism” and how it makes no sense. In what world does Cyberpunk look similar to Indiana Jones or Wukong? It just has no real meaning as a statement.
Cyberpunk looks more like Indiana Jones than Persona 5. Sure, they stand out from each other, but it’s mostly due to environments.
I think there’s plenty of games that benefit from realism, but not all of them do. There are many games that could do better with stylized graphics instead. For example, Cyberpunk is represented incredibly well in both the game and the anime. They both have different things they do better, and the anime’s style is an advantage for the show at least. The graphics style should be chosen to enhance the game. It shouldn’t just be realistic because it can be. If realism is the goal, fine. If it’s supposed to be more (or different) than realism, maybe try a different style that improves the game.
Realism is incredibly hard to create assets for, so it costs more money, and usually takes more system resources. For the games that are improved by it, that’s fine. There’s a lot of games that could be made on a smaller budget, faster, run better, and look more visually interesting if they chose a different style though. I think it should be a consideration that developers are allowed to make, but most are just told to do realism because it’s the “premium” style. They aren’t allowed to do things that are better suited for their game. I think this is bad, and also leads to a lack in diversity of styles.
I don’t understand what you’re saying. Or, I do, but if I do, then you don’t.
I think you’re mixing up technique with style, in fact. And really confusing a rendering technique with an aesthetic. But beyond that, you’re ignoring so many games. So many. Just last year, how do you look at Balatro and Penny’s Big Breakaway and Indiana Jones and go “ah, yes, games all look the same now”. The list of GOTY nominees in the TGAs was Astro Bot, Balatro, Wukong, Metaphor, Elden Ring and Final Fantasy VII R. How do you look at that list of games and go “ah, yes, same old, same old”.
Whenever I see takes like these I can’t help but think that people who like to talk about games don’t play enough games, or just think of a handful of high profile releases as all of gaming. Because man, there’s so much stuff and it goes from grungy, chunky pixel art to lofi PS1-era jank to pitch-perfect anime cel shading to naturalistic light simulation. If you’re out there thinking games look samey you have more of a need to switch genres than devs to switch approach, I think.
By “all games look the same” I’m being hyperbolic. I mean nearly all AAA games and the majority of AA games (and not an insignificant number of indies even).
Watch this video. Maybe it’ll help you understand what I’m saying.
Whenever I see takes like these I can’t help but think that people who like to talk about games don’t play enough games, or just think of a handful of high profile releases as all of gaming.
Lol. No. Again, I was being hyperbolic and talking mostly about the AAA and AA space. I personally almost exclusively play indies who know what they’re trying to make and use a style appropriate to it. I play probably too many games. I also occasionally make games myself, I was the officer in a game development club in college, and I have friends in the industry. I’m not just some person who doesn’t understand video games.
STALKER is good, though I played a lot of Anomaly mostly, and I’m not sure that STALKER was ever known for bleeding edge graphics
We should be looking at more particles, more dynamic lighting, effects, realism is forsure a goal just not in the way you think, pixar movies have realistic lighting and shadows but arent “realistic”
After I started messing with cycles on blender I went back to wanting more “realistic” graphics, its better for stylized games too
But yeah I want the focus to shift towards procedural generation (I like how houdini and unreal approach it right now), more physics based interactions, elemental interactions, realtime fire, smoke, fluid, etc. Destruction is the biggest dissapointment, was really hoping for a fps that let me spend hours bulldozing and blowing up the map.
Destruction is the biggest dissapointment, was really hoping for a fps that let me spend hours bulldozing and blowing up the map.
Ever heard of The Finals?
finals is included in my dissapointment
Factorio and Balatro
Like cgi and other visual effects, realism has some applications that can massively improve the experience in some games. Just like how lighting has a massive impact, or sound design, etc.
Chasing it at the expense of game play or art design is a negative though.
A Link to the Past > Ocarina of Time
Fight me
Idk, I’d say that pursuing realism is worthy, but you get diminishing returns pretty quick when all the advances are strictly in one (or I guess two, with audio) sense. Graphical improvements massively improved the experience of the game moving from NES or Gameboy to SNES and again to PS1 and N64. I’d say that the most impressive leap, imo, was PS1/N64 to PS2/XBox/GameCube. After that, I’d say we got 3/4 of the return from improvements to the PS3 generation, 1/2 the improvement to PS4 gen, 1/5 the improvement to PS5, and 1/8 the improvement when we move on to PS5 Pro. I’d guess if you plotted out the value add, with the perceived value on the Y and the time series or compute ability or texture density or whatever on the x, it’d probably look a bit like a square root curve.
I do think that there’s an (understandably, don’t get me wrong) untapped frontier in gaming realism in that games don’t really engage your sense of touch or any of the subsets thereof. The first step in this direction is probably vibrating controllers, and I find that it definitely does make the game feel more immersive. Likewise, few games engage your proprioception (that is, your knowledge of your body position in space), though there’ve been attempts to engage it via the Switch, Wii, and VR. There’s, of course, enormous technical barriers, but I think there’s very clearly a good reason why a brain interface is sort of thought of as the holy grail of gaming.
Having a direct brain interface game, that’s realistic enough to overcome the Uncanny Valley, would destroy peoples lives. People would, inevitably, prefer their virtual environment to the real one. They’d end up wasting away, plugged into some machine. It would lend serious credence to the idea of a simulated universe, and reduce the human experience by replacing it with an improved one. Shit, give me a universe wherein I can double-jump, fly, or communicate with animals, and I’d have a hard time returning to this version.
We could probably get close with a haptic feedback suit, a mechanism that allows you to run/jump in any direction, and a VR headset, but there would always be something tethering you to reality. But a direct brain to machine interaction would have none of that, it would essentially be hijacking our own electrical neural network to run simulations. Much like Humans trying to play Doom on literally everything. It would be as amazing as it was destructive, finally realizing the warnings from so many parents before its time: “that thing’ll fry your brain.”
Tbf, it’s kinda bullshit that we can’t double jump IRL. Double jumping just feels right, like it’s something we should be able to do.
Yeah, no, it’d likely be really awful for us. I mean, can you imagine what porn would be like on that? That’s a fermi paradox solution right there. I could see the tech having a lot of really great applications, too, like training simulations for example, but the video game use case is simultaneously exhilarating and terrifying.
I agree generally, but I have to offer a counterpoint with Kingdom Come: Deliverance. I only just got back into it after bouncing off in 2019, and I wish I hadn’t stopped playing. I have a decent-ish PC and it still blows my entire mind when I go roaming around the countryside.
Like Picard said above, in due time this too will look aged, but even 7 years on, it looks and plays incredible even at less-than-highest settings. IMHO the most visually impressive game ever created (disclaimer: I haven’t seen or played Horizon). Can’t wait to play KC:D 2!
It’s the right choice for some games and not for others. Just like cinematography, there’s different styles and creators need to pick which works best for what they’re trying to convey. Would HZD look better styled like Hi-Fi Rush? I don’t really think so. GOW? That one I could definitely see working more stylized.
I would argue that late SNES era games look far better than their early 3d era follow ups
Late 16 bit games had to lean into distinct art directions which allowed them the stand the test of time.
Let’s compare two completely separate games to a game and a remaster.
Generational leaps then:
Good lord.
EDIT: That isn’t even the Zero Dawn remaster. That is literally two still-image screenshots of Forbidden West on both platforms.
Good. Lord.
Yeah no. You went from console to portable.
We’ve had absolutely huge leaps in graphical ability. Denying that we’re getting diminishing returns now is just ridiculous.
We’re still getting huge leaps. It simply doesn’t translate into massively improved graphics. What those leaps do result in, however, is major performance gains.
I have played Horizon Zero Dawn, its remaster, and Forbidden West. I am reminded how much better Forbidden West looks and runs on PS5 compared to either version of Zero Dawn. The differences are absolutely there, it’s just not as spectacular as the jump from 2D to 3D.
The post comes off like a criticism of hardware not getting better enough faster enough. Wait until we can create dirt, sand, water or snow simulations in real time, instead of having to fake the look of physics. Imagine real simulations of wind and heat.
And then there’s gaussian splatting, which absolutely is a huge leap. Forget trees practically being arrangements of PNGs–what if each and every leaf and branch had volume? What if leaves actually fell off?
Then there’s efficiency. What if you could run Monster Hunter Wilds at max graphics, on battery, for hours? The first gen M1 Max MacBook Pro can comfortably run Baldur’s Gate III. Reducing power draw would have immense benefits on top of graphical improvements.
Combined with better and better storage and VR/AR, there is still plenty of room for tech to grow. Saying “diminishing returns” is like saying that fire burns you when you touch it.
What those leaps do result in, however, is major performance gains.
Which many devs will make sure you never feel them by “optimizing” the game for only the most bleeding edge hardware
Then there’s efficiency. What if you could run Monster Hunter Wilds at max graphics, on battery, for hours? The first gen M1 Max MacBook Pro can comfortably run Baldur’s Gate III. Reducing power draw would have immense benefits on top of graphical improvements.
See, if the games were made with a performance first mindset, that’d be possible already. Not to dunk on performance gains, but there’s a saying that every time hardware gets faster, programmers make their code slower. I mean, you can totally play emulated SNES games with minimal impact compared to leaving the computer idling.
Saying “diminishing returns” is like saying that fire burns you when you touch it.
Unless chip fabrication can figure a way to make transistors “stack” on top of one another, effectively making 3D chips, they’ll continue to be “flat” sheets that can only increase core count horizontally. Single core frequency peaked in early 2000s, from then on it’s been about adding more cores. Even the gains from a RTX 5090 vs a RTX 4090 aren’t that big. Now compare with the gains from a GTX 980 vs a GTX 1080
I am reminded how much better Forbidden West looks and runs on PS5 compared to either version of Zero Dawn.
Really? I’ve played both on PS5 and didn’t notice any real difference in performance or graphics. I did notice that the PC Version of Forbidden West has vastly higher minimum requirements though. Which is the opposite of performance gains.
Who the fuck cares if leaves are actually falling off or spawning in above your screen to fall?
And BG3 has notoriously low minimums, it is the exception, not the standard.
If you want to see every dimple on the ass of a horse then that’s fine, build your expensive computer and leave the rest of us alone. Modern Next Gen Graphics aren’t adding anything to a game.
HFW runs like butter on PC, and personally I noticed a big difference between HZD and HFW on PS5.
I’m assuming you’re playing on a bad TV. I have a 4k120 HDR OLED panel, and the difference is night and day.
I also prefer to enjoy new things, instead of not enjoying new things. It gives me a positive energy that disgruntled gamers seem to be missing.
I’m playing on a normal TV because I’m not made of money.
So you’re claiming new hardware isn’t perceivably better, despite not using a display which is actually capable of displaying said improvements. I use such a display. I have good vision. The quality improvement is extremely obvious. Just because not everyone has a high end display doesn’t mean that new hardware is pointless, and that everyone else has to settle for the same quality as the lowest common denominator.
My best hardware used to be Intel on-board graphics. I still enjoyed games, instead of incessantly complaining how stagnant the gaming industry is because my hardware isn’t magically able to put out more pixels.
The PS5 is a good console. Modern GPUs are better than older ones. Games look better than they did five or ten years ago. Those are cold, hard, unobjectionable facts. Don’t like it? Don’t buy it.
I do like it.
The fact that the Game Boy Advance looks that much better than the Super Nintendo despite being a handheld, battery powered device is insane
Is it that much better? The colours just look more saturated to me
There’s noticably more detail, especially along the coastline. Also, the more saturated colors improve contrast
The GBA just has reworked art. The SNES could easily do the same thing.
Because most GBA games were meant to be desaturated due to the terrible screen
What game is the first one
Final Fantasy 4 (2 on USA)
It appears to be a Final Fantasy game, so likely either 4 or 6 aka 2 or 3 in the US
It is baffling to me that people hate cross gen games so much. Like, how awful for PS4 owners that don’t have to buy a new console to enjoy the game, and how awful for PS5 owners that the game runs at the same fidelity at over 60FPS, or significantly higher fidelity at the same frame rate.
They should have made the PS4 version the only one. Better yet, we should never make consoles again because they can’t make you comprehend four dimensions to be new enough.
The point isn’t about cross generation games. It’s about graphics not actually getting better anymore unless you turn your computer into a space heater rated for Antarctica.
It’s a pointless point. Complain about power draw. Push ARM.
ARM isn’t going to magically make GPUs need less brute force energy in badly optimized games.
…So push ARM. By optimising games.
EDIT: This statement is like saying “Focusing on ARM won’t fix efficiency because we aren’t focusing on ARM”.
I mean, how much more photorealistic can you get? Regardless, the same game would look very different in 4K (real, not what consoles do) vs 1080p.
The lighting in that image is far, far from photorealistic. Light transport is hard.
That’s true but realistic lightning still wouldn’t make anywhere near the same amount of difference that the other example shows.
Kind of like smartphones. They all kind of blew up into this rectangular slab, and…
Nothing. It’s all the same shit. I’m using a OnePlus 6T from 2018, and I think I’ll have it easily for another 3 years. Things eventually just stagnate.
I was hoping that eventually smartphones would evolve to do everything. Especially when things like Samsung Dex were intorduced, it looked to me like maybe in the future phones could replace desktops, running a full desktop OS when docked and some simplified mobile UI + power saving when in mobile mode.
But no, I only have a locked-down computer.
Yeah whatever happened to that? That was such a good idea and could have been absolutely game changing if it was actually marketed to the people who would benefit the most from it
I used it for a while when I worked two jobs. Is clock out of job 1 and had an agreement with them to be allowed to use the screen and input devices at my desk for job 2. Then I’d plug in my Tab S8 and get to work, instead of having to carry to chunky laptops.
So it still exists! What I noticed is that a Snapdragon 8 Gen 1 feels underpowered and that Android, and this is the bigger issue, does not have a single browser that works as a full fledged desktop version. All browser I tested has some shortcomings, especially with drag and drop or context menus or whatever. Like things work but you’re constantly reminded that you’re running a mobile os. Like weird behavior or oversized context menus or whatever.I wish you could lunch into a Linux vm instead of Dex UI. Or for Samsung to double down on the concept. The Motorola Atrix was so ahead of it’s time. Like your phone transforming into your tablet, into your laptop, into your desktop. How fucking cool is that?
Apple would be in a prime position, they’re entire ecosystem is now ARM based and they have the chips with enough power. But it’s not their style to do somethingcoolto threaten their bottom line. Why sell one phone when you can sell phone, laptop, tablet, desktop separately?It’s super easy to forget but Ubuntu tried to do it back in the day with Convergence as well, and amusingly this article also compares it to Microsoft’s solution on Windows Phone. It’s a brilliant idea but apparently no corporation with the ecosystem to make it actually happen has the will to risk actually changing the world despite every company talking about wanting an “iPhone moment”
Apple would be in a prime position, they’re entire ecosystem is now ARM based and they have the chips with enough power. But it’s not their style to do something cool to threaten their bottom line. Why sell one phone when you can sell phone, laptop, tablet, desktop separately?
Let’s be real, Apple’s biggest risk would be losing the entire student and young professional market by actually demonstrating that they don’t need a Mac Book Pro to use the same 5 webapps that would work just as well on a decent Chromebook (if such a thing existed)
Linux vm
Or just something like Termux, a terminal emulator for Android. Example screenshot (XFCE desktop over VNC server), I didn’t know what to fit in there:
Full desktop apps, running natively under Android. For better compatibility Termux also has proot-distro (similar to chroot) where you can have… let me copy-paste
Supported distributions (format: name < alias >): * Alpine Linux < alpine > * Arch Linux < archlinux > * Artix Linux < artix > * Chimera Linux < chimera > * Debian (bookworm) < debian > * deepin < deepin > * Fedora < fedora > * Manjaro < manjaro > * openKylin < openkylin > * OpenSUSE < opensuse > * Pardus < pardus > * Ubuntu (24.04) < ubuntu > * Void Linux < void > Install selected one with: proot-distro install <alias>
Though there is apparently some performance hit. I just prefer Android, but maybe you could run even full LibreOffice under some distro this way.
If it can be done by Termux, then someone like Samsung could definitely make something like that too, but integrated with the system and with more software available in their repos.
What’s missing from the picture but is interesting too is NGINX server (reverse proxy, lazy file sharing, wget mirrored static website serving), kiwix-serve (serving ZIM files including the entire Wikipedia from SD card) and Navidrome (music server).
And brought to any internet-connected computer via Cloudflare QuickTunnel (because it doesn’t need account nor domain name). The mobile data upload speed will finally matter, a lot.You get the idea, GNU+Linux. And Android already has the Linux kernel part.
Linux on DeX was a thing but killed by Samsung
Yeah, I remember trying it and while it works the performance hit was too big for my use case. But it’s been a while!
Fortunately I’m in a position where I don’t have to juggle two jobs anymore so I barely use Dex these days.
Which in reverse is also why Samsung isn’t investing a lot into it I suppose - it’s a niche use case. I would guess that generally people with a desktop setup would want something with more performance than a mobile chip.
there is an official android desktop mode, I tried it and it isn’t great ofc but my phone manufacturer (oneplus) has clearly put no work into making it functional
OnePlus 6 line of phones are one of the very few with good Linux support, I mean, GNU/Linux support. If custom ROMs no longer cut it you can get even more years with Linux. I had an iPhone, was eventually fed up, got an Android aaand I realized I am done with smartphones lol. Gimme a laptop with phone stuff (push notifications w/o killing battery, VoLTE) and my money is yours, but no such product exists.
One company put a stupid fucking notch in their screen and everyone bought that phone, so now every company has to put a stupid fucking notch in the screen
I just got my tax refund. If someone can show me a modern phone with a 9:16 aspect ratio and no notch, I will buy it right now
I miss physical keyboards on phones
What do you expect next? Folding phones? That would be silly!
Games did teach me about diminishing returns though