• 0 Posts
  • 37 Comments
Joined 1 year ago
cake
Cake day: August 14th, 2023

help-circle

  • The spirit of your point is right, but: game patches existed back then. The first patch for Half Life was 1.0.0.8 released in 1999 (release version was 1.0.0.5). I cannot find the patch notes or exact release date as my search results are flooded with “25th anniversary patch” results.

    What was true is that players patching their games was not a matter of course for many years. It was a pain in the ass. The game didn’t update itself. You didn’t have a launcher to update your game for you. No. Instead, you had to go to the game’s website and download the patch executable yourself. But it wasn’t just a simple “Game 1.1 update.exe” patch. That’d be too easy. It was a patch from 1.0.9 to 1.1, and if you were on 1.0.5.3 you had to get the patch for 1.0.5.3 to 1.0.6.2, then a patch from that to 1.0.8 then a patch from that to 1.0.9. Then you had to run all of those in sequence. This is a huge, huge part of why people eventually started to fall in love with Steam back in the day. Patches were easy and “just worked” — it was amazing compared to what came before.

    The end result being that patches existed but the game that people remember (and played) was by and large defined by what it was on release. Also console games weren’t patched, although newer printings of a game would see updates. Ocarina of Time’s 1.0 release was exclusive to Japan; the North American release was 1.1 for the first batch of sales. After the initial batch was sold out the release was replaced by 1.2. That was common back then. As far as I know there was no way for consumers to get theirs updated, or to even find out about the updates. But they did exist.



  • Paying over a third of all revenue generated from searches on Apple’s platform. That’s incredible. Not a lawyer so I have no idea how this will work out legally, but I have a hard time parsing such an enormous pay-share as anything other than an aggressive attempt to stymie competition. Flat dollar payments are easier to read as less damning, but willingly giving up that much revenue from the source suggests the revenue of the source is no longer the primary target. It’s the competitive advantage of keeping (potential) competitors from accessing that source.



  • Typical corporate greed in that sense. It’s stupid but I’m not at all surprised by that attitude.

    The part that even if they were morally right in that sense… it’s already too late. This is trying to close the barn door not just after the horse left, but after the horse already ran off and made it two states over. There’s definitely value to LLM in having more data and more up to date data, but reddit is far from the only source and I cannot imagine that they possess enough value there to have any serious leverage.

    Reddit would/will survive being taken out of internet search results. Not without costs though: it will arrest their growth rate (or accelerate shrink rate, as appropriate) and make people less interested in using the site.




  • That really depends on what their goal is.

    From a business perspective it’s not worth fighting to eliminate 100% of ad block uses. The investment is too high. But if they can eliminate 50% or 70% or 90% of ad block uses with youtube? That could be worth the effort for them. If they can “win” for Chrome and make it a bit annoying for Firefox that would likely be enough for Google to declare it a huge success.

    People willing to really dig all the way in to get a solution they desire are not the norm. Google can be OK with the 1% of us out there as long as we aren’t also making it possible for another huge chunk of people to piggyback off it effortlessly.



  • The stuff that made Vista shitty to most end users wasn’t truly fixed with W7. For the most part W7 was a marketing refresh after Vista had already been “fixed.” Not saying that it was a small update or anything like that, just that the broken stuff had been more or less fixed.

    Vista’s issues at launch were almost universally a result of the change to the driver model. Hardware manufacturers, despite MS delaying things for them, still did not have good drivers ready at release. They took years after the fact to get good, stable, drivers out there. By the time that happened, Vista’s reputation as a pile of garbage was well cemented. W7 was a good chance to reset that reputation while also implementing other various major upgrades.


  • Cities Skylines sees a fairly decent improvement going to the 3D cache chips from AMD (17% speedup here for the 5800x3D). Whats your ability to increase the budget to go for a 7800X3D look like? If this is a genre of game you like and you want to hold off as long as possible between upgrades, it might be worth springing the extra. The difference the 3D cache provides in some games is rather extraordinary. City builders, automation, and similar games tend to benefit the most. AAA games tend to benefit the least (some with effectively no gain).

    A 7600X should be more than capable of handling the game though. So it’s not a question of need but if it’s worth it to you.

    You do not want 4800 CL40 RAM though, that’s too slow. I’d strongly recommend going for 32GB of RAM as well; 16GB can be gobbled up quickly, especially if you want to use mods in Cities Skylines.

    Going up even to DDR5-6000 is not much of a price increase. I’d suggest 6000 and something in the range of CL36-CL40. There’s a lot of 32GB kits in those specs in the ~$90 range. I would not build a gaming system today with 16GB of RAM.



  • It’s also because their current shows suck, and because any shows that are actually good get shitcanned after season 2, because Netflix sees less consumer growth after two seasons.

    I’m always surprised at how often other people (not you) will defend this practice from Netflix. It’s classic case of following the data in a stupid way. If their data shows that interest drops off after two seasons, I don’t doubt it.

    But… that comes with a cost. They have built a reputation as a company that doesn’t properly finish shows that they start, that will leave viewers hanging. That makes it harder to get people invested in a new series, even one that’s well reviewed. Why get interested in something you know will end on a cliffhanger?

    That kind of secondary order impact from their decision isn’t going to show up in data. Doesn’t change that it happens all the same.



  • ME2 is a good game in isolation, but I think it played a big part in getting Bioware where they are now.

    ME2 saw them move far, far more into the action-RPG direction that was wildly popular at the time, with a narrative that was in retrospect just running in place (ME2 contributes effectively nothing towards the greater plot and zero major issues are introduced if it is excised from the trilogy). I feel the wild success ME2 saw after going in this direction caused Bioware to (a) double down on trend chasing, and (b) abandon one of their core strengths of strong, cohesive narratives. ME3 chased multiplayer shooter trends, DA:I and ME:A both chased open world RPG trends, Anthem chased the live service trend, and the first try at DA3 chased more live service stuff before Anthem launched to shit and they scrapped the whole thing to start over.

    All while, of what I saw first hand (of those I played) or read about secondhand (of those I did not play) none of those games put any serious focus on Bioware’s bread&butter of well written narratives. ME3 in particular is a narrative mess, with two solid payoffs (Krogans + Geth-Quarians) and the rest being some of the worst writing I’ve seen in a major video game.

    ME2 was great. ME2 also set Bioware on a doomed path.




  • The movie made sense IMO, its main issues are that so much of the crew are hollow. Their characters are threadbare, they’re on screen for the express purpose of dying. Even if we don’t pick up on it specifically we pick up on it subconsciously and they feel off. The geologist and biologist that die early on have basically one trait each (biologist is fake tough guy, biologist is nerdy-nervous). They don’t feel like real people.

    I liked Prometheus a lot, but the very-real problems with it would in my estimation require way more than a director’s cut to fix. Unless there’s a lot of filmed character development out there, I suppose. The insignificant characters needed to be replaced with a far smaller number of significant characters to join the handful of existing significant characters. Basically requires a rewrite.


  • It’s useless for answering a questions that wasn’t asked, sure. But I didn’t pretend to answer that question. What it is useful for is answering the topic question. You know, the whole damn point?

    How much of a factor off do you think the estimate is? You think they need three drives of redundancy each? Ten? Chances are they’re paying half (or less) for storage drives compared to retail pricing. The estimate on what they could get with $100m was also 134 EB, a mind boggling sum of storage. I wouldn’t be surprised if they’re using up on the order of 1 EB/year in needed storage. There’s also a lot more room in their budget than 0.34%.

    The point is to get a quick and simple estimate to show that there really will not be a problem in Google acquiring sufficient storage. If you want a very accurate estimate of their costs you’ll need data that we do not have. I was not aiming to get a highly accurate estimate of their costs. I made this clear, right from the beginning.

    If each video was on a single hard drive the site would not be able to function as even the fastest multi actuator hard drive can only do 524 MB/s in a perfect vacuum.

    The most popular videos are all going to be kept in RAM, they don’t read them all off disk with every single view request. If you wanted a comment going over the finer details of server architecture, you shouldn’t have looked at the one saying it was doing back of the envelope math on storage costs only, eh?