• 0 Posts
  • 149 Comments
Joined 2 years ago
cake
Cake day: March 8th, 2024

help-circle
  • Hah. You do you. I get how it’d be obnoxious to be called out, but man, it’s not my fault that you chose the worst possible example for this. Like, literally the worst iteration of Windows for the specific metric you called out, in a clearly demonstrable way that a ton of people measured because it was such a meme.

    You can block me, but “they are what they are” indeed.

    Incidentally, this is a classic opportunity to remind people that blocking on AP applications sucks ass and the only effect it has is for the blocker to stop being able to see what the blockee is saying about them while everybody else still gets access to both. Speaking of software degradation, somebody should look into that.


  • Myyyyyeeeeh. A lightweight distro or a conemporaneous distro sure.

    If I’m running GPU accelerated Steam, tons of tabs on Firefox and the same highly customized KDE desktop full of translucent components and extra animations I am willing to bet they’d both chug.

    Which is what the conversation is about: new software doesn’t suck, it’s doing more stuff.

    For sure, all things being equal Linux does run ligher on RAM and VRAM, so if you’re using something that is speficially memory-limited so Windows and Linux fall on opposite sides of overflowing the available memory you’ll definitely see better performance on Linux, but that’s not an inherent issue with poorly made software having a huge performance overhead.


  • Then you’re either lying about it or haven’t booted a newer PC. Fast Boot was a back of the box feature for Windows 8 for a reason. It was becoming a huge meme at the time how slow Win7 was to boot.

    If your 2020s PC with Windows 11 is taking 45 seconds to boot on the Windows logo like Win 7 does (as seen in the benchmarks above) then you need some tech support because something is clearly not working as expected. I don’t think even my weaker Win11 machines take longer than 10 secs from boot starting to the password screen.

    That may be true anyway, because the tiny hybrid laptop I’m using to write this is reporting 2-5% CPU utilization even with a literal hundred tabs open in this browser. So… yeah, either you have a knack for hyperbole or something broken.


  • Except the Linux userbase has been saying that exact thing for the past ten years, so again, has Linux also degraded in sync or, hear me out here, is this mostly a nostalgia thing that makes you forget the cludgy performance issues of the software you used when you were younger and things have mostly gotten snappier over time across the board?

    As a current dual booter I’ll say that Windows and Linux don’t feel fundamentally different these days, for good and ill. Windows has a remarkably crappy and entirely self-inflicted issue with their online-search-in-Start-menu feature, which sucks but is toggleable at least. Otherwise I have KDE and Win11 set up the same way and they both work pretty much the same. And both measurably better than their respective iterations 10, let alone 15 or 20 years ago.


  • That’s… not really true, and not what that link shows. Those latency tests still show then-modern devices topping the list. They’re arguing that some then-modern low end devices have more button-to-screen latency than older hardware (which they would, given he’s comparing to single-threaded, single-tasking bare metal stuff from the 80s spitting signals out to a CRT to laptops with integrated graphics). And they’re saying that at the time (I presume the post dates from 2017, when the testing ends), this wasn’t well understood because people were benching the hardware and not the end to end latency factoring the I/O… which was kinda true then but absolutely not anymore.

    I’d get in the weeds about how much or little sense it makes to compare an apple 2 drawing text on a CRT to typing on a powershell/Linux terminal window inside a desktop environment, but that’d be kind of unfair. Ten years ago this wasn’t a terrible observation to make with the limited tools the guy had available, and this sort of post made it popular to think about latency and made manufacturers on controllers, monitors and GPUs focus on it more.

    What it does not show, though, is that an apple 2 was faster than a modern gaming PC by any metric. Not in 2017, and sure as hell not in 2026, when 240Hz monitors are popular, 120Hz TVs are industry-standard, VRR is widely supported and keyboards, controllers, monitors and GPU manufacturers are obsessed with latency measurements. It’s not just fallacious, it’s wrong.


  • When was the last time you booted a 2011 machine? Because man, is that not true.

    And that’s a 2016-2017 era PC.

    Windows 7 didn’t even have fast boot support at all. I actively remember recommending people to let their PCs sit for a couple of minutes after booting so that Windows could finish whatever the hell it was trying to do in the background faster instead of clogging up whatever else you were trying to do.

    Keeping my old hardware around compulsively really impacts my perception of this whole “things were better when I was a teenager” stuff.


  • That’s some nonsense, though.

    For one thing, it’s one of those tropes that people have been saying for 30 years, so it kinda stops making sense after a while. For another, the reason it doesn’t make sense is it doesn’t account for modern computers doing more now than they did then.

    In 2016 I had a 970 that’s still in an old computer I use as a retro rocket, and I can promise you that wonderful as that thing was, I couldn’t have been playing Resident Evil this week on that thing. So yeah, I notice.

    And I had a Galaxy S7 then, which is still in use as a bit of a display and I assure you my current phone is VERY noticeably faster, even discounting the fact that its displaying 120fps rather than 60.

    Old people have been going “things were better when I was a kid” for millennia. I’m not assuming we’re gonna stop now, but… maybe we should.




  • See, that’s the type of justification that doesn’t sit well with me and that the article is doing all over the place.

    Is the Steam Deck a very successful handheld PC? Sure. Compared to the boutique stuff sold on Indiegogo by Chinese manufacturers it’s probably an order of magnitude larger.

    Except it’s also not priced like one of those (or wasn’t at launch, anyway), it’s priced like a console, with the LCD model (while it lasted) priced right alongside the Switch OLED and a bit cheaper than the Switch 2.

    And by that metric it’s done poorly, with best estimates placing it right alongside the PSVita at the absolute best, lifetime. The bar for success on that scale isn’t “selling millions”, it’s selling tens of millions, which the Deck has struggled to do.

    So, all fanboyism aside: The Deck did well for a handheld PC, but kinda failed in the attempt to bridge the gap between those and handheld consoles. That, if you’re keeping track, is “reporting, not an opinion piece”.

    This?

    Valve’s Steam Deck has been a runaway success. While the beloved handheld has sold less than most major console handhelds, it’s become a valuable system for many to take their PC games on the go.

    This is an opinion piece.





  • MGS only made it to Windows in 2000. OoT obviously never did, officially.

    Where I was, the games running in demo PCs and net cafés in 98/99 were Quake 3, Unreal and, believe it or not, yeah, Baldur’s Gate. Because BG1 already had pretty much the same MP as BG3 and people would pay per seat to play co-op runs of the original.

    For the PC crowd BG1 and Starcraft were on a pretty even playing field in terms of scope perception.

    The thing is, at the time counting budgets wasn’t much of a consideration. For one thing, most of them weren’t publicly known at all, beyond the extreme outliers you mention. People took notice when 50 mill were broken because that was such a high water mark for so long, but if AAA was a concept at all (it wasn’t), it certainly had more to do with branding and promotional materials. Having ads on good old normie broadcast TV did more to sell the size of FF7 than how big it was.

    Ultimately BG was a major release. It came from a familiar publisher, it had a recognizable license, it had the same gaming magazine coverage as other major releases of the year, and it got a ton of critical praise and buzz across the industry. It didn’t come across as scope-constrained at all. FF7 was on another level entirely, but that was true of pretty much every other game release.

    Also, FWIW, OoT wasn’t that big of a deal where I am, and neither was the N64 in general. GoldenEye and Turok drove more attention than OoT, and neither of those were particularly relevant, either. You would have definitely had much more luck getting people to recognize Baldur’s Gate than OoT over here in 1999.


  • By that metric there were maybe two AAA PC games in all of 1998. BG1 you can make the case (but given that it was an Interplay-published, licensed game meant for relatively performant hardware, it was absolutely in line with AAA PC releases of the day). BG2? Absolutely not. Bordering on eight digits in 2000 was not a small game at all. And of course neither were independent games by definition.

    For sure BG3 is absurdly large and the historical comparisons break down a bit in the sheer scale of what that thing is. But nobody in the late 90s was buying a top down D&D CRPG with the production values of BG (or an action RPG in the vein of Diablo the previous year) and thinking they were slumming it in the dregs of small budget gaming.


  • Well, yes it is.

    That is exactly how being things and not being things are.

    If you go with “well, it’s not an indie, but it behaves like one in my view” as selection criteria, then the remainder of “AAA” you are left with by that tautological selection process is by definition made up of whatever bad habits you’ve arbitrarily determined to be “bad AAA behavior”.

    I’m very happy that the guy jives with CDPR. Good for him. But what he’s found is a AAA studio that works in ways he likes, not a “semi-indie” studio that just happens to own a first party platform (until last week, anyway), make massive games and be publicly owned.

    If you define AAA as “studios that do bad things I don’t like” you can’t expect to be taken seriously when you complain about how all AAA studios are doing things you don’t like.