• leshy@r.nf
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    3 days ago

    AMD is fairly very aware of this

    PCGamer needs to edit this stuff.

  • ℍ𝕂-𝟞𝟝@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    14
    ·
    4 days ago

    enabled with FSR 4 technology

    I’m pretty sure we’ll have a separate corpo-English by 2100 that is not intelligible by normal people.

    • Obi@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 days ago

      The only reason I opened the article was to find out what FSR meant. They never actually spell it out, you can understand it’s AI upscaling from the context but I guess they just assume you know the acronym…

  • Suppoze@beehaw.org
    link
    fedilink
    English
    arrow-up
    11
    ·
    4 days ago

    You know what would gather even more interest? Games not running like shit on native resolution.

      • Viri4thus@feddit.org
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 days ago

        I find it amusing that the company that is promoting brute force calculation of ray trajectories rather than using optimised code (competition defeat device) calls native rendering “brute force”. Meanwhile some of the best games of the past decade run on potato powered chips.

        • ZeroHora@lemmy.ml
          link
          fedilink
          English
          arrow-up
          3
          ·
          4 days ago

          I can’t get over this bullshit, Nvidia could become the best company in the world, with the best products at the best prices but I’ll never forgive this level of bullshit, fucking brute-force rendering.

          • Viri4thus@feddit.org
            link
            fedilink
            English
            arrow-up
            3
            ·
            4 days ago

            Absolutely. Jensen is so rich, if he wanted to spend his fortune, he couldn’t, within the habitual human lifespan.

            All that success because in the late naughties and early 10s NVIDIA (at least in EU) were giving away free GPUs to universities and giving grants on the condition researchers would use CUDA. Same with developers, they had two engineering teams in East Europe that would serve as outsoucing for code to cheapen development of games as a way to promote NVIDIA’s software “optimisations”. Most TWIMTBP games of that era, Bryan Rizzo’s time, have some sort of competing HW defeat device. They were so successful that their modern GPUs, Blackwell, can barely run some of their old games…

  • the_q@lemm.ee
    link
    fedilink
    English
    arrow-up
    9
    ·
    4 days ago

    Nvidia creates problem then creates solution and charges a premium for it. Industry smells money and starts including said problem in games. AMD gets left behind and tries to play catch up. Offers open source implementations of certain technologies to try and also create solution. Gamers still buy Nvidia.

    • FreeBooteR69@lemmy.ca
      link
      fedilink
      English
      arrow-up
      5
      ·
      3 days ago

      None of these two are our friends, though AMD is much nicer to the open source world. I tend to buy AMD because at least the hardware i’ve bought has good value and tremendous linux support.

    • Dasus@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 days ago

      I probably will, yeah.

      Or I was going to. Would’ve got 5070 ti, but didn’t have luck with the stock when it came out then drank most of the money, thought to give it a bit of time.

      I’m gonna wait a few months to see how this turns out after 5060ti comes out and whatnot.

    • moody@lemmings.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      4 days ago

      Is it? I haven’t used an Nvidia GPU since the GTX series, but my understanding was that DLSS was very effective. Meanwhile, the artifacting on FSR bothers the crap out of me.

      • ShinkanTrain@lemmy.ml
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        4 days ago

        Yes. FSR4 is the first version that uses dedicated hardware to do it like DLSS. Consensus seems to be that’s it’s on the same level as DLSS 3 (CDN model) but is heavier to run, which is pretty great for a first attempt.

        • moody@lemmings.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 days ago

          I see, it’s unfortunate that it requires dedicated hardware, but I guess it makes sense when the main competitor already has that.

    • ShinkanTrain@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      ·
      4 days ago

      FSR4 is absolutely noticable. I can’t tell the difference between native 4k and 1440p scaled to 4k with FSR4. That’s a giant performance boost.

      • ObtuseDoorFrame@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 days ago

        I’ve even been having trouble telling the difference between Super Resolution 4 and native. Driver level upscaling this good is a game changer, I might not even have to deal with optiscaler.