Early reactions to Nvidia’s DLSS 5 were swift and skeptical, with some observers likening the technology to an Instagram-style filter applied over gameplay footage. Nvidia CEO Jensen Huang refuted the allegations, but subsequent clarifications have helped outline how the system actually works – and where it can fall short.

  • BradleyUffner@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    edit-2
    8 hours ago

    I’m not sure why this is such a big deal. It’s only going to affect the 7 or 8 people in the world that can afford the 2 top of the line graphics cards and the RAM required to run it.

    • SaharaMaleikuhm@feddit.org
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      36 minutes ago

      Well, I hate the AI slop look it outputs and until now DLSS was supposed to increase performance, now it does the exact opposite. Also fuck Nvidia, they deserve all the hate .

    • Someonelol@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      21
      ·
      8 hours ago

      This is a culmination of years in which gamers kept getting ignored when they ask for some consideration to their needs like more memory, better Linux drivers, and smaller form factors. Instead we get hit with their server farm AI slop runoff and told this is what we want. To me it’s an insult for wanting something that’s more specialized to gaming and graphical work rather than some LLM platform.

      • BradleyUffner@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        7 hours ago

        Ahh,I get it now. It’s not about them creating something that sucks, it’s about them creating something that sucks instead of something else we’ve been asking for that would be actually useful. Thanks for filling in the missing piece.

  • etherphon@piefed.world
    link
    fedilink
    English
    arrow-up
    77
    ·
    14 hours ago

    So they have this nice 3D card, which they had a hand in inventing and “perfecting” to render the entire 3D scene in beautiful, stunning detail, and then another card with AI instructions that totally ignores all of that just happened, takes a screenshot and puts a filter on it in real time basically. What a massive waste of power and computation.

  • Gerudo@lemmy.zip
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    2
    ·
    11 hours ago

    Instead of doing this bullshit, can we just have regular DLSS be actually good? I can’t stand turning it on for my handheld because it’s a blurry, smeary mess as is.

    • Snoopey@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      10
      ·
      9 hours ago

      You might be thinking of FSR? I don’t know of any handhelds that support DLSS as they all use AMD hardware, and they also don’t support the latest AMD RDNA tech so you’re such with crappy FSR 2 most of the time which is indeed a horrible blurry glitchy mess.

  • shyguyblue@lemmy.world
    link
    fedilink
    English
    arrow-up
    134
    arrow-down
    2
    ·
    17 hours ago

    Best comment about this was from a video posted yesterday:

    Nvidia keeps saying that this tech is still a work in progress, yet they made the decision to release a demo in its current state…

    • CosmoNova@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      5 hours ago

      Yes because it makes people talk about that instead of their love affair with Palantir and their passionate support for Israel.

      Whoever decided to showcase DLSS5 in it‘s current form is probably getting a raise and a nice bonus for this diversion.

      • dvoraqs@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        22
        ·
        15 hours ago

        What isn’t? Everything is growing, decaying, or changing in some way, honestly.

        • BassTurd@lemmy.world
          link
          fedilink
          English
          arrow-up
          21
          ·
          14 hours ago

          AI is the most rushed to market product I think I’ve ever seen in my life. It makes Cyberpunk’s release like like a polished gem in comparison. Yes, things evolve over their life cycle for better or worse, but none of this other things have been so ingrained in everything, cost even a fraction of LLMs, both monetarily and environmentally, it sucked as hard.

          AI is a different monster. A shitty shitty monster.

          • Corngood@lemmy.ml
            link
            fedilink
            English
            arrow-up
            13
            ·
            13 hours ago

            It’s because CEOs don’t play cyberpunk, but they did try chatgpt and got an immediate boner thinking about all the people they could lay off.

            • BremboTheFourth@piefed.ca
              link
              fedilink
              English
              arrow-up
              4
              ·
              13 hours ago

              More like they saw the potential to ruin most plebs’ already limited abilities to figure out wtf is happening in the world, thought they would be too smart to fall for that shit, and decided they should open Pandora’s box and sell it like it’s Prometheus’s fire

              • badgermurphy@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                edit-2
                9 hours ago

                Unlike Pandora’s box, though, a lot of the dumber applications of this stuff will go back in when the VC money dries up.

    • justOnePersistentKbinPlease@fedia.io
      link
      fedilink
      arrow-up
      48
      arrow-down
      4
      ·
      16 hours ago

      As microslop was constantly saying last year, LLMs and their ilk are a product in search of an application.

      Every company is desperate to find anything these garbage machines can do well enough to validate the trillion or so dollars pumped into them.

      Late edit: Also, Salesforce is literally mostly about barely functional tech with shiny demos. Thats why there is a consulting and customization industry worrth at least 10s if not 100s of billions that supports just their software.

      • SaharaMaleikuhm@feddit.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        32 minutes ago

        But then Nvidia really does not need to. They sell hardware. They need to design new better hardware and make good drivers. But it’s never enough, is it? It always has to be more, like cancer.

      • Valmond@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        15
        ·
        15 hours ago

        That’s clearly the insane part, like okay it can be a bit helpful in this or that scenario, but they spent like every person on earth would want to pay 250 euros a month for it…

    • phar@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      12
      ·
      17 hours ago

      Demos are very often an example of in progress works or technology. That literally happens all the time.

      • Epzillon@lemmy.world
        link
        fedilink
        English
        arrow-up
        38
        arrow-down
        2
        ·
        16 hours ago

        Doesnt really matter IMO. If you have known bugs and flaws you dont showcase those, or if they are present in the showcase you atleast adress them and show what is to be expected upon release. NVIDIA just flat out didnt care. As soon as motion increases the artefacting is crazy. How do you even decide that this is remotely good enough for a demo?

        • [deleted]@piefed.world
          link
          fedilink
          English
          arrow-up
          24
          ·
          16 hours ago

          Nvidia hears people like motion blur and AI slop so they put some AI slop in their motion blur.

          • Valmond@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            5
            ·
            15 hours ago

            Ugh. “Everyone is doing BLOOM, lets also do BLOOM but at +150% more!”

            I remember that, motion blur came after and now I guess ai 😓

        • Gsus4@mander.xyz
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          1
          ·
          edit-2
          15 hours ago

          :3

          PS: The hallucinations are artistic freedom 😂

          • justOnePersistentKbinPlease@fedia.io
            link
            fedilink
            arrow-up
            2
            ·
            11 hours ago

            “Hallucinations” are an inherent part of the programming.

            It is literally impossible to prevent them. The systems work on building the fuzzy average response to a query via complex statistics. There is no thinking or creativity.

        • phar@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          12 hours ago

          Okay but that is not what the person said or what the poster above quoted as being the best part. I’m not commenting on the overall performance I’m just saying that demos very often are exactly what that sentence implies they shouldn’t be.

      • Telorand@reddthat.com
        link
        fedilink
        English
        arrow-up
        29
        ·
        17 hours ago

        And yet, they chose to demo a broken technology with obvious bugs and flaws. The demos from tech companies are supposed to make people excited, not recoil in disgust.

        This isn’t some tiny company, either. It’s fucking nVidia, who supposedly has the money to create a good demo.

  • empireOfLove2@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    33
    ·
    15 hours ago

    Further confirming this is not meant to ever be used by actual gamers, and instead exists only to advertise real time genAI modification to existing video media.

  • Ilixtze@lemmy.ml
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    1
    ·
    17 hours ago

    img2img slop filter for every frame in real time. Great job nvidia what a dumb waste of resources.

    • Deestan@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      13 hours ago

      The waste is the point.

      It needs to be more expensive, because that can be leveraged for higher valuations.

      • friend_of_satan@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        14 hours ago

        DLSS stands for “deep learning super scaling.” It was always gen-ai. Those extra details weren’t being revealed, they were being generated.

        • JohnEdwa@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          14
          ·
          edit-2
          11 hours ago

          While true, the way DLSS 2/3/4 does it is to take a bunch of low res renders of the game over time while wiggling the camera very slightly, and stitch them all together to generate a new, higher res image that very closely matches what the original would have looked like. The GenAI part is essentially just a very advanced temporal blending function that’s really good at detecting and smoothing out edges.

          DLSS 5 then runs an AI Instagram filter on top of the frame for “enhanced visuals”, because obviously we want our games to look like cheap AI slop.

      • Gsus4@mander.xyz
        link
        fedilink
        English
        arrow-up
        5
        ·
        15 hours ago

        But it was working fine and probably cheaper, this makes it worse. Where the fuck is QA?

          • Gsus4@mander.xyz
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            1
            ·
            15 hours ago

            “Those responsible for sacking the people who have just been sacked have been sacked.”