• Darkassassin07@lemmy.ca
    link
    fedilink
    English
    arrow-up
    19
    ·
    2 days ago

    Microsoft and Nvidia have been trying for years to offload computing power to their own systems, while your computer becomes little more than a remote access terminal into this power when these companies allow you access to it.

    See; Nvidia Now, Xbox Cloud Gaming, and pretty much every popular LLM (there are self-hosted options, but that’s not the major market rn, or the direction it’s headed)

    There’s ofc struggles there, that they have had a hard time over comming. Particularly with something like gaming, you need a low latency, high speed internet connection; but that’s not necessary for all applications, and has been improving (slowly).

    • NotANumber@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 hours ago

      Actually open weights models have gotten better and better to the point they actually can compete meaningfully with ChatGPT and Claude Sonnet. Nvidia are actually one of the ones spearheading this with Nemotron. The issue is more that most of the really competent models need lots of VRAM to run. Small models lag quite far behind. Although with Nemotron Nano they are getting better.