• masterspace@lemmy.ca
    link
    fedilink
    English
    arrow-up
    245
    arrow-down
    11
    ·
    3 months ago

    Thank fucking god.

    I got sick of the overhyped tech bros pumping AI into everything with no understanding of it…

    But then I got way more sick of everyone else thinking they’re clowning on AI when in reality they’re just demonstrating an equal sized misunderstanding of the technology in a snarky pessimistic format.

    • Jesus@lemmy.world
      link
      fedilink
      English
      arrow-up
      66
      arrow-down
      3
      ·
      3 months ago

      I’m more annoyed that Nvidia is looked at like some sort of brilliant strategist. It’s a GPU company that was lucky enough to be around when two new massive industries found an alternative use for graphics hardware.

      They happened to be making pick axes in California right before some prospectors found gold.

      And they don’t even really make pick axes, TSMC does. They just design them.

      • utopiah@lemmy.world
        link
        fedilink
        English
        arrow-up
        24
        ·
        3 months ago

        They just design them.

        It’s not trivial though. They also managed to lock dev with CUDA.

        That being said I don’t think they were “just” lucky, I think they built their luck through practices the DoJ is currently investigating for potential abuse of monopoly.

        • nilloc@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 months ago

          Yeah CUDA, made a lot of this possible.

          Once crypto mining was too hard nvidia needed a market beyond image modeling and college machine learning experiments.

      • Zarxrax@lemmy.world
        link
        fedilink
        English
        arrow-up
        21
        arrow-down
        6
        ·
        3 months ago

        They didn’t just “happen to be around”. They created the entire ecosystem around machine learning while AMD just twiddled their thumbs. There is a reason why no one is buying AMD cards to run AI workloads.

        • sanpo@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          11
          ·
          3 months ago

          One of the reasons being Nvidia forcing unethical vendor lock in through their licensing.

        • towerful@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 months ago

          I feel like for a long time, CUDA was a laser looking for a problem.
          It’s just that the current (AI) problem might solve expensive employment issues.
          It’s just that C-Suite/managers are pointing that laser at the creatives instead of the jobs whose task it is to accumulate easily digestible facts and produce a set of instructions. You know, like C-Suites and middle/upper managers do.
          And NVidia have pushed CUDA so hard.

          AMD have ROCM, an open source cuda equivalent for amd.
          But it’s kinda like Linux Vs windows. NVidia CUDA is just so damn prevalent.
          I guess it was first. Cuda has wider compatibility with Nvidia cards than rocm with AMD cards.
          The only way AMD can win is to show a performance boost for a power reduction and cheaper hardware. So many people are entrenched in NVidia, the cost to switching to rocm/amd is a huge gamble

      • Grandwolf319@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        3 months ago

        Imo we should give credit where credit is due and I agree, not a genius, still my pick is a 4080 for a new gaming computer.

        • mycodesucks@lemmy.world
          link
          fedilink
          English
          arrow-up
          24
          arrow-down
          3
          ·
          edit-2
          3 months ago

          Go ahead and design a better pickaxe than them, we’ll wait…

          Same argument:

          “He didn’t earn his wealth. He just won the lottery.”

          “If it’s so easy, YOU go ahead and win the lottery then.”

          • masterspace@lemmy.ca
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            25
            ·
            edit-2
            3 months ago

            My fucking god.

            “Buying a lottery ticket, and designing the best GPUs, totally the same thing, amiriteguys?”

            • mycodesucks@lemmy.world
              link
              fedilink
              English
              arrow-up
              14
              arrow-down
              2
              ·
              edit-2
              3 months ago

              In the sense that it’s a matter of being in the right place at the right time, yes. Exactly the same thing. Opportunities aren’t equal - they disproportionately effect those who happen to be positioned to take advantage of them. If I’m giving away a free car right now to whoever comes by, and you’re not nearby, you’re shit out of luck. If AI didn’t HAPPEN to use massively multi-threaded computing, Nvidia would still be artificial scarcity-ing themselves to price gouging CoD players. The fact you don’t see it for whatever reason doesn’t make it wrong. NOBODY at Nvidia was there 5 years ago saying “Man, when this new technology hits we’re going to be rolling in it.” They stumbled into it by luck. They don’t get credit for forseeing some future use case. They got lucky. That luck got them first mover advantage. Intel had that too. Look how well it’s doing for them. Nvidia’s position over AMD in this space can be due to any number of factors… production capacity, driver flexibility, faster functioning on a particular vector operation, power efficiency… hell, even the relationship between the CEO of THEIR company and OpenAI. Maybe they just had their salespeople call first. Their market dominance likely has absolutely NOTHING to do with their GPU’s having better graphics performance, and to the extent they are, it’s by chance - they did NOT predict generative AI, and their graphics cards just HAPPEN to be better situated for SOME reason.

              • masterspace@lemmy.ca
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                3
                ·
                3 months ago

                they did NOT predict generative AI, and their graphics cards just HAPPEN to be better situated for SOME reason.

                This is the part that’s flawed. They have actively targeted neural network applications with hardware and driver support since 2012.

                Yes, they got lucky in that generative AI turned out to be massively popular, and required massively parallel computing capabilities, but luck is one part opportunity and one part preparedness. The reason they were able to capitalize is because they had the best graphics cards on the market and then specifically targeted AI applications.

    • Sentient Loom@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      59
      ·
      3 months ago

      As I job-hunt, every job listed over the past year has been “AI-drive [something]” and I’m really hoping that trend subsides.

      • AdamEatsAss@lemmy.world
        link
        fedilink
        English
        arrow-up
        65
        ·
        3 months ago

        “This is an mid level position requiring at least 7 years experience developing LLMs.” -Every software engineer job out there.

        • EldritchFeminity@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          24
          ·
          3 months ago

          Reminds me of when I read about a programmer getting turned down for a job because they didn’t have 5 years of experience with a language that they themselves had created 1 to 2 years prior.

        • macrocephalic@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          ·
          3 months ago

          Yeah, I’m a data engineer and I get that there’s a lot of potential in analytics with AI, but you don’t need to hire a data engineer with LLM experience for aggregating payroll data.

          • utopiah@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            3 months ago

            there’s a lot of potential in analytics with AI

            I’d argue there is a lot of potential in any domain with basic numeracy. In pretty much any business or institution somebody with a spreadsheet might help a lot. That doesn’t necessarily require any Big Data or AI though.

      • technocrit@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        edit-2
        3 months ago

        The tech bros had to find an excuse to use all the GPUs they got for crypto after they bled that dry upgraded to proof-of-stake.

        I don’t see a similar upgrade for “AI”.

        And I’m not a fan of BTC but $50,000+ doesn’t seem very dry to me.