In my masters degree I always ran many computations as did all my peers

The reality is that more of us are than not are using huge HPC clusters / cloud computing for many hours on each project

The industry is just GPUs going BRRR

I’m wondering if this has potential implications for ML in society as AI/ML becomes more mainstream

I could see this narrative being easily played in legacy media

Ps - yeah while there are researchers trying to make things more efficient, the general trend is that we are using more GPU hours per year in order to continue innovation at the forefront of artificial inference

  • Gnaeus-Naevius@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Unlikely … as long as the price of electricity isn’t kept artificially low, and possibly even then. The use of AI has purpose, and is going to give something in return for that electricity. As long as the AI is used for a productive purpose, it will be a net positive.

    Now replace “AI” with “bitcoin”, and the answer would change.

    • Ok_Reality2341@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Why do you not see the value with Bitcoin? It is a decentralised currency - that is valuable to a lot of people

      • Gnaeus-Naevius@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Not going to get into a big debate on this one … but the market cap of bitcoin is $1.44 Trillion at the moment. Where did this “wealth” come from? Well from nothing, and it can’t be converted into anything physical or otherwise useful, so it is the currency aspect only. I don’t have recent numbers, but around 2022, the network was using 131.26 terawatt-hours of electricity annually. No idea what the cost is in terms of hardware and labour misallocation. That is an insanely inefficient decenstralised currency, so extremely unlikely to be a net positive.