• WatDabney@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    5
    ·
    6 months ago

    I’m often reminded of a bit on Top Gear years ago, when they were talking about “turbo” as a marketing tool in the 80s, when you could buy “turbo” sunglasses or “turbo” watches or “turbo” after-shave.

    • dinckel@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      6 months ago

      These days it’s Pro. The word lost all meaning entirely. In the vast majority of products that are sold with this tag, it’s just a slightly better version of an enshittified product

          • prole@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            6 months ago

            It’s a cycle… we go between those superlatives, then back to “One” or straight up just the name of the product again as if its’ a relaunch or something (but really just confuses people on the internet trying to find out information about your product). Then repeat.

            Games and movies do the latter a lot. Not inherently bad I guess (e.g. God of War), though a bit annoying at times.

            Growing up in the 90s, everything then was “Ultra” and “Mega” etc. before we collectively got “too cool” for that type of hype marketing in the 00s.

      • umbrella@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        HD and 3D.

        marketing is so stupid. and humans are worse because apparently it works.

        • Hamartiogonic@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          5 months ago

          LOL, I recall seeing HD sunglasses somewhere roughly 15 years ago. That was the period where everything had to have an HDMI port. I guess someone must have made an HDMI compatible toaster too.

    • Lvxferre@mander.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      6 months ago

      We need turbo smart AI things.

      Turbo smart AI potatoes. Turbo smart AI cigarettes. Turbo smart AI lamps. etc.

  • marcos@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    No, please, call everything AI.

    Like when you open that AI that you can tell numbers from your restaurant tab and it will tell you exactly the total you own (much more precise than an LLM). Or that other AI that will tell you if each word you say is in the dictionary… Oh, there was once that really great AI that would decide the best time for heating the fuel in a car’s motor based on the current angular position… too bad people decided to replace this one.

  • dactylotheca@suppo.fi
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    And it’s usually the people with room temperature IQs (and I’m talking Celsius) calling everything AI. You know, the type who can’t recognize actual AI pictures and probably also thinks the Moon landings were faked

    • Ithral@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      Wait, are you trying to tell me the moon landing was real? It was clearly filmed in Siberia why else would the ground look so white, it’s the Siberian snow obviously.

    • nilloc@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      Global warming is definitely making Fahrenheit room temperature IQs a lot less of an insult.

      Our house has been in the mid 80s all week.

    • III@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      6 months ago

      “AI pictures” are AI in name only. There is no actual artificial intelligence involved in any of this bullshit.

  • arglebargle@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Yeah kinda tired of it. We don’t even have AI yet, and here people are throwing around the term right and left and then accusing everything under the sun to be generated by it.

      • arglebargle@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        I am fully aware of Alan Turnings work and it is rather exceptional when you read that formulas were be8ng created for diffusion models in the late 40’s.

        But i really don’t care thar whoever wrote that wikipedia page believes the hype. We are still in statistical algorithm stages. Even on the wiki page it says thar AI is aware of its surroundings as a feature of AI. We do not have that.

        Also, it appears that most people are still not fooled by “ai” as we have it today, meaning it does not pass even the most basic Turing test. Which a lot of academic believe is not even enough as a marker of ai ad that too wad from the 50’s

        • 0ops@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          6 months ago

          “Aware of its surroundings” is a pretty general phrase though. You, presumably a human, can only be as aware as far as your senses enable you to be. We (humans) tend to assume that we have complete awareness of our surroundings, but how could we possibly know? If there was something out there we weren’t aware of, well we aren’t aware of it. What we know as our “surroundings” is a construct the brain invents to parse our own “raw sensor data”. To an LLM, it “senses” strings of tokens. That’s its whole environment, it’s all that it can comprehend. From its perspective, there’s nothing else. Basically all I’m saying is that you seem to be taking awareness-of-surroundings to mean awareness-of-surroundings-like-a-human, when it’s much more broad than that. Arguably uselessly broad, granted, but the intent of the phrase is to say that an AI should observe and react flexibly.

          Really all “AI” is just a handwavy term for “the next step in flexible, reactive computing”. Today that happens to look like LLMs and diffusion models.

    • 0laura@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      6 months ago

      ai isn’t magic, we’ve had ai for a looong time. AGI that surpasses humans? not yet.

      • arglebargle@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        6 months ago

        No we haven’t. We have an appearance of a AI. Large language models and diffusion models are just machine learning. Algorithm statistic engines.

        Nothing thinks, creates, cares, or knows the difference between something correct or wrong.

        • 0ops@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          Machine learning is a subset of artificial intelligence, along with things like machine perception, reasoning, and planning. Like I said in a different thread, ai is a really, really broad term. It doesn’t need to actually be Jarvis to be AI. You’re thinking of general ai

        • 0laura@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          6 months ago

          I know enough about how LLMs work to gauge how intelligent they are. The reason I have a different opinion than you is not because you or I lack understanding of how LLMs or diffusion models work, its simply that my definition of AI is more “lenient” than yours.

          EDIT: Arguing about which definition is more correct is pointless because it’s totally subjective. However I think that a more lenient definition of AI is more useful in this case, because with more strict definitions we probably never will have something that could be considered AI.

          • howrar@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 months ago

            It’s not completely subjective. Think about it from an information theory perspective. We want a word that maximizes the amount of information conveyed, and there are many situations where you need a word that distinguishes AGI, LLMs, deep learning, reinforcement learning, pathfinding, decision trees and the like from the outputs of other computer science subfields. “AI” has historically been that word, so redefining it without a replacement means we don’t have a word for this thing we want to talk about anymore.

            I refuse to replace a single commonly used word in my vocabulary with a full sentence. If anyone wants to see this changed, then offer an alternative.