• SnarkoPolo@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    12 hours ago

    “Caden, it looks like Airlynn just said you’re a hopeless loser, and she’s been banging your personal trainer Chad. Is there anything else I can help you with?”

  • BigMacHole@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    1
    ·
    16 hours ago

    CEOs are SO INTELLIGENT! I would NEVER have Thought to invest BILLIONS OF DOLLARS on Chatbots and Summarizers which ALREADY existed!

    • Affidavit@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      26
      ·
      14 hours ago

      Trying not to be too douchey here, but ironically, your message is actually a very good example of where this technology could be beneficial.

      IT is ACTUALLY not EASY to read a MESSAGE when THE CASE randomly SWITCHES back AND forth.

  • Affidavit@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    4
    ·
    14 hours ago

    I don’t use WhatsApp, but this immediately made me think of my dad who doesn’t use any punctuation and frequently skips and misspells words. His messages are often very difficult to interpret, through no fault of his own (dyslexia).

    Having an LLM do this for me would help both him and me.

    He won’t feel self conscious when I send a, “What you talkin’ about Willis?” message, and I won’t have to waste a ridiculous amount of time trying to figure out what he was trying to say.

    • Feyd@programming.dev
      link
      fedilink
      English
      arrow-up
      7
      ·
      10 hours ago

      What makes you think the llm will be able to decipher something that already doesn’t make sense

    • ayyy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      12
      ·
      12 hours ago

      If he’s not communicating in an explicit and clear way the AI can’t help you magically gain context. It will happily make up bullshit that sounds plausible though.

      • Affidavit@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        7 hours ago

        A poorly designed tool will do that, yes. An effective tool would do the same thing a person could do, except much quicker, and with greater success.

        An LLM could be trained on the way a specific person communicates over time, and can be designed to complete a forensic breakdown of misspelt words e.g. reviewing the positioning of words with nearby letters in the keyboard, or identifying words that have different spellings but may be similar phonetically.