• Limeey@lemmy.world
    link
    fedilink
    English
    arrow-up
    79
    arrow-down
    4
    ·
    9 months ago

    It all comes down to the fact that LLMs are not AGI - they have no clue what they’re saying or why or to whom. They have no concept of “context” and as a result have no ability to “know” if they’re giving right info or just hallucinating.