•  dev_null   ( @dev_null@lemmy.ml ) 
    link
    fedilink
    English
    5
    edit-2
    7 days ago

    Internal documents on how the AI was trained were obviously not part of the training data, why would they be. So it doesn’t know how it was trained, and as this tech always does, it just hallucinates an English sounding answer. It’s not “lying”, it’s just glorified autocomplete. Saying things like “it’s lying” is overselling what it is. As much as any other thing that doesn’t work is not malicious, it just sucks.