• FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    5 months ago

    The “glue on pizza” thing wasn’t a result of the AI’s training, the AI was working fine. It was the search result that gave it a goofy answer to summarize.

    The problem here is that it seems people don’t really understand what goes into training an LLM or how the training data is used.