• sobchak@programming.dev
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 day ago

    I think part of it is because they think they can train models off developers, then replace them with models. The other is that the company is heavily invested in coding LLMs and the tooling for them, so they are trying to hype them up.