• cheddar@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    Oh come on, LLMs don’t hallucinate 24/7. For that, you have to ask a chatbot to say something it wasn’t properly trained for. But generating simple texts for background chatter? That’s safe and easy. The real issue is the amount of resources required by modern LLMs. But technologies tend to become better with time.