• BluesF@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    1 month ago

    It doesn’t understand anything. It predicts a word based on previous words - this is why I called it syntax. If you imagine a huge and vastly complicated series of rules about how likely one word is to follow up to, say, 1000 others… That’s an LLM.

    • Archpawn@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      30 days ago

      It can predict that the word “scales” is unlikely to appear near “books”. Do you understand what I mean now? Sorry, neural networks can’t understand things. Can you make predictions based on what senses you received now?

      • BluesF@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        30 days ago

        Well given that an LLM produced the nonsense riddle above, obviously it cannot predict that. It can predict the structure of a riddle perfectly well, it can even get the rhyming right! But the extra layer of meaning involved in a riddle is beyond what LLMs are able to do at the moment. At least, all of them that I’ve seen - they all seem to fall flat with this level of abstraction.