• @4AV@lemmy.world
    link
    fedilink
    English
    212 years ago

    It doesn’t have “memory” of what it has generated previously, other than the current conversation. The answer you get from it won’t be much better than random guessing.

    • randint
      link
      fedilink
      English
      -22 years ago

      Maybe it should keep a log of what was generated? Would that even work though?

      • @sep@lemmy.world
        link
        fedilink
        English
        142 years ago

        Ignoring the huge privacy/liabillity issue… there are other llm’s then chatgpt.

      • @BetaDoggo_@lemmy.world
        link
        fedilink
        English
        12 years ago

        The model is only trained to handle 4k tokens, roughly 2000 words depending on complexity. Even if it had a log of everything asked it wouldn’t be able to use any of it.