Sometimes it can be hard to tell if we’re chatting with a bot or a real person online, especially as more and more companies turn to this seemingly cheap way of providing customer support. What are some strategies to expose AI?

  • zappy@lemmy.ca
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    I’m trying to tell you limited context is a feature not a bug, even other bots do the same thing like Replika. Even when all past data is stored serverside and available, it won’t matter because you need to reduce the weighting or you prevent significant change in output values (and less change as the history grows larger). Time decay of information is important to making these systems useful.

    • tikitaki@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      give an example please, because i don’t see how in normal use the weighting would matter at a significant scale based on the massive volume of training data

      any interact the chatbot has with one person is dwarfed by the amount of total text data the AI has consumed through training. it’s like saying saggitarius a gets changed over time by adding in a few planets. while definitely true it’s going to be a very small effect