• Chahk@beehaw.org
    link
    fedilink
    English
    arrow-up
    8
    ·
    11 months ago

    Most of the time I use assistant to either perform home automation tasks, or look stuff up online. The first one already works fine, and for the second one I won’t trust a glorified autocomplete.

    • stevedidwhat_infosec@infosec.pub
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      5
      ·
      edit-2
      11 months ago

      Good point, hallucinations only add to the fake news problem and artificial content problem.

      I’ll counter with this: how do you know the stuff you look up online is legit? Should we go back to encyclopedias? Who writes those?

      Edit: in case anyone isn’t aware, GPT “hallucinates” made up information in specific cases when temperature and top_p settings aren’t optimized, wasn’t saying anyone’s opinion was a hallucination of course

      • Otter@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 months ago

        Some generative chatbots will say something then link to where the info is from. That’s good because I can followup

        Some will just say something. That’s bad and I’ll have to search myself afterwards.

        It’s the equivalent of a book with no cover or a webpage where I can’t see what website it’s on. Maybe it’s reputable, maybe it’s not. Without a source I can’t really decide