• jws_shadotak@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        12
        ·
        edit-2
        11 months ago

        ChatGPT consistently makes up shit. It’s difficult to tell when something is made up because it’s a language model so it is supposed to sound confident as if it’s any person telling a fact that they know.

        It knows how to talk like a subject matter expert because that’s usually what gets publicized most and thus that’s what it’s trained on, but it doesn’t always know the facts necessary to answer questions. It makes shit up to fill the gap and then presents it intelligently, but it’s wrong.

      • Chahk@beehaw.org
        link
        fedilink
        English
        arrow-up
        8
        ·
        11 months ago

        Most of the time I use assistant to either perform home automation tasks, or look stuff up online. The first one already works fine, and for the second one I won’t trust a glorified autocomplete.

        • stevedidwhat_infosec@infosec.pub
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          5
          ·
          edit-2
          11 months ago

          Good point, hallucinations only add to the fake news problem and artificial content problem.

          I’ll counter with this: how do you know the stuff you look up online is legit? Should we go back to encyclopedias? Who writes those?

          Edit: in case anyone isn’t aware, GPT “hallucinates” made up information in specific cases when temperature and top_p settings aren’t optimized, wasn’t saying anyone’s opinion was a hallucination of course

          • Otter@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            ·
            11 months ago

            Some generative chatbots will say something then link to where the info is from. That’s good because I can followup

            Some will just say something. That’s bad and I’ll have to search myself afterwards.

            It’s the equivalent of a book with no cover or a webpage where I can’t see what website it’s on. Maybe it’s reputable, maybe it’s not. Without a source I can’t really decide

      • CJOtheReal@ani.social
        link
        fedilink
        English
        arrow-up
        6
        ·
        11 months ago

        Cause Chatgpt isn’t reliable on actual information and i don’t want to have any “assistant” at all.

  • runswithjedi@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    11 months ago

    Assistant already reads off a paragraph when I’m just trying to turn off a light. No way do I need something that will recite the entire bibliography of the sources it used to find those controls.

  • atrielienz@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    11 months ago

    Man this type of shit is why I’m getting rid of google Assistant and going to a FOSS home assistant setup. I don’t want chat gpt. I want to add things to my calendar, my shopping list, turn off lights and open/close blinds. I want to mute speakers at a certain time with a routine that isn’t broken every five minutes. I want timers that work reliably. I want to be able to make an announcement when Amazon is at the door. Why are they making this so difficult?

  • wazzupdog@lemmy.ml
    link
    fedilink
    English
    arrow-up
    6
    ·
    11 months ago

    How can it replace Google assistant if there is no Google assistant on my phone. (I removed it)

  • atocci@kbin.social
    link
    fedilink
    arrow-up
    2
    ·
    11 months ago

    I just want to be able to consistently make searches using what’s on my phone screen. Is that too much to ask? The screen search button disappears every other month and I’m sick of it. I don’t invoke the Assistant for any other reason.

  • Pyr_Pressure@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Can’t see it being that useful if it remains restricted to info 2+ years old.

    You wouldn’t be able to ask it the weather or anything of the sort.