wuphysics87@lemmy.ml to Privacy@lemmy.ml · 2 days agoCan you trust locally run LLMs?message-squaremessage-square22fedilinkarrow-up168arrow-down16file-text
arrow-up162arrow-down1message-squareCan you trust locally run LLMs?wuphysics87@lemmy.ml to Privacy@lemmy.ml · 2 days agomessage-square22fedilinkfile-text
I’ve been play around with ollama. Given you download the model, can you trust it isn’t sending telemetry?
minus-squareforemanguy@lemmy.mllinkfedilinkarrow-up6·2 days agoThe only real way of checking is by checking the send packets and/or inspect the source code. This “problem” is not only related to local AI but open source are too
The only real way of checking is by checking the send packets and/or inspect the source code. This “problem” is not only related to local AI but open source are too