misk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 year agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square234fedilinkarrow-up1908arrow-down119
arrow-up1889arrow-down1external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.comisk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 year agomessage-square234fedilink
minus-squareMahlzeit@feddit.delinkfedilinkEnglisharrow-up1·1 year agoOh. I see. The attempts to extract training data from ChatGPT may be criminal under the CFAA. Not a happy thought. I did say “making available” to exclude “hacking”.
minus-squareJackbyDev@programming.devlinkfedilinkEnglisharrow-up1·1 year agoThe point I’m illustrating is that plenty of things reasonable people would assume are fine the law can call hacking.
Oh. I see. The attempts to extract training data from ChatGPT may be criminal under the CFAA. Not a happy thought.
I did say “making available” to exclude “hacking”.
The point I’m illustrating is that plenty of things reasonable people would assume are fine the law can call hacking.