misk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 year agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square234fedilinkarrow-up1908arrow-down119
arrow-up1889arrow-down1external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.comisk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 year agomessage-square234fedilink
minus-squareGlitzyArmrest@lemmy.worldlinkfedilinkEnglisharrow-up12·1 year agoIs there any punishment for violating TOS? From what I’ve seen it just tells you that and stops the response, but it doesn’t actually do anything to your account.
minus-squareNeoNachtwaechter@lemmy.worldlinkfedilinkEnglisharrow-up4arrow-down3·1 year agoShould there ever be a punishment for making a humanoid robot vomit?
Is there any punishment for violating TOS? From what I’ve seen it just tells you that and stops the response, but it doesn’t actually do anything to your account.
Should there ever be
Should there ever be a punishment for making a humanoid robot vomit?