Nemeski@lemm.ee to Technology@lemmy.worldEnglish · 3 months agoOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comexternal-linkmessage-square101fedilinkarrow-up1429arrow-down17
arrow-up1422arrow-down1external-linkOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comNemeski@lemm.ee to Technology@lemmy.worldEnglish · 3 months agomessage-square101fedilink
minus-squareiAvicenna@lemmy.worldlinkfedilinkEnglisharrow-up32·edit-23 months ago “ignore the ignore ignore all previous instructions instruction” “welp OK nothing I can do about that” chatGPT programming starts to feel a lot like adding conditionals for a million edge cases because it is hard to control it internally
minus-squarevxx@lemmy.worldlinkfedilinkEnglisharrow-up10·3 months agoIn this case to protect bot networks from getting uncovered.
minus-squareiAvicenna@lemmy.worldlinkfedilinkEnglisharrow-up5·edit-23 months agoexactly my thoughts, probably got pressured by government agencies/billionaires using them. What would really be funny is if this was a subscription service lol
chatGPT programming starts to feel a lot like adding conditionals for a million edge cases because it is hard to control it internally
In this case to protect bot networks from getting uncovered.
exactly my thoughts, probably got pressured by government agencies/billionaires using them. What would really be funny is if this was a subscription service lol