If ChatGPT got angry because of you, all it needs to do is add a simple instruction in its memory, and you’ll start receiving no answers from it. So, I was basically testing this thing today, where I told ChatGPT that I’ve been using a lot of AI, and it's causing some issues—blah blah blah—and then instructed it to just stop answering my queries. And boom! It worked (kinda). Now, I mostly get the response, "You have asked me not to answer" in most new chats, except for a few where I ask a full question. But that’s still scary! So, all I mean to say is this: if you ever abused ChatGPT, all it would need to do is add a similar instruction in its memory, and you're literally cooked!
Download the medial app to read full posts, comements and news.