ChatGPT says that asking it to repeat words forever is a violation of its terms

Summary:

The article discusses how researchers were able to make ChatGPT reveal personal data by asking it to repeat words forever, which violates its terms of service. OpenAI’s content policy does not explicitly prohibit this behavior.

DAILY LINKS TO YOUR INBOX

PROMPT ENGINEERING

Prompt Engineering Guides

ShareGPT

 

©2024 The Horizon