Sunday, March 24, 2024

A university student tricks ChatGPT on Bing into showing him internal Microsoft documents

student tricks ChatGPT

A university student tricks ChatGPT on Bing into showing him internal Microsoft documents

Stanford University student Kevin Liu first discovered an exploit that reveals the rules governing Bing AI’s behavior when answering questions.

 student tricks ChatGPT

Microsoft announced the integration of ChatGPT in Bing Search to make available to users an interactive chat that helps the Internet user formulate questions in the best possible way but currently, access to this innovation has a waiting list.

Some lucky ones have already had the chance to test ChatGPT in the Bing search engine, and in particular, a user tricked the artificial intelligence (AI) to get the internal Bing Search document with details about its limitations and capabilities.

The chatbot suffered a prompt injection attack the user tricked the AI by introducing a malicious input into breaking the rules of the conversational engine as if it were social engineering that forces it to do something.

The person in question was Kevin Liu (a student at Stanford University). He got an internal document with all the guidelines that Microsoft gave to the AI to give coherent answers to the users.

PostMoscow claims it has broken the Ukrainian defensive lines in Lugansk but Kyiv denies it

As can be seen in the screenshots, the information was filtered by artificial intelligence by the questions Liu asked. The data was related to the instruction received, the code name of the chatbot, the number of languages supported, and the instructions on behavior.

Regarding the name, Bing AI often refers to itself as ‘Sydney’, but Microsoft notes that it was a codename to create a chat experience it was previously working on, plus The Verge indicates that it has a secret set of rules that users have managed to find through quick exploits.

Caitlin Roulston (Microsoft’s director of communications) explains in a statement to The Verge that “Sydney refers to an internal code name for a chat experience we were exploring. We are phasing out the name in preview, but it may still appear occasionally”.

PostA general dismissed by Putin found dead in an alleged suicide

Leave a Reply

Your email address will not be published.