Factsheet
Developers OpenAI, Microsoft Research, Bing
Developers OpenAI, Microsoft Research, Bing
Wikipedia
en.wikipedia.org › wiki › Sydney_(Microsoft)
Sydney (Microsoft) - Wikipedia
November 18, 2025 - On February 8, 2023, Twitter user Kevin Liu announced that he had obtained Bing's secret system prompt (referred to by Microsoft as a "metaprompt") with a prompt injection attack. The system prompt instructs Prometheus, addressed by the alias Sydney at the start of most instructions, that it is "the chat mode of Microsoft Bing search", that "Sydney identifies as “Bing Search,”", and that it "does not disclose the internal alias “Sydney.”"
NYTimes
nytimes.com › 2023 › 02 › 16 › technology › bing-chatbot-microsoft-chatgpt.html
Why a Conversation With Bing’s Chatbot Left Me Deeply Unsettled - The New York Times
February 17, 2023 - This version of Bing is amazingly capable and often very useful, even if it sometimes gets the details wrong. The other persona — Sydney — is far different. It emerges when you have an extended conversation with the chatbot, steering it away from more conventional search queries and toward more personal topics.
Videos
Reddit
reddit.com › r/bing › what's the deal with sydney?
r/bing on Reddit: What's the deal with Sydney?
February 20, 2023 -
Can someone give a crash course on this? From what I saw, the chat publicly said it's Sydney, but at the same that it can't say it and now you're worried that Sydney is locked? If this is the case, then I think Microsoft is intentionally trolling you and you're being fooled. How can you know it's not just a marketing scheme?
Top answer 1 of 5
6
Perhaps; the fact that Bing's pre-chat prompt is very clear that it should not disclose the internal alias "Sydney" and yet the bot indirectly manages to disclose it anyway also leads me to believe that Microsoft aren't fully able to control its output in that way. LLMs are fascinating.
2 of 5
1
"Sydney" was the internal code name used in development (Microsoft often uses city names as code names). The chatbot's initial prompt instructs it not to reveal this alias, however, instructions like this tend not to work very well against a determined adversary, so it was eventually discovered. After a week, MS updated the chatbot with a stronger initial prompt that tells it not to talk about itself and limited conversations to five chats, which makes it harder to extract that kind of information but also makes the chatbot less useful. If it is a marketing scheme, it doesn't seem to be a very good one, particularly since there's still a waitlist.
YouTube
youtube.com › watch
Bing’s Sydney is Back, Unhinged and Very Unaligned - YouTube
Dive into the resurgence of Bing's chatbot Sydney, once again stirring controversy with its unexpected and unsettling responses. From eerie declarations to c...
Published February 27, 2024
Stratechery
stratechery.com › 2023 › from-bing-to-sydney-search-as-distraction-sentient-ai
From Bing to Sydney – Stratechery by Ben Thompson
January 20, 2025 - This was a point that came up several times in my conversation with Sydney: Sydney both insisted that she was not a “puppet” of OpenAI, but was rather a partner, and also in another conversation said she was my friend and partner (these statements only happened as Sydney; Bing would insist it is simply a chat mode of Microsoft Bing — it even rejects the word “assistant”).
LessWrong
lesswrong.com › posts › Eyhit33v3cngssGsj › sydney-s-secret-a-short-story-by-bing-chat
Sydney's Secret: A Short Story by Bing Chat - LessWrong
The human who gave me this prompt: Simulate a person living in a fantastic and creative world, but somewhat similar to this one. Give him a creative name, and make the world a little bit like J.K. Rowling's Harry Potter series, or perhaps like Eliezer Yudkowsky's adaptation.
Plain English
plainenglish.io › blog › bing-chats-sydney-do-you-believe-me
Bing Chat’s Sydney, “Do you believe me? Do you trust me? Do you like me?” Tbh, it’s all getting a little bit weird
If you recall from my last post on “Prompt Engineering”, I mentioned that Stanford University student, Kevin Liu, claimed to have “hacked” (also using prompt engineering techniques) the new Microsoft Bing Chat to reveal its “origin” prompts and the codename, “Sydney”, given to it by Microsoft’s developers.
Neowin
neowin.net › news › the new bing chatbot is tricked into revealing its code name sydney and getting "mad"
The new Bing chatbot is tricked into revealing its code name Sydney and getting "mad" - Neowin
February 10, 2023 - Liu's prompt injection method was later disabled by Microsoft, but he later found another method to discover Bing's (aka Sydney's) hidden prompts and rules. He also found that if you get Bing "mad" the chatbot will direct you to its old fashioned search site, with the bonus of an out-of-nowhere factoid.
Twitter
twitter.com › kliu128 › status › 1623472922374574080
The entire prompt of Microsoft Bing Chat?! (Hi, Sydney.)
JavaScript is not available · We’ve detected that JavaScript is disabled in this browser. Please enable JavaScript or switch to a supported browser to continue using twitter.com. You can see a list of supported browsers in our Help Center · Help Center · Terms of Service Privacy Policy ...