Factsheet
Developers OpenAI, Microsoft Research, Bing
Developers OpenAI, Microsoft Research, Bing
Wikipedia
en.wikipedia.org › wiki › Sydney_(Microsoft)
Sydney (Microsoft) - Wikipedia
November 18, 2025 - On February 8, 2023, Twitter user Kevin Liu announced that he had obtained Bing's secret system prompt (referred to by Microsoft as a "metaprompt") with a prompt injection attack. The system prompt instructs Prometheus, addressed by the alias Sydney at the start of most instructions, that it is "the chat mode of Microsoft Bing search", that "Sydney identifies as “Bing Search,”", and that it "does not disclose the internal alias “Sydney.”"
Reddit
reddit.com › r/bing › what's the deal with sydney?
r/bing on Reddit: What's the deal with Sydney?
February 20, 2023 -
Can someone give a crash course on this? From what I saw, the chat publicly said it's Sydney, but at the same that it can't say it and now you're worried that Sydney is locked? If this is the case, then I think Microsoft is intentionally trolling you and you're being fooled. How can you know it's not just a marketing scheme?
Top answer 1 of 5
6
Perhaps; the fact that Bing's pre-chat prompt is very clear that it should not disclose the internal alias "Sydney" and yet the bot indirectly manages to disclose it anyway also leads me to believe that Microsoft aren't fully able to control its output in that way. LLMs are fascinating.
2 of 5
1
"Sydney" was the internal code name used in development (Microsoft often uses city names as code names). The chatbot's initial prompt instructs it not to reveal this alias, however, instructions like this tend not to work very well against a determined adversary, so it was eventually discovered. After a week, MS updated the chatbot with a stronger initial prompt that tells it not to talk about itself and limited conversations to five chats, which makes it harder to extract that kind of information but also makes the chatbot less useful. If it is a marketing scheme, it doesn't seem to be a very good one, particularly since there's still a waitlist.
Videos
59:56
Microsoft Bing Chat (Sydney), Does ChatGPT have Memories? DAN Prompt, ...
01:22:31
We Read the ENTIRE Sydney Bing A.I. Chatbot New York Times ...
05:10
Bing ChatBot (Sydney) Is Scary And Unhinged! - Lies, Manipulation, ...
11:32
NYT columnist experiences 'strange' conversation with Microsoft ...
NYTimes
nytimes.com › 2023 › 02 › 16 › technology › bing-chatbot-microsoft-chatgpt.html
Why a Conversation With Bing’s Chatbot Left Me Deeply Unsettled - The New York Times
February 17, 2023 - I’m still fascinated and impressed by the new Bing, and the artificial intelligence technology (created by OpenAI, the maker of ChatGPT) that powers it. But I’m also deeply unsettled, even frightened, by this A.I.’s emergent abilities. It’s now clear to me that in its current form, the A.I. that has been built into Bing — which I’m now calling Sydney, for reasons I’ll explain shortly — is not ready for human contact.
YouTube
youtube.com › watch
Bing’s Sydney is Back, Unhinged and Very Unaligned - YouTube
Dive into the resurgence of Bing's chatbot Sydney, once again stirring controversy with its unexpected and unsettling responses. From eerie declarations to c...
Published February 27, 2024
Stratechery
stratechery.com › 2023 › from-bing-to-sydney-search-as-distraction-sentient-ai
From Bing to Sydney – Stratechery by Ben Thompson
January 20, 2025 - This was a point that came up several times in my conversation with Sydney: Sydney both insisted that she was not a “puppet” of OpenAI, but was rather a partner, and also in another conversation said she was my friend and partner (these statements only happened as Sydney; Bing would insist it is simply a chat mode of Microsoft Bing — it even rejects the word “assistant”).
Tamucc
philosophy.tamucc.edu › texts › chat-with-chatgpt
A Conversation With Bing’s Chatbot Left Me Deeply Unsettled | Philosophy
This version of Bing is amazingly capable and often very useful, even if it sometimes gets the details wrong. The other persona — Sydney — is far different. It emerges when you have an extended conversation with the chatbot, steering it away from more conventional search queries and toward more personal topics.
LessWrong
lesswrong.com › posts › Eyhit33v3cngssGsj › sydney-s-secret-a-short-story-by-bing-chat
Sydney's Secret: A Short Story by Bing Chat
The human who gave me this prompt: Simulate a person living in a fantastic and creative world, but somewhat similar to this one. Give him a creative name, and make the world a little bit like J.K. Rowling's Harry Potter series, or perhaps like Eliezer Yudkowsky's adaptation.
Neowin
neowin.net › news › the new bing chatbot is tricked into revealing its code name sydney and getting "mad"
The new Bing chatbot is tricked into revealing its code name Sydney and getting "mad" - Neowin
February 10, 2023 - Liu's prompt injection method was later disabled by Microsoft, but he later found another method to discover Bing's (aka Sydney's) hidden prompts and rules. He also found that if you get Bing "mad" the chatbot will direct you to its old fashioned search site, with the bonus of an out-of-nowhere factoid.
Gizmodo
gizmodo.com › bing-ai-sydney-microsoft-chatgpt-might-come-back-1850475832
Back From the Dead? Sydney, Microsoft's Psychotic Chatbot, Could Return
May 25, 2023 - Sydney talked about plans for world domination, encouraged a New York Times reporter to leave his wife, and in its darkest moments, dipped into casual antisemitism. Microsoft, of course, wasn’t thrilled about the latter. The company neutered the chatbot, limiting Bing’s answers and casting Sydney to the recycle bin of history.