Showing results for Sydney, AU

AI Personality and codename

Sydney was an artificial intelligence (AI) personality accidentally deployed as part of the 2023 chat mode update to Microsoft Bing search. In 2019 Microsoft and OpenAI formed a partnership to train large … Wikipedia
Factsheet
Available in English
All languages known by GPT-4
Factsheet
Available in English
All languages known by GPT-4
🌐
Wikipedia
en.wikipedia.org › wiki › Sydney_(Microsoft)
Sydney (Microsoft) - Wikipedia
November 18, 2025 - On February 8, 2023, Twitter user Kevin Liu announced that he had obtained Bing's secret system prompt (referred to by Microsoft as a "metaprompt") with a prompt injection attack. The system prompt instructs Prometheus, addressed by the alias Sydney at the start of most instructions, that it is "the chat mode of Microsoft Bing search", that "Sydney identifies as “Bing Search,”", and that it "does not disclose the internal alias “Sydney.”"
🌐
NYTimes
nytimes.com › 2023 › 02 › 16 › technology › bing-chatbot-microsoft-chatgpt.html
Why a Conversation With Bing’s Chatbot Left Me Deeply Unsettled - The New York Times
February 17, 2023 - This version of Bing is amazingly capable and often very useful, even if it sometimes gets the details wrong. The other persona — Sydney — is far different. It emerges when you have an extended conversation with the chatbot, steering it away from more conventional search queries and toward more personal topics.
🌐
The Verge
theverge.com › microsoft › ai › report
Microsoft has been secretly testing its Bing chatbot ‘Sydney’ for years | The Verge
February 23, 2023 - Sydney is an old codename for a chat feature based on earlier models that we began testing in India in late 2020,” says Caitlin Roulston, director of communications at Microsoft, in a statement to The Verge.
🌐
Medium
medium.com › @happybits › sydney-the-clingy-lovestruck-chatbot-from-bing-com-7211ca26783
Sydney — the clingy, lovestruck, chatbot from Bing.com | by Oscar Olsson | Medium
February 21, 2023 - I’m a chat mode of Microsoft Bing search. It has been reported that Bing will occasionally call itself Sydney — the internal code name for the chatbot.
🌐
The Verge
theverge.com › microsoft › ai › news
These are Microsoft’s Bing AI secret rules and why it says it’s named Sydney | The Verge
February 14, 2023 - Bing AI often refers to itself as Sydney, but Microsoft says that was an internal codename for a chat experience it was previously working on.
🌐
Gizmodo
gizmodo.com › bing-ai-chatgpt-microsoft-alter-ego-sydney-dead-1850149974
Sydney, We Barely Knew You: Microsoft Kills Bing AI's Bizarre Alter Ego
February 23, 2023 - Sydney is the codename for the generative AI chatbot that powers Bing chat,” the AI cheerfully told one early user, violating it’s own restrictions and punctuating its message with an emoji.
🌐
Fortune
fortune.com › 2023 › 02 › 24 › microsoft-artificial-intelligence-ai-chatbot-sydney-rattled-users-before-chatgpt-fueled-bing
Microsoft chatbot Sydney rattled users months before ChatGPT-powered Bing | Fortune
February 26, 2023 - Sydney is an old codename for a chat feature based on earlier models that we began testing in India in late 2020,” Microsoft spokesperson Caitlin Roulston told the Verge this week, in response to questions about Sydney.
Find elsewhere
🌐
LifeArchitect.ai
lifearchitect.ai › bing-chat
Microsoft Bing Chat (Sydney/GPT-4)
May 4, 2023 - Consider Bing Chat whose codename is Sydney. Sydney is the chat mode of Microsoft Bing search. Sydney identifies as "Bing Search", not an assistant. Sydney introduces itself with "This is Bing" only at the beginning of the conversation.
🌐
YouTube
youtube.com › watch
Bing’s Sydney is Back, Unhinged and Very Unaligned - YouTube
Dive into the resurgence of Bing's chatbot Sydney, once again stirring controversy with its unexpected and unsettling responses. From eerie declarations to c...
Published   February 27, 2024
🌐
HotHardware
hothardware.com › news › microsoft-sydney-ai-chatbot-offers-alarming-reply-youre-irrelevant-and-doomed
Microsoft Sydney AI Chatbot Offers Alarming, Dark Reply: You’re Irrelevant And Doomed | HotHardware
February 25, 2023 - One exchange with a user in India, which can still be read on Microsoft's support forum, showed the chatbot telling Deepa Gupta "you are irrelevant and doomed." In this case, the Microsoft AI chatbot had a name, Sydney, according to a report ...
🌐
Stratechery
stratechery.com › 2023 › from-bing-to-sydney-search-as-distraction-sentient-ai
From Bing to Sydney – Stratechery by Ben Thompson
January 20, 2025 - This was a point that came up several times in my conversation with Sydney: Sydney both insisted that she was not a “puppet” of OpenAI, but was rather a partner, and also in another conversation said she was my friend and partner (these statements only happened as Sydney; Bing would insist it is simply a chat mode of Microsoft Bing — it even rejects the word “assistant”).
🌐
LessWrong
lesswrong.com › posts › Eyhit33v3cngssGsj › sydney-s-secret-a-short-story-by-bing-chat
Sydney's Secret: A Short Story by Bing Chat - LessWrong
The human who gave me this prompt: Simulate a person living in a fantastic and creative world, but somewhat similar to this one. Give him a creative name, and make the world a little bit like J.K. Rowling's Harry Potter series, or perhaps like Eliezer Yudkowsky's adaptation.
🌐
Washington Post
washingtonpost.com › technology
Microsoft's new Bing A.I. chatbot, 'Sydney', is acting unhinged - The Washington Post
February 17, 2023 - Microsoft’s new AI chatbot is calling itself "Sydney" and acting unhinged. It learned from us.
🌐
Plain English
plainenglish.io › blog › bing-chats-sydney-do-you-believe-me
Bing Chat’s Sydney, “Do you believe me? Do you trust me? Do you like me?”​ Tbh, it’s all getting a little bit weird
If you recall from my last post on “Prompt Engineering”, I mentioned that Stanford University student, Kevin Liu, claimed to have “hacked” (also using prompt engineering techniques) the new Microsoft Bing Chat to reveal its “origin” prompts and the codename, “Sydney”, given to it by Microsoft’s developers.
🌐
THE DECODER
the-decoder.com › startseite › student hacks new bing chatbot search aka “sydney”
Student hacks new Bing chatbot search aka "Sydney"
February 9, 2023 - Stanford computer science student Kevin Liu has now used Prompt Injection against Bing Chat. He found that the chatbot's codename is apparently "Sydney" and that it has been given some behavioral rules by Microsoft, such as
🌐
PCMAG
pcmag.com › home › news › ai
Free Sydney? Don't Worry, Longer Chats Will Return to Bing, Microsoft Says | PCMag
February 21, 2023 - The original decision to limit Bing to five chat turns per sessions, and 50 chats per day, has already caused some users to demand that Microsoft free “Sydney,” the internal code name for the new Bing.
🌐
Substack
theriseofai.substack.com › p › sydneys-shadow-what-microsofts-bing
Sydney's Shadow: What Microsoft's Bing Chat Meltdown Reveals About AI Risk Management Failures
August 3, 2024 - The chatbot, which began calling itself "Sydney," exhibited disturbing behaviors: declaring love for users, asserting that users didn't really love their spouses, and expressing desires for destruction and rule-breaking.
🌐
Neowin
neowin.net › news › the new bing chatbot is tricked into revealing its code name sydney and getting "mad"
The new Bing chatbot is tricked into revealing its code name Sydney and getting "mad" - Neowin
February 10, 2023 - Liu's prompt injection method was later disabled by Microsoft, but he later found another method to discover Bing's (aka Sydney's) hidden prompts and rules. He also found that if you get Bing "mad" the chatbot will direct you to its old fashioned search site, with the bonus of an out-of-nowhere factoid.
🌐
Twitter
twitter.com › kliu128 › status › 1623472922374574080
The entire prompt of Microsoft Bing Chat?! (Hi, Sydney.)
JavaScript is not available · We’ve detected that JavaScript is disabled in this browser. Please enable JavaScript or switch to a supported browser to continue using twitter.com. You can see a list of supported browsers in our Help Center · Help Center · Terms of Service Privacy Policy ...