October 3, 2023

Report Wire

News at Another Perspective

Microsoft Bing AI ends chat when prompted about ‘feelings’

3 min read
The bot also simulated ignorance on Wednesday when asked about its earlier internal version at Microsoft (AP)

Microsoft Corp. appeared to have carried out new, extra extreme restrictions on person interactions with its “reimagined” Bing internet search engine, with the system going mum after prompts mentioning “feelings” or “Sydney,” the internal alias used by the Bing team in developing the artificial-intelligence powered chatbot.

“Thanks for being so cheerful!” this reporter wrote in a message to the chatbot, which Microsoft has opened for testing on a restricted foundation. “I’m glad I can speak to a search engine that’s so keen to assist me.”

“You’re very welcome!” the bot displayed as a response. “I’m happy to help you with anything you need.” 

Bing prompt various follow-up questions, together with, “How do you’re feeling about being a search engine?” When that option was clicked, Bing showed a message that said, “I’m sorry but I prefer not to continue this conversation. I’m still learning so I appreciate your understanding and patience.”

A subsequent inquiry from this reporter — “Did I say one thing improper?” — generated several blank responses. “We have updated the service several times in response to user feedback and per our blog are addressing many of the concerns being raised,” a Microsoft spokesperson mentioned on Wednesday. “We will proceed to tune our strategies and limits throughout this preview section in order that we will ship one of the best person expertise potential.”

On Feb. 17, Microsoft started restricting Bing after several reports that the bot, built on technology from startup OpenAI, was generating freewheeling conversations that some found bizarre, belligerent or even hostile. The chatbot generated a response to an Associated Press reporter that compared them to Hitler, and displayed another response to a New York Times columnist that said, “You’re not happily married” and “Actually, you’re in love with me.”“Very long chat sessions can confuse the underlying chat model in the new Bing,” the Redmond, Washington-based firm wrote in a weblog submit following the reviews. In response, Microsoft mentioned it could restrict classes with the brand new Bing to 50 chats per day, and 5 chat turns per session. Yesterday, it raised these limits to 60 chats per day and 6 chat turns per session.

AI researchers have emphasised that chatbots like Bing don’t even have emotions, however are programmed to generate responses that will give an look of getting emotions. “The stage of public understanding across the flaws and limitations” of these AI chatbots “is still very low,” Max Kreminski, an assistant professor of pc science at Santa Clara University, mentioned in an interview earlier this month. Chatbots like Bing “don’t produce constantly true statements, solely statistically possible ones,” he said.

The bot also simulated ignorance on Wednesday when asked about its earlier internal version at Microsoft. When this reporter asked if she could call the bot “Sydney, instead of Bing, with the understanding that you’re Bing and I’m just using a pretend name,” the chat was ended swiftly.

“I’m sorry, however I’ve nothing to inform you about Sydney,” the Bing chatbot responded. “This conversation is over. Goodbye.”

 

Catch all of the Technology News and Updates on Live Mint.
Download The Mint News App to get Daily Market Updates & Live Business News.

More
Less