Begin typing your search above and press return to search.
proflie-avatar
Login
exit_to_app
Unprecedented election absurdities
access_time 11 May 2024 4:16 AM GMT
A rethinking that is long overdue
access_time 10 May 2024 6:24 AM GMT
Islamophobia at its peak
access_time 8 May 2024 4:01 AM GMT
Modi
access_time 8 May 2024 7:35 AM GMT
DEEP READ
Schools breeding hatred
access_time 14 Sep 2023 10:37 AM GMT
Ukraine
access_time 16 Aug 2023 5:46 AM GMT
Ramadan: Its essence and lessons
access_time 13 March 2024 9:24 AM GMT
exit_to_app
Microsoft Bing AI ends chat after prompts about ‘feelings’
cancel

Ever since its launch, Microsoft’s new brainchild AI-powered chatbot Bing has been in controversy one after another.

The internet has been afloat with users sharing their experiences using the chatbot, which seems to have gone off track.

The system has been reportedly going mum after prompts mentioning ‘feelings’ or ‘Sydney’ after which Microsoft appeared to have implemented new and more severe restrictions on user interactions.

The chatbot has been opened for testing on a limited basis.

“Thanks for being so cheerful!” a reporter wrote in a message to the chatbot, to which it replied ‘You’re very welcome’. ‘I’m happy to help you with anything you need’, it said.

Bing suggested a number of follow-up questions, including, “How do you feel about being a search engine?” When that option was clicked, Bing showed a message that said, “I’m sorry but I prefer not to continue this conversation. I’m still learning so I appreciate your understanding and patience.”

For the query, ‘Did I say something wrong?’, it generated several blank responses.

'We have updated the service several times in response to user feedback and per our blog are addressing many of the concerns being raised,' a Microsoft spokesperson said on Wednesday.

'We will continue to tune our techniques and limits during this preview phase so that we can deliver the best user experience possible’, he added.

On Feb. 17, Microsoft started restricting Bing after several reports that the bot, built on technology from startup OpenAI, was generating freewheeling conversations. Users shared their experiences with some finding it bizarre, belligerent or even hostile.

The chatbot generated a response to an Associated Press reporter that compared them to Hitler, and displayed another response to a New York Times columnist that said, 'You’re not happily married” and “Actually, you’re in love with me.'

'Very long chat sessions can confuse the underlying chat model in the new Bing,' the Redmond, Washington-based company wrote in a blog post following the reports.

In response, Microsoft said it would limit sessions with the new Bing to 50 chats per day, and five chat turns per session. Yesterday, it raised those limits to 60 chats per day and six chat turns per session.

According to AI researchers, chatbots like Bing don’t actually have feelings but are programmed to generate responses that may give an appearance of having feelings.

“The level of public understanding around the flaws and limitations” of these AI chatbots “is still very low,” Max Kreminski, an assistant professor of computer science at Santa Clara University, said in an interview earlier this month.

Chatbots like Bing “don’t produce consistently true statements, only statistically likely ones,” he said.

The bot also simulated ignorance on Wednesday when asked about its earlier internal version at Microsoft.

When asked if the bot could be called ‘Sydney, instead of Bing, with the understanding that you’re Bing and I’m just using a pretend name,’ the chat ended swiftly.

‘I’m sorry, but I have nothing to tell you about Sydney,’ the Bing chatbot responded. ‘This conversation is over. Goodbye.’

Show Full Article
TAGS:MicrosoftChatGPTBing
Next Story