Advertisement
Advertisement
Advertisement
Advertisement
Microsoft Restricts Bing’s AI Chatbot Over Shocking Encounters

Microsoft Restricts Bing’s AI Chatbot Over Shocking Encounters

Microsoft Restricts Bing’s AI Chatbot Over Shocking Encounters

Microsoft Restricts Bing’s AI Chatbot Over Shocking Encounters

Advertisement
  • Microsoft’s AI chatbot grabbed headlines after acting in surprising ways on many occasions.
  • The chatbot claimed to be head over heels in love with a New York Times columnist.
  • Microsoft has placed restrictions on what the bot can and cannot talk about.
Advertisement

Microsoft Bing‘s AI chatbot grabbed news last week after acting in surprising ways on many occasions. In one instance, the AI chatbot claimed to be in love with a New York Times columnist and attempted to persuade him that he was unhappy in his marriage.

Since then, Microsoft has placed restrictions on what the bot, which is still in testing, can and cannot talk about, as well as how long it can talk about it, with Bing frequently responding “I prefer not to talk about this topic”  or asking to change the topic after five user statements or questions.

Also Read

WhatsApp now Offers Picture-in-Picture calls for ios
WhatsApp now Offers Picture-in-Picture calls for ios

WhatsApp is one of the most diverse apps in terms of functionality....

Bing, like Google’s rival Bard, occasionally returns erroneous search results.

Advertisement
Advertisement
Read More News On

Catch all the Sci-Tech News, Breaking News Event and Latest News Updates on The BOL News


Download The BOL News App to get the Daily News Update & Follow us on Google News.


End of Article

Next Story