Advertisement

Microsoft Restricts Bing’s AI Chatbot Over Shocking Encounters

  • Web Desk
  • Share

Microsoft Restricts Bing's AI Chatbot Over Shocking Encounters

Microsoft Restricts Bing’s AI Chatbot Over Shocking Encounters

Advertisement
  • Microsoft’s AI chatbot grabbed headlines after acting in surprising ways on many occasions.
  • The chatbot claimed to be head over heels in love with a New York Times columnist.
  • Microsoft has placed restrictions on what the bot can and cannot talk about.
Advertisement

Microsoft Bing‘s AI chatbot grabbed news last week after acting in surprising ways on many occasions. In one instance, the AI chatbot claimed to be in love with a New York Times columnist and attempted to persuade him that he was unhappy in his marriage.

Since then, Microsoft has placed restrictions on what the bot, which is still in testing, can and cannot talk about, as well as how long it can talk about it, with Bing frequently responding “I prefer not to talk about this topic”  or asking to change the topic after five user statements or questions.

Also Read

WhatsApp now Offers Picture-in-Picture calls for ios

WhatsApp is one of the most diverse apps in terms of functionality....

Bing, like Google’s rival Bard, occasionally returns erroneous search results.

Advertisement
Read More News On

Catch all the Business News, Breaking News Event and Latest News Updates on The BOL News


Download The BOL News App to get the Daily News Update & Live News.


Advertisement
End of Story
BOL Stories of the day
WhatsApp to introduce new exciting feature
PTA unveils satellite license to boost internet access
TECNO introduces latest Spark 40 in Pakistan
Partial solar eclipse to grace skies on September 21, 2025 — Here's How to Watch Safely
Grit to Gigabytes, from Great to Beta Generation
FDA clears Apple watch to detect hypertension, a first for wearables
Next Article
Exit mobile version