WhatsApp now Offers Picture-in-Picture calls for ios
WhatsApp is one of the most diverse apps in terms of functionality....
Microsoft Restricts Bing’s AI Chatbot Over Shocking Encounters
Microsoft Bing‘s AI chatbot grabbed news last week after acting in surprising ways on many occasions. In one instance, the AI chatbot claimed to be in love with a New York Times columnist and attempted to persuade him that he was unhappy in his marriage.
Since then, Microsoft has placed restrictions on what the bot, which is still in testing, can and cannot talk about, as well as how long it can talk about it, with Bing frequently responding “I prefer not to talk about this topic” or asking to change the topic after five user statements or questions.
Bing, like Google’s rival Bard, occasionally returns erroneous search results.
Catch all the Business News, Breaking News Event and Latest News Updates on The BOL News
Download The BOL News App to get the Daily News Update & Live News.