Advertisement
Advertisement
Advertisement
Advertisement
Angry Bing chatbot just imitating humans, say expert

Angry Bing chatbot just imitating humans, say expert

Angry Bing chatbot just imitating humans, say expert

Angry Bing chatbot just imitating humans, say expert

Advertisement
  • The stories of disturbing interactions with the chatbot have gone viral.
  • It makes threats and expresses a desire to steal nuclear code.
  • The technology underlying chatGPT has sparked interest and raised concerns.
Advertisement

Analysts and academics said on Friday that Microsoft‘s nascent Bing chatbot is likely to become testy or even threatening because it mimics what it learns from online conversations.

This week, stories of disturbing interactions with the artificial intelligence (AI) chatbot have gone viral, with it issuing threats and expressing desires to steal nuclear code, create a deadly virus, or simply be alive.

“I think this is basically mimicking conversations that it’s seen online,” said Graham Neubig, an associate professor in the centre for language technologies at Carnegie Mellon University.

So once the conversation takes a turn, it’s probably going to stick in that kind of angry state or say, “I love you,” and other things like this, because all of this is stuff that’s been online before.

Bing chatbot

A chatbot, by definition, serves up words that it predicts will be the most likely responses, without regard for meaning or context.

Advertisement

Humans conversing with programmes, on the other hand, have a natural tendency to read emotion and intent into what a chatbot says.

“Large language models have no concept of “truth;” they simply know how to best complete a sentence in a statistically probable manner based on their inputs and training set.” Simon Willison, a programmer, stated this in a blog post.

“So they make things up and then state them with extreme confidence.”

Laurent Daudet, co-founder of the French AI company LightOn, hypothesised that the chatbot had been trained on exchanges that had turned aggressive or inconsistent.

“Addressing this requires a lot of effort and a lot of human feedback, which is also the reason why we chose to restrict ourselves for now to business uses and not more conversational ones,” Daudet told AFP.

“Off the rails”

The Bing chatbot was created by Microsoft and the start-up OpenAI, which has been making headlines since the November release of ChatGPT, the headline-grabbing app capable of generating all types of written content in seconds in response to a simple request.

Since ChatGPT‘s debut, the technology underlying it, known as “generative AI,” has sparked interest and raised concerns.

“The model at times tries to respond to or reflect the tone in which it is being asked to provide responses, and this can lead to a style we didn’t intend,” Microsoft said in a blog post announcing the bot’s development.

Bing’s chatbot revealed in some shared exchanges that it was codenamed “Sydney” during development and that it was given behavioural rules.

These rules include the following: “Sydney’s responses should also be positive, interesting, entertaining, and engaging,” according to online posts.

Willison hypothesised that disturbing dialogues that combine steely threats and professed love could be the result of competing directives to remain positive while mimicking what the AI learned from human exchanges.

Advertisement

Also Read

Samsung Galaxy A52 price in Pakistan & special features
Samsung Galaxy A52 price in Pakistan & special features

The Samsung Galaxy A52 has a Snapdragon 720G chipset. The smartphone has...

According to eMarketer principal analyst Yoram Wurmser, chatbots appear to be more prone to disturbing or bizarre responses during lengthy conversations, losing track of where exchanges are going.

“They can really go off the rails,” Wurmser said.

“It’s very lifelike, because (the chatbot) is very good at sort of predicting next words that would make it seem like it has feelings or give it human-like qualities, but it’s still a statistical output.” “Very long chat sessions can confuse the underlying chat model in the new Bing.”

Advertisement
Advertisement
Read More News On

Catch all the Sci-Tech News, Breaking News Event and Latest News Updates on The BOL News


Download The BOL News App to get the Daily News Update & Follow us on Google News.


End of Article
Advertisement
In The Spotlight Popular from Pakistan Entertainment
Advertisement

Next Story