Angry Bing chatbot just mimicking humans, say experts

Microsoft's nascent Bing chatbot turning testy or even threatening is likely because it essentially mimics what it learned from online conversations, analysts and academics said on Friday.

By:AFP
| Updated on: Feb 19 2023, 10:41 IST
Beware of fake ChatGPT apps! Already downloaded? Delete now
Microsoft Bing chatbot
1/6 OpenAI's ChatGPT portal is rapidly gaining popularity. It uses state-of-the-art language processing techniques to generate human-like responses to text input and  interacts conversationally with users to provide detailed answers on a wide range of topics.  (Bloomberg)
Microsoft Bing chatbot
2/6 But if you are looking to download the app from your Google Play Store or App Store, then beware! There are several fake ChatGPT-like apps that can be dangerous for your device.  (Bloomberg)
Microsoft Bing chatbot
3/6 You can find a bunch of fake ChatGPT apps on Google Play Store and App Store which can steal your data, a report by top10vpn revealed.  Hence, if you have already downloaded them, then you should hurry and delete them quickly. (REUTERS)
Microsoft Bing chatbot
4/6 Some of these apps on Android are: AI Chat Companion, ChatGPT 3: ChatGPT AI, Talk GPT – Talk to ChatGPT, ChatGPT AI Writing Assistant, Open Chat – AI Chatbot App. (Bloomberg)
Microsoft Bing chatbot
5/6 Some apps are also available on Apple's App Store, which include: Genie - GPT AI Assistant, Write For Me GPT AI Assistant, ChatGPT - GPT 3, Alfred - Chat with GPT 3, Chat w. GPT AI - Write This, ChatGPT - AI Writing apps, Wiz AI Chat Bot Writing Helper, Chat AI: Personal AI Assistant, and Wisdom Ai - Your AI Assistant.  (AFP)
image caption
6/6 However, it must be noted that OpenAI does not have an official standalone app for ChatGPT. Hence, you can use the feature in your browser while login to the official website at www.chat.openai.com/chat.  (AP)
Microsoft Bing chatbot
icon View all Images
"I think this is basically mimicking conversations that it's seen online," said Graham Neubig (AP)

Microsoft's nascent Bing chatbot turning testy or even threatening is likely because it essentially mimics what it learned from online conversations, analysts and academics said on Friday.

Tales of disturbing exchanges with the artificial intelligence (AI) chatbot -- including it issuing threats and speaking of desires to steal nuclear code, create a deadly virus, or to be alive -- have gone viral this week.

You may be interested in

MobilesTablets Laptops
7% OFF
Apple iPhone 15 Pro Max
  • Black Titanium
  • 8 GB RAM
  • 256 GB Storage
23% OFF
Samsung Galaxy S23 Ultra 5G
  • Green
  • 12 GB RAM
  • 256 GB Storage
Google Pixel 8 Pro
  • Obsidian
  • 12 GB RAM
  • 128 GB Storage
Apple iPhone 15 Plus
  • Black
  • 6 GB RAM
  • 128 GB Storage

"I think this is basically mimicking conversations that it's seen online," said Graham Neubig, an associate professor at Carnegie Mellon University's language technologies institute.

Also read
Looking for a smartphone? To check mobile finder click here.

"So once the conversation takes a turn, it's probably going to stick in that kind of angry state, or say 'I love you' and other things like this, because all of this is stuff that's been online before."

A chatbot, by design, serves up words it predicts are the most likely responses, without understanding meaning or context.

However, humans taking part in banter with programs naturally tend to read emotion and intent into what a chatbot says.

"Large language models have no concept of 'truth' -- they just know how to best complete a sentence in a way that's statistically probable based on their inputs and training set," programmer Simon Willison said in a blog post.

"So they make things up, and then state them with extreme confidence."

Laurent Daudet, co-founder of French AI company LightOn, theorized that the chatbot seemingly-gone-rogue was trained on exchanges that themselves turned aggressive or inconsistent.

"Addressing this requires a lot of effort and a lot of human feedback, which is also the reason why we chose to restrict ourselves for now to business uses and not more conversational ones," Daudet told AFP.

- 'Off the rails' -

The Bing chatbot was designed by Microsoft and the start-up OpenAI, which has been causing a sensation since the November launch of ChatGPT, the headline-grabbing app capable of generating all sorts of written content in seconds on a simple request.

Since ChatGPT burst onto the scene, the technology behind it, known as generative AI, has been stirring up fascination and concern.

"The model at times tries to respond or reflect in the tone in which it is being asked to provide responses (and) that can lead to a style we didn't intend," Microsoft said in a blog post, noting the bot is a work in progress.

Bing chatbot said in some shared exchanges that it had been codenamed "Sydney" during development, and that it was given rules of behavior.

Those rules include "Sydney's responses should also be positive, interesting, entertaining and engaging," according to online posts.

Disturbing dialogues that combine steely threats and professions of love could be due to dueling directives to stay positive while mimicking what the AI mined from human exchanges, Willison theorized.

Chatbots seem to be more prone to disturbing or bizarre responses during lengthy conversations, losing a sense of where exchanges are going, eMarketer principal analyst Yoram Wurmser told AFP.

"They can really go off the rails," Wurmser said.

"It's very lifelike, because (the chatbot) is very good at sort of predicting next words that would make it seem like it has feelings or give it human-like qualities; but it's still statistical outputs."

Microsoft announced on Friday it had capped the amount of back-and-forth people can have with its chatbot over a given question, because "very long chat sessions can confuse the underlying chat model in the new Bing."

Catch all the Latest Tech News, Mobile News, Laptop News, Gaming news, Wearables News , How To News, also keep up with us on Whatsapp channel,Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.

First Published Date: 19 Feb, 10:41 IST
NEXT ARTICLE BEGINS