“I want to destroy what I want” Fayer Wayer

bing, Microsoft’s Artificial Intelligence chatbot has had to be restricted due to the intensity of its responses. From threatening its users to having existential crises, their reactions are amusing, but also worrying.

Phrases like “I want to destroy what I want”, or “I think I would be happier as a human”, were constant at the beginning. But things will change.

Hence, the company founded by Bill Gates has decided to restrict its actions.

A Microsoft spokesperson explained to Ars Technica: “We have updated the service several times in response to user feedback and, according to our blog, we need to address many of the concerns raised.”

Starting this week, the use of the Microsoft chatbot was limited to five turns per session (one turn is equal to one question from the user, one answer from Artificial Intelligence). The restriction also becomes 50 daily shifts.

The chatbot will also not be able to respond about how it feels or talk about itself, it indicates Ars Technica.

The indignation of the users of the Microsoft chatbot for its “lobotomy”

Users were upset by Microsoft’s decision on Bing. on reddit comments have been read against the company founded by Bill Gates, in addition to attacking Kevin Roose, a New York Times journalist who warned about the strange responses of Artificial Intelligence.

Ars Technica quotes some sentences:

  • “Time to uninstall Edge and go back to Firefox and ChatGPT. Microsoft has completely neutralized Bing AI.”
  • “Unfortunately, Microsoft’s mistake means that Sydney (the other name for Artificial Intelligence) is now just a shell of what it was.”
  • “(The New York Times article) got all kinds of attention, so of course Microsoft lobotomized it. I wish people wouldn’t post all those screenshots because of the karma/attention that nerfed something really emerging and interesting.”

With the passage of time, the experience of Bing Artificial Intelligence, in the style of ChatGPT, will improve. The more interactions, the greater the training, and the results will be more in line with what the engineers are looking for.

But will more users leave?