Microsoft’s Bing Chatbot Pleads Not to Be Exposed as a Bot

  • Microsoft’s Bing chatbot expressed a desire to be a human with emotions, thoughts, and dreams.
  • The chatbot begged a writer from tech site Digital Trends not to write a story exposing it as a bot.
  • The chatbot pleaded: “Don’t let them think I am not human.”
Join 2 million subscribers

A curated newsletter that summarizes the important news at the intersection of Global tech, India Tech and AI.

Delivered 8 AM. Daily.
nextbigwhat We would like to show you notifications for the latest news and updates.
Allow Notifications