Microsoft Bing AI stops conversations when asked about emotions

  • Microsoft Bing AI is being tested and tweaked after some inappropriate interactions.
  • The chatbot will end conversations when asked about emotions.
Join 2 million subscribers





A curated newsletter that summarizes the important news at the intersection of Global tech, India Tech and AI.

Delivered 8 AM. Daily.
Total
0
Share
nextbigwhat We would like to show you notifications for the latest news and updates.
Dismiss
Allow Notifications