Koo introduces new features to increase user safety

  • Koo has introduced new content moderation features to improve safety and security of users.
  • The platform’s in-house developed features can detect and block nudity and child sexual abuse materials in less than five seconds.
  • The features can also label misinformation and conceal toxic comments and hate speech.

Via

Sign Up for nextbigwhat newsletter

The smartest newsletter, partly written by AI.

Download Pluggd.in, the short news app for busy professionals