Microsoft has developed a technique to detect online predators who try to groom children for sexual purposes using the chat function in multiplayer video games.
In a blog post on Thursday, Microsoft has announced that it will be sharing the tool with nonprofit organizations and other gaming and messaging service developers.
Nicknamed “Project Artemis,” the tool automatically scans text-based conversations and rates them on the probability that a user might be to trying sexually exploit children.
“Building off the Microsoft patent, the technique is applied to historical text-based chat conversations. It evaluates and “rates” conversation characteristics and assigns an overall probability rating. This rating can then be used as a determiner, set by individual companies implementing the technique, as to when a flagged conversation should be sent to human moderators for review,” Microsoft writes in its blog.
Human moderators are then able to review flagged conversations to determine if they should report them to law enforcement.
An engineering team led by Dartmouth College digital forensics expert Hany Farid developed the technique. Microsoft worked with Farid and the makers of messaging services like Kik and the popular game Roblox.
Microsoft says, beginning 10 January 2020, licensing and adoption of the technique will be handled by Thorn. Companies and services wanting to test and adopt the technique can contact Thorn.