Microsoft’s ‘Project Artemis’ chat scanner to detect child sex predators | Tech News

Microsoft’s ‘Project Artemis’ chat scanner to detect child sex predators

Codenamed Project Artemis, the technique combs through historical messages and looks for indicative patterns and characteristics before assigning a probability rating.

By: DINA BASS
| Updated on: Aug 20 2022, 18:50 IST
The tech company, which makes the Xbox gaming system, announced it's sharing the tool starting Friday, Jan. 10, 2020 with non-profit organizations and other gaming and messaging service developers.
The tech company, which makes the Xbox gaming system, announced it's sharing the tool starting Friday, Jan. 10, 2020 with non-profit organizations and other gaming and messaging service developers. (AP)
The tech company, which makes the Xbox gaming system, announced it's sharing the tool starting Friday, Jan. 10, 2020 with non-profit organizations and other gaming and messaging service developers.
The tech company, which makes the Xbox gaming system, announced it's sharing the tool starting Friday, Jan. 10, 2020 with non-profit organizations and other gaming and messaging service developers. (AP)

Microsoft Corp. will share a tool it's been using on its Xbox gaming service to scan online text chats and detect adults seeking to groom and exploit children for sexual purposes. 

Codenamed Project Artemis, the technique combs through historical messages and looks for indicative patterns and characteristics before assigning a probability rating. That can then be used by companies to decide which conversations on their platforms should get a closer look by a human moderator, wrote Courtney Gregoire, Microsoft's chief digital safety officer, in a blog post.

Tech companies are grappling with how to stem a rising tide of child pornography and exploitation online as images and nefarious texts overwhelm moderators and private chat apps make detection tougher. Companies in the industry reported 45 million online images of child sexual abuse in 2018, a record high, the New York Times reported in September.  Adult predators use built-in chat functions on popular video games and private messaging apps to groom children and solicit nude photos, sometimes by posing as kids themselves.

Microsoft's so-called grooming detection technique promises to help rein in that behavior with textual communications, but it still leaves voice chat in multiplayer games like Fortnite unaddressed, which serves as another avenue for child sex predators.

The project started at a Nov. 2018 hackathon co-sponsored with two child welfare groups that looked not just at new technology ideas but also legal and policy issues.  Since then, Microsoft has been developing the tools in collaboration with the companies behind online video game Roblox and messenger app Kik, The Meet Group, which owns social meeting apps like MeetMe and Skout, and Thorn, a non-profit organization co-founded by actors Ashton Kutcher and Demi Moore to fight child sex abuse. 

The team was led by Dartmouth College Computer Science Professor Hany Farid, who previously worked with Microsoft to build PhotoDNA, a tool that's been used by 150 companies and organizations to find and report images of child sexual exploitation. Farid has written in opposition to the proliferation of end-to-end encryption in social and private messaging services, arguing that it makes detecting and preventing child abuse more difficult.

Starting Jan. 10, Thorn will handle licensing of Project Artemis, which is built on Microsoft patents and available for free to qualifying online services, who can sign up for it by emailing antigrooming@thorn.org. Microsoft said it is already using the technique for Xbox chats and looking at doing the same for Skype.

Catch all the Latest Tech News, Mobile News, Laptop News, Gaming news, Wearables News , How To News, also keep up with us on Whatsapp channel,Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.

First Published Date: 10 Jan, 19:16 IST
Tags:
NEXT ARTICLE BEGINS