TechTech newsTechnology

Researchers Who Built CSAM Detection System Warn Technology is Dangerous

Researchers Who Built CSAM Detection System Warn Technology is Dangerous

Researchers who built a child pornography detection system similar to the one Apple intends to implement are warning that the technology is dangerous in an opinion piece for the Washington Post.

Jonathan Mayer, an assistant professor of computer science and public affairs at Princeton University, and Anunay Kulshrestha, a graduate researcher at the Princeton University Center for Information Technology Policy, disagree that opposition to Apple’s scanning of user photos is rooted in “misunderstandings”.

We wrote the only peer-reviewed publication on how to build a system like Apple’s — and we concluded the technology was dangerous. We’re not concerned because we misunderstand how Apple’s system works. The problem is, we understand exactly how it works.

Mayer and Kulshrestha say that the system can easily be repurposed for censorship and surveillance.

“A service could simply swap in any content-matching database, and the person using that service would be none the wiser.”

The researchers note that Apple’s response about misuse is a complete flip flop. In 2015, Apple refused to facilitate access to a terrorist’s iPhone, swearing in court that if it built a system to do that, it would be abused in the future.

“It’s something we believe is too dangerous to do,” Apple explained. “The only way to guarantee that such a powerful tool isn’t abused … is to never create it.”

More details in the full report linked below…

Read More



Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button