fbpx

Apple To Scan US iPhones For Images of Child Sexual Abuse

Apple To Scan US iPhones For Images of Child Sexual Abuse

Apple images

Apple To Scan US iPhones For Images of Child Sexual Abuse. Photo by cottonbro from Pexels https://www.pexels.com/@cottonbro?utm_content=attributionCopyText&utm_medium=referral&utm_source=pexels https://www.pexels.com/photo/person-in-white-dress-shirt-holding-black-smartphone-5083010/?utm_content=attributionCopyText&utm_medium=referral&utm_source=pexels

Apple unveiled plans to scan U.S. iPhones for child sexual abuse images and illegal content — a move that pleased child protection groups but raised alarm among security researchers who say it could expose millions of people’s personal devices to scrutiny by the government.

Apple detailed its proposed automated system known as “neuralMatch” and said that it will proactively scan illegal images and content before they are uploaded to iCloud.

If a match is found, it will report it to law enforcement and the user’s account will be disabled. In addition, Apple said it will notify the National Center for Missing and exploited Children (NCMEC). This scheme will initially roll out only in the U.S.

Separately, apple is planning to scan users’ encrypted messages for sexually explicit content as a child safety measure. This too alarmed privacy advocates.

The detection system will point out only the images that are already in the database of known pornography. Parents taking innocent pictures of their children bathing need not worry.

Listen to GHOGH with Jamarlin Martin | Episode 74: Jamarlin Martin Jamarlin returns for a new season of the GHOGH podcast to discuss Bitcoin, bubbles, and Biden. He talks about the risk factors for Bitcoin as an investment asset including origin risk, speculative market structure, regulatory, and environment. Are broader financial markets in a massive speculative bubble?

Though researchers say the matching tool, neuralMatch, only sees mathematical fingerprints that represent the illegal images, it could be used by authoritarian governments against their citizens.

Dr. Matthew Green, a top cryptography researcher at Johns Hopkins University, announced the new program before Apple made a statement on it.

Green warned that the way the system works — downloading a list of fingerprints produced by the National Center for Missing and exploited Children (NCMEC) that corresponds to its database of abuse images — introduces new security risks for users.

“Whoever controls this list can search for any content they want on your phone and you really don’t have any way of knowing what’s on the list because its obscured — just a bunch of opaque numbers, even if you hack into your phone to get the list,” Green said. “The theory is that you will trust Apple to only include really bad images. Say, the images carefully selected by NCMEC.”

Apple’s technology balances “the need for privacy with digital safety for children,” said Julie Cordua, the CEO of Thorn, a nonprofit funded by Demi Moore and Ashton Kutcher that uses technology to help protect children from sexual abuse by identifying victims.

Apple said that its goal is to create technology that empowers people and enriches their lives while helping them stay safe.

Photo by cottonbro from Pexels