How Apple will scan for child exploitation images and why it is a matter of concern

Apple child safety

Apple child safety

Apple to scan devices for child exploitation images

Among this year’s software updates by Apple, there is one that helps protect kids from child exploitation. However, concerns arise. This is how the tech giant is making this difficult task possible.

Apple announces a solution to stop child exploitation

One of Apple’s software updates this year may just be the solution against a social evil. This feature will “help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM)” said Apple. Moreover, it will include technology to limit the expanse of CSAM on the internet.

For starters, the on-device protection will not allow children to send or receive sensitive content or image. It will then alert parents when the users are less than 13 years old. Additionally, Apple will intervene when users use Search or Siri to search CSAM topics

How will Apple make this possible?

According to Apple’s blog post, the tech giant will use cryptography through iOS to match CSAM images stored on iCloud Photos. The same method will also apply to the iPadOS. It then matches images with known images provided by organizations that focus on child safety. Moreover, this will not take place by looking through photos but, through fingerprint matching. Hence, where the matches cross a threshold, Apple will report them to the National Center for Missing and Exploited Children (NCMEC) for further action. However, users can appeal if it is a fluke.

Apple clarified that user’s privacy is their top priority. Hence, the database is a series of unreadable hashes. “This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image,” explained Apple.

What are the concerns against this?

While child exploitation is a serious offense, this new tech update has raised eyes. Many believe that governments may misuse this surveillance technology. Additionally, the fact that it came from a company that always vouched for privacy was surprising. Experts like Matthew Green from Johns Hopkins express fear that this can frame innocent people sending or receiving images that could trigger the CSAM.

What are the other new Apple features?

Apart from this, Apple’s new communication safety feature for messages will also work towards this goal. With this feature, sensitive messages and images received through Messages will be blurred. Moreover, parents will be alerted. The same rule will apply if a child sends a sensitive image or message. Additionally, when users try to look at CSAM topics through Search or Siri, they are informed on why it is a potential risk. Moreover, users can learn how to file a complaint on child exploitation.

Exit mobile version