brytfmonline

Complete News World

Apple develops technology to search for photos of sexual assault on iPhones

Apple develops technology to search for photos of sexual assault on iPhones

Apple said its messaging app will now have the ability to identify and warn you about sensitive content, without allowing the company to read private communications.

The tool, which Apple has dubbed “Neural Match,” will detect known images of child sexual abuse without decrypting people’s messages. If a match is found, the photo will be seen by someone who can notify the police if necessary.

But investigators responded by saying the tool could be used for other purposes, such as government monitoring of opponents or competitors.

Matthew Green of Johns Hopkins University, one of the leading cryptographic researchers, has expressed concern that this tool could be used to control innocent people, sending frivolous images but designed to look like child pornography, fooling Apple’s algorithm leading to alerting forces Police – basically controlling people. “The researchers were able to do it easily,” he said.

Tech companies such as Microsoft, Google, and Facebook have for years shared “blacklists” of known images of child sexual abuse. Apple also scans files stored on its iCloud service, which aren’t as securely encrypted as your messages, for such photos.

The company has come under pressure from governments and police to allow encrypted information to be monitored.

Balanced work is required

Moving forward with these security measures will require Apple to strike a delicate balance between attacking child exploitation and maintaining its commitment to protecting the privacy of its users.

Apple believes it will achieve this compatibility with the technology it has developed in consultation with several prominent cryptographic designers, such as Dan Bonet, a professor at Stanford University, whose work in the field has earned him the Turing Prize, which is often considered a copy of the Nobel Prize. in technology.

See also  Nintendo Switch expected to lead console sales in 2022, say analysts

The computer scientist, who invented PhotoDNA more than a decade ago, the PhotoDNA technology used by police to identify online child pornography, recognized the potential for abuse in Apple’s system, but responded by needing to fight child sexual abuse. children.

Is it possible? Definitely. But is this something that worries me? NoSaid Hani Farid, a researcher at the University of California, Berkeley, who argues that there are many other programs designed to protect devices from various threats. For example, WhatsApp provides its users with full encryption to protect their privacy, but it uses a malware detection system and warns users not to open suspicious links.