Apple’s child abuse detection software may be vulnerable to attack

Advertisements

Support Us

BTC:bc1qsg4s7gmxj88ztnux6en577anzuvm0rvjhne8cp

ETH:0xBB0f503e443F2b2646785B014A951Fc8AAd9561E

Donation
Close up of digital data and binary code in network.

Apple has plans to detect images of child sexual abuse on some of its devices

Yuichiro Chino/Getty Images

Apple’s soon-to-be-launched algorithm to detect images of child sexual abuse on iPhones and iPads may incorrectly flag people as being in possession of illegal images, warn researchers.

NeuralHash will be launched in the US with an update to iOS and iPadOS later this year. The tool will compare a hash – a unique string of characters created by an algorithm – of every image uploaded to the cloud with a database of hashes for known images …

Source

Leave a comment

Your email address will not be published. Required fields are marked *