Jumat, 13 Agustus 2021

Apple Will Keep Clarifying This CSAM Mess Until Morale Improves - Gizmodo

Image for article titled Apple Will Keep Clarifying This CSAM Mess Until Morale Improves
Photo: Mladen Antonov/AFP (Getty Images)

Last week, Apple announced new tools to detect and report child pornography and sexually explicit materials. It’s a noble mission and no one’s going to argue against catching child predators. That said, the rollout has turned into a debacle of epic proportions.

The controversy centers around two features Apple says it will deploy later this year in iOS 15 and iPadOS 15. The first involves scanning photos that have been shared to iCloud for child sex abuse materials (CSAM). The second involves scanning messages sent to and from children’s accounts to stop them from sharing explicit images. (If you want a more detailed dive into how these features work, you can read more here.)

As soon as these two features were announced, privacy and security experts sounded the alarm that, however well-intentioned, Apple was building a “backdoor” that could be misused by police or governments and create new risks. Apple replied with a lengthy FAQ. Thousands have since signed an open letter asking Apple to halt its work on the features and reaffirm its commitment to end-to-end encryption and user privacy. Yesterday, a Reuters report claimed that internally, Apple employees are also raising concerns. In a bid to calm fears, the company also promised that it wouldn’t allow governments to abuse its CSAM tools as a surveillance weapon. Today, Apple has yet again released another PDF titled “Security Threat Model Review of Apple’s Child Safety Features” in the hopes that further clarification may clear up “misunderstandings” about how this all works. (Spoiler: It won’t.)

This has been a public relations nightmare that is uncharacteristic for Apple. The company has gadget launches down to a science, and its events are always slick, well-produced affairs. After the backlash, Apple has quietly admitted that perhaps it hadn’t fully thought out its communication strategy for two complex tools and that perhaps everyone’s confused because it announced these two features simultaneously, despite the fact that they don’t work in the same way. It’s since launched an aggressive campaign to explain why its tools don’t pose a privacy and security threat. And yet journalists, experts, and advocacy groups remain befuddled. Hell, even Apple software chief Craig Federighi looked flustered while trying to break it all down for the Wall Street Journal. (And Federighi is normally a cool cucumber when it comes to telling us how it all “just works.”)

Some of the confusion swirls around whether Apple is scanning your actual iPhone for CSAM. According to Federighi, the answer is both yes and no. The scanning occurs during the iCloud upload process. Some of it happens on your phone, some of it happens in the cloud. There have also been questions as to how Apple figured out that the tools have an error rate of “one in 1 trillion.” It appears that answer boils down to advanced math. In all seriousness, Apple says it made its calculations using the most conservative parameters possible but it doesn’t answer the original question: Why should we trust that number? Apple also set its reporting threshold to 30 CSAM-matched images, which feels like an arbitrary number, and the company didn’t have an answer as to why that is beyond the fact that child predators purportedly have a much higher number of CSAM images.

In a briefing today with reporters, Apple tried to give further assurances its tools have simply been mischaracterized. For instance, it said its CSAM hash database would be created from an intersection of hashes given by two or more child safety organizations operating in separate sovereign jurisdictions. Or basically, the hashes won’t be provided by any one government. It also said there would be no automated reporting, and that it was aware it would have to expand the number of employees on its human review team. Apple also said it would maintain a public list of root hashes of every encrypted CSAM database shipping in every OS that supports the feature. Third-party auditors for each version of the database are more than welcome. Apple also repeatedly stated that these tools aren’t set in stone. Things are still very much in the works, though Apple demurred on whether changes have been made since the brouhaha started.

This is the epitome of getting lost in the weeds. If you take a step back, all this conflict isn’t necessarily about the nuts and bolts of these tools (though, they should certainly be vigorously examined for weaknesses). The conflict is whether these tools should exist at all, and if Apple should be taken at its word when so many experts seem alarmed. What’s surprising is how Apple’s seemed to stumble at reassuring everyone that they can be trusted with this.

It’s too early to say which side will prevail, but this is how it’s all going to go down: Critics won’t stop pointing out how Apple is creating an infrastructure that can be abused, and Apple won’t stop trying to convince us all that these tools are safe, private, and accurate. One side will hammer the other into submission, or at least until they’re too tired to protest any further. The rest of us will remain utterly confused.

Adblock test (Why?)


https://news.google.com/__i/rss/rd/articles/CBMiVWh0dHBzOi8vZ2l6bW9kby5jb20vYXBwbGUtd2lsbC1rZWVwLWNsYXJpZnlpbmctdGhpcy1jc2FtLW1lc3MtdW50aWwtbW9yYWxlLTE4NDc0ODQyOTbSAVlodHRwczovL2dpem1vZG8uY29tL2FwcGxlLXdpbGwta2VlcC1jbGFyaWZ5aW5nLXRoaXMtY3NhbS1tZXNzLXVudGlsLW1vcmFsZS0xODQ3NDg0Mjk2L2FtcA?oc=5

2021-08-13 22:15:00Z
52781793852011

Tidak ada komentar:

Posting Komentar