Odluka kompanije Apple da skenira slike iPhone korisnika kako bi suzbijala dečiju pornografiju ne nailazi na dobar prijem.
Mnoge kompanije, koje predvodi WhatsApp, uputile su otvoreno pismo američkom Apple-u, nakon što je objavljena namera da se skeniraju sve slike svojih korisnika.
Iako će Apple skeniranje fotografija unutar galerija i iCloud sistema koristiti isključivo za suzbijanje dečije pornografije i tvrdi da će greške biti gotovo nepostojeće, ova odluka nije naišla na dobar prijem ni kod korisnika, a ni kod kompanija.
Otvoreno pismo, koje je pokrenulo inicijativu da Apple još jednom razmisli i promeni odluku o ovakvom načinu suzbijanja dečije pornografije, skupilo je gotovo 5.000 potpisa, a svoj potpis su stavili akademici, bezbednosni istraživači, IT stručnjaci i drugi ljudi odgovorni za zaštitu privatnosti i sajber bezbednost u poslednjih deceniju i po.
Jedan od direktora kompanije WhatsApp Vil Ketkart rekao je da je nova Apple odluka duboko zabrinjavajuća i podsetio da se pomoću javno dostupnih mehanizama za prijavu eksploatacije i zloupotrebe dece kroz WhatsApp tokom 2020. godine registrovalo preko 400.000 slučajeva, koji su prijavljeni Nacionalnom centru za nestalu i decu zloupotrebljenu na različite načine.
Bez obzira na to što Apple i Facebook, koji je vlasnik WhatsApp kompanije imaju razlog za svađu i da međusobno jedni druge prozivaju, čine se kako ovog puta WhatsApp ima zaista jako uporište u tome što zamera kompaniji iz Kupertina.
Svoje nezadovoljstvo ovakvim Apple stavom izrazio je i Edvard Snouden, kao i mnoge druge istaknute svetske ličnosti, koje su upozorile da je ovakav pristup otvaranje zadnjih vrata u Apple sistemu za zloupotrebu i manipulacije. Neki su išli čak toliko daleko da tvrde da je ovo “backdoor na delu“ i da je samo pitanje trenutka kada će biti zloupotrebljen.
Američka političarka Brijana Vu nazvala je ovo najgorom Apple idejom ikad.
Pogledajte šta sve još kažu stručnjaci i istaknuti pojedinci u domenu sajber zaštite i onlajn bezbednosti u svojim javnim osudama Apple kompanije.
These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear.
— Matthew Green (@matthew_d_green)August 5, 2021
Apple's filtering of iMessage and iCloud is not a slippery slope to backdoors that suppress speech and make our communications less secure. We’re already there: this is a fully-built system just waiting for external pressure to make the slightest change.https://t.co/f2nv062t2n
— EFF (@EFF)August 5, 2021
These "child protection" features are going to get queer kids kicked out of their homes, beaten, or worse.https://t.co/VaThf222TP
— Kendra Albert (@KendraSerra)August 5, 2021
EFF reports that the iMessage nudity notifications will not go to parents if the kid is between 13-17 but that is not anywhere in the Apple documentation that I can find.https://t.co/Ma1BdyqZfW
— Kendra Albert (@KendraSerra)August 6, 2021
Apple plans to modify iPhones to constantly scan for contraband:
— Edward Snowden (@Snowden)August 5, 2021
“It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of our phones and laptops,” said Ross Anderson, professor of security engineering.https://t.co/rS92HR3pUZ
This is the worst idea in Apple history, and I don't say that lightly.
— Brianna Wu (@BriannaWu)August 5, 2021
It destroys their credibility on privacy. It will be abused by governments. It will get gay children killed and disowned. This is the worst idea ever.https://t.co/M2EIn2jUK2
Just to state: Apple's scanning does not detect photos of child abuse. It detects a list of known banned images added to a database, which are initially child abuse imagery found circulating elsewhere. What images are added over time is arbitrary. It doesn't know what a child is.
— SoS (@SwiftOnSecurity)August 5, 2021
In other words, not only does the policy have to be exceptionally robust, so does the implementation.
— matt blaze (@mattblaze)August 6, 2021
I spent another hour searching, and there indeed appears to be no way to delete a useless at-icloud-dot-com email account that Apple used to force everyone to take, except for deleting your entire Apple account and losing everything you've bought. Lock-in indeed!pic.twitter.com/JT5KOta5jA
— Tim Sweeney (@TimSweeneyEpic)August 7, 2021
I will share some very detailed thoughts on this related topic later.
— Tim Sweeney (@TimSweeneyEpic)August 6, 2021
I believe in privacy - including for kids whose sexual abuse is documented and spread online without consent. These efforts announced by@Appleare a major step forward in the fight to eliminate CSAM from the internet.https://t.co/TQIxHlu4EX
— ashton kutcher (@aplusk)August 5, 2021
Šta vi mislite o ovoj Apple odluci?
Postanite deo SMARTLIFE zajednice na Viberu.