|
Another year, another alarming breach of personal privacy by Apple. |
It seemed like it was only yesterday that
Apple battled with the FBI - refusing to cooperate with law enforcement when it came to cracking a terror suspect's phone. Tim cook, Apple's CEO and chairman, would give speeches at universities and Macworld conferences about the importance of not treating its users like 'the product.' Apple is well known for its rampant advertising campaigns detailing it's commitment to privacy and user security, But, much like Apple's excuse to slow down your iPhone in the iOS 10 update in order to "save your phone's battery life," it might have all been bullshit.
Recently, reports have surfaced that Apple is planning to search iPhones, iPads and Macs for
sexually explicit material of minors. This includes not only the iCloud, but also the internal storage in the device, as well as messages being sent to -and from- the encrypted iMessage system.
The system -Apple claims- was designed to cooperate with the National Center for Missing and Exploited Children, (NCMEC) and is part of an effort to track known images of child abuse, using a technolgy called perceptual image hashing.
Image hashes are a special signature withing the image file with a string of numbers and letters. Any time the image is modified, some of the letters and numbers change, to indicate it wasn't the original. Perceptual image hashing works differently, matching the images themselves, rather than the resource data behind the image- to find the child sexual abuse material, or CSAM.
So what about images that have been slightly modified? well, Apple has an answer for that too. At first, they claimed they were only searching image hashes, and now they say they have developed a deep learning AI they have trained to search the images themselves to find close approximations to child abuse, even if the image has been changed in some way. They call this technology NeuralHash, and it has been used already in the iCloud platform.
This is nothing new for cloud platforms that host images online. Everything from DropBox, to AWS has some form of hash searching for files hosted on their servers for things like child abuse, and even terrorism. What Apple is planning to do here -in the coming iOS and Mac OS Mojave update- takes it in a direction that experts are claiming sets a very dangerous precedent.
Alex Muffett, a former Facebook staffer, and renowned encryption specialist, took to Twitter to describe what he considers a betrayal of the consumer trust.
although it ostensibly exists to prevent upload of CSAM to their iCloud platform, they are using the user’s device to do it and making the tectonic-shift statement that ‘it’s ok by us to do this sort of thing to user devices.
Muffett went on to voice political concerns over this decision as well, citing the many ways this technology could be misused by less democratic nations around the world.
How such a feature might be repurposed in an illiberal state is fairly easy to visualize. Apple is performing proactive surveillance on client-purchased devices in order to defend its own interests, but in the name of child protection, [...] What will China want them to block?
Matt Green -A cryptographer and professor at John Hopkins University- raised many of the same concerns. He -and others- believe now that Apple has opened the door to scanning images on devices, instead of just images on the cloud, the precedent could lead to pressure from many governments across the world to use this technology for other -less altruistic- purposes, like stifling political protests, or even targeting marginalized groups for political persecution.
The Electronic Frontier Foundation (EFF) -an organization founded to protect the rights of people online- has chimed in on this issue, arguing Apple is going in the wrong direction.
Apple’s compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.
Predictably, not everyone is against this idea. There is a vocal part of the social media world who are insisting that this is all a big misunderstanding of what's going on.
There is still debate as to whether the images being scanned on the phone are really being scanned on the phone at all, or just being hashed while they are uploaded to iCloud. Several noted industry professionals don't seem to think so.
David Thiel, a technology and cybersecurity researcher at Stamford University, believes the focus of the concern should be somewhere else.
Apple's proposed approach to 'protecting kids' online didn't end at perceptual image hashing. Apple has plans to take on "sexting", and it's potential to result in people's nudes being leaked or used in extortion attempts, and this is where Thiel finds Apple's approach alarming.
Apparently, there is a separate perceptual image hashing technology that would be built into the iPhone's software, which would detect nude images, and warn users before sending them.
Theil isn't the only academic that isn't totally against the concept, either. Dr. Nicholas Weaver, computer scientist and lecturer at the University of California Berkeley, had this to say.
Weaver refers to the EARN IT act of 2020: A U.S. bill passed into law with bipartisan support, which would make companies liable for any CSAM images hosted on their servers -including ones that are encrypted. Hosting images uploaded to the iCloud puts Apple in a precarious position, since the iPhone maker must now balance its privacy-focused marketing with the demands of the U.S. government. Now, they must have one or the other, but they can't have both.
Apple calls concerned people "Screeching minority"
You heard that title right. In an internal memo at Apple dated August 6th, there is a message from Marita Rodriguez, director of strategic partnerships at the National Center for Missing and Exploited Children, (NCMEC) which says the following:
Team Apple,
I wanted to share a note of encouragement to say that everyone at NCMEC is SO PROUD of each of you and the incredible decisions you have made in the name of prioritizing child protection.
It’s been invigorating for our entire team to see (and play a small role in) what you unveiled today.
I know it’s been a long day and that many of you probably haven’t slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority.
Our voices will be louder.
Our commitment to lift up kids who have lived through the most unimaginable abuse and victimizations will be stronger.
During these long days and sleepless nights, I hope you take solace in knowing that because of you many thousands of sexually exploited victimized children will be rescued, and will get a chance at healing and the childhood they deserve.
Thank you for finding a path forward for child protection while preserving privacy.
What's interesting to note here is that a large chunk of the push back against this policy is coming from Apple employees themselves. On August 12th,
Reuters reported an alarming number of employees flooding Apple's internal Slack messaging system with concerns about the system's potential for being misused.
Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.
Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy.
This is an awkward circumstance coming from a company that just tried to write off its opposition's concerns as a "vocal minority."
NeuralHash reverse-engineered on GitHub
Reddit user and developer Asuhariet Ygvar claims to have
reverse-engineered the algorithm NeuralHash is using to detect CSAM on Apple devices. He says that NeuralHash was already included with iOS as early as version 14.3, and that using code provided by the iOS binary, he managed to re-build the entire program using Python. Ygvar even provides source codes so that other Reddit devs can build it themselves.
According to Ygvar, there is a 7 step process to the perceptual image hashing used by Apple to detect CSAM.
Convert image to RGB.
Resize image to 360x360.
Normalize RGB values to [-1, 1] range.
Perform inference on the NeuralHash model.
Calculate dot product of a 96x128 matrix with the resulting vector of 128 floats.
Apply binary step to the resulting 96 float vector.
Convert the vector of 1.0 and 0.0 to bits, resulting in 96-bit binary data.
When questioned by a reddit user if the algorithm compiled by Ygvar is the same one Apple is using now, Ygvar said this:
First of all, the model files have prefix NeuralHashv3b-
, which is the same term as in Apple's document.
Secondly, in this document Apple described the algorithm details in Technology Overview -> NeuralHash
section, which is exactly the same as what I discovered. For example, in Apple's document:
Second, the descriptor is passed through
a hashing scheme to convert the N floating-point numbers to M bits. Here, M is much smaller than the
number of bits needed to represent the N floating-point numbers.
And as you can see from here and here N=128 and M=96.
Moreover, the hash generated by this script almost doesn't change if you resize or compress the image, which is again the same as described in Apple's document.
It wasn't long before another Reddit user was able to use the information from Ygvar's build of NeuralHash to
create a 'collision' -or false match- with two completely unrelated images.
This is the very circumstance many experts in the field feared would happen. If the system could somehow be cracked, it could be used to detect false positives, or even false negatives - completely defeating the purpose of NeuralHash.
An Apple spokesperson
told Motherboard yesterday that despite Ygvar's findings, the version of NeuralHash (v3B) that was reverse engineered isn't the same one being used by Apple going forward.
Apple has been spying on you for years
Back in 2019 Apple was caught eavesdropping on conversations through the Siri user assistant. After weeks of publicly denying it was happening, the company
finally apologized for it, once it was revealed that the company shared recordings of conversations users were having to third parties.
For most non-Apple users, getting the message across to iPhone users that a multi-billion dollar technocratic monopoly doesn't have your best interests at heart is like trying to tell a christian that Jesus Christ didn't die for our sins. There is a sort of alt lifestyle associated with the brand that has taken on almost religious proportions in the past 20 years. However, this appears to be the last straw for many Apple users. After buying products marketed on superior privacy to its competitors, Apple has proudly gone even further than others to breach the trust of their consumers.
undefined
ReplyDeleteundefined
undefined
undefined
undefined
undefined
undefined
undefined
undefined
undefined
Assuming that you wish, you can likewise run Windows next to each other with Mac operating system X v10.5 Panther. macosxtutorial.com
ReplyDeletecanlı sex hattı
ReplyDeletesalt likit
heets
arkadaşlık sitesi
https://cfimi.com/
REN
ReplyDeleteIMPORTANT!!! A Real Hacker will not charge anything in Advance
I came across a service who helped me in my personal affair
They guide you Step- By -Step, Provides you with
a complete explanation video and a customise demo account
so you can view and check how the software function and what all applications you can track Before You Pay anything. Here is the site REMOTEMOBILEACCESS🟢COM
These are genuine guys