What’s Going Wrong at Apple These Days?


Apple’s new child safety features for upcoming iOS, iPadOS, and macOS releases raise some serious privacy concerns in the industry. The company is also marred by issues related to promoting scam apps on the app store, as well as claims of sexism all in one week!

Apple’s Child Protection Features a Slippery Slope?

Apple on Thursday announced a series of new features that will be available with the upcoming iOS 15, iPadOS 15, and macOS Monterey. The features are intended as a way for the company to detect and report instances of sexual abuse toward children. These child safety features include:

  • New machine learning-based communication tools called Communication Safety to help parents monitor their children’s online communication, which may include sexually explicit content
  •  A cryptography tool called neuralMatch to detect child sexual abuse material (CSAM)
  • Updates to Siri and Search to catch instances of CSAM being accessed online through Apple devices

The first and the third feature updates could actually prove useful for parents or guardians. However, the market has mixed reactions for neuralMatch mainly because of privacy concerns arising out of the tool’s ability to scan iPhones and other devices for images of child sexual abuse.

neuralMatch is designed to scan every image in an iPhone, iPad, or Apple computer before it is uploaded to the iCloud. As its name suggests, the tool scans all image hashes saved on any device, and matches against images from a database of known CSAM. A match is manually reviewed before being reported to the National Center for Missing and Exploited Children (NCMECOpens a new window ).

Even as neuralMatch could help in curbing instances of child abuse by identifying offenders, it does raise some serious privacy-related concerns. “This is a really bad idea,” tweeted Matthew GreenOpens a new window , Cryptographer, and professor at Johns Hopkins.

Privacy Implications

Green’s apprehensions are two-fold. Firstly, the possibility of neuralMatch being used to invade user privacy by authoritarian governments. He addsOpens a new window , “Initially I understand this will be used to perform client-side scanning for cloud-stored photos. Eventually, it could be a key ingredient in adding surveillance to encrypted messaging systems.”

It is difficult to say if Apple, who a few years ago declined FBI’s and the U.S. government’s requestOpens a new window to install a backdoor in its products, is capitulating to such demands. The intent then was to disallow surveillance of encrypted user data.

Apple scanning photos is not new. Presently, Apple, along with Google, Microsoft, Facebook, and other companies already check respective user cloud storage to check if images match known CSAM. But the problem with this particular update is that it provides direct access to the local storage on the user device.

Greg NojeimOpens a new window , co-director of Center for Democracy & TechnologyOpens a new window ‘s Security & Surveillance Project agrees. In his sharp critique of the company, Nojeim saidOpens a new window , “Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope creep not only in the U.S., but around the world. Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.”

Technological Issues

And second, Green believes hash-based image scanning, searching, and matching is not an ideal solution.

neuralMatch is based on a cryptographic technology called private set intersection. Apple said doesn’t reveal the result if matched. It creates a cryptographic safety voucher containing the encoded match result, which is uploaded to iCloud Photos along with the image.

What the tool basically does is extract the hash of the image (a digital fingerprint of sorts) and searches it against hashes of 200,000 child sexual abuse images it was trained on. So it could be used to target and frame an innocent person if someone sends across a harmless image, but with a hash that matches with the hash of the CSAM image. 

Imagine someone sends you a perfectly harmless political media file that you share with a friend. But that file shares a hash with some known child porn file? pic.twitter.com/YdNVB0xfCAOpens a new window

— Matthew Green (@matthew_d_green) August 5, 2021Opens a new window

Moreover, Green pointed out that an average user can’t review these stored hashes.

Needless to say, child protection groups such as NCMEC have welcomed the decision. “Apple’s expanded protection for children is a game-changer,” John ClarkOpens a new window , the president, and CEO of the NCMEC stated. “With so many people using Apple products, these new safety measures have the lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material.”

Technology really does help in curbing the rampant amorality in society. And if not carefully implemented, it can also impinge upon the other aspects it was developed to protect. So the real question is, at what cost? That’s what remains to be seen.

Apple’s App Store Woes Continue

Apple has once again fallen short of user expectations, this time by promoting some autonomous sensory meridian response (ASMR) apps. Users, and especially developers are disgruntled at the fact that these apps, which basically have little to no functionality, are being unfairly promoted over millions of other apps.

Apple promoting these slime apps again.

A few of them have $10+ weekly subscriptions.

One of them doesn’t even do anything.

— Beau Nouvelle (@BeauNouvelle) August 4, 2021Opens a new window

What’s more, is that these apps charge a disproportionately high amount of money for the bare minimum they’re designed to do. So much so that they’ve earned a reputation that is bad enough for them to be categorized as scam apps. For instance, one such app charges AU$13 ($9.6) for a month taking the yearly charge to AU$676 ($499.35), as a developer pointed outOpens a new window .

This is ironic considering Apple’s App Store Review Guidelines clearly lay out the following under the business sectionOpens a new window . “While pricing is up to you, we won’t distribute apps and in-app purchase items that are clear rip-offs. We’ll reject expensive apps that try to cheat users with irrationally high prices.”

Apple in 2020 rejected:

  • 48,000 apps for using hidden or undocumented features
  • 150,000 apps because they were misleading, spam, or were illegitimate imitators
  • 215,000 apps for privacy violations like collecting more than necessary user data and others
  • 95,000 apps were removed for bait-and-switch fraud

Still, it is one thing that scam apps bypass Apple’s review policies but it is a different thing altogether when the company actively starts promoting them. The recent ones were promoted by Apple Australia App Store as ‘Slime RelaxationsOpens a new window ,’ referring to a certain type of ASMR mobile applications.

“Did you know that slime is therapeutic to play with? Kneading, poking, and stretching that multi-colored goo can elicit instant brain and body tingles, and a sense of bliss,” the company’s App Store preview page reads. “The downside? This gets messy real fast- counterproductive to your goal of relaxation. That’s where these apps come in.”

Twitter user and developer Simeon scathingly ripped through the functional elements of some of the apps. He pointed out how these apps are free for the first few days but request money later on. And Apple’s in-app payment system is designed so as to keep on charging users ever after the subscription period has ended.

I don’t care how many times Apple informs you that you can cancel your subscription, they’ve built an automated system to allow you to easily pre-consent to shady apps taking your money after a time delay


— Simeon (@twolivesleft) August 5, 2021Opens a new window

The Washington Post in June reportedOpens a new window that scam apps make up 2% of the 1,000 top apps on the App Store. These apps generated approximately $48 million during their time on the App Store. And since Apple gets a 30% cut from in-app purchases on apps generating over $1 million, there’s a lot that Apple gains.

Apple has previously also failed to catch a scam app related to the Apple Watch. It only came to light when developer Kosta Eleftheriou pointed it outOpens a new window . 

Closing Thoughts

The child safety features and promotion of scam apps are only the second and third controversial subjects of discussions surrounding Apple. The company was also accused of workplace sexism by Ashley Gjøvik, a senior engineering program manager at Apple. Privacy, scam apps, and sexism claims in one week seem like a new low for the world’s most valued company.

It is hard to imagine Apple’s new child protection features aren’t shrouded in mystery. Even harder to believe that the company unknowingly promoted the said slime apps. The company has always positioned itself as the champion of upholding user privacy as well as putting the consumer first.

Apple, please do better. Clean up this mess. Protect users. And let me, as a developer, feel like I can bet on myself again. I know a lot of other honest developers share this sentiment, too.

— Kosta Eleftheriou (@keleftheriou) February 2, 2021Opens a new window

However, it remains to be seen when the company starts resumes acting like it.

Let us know if you enjoyed reading this news on LinkedInOpens a new window , TwitterOpens a new window , or FacebookOpens a new window . We would love to hear from you!