Sometimes it feels like Apple really doesn’t understand security and privacy. Or maybe they do understand it and just don’t want you to understand it because they really want control over their users.  Since 2019, Apple has been pushing the idea that Apple products are great at privacy and security. You’ve probably seen lots of commercials and ads and billboards trying to convince you of that. In real life, Apple often behaves the opposite of what they preach in their ads.

TLDR; The only way that you can be sure that software does what it says it does is if you can audit the source code and reproduce the results. Everything else is just lip service.

If you’re not that gullible, you might want to dig deeper and audit the code in order to verify that it’s doing what the developer says it does. This “scientific method” of reproducing a claim is the only way that you can really trust software. Try that with Apple, and you’ll be met with a “don’t worry about it.”  It’s kind of like when someone on Facebook posts about some conspiracy theory, you ask about their sources, and they reply with, “Bro, trust me.”

If that sounds kind of shady to you, it should.

Apple sometimes tries to sneak code into their software so that they can bypass your VPN for example: MacOS Big Sur(veillance) bypasses Firewall/VPN to tell Apple what programs you run on your computer – PIA VPN Blog (privateinternetaccess.com).  That was back in November of 2020 and Apple removed that back door feature after they got caught (see: Apple removes feature that allowed its apps to bypass MacOS firewalls and VPNs | ZDNet).

Now Apple is also going to start scanning your conversations and content for things that Apple disagrees with. See: Apple’s Plan to “Think Different” About Encryption Opens a Backdoor to Your Private Life | Electronic Frontier Foundation (eff.org)   It’s not really a backdoor though since Apple owns the keys to both ends of their “end-to-end encryption”. End-to-end encryption really doesn’t mean much when someone else owns both ends. That’s like letting your country’s government have keys to everybody’s house. Don’t worry about it, we’ll keep you secure. *wink*wink*

Privacy and security and end-to-end encryption are really just marketing terms when it comes to Apple products.

With the guise of fighting child abuse, that is certainly something we can all agree with, but when you think about what else Apple’s ability to scan your content could be used for, the future societal abuse becomes more clear.

This video goes into a lot of detail about how Apple actually shares your data with 3rd parties.

Here’s another video with more detail about how your iPhone is nowhere near private.

John Gruber’s Daring Fireball explains in more detail about Apple’s Child Safety features that will be scanning private content. That explains how Apple intends to scan and recognize actual naughty stuff and it does seem ok. The big problem is firstly… you HAVE TO trust that Apple does what they say they do. You can’t verify this at all. The second problem is that this sets a precedent to expand this feature for scanning and enforcing whatever Apple wants to.

Only trust software that doesn’t require trust.

Apple’s Craig Federighi tried to do some damage control in an interview recently as well, but again… You can’t just video call Joanna Stern at the Wall Street Journal with a bunch of marketing talk and expect people to trust you.

The only way that we as users can actually trust that a company’s software does what they say it does… especially a company with a long track record of reality distortion field created lies…  is if anybody can verify independently what the software does. Craig Federighi says that they don’t do the bad stuff that they definitely probably could do, but you have to trust that what he says is true. It’s just another “trust me, bro” style response.

If Apple really cares about security and privacy, they should prove it and let you prove it yourself too. Remove the requirement for trust and you will gain full trust.