Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

It's really quite unfortunate how much of Apple software is designed around "privacy is when you trust Apple" :/


Exactly. What does privacy even mean when your entire digital existence is owned by and visible to a single entity?


Some of it is self-serving and some is explainable by the deep and pervasive tension between (security / privacy / autonomy), and usability.


I generally do not impute malice to it, but it is in many cases the lazy option out.


Ultimately, whoever controls operating system updates always has control over your privacy. Even if Apple did offer perfect privacy, there's no reason an update couldn't completely change that.

Bluntly: if you don't trust your OS vendor, then you can't use OS updates. There are people in this category but it's a lot of work.

Much easier to trust your OS vendor (at least to this extent).


> Ultimately, whoever controls operating system updates always has control over your privacy

This is not true. FLOSS (and reproducible builds) allows the community to verify the code and significantly (though not fully) decrease the trust to the OS.


Not really they are moving into homomorphic encryption where the entire query and processing is encrypted and Apple has no knowledge of the what you actually requested.


Completely unclear how much they're moving into homomorphic encryption. The only resource I'm available to find about it is an announcement from 30 July saying that they can now do caller ID lookup using homomorphic encryption and they've announced an SDK that developers can use to leverage it. But the announcement is so vague that it's entirely unclear how much this can actually be used for practical workloads. And, the idea that they're going to go all in on homomorphic encryption is speculative based on what Apple has revealed so far.

That's notable, as we're discussing a case where Apple said they would do something, and then not only didn't do it, but went out of their way to pretend that they never said they would.


Also, the homomorphic encryption is a requirement for third-party caller ID providers, not Apple themselves. Apple's first-party "Contact Photos" caller ID feature operates primarily on the "trust Apple" security model AFAIK.


I'm not aware of any other company of Apple's size (or anywhere approaching) that have been as committed to privacy tech. Of course they are not perfect and sometimes get it wrong but they constantly release new technologies that are furthering our privacy. Who else does it better?


It comes down to what you identify as privacy. Apple is commited to not give your data to any other company and keep it protected in their ecosystem. They'll sell access to you for ads, but only exposing your cohort to the advertiser.

From that lens, Google is also commited to never give your personal data (think Gmail content, Maps behaviour, pins etc) to other companies and keep it all in their ecosystem, for themselves only. Your data is their key advantage, the base of the ad empire, and they won't let another company run away with it.

If we call Apple privacy focused, Google also fits the bill, the question just falls down on whether we see Apple or Google as part of our intimate circle, within our private life. I assume you do for Apple but not for Google.


There is no serious person that could think that Google is a privacy focused company. Their entire business is founded on knowing everything about their users. It's an ad company. They need user data to function and they will never release tech that compromises their business. Just look at the direction of ad blocking and chrome to see where they are headed.


The Apple side is the similar: their current entire business is to middle man your relationship with other companies. You buy Apple products, purchase and subscribe to apps and services from the App store, use Apple Cloud, etc.

They need you in their ecosystem, the same way Google needs you in theirs.

And I totally agree with you, I wouldn't't call Google privacy focused, and I don't call Apple privacy focused either, even as they market it harder than anyone else.


Google is a privacy antagonist. Apple is privacy focused because it suits their business. Apple has been privacy focused for years and has built several technologies to prove it. It's not hollow marketing to build privacy software.


I don't define "privacy" as "only a single company has access to all my stuff", so to me Apple's claims are just marketing. I'd buy an argument about good security and some protection against other companies, just not "privacy".


Google is a "privacy antagonist" with an Open Source OS you can build locally and modify to your heart's content? And Apple's been privacy focused, suing security researchers for copyright violation when they try to analyze iOS?

Methinks you're holding a double standard. Compared to Android and Linux, Apple's "promise" is no better than the one Microsoft offers Bitlocker customers.


These companies shouldn't be graded on a curve. Everyone knows Microsoft is crap for privacy. But Apple has their reality distortion field, and it's important to show people that their privacy promises are BS.


Okay but from an evolutionary sense which company should we be supporting. The one company that is somewhat moving towards privacy or the 10 others that don't give a shit. Which one should survive. Would you like to see companies that copy Apple's privacy approach or Facebook's dumb fucks approach.


I was unaware there exists a fully homomorphic encryption scheme that has the right trade offs between security and computational effort to make this economically viable for even moderate to small workloads.

I’ve always thought it was either far too time or far too space intensive to be practical.

Do you have sources on this, either from Apple or academic papers of the scheme they’re planning on using?


They posted about this recently [0][1]. They are using Homomorphic Encryption in iOS 18 for Live Caller ID Lookups.

[0] https://www.swift.org/blog/announcing-swift-homomorphic-encr...

[1] https://qht.co/item?id=41111129


I've posted about this above a little after you did. Reading the article, I'm unable to determine whether or not this has any practical utility outside of niche applications or if it has the potential to be broadly useful. Has anyone reviewed the SDK that can render an opinion?


Homomorphic encryption is broadly useful and in fact should be ubiquitous for remote computation that leaks private data (not to comment specifically on Apple's implementation). They did open source it though, which gives you an idea that they want others to follow.


Can you point to other ways this is used or is intended to be used?


It's useful for situations that would otherwise be illegal, so that tradeoffs are less relevant.


Following through on a public privacy promise does not require R&D.


[flagged]


> You have to trust Apple and if you don't then buy another computer.

No, I don't have to trust Apple or anyone absolutely, and I can and do use a Mac, as the least bad option for my purposes, without fully trusting Apple. You're presenting a false dichotomy. Trust is a matter of degree, not all or nothing. In general I trust my mother (except her advice, which I often ignore), but if she insisted that she had to install cameras in my home for my "safety", I would start to have serious doubts about her.

> Because they have the ability at any point to push software to your computer that compromises your privacy and security in almost undetectable ways. And short of them having an entirely open-source OS that will always be the case.

1) They can't push software to my computer, because I've blocked the software update mechanism with Little Snitch and only install when I'm ready.

2) Open source is not a panacea, as demonstrated by the XZ Utils backdoor.

Moreover, closed source does not render users helpless. Closed source software can be reverse engineered; indeed, I've done that myself with macOS many times. The behavior of closed source software can also be observed in various ways with various tools, for example, the aforementioned Little Snitch. And even when macOS bypassed Little Snitch one time a few years ago, that was detected by developers. The whole world is watching Apple closely, so the company can't simply get away with whatever they want undetected.

> The whole point of their privacy stance is to protect you from companies like Google and rogue governments who we already know can't be fully trusted

We already know that Apple can't be fully trusted.

We also know that Apple actually caters to "rogue" governments such as China, by removing apps from the App Store at the Chinese government's request, as well as handing over control of iCloud servers to China, and reportedly Apple source code too, for inspection.


> 1) They can't push software to my computer, because I've blocked the software update mechanism with Little Snitch

They can because they (Apple) run in kernel space and Little Snitch no longer does/can. So LS, god love it, only works because Apple lets it. If Apple wanted to they could push updates to your OS without LS knowing about it.


> They can because they (Apple) run in kernel space and Little Snitch no longer does/can. So LS, god love it, only works because Apple lets it.

I already addressed this later in my comment: "And even when macOS bypassed Little Snitch one time a few years ago, that was detected by developers. The whole world is watching Apple closely, so the company can't simply get away with whatever they want undetected."

> If Apple wanted to they could push updates to your OS without LS knowing about it.

They could release an update that allows later updates to bypass Little Snitch, but they can't magically make that ability retroactive and get it on a Mac that doesn't currently have that update installed.


>They could release an update that allows later updates to bypass Little Snitch

Or the ability could be there already without us knowing.

But fortunately we can still run our own routers/switches and see outgoing traffic. If you configure LS to block everything then you could confirm it with your network gear.


> But fortunately we can still run our own routers/switches and see outgoing traffic.

Which is useless if they simply encrypt the data before sending it over SSL.

Then you will never know what they are sending to their servers.


>Which is useless if they simply encrypt the data before sending it over SSL.

Not entirely useless, you'd still know they were sending something and it would be proof they could bypass Little Snitch.


> Or the ability could be there already without us knowing.

An empirically baseless conspiracy theory.

> But fortunately we can still run our own routers/switches and see outgoing traffic. If you configure LS to block everything then you could confirm it with your network gear.

That was my point. Apple isn't actually capable of completely avoiding detection.

Anyway, just as I don't trust Apple absolutely, I don't distrust Apple absolutely either. They do some user-hostile things, which are richly deserving of criticism, but I prefer to engage with facts and evidence rather than idle speculation and paranoia.


>An empirically baseless conspiracy theory.

You misunderstand I think. I don't think it's likely. It would be easy to catch them. I also don't distrust Apple completely. I was (pedantically) replying to your comment that "They can't push software to my computer, because I've blocked the software update mechanism with Little Snitch". "Can't" is too strong. They could is what I'm saying, as in, they have the ability/possibility to do so. They can't time travel, because that's impossible. But they can establish network connections, because they control the firmware and kernel, which exceeds your control via Little Snitch.


> I was (pedantically) replying to your comment

Please don't be a pedant. It's not appreciated, and it only makes the conversation worse.


If you worry about Apple, then they thought you were a true adversary, they could install some code only for you. You really have no idea.


I really have no idea what you're saying.


and critically, the suspicion is that they handed off the keys for iMessage encryption used inside China. And if they did it for love of money for China, would they really not do it for the right US government inquiry?

Google's better in that they chose not to do business in China the same way. For now.


> by removing apps from the App Store at the Chinese government's request

So they follow the law in the market they’re operating in? Holy shit, how dare they.


I'm not sure why you're replying to me with a sarcastic response when I was merely disputing the notion that "The whole point of their privacy stance is to protect you from companies like Google and rogue governments".

On the other hand, the issue is not as simple as Apple following the law, because Apple chose to lock down iPhone and set themselves up as the sole gatekeeper for app installations. On the Mac, which allows distribution from outside the App Store, it's not possible to completely ban apps in this way.


It depends, they cried much harder for the DMA in the EU and still aren't really fully compliant. In China, they were quite okay to throw citizens under the bus without much complaints though.


Because they assume the EU would be weaker and more malleable to US influence than China and the CCP.

US companies keep trying to impose local US rules, policies and way of thinking of their HQ whenever they operate abroad, especially in Europe, but totally do an 180 when they operate in China.


> Because they assume the EU would be weaker and more malleable to US influence than China and the CCP.

That doesn't really fit, because the course of action would be the same in both cases. If the CCP says they have to ban apps then enable side loading or third party stores so customers in China can install the banned apps from another source but the company can still feign compliance. This is the same thing the EU says they have to do anyway, except that it actually defeats the onerous regulation in China whereas in the EU the regulation is intended to benefit rather than oppress the user and Apple objects to it because they're the one wearing the boot.

The explanation that fits is that they care about their own control (and so fight the EU) but don't care as much about China oppressing their customers (and so bend the knee there).

> US companies keep trying to impose local US rules, policies and way of thinking of their HQ whenever they operate abroad, especially in Europe, but totally do an 180 when they operate in China.

In general company leaders should try to have morals and use their influence to push back against rules from governments trying to harm their people. Obviously publicly-traded companies have a perverse tendency to do the opposite of this once they're controlled by Wall St rather than the original founders.


Following the laws of a market you choose to operate in does not absolve you of your actions - just ask IBM.


> Because they have the ability at any point to push software to your computer that compromises your privacy and security in almost undetectable ways. And short of them having an entirely open-source OS that will always be the case.

We know this isn't true because the contrary to this is how we know this is happening. Even when the OS is closed source, someone can still inspect its network traffic or get it running in a virtual machine and see what it's doing, and then publish the result if they find something untoward.

But once your data has been exfiltrated onto their servers, you no longer have any way of auditing what they do with it after that. So the only way to have any assurance of your own privacy is to call any behavior that exfiltrates your data a violation of trust.

It might be easier to audit open source code than closed, and maybe then we should be demanding open source operating systems, but there is still a large difference between "this is difficult but possible and so someone in the world could do it and let everyone else know" and "there is no mechanism for members of the public to validate their claims at all".


Trusting Apple not to break the computer I paid them $1k+ for is very different from trusting Apple to not hoover up and sell my personal data (or let it get stolen).


> Because they have the ability at any point to push software to your computer that compromises your privacy and security in almost undetectable ways. And short of them having an entirely open-source OS that will always be the case.

I don't believe such things would be undetectable. If that were the case, how would we have discovered the OCSP server?

Another point I don't understand is that, if it were made open-source, wouldn't Apple have the same level of control? After all, they'd still be the ones building the binaries. As another commentor noted, the XZ backdoor proved that even open-source software shouldn't be blindly trusted.


> If that were the case, how would we have discovered the OCSP server

Because Apple is not trying to obfuscate anything.

If they did you would never have discovered it.


I regularly discover things that Apple tries to obfuscate.


Such as?


Unreleased products for example


You don't think anyone would have ever analyzed the network traffic coming out of a Mac Book with Wireshark?


> and rogue governments

As opposed to the orderly and just governments, that are so clearly-defined and well trusted?

Or are you more highlighting the unlawful conduct from businesses like NGO Group that are so shamefully allowed to persist without government scrutiny? It's hard for me to tell, since both of them have hacked iPhones to attain privileged access and will likely do it again. From where I'm standing it looks like their "promise" has more holes in it than swiss cheese.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: