Apple software SVP Craig Federighi claims that the Private Cloud Compute servers used for Apple Intelligence features are rather basic, and for a good reason. He says that it’s one of the numerous decisions the company made to ensure that its AI cloud servers form a “hermetically sealed privacy bubble” with your iPhone.
The exec states that when it comes to Apple Intelligence features, there is a three-stage hierarchy. The most processing is done on the device itself without sending data to servers. If external processing power is needed, Apple’s own servers are the next step. And if they can’t assist, users are asked for permission to use ChatGPT.
We’ve also discussed the five safeguards Apple applies to its own servers, which includes the “extraordinary step” of verifiable transparency.
Federighi mentions in an interview with “Wired” that part of the privacy protection is achieved by making the PCC servers extremely basic (even though the chips might not be). It’s difficult to envision a data center server without hard drives or SSDs for storing user data, but that’s precisely what Apple has created. The PCC servers are as bare-bones as possible. For example, they don’t have “persistent storage,” meaning they don’t have a hard drive that can retain processed data for a long time.
Additional features further ensure that there is no way for data to survive a reboot. They incorporate Apple’s dedicated hardware encryption key manager known as the Secure Enclave and randomly generate the encryption key for each file system at every boot. This ensures that once a PCC server is rebooted, no data is retained, and as an additional precaution, the entire system volume is cryptographically unrecoverable. At that point, the server can only start fresh with a new encryption key.
One weakness that previously existed with iCloud was that the data was encrypted but didn’t use end-to-end encryption. This meant that either Apple or a hacker who gained access to Apple servers could read the data. Apple has been gradually rolling out E2E encryption for more and more iCloud data (though you need to enable it), but this posed a problem for the PCC servers.
Federighi says that the unique challenge of doing large language model inference in the cloud was that the data had to be readable by the server for inference, but at the same time, it needed to be within a privacy bubble with the phone. So, they had to find a solution. The technique of end-to-end encryption where the server knows nothing wasn’t applicable here, so they came up with another way to achieve a similar level of security. The company’s solution was twofold. Firstly, all the usual server tools that could give an administrator (or hacker) access to your data, such as load balancers and data loggers, are outside the protected area and can’t decrypt the data. Secondly, there is no persistent storage. Once the response is sent back to your phone, it is deleted and can never be recovered.
Apple mentioned that the “extraordinary step” they referred to earlier is that anyone can check that the system works as the company claims. All the production PCC server builds are publicly available for inspection, so people not affiliated with Apple can verify that PCC is doing (and not doing) what the company claims and that everything is implemented correctly. All the PCC server images are recorded in a cryptographic attestation log, which is essentially an indelible record of signed claims, and each entry includes a URL to download that individual build. PCC is designed in such a way that Apple can’t put a server into production without logging it. And in addition to providing transparency, the system acts as a crucial enforcement mechanism to prevent bad actors from setting up rogue PCC nodes and redirecting traffic. If a server build hasn’t been logged, iPhones won’t send Apple Intelligence queries or data to it. Apple claims that this is an unprecedented step for any cloud company.
“Creating the trust model where your device will refuse to issue a request to a server unless the signature of all the software the server is running has been published to a transparency log was certainly one of the most unique elements of the solution—and totally critical to the trust model,” Federighi said.
While the interview mainly recapitulated already known information, the iPhone 16 launch naturally means more people will be paying attention.
Photo by Alexander Huayhua on Unsplash.