Because of this I appreciated that the article stressed the importance of open-source firmware and called out companies like Intel for user-hostile approaches.
I don't think open-source is really all that important, and the article is being very misleading in that respect; in fact, if we don't have the keys, all that being open-source does is to allow us to easily see how they're oppressing us. (Of course, there's also the Ken Thompson Hack --- inspecting the binary is the real way to determine if there's anything unusual.)
This is a related article which everyone interested in this topic should read:
The real problem here IMHO is that just as the OS is horrendously complex and cannot be left to the user to configure properly, trusted hardware is also broken! For a very recent example, visit https://tpm.fail . The TPMs mentioned in that disclosure passed rigorous CC EAL4+ and FIPS 140-2 certifications. So, even the certifications fail to protect against the very flaws they are testing for. (I haven't studied the matter in detail to determine if the testing regime itself is weak, or if there's a Boeing/FAA level of corruption, or something in between.)
For another recent example, javacard has been proven weak in certain use cases.
The big problem with these hardware flaws is, you end up putting your absolute faith in them since they form the TCB. When the hardware is eventually (and almost certainly) found to contain a flaw, the entire rest of your security tends to fall apart, and generally you are unable to repair it without replacing the device entirely. This might be ok (you will eventually replace the hardware through normal obsolescence) or not (embedded [in your body] medical devices).
What I like about the HP proliant platform is the the TPM chip is an add-on card.
For example, forcing automatic updates with no postponement because users simply won't apply security updates otherwise. Taking freedom away from users? Yeah, kinda. Increasing security? Absolutely.
I think Google offers a good compromise with the screws in their chromebooks that allow overriding the bootloader.
One of the drivers of all this technology is that big business wants to be able to sue providers when everything breaks and avoid the “your users did that” excuse.
OSFC 2019: https://osfc.io/archive
PSEC 2019: https://platformsecuritysummit.com/2019/videos
FOSDEM 2020: https://fosdem.org/2020/schedule/track/open_source_firmware_...
> Most people remember when the FBI wanted a backdoor into iPhones and Tim Cook refused.
Fewer people remember when the CCP wanted a backdoor into iCloud and Apple said yes, for fear of losing its largest growth market. Presently, all iCloud users in China are hosted by a Chinese company, in China, to which the CCP by law has full access.
There is also a claim that Apple cancelled a plan to e2e encrypt iCloud phone backups, which would mean that Apple/FBI could no longer decrypt your phone’s backup. All iPhones logged in to iCloud are backing up to iCloud by default with encryption that Apple/FBI can read. The claim is that the FBI specifically requested that they not further secure this, which would prevent their current methods of easily accessing these backups.
Note that your backups generally contain your complete iMessage and SMS history, including all attached images and videos.
There’s little practical point in denying a backdoor into a seized phone if Apple/FBI already have a copy of everything on it that they can decrypt and read because your backups are encrypted to Apple, not you.
(They do this because many, many people lose their device, and have forgotten their Apple ID password which then needs to be reset. The naïve solution to this is to simply encrypt the backups to an Apple key, which is always decryptable by Apple as required for restore after resetting your password via alternate ID verification. Unfortunately it puts every single user of iCloud backup at risk of bulk surveillance.)
The whole thing is rather performative, like they are showing off for the market. Indeed, people well respected (such as the author of TFA) are repeating this meme without any of the associated caveats.
Make sure you tell your family and friends to disable iCloud and more specifically, iCloud device backups if they value their privacy.
You can use iTunes backup to your own computer. But then, that must be securely full-disk encrypted. And then it's you who'll be holding the password and/or key.
There is also the idevicebackup2 commandline tool from the libimobiledevice suite for doing those backups on linux, mac, or windows hosts. Supports the iOS protocols, including the native backup encryption.
Having a secure iOS device is possible: disable iCloud backups, probably disable messages and photos too, and use a custom numeric pin >10 digits (a lot more if you are using it as T9 input). The secure enclave's kdf is only configured for 100ms of stretching IIRC, but it's sufficient with a long pin code.
I trust Apple's implementation more than I trust myself accidentally leaving my disk unlocked in public. And I do trust myself not to leave my disk unlocked in public.
(I know this is a digression from the topic but we're nicely tucked away at the bottom of the comments)
I don't use laptops anymore. Just host machines for VMs. And I always shut them all down, whenever I leave the building. I have a commercial UPS, with a deadman circuit. So I just cabled that line with the CAT6, and there are motorcycle-style kill switches in key places (desk, bathroom, kitchen and bed).