MacInTouch Amazon link...

Apple security and privacy

Channels
Apple, Security
I have seen this exact behavior going back to El Capitan (and posted about it here). On Mojave, I don't have a permanent Little Snitch rule but I seem to recall one or two instances of attempted connections to radarsubmissions, which I blocked. So maybe Apple seemingly going behind our backs may be less of an issue, but I believe it still happens (at least in this regard).
Apple has a long history of knowing better than its users. "You may think that you don't want to share this info, but we know that, deep down, you really do."
 


I have removed SMS 2FA to the greatest extent possible from all my sensitive online accounts. A combination of iOS-based soft tokens, Verisign hardware tokens, and Google Voice have allowed to me be almost completely decoupled from SMS. Ironically, the one site that insists on clinging to SMS, and hence is extremely vulnerable to SIM swapping scams, belongs to my mobile phone provider!
It doesn't work for me. I tried adding my Google Voice number to PayPal as a contact number and I get the dreaded red triangle with an "!". No explanation as to why. In the past I have had trouble using Google Voice as a contact number with other companies.

Key Bank doesn't seem to offer 2FA. BofA does and I was able to change the 2FA-SMS to my Google Voice number.
 


Bombich Software just posted version 5.1.8 of their excellent backup/cloning app - Carbon Copy Cloner. In the release notes I spotted this interesting (disturbing?) tidbit:
Errors related to accessing Apple's super-secret /private/var/db/fpsd/dvp folder are now suppressed. No application is allowed to see the contents of this folder, you won't even be able to see this folder in the Finder. Apple has not documented the purpose nor content of this super-secret folder.
 


Ric Ford

MacInTouch
Bombich software just posted version 5.1.8 of their excellent backup/cloning app - Carbon Copy Cloner. In the release notes I spotted this interesting (disturbing?) tidbit:
Yes, I mentioned that on the MacInTouch Home Page this morning. Just now, I changed ACL for the relevant folder in Mojave (it wasn't present in Sierra), using Path Finder, and looked at it from the command line, so at least it's accessible with effort, though it's still mysterious.
Bash:
pwd
/private/var/db/fpsd
ls -l@
total 0
drwx------  2 _fpsd  _fpsd   64 Jan 22 10:23 SC Info
drwxrwxrwx@ 5 _fpsd  _fpsd  160 Dec 28 17:47 adi
    com.apple.metadata:com_apple_backup_excludeItem     61
cd adi
ls -l@
total 8
-rw-rw-rw-@ 1 _fpsd  _fpsd     0 Nov 14 19:07 adi-gb.lck
    com.apple.lastuseddate#PS      16
-rw-rw-rw-@ 1 _fpsd  _fpsd  3365 Nov 28 22:21 adi.pb
    com.apple.FinderInfo      32
    com.apple.lastuseddate#PS      16
-rw-rw-rw-@ 1 _fpsd  _fpsd     0 Nov 14 19:13 adi.pb.lck
    com.apple.lastuseddate#PS      16
 


Yes, I mentioned that on the MacInTouch Home Page this morning. Just now, I changed ACL for the relevant folder in Mojave (it wasn't present in Sierra), using Path Finder, and looked at it from the command line, so at least it's accessible with effort, though it's still mysterious.
Bash:
pwd
/private/var/db/fpsd
ls -l@
total 0
drwx------  2 _fpsd  _fpsd   64 Jan 22 10:23 SC Info
drwxrwxrwx@ 5 _fpsd  _fpsd  160 Dec 28 17:47 adi
    com.apple.metadata:com_apple_backup_excludeItem     61
cd adi
ls -l@
total 8
-rw-rw-rw-@ 1 _fpsd  _fpsd     0 Nov 14 19:07 adi-gb.lck
    com.apple.lastuseddate#PS      16
-rw-rw-rw-@ 1 _fpsd  _fpsd  3365 Nov 28 22:21 adi.pb
    com.apple.FinderInfo      32
    com.apple.lastuseddate#PS      16
-rw-rw-rw-@ 1 _fpsd  _fpsd     0 Nov 14 19:13 adi.pb.lck
    com.apple.lastuseddate#PS      16
Web searching 'adi.pb mac' yields this tidbit: /Users/Shared/adi - galvanist

May be artifacts of App Store and iBooks and formerly stored in
/Users/Shared/adi?
 


Yes, I mentioned that on the MacInTouch Home Page this morning. Just now, I changed ACL for the relevant folder in Mojave (it wasn't present in Sierra), using Path Finder, and looked at it from the command line, so at least it's accessible with effort, though it's still mysterious.
...
/private/var/db/fpsd
I wonder if fps stands for FairPlay Streaming?
_fpsd user?
... So here's my take upon further investigation: The _fpsd daemon is directly linked to iTunes, in the part of Core framework.
 


Bombich Software just posted version 5.1.8 of their excellent backup/cloning app - Carbon Copy Cloner. In the release notes I spotted this interesting (disturbing?) tidbit:
Yes, I mentioned that on the MacInTouch Home Page this morning.
That caught my eye yesterday, so I went to the link to see it, but the quoted note is not there: no mention of anything "secret", and the only mention of "private" is under v.5.1.4 ("Errors related to being unable to access Apple-private folders in the user home folder are now suppressed."). So now even talking about this "super-secret" folder is "suppressed"?
 



So now even talking about this "super-secret" folder is "suppressed"?
I think you misunderstand what they're referring to as being "suppressed": the errors about the folder are suppressed, not the fact that Apple-private folders exist.

In order words: "We're not going to report errors about accesses to a folder that will do nothing but report errors anytime we try to read into it, because it's not meant to be read from anyways".
 


Ric Ford

MacInTouch
Thunderbolt-based attacks seem extremely dangerous to me, as they happen invisibly at a very low level in completely different areas from other attacks we've learned to guard against.

We covered the issue on MacInTouch at least as early as Jan. 2015 when the Thunderstrike bootkit for OS X was created as a proof-of-concept by Trammell Hudson, and he followed up with more details, as Apple responded with a partial patch:
Trammell Hudson said:
Thunderstrike 2

Thunderstrike 2 was partialy fixed as part of Mac EFI Security Update 2015-001 in June 2015 (VU#577140, CVE-2015-3692). Systems running OS X 10.10.4 and higher are no longer trivially vulnerable. "Partially patched" means that the Protected Range Registers (PRR) are locked before the S3 script is run, which prevents a Darth Venamis "Dark Jedi" attack on the boot flash, but it does not fix several other issues. We have disclosed the following remaining problems to Apple's product firmware security team:
  • Option ROMs are still loaded during normal boots. Snare's 2012 Option ROM attack still works and could be extended to other Option ROMs built into the system's WiFi or video card.
  • The S3 resume bootscript is still unprotected and executes code from RAM, allowing code injection into PEI from either a software attack or an Option ROM attack.
  • BIOS_CNTL.BLE and BIOS_CNTL.SMM_SWP are not locked, allowing the NVRAM to be corrupted (and leaving open possible attacks on the PRR).
  • TSEGMB is left unlocked, allowing DMA into SMRAM and code injection into SMM.
  • And there some other issues that we haven't fully tested.
Even with the patch applied, it is still possible for Thunderstrike 2 to write to Option ROMs and continue to spread to new machines, to persist in the S3 resume script until the next full reboot, to hide in System Management mode and evade detection from software scanning, and to "irrevocably" brick systems by corrupting NVRAM.
I posted another follow-up in Sept. 2017 about additional EFI / firmware / Thunderbolt attacks, including a link to this Ars Technica report:
Dan Goodin said:
An alarming number of patched Macs remain vulnerable to stealthy firmware hacks
An alarming number of Macs remain vulnerable to known exploits that completely undermine their security and are almost impossible to detect or fix even after receiving all security updates available from Apple, a comprehensive study [PDF] released Friday has concluded.
The exposure results from known vulnerabilities that remain in the Extensible Firmware Interface, or EFI, which is the software located on a computer motherboard that runs first when a Mac is turned on. EFI identifies what hardware components are available, starts those components up, and hands them over to the operating system. Over the past few years, Apple has released updates that patch a host of critical EFI vulnerabilities exploited by attacks known as Thunderstrike and ThunderStrike 2, as well as a recently disclosed CIA attack tool known as Sonic Screwdriver.
Fast-forward to the present, and we have new research and articles about "Thunderclap:"
Bleeping Computer said:
Thunderclap Vulnerabilities Allow Attacks Using Thunderbolt Peripherals
Modern computers that come with a Thunderbolt interface and run Windows, macOS, Linux, or FreeBSD are vulnerable to a range of Direct Memory Access (DMA) attacks performed by potential attackers with physical access to the device using malicious peripherals.

The security flaws collectively dubbed "Thunderclap" can be exploited to run arbitrary code using highest possible privilege level on the system to potentially access or steal "passwords, banking logins, encryption keys, private files, browsing," and other sensitive data present on machine that come with ports for peripherals that use PCI Express (PCIe) and USB-C ports.

The Thunderclap vulnerabilities provide potential attackers with direct and unlimited access to a machine's memory because these ports come with low-level and very privileged direct memory access (DMA), which supplies any malicious peripherals with much more privileges than regular USB devices.
...
The Thunderclap platform used to discover the vulnerabilities, as well as a number of proof-of-concept attacks, are described in the "Thunderclap: Exploring Vulnerabilities in Operating System IOMMU Protection via DMA from Untrustworthy Peripherals" paper available here in PDF format, and is authored by A. Theodore Markettos, Colin Rothwell, Brett F. Gutstein, Allison Pearce, Peter G. Neumann, Simon W. Moore, and Robert N. M. Watson.

More information on some of the experiments behind the vulnerabilities and the Thunderclap hardware platform is provided in Colin Rothwell's "Exploitation from malicious PCI Express peripherals" PhD thesis that can be found here.

The Thunderclap paper was presented at the NDSS 2019 conference that takes place in San Diego, USA, between 24-27 February 2019.
 


Per this commentary in the DuoLab documentation referred to above:
As a general rule of thumb, always run the latest version of macOS (10.12 at the time of writing). While Apple has historically provided security updates for at least the two previous OS versions, they typically do not contain all the security patches that ship for the current OS version and this seems increasingly true for EFI firmware updates.
Does it not make sense to urge Apple to take a more comprehensive, broad-based security support position, in order to fend off the proliferation of these vulnerabilities?
(I can imagine the expense involved would be somewhat trivial....)
 


Ric Ford

MacInTouch
This might have a bit to do with Apple's hard push on customers to adopt 2FA (though there may be other commercial reasons, as well):
Malcolm Owen said:
'Celebgate' iCloud hack perpetrator sentenced to 34 months in prison

... Brannan acquired access in a variety of ways, including simply answering security questions in forgotten password systems that could be easily answered by reviewing the victim's other public social media accounts. He also used phishing to acquire credentials, using email addresses that looked as if they were legitimate security accounts from Apple.

The teacher is not the only person to receive punishment for "Celebgate," as last year George Garofano was sentenced to eight months in prison followed by three years of supervised release for accessing more than 200 iCloud accounts. In 2017, Edward Majerczyk received nine months in prison and paid $5,700 to one victim for hacking into more than 300 iCloud and Gmail accounts.
 


Ric Ford

MacInTouch
It's "funny" how Apple doesn't want to restrict itself from viewing or analyzing customers' private data....
EFF said:
Fix It Already: Nine Steps That Companies Should Take To Protect You

Apple should let users encrypt their iCloud backups.

Data on your Apple device is encrypted so that no one but you can access it, and that’s great for user privacy. But when data is backed up to iCloud, it’s encrypted so that Apple, and not just the user, can access it. That makes those backups vulnerable to government requests, third-party hacking, and disclosure by Apple employees. Apple should let users protect themselves and choose truly encrypted iCloud backups.
 


Lately my iPhone Xs has started failing to completely log me in using Face ID. I tap the screen to wake up the phone; slide up from the bottom, and get the Face ID icon, and then a checkmark, indicating successful recognition.... And that's all that happens. I'm left at that screen, instead of being logged in. So I swipe up again, and go through the whole process. This repeats forever. I'm forced to power off my phone in order to get the keypad to log in.

A few minutes ago, I wiped out the Face ID entries and rescanned. I also scanned an additional face. For the past 5 minutes, things seem to be working again.

So I'm posting to share what may be the solution, but also I'm curious to see if anyone else has had this happen to them. If the malady comes back, I'll post again....
 


Lately my iPhone Xs has started failing to completely log me in using Face ID. (snip).
Well, I spoke too soon. This morning I got the same thing (a circle with checkmark) but failure to log in. As usual, I was forced to power off in order to get the keypad. This time I went back to Preferences and disabled the "Attention aware" features. So far, it's working... knock on wood. I hope this update isn't also premature.

All in the hope this help...
 


It's "funny" how Apple doesn't want to restrict itself from viewing customers' private data....
I think it helps clarify things to put a content around the role where they are leaving the window open for them to view the data. This is specifically about backups; not all iCloud data. There are classes of iCloud data where Apple does not have access ("end to end")
Apple said:
iCloud security overview
...
End-to-end encryption requires that you have two-factor authentication turned on for your Apple ID. ... End-to-end encryption provides the highest level of data security. Your data is protected with a key derived from information unique to your device, combined with your device passcode, which only you know. No one else can access or read this data.
These features and their data are transmitted and stored in iCloud using end-to-end encryption:
  • Home data
  • Health data (requires iOS 12 or later)
  • iCloud Keychain (includes all of your saved accounts and passwords)
  • Payment information
  • Quicktype Keyboard learned vocabulary (requires iOS 11 or later)
  • Screen Time
  • Siri information
  • Wi-Fi network information
To access your data on a new device, you might have to enter the passcode for an existing or former device.

Messages in iCloud also uses end-to-end encryption. If you have iCloud Backup turned on, your backup includes a copy of the key protecting your Messages. This ensures you can recover your Messages if you lose access to iCloud Keychain and your trusted devices. When you turn off iCloud Backup, a new key is generated on your device to protect future messages and isn't stored by Apple.
If you force "end to end" encryption of backups, that probably pragmatically forces iCloud Keychain on also. Apple is making choices about end-to-end encryption - they aren't totally avoiding it in a "funny" way.

In the context, of backups, I think the discussion goes off into the swamp if we don't also take into account the backup context and policy the users are running in. For example, if >50% are doing zero backups, then creating a policy which will make them hesitate to turn backups on has another quite significant impact. Similarly, if this iCould backup is their one and only copy, then losing the password would mean the user (or heirs) would lose access to the only copy. The nominal backup settings with iTunes as a local backup aren't with encryption on either. Same thing with Time Machine....

At the top of every Password Manager or "secure email" FAQ list I've seen is an entry that essentially is "Can you help me open this if I forgot the passcode?" issue. For a product with a 1+ million user base, even 1% falling into this catch-22 turns into tons of service time. Completely locked up accounts also become a source of storage constipation. They tend to get abandoned in place.
 


Ric Ford

MacInTouch
EFF said:
Apple should let users encrypt their iCloud backups
Data on your Apple device is encrypted so that no one but you can access it, and that’s great for user privacy. But when data is backed up to iCloud, it’s encrypted so that Apple, and not just the user, can access it. That makes those backups vulnerable to government requests, third-party hacking, and disclosure by Apple employees. Apple should let users protect themselves and choose truly encrypted iCloud backups.
There are classes of iCloud data where Apple does not have access ("end to end")
... if >50% are doing zero backups, then creating a policy which will make them hesitate to turn backups on has another quite significant impact
...The nominal backup settings with iTunes as a local backup aren't with encryption on either. Same thing with Time Machine.
Here again is the typical Apple issue: Apple doesn't let customers choose whether to use end-to-end encryption for iCloud backups, the way they can choose encryption for Time Machine. Apple forces its million plus customers to all jump through the same hoops the same way with no individual options.

I appreciate your highlighting end-to-end encryption vs. Apple-accessible encryption. But you cannot have any end-to-end encryption at all, according to Apple, unless you enable 2FA, which has a raft of problems that belies any "ease of use" claim, as we've documented in past discussions here.
At the top of every Password Manager or "secure email" FAQ list I've seen is an entry that essentially is "Can you help me open this if I forgot the passcode?" issue.
Yes, and all those Password Managers can not help you if you forget the password. Is the support load dragging those companies under water?

In fact, Apple's security mechanisms for many things are anything but easy to use, for example failing to identify security realms and requiring customers to re-enter the same Apple ID credentials repeatedly to accomplish simple tasks (e.g. checking online orders) after already authenticating (sometimes in multiple ways).

Lastly, note that there is no end-to-end encryption of your contacts, calendars, Safari browsing history, photos, voice memos, notes, email, etc., even if you jump through all the 2FA hoops. But a variety of third-party (non-Apple) backup apps and cloud services do provide end-to-end encryption.
 


Ric Ford

MacInTouch
Apple still has an unpatched "high-severity" flaw in macOS, which has just been disclosed by Google Project Zero researchers, though Apple was notified three months ago:
The Hacker News said:
Google Discloses Unpatched 'High-Severity' Flaw in Apple macOS Kernel
Cybersecurity researcher at Google's Project Zero division has publicly disclosed details and proof-of-concept exploit of a high-severity security vulnerability in macOS operating system after Apple failed to release a patch within 90 days of being notified.
...
In addition to this vulnerability, the Project Zero researcher also found a similar copy-on-write behavior bypass (CVE-2019-6208) by abusing another function on macOS operating system.

The researcher notified Apple of both the vulnerabilities back in November 2018 and the company privately acknowledged the existence of the flaws. While Apple patched the latter flaw in January 2019 update, the former flaw remains unaddressed even after the 90-day deadline Project Zero provides the affected companies.

So, the researchers made the vulnerability public with a "high severity" label and also released the proof-of-concept code that demonstrates the bug, which remains unpatched at the time of writing.
AppleInsider said:
'High severity' kernel security flaw found in macOS file system
Google's Project Zero has revealed a "high severity" flaw in the macOS kernel, one which could allow an attacker to make changes to a file without macOS being informed, an issue that could lead to infected files being opened and allowing more malicious activities to become available to abuse.
...
According to Project Zero's procedures, it discovered the flaw and advised Apple of its existence in November 2018, at the same time as issuing a 90-day deadline to fix the flaw before it is published, to encourage the development of a fix. Proof-of-concept code for the flaw and an explanation has since been posted by the team.

An update on February 28 advises the team has been in contact with Apple about the issue, but no fix for the problem has been released. "Apple are intending to resolve the issue in a future release, and we're working together to assess the options for a patch," team researcher Ben Hawkes notes.

This is not the first time Project Zero has taken aim at Apple's software. In February, it was revealed Apple had patched two flaws in iOS found by the team that were used to hack iPhones and iPads in the wild, while in 2015, three zero-day exploits in Mac OS X were disclosed.
 


Thunderbolt-based attacks seem extremely dangerous to me, as they happen invisibly at a very low level in completely different areas from other attacks we've learned to guard against.
This is a DMA attack, which as a class of attack isn't new at all. Remember back in 2002 when Quinn "The Eskimo" won First Place at the MacHax Best Hack contest, for FireStarter:
TidBits said:
First prize ... went to Quinn "The Eskimo" for FireStarter, a program that draws a QuickTime burning flames effect and then propagates the effect to any Mac you plug in via FireWire, all without requiring any special software on the target Mac. That Mac can even be booted from an installation CD, or be waiting at the Login window.
Since then there have been many FireWire DMA attacks. I think macOS has changes to stop them.
 


From a report on The Register:
Shaun Nichols' said:
Level up Mac security, and say game over to malware? System alerts plus Apple game engine equals antivirus package

Infosec guru Patrick Wardle has found a novel way to attempt to detect and stop malware and vulnerability exploits on Macs – using Apple's own game engine.

The boss of Objective-See, a maker of in-our-opinion must-have macOS security tools, explained at this year's RSA Conference, held this week in San Francisco, how he and his colleagues developed a series of rules to potentially identify malicious software and network intruders, then plugged them into Apple's macOS games development toolkit to create a capable Mac security suite.

The idea, said Wardle, was to develop a package that would address what he saw as serious deficiencies in the Mac security space, both technically and culturally, from insecure Safari browser code to Apple fans convinced their computers can't fall victim to software nasties.
 


Every time I boot my MacBook Air (7,2) , Early 2015, running Mojave 10.14.2, in addition to the normal login password, I now get a dialog box with "Identity Servicesd wants to use MacBook Pro login keychain".

There is no longer Keychain repair. This is an ongoing problem for others for many years - I did a search. I am not going to toss out my old keychain.

Sorry to sound strident. I guess Apple is testing my loyalty and patience with yet another problem. If anyone has found a solution, you should shout it loudly from the mountaintops. Lots of users with the same problems, and no apparent solutions. Thanks to all at MacInTouch.
 


Ric Ford

MacInTouch
Every time I boot my MacBook Air (7,2) , Early 2015, running Mojave 10.14.2, in addition to the normal login password, I now get a dialog box with "Identity Servicesd wants to use MacBook Pro login keychain". There is no longer Keychain repair. This is an ongoing problem for others for many years - I did a search.
This may be a different problem, but I had a similar frustrating issue after recovering from an unwanted/unexpected Mojave update to an older macOS system, and this helped:
Apple Support said:
If your Mac keeps asking for the login keychain password
The password of your macOS user account might not match the password of your login keychain. Either create a new login keychain or update it with your new password.
 


Every time I boot my MacBook Air (7,2) , Early 2015, running Mojave 10.14.2, in addition to the normal login password, I now get a dialog box with "Identity Servicesd wants to use MacBook Pro login keychain"..
Sorry, not quite clear on the problem. At what point does the new dialog appear? Is it while the machine is still booting?
 


Sorry, not quite clear on the problem. At what point does the new dialog appear? Is it while the machine is still booting?
The dialog appears at the final stages of booting. After a verbose boot, the progress bar appears, then the desktop is painted, I have Activity monitor and Calendar set to launch at startup, and then the dialog appears. My passwords for logging in and unlocking Keychain access are identical. Thanks.
 


Ric Ford

MacInTouch
Two current Apple zero-days, surfaced, at least one of which Apple knows about but hasn't fixed yet:
Zero Day Initiative (ZDI) said:
Pwn2Own Vancouver 2019: Day One Results
The contest started with the team of Fluoroacetate (Amat Cama and Richard Zhu) targeting the Apple Safari web browser. They successfully exploited the browser and escaped the sandbox by using an integer overflow in the browser and a heap overflow to escape the sandbox.... The demonstration earned them $55,000 USD and 5 points towards Master of Pwn.

... The final entry in Day One saw the phoenhex & qwerty team (@_niklasb @qwertyoruiopz and @bkth_) targeting Apple Safari with a kernel elevation. They demonstrated a complete system compromise. By browsing to their website, they triggered a JIT bug followed by a heap out-of-bounds (OOB) read – used twice – then pivoted from root to kernel via a Time-of-Check-Time-of-Use (TOCTOU) bug. Unfortunately, it was only a partial win since Apple already know of one of the bugs used in the demo. Still, they earned themselves $45,000 USD and 4 points towards Master of Pwn.
 


Ric Ford

MacInTouch
Here's an interesting trick, attacking iOS via Chrome:
Bleeping Computer said:
Malvertising Campaign Abused Chrome to Hijack 500 Million iOS User Sessions
Multiple massive malvertising attacks which targeted iOS users from the U.S. and multiple European Union countries for almost a week used a Chrome for iOS vulnerability to bypass the browser's built-in pop-up blocker.

... The April campaign used landing pages hosted on .world domains and it made use of pop-ups to hijack users sessions and redirect the victims to malicious landing pages.

... the malicious payloads used by the eGobbler group during these massive malvertising campaigns exploited a yet unpatched vulnerability in the Chrome for iOS web browser — the Chrome team is investigating the issue after Confiant reported the flaw on April 11.

... This campaign was designed by the eGobbler malvertising group to specifically target iOS users but it was not the first. During November 2018, Confiant monitored another campaign run by the ScamClub group which managed to hijack roughly 300 million iOS user sessions and redirected them all to adult content and gift card scams.
 


Ric Ford

MacInTouch
Another iPhone compromise via Apple "enterprise" mechanisms:
Ars Technica said:
Well-funded surveillance operation infected both iOS and Android devices
Researchers recently discovered a well-funded mobile phone surveillance operation that was capable of surreptitiously stealing a variety of data from phones running both the iOS and Android operating systems. Researchers believe the malware is so-called "lawful intercept" software sold to law-enforcement and governments.

... The iOS version was installed using the Apple Developer Enterprise program, which allows organizations to distribute in-house apps to employees or members without using the iOS App Store. The apps masqueraded as mobile carrier assistance apps that instructed users to “keep the app installed on your device and stay under Wi-Fi coverage to be contacted by one of our operators.”

The Apple-issued digital certificate used to distribute the malicious iOS apps was associated with an Italy-based company called Connexxa S.R.L. Infected iPhones also connected to domains and IP addresses belonging to Connexxa.

... Lookout researchers reported their findings to Apple, and the company revoked the enterprise certificate. The revocation has the effect of preventing the apps from being installed on new iPhones and stopping them from running on infected devices.
 


Another iPhone compromise via Apple "enterprise" mechanisms:
Ars Technica said:
Well-funded surveillance operation infected both iOS and Android devices
Researchers recently discovered a well-funded mobile phone surveillance operation that was capable of surreptitiously stealing a variety of data from phones running both the iOS and Android operating systems. Researchers believe the malware is so-called "lawful intercept" software sold to law-enforcement and governments.

... The iOS version was installed using the Apple Developer Enterprise program, which allows organizations to distribute in-house apps to employees or members without using the iOS App Store. The apps masqueraded as mobile carrier assistance apps that instructed users to “keep the app installed on your device and stay under Wi-Fi coverage to be contacted by one of our operators.”
This seems to imply that everything is working as intended. Are you suggesting Apple eliminate the enterprise installation option?
 


Ric Ford

MacInTouch
This seems to imply that everything is working as intended. Are you suggesting Apple eliminate the enterprise installation option?
Not at all, Robbie, the only thing I'm "suggesting" by posting information about a dangerous attack vector is that people should be wary and aware of it, so that they aren't losing valuable / private / personal / sensitive data to criminals or others without realizing it through this sort of trick.
 


Not at all, Robbie, the only thing I'm "suggesting" by posting this information about a dangerous attack vector is that people should be wary and aware of it, so that they aren't losing valuable / private / personal / sensitive data to criminals or others without realizing it through this sort of trick.
I imagine that Apple will implement new controls on how Enterprise apps are distributed and installed. Maybe require Enterprise apps to only be installed on devices that are MDM managed? I am trying to think of a valid use case for Enterprise apps for the general public, that is not against Apple's TOS.
 


I imagine that Apple will implement new controls on how Enterprise apps are distributed and installed. Maybe require Enterprise apps to only be installed on devices that are MDM managed? I am trying to think of a valid use case for Enterprise apps for the general public, that is not against Apple's TOS.
Maybe, but many corporations would complain bitterly, especially in light of the fact that BYOD (bring your own device) is a very popular thing these days. Employees want the ability to install corporate apps on their personal phones, and they don't want to give their employer total control over that phone in order to do this.

I don't think the existing system (of allowing an installed profile to allow enterprise-signed apps) should be changed, but Apple will have to be vigilant about companies that abuse the service, revoking the certificates from those that do.
 


Not at all, Robbie, the only thing I'm "suggesting" by posting information about a dangerous attack vector is that people should be wary and aware of it, so that they aren't losing valuable / private / personal / sensitive data to criminals or others without realizing it through this sort of trick.
Ric, rereading my question, the wording was unnecessarily accusatory in tone - that was not at all my intention and I apologize for not being clearer. Thank you for both the initial post and your response.
I imagine that Apple will implement new controls on how Enterprise apps are distributed and installed. Maybe require Enterprise apps to only be installed on devices that are MDM managed? I am trying to think of a valid use case for Enterprise apps for the general public, that is not against Apple's TOS.
Rob, you may indeed be right, but I think that would be unfortunate. We've used iOS devices in our enterprise for… well, since they first started being usable with cellular, but have only recently put in an MDM system. We also have a lot of BYOD users who use our (single) enterprise app, but aren't in the MDM.

Enterprise distribution currently requires the user to approve the certificates used, as well as agree to install the app. Unfortunately, there are always going to be people who will be susceptible to that problem, but at least by revoking the certificate, Apple can stop it. My admittedly uninformed understanding is that this is not the case with the Androids, unfortunately.
 


Maybe, but many corporations would complain bitterly, especially in light of the fact that BYOD (bring your own device) is a very popular thing these days. Employees want the ability to install corporate apps on their personal phones, and they don't want to give their employer total control over that phone in order to do this.
I don't think the existing system (of allowing an installed profile to allow enterprise-signed apps) should be changed, but Apple will have to be vigilant about companies that abuse the service, revoking the certificates from those that do.
From a legal standpoint (and I help manage devices for enterprise customers), my opinion is that if you have company data on the phone, it must be managed, at least partially. If you (the employee) don't want to have a managed device, then don't expect to do work on it (and don't expect the company to reimburse you for it). At the same time, if a company wants their employees to utilize a BYOD program, they have the obligation to set up management policies that protect the company without putting undue restrictions on their users’ personal use of the devices. If they can't do that, then they should provide their employees with company-owned devices.

Apple has been fine-tuning its MDM specification over the past few years and limiting many of the more restrictive restrictions and functions to devices that are "supervised". Right now, most supervised devices are enrolled utilizing Apple's Device Enrollment Program (DEP). Only corporate-owned devices can be in DEP. While it is possible to Supervise a non-DEP device, you have to have physical access to it. And, reading the tea leaves, this option will be going away very soon. Looking at the current restrictions profile, nearly half the options are listed as "Supervised device only". And this number gets bigger with every update. Most of the restrictions that are still eligible for non-supervised devices deal with how iOS deals with managed and unmanaged data/apps. Setup properly, a BYOD user should not be inconvenienced in their personal usage of the phone.

It may be unfortunate for the companies that followed the rules that will be hurt the most. But, the abuse of enterprise certificates was pretty bad for Apple's privacy stance. They have to be looking to clamp down on unregulated distribution of enterprise apps. This is a pretty big security hole right now.
 


From a legal standpoint (and I help manage devices for enterprise customers), my opinion is that if you have company data on the phone, it must be managed, at least partially. If you (the employee) don't want to have a managed device, then don't expect to do work on it (and don't expect the company to reimburse you for it). At the same time, if a company wants their employees to utilize a BYOD program, they have the obligation to set up management policies that protect the company without putting undue restrictions on their users’ personal use of the devices. If they can't do that, then they should provide their employees with company-owned devices.
I completely agree, but policies like this must be the decision of the corporation managing the phones, not of the phone's manufacturer. Apple cannot and should not start micro-managing the mobile access networks of all their corporate customers.

FWIW, I have an employer-issued (Android) phone and they do not take total control of the phone. They installed a sandbox environment ("Workplace"), which I must sign-in to, in order to access corporate apps and data. If the company decides, they can wipe anything in that sandbox, but that will not touch the rest of the phone. If I install an app outside of the sandbox, then that instance doesn't have access to the VPN needed to access corporate data. This allows me to use common apps (like Outlook) for both corporate and personal mailboxes, and the company can't touch the personal ones.

They have a similar kind of environment for employees using iPhones.

There is no need to give the company total control over the entire device in order secure company data.
 


I completely agree, but policies like this must be the decision of the corporation managing the phones, not of the phone's manufacturer. Apple cannot and should not start micro-managing the mobile access networks of all their corporate customers. ... There is no need to give the company total control over the entire device in order secure company data.
I manage one corporation's MDM, as related to Apple devices, and, as I understand it, I, by way of the MDM implementation, control the company data on the device but have no impact on the users' personal data.

For what it's worth, the company is is Google G Suite user. When a device is used, nothing impedes or impacts an employee until they attempt to add their employee Google-hosted email to the Gmail app (or to Apple's Mail app). They are notified that, to do so, they must download, install and approve the relevant profile. If in agreement, they so agree and are prompted through the process. There are advisories as to the specific data management to which the employee is consenting.

So, I can, if need be, revoke a device's access. The device will no longer be able to access/retrieve mail and, when next online, will have any on-device cache of the data and the account credentials removed.

But the user's access to their installed apps or their own accounts (as distinct from the employee account) will remain. So, my existing setup corresponds to our mutual understanding that "there is no need to give the company total control over the entire device in order secure company data."
 


Ric Ford

MacInTouch
Howard Oakley continues to explore various issues with Apple's security changes in macOS 10.14 and beyond:
The Eclectic Light Co. said:
There’s more to notarization than that
... Currently, it’s relatively easy to distribute malware through a legitimate distributor. What you need to do boils down to:
  • Purchase an unused Developer ID and certificates on the black market, which is cheap and easy.
  • Obtain the source code to the app, or mimic it, inserting your malicious code.
  • Build and sign the hijacked app on any Mac anywhere in the world.
  • Replace the genuine app with your malware, either directly on the server or by redirection.
All of these can be performed remotely, by anyone with fairly minimal knowledge and resources.

Notarization doesn’t prevent such hijacking, but makes it significantly more difficult.

... In just over a month we’ll see the first evidence of what the new system will be, at WWDC 2019. All the indications are that hardening and notarization are just part of major changes which we’ll then spend the coming months and years trying to discover in detail. Only then will we be able to judge whether notarization is snakeoil or a valuable advance.
And he also writes about issues with Apple's "quarantine" mechanisms:
The Eclectic Light Co. said:
Quarantine: Apps
It’s essential to remember that the quarantine flag is an opt-in system, and not one imposed by macOS itself. Any developer, including malware authors, can download files from the Internet without setting the flag on them, and any app on your Mac can change or strip the quarantine flag on any item to which it has write permission. The use of these flags in security is very much a gentleman’s agreement, which is easily broken when software doesn’t behave like a gentleman.
 


Ric Ford

MacInTouch
Not at all, Robbie, the only thing I'm "suggesting" by posting information about a dangerous attack vector is that people should be wary and aware of it, so that they aren't losing valuable / private / personal / sensitive data to criminals or others without realizing it through this sort of trick.
Here's some new perspective from Apple itself about these security problems, responding to a public relations issue:
Apple PR said:
The facts about parental control apps
Over the last year, we became aware that several of these parental control apps were using a highly invasive technology called Mobile Device Management, or MDM. MDM gives a third party control and access over a device and its most sensitive information including user location, app use, email accounts, camera permissions, and browsing history. We started exploring this use of MDM by non-enterprise developers back in early 2017 and updated our guidelines based on that work in mid-2017.

MDM does have legitimate uses. Businesses will sometimes install MDM on enterprise devices to keep better control over proprietary data and hardware. But it is incredibly risky—and a clear violation of App Store policies—for a private, consumer-focused app business to install MDM control over a customer’s device. Beyond the control that the app itself can exert over the user's device, research has shown that MDM profiles could be used by hackers to gain access for malicious purposes.
 


MDM as an attack vector was demonstrated in March, 2016
The Register said:
MDM is the guts of managing enterprise, business, and education-owned Apple devices:
Jamf said:
Apple device management for your business
Learn how you can empower your employees to be more productive with their Apple devices.
Apple PR said:
The facts about parental control apps
MDM does have legitimate uses. Businesses will sometimes install MDM on enterprise devices to keep better control over proprietary data and hardware. But it is incredibly risky . . . Beyond the control that the app itself can exert over the user's device, research has shown that MDM profiles could be used by hackers to gain access for malicious purposes.
Stunning that Apple would characterize its system to manage enterprise systems as "incredibly risky." Can't imagine that's a message they want to send to Megaworldwide, Inc.

A different standard? What's missing in Apple's PR communique is an explanation of how the expelled "consumer" parental control apps differ from MDM controls in enterprise, and if the installation of an MDM "consumer" app gives the app developer control over the device (1) not available to, or (2) not explained to, its parental owners.

Apple mentions Verizon Smart Family as offering iPhone parental controls that aren't being shut out of the App Store:
Verizon Wirelss said:
What is Verizon Smart Family?
Verizon Smart Family is a service that gives you parental controls to help manage your kids' smartphone* use.

From a single app, you'll get:

Content filtering
Call, text and purchase monitoring and limiting
Contact management
Internet pausing
Location services (Verizon Smart Family Premium)
"Smart Family" is actually rather limited, and, ahem, doesn't block purchase and installation of apps that will be charged to the Verizon bill, though it does allow setting a dollar limit. Would it be possible for a user to work around that limit by purchasing an iTunes Gift Card?

Is this another unsolvable "he said / she said" controversy? Or is enough information available about the "banned" apps and whatever special and specific dangers they posed to evaluate conflicting claims?
 


Stunning that Apple would characterize its system to manage enterprise systems as "incredibly risky." Can't imagine that's a message they want to send to Megaworldwide, Inc. A different standard? What's missing in Apple's PR communique is an explanation of how the expelled "consumer" parental control apps differ from MDM controls in enterprise, and if the installation of an MDM "consumer" app gives the app developer control over the device (1) not available to, or (2) not explained to, its parental owners.
To be fair, Apple didn't say that its "system to manage enterprise systems" is incredibly risky, per se. They said (emphasis mine) that
it is incredibly risky—and a clear violation of App Store policies—for a private, consumer-focused app business to install MDM control over a customer’s device.
I don't think it is unreasonable to raise concerns over how general use of MDM-level controls by consumer apps might open large opportunities to compromise security or privacy, though it certainly is fair to question why Apple is changing its focus now.

FWIW, there is a concise roundup of perspectives on the matter at Michael Tsai's blog:
 


Stunning that Apple would characterize its system to manage enterprise systems as "incredibly risky."
They didn’t. They said it is incredibly risky for a third party app to have MDM on a customer’s device, for exactly the reasons that enterprises using MDM want it: because it gives them an extraordinary degree of access and control.
 


Amazon disclaimer:
As an Amazon Associate I earn from qualifying purchases.

Latest posts