MacInTouch Amazon link...

malware and security

Channels
Security
Ironically, the take-away for me from all this was that anti-virus software remains difficult to distinguish from the threats they are supposed to protect us from. None of them would function without being given carte blanche to phone home. In spite of preferences seemingly to the contrary, none of them would simply run when asked. They all insisted on daemons in the background. Is it unreasonable to expect these folks to be sensitive to this?
Yes, for the most part it is unreasonable in today's macOS environment. It's not that these apps need to “phone home” by the traditional definition, rather they must also stay up to date on malware definitions to be effective against the ever-changing malware types and variants. If you have privacy concerns, check the developer's policy to see what, if any, information is being passed to them. If they require registration, then that's the minimum check that needs to be made when used.

Background processes are necessary to provide real-time / on-access protection involving newly downloaded / changed files. They also control all scheduled scanning and definition / app update checks. And, they are necessary to provide a menu icon.

There have been a couple of requests to ClamXAV to eliminate all background processes as long as Sentry and all scheduled events are disabled. Although such settings are not optimal for providing full protection, the developer has agreed to provide such an option for users who demand complete control in a future release. This will simply eliminate the menubar icon, which uses practically no resources at all, but should satisfy such purists.
 


Yes, for the most part it is unreasonable...
I don't think it is unreasonable. I know why developers claim software needs to phone home. I know what information their privacy notices claim to be gathering. What I don’t know is any way to verify any of the above claims.

With Microsoft and its Windows Installers (with bonus secret marketing surveys) and now Google’s example of “let’s just grab this data and feign innocence when it upsets someone” setting industry privacy standards, I am really supposed to trust all this?

I download from Apple, directly from developer sites (after reading reviews), and occasionally, source code from GitHub. Nowhere else. In over thirty years of personal computer use, I have only been “infected” one other time besides the above instance. That was in 1987. Every single “infection” has come from a “trusted source”, i.e. directly from an established developer.

In short, I don’t feel the need for my CPU and I/O bandwidth to be eaten up by constantly running antivirus processes. I have 267 processes and 905 threads (each of which “uses practically no resources at all”). Meanwhile, folks in this forum complain about macOS getting more sluggish with each update. All those processes don’t add up?

I have five persistent iCloud processes (with 2 threads each). I don’t use iCloud. Five Photos processes (with 2 threads each). I wouldn’t let Photos near my photo library. Siri processes, and I don’t even have a microphone. SafeEjectGPU processes, when I have no ejectable GPU, etc. Most of these could be avoided with one-line-of-code tests. How many other more obscurely titled processes do I really not need?
 


I don't think it is unreasonable. I know why developers claim software needs to phone home. I know what information their privacy notices claim to be gathering. What I don’t know is any way to verify any of the above claims.
I understand such frustration. I was narrowly responding to your specific questions about ClamXAV and similar anti-malware apps, not the proliferation of background processes with each new version of macOS. Most of those are easily supported by current hardware and won't noticeably impact the average user, but are somewhat unnecessary energy consumers. Nor does it address your ability to verify privacy policies. That requires research into the company, which is often difficult and time-consuming for most all of us. Some developers have even accidentally violated their own policies due to faulty coding and / or poor QC.

If you really feel you want to control these unused processes, then grab a copy of LaunchControl for $15 and learn how to safely use it.
 


I have five persistent iCloud processes (with 2 threads each). I don’t use iCloud. Five Photos processes (with 2 threads each). I wouldn’t let Photos near my photo library. Siri processes, and I don’t even have a microphone. SafeEjectGPU processes, when I have no ejectable GPU, etc. Most of these could be avoided with one-line-of-code tests. How many other more obscurely titled processes do I really not need?
Found this trick that will shut off photoanalysisd, the background process that's identifying faces and possibly objects:
Guy Blanco IV said:
Controlling photoanalysisd
We're going to make a cron job that kills the photoanalysisd service every minute. The command takes less than 10ms to run and doesn't error out if the service is already dead.
It may be possible to apply Guy Blanco's idea to other unwelcome processes.

From observing in Little Snitch, I found many Apple processes engage in bi-directional telemetry, and it's possible those blocked by Little Snitch build logs to phone home when I deactivate Little Snitch to take updates.

It's possible, as I've mentioned before, to delete some of what I consider Apple bloatware (Game Center was one that had no place on computers in the workplace). Such applications are simply built on top of the underlying BSD/Darwin core of macOS, and can be removed without creating instability, and seem to improve speed. To do that, for the ones it is possible to delete, turn off SIP and use a program like the late Reggie Ashworth's full and still supported (not Mac App Store) version of AppDelete to easily scrape out all the applications' hidden files. I even deleted Safari once, with no negative side effects. In days gone by, it was possible to delete Spotlight, but that no longer seems possible.

Of course, the Apple apps you delete are likely to return at the next major point release, but not with standalone security releases. That's one reason I prefer not to "upgrade" to a new version, e.g., El Capitan to Sierra, until the final (often .6) version is released. Then I'm able to create a Carbon Copy Clone of my macOS system that has been stripped of what isn't needed, and clone that to other Macs in the office.
 


Found this trick that will shut off photoanalysisd, the background process that's identifying faces and possibly objects:
That did not work for me. However, using LaunchControl, in expert mode, I could either shut it down entirely, or change the interval from 7200 (the default) to 72000 (quite rare). I can also shut down PowerNap, which strikes me as a good idea.

That said, to run the crontab or use LaunchControl on that particular item, you need to shut off System Integrity Protection (SIP). In short, reboot into Recovery partition, go to Terminal, type
/usr/bin/csrutil disable
go back into normal boot, LaunchControl to remove or restrict the daemon, and back into Recovery to
/usr/bin/csrutil enable

It would be nice to have a clever way to play with SIP without two reboots each time...
 


I'm still wrestling with what exactly makes a Ukrainian company intrinsically less trustworthy than a Finnish one-man show (per his GitHub profile).
The increasingly international provenance of software raises interesting, complicated, and perhaps unresolvable questions about security and privacy.

At a very simple level, how confident can anyone be about the security of a piece of software written by a single individual, regardless of the individual's location? Even the most honest, careful, talented developers produce software with bugs, and sometimes those bugs create security vulnerabilities. Even if you have access to the source code or can perform formalized software security testing, issues still can slip through. That's the best case.

What happens if the developer is careless, has strong incentives to de-emphasize security, or is outright dishonest? Will you ever know? Perhaps there is some comfort if the developer is in a friendly legal jurisdiction. In that case, perhaps you can pursue legal compensation if negligence or criminal behavior can be demonstrated, but doing so can be costly and may have a low chance of success. In any case, if legal remedies are sought, chances are that they're being pursued in response to a security/privacy breach that already has happened. By then, it may be that no amount of compensation will match the financial losses and damages to privacy, reputation, and so on.

Now let's look at some widespread practices in the industry. Many commercial software developers (regardless of size) rely on outsourced development partners with highly variable degrees of oversight. This includes everything from traditional apps to websites. You may insist on buying/licensing/using software written by local developers, but there is a substantial probability that some, most, or even all of the actual code was written somewhere else by someone else. I'd wager that most people who don't have firsthand experience with the process would be astonished at how much code is developed by outsourced developers.

Complicating things further, outsourced development partners themselves often rely on additional third party developers unless contractually prohibited from doing so. While some outsourcing is entirely local, and most people think of India as the base of most international outsourcing, places like Russia, Ukraine, and eastern Europe are very commonly used. They have very reasonable costs and significant pools of developers with excellent reputations, world-class technical training, and surprisingly good English skills. Latin America is growing quickly, too, with a lot of development outsourcing being done in places like Argentina, Uruguay, and Mexico. If things go wrong, however, opportunities for redress may be limited.

A variation is how often code is developed by in-house developers based in other countries. Just about every large software provider has a significant in-house international development footprint, and many surprisingly small firms do, too. In my own work, I often encounter firms with a three or four person headquarters in places like Boston or NY and a 24-person overseas software development department -- without any public mention of the overseas development department. Often, the ability of HQ to keep a close eye on development is less than ideal.

In other words, modern software development often involves a very complicated international supply chain, the security of that supply chain can be highly variable, and it can be pretty rare for end users to have true visibility into where their software comes from.

This ends up being an extremely unsatisfying post. On the one hand, I often think that people worry too much about security/privacy issues that are derived from the geographical origin of well-known, professionally managed products. On the other hand, the variability and vulnerability of the software supply chain across the full range of websites and apps is so large and the associated guarantees and protections are so small that I sometimes feel like pen and paper are the future.
 


Ric Ford

MacInTouch
The increasingly international provenance of software raises interesting, complicated, and perhaps unresolvable questions about security and privacy....
Thanks for describing some of these critical software supply chain issues and their complexity.

To add just a couple of additional notes, I'll start by mentioning that developers typically use third-party software libraries within their apps, from a wide variety of sources with a wide variety of issues, including many severe security and privacy problems. Examples abound, including some nasty code in iOS apps, while more benign libraries also can contain security flaws, as we've discussed recently re SQLite.

Meanwhile, it seems that almost every app now connects over the Internet to remote systems to exchange data and code, which can be anything but transparent and create all kinds of issues on the fly in real time, regardless of what code happened to be in the app when someone first downloaded it. Firewalls, such as Little Snitch, can be helpful here, but we can't know or see what's going back and forth. Apple is a prime example - we might think we know what's happening, such as a software update being installed, but we can't actually see what's going back and forth, and there's astounding complexity involved, as Apple hooks the customer into Apple ID, iCloud, iMessage, App Store, iTunes, OS signing, software update scans, etc.
 


To add just a couple of additional notes, I'll start by mentioning that developers typically use third-party software libraries within their apps, from a wide variety of sources with a wide variety of issues, including many severe security and privacy problems.
Excellent point. In my day job, I occasionally participate in technical due diligence assessments of firms that develop software for internal use or for external distribution. Some firms manage their software development processes very well, with regular, close review of code as it is developed and with tracking of incorporated libraries and other third-party code, updating them as necessary, and auditing periodically. With such firms, there is little worry about whether the code is generated in one country or another. The processes are good enough to instill confidence.

Other firms are not so rigorous or proficient; sometimes they have no idea that one of their staff decided to include a particular third-party library or code snippet in their product. In those cases, the code often is out of date, revealing potential vulnerabilities, and sometimes the code is incorporated in violation of applicable licenses, opening potential legal/financial liabilities. Perhaps needless to say, firms that operate in that fashion will tend to have other significant issues with their software, whether the code is developed in Kiev or in Kansas.
 


That did not work for me. However, using LaunchControl, in expert mode, I could either shut it down entirely, or change the interval
I've clicked into the LaunchControl website several times and fled like a scared bunny - I found no detailed documentation on the web site site, and it appears possible to unwittingly choose options that would bork a system? I sorta' understand what I was doing when I tried to shut off photosynthid in Terminal, and how the cron job trick would work, if it worked, but that's one process, and it took some research to reach that understanding.

Is LaunchControl idiot-proof (or George-proof)? Does it provide in-application explanation of what processes may be unnecessary and warnings against disabling those which are essential?

Lastly about Launch Control: You used what amounts to a cron inside LaunchControl to shut down photosynthid. Any idea if LaunchControl provides a different and separate cron service? If that's the case, removing LaunchControl would result in photosynthid returning from the grave? Does LaunchControl work by patching on its own kernel extension (kext)?
I often think that people worry too much about security/privacy issues that are derived from the geographical origin of well-known, professionally managed products.
Parallels, which was developed in Russia, used to prohibit blocking its software from phoning home. An acquaintance learned about the Parallels Toolbox for Mac, a set of utilities that mostly replicates native functions. I tried to persuade her not to use it when it is (mostly) duplicative, because of this language I extracted from the Parallels site in July, 2017:
"Parallels reserves the right, and you authorize Parallels, to gather data on key usage including license key numbers, Authorized Device IP addresses or other applicable device identifier (including MAC address or UDID), domain counts and other information deemed relevant, to ensure that our products are being used in accordance with the terms of this Agreement. . . . You agree not to block, electronically or otherwise, the transmission of data required for compliance with this Agreement. Any blocking of data required for compliance under this Agreement is considered to be violation of this Agreement and will result in immediate termination of this Agreement pursuant to Section 5."
I can no longer find the blocking prohibition on the legal pages of Parallels' site. Still, Parallels' Privacy Policy, shown updated May16, 2018, is pretty scary, and I presume using Little Snitch or an alternative to block it from phoning home would result in its refusal to work, since it can't check DRM, or send back to Parallels the extensive range of data it collects, tied to identified users:
Parallels said:
Privacy Policy
The personal information that you are asked to provide, and the reasons why you are asked to provide it, will be made clear to you at the point we ask you to provide your personal information and may include name, postal address, email address, phone number, date of birth, language preference, job title and business affiliations . . .

When you visit our website and/or use our products, we may collect certain information automatically from your device (e.g., information like your IP address, device type, unique device identification numbers, browser-type, broad geographic location (i.e., country or city-level location) and other technical information). We may also collect information about how your device has interacted with our website and/or products, including the pages accessed and links clicked. . . . In some cases, your personal data will be supplemented by information retrieved from public sources, such as online media or employer websites, for the purpose of confirming your current professional position or address. . . .

We do not sell or otherwise make your personal data available to third parties, although we may disclose your personal information to the following categories of recipients: to our group companies (including those in Cyprus, Estonia, Germany, Malta, Russia, Spain, the United Kingdom and United States), third party service providers and partners who provide data processing services to us (e.g., to support the delivery of, provide functionality on, or help to enhance the security of our website and/or products), or who otherwise process personal information for purposes that are described in this Privacy Notice or notified to you when we collect your personal information (when they perform services on our behalf, mainly to maintain and support our IT systems).

We may also disclose your personal data to third parties including law enforcement bodies, regulatory, government agencies or other third parties in the following circumstances: (a) to undertake the activities listed above; (b) to conform to legal requirements or comply with legal process (including assisting in the investigation of suspected illegal or wrongful activity or to deal with any misuse of the product); (c) to sell, make ready for sale or dispose of our business in whole or in part including to any potential buyer or their advisers.
Comforting or not, Corel, a Canadian company, acquired Parallels in December, 2018.
 


Ric Ford

MacInTouch
Is LaunchControl idiot-proof (or George-proof)? Does it provide in-application explanation of what processes may be unnecessary and warnings against disabling those which are essential?
It's not entirely idiot-proof, but there are a number of built-in checks and warnings and extensive built-in help, plus macOS also restricts what you can do via System Integrity Protection (SIP), which you have to disable by jumping through arcane hoops if you want to do something like disable photoanalysisd or gamed.
Any idea if LaunchControl provides a different and separate cron service? If that's the case, removing LaunchControl would result in photosynthid returning from the grave?
LaunchControl has a variety of checkboxes and options to control spawning, running at load time, background/interactive, "globbing" and more,
 


Ric Ford

MacInTouch
To add just a couple of additional notes, I'll start by mentioning that developers typically use third-party software libraries within their apps, from a wide variety of sources with a wide variety of issues...
Here's an example: QuickBooks 2016 for Mac, a standalone app specifically designed to hold very sensitive financial data:
Intuit/QuickBooks for Mac said:
... This program uses the following open source components under their respective licenses: Blast.c, Boost, Excelsior! And SQLite.
... This distribution may contain Sparkle. Copyright (c) 2006 Andy Matuschak.
... This distribution may contain IFVerticallyExpandingTextfield. Copyright (c) 2006, Andrew Bowman.
... This distribution contains AttachedWindow and NSColor Contrasting Label Extensions by Matt Gemmell
... This distribution contains InspectorKit, created by Steven Degutis
... This product code includes Apple Sample Code - Sketch-112 - Copyright © 2005 Apple Inc.
... This distribution may contain CHCSVParser. Copyright (c) 2014 Dave DeLong
... This distribution may contain MailCore 2 Copyright (C) 2001 - 2013 - MailCore team
 


Ric Ford

MacInTouch
Here's an example: QuickBooks 2016 for Mac, a standalone app specifically designed to hold very sensitive financial data:
Intuit/QuickBooks for Mac said:
... This program uses the following open source components under their respective licenses: Blast.c, Boost, Excelsior! And SQLite.
... This distribution may contain Sparkle. Copyright (c) 2006 Andy Matuschak.
... This distribution may contain IFVerticallyExpandingTextfield. Copyright (c) 2006, Andrew Bowman.
... This distribution contains AttachedWindow and NSColor Contrasting Label Extensions by Matt Gemmell
... This distribution contains InspectorKit, created by Steven Degutis
... This product code includes Apple Sample Code - Sketch-112 - Copyright © 2005 Apple Inc.
... This distribution may contain CHCSVParser. Copyright (c) 2014 Dave DeLong
... This distribution may contain MailCore 2 Copyright (C) 2001 - 2013 - MailCore team
Meanwhile, here are some network connections that QuickBooks makes to 13 different IP addresses:
  • cdn.mxpnl.com
  • d24n15hnbwhuhn.cloudfront.net
  • ofx-prod-brand.intuit.com
  • http-download.intuit.com
  • download.fidir.intuit.com
Identifying the owners and purposes of "mxpnl.com" and "d24n15hnbwhuhn.cloudfront.net" is left as an exercise for the reader.
 


Ric Ford

MacInTouch
Warnings from Talos:
Cisco said:
CleanMyMac X incomplete update patch privilege escalation vulnerability
CVE-2019-5011
An exploitable privilege escalation vulnerability exists in the helper service CleanMyMac X, version 4.20, due to improper updating. The application failed to remove the vulnerable components upon upgrading to the latest version, leaving the user open to attack. A user with local access can use this vulnerability to modify the file system as root.
Most Prevalent Malware Files March 14 - 21, 2019
...
SHA 256: dcf0fd2f6cc7b7d6952e8a2a9e31d760c1f60dd6c64bffae0ab8b68384a21e8b
MD5: f22a024b4c98534e8ba7a1c03b0b6132
VirusTotal: https://www.virustotal.com/#/file/dcf0fd2f6cc7b7d6952e8a2a9e31d760c1f60dd6c64bffae0ab8b68384a21e8b/details
Typical Filename: unpacknw.zip
Claimed Product: N/A
Detection Name: Osx.Malware.Bpbw::agent.tht.talos
 


Kasperski said:
Cryptocurrency businesses still being targeted by Lazarus
It’s hardly news to anyone who follows cyberthreat intelligence that the Lazarus APT group targets financial entities, especially cryptocurrency exchanges. Financial gain remains one of the main goals for Lazarus, with its tactics, techniques, and procedures constantly evolving to avoid detection.
...
It’s no secret that Apple products are now very popular among successful internet startups and fintech companies, and this is why the malicious actor built and used macOS malware. While investigating earlier Lazarus incidents, we anticipated this actor would eventually expand its attacks to macOS.
...
We’d therefore like to ask Windows and macOS users to be more cautious and not fall victim to Lazarus. If you’re part of the booming cryptocurrency or technological startup industry, exercise extra caution when dealing with new third parties or installing software on your systems. It’s best to check new software with an antivirus or at least use popular free virus-scanning services such as VirusTotal. And never ‘Enable Content’ (macro scripting) in Microsoft Office documents received from new or untrusted sources. Avoid being infected by fake or backdoored software from Lazarus – if you need to try out new applications, it’s better do so offline or on an isolated network virtual machine which you can erase with a few clicks. We’ll continue posting on Lazarus’s latest tactics and tricks in our blog. In the meantime, stay safe!
 


Meanwhile, here are some network connections that QuickBooks makes to 13 different IP addresses:
  • cdn.mxpnl.com
  • d24n15hnbwhuhn.cloudfront.net
  • ofx-prod-brand.intuit.com
  • http-download.intuit.com
  • download.fidir.intuit.com
Identifying the owners and purposes of "mxpnl.com" and "d24n15hnbwhuhn.cloudfront.net" is left as an exercise for the reader.
mxpnl.com is an Oracle site. Any Cloudfront domain is just their regular Cloud service load-balancing,
 


Kaspersky Labs said:
An EXE infection for your Mac

The idea that macOS is invulnerable is a myth, as we’ve said many times before. Recently, cybercriminals found yet another way to tiptoe past its built-in defense mechanism. They collected data about the infected system and fed it into adware using files with the EXE extension, which usually runs only in Windows. An EXE file infecting Mac users? Strange, but the method does work.
...
The irony is that the malware was added not just anywhere, but to a pirated copy of a securityproduct — the Little Snitch firewall. Users who tried to save on paying for a license predictably ended up with a headache instead.
The infected version of the firewall was distributed using torrents. Victims downloaded to their computers a ZIP archive with a disk image in DMG format — so far, normal. But a close look at the contents of this DMG file reveals the presence of the MonoBundle folder with a certain installer.exe inside. This is not a typical macOS object; EXE files usually just don’t run on Mac machines.
....
In fact, Windows executables are so unsupported in macOS that Gatekeeper (a security feature of macOS that prevents suspicious programs from running) simply ignores EXE files. This is quite understandable: It makes little sense to overload the system by scanning obviously inactive files, especially with one of Apple’s selling points being operating speed.
...
After installation, the malware first collects information about the infected system. Cybercriminal interest is focused on the name of the model, device IDs, processor specifications, RAM, and many other things. The malware also harvests and sends information about installed applications to its C&C server.
Simultaneously, it downloads several more images to the infected computer with installers masked as Adobe Flash Media Player, or Little Snitch. They are in fact run-of-the-mill adware tools that pester you with banners.
 


In fact, Windows executables are so unsupported in macOS that Gatekeeper (a security feature of macOS that prevents suspicious programs from running) simply ignores EXE files . . .
Some years ago I tried Crossover, a proprietary version of the Open Source Wine project, to run a Windows version of Quicken 98 on my Mac. It worked well enough that Quicken 98 launched, then shut down, apparently because Intuit had turned off its authorization server.

While researching Crossover and Wine, I realized their components could, in theory, open on a Mac and run Windows malware in part because they use the Microsoft-supported Mono open-source alternative to .NET This report from Kaspersky seems to resolve hypothetical into real world fact.

As to Gatekeeper not checking for .exe files in a .dmg, I think it's safe to presume someone downloading known pirated software would disable Gatekeeper's safety verifications before installing?

Those of us who aren't acquiring pirated software should be pretty safe, but it is apparently rather easy to create the kind of packaged software that delivered this malware load in an .exe file.
Wineskin said:
What is Wineskin ?
Wineskin is a tool used to make ports of Windows software to Mac OS X. The ports are in the form of normal Mac application bundle wrappers. It works like a wrapper around the Windows software, and you can share just the wrappers if you choose.
Best of all, its free! Make ports/wrappers to share with others, make ports of your own open source, free, or commercial software, or just make a port for yourself! Why install and use Windows if you don’t need to?
 


As to Gatekeeper not checking for .exe files in a .dmg, I think it's safe to presume someone downloading known pirated software would disable Gatekeeper's safety verifications before installing?
If it's signed with a revoked DeveloperID then Gatekeeper should still prevent it from being opened even when disabled.

If it's known malware, then an updated XProtect would prevent it from opening.

But there's another factor here that I don't think Kaspersky considered. In order for any file to be checked by Gatekeeper (or XProtect), it must have a quarantine attribute, and that will only happen if the file was downloaded by a Quarantine-savvy app. Most pirate sites use torrent downloads, and most such apps do not quarantine their files. I'd have to guess that's the real problem here, not that Gatekeeper doesn't check .exe files.
 


But there's another factor here that I don't think Kaspersky considered. In order for any file to be checked by Gatekeeper (or XProtect), it must have a quarantine attribute
Al, I don't know if I'm confused or my post was confusing.

It is my understanding that Gatekeeper checks to see if an application is from the Mac App Store or an identified developer who paid a developer fee to Apple to be identified. Gatekeeper gives App Store apps an automatic pass. But even apps not from the App Store but from identified developers are met with a challenge that seems designed to urge users to stay inside Apple's walled garden:
Apple said:
Safely Open Apps on Your Mac
If your Mac is set to allow apps from the App Store and identified developers, the first time that you launch an app from an identified developer, your Mac asks if you’re sure you want to open it.
As to Xprotect, I understand it is not a comprehensive malware blockade, and what malware it does block isn't documented, at least by Apple.

What I had been trying to suggest is that someone who is willing to download and install a pirated software would ignore Gatekeeper even it it went full alarm. After all, what Gatekeeper does is check for authentication, not check the app itself. As the recently headlined hack of Asus firmware servers to deliver malware in signed firmware reminds, even software certified authentic may be counterfeit. Perhaps, somehow, the pirated copy of Little Snitch with its added malware load managed to sneak through Gatekeeper without setting off alarms. Unsettling prospect.
 


even apps not from the App Store but from identified developers are met with a challenge that seems designed to urge users to stay inside Apple's walled garden:
That depends on which setting the user has enabled and how it was downloaded. If "App Store" then the user is only challenged before being able to open an app from an identified developer, but if choosing "App Store and identified developers" they are only warned (if the app was quarantined) as downloaded from the internet. If the app wasn't quarantined, there is no warning.
Perhaps, somehow, the pirated copy of Little Snitch with its added malware load managed to sneak through Gatekeeper without setting off alarms.
I wish I had a sample to test for myself. As you have guessed, a large number of recent Mac malware has been found to be signed with a valid Apple DeveloperID, that Apple had to revoke. Sometimes this happens very quickly, but I've seen more than 24-hours pass after having personally reported an issue to product security.

As I said, I'd like to see a sample so that I could see if Gatekeeper actually skips checks of .exe files that have been quarantined or just was never quarantined by the downloading app. We've certainly seen many examples of the latter when it comes to BitTorrent malware.
 


DFG

MacRumors reports that, starting with macOS 10.14.5, notarization will be required for all apps and kernel extensions to run with default Gatekeeper settings.

Apple gives very little information about the notarization process, other than stating that it is an automated tool.

As far as I understand, developer are still free to publish non-signed apps, which however can only be run if Gatekeeper is configured to "allow applications downloaded from Anywhere".

Apple removed the GUI to enable this in macOS Sierra, but thankfully it can be re-enabled with the following terminal command:
Code:
sudo spctl --master-disable
Given that the workaraound is still there, maybe this isn't the end of the world, but Apple is slowly closing all the loopholes and making macOS more and more like iOS.

(The day the last loophole will be closed is the day I will abandon macOS.)
 


MacRumors reports that, starting with macOS 10.14.5, notarization will be required for all apps and kernel extensions to run with default Gatekeeper settings.

Apple gives very little information about the notarization process, other than stating that it is an automated tool.
Notarization was covered extensively during WWDC-2018 last June, was implemented in High Sierra 10.13.6 and a good bit of information on it can be found in this developer note Notarizing Your App Before Distribution. I am running quite a few apps that have been notarized by their developers.
As far as I understand, developer are still free to publish non-signed apps, which however can only be run if Gatekeeper is configured to "allow applications downloaded from Anywhere".
No, you can still open such apps using a right-click/control-click on the app and choosing "Open."
 


As far as I understand, developer are still free to publish non-signed apps, which however can only be run if Gatekeeper is configured to "allow applications downloaded from Anywhere".
No, you can still open such apps using a right-click/control-click on the app and choosing "Open."
And once you do this and approve the app, macOS will remember your approval and not ask again.

Additionally, Gatekeeper's protection only affects software downloaded over a network. Software you compile yourself or install from local media (for those of you who still remember software distributed on CD/DVD) never trigger Gatekeeper.
 


Ric Ford

MacInTouch
Howard Oakley has more today about macOS 10.15 notarization changes:
Eclectic Light Co. said:
Macs move closer to compulsory notarization

... Most importantly, no one can notarize their own app, only Apple can. Developers upload their apps to Apple for notarization, and that involves checking the app meets more stringent criteria and undergoing automated testing for evidence of malicious behaviour. For the last couple of years, most malware for macOS has been correctly signed using valid developer certificates. Apple’s testing routines are intended to prevent malware from being notarized, thus give you better confidence that a new app isn’t going to be bad news.

In order for an app to be successfully notarized, it must also be hardened, which restricts its behaviours. Hardening applies a set of rules which are not unlike those applied to App Store apps. For example, if a hardened app wants to access your address book, it has to declare that when it’s built, provide an explanation which you will be given, and then pass through standard privacy checks when you first run it.

Hardening doesn’t ban potentially harmful behaviours like executing JIT-compiled code, but they are only allowed if the app declares them prior to notarization. Most importantly for many notarized apps, they aren’t run in a sandbox in the way that App Store apps are required to be, so the requirements of hardening should seldom if ever get in the way of the app or its user.

Notarization is specific to that release of that particular app. If there is a problem with that version, Apple can revoke notarization for that alone. Using developer certificates, Apple can only revoke the certificate, which blocks all subsequent first runs on any app of any version signed using the same developer certificate.

When Apple makes notarization of apps mandatory, any app which is to pass through first run checks and has been downloaded from the Internet will have to be notarized (or be from the App Store), or Gatekeeper simply won’t allow it to run. This brings major change to the way in which we supply, obtain, and install apps.
 


Ric Ford

MacInTouch
FYI:
ESET said:
OceanLotus: macOS malware update
Early in March 2019, a new macOS malware sample from the OceanLotus group was uploaded to VirusTotal, a popular online multi-scanner service. This backdoor executable bears the same features as the previous macOS variant we looked at, but its structure has changed and its detection was made harder. Unfortunately, we couldn’t find the dropper associated with this sample so we do not know the initial compromise vector.

We recently published a detailed update about OceanLotus and how its operators employ a wide range of techniques to gain code execution, achieve persistence, and leave as little trace as possible on a Windows system. OceanLotus is also known to have a malicious macOS component. This article details what has changed from the previous macOS version analyzed by Trend Micro ...
Trend Micro said:
New MacOS Backdoor Linked to OceanLotus Found
We identified a MacOS backdoor (detected by Trend Micro as OSX_OCEANLOTUS.D) that we believe is the latest version of a threat used by OceanLotus (a.k.a. APT 32, APT-C-00, SeaLotus, and Cobalt Kitty). OceanLotus was responsible for launching targeted attacks against human rights organizations, media organizations, research institutes, and maritime construction firms. The attackers behind OSX_OCEANLOTUS.D target MacOS computers which have the Perl programming language installed.

The MacOS backdoor was found in a malicious Word document presumably distributed via email.
 




I recently bought a new Mac with Mojave as the OS. One of the “features” that may have come with it is that my Safari topic searches, e.g., “baking powder”, are being redirected from Google to Bing. I’ve checked my Safari preferences and Google is the specified search engine. I tried Firefox (redirected to Bing), Chrome (Google), and Opera (Google). There are three other Macs in my household, all running High Sierra. They don’t have this problem. Thus, the redirection appears limited to Safari and Firefox on my new Mac. How do I get rid of this redirection?
 


Amazon disclaimer:
As an Amazon Associate I earn from qualifying purchases.

Latest posts