MacInTouch Amazon link...

Linux

Channels
Other
Strange that SMB1 should still be the default in current iterations of Ubuntu. I had to replace my old NAS at home about two years ago, because the version of Linux Mint (which is based on Ubuntu) I had upgraded my laptop to no longer supported SMB1, but my old NAS only supported SMB1. Not a big deal... once I moved the data to an SMB2/3-capable device, the shares started mounting again.

Maybe Ubuntu and Mint differ on which versions of SMB they choose to support out-of-the-box, but SMB1 is exceptionally outdated and really should be avoided if at all possible.
 


The problem is that the Linux SMB client .conf file does not enable SMB2 or SMB3. I spent a lot of time trying suggestions about how to edit the .conf file, until finding the copy and paste set below that did the job.
Grrrrr. What worked on my Mint 19.2 install at home on a Synology 718 wasn't successful at work (with Mint 19.2 at work, but a newer Synology 1019). Both Synologies were running the same Synology DSM with Intel NUCs both places, but they differ, and there's some differences in how they're set up. Everything in both places is updated, and I loaded the Synology DSM Management Console side by side for both the 718 and 1019 to compare the options on both units.
Strange that SMB1 should still be the default in current iterations of Ubuntu.
I had a Pop!_OS 19.10 install on a test machine. It is a re-skin of Ubuntu; possibly changes go deeper than the skin. I concluded the sym.conf files that had the same settings as the Pop install would access Synology under SMB1 but not after SMB2 was set as the minimum.
SMB1 is exceptionally outdated and really should be avoided if at all possible.
Agree.

This should be easier! Between all the changes in macOS and trying to integrate Synologies and Linux systems, I find myself being not just the guy who keeps computers going, but the unprepared guy trying to learn network management on the fly.
 


Grrrrr. What worked on my Mint 19.2 install at home on a Synology 718 wasn't successful at work (with Mint 19.2 at work, but a newer Synology 1019). Both Synologies were running the same Synology DSM with Intel NUCs both places, but they differ, and there's some differences in how they're set up.
I may be preaching to the choir... On the Synology's web page, open Control Panel, click File Services, click SMB/AFP/NFS. Expand the SMB section if necessary, click Advanced Settings. There you can specify the Minimum and Maximum SMB protocols.

My Synology is behind a VPN router and has to support some older Wintels and Macs. It's set to use SMBv1 through SMBv3. As soon as I retire the last Windows 7 PC this year I'll get rid of SMBv1, if not SMBv2. I don't recall why I chose these settings but I have "Transport encryption mode:" set to "Auto" and only "Enable Opportunistic Locking" is checked.
 


Strange that SMB1 should still be the default in current iterations of Ubuntu.
This article dated July 9, 2019 discusses the survival and pending deprecation of SMB1 in the Samba SMB "clone" used in Linux:
The Register said:
Years late to the SMB1-killing party, Samba finally dumps the unsafe file-sharing protocol version by default
The move by Samba to drop SMB1 can be seen as long overdue, given that Microsoft has been moving to get rid of the file-server protocol version from its operating systems for several years now, even before it was revealed to be one of the NSA's favorite weak points to exploit.
Re:
My Synology is behind a VPN router and has to support some older Wintels and Macs. It's set to use SMBv1 through SMBv3.
When I have any of our Synologies set as you describe, to SMBv1 as a minimum, there's no issue with Linux computers connecting, and everything works as expected. What Windows systems we have are on the shelf and haven't been updated in a long time. Macs seem to do just fine connecting with AFP. I haven't thought to check their connectivity with AFP off and only SMB on. The "Linux native" NFS setting seems to be completely ineffective.

Until something recently changed, the Linux systems worked well over AFP. File saving is now weird, and there's no way to delete files on the Synology share from a Linux system logged in via AFP.

A Malwarebytes observation about ransomware payloads traveling laterally across a network is scary enough that, if I can't get it figured out, on Linux systems we will be using Synology Assistant to log in to manage files directly in the Synology File Station.
Malwarebytes Labs - December 14 2018 said:
How threat actors are using SMB vulnerabilities
Some of the most devastating ransomware and Trojan malware variants depend on vulnerabilities in the Windows Server Message Block (SMB) to propagate through an organization’s network. ...a worm-like infection that keeps spreading itself requires little effort for multiplying returns. And that’s exactly what the SMB vulnerabilities allow their payloads to do: spread laterally through connected systems.
Confirming, my system at home, with the Synology SMB profile set to minumum SMB2, is working tonight as expected. I've been on this two full days and haven't yet identified the disabling differences between my Linux system at home and the ones at work.
 


It's set to use SMBv1 through SMBv3. As soon as I retire the last Windows 7 PC this year I'll get rid of SMBv1, if not SMBv2.
Confirming, my system at home, with the Synology SMB profile set to minimum SMB2, is working tonight as expected. I've been on this two full days and haven't yet identified the disabling differences between my Linux system at home and the ones at work.
I've found no setting other than permitting SMB1 on the Synology to let the Linux computers on the same network usefully access the Synology encrypted shared storage.

NFS is the "native" Linux option. Synology documentation says it isn't possible to access encrypted folders on that protocol.

AFP doesn't seem to provide a Linux user full file management options.

My idea of using the Synology File Station "file manager" as a way for users to manage their resources on the file-share is a real productivity killer, adding extra steps that if not done correctly could lead to file destruction.
Dept. of Homeland Security: CISA said:
SMB Security Best Practices
  • disabling SMBv1 and
  • blocking all versions of SMB at the network boundary by blocking TCP port 445 with related protocols on UDP ports 137-138 and TCP port 139, for all boundary devices.
As I've been unable to get the darn things to work right without leaving SMB1 active, I'm down to relying on the second step, blocking SMB at the network boundary. Spent way too much time finding Netgear assertions that our Nighthawk routers do that by default, which I confirmed on Steve Gibson's ShieldsUP, which reported neither router exposes UPnP or TCP Port 445 to the Internet.

With apologies to Sam for saying I'd not do what he suggested, I have. I'm relying on blocking blocking SMB1 services at the "network boundary" but not happy about it.
Microsoft Windows Support said:
Guidelines for blocking specific firewall ports to prevent SMB traffic from leaving the corporate environment
Blocking connectivity to the ports may prevent various applications or services from functioning.
A couple of things I did that may offer some insurance. Our primary work Synology uses Synology's HyperBackup targeted to an older unit on the network that's used only to receive backups. I was able to set the Synology firewall on the target computer to receive the backups without SMB of any kind active on the receiving unit.

We also rotate backups of the main unit to USB hard drives. I set them to automatically dismount when the Hyperbackup is finished. That offers some assurance the USB backup won't be attacked by ransomware.
 


Ric Ford

MacInTouch
The latest Linux looks good:
Ars Technica said:
Ubuntu 19.10: It’s fast, like “make old hardware feel new” fast
There's even support for the ZFS filesystem (though it's "EXPERIMENTAL").

... After spending recent weeks with Ubuntu 19.10, I can say confidently it is quite simply the best Ubuntu Canonical has ever released. The first reason I like 19.10 so much is that it feels insanely fast. Everyday tasks like opening applications, dragging windows, activating the search interface, and even just moving the cursor around are all noticeably faster than in 19.04. The speed boost is immediately noticeable from the minute you pop in the live CD, and it's even faster once you have 19.10 installed.

I happened to be testing a top-of-the-line MacBook around the time I first installed the 19.10 beta on my aging Lenovo x240, and it instantly made the Mac feel like a sloth. Ubuntu 19.10 ran circles around the Mac even on much, much less powerful hardware, and nothing says success during testing like software that makes old hardware feel newer. Even if that were all you got out of Ubuntu 19.10, I'd call it a win.

#performance #ZFS
 


I need to use a Windows laptop at work, but I'm allowed to run a Hyper-V VM. So I have the latest Ubuntu installed on it and use it for 90% of my work. It's much better than Windows and much worse than macOS. I've discovered, however, that not everyone uses macOS like I do, so it might work fine for you.

Things I love about macOS:

1. Consistent keyboard shortcuts. Command-Q, W, C, V, X, ',' work everywhere. No weird exceptions, no 'this app treats every tab as a window, etc.

2. I use Alfred. On Windows the only thing even close is Wox, and it sucks. On Linux they have Albert (yes, it's a pure clone). Neither is really good.

3. You can adapt macOS to fit your desires. Keystroke changes, launchers, etc. just fit seamlessly. On Windows it's impossible to remove all Command-key shortcuts (Windows key) without disabling OS functions. Command-L, for example, is hard-coded to the lock screen. You can only reassign it by turning off the lock screen functionality.

4. Apps are much, much, much better on macOS. Pasteboard apps are clunky on Windows and Linux, but there are multiple excellent options on macOS. iTerm2 is way, way, way better then even the options on Linux (which shocked me!).

In general, everything on Linux seems a little bit broken. You have to tweak this there and hunt down a dependency here and find something to make that thing work with Gnome vs KDE. Your mileage may vary.
 


Ric Ford

MacInTouch
... In general, everything on Linux seems a little bit broken. You have to tweak this there and hunt down a dependency here and find something to make that thing work with Gnome vs KDE. Your mileage may vary.
On the other hand, I've been getting that "little bit broken" feeling more and more with Macs lately, while the beauty of Linux (I believe) is that it doesn't install all kinds of proprietary black boxes (hardware, software and network-based) that you can't understand, can't control, and can't even remove. Apple doing exactly that, while aggressively integrating unknowable AI mechanisms, is making it increasingly non-viable for some of us (along with Apple’s exhorbitant pricing and quality problems).

The great things about macOS are its past excellence in human interface design and quality (now rapidly being destroyed for the sake of cheap, short-term profits and marketing/sales, on top of unwanted complexity), and the excellence, energy and breadth of its support community, which Linux can't begin to compete with, and the traditional quality of third-party Mac applications (though most good apps are now cross-platform).

Windows, of course, enjoys the broadest support by far, with the greatest standardization and the largest collection of cost-effective software and hardware. But it's a security nightmare.

Too bad we can't have the best of all three worlds....
 


Too bad we can't have the best of all three worlds....
A case can be made that we should be looking at all three platforms for our daily needs today. Times have changed, and our decision making should evolve with it. Using Linux or Windows may not fit into your daily life, but there may solutions there waiting to be had. Need a cheap, low-load, server? You can buy a cheap Windows laptop and run a number of free software titles to add features to Windows 10, or wipe the drive and install a flavor of Linux you prefer. Having a built-in screen frees you from needing an extra input or display to connect the device, even if it is not the highest resolution or the most color accurate.

There are also a number problems we face now we did not in the past, especially as it pertains to malware and security. The ability to easily, and cheaply, set up a Raspberry Pi to run Pi-Hole to serve as an ad-blocking service for your entire home should make us all take notice. This is something you cannot do easily, or cheaply, in macOS. In fact, Apple is locking out or deprecating products and services we might have used for such purposes even on an individual machine basis.

What seems ages ago, Apple asked us to Think Different. Now, I get the strong impression they want us to think only as they do and to just trust them in all matters. They may have actually become what they satirized in their 1984 Super Bowl commercial.
 


On the other hand, I've been getting that "little bit broken" feeling more and more with Macs lately...
I'll just add this and then shut up. It's a matter of degree. My Ubuntu VM needs to use xRDP as a server to get adequate performance under Windows Hyper-V. Microsoft even bundles and sets it up for you if you click the express install of Ubuntu. However... xRDP doesn't play well with Gnome and PolicyKit, so you get strange permissions errors and have to Google the crap out of it to track down fixes, not all of which work. Some of the links are for previous versions of Ubuntu that don't work the same now, some are for newer versions of PolicyKit, which Ubuntu doesn't use yet. There is no way, no way, that someone other than a developer, someone who really understands Linux, can debug and fix this stuff. You complain about buggy software, but the difference in level of issues is several orders of magnitude.

You might get lucky. You might not hit the edge cases. The distributions are getting better, but there's a final gap they're never going to be able to bridge because there's no overarching vision. There's no QA group paid to make sure the edges are filed off. macOS has had buggy releases before and they result in something like Snow Leopard. There's no Snow Leopard in the Linux desktop future.
 


Ric Ford

MacInTouch
... you complain about buggy software, but the difference in level of issues is several orders of magnitude. You might get lucky. You might not hit the edge cases. The distributions are getting better, but there's a final gap they're never going to be able to bridge because there's no overarching vision. There's no QA group paid to make sure the edges are filed off. macOS has had buggy releases before and they result in something like Snow Leopard. There's no Snow Leopard in the Linux desktop future.
I don't disagree that using Linux can involve a lot of headaches with configuration and compatibility in a way that's often different from using macOS, even today. On the other hand, we have plenty of examples of equally obscure and infuriating problems in macOS now, and troubleshooting Apple's software has the feel of the old Windows troubleshooting voodoo that we mocked decades ago, as we wade through the same kind of muck but with orders of magnitude more complexity.

And on the flip side, let me cite one personal example. A number of years ago, I set up a zero-cost Linux Ubuntu system for non-technical friends who have used it constantly for basic functionality ever since, with zero problems – email, web browsing, YouTube, Facebook, spreadsheets & word processing (using LibreOffice), etc. I even installed software for a very specific application need a couple of years ago, which took more effort than installing a Mac app but wasn't all that difficult in the end.

The viability and suitability of one platform vs. another obviously depends on the user's very specific needs in all respects.
 


... My Ubuntu VM needs to use xRDP as a server to get adequate performance under Windows Hyper-V. Microsoft even bundles and sets it up for you if you click the express install of Ubuntu. However... xRDP doesn't play well with Gnome and PolicyKit, so you get strange permissions errors and have to Google the crap out of it to track down fixes, not all of which work ... There is no way, no way, that someone other than a developer, someone who really understands Linux, can debug and fix this stuff.
I would argue that this is the fault of Microsoft's Hyper-V environment, not Linux.

I have installed plenty of Linux systems on standalone PCs and they, for the most part, just work. Sure, I often need to add or configure packages, but nothing that a curious person with a "missing manual" type reference couldn't handle.

When I run VMs on my Windows PC, I use VirtualBox, not Hyper-V, and I get similar experiences. After I install the Guest Additions drivers, the integration with Windows (desktop resizing with window size, shared folders, clipboard integration) works well enough that I rarely have to think about it. And if I want to run the VM headless (remote access only), I can install a VNC server in the VM or use VirtualBox's own RDP server, which runs outside of the guest OS and shouldn't interfere with it.
 


I've found no setting other than permitting SMB1 on the Synology to let the Linux computers on the same network usefully access the Synology encrypted shared storage.
In reading comments following the Linux Mint November, 2019 blog, discussing the pending December release of Mint 19.3, I found a user question and answer from "Clem," Clement Lefebvre, founder and lead of the Mint project.
The Linux Mint Blog said:
I read the thread about the Samba bug that isn't fixed and believe it may explain my issues connecting Linux systems to the Synology set to require a minimum of SMB2.
 


My Ubuntu VM needs to use xRDP as a server to get adequate performance under Windows Hyper-V. Microsoft even bundles and sets it up for you if you click the express install of Ubuntu. However... xRDP doesn't play well with Gnome and PolicyKit, so you get strange permissions errors and have to Google the crap out of it to track down fixes
I wonder if there are similarities to the way Parallels had an auto-install variety of Ubuntu 14.04, which I tested on a new 15" MacBook Pro in early 2015? That was my first serious effort to try out a Linux install, and I found it was apparently restricted by Parallels. I suppose the good of that was Parallels knew it would work. But I found no way to upgrade the Ubuntu VM or installed software and concluded it was a "special" Parallels compatible version.
I have installed plenty of Linux systems on standalone PCs and they, for the most part, just work.
Which, for the most part, is my experience after deciding to try Linux on bare metal. The "unusual" problems I've had arose from buying new hardware before it was supported by the Linux kernel. That's becoming less an issue, as it now seems even Nvidia is actively supporting its latest GPUs (though getting that support quickly may require running "bleeding edge" versions).
I use Alfred.
I had Alfred, Hazel, Text Expander, and a library of customized Automator actions.

Issues:

  • Keeping them updated with seemingly frequent mandatory paid updates
  • Not inadvertently "automating" a file structure implosion.
I didn't trust users I supported to safely utilize the "powers" in Hazel, Text Expander, or Automator. And I found that moving from my Mac, which had such tools, to other Macs without such tools challenged my own muscle memory, so I learned to do the best with "native" software and not rely on interface extensions.

Countervailing macOS productivity taxes?
  • deprecation of File > Save As
  • unwanted "Versions" in Preview and other Apple applications
  • CPU wasters, such as Game Center and photoanalysisd
  • the need to carefully control Apple's push to auto-update, lest (e.g.) Catalina appear and obsolete important 32-bit applications
Apps are much, much, much better on macOS.
That's an unquestionable Mac advantage. I first started trying to replace aging Macs with Linux systems in 2015; we had to buy some new Macs last year to keep software in service... software, however, that won't run on Catalina.
 


[FYI:]
elementary OS said:
Introducing elementary OS 5.1 Hera
Last October, we announced elementary OS 5 Juno with wide-ranging updates to provide a more refined user experience, improve productivity for new and seasoned users alike, and take our developer platform to the next level. Today we’re pleased to announce elementary OS 5.1 Hera, the latest major update.
Hera builds on the solid foundation of Juno while bringing:
  1. A brand new first-run experience with Greeter and Onboarding
  2. Flatpak support with Sideload and AppCenter
  3. Major updates around accessibility and System Settings
  4. Iterative improvements across nearly all apps
  5. The latest hardware support with a new Linux kernel and hardware enablement stack
If you’re just interested in downloading it, head on over to elementary.io to get yourself a copy...
To download for free, enter 0 in the Custom field to the right of "Pay What You Want:".
 


Ric Ford

MacInTouch
[FYI:]
To download for free, enter 0 in the Custom field to the right of "Pay What You Want:".
Here's more about it:
Jason Evangelho said:
Meet The Linux Desktop That’s More Elegant Than Mac And Windows 10
... I’ve honestly struggled to capture the precise words that are needed to convey how enthralling elementary OS is to me. It should be said that it looks much better in motion, and to appreciate it you simply have to use it.

Fortunately, for the average PC user, using it is just that: simple. And with today’s introduction of version 5.1, I may finally make the switch myself. It feels like macOS, but with a sharper focus on appearance and the freedom of choice that Linux excels at.
 


Sam Herschbein said:
[FYI:]
To download for free, enter 0 in the Custom field to the right of "Pay What You Want:".
... I don't want to discourage people from learning new systems, but after going through another set of posts from people contemplating leaving the Mac to go to Linux, or worse (really!) Windows, I hope you'll allow me to get something off my chest.

In almost 40 years of software development, I have used more flavors or OSes than I care to remember, and really, the only one I actually dread using is Windows, although a short bout around 2000-2001 with pure Windows NT showed Microsoft could actually put out decent software, and then they put the Fisher-Price interface that was Windows XP on it and ruined it. Today [I think that] Windows 10 is just as bad as Windows 8 was; they were just smart enough to put back enough features to keep people from doing a mass migration to the Mac.

I like Linux. I use it every day for embedded software development, and I thank the hardware vendors for releasing their development systems on Windows, Linux and (even) macOS. In the embedded field, more and more people are doing development in Linux environments, but it amazes me that so many of them do it by running a Linux VM on a Windows box. When I was told that no one had a problem if I wanted to run my development environment on Linux with a Windows VM for mail and Windows-specific tools, it was one of my better days at work.

I don't want to discount other people's experience with Linux, it is a very productive operating system, but for the things I hear people complaining about here – interface changes, preference changes, dropping support for 32-bit applications or certain devices etc. – Linux does not solve those problems. An upgrade from Centos 5.8 to Centos 6.2 essentially cost me an afternoon's work, as I downgraded from a backup. My desktop was nothing like it was previous to the upgrade; none of my preferences were being applied, because the new upgrade was looking in different directories for its resource files, and there was no one-to-one translation to make migration simpler. The only solution was to drop back to 5.8.

Going from Ubuntu 11 to Ubuntu 14 was a similar experience with the default Gnome 3-based desktop with the really simplified menu systems, overlarge fonts, integrated App Store and Amazon shopping (!). Luckily, by then I had moved on to KDE and was able to install Kubuntu as the interface; otherwise it would have been another downgrade.

You can argue that being able to use Gnome 2, Gnome 3 or Kubuntu gives me more flexibility, but, honestly, I've never felt the need to move from the default macOS experience to something else, as much as I've felt the need to do so on Linux. Yes, I still miss my colored sidebar icons in the Finder, and they really want you to sign into an iCloud account. But you don't have to sign into iCloud. It'll still let you log in and use your computer. If you stuck with a previous version of macOS Server, you could practically run your own iCloud server for about $20. And the interface changes are minor annoyances most people get used to, compared to the total upheaval that can accompany a major release of Linux.

(I will give Linux credit for providing support packages that allow people to run 32-bit applications on 64-bit OSes, unlike dropping them completely, like Catalina. But they don't make it easy to do, if you don't know what to look for.)

macOS is a mature OS. People are both bored with it and uncomfortable when changes are introduced. When changes are made to the interface, [some people wonder] how they can go back to running 5-year-old OSes in a VM. Elementary OS Hera 5.1 looks quite good, very familiar in fact. People seem excited to go to it, but isn't that the same user interface people want to leave? Aren't those the same Mac-like applications they already have? What am I missing about Elementary that would make one want to move to it, other than it running on cheaper hardware?

If you're looking for stability in your operating system and interface, macOS X is about as stable as you're going to get. Compare OS X 10.0.0 to macOS 10.15.x, and you will see refinements, throbbing buttons to flat, pinstripes to flat and, yes (sadly) colors to grayscale. But most of the interface elements and menus remained as before.

Contrast that to the Windows 98 -> XP -> Vista -> Windows 7 -> 8 -> 10 interface changes. Each was a major headache to support, people suddenly trying to tell their customers where familiar buttons ended up. And as I described above, major Linux upgrades can result in essentially unusable systems, unless you want to spend a couple of days figuring out where all the configuration scripts moved to. There is a reason why doing a "how do you do x on Linux" often involves having to figure out exactly what distribution and release you are running.

In closing, and sorry this is so long, the only thing I envy the non-Apple space is the variety of hardware available. I understand that a lot of people want to move to PC-based hardware for that reason, too. If you're in that situation and you're familiar with the Mac, but not with Linux, I'd recommend spending your time on learning how to build and run a Hackintosh rather than trying to learn a system that looks like a Mac but will quickly start to show its seams as soon as you start doing anything more involved than browsing the file system.
 


Ric Ford

MacInTouch
... a short bout around 2000-2001 with pure Windows NT showed Microsoft could actually put out decent software ...
Well, that actually came from the DEC/VAX world, so it's not surprising that it was superior....
Wikipedia said:
Windows NT
... Microsoft hired a group of developers from Digital Equipment Corporation led by Dave Cutler to build Windows NT, and many elements of the design reflect earlier DEC experience with Cutler's VMS and RSX-11, but also an unreleased object-based operating system developed by Dave Cutler for DEC Prism.
An upgrade from Centos 5.8 to Centos 6.2 essentially cost me an afternoon's work, as I downgraded from a backup.
Something tells me that CentOS probably isn't the most appropriate comparison to macOS. I haven't had problems like that updating Ubuntu or Mint. Obviously, jumping across major versions of macOS (e.g. to Catalina) has its own problems.
Yes, I still miss my colored sidebar icons in the Finder, and they really want you to sign into an iCloud account. But you don't have to sign into iCloud. It'll still let you log in and use your computer.
My experience with macOS 10.13 and 10.14 is that Apple will not stop harassing you to login to iCloud if you choose not to set up automatic login. You can't say "no" and you also can't say no to harassment about updates you don't want (e.g. Catalina), etc., etc.
What am I missing about Elementary that would make one want to move to it, other than it running on cheaper hardware?
An intense urge to be free from the new Apple's ever-increasing and ever creepier (with "machine learning") manipulations and abuses of its customers, and its astonishing disrespect for the most fundamental humanistic principles that produced the computing revolution and success of the original Macintosh, as Tim Cook's Apple Inc. transitions from a computer company to an entertainment company à la Disney/Sony.
 


My experience with macOS 10.13 and 10.14 is that Apple will not stop harassing you to login to iCloud if you choose not to set up automatic login. You can't say "no" and you also can't say no to harassment about updates you don't want (e.g. Catalina), etc., etc.
Don't forget Apple's regular, persistent nag to "upgrade your security" by turning on Two Factor Authentication (2FA) across both macOS and iOS devices, the results of which may not be quite what you expect, many people are badly prepared for, has been documented to potentially become a complete nightmare and, of course, the new scenario of, in some cases, you can't actually switch it back off once you switch it on!
 


macOS is a mature OS. People are both bored with it and uncomfortable when changes are introduced.
Good observation, but the dynamic I see among many users, particularly here, is that it works for us now, please don't break it - so we can go on with our business. I write about science and technology and am not a QA technician whose job is to find bugs. My Mac is a tool for my research and writing and for doing auxiliary tasks like locating illustrations to use with my articles and managing my business. It gets to be a serious problem when Apple does things like fiddle with file name formats in ways that make my archives unreadable without searching through all all the documents in my files. Change for the sake of change is a marketing technique to sell products, but current users want stability with improved performance, and will evaluate the usability of new features.
 


I don't disagree that using Linux can involve a lot of headaches with configuration and compatibility in a way that's often different from using macOS, even today. On the other hand, we have plenty of examples of equally obscure and infuriating problems in macOS now, and troubleshooting Apple's software has the feel of the old Windows troubleshooting voodoo that we mocked decades ago, as we wade through the same kind of muck but with orders of magnitude more complexity.
I've been a systems administrator in a mixed Windows/Mac/Linux environment for 20+ years and agree with what you say. Macs used to be the machines that "just worked", but honestly now it's the Linux systems that work best out-of-the-box.

Much has been said about the need to "make" things work under Linux and to some extent that's true. Once you start looking to do things with a Linux system that could be considered "outside the norm", some extra effort may be necessary to get it working. However, the underpinnings of Linux by-and-large are pretty basic so the extra work isn't overly difficult. For example, I run Retrospect servers to backup all of our client machines. Installing the client on any client machine is simply a matter of running the installation program, but on some Linux distributions I have to do some additional work to get that client daemon to run automatically at startup. Sounds like a PITA, but all it entails is creating a basic text file with some needed info in it. It's extra work, but it's simple work. The same can't be said for Macs or Windows 10 – those OSes are far more complicated, and working "under the hood" on them becomes more and more difficult (if not impossible).

I will say that I'm seeing far fewer problems with new Windows 10 systems than I am with new Macs, and for what Apple charges, that absolutely shouldn't be the case. I honestly can't recommend a $2500 MacBook Pro to a user when a similarly spec'd $1300 Win10 laptop is that much cheaper, far easier to upgrade/repair, and has an OS that honestly seems to work better.
 


... An upgrade from Centos 5.8 to Centos 6.2 essentially cost me an afternoon's work
Red Hat Enterprise (and CentOS, by extension) is particularly nasty here. Major version upgrades really aren't supported. Red Hat recommends a clean install each time.

If you make sure you put your home directories and /usr/local/ in a separate partition from the rest of the system, you can just wipe the system partition and install. But if you don't, it's a real pain and one of the reasons I no longer like Red Hat.

Other distributions I've used (Debian, Ubuntu, Fedora) don't seem to have these problems any more (Fedora used to have all of Red Hat's problems, but that was many years ago). Major version upgrades are supported and are pretty painless.

For Debian and Ubuntu, it’s (mostly) just changing a configuration file and then running a normal package upgrade, which will pull in over 1000 packages. For Fedora, their "dnf" package manager has a module (optionally installed) which automatically does whatever is needed.
Going from Ubuntu 11 to Ubuntu 14 was a similar experience with the default Gnome 3-based desktop with the really simplified menu systems, overlarge fonts, integrated App Store and Amazon shopping (!). Luckily, by then I had moved on to KDE and was able to install Kubuntu as the interface; otherwise it would have been another downgrade.
Strange. When I upgraded back then, it retained my existing desktop and configuration. It didn't force me from GNOME version 2 to 3. I remember explicitly installing Ubuntu's "Unity" interface, because I wanted to try it. (I later got rid of it).

Did you perform a clean install (followed by restore of home directories) instead of an upgrade?

Or maybe it was because you jumped from 11 to 14 (skipping 3 years of releases) as a part of the upgrade?
 


Well, that actually came from the DEC/VAX world...
via Dave Cutler, accompanied by his vast knowledge of the internals of OpenVMS/VMS.

Microsoft, with NT, wanted to eventually have clustering done just as OpenVMS does. Didn't work out too well in the end for Microsoft, despite all of Dave's efforts. There is nothing more rock-solid than an OpenVMS cluster. (I've managed OpenVMS clusters with up-times of many years without a single reboot.)

Me? OpenVMS Systems Engineer since 1989. And Dave Cutler is a personal friend of mine: we go back to years-ago DEC.
 


Ric Ford

MacInTouch
macOS is a mature OS.
Actually, that claim doesn't hold any water whatsoever, if you think about it. The most fundamental and critical parts of the current macOS are anything but mature:
  1. new file system! (APFS, virtually undocumented and silently changing all the time)
  2. new graphics system! (Metal 2, which follows Metal 1, as open standards and past standards are discarded)
  3. new storage system! (radically new and non-standard, T2 hardware-based)
  4. new boot system (T2/BridgeOS)
  5. new kernel drivers! (all-new architecture with Catalina)
  6. new security/authentication systems (constant changes)
  7. new software/media distribution/installation (change after change to systems and stores, plus expired certificates and installer bugs)
  8. "Catalyst" (a buggy mess critical to Apple's own highlight apps)
  9. and more (e.g. iOS device sync, iCloud, etc.)...
I mean... seriously... we need to compare those with Linux's or Windows' file system, graphics system, storage systems, boot system, kernel driver architecture, etc., if you want to claim "maturity" is an Apple advantage.

#applequality
 


Elementary OS Hera 5.1 looks quite good, very familiar in fact. People seem excited to go to it, but isn't that the same user interface people want to leave?
Let's put a positive spin on it and just call Elementary OS an unabashed homage to OS X as it was in, say, Snow Leopard.
Aren't those the same Mac-like applications they already have?
If you'd listened to Elementary co-founders Daniel Foré and Cassidy James Blaede (the UX designer) on as many Linux-centric podcasts as I have, I think their goal of replicating an Apple-esque ecosystem would be clear. That extends to offering a customized set of applications consistent with Elementary's theming and selling them through an application store where users register credit cards.
What am I missing about Elementary that would make one want to move to it, other than it running on cheaper hardware?
I was surprised how well DistroTest's presentation of Elementary OS 5.1 worked over the Internet. It's beautiful and very appealing. But that's not saying it was speedy, so I didn't take the said-to-be-possible step of installing applications, an important evaluation step, because there aren't many "Elementary" applications, leaving users dependent on the "standard" UI-agnostic Linux applications.
DistroTest said:
Test a New Operating System
On our website you will find many (857 versions of 254) operating systems which you can test directly online without a installation.
 


Ric Ford

MacInTouch
Of course, the issue of macOS vs. other operating systems brings up the issue of Apple pushing people to iPads as its recommended alternatives to Macs and other personal computers. Here's a related article:
Gizmodo said:
iPadOS vs a MacBook Pro in All the Tasks That Really Matter
For years now, Apple has been pushing the iPad as a laptop replacement; with the arrival of iPadOS, it might just have a serious shot at getting you to ditch your computer for good (or at least leaving it behind on trips). To test the current state of play, we put an iPad Pro up against a MacBook Pro in five key computing workflows.
 


Ric Ford

MacInTouch
FYI:
BleepingComputer said:
Ubuntu Linux Gets Intel Microcode Update to Fix CPU Hangs
Canonical has released a new Linux Intel microcode update for Ubuntu that fixes an issue causing Intel Skylake CPUs to hang after a warm reboot.

On November 12th, 2019, new Intel microcodes were released to mitigate a vulnerability discovered in the Transactional Synchronization Extensions (TSX) feature in Intel processors and a vulnerability in Intel Xeon processors that could lead to a denial of service attack from a local privileged user.

After the update was released, a regression was discovered that was causing Intel Skylake processors to hang after a warm reboot.
 


Of course, the issue of macOS vs. other operating systems brings up the issue of Apple pushing people to iPads as its recommended alternatives to Macs and other personal computers. Here's a related article:
Apple's pushing people to iPads as a recommended alternative to Macs is a joke. Can you get to the iPad's filesystem? Nope. Can you install an application like 'Terminal' and then, say connect via ssh to a remote server? Nope. The list of things one can’t do with an iPad as opposed to a Mac is endless. On the flip side, my newly-acquired Surface Pro 7 can do all that and function as a tablet as well.
 


Linux can be amazingly useful or a pain the neck. Like all things with computers, some care needs to be taken when jumping into its world. I have been using Linux for nearly a decade now and switched from Mac completely last year. Here are some quick observations that everyone should free to add to.

Ubuntu desktop: generally stable with wide range of hardware support because of the focused and sponsored effort on that experience. But, the experience will vary significantly by flavor.

Kubuntu desktop: KDE-based desktop was quite buggy and needy for resources a few years ago. Newer, plasma-flavored versions, however, are stellar for ease-of-use, design, compatibility, and resources. A review of the current version:

Standard Ubuntu desktop: Has gone through major design changes because of the switches from Gnome 2 to 3, and then Unity, and then back to Gnome 3. Then there are the changes to the X-Windows system, too (X, Wayland, Mutter, Mir). The most recent version, 19.10, is getting good reviews, however:

Xubuntu desktop: Based on Xfce, this distro lacks the staff support available to Ubuntu and Kubuntu. The focus here is on stability and incremental change, along with a clean interface. Xubuntu is the desktop I use. Because the team is small, however, bugs can crop up and continue for some time until a fix arrives. Recent versions have gotten better.

MX-Linux has made notable headway on providing an alternative Xfce distro that deserves serious consideration. A review of MX-Linux 19 is available at:

Note, because of its versatility and small footprint resource-wise, Xfce is the desktop of choice for numerous distros that have a specific purpose, such as Kali (recently switched from Gnome) and Ubuntu Studio.

Lubuntu desktop: Based on the LXDE desktop, this flavor has replaced xubuntu as the desktop for low-end hardware. LXDE, for instance, is the desktop for Raspbian on Raspberry Pi computers. Design and style are not a strength of this distro, however.

For truly low-end hardware, Knoppix-based distros should be looked at. My current favorite is Tiny Core Linux (the basic install with the Flwm desktop is 16 MB - yes, that is right). A distro like Tiny Core will run entirely in memory and can easily work on very old computers (the recommended configuration is Pentium 2 or better, 128 MB of RAM plus some swap).

Redhat/CentOS: Is Gnome-based. Good hardware support, but the Gnome bugs/changes over the last several years have been a problem for some.

Ubuntu Mate: An effort to keep Gnome 2 going by forking that project and calling the desktop environment Mate. For many who want a little more with the desktop but also want stability and broad hardware support, Ubuntu Mate is a good choice.

Elementary OS: A independent fork of Ubuntu that is focused on design. The team behind this OS is small, and they are creating and maintaining their own desktop called Pantheon. The design is very Mac-like, but hardware support and fixes have lagged relative to the goals they have set for themselves.

Linux Mint: An Ubuntu-derived but independently run version that features Mate, Cinnamon (an alternative, Gnome3-based desktop), or Xfce desktops. The team behind this distro tends to push the envelope for supporting newer hardware.

This list of distros only touches on a few. Linux is all about fragmentation, as even multiple versions of alternative desktop environments have gained widespread traction. If you want support for cutting-edge hardware, Mint might be your choice. If you are deeply concerned about design and careful about what hardware you use, Elementary might be for you.

Sometimes CentOS provides support for something that Ubuntu does not, and sometimes the opposite happens. Until recently, font display was a mess on anything but Xfce desktops or Elementary. But, Mate, KDE, and the current Gnome 3 variations have all made remarkable improvements since 2018 on font display.

Still, there are always regressions to worry about, especially when the OS is getting updated often (the 2017 versions of Xubuntu had some display regressions that I saw firsthand).

Unlike a Mac, where distro, desktop, and hardware were all rolled together into a neat package, Linux requires independent examination of all of these factors.
 


Let's put a positive spin on it and just call Elementary OS an unabashed homage to OS X as it was in, say, Snow Leopard.
I am thinking about trying this on standard hardware instead of Apple stuff. I would like to buy a cheap laptop (refurbished from Walmart or Staples or somewhere) for $150 or so. They have Intel Core i3/5 processors which should be more than enough for my needs. Any advice about which brand or model, or if that even makes any difference?
 



I am thinking about trying this on standard hardware instead of Apple stuff. I would like to buy a cheap laptop (refurbished from Walmart or Staples or somewhere) for $150 or so.
Speaking of Apple stuff, I found it very handy to dual boot Linux Mint on my 11" MacBook Air. The base Core i5 Air with only 128GB SSD turned out to be a great tool in our mixed Mac and Linux setup at work, and easy to carry between locations. Only issue I had setting it up using rEFInd boot manager was that I originally installed Mint from the LiveUSB while connected via USB Ethernet adapter, but hadn't forced recognition of the onboard WiFi. A quick update fixed that.

Generally, HP and Dell computers are said to be the most likely inexpensive brands to support Linux conversion without issues. You'd want to verify Linux compatibility for the specific model you're considering, as some of the cheapest Windows computers were the most locked down.

There are new models of the HP Stream laptop for as low as $180 [Amazon link]. They're very basic laptops but should give a taste of what Linux can do.
iFixIt Guide said:
Undoubtedly better in the same price range would be a used Lenovo ThinkPad. Frequent Ubuntu spokesman Alan Pope (Developer Advocate on the Snapcraft team) is a ThinkPad aficionado. He recently guested to discuss Thinkpads on a Jupiter Extras Podcast:
Jupiter Broadcasting said:
Jupiter Extras #34, 11/22/19, "popey on ThinkPads"
Chz sits down with Alan Pope (popey) to discuss his thoughts about ThinkPads, and why they might be the perfect Linux laptop. Find out what those model numbers really mean, plus our tips for picking which one is right for you.
 




There also is an iOS version of the venerable Attachmate Reflection for UNIX, which is free for basic use.
I also use Reflection a lot. The keyboard is the best thing about it (tabs, ctrl, arrow keys, etc.), but I can't figure out how to paste anything from the clipboard. The setup to SSH into a server is fairly simple.

I also use Remoter in order both to SSH and to VNC, and it also works well... They keyboard isn't as good as Reflection, but it does accept input from the clipboard. The setup to access servers has a basic and advanced option, should you want to customise your shell or VNC display.

I use both of these products on my old iPad mini (v.1) and have had no problems accessing either Mac or Linux systems.
 



I've been experimenting with a number of different Linux distros (in Parallels VMs) - Ubuntu, Mint, Ubuntu-MATE - and two on the Raspberry Pi 4 - Raspbian and Ubuntu 19.10 with MATE desktop. (The Pi 4 is used as an AirPrint server for my old networked printer).

One question I have: given that all these distros are built with contributions from all over the place, how can we be sure that they don't include malware, backdoors etc.? While I follow basic security procedures in configuring each install, am I stlll setting myself up for intrusion?

By the way, I'm pleasantly surprised by the quality and completeness of these Linux distros. A recent install of Ubuntu 19.10 (64bit) on the Pi 4 had a few bumps in the road, and then was a bit laggy in use until I installed the MATE desktop, but so far seems to be working fine. Note: I'm just a Linux dabbler at this point, heavily reliant on the wealth of expert info out there.

#Linux #RaspberryPi #security #supplychain
 



I've been experimenting with a number of different Linux distros (in Parallels VMs) - Ubuntu, Mint, Ubuntu-MATE - and two on the Raspberry Pi 4 - Raspbian and Ubuntu 19.10 with MATE desktop. (The Pi 4 is used as an AirPrint server for my old networked printer).
One question I have: given that all these distros are built with contributions from all over the place, how can we be sure that they don't include malware, backdoors etc.? While I follow basic security procedures in configuring each install, am I stlll setting myself up for intrusion?
By the way, I'm pleasantly surprised by the quality and completeness of these Linux distros. A recent install of Ubuntu 19.10 (64bit) on the Pi 4 had a few bumps in the road, and then was a bit laggy in use until I installed the MATE desktop, but so far seems to be working fine. Note: I'm just a Linux dabbler at this point, heavily reliant on the wealth of expert info out there.
Any package you install has root level access, so you don’t have cast-iron guarantees against installing malware etc. However, there are ways of reducing your exposure:

- The distributions usually sign the metadata files that they provide, which makes you more confident that the package you just installed came from Ubuntu or Debian etc. That means that you are trusting their developers and maintainers to avoid malware (or at least remove it ASAP). Given that they provided all the base packages you are running (kernel, long etc.), that is probably a trust that you already have​
- Some of the distribution repositories may have no quality assurance done. Universe in Ubuntu, for example, will have less validation than the core package lists.​
- Adding third-party repositories has the same trust question. Do you trust Google‘s apt repository? Do you trust Ubnt’s? By adding them to your configuration, you are giving them the ability to have their packages installed, so be careful about which of those you do enable. If you do, understand the configuration files so you can be sure that they don’t override packages from your distribution if you don’t want them to.​
- Installing a raw package (dpkg -i blah.deb) means you need to trust the source of that package, and that they checked it.​
- The horrendous pattern curl get.sh | sudo sh gives a random shell script from the web root access* to your machine. Read the script before you let it loose.​
- Package managers for languages have regular security bulletins. make sure you are aware of them for any you enable (there was a node package recently which did crypto mining).​
[*See here for example. –MacInTiouch]

#security #supplychain
 


Ubuntu...Raspbian...given...how can we be sure that they don't include malware, backdoors etc.?
The basic answer is to verify your download's checksum before installing.
It's FOSS said:
How To Verify Checksum In Linux [Beginner Guide]
Most common use of checksum is in checking if the downloaded file is corrupted.
It's an important question because I recall the servers of a couple of major Linux distributions were hacked into (very briefly) sending .iso files with embedded malware.

A clean .iso won't protect you from yourself. One way to get into "trouble" with Linux is to Google up "answers" or "applications", then copy and paste possibly dangerous commands into Terminal. It is easy to link your install to a ppa (source for applications), or install a .deb package, and unless you have some confidence because they're affliated with your distro, not easy to verify safety.

Example: I wanted to download a YouTube clip that fascinated my granddaughter. I installed a Firefox extension that, uh oh, required download and installation of a .deb "helper application" from the extension's website. I'm embarrassed to confess I did that, then woke up and immediately removed both the extension and helper application. Perhaps overkill, but since I'd been using the TimeShift backup program that comes with Linux Mint, I went further and did a nuke and pave reinstall.

Martin Wimpress, who has recently been named lead of Canonical's Ubuntu desktop team, touches lightly on what goes on behind the scenes in creating Ubuntu and how that benefits Ubuntu 'flavours', like Mate, and distros based on Ubuntu (Mint, Elementary, Zorin, Pop!_OS) in this December 4, 2019 Podcast Interview:

He and fellow Canonical employee Alan Pope produce the Ubuntu Podcast. They teased that the next episode (Season 12, Episode 36) will discuss what goes into building a Linux distro. Show notes for the Ubuntu Podcast series, and past episodes, can be found at UbuntuPodcast.org

This is all about the rather reliable and perhaps even staid Ubuntu and derivatives. Over in the geekier "build your own Linux" that's Arch, there's AUR, about which the Arch-based Manjaro developers warn:
Manjaro said:
Arch User Repositor
AUR, as a community maintained repository, present potential risks and problems.
Possible risks using AUR packages:
  • Multiple versions of the same packages.
  • Out of date packages.
  • Broken or only partially working packages.
  • Improperly configured packages which download unnecessary dependencies, or do not download necessary dependencies, or both.
  • Malicious packages (although extremely rare).

#security
 


Amazon disclaimer:
As an Amazon Associate I earn from qualifying purchases.

Latest posts