MacInTouch Amazon link...

2019 Mac Pro and alternatives

Ric Ford

MacInTouch
Before I parallellized this step, it took some ten hours to complete, on less total files. On my mid-2012 2.6GHz MacBook Pro 15" Retina, this step used 16 worker threads and used to take about 2.5 hours. On my new 2019 2.4GHz Core i9 MacBook Pro 15", it takes just over half an hour using 32 worker threads. Not all of this speedup is due to having twice as many cores, naturally....
Does your 2012 MacBook Pro Retina at least have an SSD? Even if so, I'm sure it's far slower (probably half the speed) of the one in the new MacBook Pro and storage differences might have a major effect on your results. To really get an understanding of the effects of the processor and cores, one would like to see results from booting both systems off the same (fast) storage device. Unfortunately, the best way to do that here is probably a Thunderbolt 2 SSD, which would constrain the top speed. But wait... I wonder if using an internal RAM disk for the data on each computer would provide the most meaningful comparison?
 


Ric Ford

MacInTouch
Price anchoring on the Vegas in the iMac Pro may not manage expectations well....
I couldn't agree more with everything you said, and the price for MPX modules is a critical, unknown factor. They could be frighteningly high, well into the thousands of dollars each, depending on how much Apple charges for the modules themselves on top of the GPUs and storage devices contained within.

The only reason I could justify comparisons without MPX modules/prices is that I assume standard PCIe cards will work in the 2019 Mac Pro's slots, so any compatible PCIe graphics card should be an option, making comparisons across systems easier. But, absolutely, anyone who buys MPX modules is going to be paying a heavy price for the privilige.

The other thing I haven't mentioned is the potential in this radically new system for thermal problems. Did Apple actually design it so perfectly that there won't be any problematic hot spots inside or outside of MPX modules and so standard PCIe cards won't have any issues? I hope that's the case, but it seems more than a little challenging from an engineering/design perspective. (Of course, if Apple were to mess that up, its customers paying $10,000, $20,000, or $50,000 aren't going to be real happy.)
 


Does your 2012 MacBook Pro Retina at least have an SSD? Even if so, I'm sure it's far slower (probably half the speed) of the one in the new MacBook Pro and storage differences might have a major effect on your results. To really get an understanding of the effects of the processor and cores, one would like to see results from booting both systems off the same (fast) storage device. Unfortunately, the best way to do that here is probably a Thunderbolt 2 SSD, which would constrain the top speed. But wait... I wonder if using an internal RAM disk for the data on each computer would provide the most meaningful comparison?
Yes, it has a Transcend JetDrive 725 960GB as the internal SSD. Testing with Blackmagic Disk Speed Test gives write ~ 200MB/s, read ~ 450MB/s. The new MacBook Pro reaches 2.7GB/s write and 2.6GB/s read. That's… quite a large difference. Much larger than I expected!

RAM disk… Now that's interesting, it's been a while since I used one of those. Quickly thinking: 2012 MacBook Pro has 16 GB of RAM, and I'd need 10 GB to hold all files. That's going to be a squeeze, since the processing itself needs a couple of GB of RAM too. It's an experiment I might try over a rainy weekend. :-)
 


Ric Ford

MacInTouch
You could have 1000 processes like these running and they could all be handled with ease by a single processor core because of processor scheduling.
I understand your point, but mdsworkers may be a counter-example, given their deleterious impact at times.

Other things that ramp up fans on my MacBook Pro, for what it’s worth:
  • Malwarebytes scanning many new files
  • Carbon Copy Cloner (rsync)
  • FHD video
  • certain raw image adjustments (in multiple apps)
  • Spotlight indexing
But here's the thing: While Mac hardware has been getting faster and faster - with huge leaps from 2011-era and earlier systems to current Macs - the software has been getting slower and slower at the same time! Mac OS X 10.6..8 was radically faster than OS X 10.9 - so much so, that I had to upgrade all our systems from hard drives to SSDs. Then OS X 10.10 through macOS 10.12 were significantly slower than OS X 10.9 (which was a great Mac OS that I would still be running if it were up to date on security and app support). And, then, I've found that macOS 10.14 is even slower than macOS 10.12, on hardware that's 50% faster, when it comes to startup times, getting to the Finder, etc. (After that, it's more sprightly on the faster hardware.)

So... that's why I wondered if more cores would help, because OS X/macOS software has slowed dramatically over the years in my experience, on ever-faster hardware. (Graphics and video are an exception, being faster nowadays while taking advantage of radical improvements in GPU hardware.)

Certainly, Apple's headlong rush to integrate A.I./machine learning into all aspects of its platforms is likely to aggravate the issue, considering how much processing power these things demand - so much so that Apple, Intel, Nvidia, AMD et al are investing massive amounts of money in custom silicon to run the stuff faster! And we can count on these invisible processes competing with the work we're trying to get done on the Mac (photo processing, video editing, scientific modelling, whatever).

P.S. As we've moved from running programs on our unconnected Macs to doing vastly more work on the global Web/Internet platform, performance of browsers and JavaScript has become critical and often is the cause of system slowdowns. (Meanwhile, modern apps are constantly phoning home to cloud services.) So the effect of more cores on Safari/Firefox/Chrome sub-processes is another critical issue.

P.P.S. I just rebooted, and "mds_worker" started sucking up a lot of CPU (along with Malwarebytes' RTProtectionDaemon).
 


Ric Ford

MacInTouch
The problem, of course, as my readers have pointed out to me since my article on WWDC, is this: things change. One reason why so many of the creatives ask for a modular Mac Pro is that they want to be as future-proof as possible.
Thom, I'd guess that you might have a lot of Windows users in your audience. How about these $3-4K PC systems I've been describing here - are they as fast for Photoshop, Premiere, etc. as they seem to be on the surface? Do you have folks using mid-range, modular Windows systems in preference to higher-priced Macs and getting good results, or is there something holding them back in the photo/video/audio world?
 


I understand your point, but mdsworkers may be a counter-example, given their deleterious impact at times.

Other things that ramp up fans on my MacBook Pro, for what it’s worth:
  • Malwarebytes scanning many new files
  • Carbon Copy Cloner (rsync)
  • FHD video
  • certain raw image adjustments (in multiple apps)
  • Spotlight indexing
But here's the thing: While Mac hardware has been getting faster and faster - with huge leaps from 2011-era and earlier systems to current Macs - the software has been getting slower and slower at the same time! Mac OS X 10.6..8 was radically faster than OS X 10.9 - so much so, that I had to upgrade all our systems from hard drives to SSDs.
Aside from raw image adjustments, most of these tasks are I/O-bound, not CPU-bound. And most of the macOS slowdowns appear also to be due to large amounts of file system I/O, which is why an SSD speeds it up.

CPU-bound tasks wouldn't be affected by increasing storage performance. And adding extra cores/threads/processes won't help if everybody is blocking on the same file system, which is why new and better SSD performance (nVME, new controllers, more PCI bandwidth, etc.) is such a big deal.
 


I have a workflow (written in Python) where one step takes input files and aggregates them into output files. This step can be split into sequential parallel parts, each processing just shy of 1000 zipped XML files. Since I know that all threads will block on I/O, I schedule two worker threads per core.

Before I parallellized this step, it took some ten hours to complete, on less total files. On my mid-2012 2.6GHz MacBook Pro 15" Retina, this step used 16 worker threads and used to take about 2.5 hours. On my new 2019 2.4GHz Core i9 MacBook Pro 15", it takes just over half an hour using 32 worker threads.

Not all of this speedup is due to having twice as many cores, naturally. Increased memory bandwidth and a faster SSD will contribute for sure. Still, it's a significant gain. (I'm very happy with the new laptop.) Would a 28-core machine be even better? I reckon I'd run into an I/O bottleneck.
28 cores (56 virtual CPUs via hyperthreading) should be able to let you run 112 worker threads. That will definitely speed things up, but you're right that they'll start blocking on I/O if nothing else changes.

Fast storage will help. So will optimizing your apps to minimize I/O concurrency. One possible way might be for a single master thread to do all the I/O (read raw data into memory buffers and write completed data from memory buffers to output files), so your worker threads deal entirely with in-memory buffers.
 


Ric Ford

MacInTouch
Aside from raw image adjustments, most of these tasks are I/O-bound, not CPU-bound. And most of the macOS slowdowns appear also to be due to large amounts of file system I/O, which is why an SSD speeds it up. ...
I probably should have noted that the macOS 10.14 system is using Apple's very fastest flash technology (1TB T2-based), yet the older macOS 10.12 machine running on AHCI flash that is far slower on a much slower CPU boots faster. So, no, in this case, it's software (or networking) performance that's the problem, not storage and not CPU. And, as you note, the raw image processing issues are pretty much pure CPU (and, surprisingly, I think more CPU than GPU).

As far as CCC/rsync goes, I think you might find that it's more CPU-bound than you realize. It seemed odd to me, too, when I realized what was happening. A quick look in Activity Monitor shows four cores rather active while ramping up CPU temps and fans. I expect it's doing a lot of data comparison work to determine which files need to be backed up in my typical backup scenario of comparing hundreds of gigabytes of files (metadata, I guess) on source and backup drives but only actually copying a tiny fraction of that data (i.e. the few changed files).
 


I wonder if anyone else remembers Apple's old tag line, "Macintosh, the computer for the rest of us"?
The IIcx from 1989 was $5,369 when released ($11,088 with inflation to 2019 dollars) which was geared towards the pro market.

The first PowerBook G3 was $5,699 in 1997 (just over $9,000 in today's dollars for the base model).

Or the 20th Anniversary Mac from 1997, as well, that started out at something like $7,500 (it wasn't a pro model though).
 


Ric Ford

MacInTouch
The Iicx from 1989 was $5,369 when released ($11,088 with inflation to 2019 dollars) which was geared towards Pro market....
I actually spent some time putting together a chart about that earlier. I do remember the price pain of old Macs. I coughed up for the original Mac, based partly on a business case, but I never had the money to buy anything like a Mac IIci for myself (nor the outrageously expensive IIfx). I never even considered the 20th Anniversary model.

I've always gravitated to the middle ground, or cost-effective lower levels, like iBooks and Mac Minis. The Quadra 650 is one of my all-time favorites; the SE/30 another one; the Power Macs were good (except for reliability issues with the G5 towers), and the mid-range PowerBooks and MacBook Pros have been worth their hefty prices for me over the long term, but I was lucky/smart enough to avoid the nightmares with defective graphics hardware. (We did get burned badly for months of misery with a defective MacBook Pro internal SATA cable, but those 2011-2012-era MacBook Pro 13" computers have otherwise have been great over many years with easy upgrades to SSD.) A Mac Mini G4 was astoundingly reliable, and a later Intel Mini the same, but Apple made them more and more inaccessible, pushing me to the MacBook Pros of that era as a more accessible alternative with built-in screen, keyboard and "UPS" for a very modest price premium. (Thunderbolt 3 support helps mitigate the problem of closed desktop systems, but Thunderbolt 3 is more expensive than internal upgrades in the past, and external upgrades aren't great for laptops.)
 


,,, But, absolutely, anyone who buys MPX modules is going to be paying a heavy price for the privilige.
I forgot about one corner case. The Radeon Pro 580X module will probably be affordable by 'mere mortals'. (However, I suspect there aren't too many use cases where folks would want more than one of these. Most of the folks who will buy the default configuration buy it because it is a "lowest option available".) It does have some upside in that it is only a 'half height' MPX module, so the x8 socket that the two Thunderbolt controllers would have used is still available (each MPX bay has an x16 and an x8 "standard" slot). A 'skinny' x8 card would probably work, though (e.g., an M.2 SSD card). For folks who need to drive six displays (or need some of the Thunderbolt 3 sockets for data and need more video outs), another 580X module will work.
The other thing I haven't mentioned is the potential in this radically new system for thermal problems.
The 'half height' 580X module may be one of those issues if pushed to extremes. It runs cooler than the others, so 'half' of the surface area to shed heat isn't critical, but I suspect it doesn't have lots of headroom. It may be clocked closer to an iMac's embedded GPU speeds than a mainstream desktop card.
Did Apple actually design it so perfectly that there won't be any problematic hot spots inside or outside of MPX modules and so standard PCIe cards won't have any issues?
The shroud around the MPX modules should keep them relatively independent from other modules and cards. If the shroud itself becomes a 'backstop' heat sink, they may run into problems with radiation to adjacent cards. In most standard design set ups, the "hot' side of the card is usually covered mostly in a fan. The fan is sucking air in, so it won't be a radiative surface. Plus, air being sucked into the channel between cards has a mild benefit of moving air over the adjacent card's "cooler" side (it is still a net heat source, though).

Once you start putting 3rd-party stuff in there, with its own design of how the air flow needs to be redirected and/or distrupted, that flow is going to get sloppy through the system. For example, the 4-drive RAID system in an MPX bay. You'll have very blunt forward surfaces on those relative to the airflow. Something that goes in and severely disrupts the ariflow, and doesn't do anything active to correct that, could run into 'hotspots'.

The 'expansion mount point' downstream of the CPU cooler is a "hot spot" or at least a 'much warmer than ambient' spot (contributing to why there is nothing normally there). Anything there is at least in the heated "Santa Ana" wind zone. The MPX modules channel their heated air until about the exit of the system. The CPU channel basically dumps into that empty space (there is a huge gap between there and the 'exit'). Anything large enough to get in the way of most of the air flow coming out of the CPU shroud is going to get a substantive percentage of the heat from the CPU. Plus any turbulence on that CPU downstream air bounced away from the 'exit' isn't going to help.

Slot 5 appears to be 'perfectly' tuned for the Afterburner card. I could see folks getting into trouble if they put a card there that needed Aux power (more than 75W) and needed to cool it. That is a spot where you could get into a slippery slope. Also trying to fill 5-8 slots with cards all running close to 75W socket power. That is in between the sweet spot for the two fan cooling zones in the top half of the system.

Apple also doesn't have any noise specs on the marketing tech specs page. At full blast (consuming up near 1.2Kw ) with all 4 fans going, this may not be 'quiet' anymore.

The rack version may even have a even louder upper threshold (and perhaps more cooling). Apple doesn't have exact dimensions on that, but they could make the 3 main fans a bit bigger in the rack system and just move more cubic feet of air per minute normally and spin them faster when they start to fall behind. They may not have to constrain themselves to normal household power either. In short, they are working on another 'option' for an even more expensive enclosure where they don't care so much about noise. If the workload is long-duration, normally running substantively over 1KW in power consumption, the rack options may be a better fit (at even bigger bucks) with basically the same logic board.

(The rack moves the power button and two Thunderbolt ports to the front, so it may need a slight variant on the logic board. Those appear to be a module on top in the AR view, so perhaps they have one board and two mount points.)
 


Ric Ford

MacInTouch
I forgot about one corner case. The Radeon Pro 580X module will probably be affordable by 'mere mortals'....
For some silly reason, I was picturing the entry-level Mac Pro with a standard PCIe graphics card, not an MPX module. If that skimpy system includes an MPX graphics module, then maybe MPX prices aren't going to be quite as extravagent as I'd feared.

I wonder how open and expensive the MPX system is for third parties? Promise obviously has early access, but that company has always had an oddly close relationship with Apple. Will we see MPX storage and graphics modules from OWC, for example, or Sonnet or others? And how expensive will they be? Lots of unanswered questions at the moment.
 


For some silly reason, I was picturing the entry-level Mac Pro with a standard PCIe graphics card, not an MPX module. If that skimpy system includes an MPX graphics module, then maybe MPX prices aren't going to be quite as extravagent as I'd feared.
A "half height" MPX GPU module that only took socket standard power (75W) could be very close to a standard PCIe graphics card. If the edge of the module only plugs into the x16 socket (and skips the other 400W power pins), then for most connectivity purposes, it is just a standard card with no active fan cooler on it.

The only modification needed would be to route 4 DisplayPort streams into the 'custom' socket. It shouldn't be very hard at all to figure out which pins those are (nicer if Apple just documented it and had a low-cost certification system). But if it was "OK" to kill off all the DisplayPort to the standard Thunderbolt 3 sockets, then it could be even easier (which is another reason Apple should help "do it right"). At least hardware-wise.

You still need a graphics driver to support a full range of functions like Boot/FileVault login and Internet recovery screens. That would be the bigger hiccup for 3rd-party GPUs that were "new' to Mac system usage. I don't think MPX physical format is the main blocker in the GPU context.

If Apple just 'lifts' an embedded iMac GPU from a subsection of an "already paid for" iMac design, the costs here should be relatively cheap. It would need some adjustment, but most of the work is done. Set it to the same parameters as in the iMac, and it should run the same with merely some PCB board layout tweaks to feed the module's edge pins.
I wonder how open and expensive the MPX system is for third parties? Promise obviously has early access, but that company has always had an oddly close relationship with Apple. Will we see MPX storage and graphics modules from OWC, for example, or Sonnet or others? And how expensive will they be? Lots of unanswered questions at the moment.
Storage wouldn't be surprising to me. Four to six 2.5" SSDs or four U.2 would be a good fit. (With zero contribution to provisioning the DisplayPort streams for Thunderbolt 3, the required edge connector is merely a standard PCIe slot if it can survive on just 75W.) The only thing MPX about it is the physical dimensions of the bay, and that's about it.

Can't see where you would need to license anything from Apple at all, if only using the standard PCI-e slot portion for connectivity. The only real blocker was wondering if there were going to be enough customers who would buy to make it worthwhile. (The 2019 Mac Pro is priced so high that the run rate is low.)

Graphics: I'm not so sure, because there are two pairs of 8-pin power ports. There is also a 6-pin power port (for a better-than-entry-level card). If competing solely on costs, the overlap with the "approved" eGPU card list is going to be pretty high. Spanning both a Thunderbolt GPU enclosure and Mac Pro probably generates more volume to sell into. The third-party MPX candidate would be Blackmagic. Apple has worked with them for eGPUs that are more tightly integrated into the Mac ecosystem. The GPU is embedded in their designs.

Blackmagic could do something similar to what Apple can do with the embedded iMac GPU and just put it into MPX format. If they overlapped Apple driver work for something else in the Mac line-up. that would work and bump the component parts volume across both their eGPU and this card.

If you look at the logic board for the last two Blackmagic eGPUs and at the MPX, there are some probably some similarities. I can see why Apple 'participated closely' in that, because it was probably partially 'free' R&D for them. The MPX shell is different, but similar enough to get some insights.

The problem for a higher-power GPU card, though, is how the thermal feedback system would couple to the fan driving the air through the GPU. Maybe that is part of the custom socket with the higher power - that would make sense. If shipping in 400W, then you should have a way of the card on the other side to say, "I'm hot, blow more air."
 


28 cores (56 virtual CPUs via hyperthreading) should be able to let you run 112 worker threads. That will definitely speed things up, but you're right that they'll start blocking on I/O if nothing else changes. Fast storage will help. So will optimizing your apps to minimize I/O concurrency. One possible way might be for a single master thread to do all the I/O (read raw data into memory buffers and write completed data from memory buffers to output files), so your worker threads deal entirely with in-memory buffers.
That's a good idea, using in-memory files. I was planning to parallelise another step in the workflow and that would require using a master thread anyway. I'll see about using in-memory files instead of on-disk ones.

On the MacBook I'm still CPU-bound with 32 workers for the job I described. Using the Intel Power Gadget I can see that utilisation is at 99%. Clock speed still hovers around 3GHz. (Screenshot)
 


But here's the thing: While Mac hardware has been getting faster and faster - with huge leaps from 2011-era and earlier systems to current Macs - the software has been getting slower and slower at the same time! Mac OS X 10.6..8 was radically faster than OS X 10.9 - so much so, that I had to upgrade all our systems from hard drives to SSDs. Then OS X 10.10 through macOS 10.12 were significantly slower than OS X 10.9 (which was a great Mac OS that I would still be running if it were up to date on security and app support). And, then, I've found that macOS 10.14 is even slower than macOS 10.12, on hardware that's 50% faster, when it comes to startup times, getting to the Finder, etc. (After that, it's more sprightly on the faster hardware.)
You touch an interesting issue here. Way back in the '70s, when I briefly worked for Honeywell's computer division, I was surprised to learn that the operating system used more than half of their "small" mainframe's computing power. In other words, more than half of the machine's computing power went to the overhead of managing the programs that did the work. I don't know what the ratio is today, but I suspect that much more of a modern Mac's processing power goes into what is essentially overhead. Much of that "overhead" is intended to make the machine easier to use, but conversely some of that ease of use comes at a high cost in overhead. For example, speech recognition in Siri takes much more overhead than processing keyboard input, and as a touch typist I don't bother with speech input.

Cloud computing adds another form of overhead: the time to retrieve information from the cloud. My first effort at collaboration using Google Docs gave me a vivid example. Two of us several hundred miles apart were working at once on the same section of a document and the delay from my input to the change appearing on screen seemed to be tens of seconds, so long that I moved to another part of the document to avoid screwing up the other person's input.

This leaves me wondering how much the overhead of these supposed enhancements in performance wind up costing in processing and response times. Or to put it another way, does our human perception of no net speed improvement with faster hardware reflect trade-off with performance enhancements, many of which don't really enhance performance very well -- or at all -- for some of us?
 


Ric Ford

MacInTouch
You touch an interesting issue here. Way back in the '70s, when I briefly worked for Honeywell's computer division, I was surprised to learn that the operating system used more than half of their "small" mainframe's computing power. In other words, more than half of the machine's computing power went to the overhead of managing the programs that did the work. I don't know what the ratio is today, but I suspect that much more of a modern Mac's processing power goes into what is essentially overhead.
I picked up my iPhone this morning, turned on Low Power Mode, as I normally do, and realized that overhead is very much a factor, and you can see it by looking at power drain. Apple is all about reducing power now, because power translates to battery drain from iPhones to Mac laptops and iPads.
Now, before someone slams the concept, I do understand that performance is a factor, too (and may or may not be considered "overhead"), and iPhone power usage depends a lot on radio factors, but when an iPhone, iPad or Mac drains its battery sitting on a table unused, that's showing overhead.
 


Now, before someone slams the concept, I do understand that performance is a factor, too (and may or may not be considered "overhead"), and iPhone power usage depends a lot on radio factors, but when an iPhone, iPad or Mac drains its battery sitting on a table unused, that's showing overhead.
Excellent point. The power overhead also includes things like display lighting that don't use the processor. I see a big difference in rated power draw and expected lifetime on my MacBook Air when I turn the display brightness down.
 


Ric Ford

MacInTouch
Excellent point. The power overhead also includes things like display lighting that don't use the processor. I see a big difference in rated power draw and expected lifetime on my MacBook Air when I turn the display brightness down.
That's very true, but macOS/iOS is pretty aggressive about shutting off the screen when it's not in use, and batteries still drain with it off. Here are some examples of Apple system overhead:
Apple said:
Use Low Power Mode to save battery life on your iPhone

Low Power Mode reduces or affects these features:
  • Email fetch
  • "Hey Siri"
  • Background app refresh
  • Automatic downloads
  • Some visual effects
  • Auto-Lock (defaults to 30 seconds)
  • iCloud Photos (temporarily paused)
 


For example, speech recognition in Siri takes much more overhead than processing keyboard input, and as a touch typist I don't bother with speech input.
My iPad 9.7" updated to iOS 12.1.3 yesterday. While exploring Settings > Siri and Search > Siri Suggestions, I turned off "Suggestions in Search," Suggestions in Look Up," Suggestions on Lock Screen", and rendered local search on the device next to useless. Turning back on "Suggestions in Search" added back at least some functionality. Not sure how those relate to the telemetry I associate with Siri, but the exercise does show how Apple has merged Siri, which I associate with cloud connections and services, into a local activity, thereby increasing the likelihood I'll leave it enabled and possibly more "cloud connected."
This leaves me wondering how much the overhead of these supposed enhancements in performance wind up costing in processing and response times.
My venerable 15" MacBook Pro (R.I.P.) had "only" 6 GB of RAM, which was more than Apple officially supported. I had a Crucial SATA III SSD installed by a not-Apple shop after confirming it would work with the old laptop's SATA I connector. There was considerable performance gain, although the SSD was capable of multiple times the SATA I connector's max speed. I attribute much of the speed gain to lower latency.

Most of my work on that system was in Excel or LibreOffice Calc. Spreadsheets will work best when entirely in RAM, but at some point an "enhancement" in OS X started a much more active use of disk cache to reduce "memory pressure." Might work great on a really fast Apple PCIe SSD, not so great through SATA I. I was able, by limiting how many spreadsheets I had open and by settings I read about through terminal, to avoid the "benefits" of that Apple enhancement. Another of those attempts to have my Macs work like I want, not like Apple dictates, that I've given up on since Apple's updates sweep them away.
I picked up my iPhone this morning, turned on Low Power Mode, as I normally do, and realized that overhead is very much a factor
Over in Android Land, my recent LG phones (V30, LG G8) offer "Battery Saver." I just leave mine on Maximum. I disconnected my G8 from the charger four hours ago. It is at 94% charge and reports that will last (ha!) 45 h and 33 min. I have been using it during the morning, taken some photos of a construction project, checked email, and read several websites, including MacInTouch and the review of the new One Plus 7 Pro below:
Hot Hardware said:
One Plus 7 Pro Review
Though the OnePlus 7 Pro sports a rather capacious 4,000 mAh battery, the phone pulls off only middling performance here in our PCMark Work 2.0 battery life test. Further, dropping screen resolution and display refresh rate only gained us a little over a half hour of additional up-time. Perhaps a lower RAM configuration at 6GB or 8GB may offer slightly better battery life as well.
The reviewed phone has 12 GB of RAM, 256GB UFS 3.0 storage, and a 90Hz 3120x1440 OLED Screen. Android OEMs have been in a RAM race, with observers commenting that adding large amounts RAM isn't necessary but does consume battery.
 


Ric Ford

MacInTouch
Aside from raw image adjustments, most of these tasks are I/O-bound, not CPU-bound.
As far as CCC/rsync goes, I think you might find that it's more CPU-bound than you realize. It seemed odd to me, too, when I realized what was happening....
Thinking more about this, I'm going to guess that file metadata is cached in RAM, so CCC/Rsync can slam through all that data with maximum CPU power, doing its comparisons, and it's only when files actually are different and need to be copied that things might get I/O-bound. (We're talking about metadata for literally millions of files.)
 


For me, the new Mac Pro will be a "necessary evil". I have four < 2-year-old, external hardware RAID systems, fed by PCIe cards, and two nearly new, very pricey Eizo displays (for the macOS UI).

Even at the low-end (with more storage and RAM, however), a new Mac Pro would blow away my dual-CPU 2012 Mac Pros. I will likely use all of the PCIe slots in the new Mac Pro. Thunderbolt just doesn't cut it for storage bandwidth and speed. I will most likely get the 16-core CPU.

Depending on prices, I may go with the Afterburner card and one of the Vega II Duo MPX modules. It is a shame that it looks like nVIDIA GPUs won't be an option (I've got an unused, brand new nVIDIA GPU that I currently can't use with macOS 10.13.6, as dual nVIDIA GPUs were last supported in macOS 10.12.x).

If Apple hadn't introduced this form factor, I would have considered a hackintosh, though I probably would have moved to a Linux system or even a Windows box... In the past year I've grown very fond of using FCPX...
 
  • appreciate
Reactions: BKN


I wish there were a form factor between the Mac Mini and the Mac Pro.

This year, I replaced a MacBook Pro (Mid 2012) with a Mac Mini (2018). I was sold on upgradable RAM, four Thunderbolt 3 ports, and the potential for an eGPU upgrade later. With an iPad, I rarely travel with my Mac anymore. I am not interested in the iMac form factor. I've had them before, and while they are great machines, I prefer the flexibility of managing displays separately.

I would be interested in something equivalent to an iMac Pro in specifications, minus the display, plus the ability to manage RAM, SSDs, and a few PCIe cards (but probably not 8). It wouldn't be for the highest-end users, and it would sit between the Mini and the Pro, including in price. (My past includes the Mac IIci and Quadra 950, so the current proposition is not altogether unfamiliar.)
 


Ric Ford

MacInTouch
My past includes the Mac IIci and Quadra 950, so the current proposition is not altogether unfamiliar
I see that the Quadra 950 started at $8500 back in 1992 - $15,526 in today’s dollars - making the 2019 Mac Pro look like a bargain in comparison. Unfortunately, Apple seems determined not to provide anything like a modern Quadra 650* equivalent for fear of undermining its current pricing model, and in the context of the current iMac Pro, considering its display, RAM, CPU, GPU, and storage, the 2019 Mac Pro is overpriced.

* $2780 in 1994
 


In the absence of an xMac, and you can't make a purchase decision based on something you can't buy now, or possibly ever, the options for those of us enthusiasts who want a powerful but expandable system, are (
a) Hackintosh​
(b) iMac Pro and​
(c) Mac Pro.​

I bought my first Mac in 1995 (a Power Mac 6100/66) and, since then, I've always bought 'professional' models in preference to 'consumer'. My main machine is a heavily upgraded 2009 Mac Pro, bought in 2013 after the trashcan proved too little for too much, and I still regularly use its 2006 predecessor. So I've been considering the options/reasoning/excuses for going for the 2019 Mac Pro rather than the alternatives....

First off: Hackintosh. Doesn't really appeal. Yes, I know many people can and do use them as daily drivers, but it'll take a lot of research, maintenance....I'm certainly not afraid of tweaking stuff and getting my hands dirty, but I'd rather actually use the thing.

Next: iMac Pro. I have a very nice 4K panel (Dell P2715Q) already. All expansion [must be] via Thunderbolt 3. In my case, that'll be plenty (more) boxes with power supplies for things like hard drives. Internal GPU upgrade not possible, eGPU requires hacks to use the internal display (and I don't really have desk space for two 27" panels). Current CPUs are the previous, Skylake-generation Xeon Ws, so comparisons between the 8-core iMac Pro and the forthcoming Mac Pro are somewhat misleading. Base clock speeds are about 10% higher on the Mac Pro, and though turbo boost speeds are slightly lower, there's considerably more cache memory. And then there's the long-term issue of dust build-up, the restrictions of CPU upgrades due to power and cooling capacity (no 28-core option here)... Even as it stands, the iMac Pro is a better value option, but much will depend on when/if Apple upgrades to newer Xeons, and what it does or doesn't do about the price gap.

Then, the Mac Pro. A big capital outlay, but this is a long-term investment. With the right options and judicious upgrades, it should last me as long as the 2006 Mac Pro. The Promise j2i hard drive enclosure is likely to be expensive but a lot neater than Thunderbolt 3 boxes, and others will almost certainly offer alternatives. The standard 256GB SSD is mean, but there are many options to add NVME drives via PCIe cards. More USB ports? There's a cheap card for that. The Radeon Pro 580X GPU has had comments that it is 'weak'. No, it shouldn't be standard on a $6000 system (and that will likely be £6000 here in the UK), but it's much better than many of the cards fitted on Mac Pros of the past, and more than capable for many tasks (I have one in my 2009 Mac Pro). And it's easy to add a more powerful card, and keep it as extra for GPU-accelerated applications. No issues getting a CPU upgrade fitted, and cleaning will be no problem whatsoever.

In effect, what I'd be doing would be to buy a Mac upgrade, and the one I would buy in four or five years, and the one four or five years after that, in one go. It might be overkill for what I do today, but in a few years... no, but it'll still be a very, very capable tool even then.
 


In effect, what I'd be doing would be to buy a Mac upgrade, and the one I would buy in four or five years, and the one four or five years after that, in one go. It might be overkill for what I do today, but in a few years... no, but it'll still be a very, very capable tool even then.
I purchased my Mac Mini, with upgradable RAM and Thunderbolt expansion, expecting that it could last as long as my MacBook Pro (about 6 to 7 years). A Mac Pro could last twice as long, 12 to 14 years, but at 3.5 times the initial cost and with expandability that may never get used. The specs on the Mac Pro are wildly better on processor, graphics, and RAM (but not SSD), but the price point creates a gap in the product lineup. How is the current (trash can) $2,999/$3,999 Mac Pro, still available on Apple's website, being replaced with a $5,999 model? How do you justify $5,999 with no display and a paltry 256GB SSD when an iMac Pro comes with a 5K display and a 1TB SSD for $1,000 less? I'm not the high-end professional Apple wants for the new Mac Pro. I'm a prosumer who doesn't want the constraints of the iMac Pro. I am suggesting a product in the $2,999 to $3,999 range. Maybe that's the new Mac Pro if priced right. Until then, I love this Mac Mini.
 


I purchased my Mac Mini, with upgradable RAM and Thunderbolt expansion, expecting that it could last as long as my MacBook Pro (about 6 to 7 years). A Mac Pro could last twice as long, 12 to 14 years, but at 3.5 times the initial cost and with expandability that may never get used. The specs on the Mac Pro are wildly better on processor, graphics, and RAM (but not SSD), but the price point creates a gap in the product lineup. How is the current (trash can) $2,999/$3,999 Mac Pro, still available on Apple's website, being replaced with a $5,999 model? How do you justify $5,999 with no display and a paltry 256GB SSD when an iMac Pro comes with a 5K display and a 1TB SSD for $1,000 less? I'm not the high-end professional Apple wants for the new Mac Pro. I'm a prosumer who doesn't want the constraints of the iMac Pro. I am suggesting a product in the $2,999 to $3,999 range. Maybe that's the new Mac Pro if priced right. Until then, I love this Mac Mini.
I fully agree. There is currently a gaping hole in the Mac line-up. The Mini's a great machine - it would be a lot faster than my 2009 Mac Pro for many tasks, but I know I'd find its limitations rather quickly for my liking - straight away I'd want an eGPU, Thunderbolt 3 enclosures for hard drives… Apple should plug that gap - my choice would be for a smaller Mac Pro case with two or three PCIe slots and an i9, or perhaps a Mini Pro with better graphics and cooling, plus faster CPU. The only macOS machines in this gap are Hackintoshes, sadly.
 


Apple should plug that gap - my choice would be for a smaller Mac Pro case with two or three PCIe slots and an i9, or perhaps a Mini Pro with better graphics and cooling, plus faster CPU. The only macOS machines in this gap are Hackintoshes, sadly.
It's funny that, after all these years, what people really want is a modern Power Mac 8500. It had a replaceable processor, lots of RAM, 3 slots, and drive bays. Come on, Apple, go back to the future!
 



... I suspect that much more of a modern Mac's processing power goes into what is essentially overhead. Much of that "overhead" is intended to make the machine easier to use, but conversely some of that ease of use comes at a high cost in overhead.
One of my favorite quotes in computing was attributed to an engineer at, I believe, DEC, who said "No matter how clever the hardware boys are, the software boys just piss it away."

Assessing how much of a performance hit macOS imposes is complicated by the fact that some of the raw power of your machine is consumed by the essential, 'always-on' parts of your OS (i.e. the kernel and whatever else is required to make programs run at all) and some by transient programs running in user space to do housekeeping or interact with the user. Someone mentioned 'mds_worker'. That, if I recall correctly, is the Spotlight indexer. It'll fire up from time to time and run for a while, then go away. While it's running, it will suck up substantial amounts of CPU, memory and disk I/O, so you'll see a performance hit. When it's not, though, the cost is negligible. I'd expect the same to be true of Siri - when you're actually talking to Siri, you're paying a price in CPU, but when you're not, the cost is probably negligible.

I'm currently running a CPU-intensive task on my laptop, so my machine is running essentially flat-out. Activity Monitor shows the load for 'system' at around 57%, with the rest taken up by user space processes (such as the 3D renderer I'm running). But a few moments ago, 'system' was at around 9%, so there's a lot of variability. And some part of that 'system' load will actually be calls executed on behalf of my renderer, so factoring out what's using what is very hard.

Apple's XNU kernel (a hybrid based on the original Mach micro-kernel) is an amazing bit of engineering, but it's reportedly somewhat inefficient, due to context switching needed to support multiple OS's. So there's definitely a performance hit there; a simpler micro-kernel will get much more out of the hardware in terms of basic processing power. On the other hand, if you tried to do all the things that macOS can do on a simpler kernel, it might be less efficient overall than XNU,which is optimized for those tasks.

TL;DR: Apple's architecture is optimized for running macOS with all its bells and whistles, not just Siri and Spotlight, but also modern GUI-based applications, multiple OS's and so forth. As such, both the permanent parts of the OS and kernel and transient utility programs running in user space will suck up a lot of the raw power of the hardware. You could get more performance on the same hardware with a simpler OS, some minimalistic Linux, say. But then you wouldn't get the modern macOS experience.
 


Like others here, I historically bought pro-level machines since 1995 or so (my first Mac was bought in 1984 – I'll never forget the day I was able to afford an SE/30). As a freelance professional Mac technician at that time, it was always the best option for me – I would get the next-highest model, and eventually it would be full of PCI cards and hard drives (sometimes sticking in one more than was officially supported). I bought a new computer every 5 years or less and was always thrilled with my new purchase – even the brand-new G5 that immediately needed a new processor: an Apple technician came to the house and replaced the motherboard the day I called Apple, thanks to AppleCare.

In 2012, my Mac Pro was showing its age; it didn't look like a new one would be the best purchase for me – the fastest Mac available was the mid-2012 15" MacBook Pro Retina. I bought a 16GB RAM BTO model and the Thunderbolt Display, and was extremely happy with my purchase (and when the new Mac Pro came out in 2013, I had absolutely no buyer's remorse). That machine has lasted me over 7 years, longer than any other of my machines.

But the machine has developed a hardware problem, which has caused increasing slowdowns, and I realized that it was finally time to get a more modern machine. After putting off the purchase for months, and after seeing the cost of the new Mac Pro (if I weren't retired, I might have bought one), and after some excellent advice from Ric and others here on the forum, I bought a 2018 Mac Mini. I opted for the 1TB SSD, but only 8GB RAM – it was much cheaper to buy RAM elsewhere and install it myself. And by the third time of opening the Mini to get at the RAM slots, it got pretty easy (I leave it to the imagination of the reader, who may also have experience in this area, as to why I needed to open the machine 3 times). It plays well with the Thunderbolt Display and all the Thunderbolt 1 drives hanging off the display. I look forward to getting some i/o native to USB-C or Thunderbolt 3; I'm trying to find some quad-NVMe PCIe cards to use with a PCIe expansion box that will work with a Mac.

This machine is very responsive. I work regularly in Photoshop, Illustrator, Logic Pro, and various word processors, with the occasional use of Final Cut Pro; the Mini handles it all with ease. I am once again happy with a new Mac purchase.
 


Ric Ford

MacInTouch
I look forward to getting some i/o native to USB-C or Thunderbolt 3; I'm trying to find some quad-NVMe PCIe cards to use with a PCIe expansion box that will work with a Mac.
That might not actually make sense, as a single NVMe SSD is already pushing the limits of Thunderbolt 3, if I understand the details correctly, so four of them on a single card hooked up through Thunderbolt 3 would mean they'd just be handcuffed by the bandwidth limits.
Wikipedia said:
Thunderbolt 3
Compared to Thunderbolt 2, Intel's Thunderbolt 3 controller (codenamed Alpine Ridge, or the new Titan Ridge) doubles the bandwidth to 40 Gbit/s (5 GB/s)

PCI Express
PCI Express 3.0x4 (four lanes) 3.94GB/s

NVM Express
M.2, formerly known as the Next Generation Form Factor (NGFF), uses a M.2 NVMe Solid State Drive Computer bus. Interfaces provided through the M.2 connector are PCI Express 3.0 (up to four lanes) .
 


It's funny that, after all these years, what people really want is a modern Power Mac 8500. It had a replaceable processor, lots of RAM, 3 slots, and drive bays. Come on, Apple, go back to the future!
...a machine which you had to completely disassemble to add memory or VRAM, or change the battery. What was the procedure? Oh yeah:
  1. Remove screws.
  2. Remove top housing.
  3. Reach way inside and remove the power actuator. Not too hard to remove, but a bear to put back.
  4. Remove the PCI card retainer.
  5. Remove the CPU daughter card.
  6. Remove all PCI cards.
  7. Disconnect and remove the AV module.
  8. Disconnect all the remaining cables: power supply (2), floppy drive, CD-ROM, speaker, SCSI.
  9. Remove the logic board screw.
  10. Slide the logic board part way out.
  11. Lift the logic board catch.
  12. Pivot the logic board out.
  13. Add memory.
  14. Do everything in reverse.
Be sure to have bandages on hand for the scraped knuckles.
 


Most discussion on the new Mac Pro relates to the fact that with upgradeable components we should plan on using the machine for many years. My concern is Apple. Are they going to say the latest operating system won’t run on it in 4, 5, or 6 years with no security upgrades on the last system? Anyone else have this concern, or am I worrying unnecessarily?
 


Ric Ford

MacInTouch
Most discussion on the new Mac Pro relates to the fact that with upgradeable components we should plan on using the machine for many years. My concern is Apple. Are they going to say the latest operating system won’t run on it in 4, 5, or 6 years with no security upgrades on the last system? Anyone else have this concern, or am I worrying unnecessarily?
I would think that any of us who bought the original Mac Pro, whose 32-bit boot code Apple cynically and unnecessarily abandoned*, might have that concern, but Apple is definitely marketing the overpriced new model as a long-term investment.


*But it boots and runs a completely up-to-date, 64-bit Linux, no problem.
 


...a machine which you had to completely disassemble to add memory or VRAM, or change the battery. What was the procedure?
It was decades ago, but I only remember maxing out the machine a few times: once when RAM prices got low enough to move up to the next tier, once to replace the daughtercard.

You're being overly obtuse; we want a modern 8500-like machine, not the actual 8500 design.
 


That might not actually make sense, as a single NVMe SSD is already pushing the limits of Thunderbolt 3, if I understand the details correctly, so four of them on a single card hooked up through Thunderbolt 3 would mean they'd just be handcuffed by the bandwidth limits.
FWIW: Yup. To be precise, there are few if any M.2 NVMe SSD blades that are truly limited by the PCI3 x4 interface today, but it's close. A 4-SSD card would be heavily limited. The top-end 4-SSD cards (example: the Highpoint SSD7101) have a PCIe3 x16 bus interface and can use every bit of it.
 


It was decades ago, but I only remember maxing out the machine a few times: once when RAM prices got low enough to move up to the next tier, once to replace the daughtercard.
You're being overly obtuse; we want a modern 8500-like machine, not the actual 8500 design.
Obtuseness aside, I don't know that machines with upgradable components are actually a good idea. Sure, there are users who need PCIe slots to install specialized hardware, but I'm talking about upgrading as a means to extend the life of the computer.

As you can tell, I had a Power Mac 8500, which I upgraded extensively. I spent nearly as much on internal upgrades alone as I did on the original computer:
  • More VRAM ($70)
  • More memory ($205)
  • Graphics card ($230)
  • CPU upgrade ($690)
  • Ultra2 Wide SCSI ($429)
  • Hard drive ($399)
Sure, I was able to keep using it for 12 years, but what I missed out in that time was all of the base level improvements, such as faster system clocks and memory. I didn't even have USB ports.

For example, maybe instead of upgrading to last 5 more years, I could have bought a Power Mac G5 for less than I had spent on upgrades.

So, in retrospect I regret the upgrades, especially at the end, when the Power Mac 8500 suddenly decided it didn't want to boot. That's when I learned that the problem with a SCSI tape backup system is that you can't use it to restore when new Macs don't have any SCSI ports! (Fortunately I was able eventually to get it to boot one final time and transfer the data off of it.)
 



That might not actually make sense, as a single NVMe SSD is already pushing the limits of Thunderbolt 3, if I understand the details correctly, so four of them on a single card hooked up through Thunderbolt 3 would mean they'd just be handcuffed by the bandwidth limits.
Don't forget, we had the discussion a few months ago where I determined that Thunderbolt 3 limits data transfer to 22 Gbps. That means that even a Thunderbolt 3 interface will be slower than the max speed of some of the higher-performing NVMe drives.
 


Ric Ford

MacInTouch
Obtuseness aside, I don't know that machines with upgradable components are actually a good idea. Sure, there are users who need PCIe slots to install specialized hardware, but I'm talking about upgrading as a means to extend the life of the computer. As you can tell, I had a Power Mac 8500, which I upgraded extensively. I spent nearly as much on internal upgrades alone as I did on the original computer...
I think you have to be smart about what you're doing. I never regretted upgrading my Power Mac G5 with an eSATA card for high-speed external storage, an SSD for much faster boot and app access, or RAM upgrades for better performance. The SSD cost a few extra bucks, but the other upgrades were pretty reasonable.

To your point, though, there is no way to upgrade Thunderbolt 2 to Thunderbolt 3, which is a game changer. Heck, Apple couldn't even manage to upgrade its own 2013 Mac Pro, which it is still selling, to Thunderbolt 3 over the course of six years....
 


Amazon disclaimer:
As an Amazon Associate I earn from qualifying purchases.

Latest posts