MacInTouch Amazon link...

Apple Oct. 2018 announcements

What one modern Mac would you most like MacInTouch to get for testing?

  • 2018 Mac Mini

    Votes: 35 79.5%
  • 2018 MacBook Air Retina

    Votes: 3 6.8%
  • 2018 MacBook Pro

    Votes: 5 11.4%
  • iMac 5K

    Votes: 1 2.3%
  • Other... (please describe)

    Votes: 0 0.0%

  • Total voters
    44
  • Poll closed .
Channels
Apple, Products, News

Ric Ford

MacInTouch
I look forward to hearing your impressions and experiences with your new baby, Ric!
I'm already trying to line up USB-C storage and video adapters/connectors. I got Mojave running on another system, though I'll probably focus on the Mini for Mojave testing.

One small concern could be a real irritation, as tiny as it is, and that is if the Mini lacks any sort of speaker. It's just... unpleasant... to think that I would have to attach additional speakers to hear a startup chime and user interface sounds. Ugh. My MacBook Pro's internal speaker is surprisingly useful (and I don't always need a big sound system hooked up).

So I'm looking for the most minimal possible solution. Any suggestions for an elegant, tiny speaker that doesn't need babysitting, if Apple drops this ball?

(I guess this wouldn't be an issue if my display had a speaker... but none of mine do.)
 


I'm already trying to line up USB-C storage and video adapters/connectors. I got Mojave running on another system, though I'll probably focus on the Mini for Mojave testing. One small concern could be a real irritation, as tiny as it is, and that is if the Mini lacks any sort of speaker....
Didn't I read that Apple has abandoned the "startup chime" and even the LED that signals laptops are powered but perhaps taking a nap?

Even if your new Mojave Mini has that startup ding, will you still be waiting around for the "mysterious" boot progress bar to finish? Looking forward to that report on your new system.

Long ago and far away when I was designing museum exhibits, I discovered the joy of plugging a sound source into a cheap laser display with audio input. Beethoven's 9th from a CD player (no amp required) could actually be heard through the stepper motor, and seen on the wall. Just sayin'

There's a wide range of Bluetooth speakers, some very small, that also work through the headphone jack. That is, if you have a headphone jack. It's an unusual day when 9to5Toys doesn't list sales of those, the last one I bought was $20 and sounds pretty good.

You could splurge. I recently paid $320 for a 49" TCL 4K UHD HDR TV. It's a surprisingly good monitor, and set at 1920 x 1080 makes everything comfortably legible. Sound through its built-in speakers is better than any of my speaker-equipped computer monitors, but not as good as the rather small set of Sony powered analog computer speakers that date back to Windows 98.
 



Ric Ford

MacInTouch
Don't forget that you'll also need a 10GbE switch, which, while prices have come down recently, will still run you at least $200.
Ouch, didn't think of that. A quick look around turned up a Netgear Nighthawk Pro Gaming Switch at $235 that has just two 10GbE ports (plus 8 Gigabit ports), while there are 10GigE switches with more ports (often SFP rather than RJ45) at some painful prices. A Netgear 4-Port 10Gig switch (with an SFP+ uplink port) is $399.99, while a $575 Netgear 8-port 10GigE switch touts "10 Gigabit performance for less than $100 per port."
 


Ric Ford

MacInTouch
Howard Oakley is wrestling with Apple's options and prices, as he struggles to find a suitable iMac replacement and considers the new Mac Mini:
The Eclectic Light Co. said:
Which Mac next?
... Apple’s new Mac models are very welcome, indeed long overdue. I am delighted that it has at last breathed some life back into two important Mac ranges, the MacBook Air and Mac mini. But I’m amazed that it has left its flagship desktop systems so outdated and overpriced, and leaves customers like me in a quandary, unable to buy a system which meets a fairly undemanding specification.
 


Ouch, didn't think of that. A quick look around turned up a Netgear Nighthawk Pro Gaming Switch at $235 that has just two 10GbE ports (plus 8 Gigabit ports), while there are 10GigE switches with more ports (often SFP rather than RJ45) at some painful prices. A Netgear 4-Port 10Gig switch (with an SFP+ uplink port) is $399.99, while a $575 Netgear 8-port 10GigE switch touts "10 Gigabit performance for less than $100 per port."
I did see this Netgear 8-port Gigabit Ethernet Unmanaged Switch, Desktop, 2x10-Gig/Multi-Gig, earlier for $199, which I didn't realize on has two 10GbE ports. I guess it's the non-gaming version of that NIghthawk switch. An advantage it has, though, is no fans, unlike those rackmount switches.

The thing with 10GBase-T is that it takes a lot of power to run, which may be why you don't see any desktop switches with more than a couple ports. If you need a switch with 8+ 10GBase-T ports, you may have to settle for a switch with a loud fan.
 


What I really meant to say was "secure boot drive", i.e. the main partition/volume with a bootable OS on it, as opposed to data drives or backup drives. But, there's an issue, too: You're going to back up to a non-T2 backup drive, so all your files (including all sensitive data) will live outside the T2 security realm, anyway (since the backup drive needs to be a separate device).
The back-up (or external boot drive) can still be encrypted via "FileVault", but the keys to unlocking it are stored on the external device. That's slightly less secure but is a reasonable trade-off. That means another Mac can actually unencrypt and mount that drive - for example, a Time Machine back-up drive being used to restore a Mac destroyed in a fire. You actually don't want that fire-destroyed T2 to be the singular holder of the key to restore the drive. Pragmatically, the keys to an encrypted back-up drive have to also be backed up so you have a coherent "whole" back-up. (With APFS and other approaches, there is a small clear-text stub that holds some metadata info and the encrypted key that can be unlocked with a password.)

There are still non-T2 Macs out there (far more than those with the T2). The mechanisms to secure those drives are still in place for the other storage drives. The T2 adds incrementally to that security in that it makes sure the boot process that is asking for the passwords to decrypt the external drive (or alternate internal devices if/when those come along) has been authenticated.
Meanwhile, there's the whole world of TCG Opal storage security standards, which seem similar in goals (though I don't know much about it). How does that compare with Apple's proprietary T2 system?
It is very similar, but it is designed around putting a secure enclave inside the drive. Apple puts their secure enclave inside the main system. As this is largely a fallout from their iOS devices (just reusing the same implementations to get economies of scale), the notion of an external/removable drive doesn't fit. Those are all embedded drives, so end up with an embedded drive here, too.

Opal either uses a mini "pre-boot authentication" OS off the drive or the BIOS (UEFI) to gather the password to unlock the key. Those aren't as unified a trust chain as the Apple approach. If the pre-boot authentication software can be tweaked (directly or indirectly), the password can be gathered.

Windows Bitlocker will work with Opal 2 drives. Conceptually, Apple could wrap FileVault around Opal 2 (and later), also, but I suspect keeping their own system secure probably fills most of their time. (Lots of folks are trying to hack in, so a never ending 'spy vs. spy' game.)
 


I have to say, the news that I might be able to use NVMe memory in my ancient 2010 Mac Pro 5,1 may sway me from getting a faster but graphics-challenged Mini, especially given the cost of Thunderbolt drives - apparently any case is in the hundreds of dollars. (USB 3 drive cases are apparently much cheaper, I wonder what the performance difference is?)

I really don't like the idea of nonreplaceable drives, even if they are just a bunch of NVRAM. If one of these fails, I gather we have an expensive paperweight, if they don't fail "just right." (We can use an external drive to boot, but, correct me if I'm wrong, a failing internal drive will screw things up regardless. That's based on an old memory of complaints at MacInTouch; am I mis-remembering?)

I've added up the price, and I'm basically looking at $2,000 to replace my computer with one that's twice as fast on CPU and one-third as fast on GPU, at a time when everyone seems to be putting loads onto the GPU. I could, of course, go to a 2013 Mac Pro, which is even more expensive, lacks Thunderbolt 3, may or may not be supported past the next OS, and is pretty obsolete now, but which has good GPU speeds. Or spend $500 more and get a Mini with an eGPU, if it's even supported. Or go to an iMac and have fewer ports but a new nonreplaceable monitor, even harder to service than a Mini...

Sadly, I have to buy during this fiscal year. The new Mac Pro might be just the right solution, but all we have so far is a name. It's going to be a tough choice, though really, the Mini’s looking best... just awaiting that Compute score...
 


I talked tonight with a friend, who is a big Apple fan and customer, about Apple's announcements this week and the concept of Apple switching from Intel-based Macs to computers based on its own, ARM-based processors, which are getting very fast.

While he likes the new iPad Pro a lot and acknowledges its amazing benchmark performance, he pointed out that the key here is software, just as software was the key to the Mac's success, more than any hardware superiority vs. Windows hardware.

So we discussed important Mac software that isn't supported on an iPad Pro. Let's start with Apple's own software:
  • Xcode
  • Final Cut Pro X
  • FileMaker Pro
When Apple can produce compelling iMac Pro versions of its own professional apps, then that whole concept that Tim Cook has been pushing might make a little more sense.
OK, I'm going to declare the challenge complete! I've ordered a 2018 Mac MIni, due for delivery by Nov. 15. We had contributions from 20 people for a grand total of $1865 (minus PayPal fees). That's good enough (not counting tax) for this 2018 Mac Mini:

2018 Mac mini
  • 3.2GHz 6‑core 8th‑generation Intel Core i7 (Turbo Boost up to 4.6GHz)
  • 16GB 2666MHz DDR4
  • 512GB SSD storage
  • Intel UHD Graphics 630
  • 10 Gigabit Ethernet
Subtotal: $1,799.00​
Shipping: $8.00​
Estimated Tax: $112.94​
Order Total: $1,919.94

Some decision factors along the way:
  • People indicated clearly that they wanted this to be a Mac Mini and not a different Mac. (I might possibly have chosen a laptop or iMac otherwise.) The Mac Mini has been one of Apple's great computers in the past, and I think many of us are anxious to know if this will be another good one.
  • I ended up opting for the i7 processor mostly because that has been something people really wanted/liked in the past. I don't have a particular need for that much power myself currently, but the budget was there, and this lets us assess the capability offered by the higher end of this promising 8th-generation CPU family.
  • I ended up limiting internal storage to 512GB, because the price for going to 1TB just seemed too high*, and the point of Thunderbolt 3 and 10Gbps USB-C is to support high-speed external storage (which will be a priority for my testing).
  • As noted earlier, I went for the 10Gigabit Ethernet port, because that's unique and it will be good to be able to test that (though I don't currently have any 10GigE devices).
  • I went for 16GB RAM, because I'm concerned about the difficulty of doing our own memory upgrades in this machine - a big question as we go forward.

*Apple charges $400 for a 512GB bump to 1TB of internal SSD, while a high-speed 500GB SSD from Samsung retails for less than $150! Apple's 2TB configuration costs $1400 more than its 256GB baseline, while a pair of high-speed Samsung 970 1TB M.2 cards will only cost you $456. Apple's getting away with highway robbery here.
We could have another Paypal challenge (autocorrect wanted to make it a papal challenge) to get the 64GB RAM upgrade from OWC, once you have tested it with 16.
 



Ric Ford

MacInTouch
USB 3 drive cases are apparently much cheaper, I wonder what the performance difference is?
You can see benchmarks if you scroll back in this discussion or poke around discussions in the previous MacInTouch forums.

Thunderbolt 3 is 40Gbps; USB 3.1 Gen 2 is 10Gbps (i.e. via Apple Thunderbolt 3 ports); and USB 3.0 is 5Gbps. Performance differences depend on which of those connections you use, what's supported by the drive enclosure (e.g. UASP on USB 3, 10 Gbps on USB-C), and the speed of the devices.

The Samsung T5 and SanDisk Portable SSD are faster than USB 3.0 but not faster than USB 3.1 Gen 2. The Samsung 970 NVMe is faster than all but Thunderbolt 3.
correct me if I'm wrong, a failing internal drive will screw things up regardless.
It can do that, depending on the details of the failure mode. I've seen a bad drive in a Power Mac disable the system, so it couldn't be booted from a good drive. I assume that a bad drive or controller could disable any system by, for example, flooding it with garbage I/O, if things really went wrong.
It's going to be a tough choice, though really, the Mini’s looking best... just awaiting that Compute score...
Mac Mini specs say:
Apple said:
Graphics:
Intel UHD Graphics 630
And you can search Geekbench results for that:
Geekbench Browser said:
 


It looks like Apple may have cripped the new MacBook Air's performance, much like it did with the 12" MacBook:
It appears the retina MacBook Air has a fan and the retina MacBook does not. That actually makes a difference (outlined in an Engadget article: Why the new MacBook Air isn’t ‘a bigger MacBook’.)

The Dell XPS is more naturally compared to the function key [two-port] MacBook Pro 13" (which Apple has left drifting in the wind a bit). Those are both U-series CPU based and "non-entry" class laptops.

I don't think the Y-series was chosen to 'cripple' the system, but rather to give it longer battery life....

An Apple comparison of the retina MacBook Air 13" vs. 2-port MacBook Pro 13" reveals:

wireless web: 12 vs. 10 hours​
battery: 50.3 vs 54.3 W-hours​
screen : 2560 x1600​
brightness : 300 vs 500 nits​
color space: sRGB vs P3​
weight: 2.75 vs 3.02 lbs.​

The Retina screens soak up lots of power. The MacBook Air is smaller (so the battery is smaller). But Apple also assigned the 'conflicting' goal that the battery usage lifetime had to be higher. Somewhere, Apple needed power saving. Going Y-series does that while keeping the lighter weight constraint.

The PC World head-to-head comparison probably very carefully avoids talking about the displays. The more common Dell XPS 13 configurations have a 1080p display. That's the opposite trade-off (i.e., keep the higher power-consuming CPU package and dump power consumption from the screen). If the primary usage is as a 'desktop replacement' and you will hook to an external screen 95% of the time, then is a reasonable trade-off.

Dell also chops out some aluminum by going carbon fiber for the palm rest. (If someone is trying to match the XPS up to the MacBook Air on weight, Apple's competing system, the MacBook Pro 13", is just heavier.)

Apple laptops are designed to be laptops first and any 'desktop, clamshell closed' mode is after that. The display in the Mac laptop matters in the push to go 'retina' across the whole line up.

P.S. I think this shows that the MacBook's super minimized volume is too small for the trade-offs they have chosen. An A-series [ARM] processor (and flip to a later iOS) probably would help it. If they dropped the trackpad and went to a 360° hinge, they'd have a '2-in-1' like system.

The new retina MacBook Air is probably what the MacBook should have been shifted to. It isn't crippled, it is better (two ports, both Thunderbolt, non throttling CPU, a weight that doesn't try to limbo under the old MacBook Air 11" (2.38 lbs)). Using aluminum means that they over-zealously attacked volume to save weight past the point of having something that fit well with the CPU choices they had available. The retina MacBook Air expands out the volume and actually 'de-cripples' the overall system, because that suspect weight target isn't an issue anymore.

When either Intel uncorks their 10nm process or gets to a better one, then the Y-series will get better over time, and is a good fit to this retina MacBook Air . Or AMD does better at mobile CPUs and Apple gives them a shot. Apple does need to get it to be more affordable but 'doable' also (if they try).
 


And you can search Geekbench results for that:
Interesting. Those results go from around 17651 to around 37825, but the average seems to be around 22,000. Which means we could see, well, any quite a range of results from Apple.

The card in my Mac Pro is Metal Geekbenched at 60,000 or so. My MacBook Pro AMD Radeon R9 M370X Geekbenches Metal at 39,060, and OpenGL, which I assume the PC listings are for, at 37608. The built-in Iris clocks OpenCL at a 27,285.

The MacBook Pro performance is really “good enough” for me; but I don't want to go much slower, because I'd like video conversions to take a reasonable time, and sometimes I need to plow through a hundred or two Photoshop images.

I think we really need to see real Mac Mini compute results in OpenGL and Metal to narrow it down; right now it appears the range is from slower-than-Iris Pro built-in speeds to a little slower than R9-laptop speeds.
 


Ouch, didn't think of that. A quick look around turned up a Netgear Nighthawk Pro Gaming Switch at $235 that has just two 10GbE ports (plus 8 Gigabit ports), while there are 10GigE switches with more ports (often SFP rather than RJ45) at some painful prices. A Netgear 4-Port 10Gig switch (with an SFP+ uplink port) is $399.99, while a $575 Netgear 8-port 10GigE switch touts "10 Gigabit performance for less than $100 per port."
I had thought about the switches/cabling/etc. for the network. The challenge may be closed, but I just donated to the cause.
 


Ric Ford

MacInTouch
I had thought about the switches/cabling/etc. for the network. The challenge may be closed, but I just donated to the cause.
Thanks, Wayne! There are a lot of auxiliary items to consider, and I'm looking to try an eGPU (and see if it helps Affinity Photo run better than it does on my MacBook Pro, which lacks any discrete GPU).
 


Didn't I read that Apple has abandoned the "startup chime" and even the LED that signals laptops are powered but perhaps taking a nap?
I definitely remember reading about elimination of the startup chime. I don't expect the new models to have it.

As for the power LED, the Air never had one. The new mini does have it (according to the photos on Apple's web site).

... there are 10GigE switches with more ports (often SFP rather than RJ45) at some painful prices.
A switch with a lot of 10G ports is going to be expensive, because those speeds are not (yet) mass-market technology. Switches with many 10G ports are mostly being sold for use in data centers, not people's homes.

SFP interfaces (more specifically, SFP+ for 10G speeds) should not be deal breakers. I would actually expect to see them on most 10G switches, because the majority of 10G deployments (at least at this time) are over fiber, and there are many different kinds of fiber - single-mode, multi-mode, long-range, short-range, different shape connectors, etc. SFPs allow a manufacturer to sell one switch that is compatible with whatever kind of optical network a customer wants to deploy.

You can get 10GBase-T SFP+ transceivers to allow copper cabling for these switches. Just make sure to double-check switch compatibility. Although SFP is a de-facto standard and devices are mostly interoperable, some switch vendors sell equipment with firmware that will refuse to recognize other vendors' SFPs, even if they would otherwise work. (This is why many aftermarket SFP vendors sell versions designed for different brands of switches - they're the same SFP, but with different ID bits in the firmware in order to be accepted by the corresponding switch firmware.)
 



Pricing is an enormous topic ...
But that's not all. In addition, Apple is price-gouging with the leverage of this control it asserts. We know Apple's skill for purchasing at dirt-cheap rates from its suppliers (Cf. Isaacson's biography and other references) yet it's selling that NAND to its customers at quadruple retail rates for equivalent M.2 SSDs.
[Re Apple's flash storage costs], one of the factors here is how many NAND packages Apple is using to get to the higher capacities. If Apple is using 2 packages, and other 1-2TB SSDs are using 4 packages, then those aren't the same packages. Either the density of the dies is higher, or there are more effectives dies stacked higher (3D). If Apple is buying the most bleeding-edge NAND, the discounts, even at large volume, aren't going to be as [large as] for mature (or trailing edge) NAND packages.

The teardown for the Mini should help narrow that aspect down. There seem to be two NAND chips on top of the logic board. There could be two - or a spot for two - on the bottom. However, if Apple is trying to do this with more exotic chips, then that's part of the price.

The iMac Pro gets to 4TB with four NAND chips on two daughter-boards (two packages on top and bottom). Apple's 4TB iMac Pro BTO option is $2,800. The Mac Mini's is half of that for 2TB.

However, the pricing on the 2TB option for the Mini is quite high. You can buy a OWC Express 4M2 and a 2TB Samsung M.2 drive and still have several hundred dollars left over (and more capacity, since you still have the Mini's internal drive and three more empty M.2 slots). It is effectively priced high enough to make Thunderbolt external solutions look more affordable.
There may or may not be some unstated technical justification for eschewing industry standards, but there is no justification for this kind of price gouging, which is a serious competitive disadvantage with any customer willing to consider a non-Apple alternative.
Apple may be haggling some of these price spikes away for higher-volume (corporate) purchasers. It is a common enterprise market product technique to price things higher than you want to sell them for, so you can hand back a 'discount' so buyers can feel better about haggling the price down. Dell/HP tend to have discount codes from places people can 'puzzle' to, so that it isn't only the "big" players that get some relief. Apple generally doesn't haggle with 'normal' folks.

I don't expect Apple to pass along volume buying discounts. (They aren't Walmart or Newegg/Fry's or Costco.) However, they shouldn't be using SSD storage to plug problems in their profit margins either (e.g., margins are down on components x, y , and z, so crank NAND mark-up higher to fill the gap). If one of Apple's strategic objectives is to get more customer data onto SSD sooner rather than later, then that's a conflicting agenda.
 


If Apple were to produce a new MacBook Pro with a current generation CPU+GPU and the same form factor and keyboard as the mid-2012 MacBook Pro, while retaining most of the ports and user replaceable/upgradeable RAM and storage, I would throw money at Apple to buy it, gladly paying a premium compared to hardware from competitors. (I could say something similar for most other Apple product lines. For example, I have very little faith that the next Mac Pro will be anything else than a wildly overpriced vanity project.)

What I will not do is pay a steep premium for a laptop that may still have significant issues with keyboard reliability, CPU throttling, limited ports, and poor repairability just to satisfy an Apple designer's fetish for thinness.

After several decades of buying mostly Macs for my own business and advising clients to do the same, my last few purchases have been Dells and Lenovos, and I no longer recommend Macs to most clients. After these announcements, I'm starting to think that I may have purchased my last new Mac for personal use, as well, an idea that would have been inconceivable not that long ago.
I need a computer, not a dongle hanger. I need terrabytes of space, not 256GB, I need to be able to quickly deal with large files, from digital cameras with 50 Mpx resolution and 4K video.

I don't need something that's thin and cute. I need a workhorse - that I can actually update and manage storage. I'd pay $1500 for a machine with user-installable RAM, a 2.5" SSD that's easy to replace, and a blu-ray writing optical drive. Throw in a standard HDMI port, SD Card reader, headphone jack, a few USB-C ports, and bring back the MagSafe connector, and I'm buying them by the carload.
 


Ric Ford

MacInTouch
need a workhorse - that I can actually update and manage storage. I'd pay $1500 for a machine with user-installable RAM, a 2.5" SSD that's easy to replace, and a blu-ray writing optical drive. Throw in a standard HDMI port, SD Card reader, headphone jack, a few USB-C ports, and bring back the MagSafe connector, and I'm buying them by the carload.
Does it need to run macOS, or can it run Windows/Linux...
 


(I guess this wouldn't be an issue if my display had a speaker... but none of mine do.)
When you set up your new Mini, Ric, give us a list of "stuff" you're using with it, including your displays. It'll be helpful for some of us to know which combinations seem to work really well.

Glad this all worked out well!
 


Ric Ford

MacInTouch
When you set up your new Mini, Ric, give us a list of "stuff" you're using with it, including your displays. It'll be helpful for some of us to know which combinations seem to work really well. Glad this all worked out well!
I definitely will. I decided I really needed a 4K UHD display to do any kind of serious testing, and the best I had was QHD, so I just got a 27" LG 27UK650-W from Amazon, which seems good so far. I've been working with it all day, trying to get up to speed. SwitchResX is a big help (but somewhat confusing, too), and there are some differences between HDMI and DisplayPort. (I'm testing with my 2015 MacBook Pro, so far.)

It doesn't have speakers but does have a headphone jack, so I guess I could connect something to that to get sound out of the Mac Mini - that'll be one of the experiments when the Mini arrives.

The monitor definitely looks good, and it has a helpful controller for settings (underneath the screen), and offers 4K resolution (which is too small to be usable except in HiDPI/scaled modes, much like the MacBook Pro's own retina screen).

It came with HDMI and DisplayPort cables, but I needed a Mini DisplayPort cable for the MacBook Pro's Thunderbolt 2 connector. I already had one I've used since 2015 (Accell Mini DisplayPort to DisplayPort with Locking Latch), and it's good for 4K at 60Hz.

The HDMI setup offered 4K at 30Hz, but that looks like a specification limitation for the 2015 MacBook Pro's HDMI output, while the Mac Mini specs list a maximum of 4096-by-2160 at 60Hz over HDMI. (I like using HDMI for the monitor to avoid using up a precious Thunderbolt port, which is useful for other things.)

One slightly annoying oddity is that my beloved Logitech M720 Triathlon has gotten a little jerkier since I swapped monitors. I'm not sure if it's Bluetooth interference or what - haven't taken time to troubleshoot it yet (as it's not dysfunctional, just a little unpleasant).
 


Others have commented on this, but I'll put in my 2 cents, as well. The iPad Pro demo was a pretty good indication that the A-Series chip will find its way into 2020-2021 Macs.
Apple's slide that said that iPads outsold Dell+HP+Lenevo+Microsoft Windows (probably limited to a certain class of laptops) combined suggests otherwise. If iPad and iOS is 'beating' Windows, then what does Apple need macOS for, to compete with them? The question for macOS on A-Series that doesn't really have a good answer is how is iOS on A-Series doing badly (or substantively worse than macOS)? macOS to the rescue how?
[I think] we'll have a compatibility layer for Intel software to ease the transition, and Parallels will release their virtualization system for Windows software.
Virtualization why? The iPad Pro demo included a substantial amount of time for an upcoming version Photoshop on iOS. It won't be 100% the same in 2019, but two years into development (2021), how many critical 'missing pieces' are there going to be?

One of the big pushes at WWDC this year was a core library that allows for more unified applications to be ported to both macOS and iOS. The 'flow' of apps is probably going in two directions. Some iOS apps will come over to macOS (making it a broader ecosystem), and some will flow in the opposite direction (same impact). Basically, many of those are going to be a win/win for both sides.

Apple has only made shifts in the past when the target being shifted to was faster than what they were coming from. The A-series getting to parity (vs. a process and microarchtectural stuff for the Intel line-up) isn't getting out in front. Virtualization will only add more overhead (need to be substantially faster to get to the point of not backsliding on 'upgrades'). If Intel remains stuck in time, perhaps. Both Intel and AMD both being stuck still in 2021 is somewhat unlikely.
It will be as fast (or faster) than all but the fastest Core i9 chips (or whatever Intel has released by then) and certainly with a lower TDP.
It is far from that. Apple isn't likely to close that gap in two years. Geekbench scores are far from a holistic picture of the task. First, the A12X came after the A10X (Apple's release of the X variant of the series is slowing down). The curve is starting to flatten out for the A-series as many of the same 'tricks' are woven in.

Second, the A-series is tuned for 32-bit-sized problems. The SPEC standard benchmarks needs a modification, because iOS wants apps that are scoped for iPhone-sized memory constraints.
The iPhone XS & XS Max Review: Unveiling the Silicon Secrets
... On iOS, 429.mcf was a problem case as the kernel memory allocator generally refuses to allocate the single large 1.8GB chunk that the program requires (even on the new 4GB iPhones). I’ve modified the benchmark to use only half the amount of arcs, thus roughly reducing the memory footprint to ~1GB. ...
The iPad Photoshop demo appeared to be a large PSB file, so there are workarounds, but the A-series isn't extenstively benchmarked against double-digit-GB-sized problems.

Third, there have been several "A12 is great" articles pointing to Geekbench stats that are overly obsessed with single-thread performance. A12 tends to fall short on multicore benchmarks.

Lower power and thermal envelope? Yes. The A-Series is ready to take over the MacBook place in the line-up. On the desktop front, though, the Mini just moved "up" to desktop processors. The A12 doesn't touch those in multicore at all. If the desktop is plugged into the wall, then runtime battery life really doesn't matter as much.
Otherwise (as for the demo), the MacBook Air was a disappointment. Apple has doubled down on the poorly-designed butterfly keyboard, eliminated essential ports (and the SD card slot!) .... The Mini .... and it's a Core i3? Then have to pay $900 more for a machine with 512GB of storage, a better CPU, and 16GB of RAM? Yeah...
If there were an A-series in the MacBook Air, it probably would get the same keyboard and even fewer ports (since the A-series doesn't have robust I/O capabilities - e.g., there is no x4 PCI-e v3 to hook a Thunderbolt controller to).

The Core I3 multicore specs are better normalized against number of cores, and probably clearly better on sustained (not 3-4-minute benchmark) workloads.

As much as Apple talked about iPads outselling some vaguely-defined set of systems from their Windows competitors, Chromebooks have been making a dent. (The lowering of the 'regular mainstream' iPad price is a partial response.) Apple still needs a low-cost laptop. That could turn out to be an iOS device in a "Mac like" form factor. Put a 2020-2021 A1x processor and most of the internals of a 2019-2020 iPad Pro into the MacBook sized system. Drop the trackpad (and stuff in more battery), put a 360° hinge on it, and just an sRGB screen (iPad Pro 12 variation). If Apple used (e.g, from the affordable iPad 9.7"), they could push out a $699-799 solution, which is better than the $999 old MacBook Air they are stuck on now, if it could run 12-16 hours (basically all work day).

The iPad Pro has one USB Type-C connector with DisplayPort Alt mode. The 12" MacBook has the exact same I/O limitation. There is overlap there, but the A-series MacBook could just as easily become a resurrected "iBook" (running iOS).

Apple could prune off the bottom of the Mac laptop line-up with the A-series, but replacing the rest (desktop and top end laptops) isn't practical. And splitting the Mac line-up onto two different processors only adds to overhead for developers and Apple.
 


[Opal] is very similar, but it is designed around putting a secure enclave inside the drive. Apple puts their secure enclave inside the main system. [...] Opal either uses a mini "pre-boot authentication" OS off the drive or the BIOS (UEFI) to gather the password to unlock the key. Those aren't as unified a trust chain as the Apple approach. If the pre-boot authentication software can be tweaked (directly or indirectly), the password can be gathered
Great summary! And purely coincidentally, the news today is about someone finding several problems of just this sort in different Opal implementations:
 


If one of Apple's strategic objectives is to get more customer data onto SSD sooner rather than later, then that's a conflicting agenda.
Or is the strategic objective to get more folks to pay for iCloud storage? Just like the push for subscription software, push for subscription storage to get an ongoing revenue stream and further lock-in into the ecosystem.
 


FYI... a new email from Costco shows many new iPad Pro models available on pre-order with shipping next week.

I have not compared prices but their products are always below regular prices. If a new iPad Pro is on your shopping, list, it might be worth checking out.
 


Ric Ford

MacInTouch
One slightly annoying oddity is that my beloved Logitech M720 Triathlon has gotten a little jerkier since I swapped monitors. I'm not sure if it's Bluetooth interference or what - haven't taken time to troubleshoot it yet (as it's not dysfunctional, just a little unpleasant).
I figured it out. It's 30 Hz. vs. 60 Hz. Using a 30Hz monitor mode makes my mouse feel jittery moving around the screen, while 60 Hz. is smoother. So, I've switched back from HDMI to DisplayPort on this 2015 MacBook Pro (which has no options higher than 60 Hz.)
 


Regarding the switches: look for the multi-gig switches - those support the NbaseT standard that the Apple devices do (10Gb/5Gb/2.5Gb/1Gb), as opposed to the older 10GbaseT (10Gb or 1Gb). You have more flexibility with devices on the other side, and the speeds up to 5Gb can be run on Cat 5e cabling, so even users with older cabling they don't want to replace can run faster than gigabit speeds. Also, the new NbaseT controllers tend to be less expensive, and there are ≤5Gb cards available as well, at even lower cost for other machines.
 


Apple's slide that said that iPads outsold Dell+HP+Lenevo+Microsoft Windows (probably limited to a certain class of laptops) combined suggests otherwise. If iPad and iOS is 'beating' Windows, then what does Apple need macOS for, to compete with them? The question for macOS on A-Series that doesn't really have a good answer is how is iOS on A-Series doing badly (or substantively worse than macOS)? macOS to the rescue how?

Virtualization why? The iPad Pro demo included a substantial amount of time for an upcoming version Photoshop on iOS. It won't be 100% the same in 2019, but two years into development (2021), how many critical 'missing pieces' are there going to be?

One of the big pushes at WWDC this year was a core library that allows for more unified applications to be ported to both macOS and iOS. The 'flow' of apps is probably going in two directions. Some iOS apps will come over to macOS (making it a broader ecosystem), and some will flow in the opposite direction (same impact). Basically, many of those are going to be a win/win for both sides.

Apple has only made shifts in the past when the target being shifted to was faster than what they were coming from. The A-series getting to parity (vs. a process and microarchtectural stuff for the Intel line-up) isn't getting out in front. Virtualization will only add more overhead (need to be substantially faster to get to the point of not backsliding on 'upgrades'). If Intel remains stuck in time, perhaps. Both Intel and AMD both being stuck still in 2021 is somewhat unlikely.

It is far from that. Apple isn't likely to close that gap in two years. Geekbench scores are far from a holistic picture of the task. First, the A12X came after the A10X (Apple's release of the X variant of the series is slowing down). The curve is starting to flatten out for the A-series as many of the same 'tricks' are woven in.

Second, the A-series is tuned for 32-bit-sized problems. The SPEC standard benchmarks needs a modification, because iOS wants apps that are scoped for iPhone-sized memory constraints.

The iPad Photoshop demo appeared to be a large PSB file, so there are workarounds, but the A-series isn't extenstively benchmarked against double-digit-GB-sized problems.

Third, there have been several "A12 is great" articles pointing to Geekbench stats that are overly obsessed with single-thread performance. A12 tends to fall short on multicore benchmarks.

Lower power and thermal envelope? Yes. The A-Series is ready to take over the MacBook place in the line-up. On the desktop front, though, the Mini just moved "up" to desktop processors. The A12 doesn't touch those in multicore at all. If the desktop is plugged into the wall, then runtime battery life really doesn't matter as much.

If there were an A-series in the MacBook Air, it probably would get the same keyboard and even fewer ports (since the A-series doesn't have robust I/O capabilities - e.g., there is no x4 PCI-e v3 to hook a Thunderbolt controller to).

The Core I3 multicore specs are better normalized against number of cores, and probably clearly better on sustained (not 3-4-minute benchmark) workloads.

As much as Apple talked about iPads outselling some vaguely-defined set of systems from their Windows competitors, Chromebooks have been making a dent. (The lowering of the 'regular mainstream' iPad price is a partial response.) Apple still needs a low-cost laptop. That could turn out to be an iOS device in a "Mac like" form factor. Put a 2020-2021 A1x processor and most of the internals of a 2019-2020 iPad Pro into the MacBook sized system. Drop the trackpad (and stuff in more battery), put a 360° hinge on it, and just an sRGB screen (iPad Pro 12 variation). If Apple used (e.g, from the affordable iPad 9.7"), they could push out a $699-799 solution, which is better than the $999 old MacBook Air they are stuck on now, if it could run 12-16 hours (basically all work day).

The iPad Pro has one USB Type-C connector with DisplayPort Alt mode. The 12" MacBook has the exact same I/O limitation. There is overlap there, but the A-series MacBook could just as easily become a resurrected "iBook" (running iOS).

Apple could prune off the bottom of the Mac laptop line-up with the A-series, but replacing the rest (desktop and top end laptops) isn't practical. And splitting the Mac line-up onto two different processors only adds to overhead for developers and Apple.
I never suggested a "splitting" of the Mac line into A-Series and Intel CPUs; as you write, it doesn't make sense.

The thrust of your logic implies that Apple really does care what happens to the Macintosh and macOS-on-Intel. I am not convinced (of the latter, at least). They will make much more money if the chips in Macs are made by foundries they control. Regarding any A-Series deficiency (relative to the use in Macs): That's just a generation or two away in computer time, so two years hence and presto, everything about which you complain will be resolved. If a fanless iPad can pretty much run with some of the big dogs on critical tasks, an A14(?) will fix that.

I think it should be remembered that everything that Apple has tried to do regarding hardware since Steve's crew begat the Mac has been to not leave money on the table. There have been some detours along the way (i.e., clones), but that's the direction.
 


Ric Ford

MacInTouch
2018 Mac mini
  • 3.2GHz 6‑core 8th‑generation Intel Core i7 (Turbo Boost up to 4.6GHz)
  • 16GB 2666MHz DDR4
  • 512GB SSD storage
  • Intel UHD Graphics 630
  • 10 Gigabit Ethernet
Subtotal: $1,799.00
Shipping: $8.00
Estimated Tax: $112.94
Order Total: $1,919.94
So, what does a similarly-configured 2018 MacBook Pro cost? It has fewer CPU cores and fewer ports but better graphics, a battery, keyboard, and display... for an extra $700 (39% more).

2018 MacBook Pro 13"
  • 2.7GHz quad‑core 8th‑generation Intel Core i7, Turbo Boost to 4.5GHz
  • 16GB 2133MHz LPDDR3
  • 512GB SSD storage
  • Intel Iris Plus Graphics 655
  • no Ethernet
Subtotal: $2,499.00
+ Shipping
+ Tax

How about a simlarly-configured iMac? Maybe this is the sweet spot, with a fast (but previous-generation) CPU, decent graphics, SSD, two Thunderbolt 3 ports, four USB 3 ports, and an SDXC card reader, to go along with a 4K*, 21-inch display (and you could add a bigger one via a Thunderbolt 3 port).

21.5‑inch iMac 4K
  • 3.6GHz quad‑core 7th‑generation Intel Core i7, Turbo Boost to 4.2GHz
  • 16GB 2400MHz DDR4
  • 512GB SSD storage
  • Radeon Pro 555 2GB
  • Gigabit Ethernet
Subtotal: $2,199.00
+ Shipping
+ Tax

*Apple is using an unusual "4K" resolution of 4096x2304.
 
Last edited:


Lyman was commenting on the limitations of Target Display Mode:
The last item is the energy impact one. Neither computer can go to sleep without this setup disconnecting. Pragmatically, you're going to have to put both computers into the 'never sleep' mode, if you don't want to go through the reconnect process every time you wake the computer up. For "never sleep" servers, perhaps that was the normal mode anyway, but for most end user setups, that is abnormal.
Oddly, I’m not seeing a number of the limitations & disadvantages of Target Display Mode that are described in the Apple Support Article. (I'm using a 2010 27” iMac as a second monitor for a 2017 27” iMac).

Specifically:
  • There’s no need to press Command-F2 to toggle into Target Display Mode. As soon as I plug the USB-C to DisplayPort cable into the 2010 iMac’s DisplayPort port, the 2010 iMac immediately launches Target Display Mode and turns into a second monitor.
  • After the 2010 iMac is in Target Display Mode, performing a Restart on the 2017 iMac leaves the 2010 iMac in Target Display Mode; i.e., by the time the 2017 iMac has rebooted, the 2010 iMac is still acting as a 2nd monitor. (I’m wondering whether this is due to how rapidly the 2017 iMac reboots — it has an internal SSD.)
  • When the 2017 iMac goes to sleep, the 2010 iMac stays in Target Display Mode, and its display also goes to sleep. When the 2017 iMac wakes up, the 2010 iMac's display also wakes up, and is still a 2nd monitor.
    • (At first, I thought maybe I’d configured the 2017 iMac to not to go to sleep, and that only its display was sleeping, not the iMac itself. However, the ”Prevent computer from sleeping automatically when the display is off” setting in System Preferences->Energy Saver is off (unchecked) for the 2017 iMac.)
  • Once the “display” iMac is in Target Display Mode, the only two ways to get it out of Target Display Mode are to (a) shutdown the “primary” iMac (Shutdown, not Restart), or (b) disconnect the cable between them.
For any Target Display Mode veterans: is this consistent with what you see when using Target Display Mode? Or have I lucked into a hardware configuration that magically does just what I want it to do, instead of what it’s documented to do? :-). Obviously, this isn’t a problem for me; but I am curious about the observed deviation from the way Target Display Mode is supposed to work.

BTW, here’s some primary configuration info for both iMacs:
  • 2010 iMac: 27” i7 2.8 GHz, running macOS Sierra 10.12.6
  • 2017 iMac: 27” i7 4.2 GHz, running macOS Sierra 10.12.6
 


Ric Ford

MacInTouch
one of the factors here is how many NAND packages Apple is using to get to the higher capacities. If Apple is using 2 packages, and other 1-2TB SSDs are using 4 packages, then those aren't the same packages. Either the density of the dies is higher, or there are more effectives dies stacked higher (3D). If Apple is buying the most bleeding-edge NAND, the discounts, even at large volume, aren't going to be as [large as] for mature (or trailing edge) NAND packages.
I took a look at SSD upgrade pricing for the 15" MacBook Pro:

Storage (TB) add'l TB add'l price add'l $/TB $/add'l TB
0.25 0 $0 n/a n/a
0.5 0.25 $200.00 $400.00 $800.00
1 0.75 $600.00 $600.00 $800.00
2 1.75 $1,400.00 $700.00 $800.00
4 3.75 $3,400.00 $850.00 $906.67

How does that compare to high-speed Samsung 970 EVO SSDs, for example (using Amazon pricing)?

Storage (TB)price $/TB
0.5 $148.00 $296.00
1$228.00 $228.00
2$598.00 $299.00
4*$1,196.00 $299.00
* two 2TB cards

These exhorbitant Apple prices and lock-up are just not a problem with a Dell XPS 15, though.
 



Ric Ford

MacInTouch
Given that the becoming-ubiquitous T-2 controls storage and secure boot, I wondered if it would block dual booting of Linux, as I've done on several Macs, including my 11" MacBook Air, using the rEFInd boot manager.
Turned up one post that said the T-2 chip only allows internal install of macOS, or a Boot Camp Install of Windows 10.
Apple's support site says it is possible to boot a Mac with a T-2 chip from an external drive, if the drive hosts a "compatible operating system," defined as macOS or Windows 10. With the T-2 Chip, gymnastics are required in "Startup Security Utility" to enable external drive boot. Anyone know if it is possible to install, or even run, Linux on Macs with T-2 chips?
It may not currently be possible to use Linux on Apple's built-in flash drive, although it may be possible to boot Linux from an external drive:
Jason Evangelho/Forbes said:
Booting Linux Is Impossible On New Apple Hardware
... it's disappointing to learn that it's impossible to boot Linux on newer Apple hardware like the 2018 MacBook Pro and Mac Mini. The culprit is Apple's T2 Security Chip, a custom processor that's responsible for managing elements like Touch ID, SSD controllers and most importantly to this conversation, Secure Boot. Secure Boot "ensures that the lowest levels of software aren’t tampered with and that only trusted operating system software loads at startup."
...
Apple's boot time security options on T2 chip-enabled systems allow three levels of security: Full, Medium and None. However, reports have come in that even with it disabled, users are still unable to boot a Linux OS as the hardware won't recognize the internal storage device. Using the External Boot option (pictured above), you may be able to run Linux from a Live USB, but that certainly defeats the purpose of having an expensive machine with bleeding-edge hardware.

I understand if you're scratching your head while reading this story. Why pay an exorbitant amount of money for Apple hardware if you have no intention of running macOS? Why not go with more supported platforms like the Dell XPS 13 or Lenovo Thinkpad lines? Everyone has their reasons. One guy happens to believe that Apple makes some of the best Linux hardware. That guy is Linux creator Linus Torvalds. . .
 


I figured it out. It's 30 Hz. vs. 60 Hz. Using a 30Hz monitor mode makes my mouse feel jittery moving around the screen, while 60 Hz. is smoother. So, I've switched back from HDMI to DisplayPort on this 2015 MacBook Pro (which has no options higher than 60 Hz.)
You might look into SwitchResX. It unlocks resolutions your video "card" is capable of producing but macOS doesn't have available.
 


Ric Ford

MacInTouch
You might look into SwitchResX. It unlocks resolutions your video "card" is capable of producing but macOS doesn't have available.
As a matter of fact, I just purchased it today. Along with allowing a choice among a large range of resolutions (or even creating custom ones), it has other very useful features, including displaying HiDPI status, refresh rate, resolution, aspect ratio and more. (Some of these are displayed in the menubar pop-up.) And it can save icon arrangements of the desktop, which is a huge benefit in my usage (though I'm having some glitches, at least with Path Finder).

SwitchResX has been very, very helpful as I've been working with this LG 4K monitor and trying to understand related options and issues.

P.S. I just tried it on a 2017 MacBook Air, and it was shocking to see all the viable high-resolution modes that Apple hides from you!
 




You might look into SwitchResX. It unlocks resolutions your video "card" is capable of producing but macOS doesn't have available.
Another really cool feature is the ability to create custom resolutions by tweaking the low-level timing parameters.

Unix/Linux users running XFree86 (and later X.org) have had this ability for decades. It's really nice when you have some custom LCD panel that requires a resolution/frequency nobody else has heard of before. It's also great for fixing brain-dead industry decisions.

For instance, many years ago, when I was still using a CRT multisync monitor, the next standard resolution up from 1024x768 was 1280x1024. The problem with this resolution is that it's not the standard 4:3 (1:1.33) aspect used by non-widescreen displays, but is 5:4 (1:1.25), which means you either end up with black borders on the edges or non-square pixels.

By tweaking the video timings, however, I was able to get even el-cheapo video cards to output 1360x1024, which is a 4:3 aspect, using the same number of lines. And I've never found a monitor that had a problem syncing with it.

I've also used tweaked video modes to maximize the use of video RAM, back when spare VRAM wasn't needed for things like texture mapping. So a 1MB video card could be tweaked to output a 1024x1024 (8-bit) image, with a very wide overscan region (168 pixels on each side) to keep the pixels square and the image centered.

It always annoyed me that I could never do stuff like that on my Windows and Mac computers. Of course, it's less important today, now that we all use LCD displays, where it is almost always better to output the signal at the display's native resolution and let software perform any scaling that may be needed.
 


iFixit just posted their 2018 MacBook Air teardown. Important (to me, at least) points:
  • Getting inside is easy - pentalobe screws, but nothing else holding the lower case on.
  • Memory and storage are soldered down and are not replaceable.
  • The battery is replaceable, but you need to remove the motherboard and speakers in order to get access. Removal involves four screws and six adhesive strips - not as easy as prior MacBook Airs, but better than other Apple laptops.
  • As with all other Apple laptops, the keyboard is permanently attached to the top case, so replacement is going to be a major effort.
  • I didn't notice any secret diagnostic port for the storage (as with the iMac Pro - this may be a "feature" of using the T2 for the SSD controller). If the computer gets so messed up that you can't even use Target Disk mode, there won't be any mechanism (short of de-soldering the flash chips and the T2 and somehow getting them to work together on a new board) for extracting your files. This means backups (which are always very important) become extra important, since you can pretty much forget data recovery. Hopefully, it also means that any damage severe enough to prevent normal access will also make chip-level access impossible (because of a damaged T2 losing the encryption keys), so dumpster divers won't be able to get your files after you've given up. Of course, using FileVault in addition to the T2's encryption can protect against this if this "hopeful" guess turns out to be wrong.
 


Amazon disclaimer:
As an Amazon Associate I earn from qualifying purchases.

Latest posts