MacInTouch Amazon link...
Channels
Other, Products
I have two monitors connected to my Mac Pro 3,1 OS X 10.11.6. They work fine. The problem is when one of the monitors is shut off. If I accidentally move a window to the shut off monitor, I can't bring it back to the operating monitor. It appears that the OS does not recognize that the other monitor is shut off. If I turn on the second monitor, the window appears on the second monitor.

It would be nice if the OS recognized that the second monitor is off and set the screen boundaries to that of the first monitor.

Is there any way to do this?

I posted this question in the Apple Forum but got no reply.
 
Last edited by a moderator:


Neal, unplug, from the Mac Pro, the video cable from the second monitor. Power-off sometimes is simply not enough.
 


... It would be nice if the OS recognized that the second monitor is off and set the screen boundaries to that of the first monitor. ...
This has been a problem with multiple displays on Macintosh computers since sometime last century.

A good work-around is to install Display Menu, a menu bar applet, and click on Mirror Displays.
 


Ric Ford

MacInTouch
Here's a new one... 4K display panels used in cheaper displays, with none of their native benefits and some new image quality issues:
The Next Web said:
That 1440p monitor you bought might have a 4K screen
This is actually bad for consumers. A 4K panel is capable of higher resolution than a QHD panel, but these have to be down-scaled. And that means that a 1440p monitor with a 4K panel has a worse image than one with a 1440p panel.
 


I've been looking into getting two displays to mount to a dual-display arm on my office desk. Since the newest MacBook Pros have USB-C and [an optional external] "triple-dongle" that has HDMI, I was curious to hear what others are using in terms of
a) what displays did you buy, and
b) how do you connect them to the new MacBook Pro?

It doesn't seem too many displays have USB-C connectivity, and if they do and they're 4K, you pay out the nose for them. Apple wants $699 for the LG Ultrafine 4K they offer on their site.

If I go the HDMI route, then I need to triple dongles coming out of both sides. Not the worst thing, but not the most aesthetically pleasing.

So if you're using the new USB-C-only MacBook Pro and you're driving two, at least HD, displays (I'd like to be able to edit 4K video), please share your setup and rationale?

Thanks
 


I've been looking into getting two displays to mount to a dual-display arm on my office desk. Since the newest MacBook Pros have USB-C and [an optional external] "triple-dongle" that has HDMI, I was curious to hear what others are using in terms of
a) what displays did you buy, and
b) how do you connect them to the new MacBook Pro?...
DisplayPort is a reasonable choice for connecting a 4K display today. No worries about versions.
 


Since the newest MacBook Pros have USB-C and [an optional external] "triple-dongle" that has HDMI, I was curious to hear what others are using.
Forgive me for jumping on your question with only a slightly related answer. I can tell you that when Apple introduced the USB-C/Thunderbolt 3 Macs in 2016, those of us who purchased one entered dongle hell, from which we have yet to emerge.

USB-C to Mini DisplayPort adapters that purported to support Apple's own Mini DisplayPort monitors often didn't. Apple's own adapter that looked like a USB (Type A) to Thunderbolt 3 adapter was not (it provided charging current but not data signals), and the same is true for the still-sold USB-C MultiPort adapter (USB-A, USB-C charge only, and HDMI). External devices that connect via USB micro plugs often don't work to mount the devices as mass storage external volumes.

In my case, the first major help came from OWC, with their 13-port (expensive) Thunderbolt 3 dock. When it arrived I could finally connect my very expensive Apple 27" Mini DisplayPort Cinema Display to my laptop and expect to get an image on the monitor, although sometimes a series of chants and plug-unplug-replug rituals were necessary (especially to be certain the computer itself was connected to a/c power.

It's just in the past week or so that I've discovered that the MultiPort adapter doesn't support Thunderbolt or USB data.
 


I have a quite-old Apple Cinema Display 22" DVI that has been in use on my desk for years. As the display has slowly darkened, I have considered replacing the fluorescent bulbs illuminating the screen but decided that the difficulty and expense simply wasn't worth it. Hence I now have three other monitors, two different Dell Ultrasharps and an Acer (all 27" and purchased from Craigslist), all running off of an Nvidia GTX 960 card in my 2009 Mac Pro. My old oak teacher's desk can handle three monitors, but four don't fit.

As the Cinema Display is still useful (but hasn't been my main monitor for many years), I would now like to mount it up, on or attached to, a shelving unit behind my desk. Of course, the Cinema Display is my only monitor that doesn't have VESA capabilities, so there is no easy solution to do so.

I'm sure, way back when, other folks desired to mount their 22" Cinema Displays rather than have them stand on a desk. Anyone know of any solutions that they might have come up with?
 



Thank you for noting that link. Unfortunately, it is for the 20" anodized aluminum display, and I have the much older, plastic-encased 22-inch display with the non-removable rear stand.

I know they made two versions of my display; the first used DVI, the second used ADC.
 


Advice sought: I’m buying a new iMac and am looking for an adapter/cable to connect it to my second monitor (which has HDMI and DVI-I inputs). My old iMac is circa 2010, and so Thunderbolt 3/USB-C to HDMI adapters are a new experience for me.

I was about to purchase one of Apple’s USB-C Digital AV Multiport adapters but was alarmed by 2-out-of-5 star rating it’s given… on the Apple Store’s website. According to the reviews there, the quality is horrendous.

Does this match your experience - and/or do you have any recommendations for alternate products/solutions? I don’t mind paying more for something that’s reliable and lasts a (reasonably) long time.

The monitor I’m currently using is relatively low-res (1920x1200), but I could see going to a 4K external monitor in the future. (I’m not a gamer, so 60hz refresh rate is all I need.)

Thanks in advance!
 


Thanks to the obscurity connected with USB-C functionality labeling, I’ve tried to stay away from Apple’s adapters where connecting video from USB-C to anything older is concerned. We have a newish iMac that runs a second monitor using the following two Monoprice items.
For your case, I’d consider this:
Assuming you already have an HDMI cable, this would be what I’d try in lieu of a more elaborate multi-port box.
 



Advice sought: I’m buying a new iMac and am looking for an adapter/cable to connect it to my second monitor (which has HDMI and DVI-I inputs).
A USB C-to-HDMI cable is probably all you need. I bought one of these to connect my 2018 MacBook Pro to projectors:

I thought about getting an adapter so I could connect an HDMI cable to my computer, but I figured why bother with an adapter and a cable when I could just use a single cable.
 


Advice sought: I’m buying a new iMac and am looking for an adapter/cable to connect it to my second monitor (which has HDMI and DVI-I inputs). My old iMac is circa 2010, and so Thunderbolt 3/USB-C to HDMI adapters are a new experience for me.
Thanks to everyone who posted with suggested solutions to my question! I wanted to provide a brief follow-up report on what succeeded (and failed) on my quest for said adapter.

Initially, I tried the Belkin USB-C (USB Type C) to HDMI Adapter, a relatively high-priced (and, I thought, high-end) USB-C to HDMI adapter. Several people recommended it for (a) working well with iMacs and (b) running 24/7 without failing/overheating.

Unfortunately, it wouldn't produce more than 1080p output for my Samsung monitor (native resolution 1920x1200 resolution). Way too fuzzy; I returned it.

Found success with a Benfei USB-C (Thunderbolt 3) to DVI Adapter - gave me the Samsung's native resolution at 60hz.

In the end, I decided to have a go at using the old 2010 iMac, in Target Display Mode, as my 2nd monitor for my new iMac. Works very well, so far, with a Cable Matters USB-C to Mini DisplayPort cable connecting the two. I'm keeping the Benfei adapter and the Samsung monitor, though, as backups (or a 3rd monitor).
 



DFG

Buy 4K/8K display* and high-quality cable. Connect to computer.
Get SwitchResX to display resolution mode details and switch among them.
I have long debated whether I should switch to a 4k display for my hackintosh desktop. I am currently running at 2560x1440 on a 27" display. This is a format officially supported by Apple, as it is identical to the previous iMac 27". The menu bar, Dock, and icons are the right size for me, even if somewhat on the smallish side. I do not wish to go much smaller. It looks Ok, but I have been spoiled by my MacBook Pro retina display. Ideally, I would want to go 5k, that is, exactly doubling my current resolution. This is another Apple-sanctioned format, same as the current iMac 5k.

There are several factors that are important:

1. Monitor size. At my desk, I am limited to a 27" monitor. Even if I could, I am not sure I want a larger monitor right in front of me eyes (maybe 31" would be the max. I would consider)

2. Graphics card. I have a relatively old Nvidia GeForce GTX 760, which does support 4k (3840x2160@60Hz), but alas, not 5k

3. Cost. 4k displays are now available for a reasonable cost. Not so 5k displays.

So I am pondering whether I should go to 4k, which is not an Apple-sanctioned format. This means that the menu and icons would look even smaller than what they are now. I do not know whether I would like that format, and my eyes aren't getting any younger. That's the reason I haven't made the switch yet.
 


Ric Ford

MacInTouch
... So I am pondering whether I should go to 4k, which is not an Apple-sanctioned format. This means that the menu and icons would look even smaller than what they are now. I do not know whether I would like that format, and my eyes aren't getting any younger. That's the reason I haven't made the switch yet.
It's a bit confusing, but it became clear when I got the 27-inch LG 27UK650-W display, which is marketed as a "4K HDR" display but which, incredibly, can actually show 8K images on the screen, as well as 5K and a huge variety of other resolutions (thanks to the SwitchResX pref pane utility).

What's confusing is Apple's marketing. The iMac "5K" actually displays our standard 27-inch 2560x1440 picture on the screen... except it doubles/quadruples all the pixels, so there are 5K worth of them, making for a higher-quality image, though you see the same-sized objects and type as you would on a standard 2560x1440 (i.e. the same content as with a non-retina/non-"HiDPI" screen).

You can also use this screen like Apple's 21-inch iMac "4K" retina display for a bit larger image: a pixel-doubled 2048x1152 (or the other "4K", which is a HiDPI version of 1080p.)
 


FWIW: I’ve got an 11” mid-2013 MacBook Air with 250GB SSD, Core i7 processor and 8GB RAM, running High Sierra (no need for Mojave yet, I’ll wait for at least 1 update, if at all). I mostly do pretty pedestrian things - internet, email, docs, Keynote presentations, very occasionally some not-too-sophisticated video editing (iMovie) - and it all works fine, at reasonable speed. I was thinking of the new MacBook Air simply because it’s newer (newer CPU mostly). But since what I have is fine for me, and the newer Air’s specs are no great improvement, I’ve decided not to buy. The MacBook Pro would be truly much more capable, but most of that improvement would only be needed (or used) by someone who has much more high-end needs. For me, it’d be like having a Lamborghini just to drive to the corner store and back.
Sounds to me like exactly the right attitude about upgrading any computer.

If you haven't seen a Retina display in person yet, though, you should swing by an Apple Store if you find yourself near one. It may be my age, or my deteriorating vision (will need my first set of prescription lenses next year), but I find a Retina display makes an enormous difference, simply because of the contrast (dynamic range). And of course, your milage may vary.
 


Ric Ford

MacInTouch
It may be my age, or my deteriorating vision (will need my first set of prescription lenses next year), but I find a Retina display makes an enormous difference, simply because of the contrast (dynamic range). And of course, your milage may vary.
I've been doing a lot of monitor/graphics testing lately, and calibration can make a huge difference in contrast and image quality. Unfortunately, Apple hides the critical controls, so you have to know the secret incantation for accessing the real/good version.

In System Preferences > Displays > Color, there's a "Calibrate..." button that leads to a mechanism that seems wholly dysfunctional in my testing, but if you invoke the more useful version via secret incantation—holding down the Option key while clicking the "Calibrate..." button—then you hopefully will find a hidden "Expert Mode" checkbox that reveals previously-hidden tools in the next steps, which worked amazingly well in my experience to get a good, sharp image from my display. And you can name and save a variety of these "display profiles" and switch among them as you like. (Uncheck "Show profiles for this display only" to see a variety of other interesting options.)
 



I am looking for a very good quality 21:9 monitor, larger than 30". I just bought a Dell U3417W to go with my Mac Mini. Looks like a good monitor but has a fair amount of light bleed in the corners - not disgraceful, but at $800, I definitely would not say it was "very good". I am currently using an Apple Cinema Display, and it certainly does not have this kind of bleed on it. Its problem is that it flickers on and off. Are there 16:9 4K monitors? (I am going to run this at some lower resoltuion, because the eyesight just isn't as good as it used to be.)
 


I've been doing a lot of monitor/graphics testing lately, and calibration can make a huge difference in contrast and image quality. Unfortunately, Apple hides the critical controls, so you have to know the secret incantation for accessing the real/good version.

In System Preferences > Displays > Color, there's a "Calibrate..." button that leads to a mechanism that seems wholly dysfunctional in my testing, but if you invoke the more useful version via secret incantation—holding down the Option key while clicking the "Calibrate..." button—then you hopefully will find a hidden "Expert Mode" checkbox that reveals previously-hidden tools in the next steps, which worked amazingly well in my experience to get a good, sharp image from my display. And you can name and save a variety of these "display profiles" and switch among them as you like. (Uncheck "Show profiles for this display only" to see a variety of other interesting options.)
In all the years the "Calibrate" function has been available to us, I've never used a straight-from-the-factory Mac that had a properly calibrated display. Ric's advice is spot-on. Yes, do calibrate it using the function within the System Pref as a starting point; the display's colors will become richer and everything just looks better (and more accurate)!

If you are a professional involved in the graphics arts or photography, you might want to consider a hardware calibration system (these are expensive). But in my case, my 2015 27" iMac (calibrated using the system function) provides pretty accurate results in Lightroom when I commit my edits to paper or when I upload jpegs to one of the companies creating my holiday cards.
 


Ric Ford

MacInTouch
If you are a professional involved in the graphics arts or photography, you might want to consider a hardware calibration system (these are expensive).
I was on the verge of buying a (less expensive) hardware calibration tool, but the reviews on Amazon said that the software was terrible, and I don't need any extra problems at this point, so I didn't buy one. But if anyone has tips on a good option and procedure for using it easily and effectively, I'd appreciate knowing about that.
 


In all the years the "Calibrate" function has been available to us, I've never used a straight-from-the-factory Mac that had a properly calibrated display. Ric's advice is spot-on. Yes, do calibrate it using the function within the System Pref as a starting point; the display's colors will become richer and everything just looks better (and more accurate)!
This shouldn't be surprising. The correct calibration for a display depends as much on the ambient room lighting as on the display itself.

If you calibrate it "perfectly" and move the computer to another room with different lighting, it will no longer be perfect. Depending on how sensitive/picky/fussy you are, you may want to calibrate it multiple times under different conditions (e.g. dark room, sunlight, incandescent light, etc.) and switch profiles when the room light changes.
 


It's a bit confusing, but it became clear when I got the 27-inch LG 27UK650-W display, which is marketed as a "4K HDR" display but which, incredibly, can actually show 8K images on the screen, as well as 5K and a huge variety of other resolutions
That's not entirely correct. Assuming the Amazon specs are correct, the panel's native resolution is 3840 x 2160 - a 4K display. It may be able to sync to a 5K or 8K signal and down-sample it to the physical resolution, but that's not putting any more information on the screen. It's just consuming more video memory and bandwidth.

What's confusing is Apple's marketing. The iMac "5K" actually displays our standard 27-inch 2560x1440 picture on the screen... except it doubles/quadruples all the pixels, so there are 5K worth of them, making for a higher-quality image, though you see the same-sized objects and type as you would on a standard 2560x1440 (i.e. the same content as with a non-retina/non-"HiDPI" screen).
This is what Apple's been doing (in various ways) for years.

Apps draw images in terms of virtual pixel coordinates which are independent of the display's DPI. Apps always work in term of "points" (1/72 of an inch) for the coordinate space, regardless of the resolution (using floating-point numbers to deal with the fact that there may be multiple pixels between points. The "backing store" (GPU memory storing the image) is typically 2x the resolution of the signal going to the display, and the display's scale factor determines the ratio of points-to-pixels.

This is the same tech that's been used for ages when printing, which is why a document you send to a 100dpi printer (e.g. an ink-jet printer in draft mode) ends up the same size as that document sent to a 2400dpi laser printer. The high DPI allows everything to be printed with more detail and better halftoning, but everything comes out the same size.

On non-retina displays, the ratio of points to backing-store pixels is 2:1, so everything is rendered at double-size. Then the GPU downsamples it to produce the video signal. This produces high quality anti-aliasing, because the image contains data about the colors "between" the pixels.

With a retina display (as I understand it), the signal sent to the display is always the display's native resolution, so the backing store memory is always double that resolution. The scale factor you set in the Displays preference pane determines the points-to-pixels ratio. When it is set to "larger text", you get a larger ratio - more pixels per point, resulting in a larger and sharper image. When it is set to "more space", you get a smaller ratio - fewer pixels per point, resulting in a smaller and less sharp image. At the extreme end of "more space", you're down to the same 2:1 ratio used for non-retina displays, producing extremely small type on a retina display.

One thing that I'm curious about is what is actually happening when you use SwitchResX to select an 8K image for a 4K retina display. Is the Mac sending an 8K signal to the display (probably meaning a 16K backing store and a 2:1 ratio)? Is it setting a 1:1 point-pixel ratio on the existing 8K backing store and therefore just reducing the size of everything? Something else? I can't imagine anyone wanting to do real work in this mode, since everything would end up unreadably small, but I am curious about what is actually going on behind the scenes.
 


Ric Ford

MacInTouch
That's not entirely correct. Assuming the Amazon specs are correct, the panel's native resolution is 3840 x 2160 - a 4K display. It may be able to sync to a 5K or 8K signal and down-sample it to the physical resolution, but that's not putting any more information on the screen....
Just a quick follow-up, while I'm also involved in email discussions on this topic:

The 27-inch LG 27UK650-W has its own on-screen display that provides some related options and information about that hardware, independent of anything happening on the Mac.

This shows 3840 x 2160 for the monitor itself with SwitchResX settings of:
  • 6720 x 3780
  • 1080p HiDPI
  • 5K, etc.
With SwitchResX set to 1080p not HiDPI, the display says it's at 1920 x 1080, as expected.

The monitor itself can also be adjusted to different modes. So, you can pick "1:1", and a 1080p non-HDMI image shrinks to cover only that number of pixels in the middle of the screen. Other modes expand the same image to fill the screen.

Meanwhile, setting SwitchResX to a high-resolution mode, e.g. 5K or 8K non-HiDPI/retina, results in macOS screen captures of that full set of pixels, not anything smaller.

And you see a desktop of that size on the screen (e.g. 5K), although the Mac/monitor system combination apparently creates such images through some sort of magic manipulation of the 4K pixels actually contained in the display panel. (We don't know yet how this is done.)

Lastly, the Mac does its own anti-aliasing internally (visible in screen captures), which adds a little extra confusion/complexity (and includes things like "Font Smoothing" preferences plus other mechanisms outside the user's control).
 


Ric Ford

MacInTouch
The 27-inch LG 27UK650-W has its own on-screen display that provides some related options and information about that hardware, independent of anything happening on the Mac.

This shows 3840 x 2160 for the monitor itself with SwitchResX settings of:
  • 6720 x 3780
  • 1080p HiDPI
  • 5K, etc.
Despite the 27" display telling us that it's set to 4K (3840 x 2160), it actually handles a SwitchRes setting equivalent to Apple's iMac "5K" retina display, and it looks quite good in that mode, whatever is going on behind the scenes!
 


The monitor itself can also be adjusted to different modes. So, you can pick "1:1", and a 1080p non-HDMI image shrinks to cover only that number of pixels in the middle of the screen.
This is also a good way to demonstrate (for others) the advantage of a retina/HiDPI screen vs. simply using a lower resolution.

A 1080p image scaled to full-screen by the display is going to look different from a 1080p HiDPI screen (which is actually a 4K image with scaling applied to the rendering pipeline so the graphics are the same size as they would be on a 1080p screen). The former will show a more chunky/blurry image than the latter. But the latter will use more video RAM and GPU cycles.
Despite the 27" display telling us that it's set to 4K (3840 x 2160), it actually handles a SwitchRes setting equivalent to Apple's iMac "5K" retina display, and it looks quite good in that mode, whatever is going on behind the scenes!
Since you're only reducing the image's dimensions by 25% (5120x2880 => 3840x2160), text should be readable at normal point sizes as long as the scaling algorithm (wherever it is implemented) is using interpolation and isn't just discarding every fourth row and column. The minimum legible point size is likely to be larger when in 5K mode vs. 4K mode, but I wouldn't expect any other usability issues.
 


Ric Ford

MacInTouch
A 1080p image scaled to full-screen by the display is going to look different from a 1080p HiDPI screen (which is actually a 4K image with scaling applied to the rendering pipeline so the graphics are the same size as they would be on a 1080p screen). The former will show a more chunky/blurry image than the latter.
Yes, that's quite apparent in testing. Using the HiDPI/retina version of 1080p results in thinner, "cleaner" text, while using non-retina resolution gives text a bit thicker, darker appearance.

The display hardware (on-screen display options, selected via a "joystick" control in the display) also offers smoothing options that make quite a difference in results.
 


In all the years the "Calibrate" function has been available to us, I've never used a straight-from-the-factory Mac that had a properly calibrated display. Ric's advice is spot-on. Yes, do calibrate it using the function within the System Pref as a starting point; the display's colors will become richer and everything just looks better (and more accurate)!

If you are a professional involved in the graphics arts or photography, you might want to consider a hardware calibration system (these are expensive). But in my case, my 2015 27" iMac (calibrated using the system function) provides pretty accurate results in Lightroom when I commit my edits to paper or when I upload jpegs to one of the companies creating my holiday cards.
Hardware calibration beats the heck out of anything you will achieve by fiddling with the built-in OS controls and eyeballing it. One purchase I have never for one second regretted was hardware calibration. I recommend the xRite i1 Pro. Look for deals on it right after the Massacre of the Turkeys.
 



Ric Ford

MacInTouch
There are lots of specifications and details in this Apple support document:
Apple said:
Use 4K displays, 5K displays, and Ultra HD TVs with your Mac

You can use 4K displays and Ultra HD TVs with these Mac computers:
  • iMac (27-inch, Late 2014) and later
  • iMac Pro (2017)*
  • Mac mini (Late 2014) and later*
  • Mac Pro (Late 2013)*
  • MacBook (Retina, 12-inch, Early 2015) and later
  • MacBook Air (Early 2015) and later
  • MacBook Pro (Retina, Late 2013) and later
* You can learn more about connecting multiple displays to your iMac Pro, Mac mini (2018), or Mac Pro (Late 2013).
...
 



There are lots of specifications and details in this Apple support document:
You can get all kinds of HiDPI resolutions using SwitchResX with the MSI Radeon RX 560 that Apple specifies for a Mac Pro 5,1 - apparently on any monitor, including my main one, an Apple 30" Cinema Display.

The highest HiDPI resolution is 4K (3840 x 2160), but there's no option to scale the system fonts accordingly like you can with a Retina display - unless someone knows of a way to fool Mojave?

For me, the standard 2560 x 1600 (that's 16:10 vs. today's standard 16:9 ratio) is still the best one for normal use. There's a HiDPI 2560 x 1440 option - 16:9 - but that means giving up about 1-3/4" of vertical screen estate to black bars, and the improvement in quality is microscopic at best (unless you zoom with Control/swipe up, in which case the difference is obvious). But HiDPI resolutions up to 3360 x 1890 are very useful for some applications.

Of course, there are also some silly resolutions like 7680 x 4320, but that serves no purpose other than causing you to restart to escape from it.
 


Ric Ford

MacInTouch
The highest HiDPI resolution is 4K (3840 x 2160), but there's no option to scale the system fonts accordingly like you can with a Retina display - unless someone knows of a way to fool Mojave?
This is a little confusing. In SwitchResX, "3840 x 2160 HiDPI" should display Mac objects at twice the size of 7680 x 4320 non-HiDPI, using the same total number of pixels. And all 3840 x 2160 modes should show objects at half the size of objects in 1080p modes.
 


This is a little confusing. In SwitchResX, "3840 x 2160 HiDPI" should display Mac objects at twice the size of 7680 x 4320 non-HiDPI, using the same total number of pixels. And all 3840 x 2160 modes should show objects at half the size of objects in 1080p modes.
The 3840 x 2160 HiDPI setting shrinks the entire image down to fit on the 30" Cinema Display, system fonts/Mac objects and all. That results in black bands at the top and bottom, because the screen is 16:10 rather than 16:9.

My understanding is that you get a different control panel if you're using a Retina display (presumably on a Mac that supports it). It looks like that also enlarges the system fonts if you fit more picture on the same screen in an HiDPI setting?
 


The highest HiDPI resolution is 4K (3840 x 2160), but there's no option to scale the system fonts accordingly like you can with a Retina display - unless someone knows of a way to fool Mojave?
I think there is a misunderstanding (possibly on my part) about what SwitchResX means by a "HiDPI" screen.

As I understand it, any HiDPI mode will produce a video signal at your display's native resolution. The "resolution" setting is used to set a scale factor so everything renders to a size equivalent to the selected resolution.

In other words, if you have a 4K (3840x2160) display and you set it to 1920x1080 HiDPI, you will get a 4K screen but with a 2x scale factor, making objects render at the same size that they would on a non-HiDPI 1920x1080 screen.

A 3840x2160 HiDPI screen, displayed on a 4K display seems pointless to me. It would be generating a 4K desktop with a scale factor for a 4K image (1:1) - no different from a non-HiDPI 3840x2160 resolution. If displayed on a higher resolution screen (e.g. 5K or 8K), then it would produce that display's native resolution with a scale factor (e.g. 1.25x or 2x, respectively) in order to produce an image of equivalent size to a 4K signal sent to that screen.

Apple's preference panel is doing the same thing, but they present it differently to the user. It sees a "retina" display and always goes into "HiDPI" mode, presenting the scaling factors as actual scaling factors instead of as the resolution equivalents that SwitchResX does.
 


Ric Ford

MacInTouch
My understanding is that you get a different control panel if you're using a Retina display (presumably on a Mac that supports it).
I just tested on a 2017 MacBook Air without retina display (running macOS 10.12 Sierra). SwitchResX didn't display any "HiDPI" modes for the internal screen, but it allows choosing higher resolutions than Apple's Displays preference pane does (and, thus, smaller on-screen objects that Apple's software allows).

With the MacBook Air connected via DisplayPort to an external 4K monitor, SwitchResX shows HiDPI modes, and Apple's Displays preference pane shows the same sort of options shown for Apple MacBook Pro retina displays.
 


Ric Ford

MacInTouch
In other words, if you have a 4K (3840x2160) display and you set it to 1920x1080 HiDPI, you will get a 4K screen but with a 2x scale factor, making objects render at the same size that they would on a non-HiDPI 1920x1080 screen.
That's correct. SwitchResX refers to this resolution as "1080p" (e.g. 1080p60 for 60Hz) and either HiDPI or not: you get the same size but different appearances for non-HiDPI vs. HiDPI.
Apple's preference panel is doing the same thing, but they present it differently to the user. It sees a "retina" display and always goes into "HiDPI" mode, presenting the scaling factors as actual scaling factors instead of as the resolution equivalents that SwitchResX does.
Selecting "Default for display" in Apple's Displays preference pane for a 27" 4K monitor switches the display mode to 1080p60 HiDPI (as shown in SwitchResX). In Apple's "Scaled" preference pane options, this is shown as "Larger Text"/"Looks like 1920 x 1080."

You can scale this up to "More Space"/"Looks like 3840 x 2160" in Apple's preference pane, which is 3840 x 2160 non-HiDPI in SwitchResX. And you can choose several intermediate retina/HiDPI resolutions in both Apple's preference pane and SwitchResX, including 2560 x 1440 HiDPI (like Apple's 27-inch iMac "5K" or iMac Pro).
 


Amazon disclaimer:
As an Amazon Associate I earn from qualifying purchases.

Latest posts