Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

sd2020

macrumors newbie
Jul 27, 2021
12
1
Australia
I bought a Benq EW3280U today and I'm having the same resolution issue on my M1 Mac Mini.
The perfect resolution "3008 x 1692" is only available via usb-c cable. But I'm getting a max of 30hz somehow and not 60 via usb-c. And the hdmi is 60hz but it's either too big or too small.

Pretty bummed out tbh. :( Does anyone know any fix to all these?

usb-c vs hdmi
I can get 3008x1692 @60Hz on Dell U3219Q/M1 Mac Mini using an Apple Thunderbolt 0.8M cable. I use the monitor at native resolution. However the monitor USB hub switching has glitches when using USB C cable. The issues appear with Dell firmware related when using monitor USB C input.
 
Last edited:

jdmc

macrumors member
Nov 12, 2018
34
18
A lot of folks commenting in this thread seem to be having difficulties trying to use "scaled" display resolutions. Let me take a shot at explaining what's going on.

The process can be abstractly understood by dividing it into three parts: (1) the macOS UI rendering engine; (2) the frame-buffer downscaling engine; and (3) the video datastream output engine.

Part 1, the UI rendering engine, only works at two pixel densities: the "normal" density that has been the same in macOS for decades, and a "HiDPI" density, which was created when the first "Retina" displays came out. The "HiDPI" mode basically assumes that each display pixel is half as wide, and half as tell, as a "normal" display pixel, and therefore renders everything across twice as many pixels in each dimension. (In other words, in terms of the number of pixels used, it draws everything twice as large.) Now, you might think that if you have an intermediate display resolution selected, that macOS actually renders the screen image directly into that pixel density, using a non-integer scaling factor, but it doesn't; there are only "normal" (1x) and "HiDPI" (2x) rendering modes.

Helpful background info: In ancient times, when displays were made with cathode-ray tubes, the images on screen were produced by a sweeping beam of electrons impacting the back side of the display glass, which had a phosphorescent coating that would emit light when excited by the passing beam. The beam itself was an analog physical phenomenon, rather than a discrete digital one. If you ran a CRT at a resolution of 1024x768, for example, the beam would actually trace 768 discrete horizontal lines across the screen. If you switched to a resolution of 1280x960, then the display would trace 960 discrete lines. The phosphors would simply light up wherever the beam hit them; there was no fixed "native" resolution of the screen.

Modern LCD/LED displays are completely different: each has a fixed grid of physically discrete display pixels, the size and arrangement of which can't be changed. Therefore, if a display receives a digital video signal with a resolution different from its own, it must mathematically interpolate the provided signal into its native resolution in order to display it. Interpolating from a lower (courser) resolution signal to a higher (finer) pixel grid is always going to result in some blurriness or jaggedness in the visible result. Interpolating from a higher-resolution signal to a lower-resolution pixel grid fares much better, but is dependent on the quality of the algorithm used, and the computational resources available for the task. (As far as I'm aware, displays are not normally designed to accept incoming video signals that use a higher resolution than the display's own.)

So, let's suppose you have a 4K 16:9 display, with a 3840x2160 grid of physical pixels, and a diagonal screen size of 27". Let's further suppose that if you tell macOS to render to a matching-size 3840x2160 target frame buffer, using the "normal" pixel density, the result makes everything on the screen look too small to you; whereas if you have macOS use "HiDPI" mode (everything double the size in pixels), everything looks too big. Suppose, at that screen size, your most preferred visual size would result from rendering at a pixel density somewhere between "normal" and "HiDPI", but as we know, the renderer can't do that on its own. What then?

Well, one option is to tell macOS to render at "normal" (small-looking) pixel density, but to a frame buffer that's smaller than the actual screen dimensions; then send that signal to the display and let the display "upscale" each video frame it receives to its native resolution. (This is what's described in the "Displays" system settings as "low-resolution modes".) As explained above, this will allow the image to be displayed, but the upscaling will unavoidably give a somewhat fuzzy, blurry appearance to the visible image. In other words, it works, but the result isn't as sharp and pretty as it could be.

The other thing you can do is (1) tell macOS to render at "HiDPI" (big-looking) pixel density, but to a frame buffer that's larger than the actual screen dimensions; then (2) separately interpolate that higher-resolution rendering down to the actual native resolution of the display; and finally (3) send that interpolated video frame to the display in its own native resolution.

This latter method gives outstanding results, but the key thing to understand about it is that it places a much higher workload on the video subsystem of the computer, in two ways: (1) it requires a lot more memory to accommodate the extra-large video frame buffer; and (2) you're adding the whole new computationally-demanding task of interpolating/downscaling the larger frame buffer into the target resolution. Only after that process is completed can the result be sent to the display device.

So, in theory, it really shouldn't matter what interface protocol is being used to convey the video signal from the computer to the display, as the video frames being sent are at the native resolution of the display, even when "scaling" is being used. Rather, the performance bottleneck lies in the capacity of the computer's video subsystem to access enough RAM to hold both the extra-large frame buffer and the downsampled conversion, and also to have access to the necessary computational resources to actually perform the conversion fast enough to keep up with the selected video refresh rate.

The most resource-intensive "scale" setting is always going to be the largest one that's still smaller than the display's native resolution. In that case, the RAM needed to hold the video data will likely be around four times what would be required for an ordinary native-size render, plus all the downscaling computation (which is completely unnecessary for a native-size render).

When Apple says that the M1 Mac mini "supports two 4K displays", it means at those displays' native resolution. Trying to run two displays at an intermediate "scaled" resolution can be as taxing on the computer's internal video subsystem as driving four or five such displays without scaling.

I hope this helps to explain the difficulties some of you have been experiencing.

~ Justin

________

~ Note: I've made several rounds of corrections & refinements to the text above in the five hours since first posting it. Hope you enjoy. ~
 
Last edited:

joevt

macrumors 604
Jun 21, 2012
6,967
4,262
In other words, in terms of the number of pixels used, it draws everything twice as large.
In terms of pixels in one dimension.
In terms of total pixels (or area), it draws everything four times as large.

Now, you might think that if you have an intermediate display resolution selected, that macOS actually renders the screen image directly into that pixel density, using a non-integer scaling factor, but it doesn't; there are only "normal" (1x) and "HiDPI" (2x) rendering modes.
Back in the macOS Leopard and Snow Leopard days, you could choose arbitrary scaling factors between 1x and 3x (like 2.5x)
https://www.tonymacx86.com/threads/adding-using-hidpi-custom-resolutions.133254/post-2175982
That ability might still exist but I haven't tried it (it would require some patching to create the alternate non-2x scaling modes).

Helpful background info: In ancient times, when displays were made with cathode-ray tubes, the images on screen were produced by a sweeping beam of electrons impacting the back side of the display glass, which had a phosphorescent coating that would emit light when excited by the passing beam. The beam itself was an analog physical phenomenon, rather than a discrete digital one. If you ran a CRT at a resolution of 1024x768, for example, the beam would actually trace 768 discrete horizontal lines across the screen. If you switched to a resolution of 1280x960, then the display would trace 960 discrete lines. The phosphors would simply light up wherever the beam hit them; there was no fixed "native" resolution of the screen.
No fixed native resolution that the display output circuitry can detect or be affected by but there is a certain number of red/green/blue phosphors per inch horizontally and vertically (except CRTs with Aperture grille had vertically continuous phosphors https://en.wikipedia.org/wiki/Aperture_grille but even so, the beam has a minimum thickness so lines could overlap vertically?).

(As far as I'm aware, displays are not normally designed to accept incoming video signals that use a higher resolution than the display's own.)
True, but some displays can. For example, my Acer XV273K 4K display can accept resolutions between 341x192 (or some stupidly low numbers like that) and 8K. Scaling up or down works as well as can be expected (for scaling up it uses interpolation; for scaling down it uses info from multiple input pixels for each output pixel).

This latter method gives outstanding results, but the key thing to understand about it is that it places a much higher workload on the video subsystem of the computer, in two ways: (1) it requires a lot more memory to accommodate the extra-large video frame buffer; and (2) you're adding the whole new computationally-demanding task of interpolating/downscaling the larger frame buffer into the target resolution. Only after that process is completed can the result be sent to the display device.
Frame buffers are not very large compared to the video RAM. I have tried a scaled mode of 16384x16384 with a Radeon Pro W5700. Performance of macOS is greatly reduced and you need to set the cursor size to max to see what you're clicking on.
16Kx16K is 1 GB. The GPU has 8 GB total. The screenshots are ≈250 MB (.png file format has compression).

The maximum that you would normally expect to get is something like 8K = 127 MB.
Intel graphics will have less RAM available (1.5 GB) than a discrete GPU but it's enough for most jobs.
I suppose if you had an 8K display, then you might have scaled modes up to 16K = 507 MB.

One thing GPUs are supposed to do efficiently is scale things, so the W5700 can scale 16Kx16K to 4K without too much trouble 60 times per second.

When Apple says that the M1 Mac mini "supports two 4K displays", it means at those displays' native resolution. Trying to run two displays at an intermediate "scaled" resolution can be as taxing on the computer's internal video subsystem as driving four or five such displays without scaling.
If Apple says the M1 Mac mini supports two 4K displays then they mean all the scaled modes that you can get for a single display should still work when two displays are connected but you do have a point that it takes some resources to do scaling. I would like to see benchmarks that measure the performance loss due to scaling.
 

waydabber

macrumors 6502
May 27, 2010
363
273
Just to chime in, the M1 is totally capable to produce scaled resolutions up to 7K and also it has zero problem mapping these resolutions to any physical display resolution with ease. With the (free and open source) app BetterDummy you can create a virtual screen of any size and simply set it as mirror main and mirror it to your physical display (which can even be a 2K display) creating custom HiDPI resolutions of your liking. You can do this with multiple displays as well. It works beatifully on the cheapest M1 with 8 gigs of RAM. Since BetterDummy creates dozens upon dozens of resolution options in small increments, you can easily customize the size of the screen to the size best fits your needs.
 

joevt

macrumors 604
Jun 21, 2012
6,967
4,262
Just to chime in, the M1 is totally capable to produce scaled resolutions up to 7K and also it has zero problem mapping these resolutions to any physical display resolution with ease. With the (free and open source) app BetterDummy you can create a virtual screen of any size and simply set it as mirror main and mirror it to your physical display (which can even be a 2K display) creating custom HiDPI resolutions of your liking. You can do this with multiple displays as well. It works beatifully on the cheapest M1 with 8 gigs of RAM. Since BetterDummy creates dozens upon dozens of resolution options in small increments, you can easily customize the size of the screen to the size best fits your needs.
I think you've done testing with external displays, but what about the internal display of an M1 Mac? Can you describe how BetterDummy works with the notch of the M1 Max or M1 Pro? There might be an issue as describe at https://forums.macrumors.com/thread...rt-for-m1-macs-and-more.2284799/post-30543013
 
  • Like
Reactions: Amethyst1

waydabber

macrumors 6502
May 27, 2010
363
273
Hey @joevt thanks for the info. Actually I had no idea as I don't have an nMBP. BetterDummy works with internal displays as well, but I think macOS in general can't really handle the situation well when a display is mirrored to the nMBP's screen with a notch (Apple might think that this is a rare use case and they are right I think).

I can add scaling options for the native display size of the 14" and 16" MBP - the only problem is that these displays have a super weird aspect ratio and cannot scale really well (for example the native 14" res 3024x1964 has a GCD /Greatest Common Divisor/ of 4), but I can do approximations. The notch will probably still cover the screen and apps will not work around that.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.