Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

BlueTide

macrumors regular
Original poster
Feb 6, 2007
230
286
Silicon Valley, CA
This is mostly to vent out my own frustrations and has little to no baring to the actual products.

For some time now, I’ve not had a decent external monitor for my personal use. I sold the old one as I relocated to a new home and that also left me with just a late 2013 maxed out MacBook Pro - a fantastic machine btw. Now, that is a little inadequate for my photography stuff, but it also doesn’t drive modern new displays too well.

So, I’ve been eyeing the new iMac Pros. I do some video, but really, the rendering times are almost irrelevant compared to time I spend on edits and the frequency of use. I could lie to myself about doing that more in the future. Reason says that a std iMac would be enough.

Yet I kind of want the Pro. Money is not a real problem, just spending it foolishly hurts me emotionally. Pro also doesn’t make much sense since the display isn’t that good for what I do. It’s not bad, just not that great.

Other demands are being being quiet and being sleek. I’d love a future Mac Display with Mac Pro in a closet, hidden away and connected via very long TB3 cable running to my desk. But, no idea about how much that will be or when we even find out about it.

Gah. Choices. Now, where was that buy button again...
 
May I ask what you want from the display?
I mean, I assume resolution is not your problem with the iMac display. Is it that it's only dithered 10-bit? Or that it's P3 and not AdobeRGB? Cause aside from that I can't think of anything not superb on the iMac's display.
 
May I ask what you want from the display?
I mean, I assume resolution is not your problem with the iMac display. Is it that it's only dithered 10-bit? Or that it's P3 and not AdobeRGB? Cause aside from that I can't think of anything not superb on the iMac's display.

Sure; but you hit some notes already. Few additional ones:

- hardware calibration
- self calibrating regularly outside office hours
- HDR
- integrated kvm switch

Haven’t seen the last 3 in one package, though. Edit: forgot a personal one: larger screen than 27”.
 
Last edited:
- hardware calibration
- self calibrating regularly outside office hours
- HDR
- integrated kvm switch


Since I can't really think of a way to phrase this that won't make it sound wrong, I'll preface it by saying that this is not some blind fanboy defence, but merely counter arguments to see what your opposing arguments would be

1) Hardware calibration
Well, first off, the factory calibration is really good, and gives an average DeltaE of like 0.6 or something (That's for the iMac, not the Pro, but I assume it's the same)
Second, calibration through ColourSync in macOS should give the same results as calibration through knobs and whatnot on the display, no?

2) Self calibrating
I have no idea how that should work, let alone how it should work well. Care to enlighten me?

3) Yeah, fair point - No argument there. It's about the only thing that genuinely disappoints me with the iMac displays as of right now.

4) KVM
Isn't that a really minor complaint though? Getting an external KVM switch shouldn't be that problematic?
Or perhaps get keyboards and mice that can connect to several bluetooth connections or something (Obviously not ideal for all workflows) - Also, this doesn't seem to be an issue with the display, more the computer part, no?
 
..but merely counter arguments to see what your opposing arguments would be

That's cool, no worries. And this is partially a personal value choice too, so I understand that opinions may vary.

1) Hardware calibration
Well, first off, the factory calibration is really good, and gives an average DeltaE of like 0.6 or something (That's for the iMac, not the Pro, but I assume it's the same)
Second, calibration through ColourSync in macOS should give the same results as calibration through knobs and whatnot on the display, no?

Calibrations tend to drift over lifetime of a product, so you need a solution no matter how good the factory default is. Well, assuming you care, that is. Software calibration does not adjust the display itself, it alters the data that is sent to the display whereas with hardware calibration, the computer just sends the values straight to the display that then takes care of the conversion. This has numerous benefits that matter more or less depending on what you do and how you work - and for as long as a true 10-bit pipeline working in all apps is not there, this matters a bit more. I also quite like how it's often easy to change the calibration targets from the display as needed without software workaround/solutions.

2) Self calibrating
I have no idea how that should work, let alone how it should work well. Care to enlighten me?

Your monitor wakes up at 4am every Sunday (or what have you), pops out a calibrator, calibrates, and goes back to sleep. Ready for work when I wake up.

4) KVM
Isn't that a really minor complaint though? Getting an external KVM switch shouldn't be that problematic?

This is minor, you're right. Especially given iMac that to my knowledge does not support external sources anyway *cough*. It would just be terribly nice for me to have one display that, with a press of a button, can switch to command another 'puter (say, Linux that does the real number crunching). Less hassle, nicer, cleaner desk.
 
Your monitor wakes up at 4am every Sunday (or what have you), pops out a calibrator, calibrates, and goes back to sleep. Ready for work when I wake up.

But how can it do accurate calibrations? Surely the display only knows what data it recieves, not how it actually looks on the display. Unless there's a spectrometer or something attached.

Calibrations tend to drift over lifetime of a product, so you need a solution no matter how good the factory default is. Well, assuming you care, that is. Software calibration does not adjust the display itself, it alters the data that is sent to the display whereas with hardware calibration, the computer just sends the values straight to the display that then takes care of the conversion. This has numerous benefits that matter more or less depending on what you do and how you work - and for as long as a true 10-bit pipeline working in all apps is not there, this matters a bit more. I also quite like how it's often easy to change the calibration targets from the display as needed without software workaround/solutions.

Assuming you only work in OS X/macOS, I don't see the issue here though? Since the OS uniformally handles all the data to the display, if it converts all the signals to the "calibrated" signals, shouldn't that be fine?

This is minor, you're right. Especially given iMac that to my knowledge does not support external sources anyway *cough*. It would just be terribly nice for me to have one display that, with a press of a button, can switch to command another 'puter (say, Linux that does the real number crunching). Less hassle, nicer, cleaner desk.

Yeah. Prior to the 5k displays you could put the computer in Target Display mode (which required a restart so still wouldn't be as fluid as your proposed workflow) and use external sources. But when the 5k displays came out no single cable could carry the data stream so Apple cut support for it, and haven't added it back in. Though TB3 could support it in theory
 
But how can it do accurate calibrations? Surely the display only knows what data it recieves, not how it actually looks on the display. Unless there's a spectrometer or something attached.

Precisely. Assumption is that there is an external, or integrated, colorimeter/spectrometer. From experience, integrated ones that automate all this are such a nice feature...

Assuming you only work in OS X/macOS, I don't see the issue here though? Since the OS uniformally handles all the data to the display, if it converts all the signals to the "calibrated" signals, shouldn't that be fine?

Nope. Well, depends on what is fine. For vast, vast majority, it's fine. The issue is more pronounced with 8-bit realm (which is why I use that as an example here), where that software profiling basically means you don't get the full breadth of the signal to the monitor. Hardware (3D) LUTs help with this, even if they don't fully remedy the issue. In full 10-bit world, it's less of a problem, but we ain't there yet. And at the same time, expanding color gamuts require that much more.

Since you seem interested, here's an old read, but the same principles still apply: http://www.eizo.com/library/basics/maximum_display_colors/

There WAS a claim a few days back that iMac Pro would have both hardware calibration and LUT support, but post did not offer any references. We'd still need to know how to calibrate that.
 
Precisely. Assumption is that there is an external, or integrated, colorimeter/spectrometer. From experience, integrated ones that automate all this are such a nice feature...

Ah. That makes a lot of sense then. Never heard of systems like that before, but it sounds awesome!

Since you seem interested, here's an old read, but the same principles still apply: http://www.eizo.com/library/basics/maximum_display_colors/

Thank you! I am indeed interested. I've done video colour grading for a while on my iMac (not in a professional environment, but what I'd call a high end enthusiast way). All of what you said also makes perfect sense. It also disappointed me a bit when the iMac Pro was revealed that it wasn't true 10-bit but only dithered. Although I've never done a side by side test to see if I can see the difference. My current iMac also uses dithering and I've heard that if you don't notice a flicker, the dithered image is as good as true 10-bit, and I don't see any flickering artifacts from the temporal dithering.
I would assume macOS' colour sync does send 10-bit info to the dithering engine of the display though, so you'd still get all the information of 10-bit per sample even if it still requires the workaround process dithering delivers.

There WAS a claim a few days back that iMac Pro would have both hardware calibration and LUT support, but post did not offer any references. We'd still need to know how to calibrate that.

I would assume Apple's way of doing that would be for the Colour Sync app to send it's colour profile to the display hardware, so the hardware calibration would still be done via the OS and not knobs on the display or whatnot. Seems like the Apple way to do it. Colour Sync detects it's an iMac Pro display and instead of applying the profile in software, sends it off to the display's hardware.

That's assuming hardware calibration is indeed a feature of course. Which I hadn't heard of before you mentioned it, and since it sounds like your source may be a bit unreliable too, who knows.
 
It also disappointed me a bit when the iMac Pro was revealed that it wasn't true 10-bit but only dithered.

Yeah. It's not such a big deal as such, given all, more like comparing faux leather to real leather. Faux can even be better, but it's... just... not right. None of that really matters on their own - there's no perfect package that I am aware of, trade-off are just different. And to re-iterate, I didn't mean to imply that iMac display is bad, not at all. Especially in the non-pro it's very nice bang for the buck.

I would assume Apple's way of doing that would be for the Colour Sync app to send it's colour profile to the display hardware, so the hardware calibration would still be done via the OS and not knobs on the display or whatnot. Seems like the Apple way to do it. Colour Sync detects it's an iMac Pro display and instead of applying the profile in software, sends it off to the display's hardware.

Don't know how they'd do that, but that's a pretty good idea! That could help third party calibrator companies too as it might become easier to support the ecosystem and they'd need to do less of their own software. Perhaps.
 
I am guessing that Apple is going to raise the bar with the promised modular Mac Pro, shall we call it the mMP? I think that for some that the iMac Pro will be a bridge to the mMP. If people think that the iMac Pro is expensive, I am guessing that the price of the mMP is going to take some poster's breath away.
 
I am guessing that Apple is going to raise the bar with the promised modular Mac Pro, shall we call it the mMP? I think that for some that the iMac Pro will be a bridge to the mMP. If people think that the iMac Pro is expensive, I am guessing that the price of the mMP is going to take some poster's breath away.

I wouldn't be too surprised. I am not too worried about the cost as such, but more about the compromises I may be forced to make despite the high cost. In any case, it would be nice to know more about mMP to be able to make more informed calls.
 
  • Like
Reactions: RuffDraft
Yeah. It's not such a big deal as such, given all, more like comparing faux leather to real leather. Faux can even be better, but it's... just... not right. None of that really matters on their own - there's no perfect package that I am aware of, trade-off are just different. And to re-iterate, I didn't mean to imply that iMac display is bad, not at all. Especially in the non-pro it's very nice bang for the buck.


Never got the impression you thought it was a bad display either. Just wanted to hear more about what you thought :)
I personally love my 2014 5k iMac, and get sad when I see "generic" displays on friends' laptops and desktops for that matter. There are of course other good displays out there, but what you see on the average person's desk doesn't exactly often come close to what the iMac offers. In fact, and this is hillarious to me personally, I saw someone on a photograph forum recommend a laptop that boasted to support "45% of the NTSC colour space"That was btw a 6-bit display as well. And it was a 2017 laptop.....



Actually, maybe you could clear up another display confusion I have.

My iMac's display registers as 10-bit in macOS as follows "ARGB(2-10-10-10). 10 bits per red green and blue. But why is alpha showing 2 bits? On my 8-bit MacBook Pro display it shows "ARGB(8-8-8-8)". 8--bits per red green and blue, and 8 for alpha. That confuses me a great deal, and I haven't been able to Google any answers.
 
  • Like
Reactions: RuffDraft
I wouldn't be too surprised. I am not too worried about the cost as such, but more about the compromises I may be forced to make despite the high cost. In any case, it would be nice to know more about mMP to be able to make more informed calls.

I wouldn’t be surprised if Apppe are still messing around with prototypes for the new modular Mac Pro, as they didn’t seem to have a clue what it was going to look like in June. For me, it’s definitely going to arrive in 2019. I’d be amazed if it arrived December 2018.

I understand your comments, but you don’t seem to have much of a need for all this super high tech external monitor that you’re craving? Do you do photography professionally? With 200-300% increase in photo editing with the iMac Pro, that should be a huge draw for you - can always slap on an extra monitor at a later date. I’ve used a Matte 17” MacBook Pro as a monitor for my video work and most people always say that my colours are really nice. Because of this, I’ve never really considered that I would need a monitor that’s anything more.

If photography isn’t your profession and it’s just for fun, I’d just wait for a six core iMac to be announced next year and be done with it. Otherwise, if having an external monitor is that important, then you could always hide the iMac Pro beneath your desk and use a first class monitor above.

I can’t see true HDR coming to the iMac for some time - not unless it’s dumbed down HDR like the Panasonic TVs deliver.

I may be wrong though. Either way, stop your whining and crack on with life! ;p I’ve been using a 2011 MacBook Pro to run my business - I’m sure that 2013 one is good enough! ;D

I’m just teasing, but least the iMac Pro is a choice now. All the best with your decision and thanks for sharing a lot of knowledge in relation to screen tech too.
 
I can’t see true HDR coming to the iMac for some time - not unless it’s dumbed down HDR like the Panasonic TVs deliver.

Not so sure about that honestly. I'm betting it'll be there for the next version. There's relatively little required in terms of change in hardware, to meet the HDR600 standard. In fact, I'd assume a software update could bring HDR400. Bump the brightness by 100 nits and HDR600 should be possible too.
https://www.anandtech.com/show/12144/vesa-announces-displayhdr-spec-and-tiers

I may be wrong though. Either way, stop your whining and crack on with life! ;p I’ve been using a 2011 MacBook Pro to run my business - I’m sure that 2013 one is good enough! ;D

I didn't hear any whining ;)

You may seriously need to get that 2011 replaced though. The GPU is a ticking time bomb that'll die any moment now.
Regarding display needs, well, display needs vary greatly depending on what exactly you do. Your photos may not strictly speaking require as much colour precision as what the OP needs - for instance if the OP works with gradients a lot, banding could be a bigger issue.
 
  • Like
Reactions: RuffDraft
My iMac's display registers as 10-bit in macOS as follows "ARGB(2-10-10-10). 10 bits per red green and blue. But why is alpha showing 2 bits? On my 8-bit MacBook Pro display it shows "ARGB(8-8-8-8)". 8--bits per red green and blue, and 8 for alpha. That confuses me a great deal, and I haven't been able to Google any answers.



I am not quite sure if I understand what you are asking. 10+10+10+2 = 32. 4x8 = 32. 2^5=32. Computers generally like numbers like that, but the exact reasons in this case evade my knowledge. Back in the day, that helped to do cheap bit-shift operations, you could pack data nicely to long ints etc., but I don’t know how display signals and monitors work these days and if such would still be relevant.



Not that we need to pack everything to 32s. SGI for one used 12 bits per channel quite some time back already.



I wouldn’t be surprised if Apppe are still messing around with prototypes for the new modular Mac Pro, as they didn’t seem to have a clue what it was going to look like in June. For me, it’s definitely going to arrive in 2019. I’d be amazed if it arrived December 2018.



Yeah, that wouldn’t surprise me either. It may be a long wait. On bright side, I have relatively little free time to enjoy a new ‘puter anyhow.



I understand your comments, but you don’t seem to have much of a need for all this super high tech external monitor that you’re craving? Do you do photography professionally?



Need? Not strictly speaking, no. I do that as a hobbyist, but it gets rather tense at times. And a good monitor does help to save quite some time by allowing to switch different targets. Those are actually not that super high-tech ones… well, at least not that costly. Just a price of a good DSLR body or something.



At work it’s another matter but there I have an IT department that just gives/builds me what I need.



With 200-300% increase in photo editing with the iMac Pro, that should be a huge draw for you - can always slap on an extra monitor at a later date.



But that’s the thing - it’s not. In photo editing most of the time is spent on various brush tools, curve tools and saving (large file sizes). Processor speeds affect the real throughput very little, it’s mostly manual labor. Yeah, I do some video and 3D at home too, but it’s more aspirational than a real need. Thus the chat about monitors in the first place - a good one can speed the other parts of the workflow more than faster computer.



Doesn’t mean that I don’t want the Pro. :) I think I am trying to talk myself into getting one.



I may be wrong though. Either way, stop your whining and crack on with life! ;p I’ve been using a 2011 MacBook Pro to run my business - I’m sure that 2013 one is good enough! ;D



I know, I know I should! :) This 2013 MBP is in fact surprisingly good still, even for editing Hasselblad files. It’s just that this doesn’t drive modern monitors anymore and some rare 3D and video is pain. Gah, life and choices... :)
 
  • Like
Reactions: RuffDraft
Sounds like the original poster wants an EIZO coloredge. They don't have a kvm switch but have the other things listed, hardware calibration, built in calibrator and what not.

Would still require a computer. But if you did buy an iMac why not have dual screens?
 
I am not quite sure if I understand what you are asking. 10+10+10+2 = 32. 4x8 = 32. 2^5=32. Computers generally like numbers like that, but the exact reasons in this case evade my knowledge. Back in the day, that helped to do cheap bit-shift operations, you could pack data nicely to long ints etc., but I don’t know how display signals and monitors work these days and if such would still be relevant.

Well, my question basically comes down to "What's the point of the alpha channel and why is it not a problem/why is it a problem that it's only 2 bits on the 10 bit display vs. 8 bit on the 8-bit display." – I get your point with the whole 2^5=32 because binary argument, but then if that's the only reasoning, surely something is lost by going from 256 possibilities for the alpha channel to only 4, and the 10-bit panel, in one way or another, doesn't surpass the 8-bit one that has 8-bits per alpha sample.... What is the purpose of this alpha channel? It would seem obvious it had to do with opacity, but that's handled by the compositor anyhow and you don't need an alpha channel in the display for that.
 
Not so sure about that honestly. I'm betting it'll be there for the next version. There's relatively little required in terms of change in hardware, to meet the HDR600 standard. In fact, I'd assume a software update could bring HDR400. Bump the brightness by 100 nits and HDR600 should be possible too.
https://www.anandtech.com/show/12144/vesa-announces-displayhdr-spec-and-tiers

I agree that it'll be their next step up with the iMac, but I'd be very surprised if Apple places a better screen on an iMac than on the iMac Pro within a year of release. I think 2019 to coincide with the new Apple monitor. Otherwise, sort the iMac Pro out with the new display first and then on the iMac...

I didn't hear any whining ;)

Me neither, it was tongue in cheek wind up - I'm British - it's our sense of humour! ;)

You may seriously need to get that 2011 replaced though. The GPU is a ticking time bomb that'll die any moment now.
Regarding display needs, well, display needs vary greatly depending on what exactly you do. Your photos may not strictly speaking require as much colour precision as what the OP needs - for instance if the OP works with gradients a lot, banding could be a bigger issue.

Yeah, the GPU has already died once and has been replaced. I have the iMac Pro coming the first week of January, so I'm hoping it can last another two weeks!

Yeah, I get what you're saying. I think colour correction with video is much harder than with photographs though, as the files themselves are more like poor JPEGs, so it's a much harder task to colour correct with film than it is with photographs, where the data per frame is vastly different. Plus, banding is usually seen by zooming in a bit, and I'm assuming that an iMac 5K display would easily show this... but I know everyone has their own reasons.
 
What is the purpose of this alpha channel? It would seem obvious it had to do with opacity, but that's handled by the compositor anyhow and you don't need an alpha channel in the display for that.

As a disclaimer, we are now on topics where I don't recall facts. But, to my knowledge, for most displays the alpha channel is simply unused. The composition is done before, so all you do is send RGB raster to the monitor. However, that is/was not universally true. IIRC, there were devices that took the signal as input and did compositing on their own afterwards, and those could use the alpha. Say, you had a titling machine that put out a banner over a live news feed back in the 90s which used alpha for obvious reasons.
 
  • Like
Reactions: RuffDraft
As a disclaimer, we are now on topics where I don't recall facts. But, to my knowledge, for most displays the alpha channel is simply unused. The composition is done before, so all you do is send RGB raster to the monitor. However, that is/was not universally true. IIRC, there were devices that took the signal as input and did compositing on their own afterwards, and those could use the alpha. Say, you had a titling machine that put out a banner over a live news feed back in the 90s which used alpha for obvious reasons.


Does make a lot of sense that - Also didn't get why my Macs would need an alpha channel in the display as the GPU did the compositing along with Quartz.
Thanks :)
 
Check out Vincent Laforet's take on the iMac Pro if you haven't already.

I think similarly to other posters here: if you're going to max out an iMac, you might as well truly max it out by getting a baseline iMac Pro if funds allow. You're getting a much newer, much better GPU, a much more reliable XEON CPU with more cores and nearly equivalent single core performance (but once your application makes use of up to 4 cores, the iMac Pro beats the iMacs hands down) and fast ECC RAM, which you could boost to 64GB RAM and be out...

I get that the price increases significantly with everything that you add on, but it's up to you as to whether it's worth adding... you can always update the RAM at a later date, and the baseline GPU is still vastly better than the one you'll find inside the standard iMac...

As much as the new iMacs of 2017 are great and powerful enough, the GPU is much older in comparison to the one inside the iMac Pro, the temperatures are slightly annoying/worrying and the ports on the iMac Pro, speakers, 1080p FaceTime Camera, SD card supporting UHS II etc. it's just a much more complete machine.

Granted, it suits other users more, but if the price difference for you is debatable, then I think you may end up regretting the 2017 iMac over the iMac Pro. The 2017 iMac is still awesome, but the spec differences and additional ports making the baseline iMac Pro a bit of a bargain over a maxed out iMac... you only lose out on the size of the RAM.

Either way, all the best with your purchase!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.