Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Geekbench does not represent the real-world performance. Do you really expect iMac with i9-9900K perform similar to a PC with i9-9900K and a bigger cooler? Wow... You better check the size of iMac's cooler.

[automerge]1579554447[/automerge]


That doesn't justify that photographers need to spend much more on Mac instead of PC.

What they spend is their own choice. They can buy a cheap camera, too, it can do the job. They choose the tools that they feel will enable them to produce their best work. What that tool costs is irrelevant if they can cover the costs with the work they do. Therefore you can't say "$6000 is too expensive." That's an opinion, not a fact.
 
What they spend is their own choice. They can buy a cheap camera, too, it can do the job. They choose the tools that they feel will enable them to produce their best work. What that tool costs is irrelevant if they can cover the costs with the work they do. Therefore you can't say "$6000 is too expensive." That's an opinion, not a fact.

It is too expansive since you can build a better but cheaper PC OR a much better PC at $6000. For those people using MF cameras actually dont own it cause it's too expansive. The digital back itself is more than 10,000 dollars.

Tell me, does $6000 basic Mac Pro perform better than 8 cores PC with less money? Those people buying Mac Pro for photo uses have no choice cause Mac Pro is the only proper desktop available in Mac.

If Mac Pro worth every single penny for photo uses, I wouldn't even complain but 8 cores for $6000? Seriously? Ryzen 9 3900X performs much faster than Mac Pro's 16 cores at $500. There is no way to justify Xeon and server parts for photography.
 
It is too expansive since you can build a better but cheaper PC OR a much better PC at $6000. For those people using MF cameras actually dont own it cause it's too expansive. The digital back itself is more than 10,000 dollars.

Tell me, does $6000 basic Mac Pro perform better than 8 cores PC with less money? Those people buying Mac Pro for photo uses have no choice cause Mac Pro is the only proper desktop available in Mac.

If Mac Pro worth every single penny for photo uses, I wouldn't even complain but 8 cores for $6000? Seriously? Ryzen 9 3900X performs much faster than Mac Pro's 16 cores at $500. There is no way to justify Xeon and server parts for photography.

I can take killer photos with an old camera from ten years ago and a vintage used lens I paid $50 dollars for, but it doesn't mean I want to. Very few business decisions are made purely on a theoretical peformance per $ basis; it's a complex calculus of what works best for you.
 
  • Haha
Reactions: high heaven
It’s totally unclear to me why people are comparing a complete system with power supply and cooling (Mac Pro) to a chip (“Ryzan” chip or “Threadripper”). They’re not comparable.

Show me a comparable design that enables good sustained performance and then maybe it makes sense to compare.

Because the intel CPUs in this thing are the major BOM cost (well, that and the un-necessarily difficult and expensive to manufacture case, using non-standard parts) and where the biggest savings could have been made whilst improving performance drastically in the VAST MAJORITY of scenarios, and pulling even or only slightly faster in most of the rest. Intel lead in AI and machine learning, neither of which are big in the Apple market, and better handled by the afterburner card or the GPUs anyhow.

PSU and cooling - add a thousand dollars if you want to go overboard. Even two thousand if you want? The Threadripper system will still be MUCH MUCH cheaper.

This would not impact Apple's profitability, yet give end users a faster machine for less.

I'm not sure just why so many Apple fanboys here seem so dead set against getting better performance for less money...even without impacting Apple's profit margins at all.
 
  • Like
Reactions: defjam
you dont need Xeon for photography, videography or music production.



Sure thing. Here is one


This has tons of image viewer suggestions. But all of them are just bad apps.

And if you work in HDR


8-bit is fine for managing files. But I cannot simply keep changing the monitor profile for just managing files. A workflow by nature has to be smooth and without unwanted interferences.

The problem actually stems from Microsoft not giving enough weight to this issue. Its a simple fix. But they seemed to have ignored it all along.

Don't get me wrong. I love Windows, in fact use it for gaming a lot. It just didn't fit my flow. I know the Mac Pro is expensive and I probably don't need Xeon/ECC. But something that is known to JUST work is a better bet than experimenting at this stage. Plus our video production is FCPX based. So that is another thing that sticks us to the Mac. We are trying out Davinci mainly for its color correction capabilities, but then again it should play well with our 10 bit monitors. Time and testing will tell.
I'm unsure if either of those two helped or not.

The first link appears to find fault with the default application Windows uses to open photos (i.e. the "Photo" application). The solution appeared to be to use a different application which honors color calibration (apparently the Photos app does not).

The second appears to be a problem with HDR and it didn't look like the actual cause of the problem was identified. That Windows is the problem was speculation on the posters part without anything other than "it works fine with my Xbox" leading him to the conclusion it was an Windows issue. This is not to say Windows isn't the problem but it was more confusing than helpful.

I appreciate the links though.
 
Because the intel CPUs in this thing are the major BOM cost (well, that and the un-necessarily difficult and expensive to manufacture case, using non-standard parts) and where the biggest savings could have been made whilst improving performance drastically in the VAST MAJORITY of scenarios, and pulling even or only slightly faster in most of the rest. Intel lead in AI and machine learning, neither of which are big in the Apple market, and better handled by the afterburner card or the GPUs anyhow.

PSU and cooling - add a thousand dollars if you want to go overboard. Even two thousand if you want? The Threadripper system will still be MUCH MUCH cheaper.

This would not impact Apple's profitability, yet give end users a faster machine for less.

I'm not sure just why so many Apple fanboys here seem so dead set against getting better performance for less money...even without impacting Apple's profit margins at all.

Uh, because one machine exists and the other doesn't? If both existed, I'd almost certainly go with the cheaper one, as I don't do video. There's that word again, "If".

Philip II of Macedon told the Spartans:
You are advised to submit without further delay, for if I bring my army into your land, I will destroy your farms, slay your people, and raze your city.
The Spartans replied with a single word:
(stolen from Wikipedia)
 
Last edited:
Some buyers choose Macs over PCs to avoid Windows. I am one of them. I'm also someone who needs to make a profit, so the choice of a "decent" 7,1 build @ $12K vs a $6K PC with roughly similar raw performance is essentially tossing $6K of net. WTF?

1) I use Windows at work and the double edged sword of updating is frustrating
2) I typically work in close proximity to my computer - often near set on a cart - so loud fans are non grata.
3) TB3 I/O is useful and well supported
I am still puzzled by the concerns about updates. While the way Microsoft has implemented them wouldn't be my first choice I haven't had any issues with them earlier.

Example from last week: I recently updated my system to Windows 10 1909. Apparently there was an update for it one night because when I sat down at the system the next day I noticed this:

Task Bar with Update Alert Icon - 2020-01-16.jpg

A little yellow dot indicating an update was pending installation. When I clicked on it the following appeared:

Dialog After Selecting Update Alert Icon - 2020-01-16.jpg

A dialog informing me that a restart was required. I could have scheduled the restart or restarted the system at that time. I chose the latter as I saw no reason to do so before I started my day. The restart took less than 60 seconds (this on an Ivy Bridge laptop with SATA HD). What I know wouldn't have happened is the system restarting during my work day as I have my active hours defined as 5:00 am until 10:00 pm so there was no concern about Windows spontaneously restarting my system during the middle of the day.

If I didn't feel like updating it I didn't have to. I can even pause the installation of updates for up to seven days if I was concerned about it interrupting work:

Paused Updates Screen Delay of One Week - 2020-01-16.jpg

I am really puzzled as to what the issue is with applying updates. It's not nearly as bad as people make it out to be. IMO it's certainly not a reason to avoid using Windows.
[automerge]1579564111[/automerge]
What they spend is their own choice. They can buy a cheap camera, too, it can do the job. They choose the tools that they feel will enable them to produce their best work. What that tool costs is irrelevant if they can cover the costs with the work they do. Therefore you can't say "$6000 is too expensive." That's an opinion, not a fact.
I am tired of hearing this argument. I work in a very large organization which can easily afford to buy the more expensive of two products. However I can't say that I've ever not had to justify purchasing the higher cost product and, many times when I do, I am still forced to buy the lower cost option.
 
Uh, because one machine exists and the other doesn't?

You mean because Apple didn't build the other one? Duh... that's what we're complaining about.

If you're talking about the existence of the CPUs in question... 🤣

AMD are producing a HEAP more Threadripper/EPYC/Ryzen processors than Intel is making 28 core workstation Xeons :D

You can bet that Apple didn't release the new Pro for MONTHS after the announcement, because Intel are simply unable to supply in volume. Their ability to supply these very high core count chips is extremely limited. They can't make them. Which is lucky, because nobody (outside of the fringe nut-job sub-set of the apple crowd) even wants them due to the performance per watt sucking so bad.

The 28 core desktop parts are as close to "paper launch" part as you can get (purely so intel can claim they have a "competitive" part on the market. Even if zero are actually sold).
 
Last edited:
I am still puzzled by the concerns about updates. While the way Microsoft has implemented them wouldn't be my first choice I haven't had any issues with them earlier.

I am really puzzled as to what the issue is with applying updates. It's not nearly as bad as people make it out to be. IMO it's certainly not a reason to avoid using Windows.

I expect that most people are bothered not by their installation--which can be intrusive if you don't go to the effort to make them avoid certain time windows, etc--but by what they break. There's a reason corporate IT organizations limit both what software can be installed and do rigorous testing of updates before releasing them--they are notorious for breaking the functionality of software and/or third-party hardware.

Point in hand: At my business one of my employees uses a Windows 10 machine to run a very specialized piece of CAD software; it's a specialized layer written to run on top of Rhino 3D. For several weeks she was having issues with rendering crashing. The software vendor wanted to blame the brand of graphics card in her PC, even though the makers of Rhino say it's fully supported. Why? Because it's not what they use on their development machines. Meanwhile, the problems all started shortly after a software update of their program...and a Windows update that also installed.

We never figured out exactly whose software update broke things, but I eventually solved the problem myself by going in and first installing a new version of the graphics card driver, and then tweaking a bunch of settings in the graphics card driver software. In the meantime she had lost countless hours of valuable work time due to crashes. More hours were lost with the software vendor pointing fingers. Because of my 20+ years in IT I was able to muddle through to a solution.

There's a big upside to the smaller closed ecosystem of MacOS--not only does almost every application I ever use just plain work--but updates virtually never break anything. I can't recall the last time an update broke any functionality whatsoever. Conflicting drivers/driver settings are also a non-issue. In the Windows ecosystem it's commonplace--it's the tradeoff for having a huge choice in hardware and software. To some people it's worth it, to others not.
 
I expect that most people are bothered not by their installation--which can be intrusive if you don't go to the effort to make them avoid certain time windows, etc--but by what they break. There's a reason corporate IT organizations limit both what software can be installed and do rigorous testing of updates before releasing them--they are notorious for breaking the functionality of software and/or third-party hardware.

Point in hand: At my business one of my employees uses a Windows 10 machine to run a very specialized piece of CAD software; it's a specialized layer written to run on top of Rhino 3D. For several weeks she was having issues with rendering crashing. The software vendor wanted to blame the brand of graphics card in her PC, even though the makers of Rhino say it's fully supported. Why? Because it's not what they use on their development machines. Meanwhile, the problems all started shortly after a software update of their program...and a Windows update that also installed.

We never figured out exactly whose software update broke things, but I eventually solved the problem myself by going in and first installing a new version of the graphics card driver, and then tweaking a bunch of settings in the graphics card driver software. In the meantime she had lost countless hours of valuable work time due to crashes. More hours were lost with the software vendor pointing fingers. Because of my 20+ years in IT I was able to muddle through to a solution.
If you never figured out what caused the issue then why are blaming Windows? Perhaps it was the update for the graphics card. Especially when you installed a new driver for the graphics card and tweaking a bunch of settings in order to solve it. This sounds more like a graphics card driver issue than the Windows update.

There's a big upside to the smaller closed ecosystem of MacOS--not only does almost every application I ever use just plain work--but updates virtually never break anything. I can't recall the last time an update broke any functionality whatsoever. Conflicting drivers/driver settings are also a non-issue. In the Windows ecosystem it's commonplace--it's the tradeoff for having a huge choice in hardware and software. To some people it's worth it, to others not.
All updates have the potential to cause problems, even macOS updates. To pretend they don't is being disingenuous. The fact you haven't had any issues with macOS updates doesn't change that fact. By that metric Windows updates are trouble free because I haven't had any issues with them.
 
You mean because Apple didn't build the other one? Duh... that's what we're complaining about.

If you're talking about the existence of the CPUs in question... 🤣

AMD are producing a HEAP more Threadripper/EPYC/Ryzen processors than Intel is making 28 core workstation Xeons :D

You can bet that Apple didn't release the new Pro for MONTHS after the announcement, because Intel are simply unable to supply in volume. Their ability to supply these very high core count chips is extremely limited. They can't make them. Which is lucky, because nobody (outside of the fringe nut-job sub-set of the apple crowd) even wants them due to the performance per watt sucking so bad.

The 28 core desktop parts are as close to "paper launch" part as you can get (purely so intel can claim they have a "competitive" part on the market. Even if zero are actually sold).
I could have sworn that you didn't understand why Apple Fanboys wouldn't want a faster cheaper machine. I think they would love one but it doesn't exist. I certainly would.
 
Last edited:
I can take killer photos with an old camera from ten years ago and a vintage used lens I paid $50 dollars for, but it doesn't mean I want to. Very few business decisions are made purely on a theoretical peformance per $ basis; it's a complex calculus of what works best for you.

If alternative exists, there will be some people complain about the price and more. Tell me the price of Ryzen or Threadripper?
 
That's a blanket statement without much merit. A pro photographer that is running a successful business (not a hobbyist that calls themselves a pro, but someone who makes their *living* do it) won't blink at $6000--they are buying lenses and cameras that cost thousands of dollars, and don't use any of them any more than a computer that they also need to process and prepare all of their photos for distribution. It's just a cost of doing business, and if you know how to run a business, passing on that cost is minor, just like you have to pass on all of your costs. Spread across all of the jobs over the lifetime of the machine, the costs are pretty tiny.

I mean, sure, I think there’s some kernel of truth to this claim. But if Canon puts out a 24-70 f2.8 for $2500 and Nikon puts out an optically inferior 24-70 f2.8 for $6000, Nikon shooters have a right to make some noise about it. For some people it may even be the straw the breaks the camel’s back - they may switch to Canon over it, even though they’ll miss X or Y feature and have to deal with migrating their whole kit.

Photographers of all stripes have to deal with realistic budgetary concerns too, and they’re all hoping to get the best value for their dollar same as anyone else.
 
If you never figured out what caused the issue then why are blaming Windows? Perhaps it was the update for the graphics card. Especially when you installed a new driver for the graphics card and tweaking a bunch of settings in order to solve it. This sounds more like a graphics card driver issue than the Windows update.
I recently had a discussion with a developer for a mobile game company. The developer was complaining that their users blame the game for every stupid mistake, even ones the users make themselves, and it was blowing up their support costs. I pointed out that their game was, in fact, a buggy piece of crap, and they should look to that for why their support costs were blowing up. "But we're not responsible for everything that goes wrong!" he said. To which I replied, "When your software is responsible for so much of what goes wrong, when anything goes wrong, the user will justifiably think it's your game screwing up again. Most of the time, they're probably right."

Windows justifiably has a reputation for updates breaking things. It doesn't matter if they have recently got better about it, they earned that reputation fairly, and only an extended period of things going near perfectly will make a dent in it. I suspect some vendors love to dump support on Windows by saying, "It's Windows fault," and closing the ticket. They got your money when you bought the thing, you're not paying them any more money to fix it, and Windows' reputation shields theirs. Whereas Apple knows that you'll be dragging that Macbook into the Genius Bar long after the warranty is gone, and they can't do the circular finger pointing game if it's all their kit.
 
I mean, sure, I think there’s some kernel of truth to this claim. But if Canon puts out a 24-70 f2.8 for $2500 and Nikon puts out an optically inferior 24-70 f2.8 for $6000, Nikon shooters have a right to make some noise about it. For some people it may even be the straw the breaks the camel’s back - they may switch to Canon over it, even though they’ll miss X or Y feature and have to deal with migrating their whole kit.

Photographers of all stripes have to deal with realistic budgetary concerns too, and they’re all hoping to get the best value for their dollar same as anyone else.

This kind of thing happens *all* the time in the camera world. Not necessarily price...but that comes into play as well. Most recently witness the technology shift away from classic SLRs to mirrorless cameras, for example. The reason that so few people do just switch systems is exactly the same as with computers: The decision on what tools you use for job is based on much, much more than the cost of the tools in isolation.
 
  • Like
Reactions: OkiRun
The mass-migration to mirrorless is actually the reason I made that analogy - I'm seeing a ton of people making the painful choice to drop their EF/F mount gear for a Sony mirrorless system.
 
Geekbench does not represent the real-world performance. Do you really expect iMac with i9-9900K perform similar to a PC with i9-9900K and a bigger cooler? Wow... You better check the size of iMac's cooler.

As I'm sure you're aware, attempting to disparage evidence put forward, while failing to offer any countervailing evidence, is not an approach taken seriously in the real world.

Perhaps you have actual countervailing evidence?
 
  • Haha
Reactions: high heaven
The decision on what tools you use for job is based on much, much more than the cost of the tools in isolation.

True, but if we're honest, the biggest one is often simply that humans fear/dislike change.

I mean we have end users at work who don't want to use Chrome instead of IE, purely because it is different. Even though we rolled out Chrome because IE is deprecated.
 
As I'm sure you're aware, attempting to disparage evidence put forward, while failing to offer any countervailing evidence, is not an approach taken seriously in the real world.

Perhaps you have actual countervailing evidence?

And there is evidence which most people can not obtain unless you build PCs before. But it is a simple fact that the bigger cooler is better. Because a small cooler can not exhaust hot air efficiently. This is how Mac Pro 2013 failed.

Also, iMac 2019's i9-9900K is undervolted in order to use it in iMac's cooler.

If you have computer knowledge, it wouldn't be a problem.
 
That's a blanket statement without much merit. A pro photographer that is running a successful business (not a hobbyist that calls themselves a pro, but someone who makes their *living* do it) won't blink at $6000--they are buying lenses and cameras that cost thousands of dollars, and don't use any of them any more than a computer that they also need to process and prepare all of their photos for distribution. It's just a cost of doing business, and if you know how to run a business, passing on that cost is minor, just like you have to pass on all of your costs. Spread across all of the jobs over the lifetime of the machine, the costs are pretty tiny.

I earn my living from Photogrpahy and videography 100% and I keep running into photographers who claim my Fuji MF kit and my canon R lenses are overkill for wedding Photogrpahy and one can get way with cheaper lenses and better sharpness from Sony. Yes Sony is sharper and cutting edge but for ME my Sony A7R3 is the last camera I want to use. just like photogrpahy is not about sharpness and more about the character, colors and ultimately the moment captured, similarly computing cannot be measured in pure performance. What price / performance means to anyone cannot be computed in metrics. For me during peak work loads my time is money. I can shoot more if I spend time editing less. I used a 18 core windows system throughout 2018 for editing. The color management issues, UI scaling issues simply drove me nuts when editing large projects. Premiere pro was a disaster for us working in 4K. So we moved to final cut.

those claiming and comparing the Mac Pro to AMD are not full time photographers and do not realize that the Thunderbolt protocol is simply unstable with AMD. I have personally checked all options.

you are absolutely right. Compared to my 30k investment in bodies/lenses, even though the MP is a stretch for us, my business and workflow means more than trying out and experimenting with new stuff.
 
  • Like
Reactions: Adult80HD
If alternative exists, there will be some people complain about the price and more. Tell me the price of Ryzen or Threadripper?

VelocityMicro.com

ProMagix HD80

For $6,600 you are getting 24 cores (@3.8Ghz) of Threadripper, along with 256Gb of ram, a 5700XT, and a 1TB NVMe.

This is why people are complaining. It isn't the cost of the 7,1 - it is how little you get for your money. At the end of the day, you are paying about $4,500 for the Apple Tax.
 
I'm unsure if either of those two helped or not.

The first link appears to find fault with the default application Windows uses to open photos (i.e. the "Photo" application). The solution appeared to be to use a different application which honors color calibration (apparently the Photos app does not).

The second appears to be a problem with HDR and it didn't look like the actual cause of the problem was identified. That Windows is the problem was speculation on the posters part without anything other than "it works fine with my Xbox" leading him to the conclusion it was an Windows issue. This is not to say Windows isn't the problem but it was more confusing than helpful.

I appreciate the links though.

It’s not just the default photos application. Even windows explorer is not color managed. where stems the issue. Also the apps mentioned in the linked post and a few other forums are garbage and must be used as a last resort.

another thread. Seems you need to use an older viewer hacked to make it work. Why Is this even an issue in 2019?


what is beautiful about the Mac is that the entire OS is color managed. I simply don’t need to care about color management anymore. I calibrated my displays and have not looked back since. Trust me, my photos look 99% the same when delivered to clients and I am very happy about it.

the HDR is a issue in windows indeed. My teammate who works and edits on windows hasn’t found a solution with either black magic or adobe. They both claim thats its due to poor HDR management in windows. They are willing to sell us expensive hardware and much more expensive monitors when that can be simply achieved by a windows fix. He forwarded me the link and I pasted it here. I don’t work HDR so am frankly not concerned
 
Last edited:
I am willing to stipulate that the price delta ssgbryan notes above is a reasonable example to cite.
That said, the more cogent facet of this debate for many users in photo/video/imaging is the productivity equation. Let's start with storage including backups (often both RAW and RGB versions are archived) where TB3 is the best combination of cost/reliability/throughput I've used that is truly plug and play. The fact that clients/vendors can mount via the same physical port using USB-C 3.x (or just to the older 3.0 via crossover cable) - all without really needing to know what they're doing - is huge. Sure, some of them may end up reading the drives at USB 2.0 speeds without realizing it, but at least they can access the files.
Why the big tangent on TB3? It's just one example of how the oft derided "walled garden" has an upside as well.
If you have a machine room (sound insulated closet, etc) then loud fans are not an issue. For my world, having a quiet workstation is a big plus. Is it worth $4,500? Perhaps not, but specialty cabinets capable of handling needed TDP @ <70db are not cheap.
Sure, working near set off a cart is a specific use case - but over the years I've gotten more and more annoyed by the audio emanations of my workstations just when editing/grading/versioning/etc. What's the price tag on peace of mind?
Bottom line - if Apple can offer me a complete solution, including a calibrated 10 bit 6K monitor with legit HDR nits and deep blacks - that has a nice long lifespan with judicious upgrades - it's a solid investment. The real issue is how strong Apple comes with coding resources, Metal support incentives, etc to foster a dynamic ecosystem for creatives on OSX.
If they don't, it will be hard to ROI what shapes up to be a $20K capital outlay. (12c/96/1TB, XDR, 3 years of AppleCare, NVMe RAID, etc).
 
If you have a machine room (sound insulated closet, etc) then loud fans are not an issue. For my world, having a quiet workstation is a big plus. Is it worth $4,500? Perhaps not, but specialty cabinets capable of handling needed TDP @ <70db are not cheap.
Sure, working near set off a cart is a specific use case - but over the years I've gotten more and more annoyed by the audio emanations of my workstations just when editing/grading/versioning/etc. What's the price tag on peace of mind?

I can't speak to how loud workstation Quadro cards are, but CPU coolers in AMD-land are extremely quiet nowadays. Even at full blast a be quiet! Dark Rock Pro will only be ~22db.
 
Last edited:
  • Like
Reactions: Marekul
I am willing to stipulate that the price delta ssgbryan notes above is a reasonable example to cite.
That said, the more cogent facet of this debate for many users in photo/video/imaging is the productivity equation. Let's start with storage including backups (often both RAW and RGB versions are archived) where TB3 is the best combination of cost/reliability/throughput I've used that is truly plug and play. The fact that clients/vendors can mount via the same physical port using USB-C 3.x (or just to the older 3.0 via crossover cable) - all without really needing to know what they're doing - is huge. Sure, some of them may end up reading the drives at USB 2.0 speeds without realizing it, but at least they can access the files.
Why the big tangent on TB3? It's just one example of how the oft derided "walled garden" has an upside as well.
If you have a machine room (sound insulated closet, etc) then loud fans are not an issue. For my world, having a quiet workstation is a big plus. Is it worth $4,500? Perhaps not, but specialty cabinets capable of handling needed TDP @ <70db are not cheap.
Sure, working near set off a cart is a specific use case - but over the years I've gotten more and more annoyed by the audio emanations of my workstations just when editing/grading/versioning/etc. What's the price tag on peace of mind?
Bottom line - if Apple can offer me a complete solution, including a calibrated 10 bit 6K monitor with legit HDR nits and deep blacks - that has a nice long lifespan with judicious upgrades - it's a solid investment. The real issue is how strong Apple comes with coding resources, Metal support incentives, etc to foster a dynamic ecosystem for creatives on OSX.
If they don't, it will be hard to ROI what shapes up to be a $20K capital outlay. (12c/96/1TB, XDR, 3 years of AppleCare, NVMe RAID, etc).

The XDR is not a calibrated monitor. At least not to an accurate professional standard. There will hopefully be a 3rd party LUT box released that allows 3d LUT calibration.
 
  • Like
Reactions: throAU and Marekul
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.