Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The reason why GT120 is a good "boot screen card" because it's slot powered. Therefore, both mini 6pin can be use to drive the real powerful card.

If you use K5000 to replace it. The 2nd card's power supply may be affected. (Of course, there are lots of work around. e.g. Extra PSU, use SATA powers, or Pixlas mod etc.)

Also the K5000 as insanely bad performance for the price
 
Sorry no the engineering shop I work in uses solid works on custom windows workstations. However a customer came in a few months back with solid works running via bootcamp on a MacBook pro.

As for Mac compatible engineering apps. I know Autodesk fusion 360 is available for Mac as well as other Autodesk software. I know of some others but for the life of me can't remember the names.
there's AutoCad and Alias from Autodesk..

Rhino is on Mac (which is what i personally use)
(Grasshopper for Rhino is also on Mac now though still in beta.. still very usable in its current state)

a few others which may be used in more specialized 'engineer' fields would be Vectorworks, Modo, Cinema 4D, Maya, Moi3D, Blender, SketchUp, and openSCAD.

Apple designers/prototypers use Alias and Rhino.. that said, i imagine they also have boot camped machines running NX and Catia.. and the likes as well as CAM software (which is sorely lacking on macOS).

---
that aside, the GPUs in the upper end Mac builds are good to great for all of this software.. (like, the Radeon560 in the MBP or the 580 in the iMac).. it's highly unlikely you're going to get better CAD performance using more 'pro' GPUs than these.
(unless we're taking something like GPGPU based rendering or simulations or something.. in which case, more vRAM generally equates to larger scenes being able to be computed at a faster pace)
[doublepost=1500244101][/doublepost]
3D modelling on the other hand, that would definitely need ECC memory.
how so?

idk, i don't think i've ever experienced a bit flip when modeling (though i very well could have).. thing is, with modeling at least, you see the results.. if there's an error, you'll very likely notice something went weird..

further, the more adept designers/engineers/modelers are triple-checking their work.
 
Last edited:
I know this is a long shot but does anyone use a FirePro despite the lack of official drivers? I was wondering if they can be used as accelerator cards in FCPX when paired with a standard GPU ex: GTX series or something similar
 
Hi Guys, I am using quadro 4000 (cMP 12 core 3.0) in Blender and it's very slow n viewport in texture mode :(. My GTX 670 on old Hackintosh was way faster. Also I never get 10bit mode working. Also with special software and right cables. I gave up.
 
Last edited:
Hi Guys, I am using quadro 4000 (nMP 12 core 3.0) in Blender and it's very slow n viewport in texture mode :(. My GTX 670 on old Hackintosh was way faster. Also I never get 10bit mode working. Also with special software and right cables. I gave up.

You state nMP? so I take it you are running it via eGPU? I wonder if that is effecting your performance.
 
Of course my mistake (it's cMP not nMP)

Ah ok, looking at the the specs of the quadro 4000 vs the gtx 670 yeah I wouldn't compare them. The gtx 670 is a newer generation card to the quadro 4000. Gtx 670 being a kepler card it would be more in comparison to a quadro k4000. I don't know if you would have any improvement though as I haven't tried them to compare.
 
  • Like
Reactions: dabotsonline
Yeah to confirm what has been said, you don't need ECC for photography and a consumer level GPU has more than enough power. Even an Intel iGPU can handle Photoshop and Capture One.
 
Yeah to confirm what has been said, you don't need ECC for photography and a consumer level GPU has more than enough power. Even an Intel iGPU can handle Photoshop and Capture One.

I just want to emphasise this thread isn't about whether workstation cards are needed or not. This thread is a place to talk about what people are doing with the cards.

Plus on the nvidia side if you want 10 bit colour your choices are made for you.
[doublepost=1500280643][/doublepost]For anyone wondering, here is a great piece of info regarding how Adobe Creative Cloud utilises Nvidia GPU's.

http://images.nvidia.com/content/qu...obePremierPro-SolutionOverview-US-Fnl-WEB.pdf
 
  • Like
Reactions: dabotsonline
I've been in photography for 4 decades so bear with my ignorance ;)

Adobe had its own Mercury Engine that is more efficient than OpenGL. It supports a few OpenCL features. It isn't accelerated by CUDA.

I also confirm what others have said about 10 bit output. Barely anyone ever needs it, perhaps only half a percent of global photography has required it because most imagery (almost everything) doesn't have a colour palette anywhere close to what 10 bit can provide.

We have to bear in mind that most publishing has been done on a Mac and only a handful of AMD GPUs supported 10 bit output starting on El Capitan.

The advent of HDR video content will speed adoption.

Here's what the platforms provide:

- Nvidia settings offer 8 and 10 bit output on Windows, even on consumer cards three years old.

- Nvidia doesn't offer 10 bit output on the Mac.

- A small selection of AMD cards offer 10 bit output on the Mac. Most recent cards support it.

- AMD's 10 bit output option doesn't show in Windows so I presume it switches automatically.
 
I've been in photography for 4 decades so bear with my ignorance ;)

Adobe had its own Mercury Engine that is more efficient than OpenGL. It supports a few OpenCL features. It isn't accelerated by CUDA.

I also confirm what others have said about 10 bit output. Barely anyone ever needs it, perhaps only half a percent of global photography has required it because most imagery (almost everything) doesn't have a colour palette anywhere close to what 10 bit can provide.

We have to bear in mind that most publishing has been done on a Mac and only a handful of AMD GPUs supported 10 bit output starting on El Capitan.

The advent of HDR video content will speed adoption.

Here's what the platforms provide:

- Nvidia settings offer 8 and 10 bit output on Windows, even on consumer cards three years old.

- Nvidia doesn't offer 10 bit output on the Mac.

- A small selection of AMD cards offer 10 bit output on the Mac. Most recent cards support it.

- AMD's 10 bit output option doesn't show in Windows so I presume it switches automatically.

See this is what I was hoping. Guys like yourself who have been there done that and have a lot more experience then myself so that I can learn and discuss. Thank you for joining in. I should be saying excuse my ignorance.

It is just very hard to get information.
 
Last edited:
Thanks for joining in, yeah ECC on the GPu is more of a nice to have feature but not the end of the world if not there.


I would disagree. I have a 5770 and im a photographer. I have a 27" ACD and an older 23" ACD that I use for music/email and if I have the 23" plugged in lightroom is soooo slow its pretty slow normally as its written so poorly but noticeably faster with one display with the piddly amount of ram.

Doesnt seem to matter what machine you use with lightroom, I have a high end i7 windows machine with a 1070 also and it still runs slowly but a card that has 2gbs or more ram runs more smoothly if you have higher pixel monitors like 2 or 4k 1080 pannels are much easier to run.

My hesitation of getting a newer card is that none seem to work well out of the box, the Nvidia cards seem to have really poor performance in creative apps because of poor open CL/GL performance so the lower end ATI card outperform them in those apps. The ATI cards dont all have native support so editing kext files etc so still arent naively supported. I use the machine for a living and not messing with it is the reason I use a mac.

Unless you use older cards of both varients performance gains are minimal in actual usage.

All the options seem to give limited performance gains comparative to their actual power on a 5,1 and the drivers seem pretty poor. Hopefully with high sierra the 4 and 5 series cards from ATi will work natively then Il buy one, hopefully this mining craze will cool too meaning prices get back to normal.
 
I would disagree. I have a 5770 and im a photographer. I have a 27" ACD and an older 23" ACD that I use for music/email and if I have the 23" plugged in lightroom is soooo slow its pretty slow normally as its written so poorly but noticeably faster with one display with the piddly amount of ram.

Doesnt seem to matter what machine you use with lightroom, I have a high end i7 windows machine with a 1070 also and it still runs slowly but a card that has 2gbs or more ram runs more smoothly if you have higher pixel monitors like 2 or 4k 1080 pannels are much easier to run.

My hesitation of getting a newer card is that none seem to work well out of the box, the Nvidia cards seem to have really poor performance in creative apps because of poor open CL/GL performance so the lower end ATI card outperform them in those apps. The ATI cards dont all have native support so editing kext files etc so still arent naively supported. I use the machine for a living and not messing with it is the reason I use a mac.

Unless you use older cards of both varients performance gains are minimal in actual usage.

All the options seem to give limited performance gains comparative to their actual power on a 5,1 and the drivers seem pretty poor. Hopefully with high sierra the 4 and 5 series cards from ATi will work natively then Il buy one, hopefully this mining craze will cool too meaning prices get back to normal.

It's funny you mention Lightroom speed as Adobe have just these last few days admitted that Lightroom is painfully slow.
 
See this is what I was hoping. Guys like yourself who have been there done that and have a lot more experience then myself so that I can learn and discuss. Thank you for joining in. I should be saying excuse my ignorance.

It is just very hard to get information.

Thanks. It's best to go on specialist post production forums with some hardcore geeks. This forum is a generalised one and unfortunately many Mac users haven't been able to experience what hardware and software is truly capable of doing because Apple is always late at supporting industry standards.

Monitor wise yes go for an Eizo or NEC with at least 96% Adobe RGB coverage.

The cheaper 10 bit panels are for 10 bit consumption, which is different from 10 bit or wide gamut production. DCI P3 panels aren't suitable for the best prints either.

The more expensive monitors let you proof for print, digital, etc. You can't soft proof properly for print on a cheaper high contrast panel even if it says 'wide gamut'.
 
Hi Guys, I am using quadro 4000 (cMP 12 core 3.0) in Blender and it's very slow n viewport in texture mode :(. My GTX 670 on old Hackintosh was way faster. Also I never get 10bit mode working. Also with special software and right cables. I gave up.

Old post and not sure if you will read it but it may help someone. Talking to Nvidia regarding the stock Mac edition quadros. The Quadro 4000 for Mac was not a 10 bit card, however the K5000 for Mac does support 10 bit.
 
  • Like
Reactions: dabotsonline
FWIW there's plenty of people who have 10-bit color working on macOS on NVIDIA hardware. For example:

http://www.insanelymac.com/forum/to...-macos-high-sierra-update-11012017/?p=2526682

post-1011040-0-29553000-1509541720.png
 
  • Like
Reactions: dabotsonline
Ok here is where it gets tricky. 10 bit on a geforce is Direct X. Fine for your monitor and basic tasks and gaming.

Photoshop does not read 10 bit over direct X. It reads it via open GL. Which GeForce cards do not do. Only Quadro and AMD pro cards do.

That is after months of back and forth between Adobe and Nvidia regarding this very subject.

If you don't use photoshop or Software using Open GL as the API then none of this matters.
 
Ok here is where it gets tricky. 10 bit on a geforce is Direct X. Fine for your monitor and basic tasks and gaming.

Photoshop does not read 10 bit over direct X. It reads it via open GL. Which GeForce cards do not do. Only Quadro and AMD pro cards do.

That is after months of back and forth between Adobe and Nvidia regarding this very subject.

If you don't use photoshop or Software using Open GL as the API then none of this matters.

Obviously there is no DirectX in MacOS. And our Geforce cards support 10bit colours.

Anyway, my Photoshop in MacOS detected that I am using a Geforce card, and let me enable 10bit colour.
Screen Shot 2017-11-11 at 06.51.51.jpg

TBH, I am very new in this area, not even know how to test if my 10bit setup is actually working or not. Any simple test that I can do to confirm your info is right?
 
Obviously there is no DirectX in MacOS. And our Geforce cards support 10bit colours.

Anyway, my Photoshop in MacOS detected that I am using a Geforce card, and let me enable 10bit colour.
View attachment 734468
TBH, I am very new in this area, not even know how to test if my 10bit setup is actually working or not. Any simple test that I can do to confirm your info is right?

I have the 30 bit monitor choice selected on a stock gt120 haha.

Best way to tell is with a gradient. Black to white gradient with various shades of grey in between. If you get banding it's 8 bit. If it's a smooth transition it's 10 bit.

This can be a very confusing topic regarding 8 bit and 10 bit. Also there is the whole true 10 bit versus 8 bit plus dithering to produce a false 10 bit.

It is super confusing.
[doublepost=1510356733][/doublepost]If you want to get even more confused here is some interesting reading that goes into this topic, caveat it is about windows but it does explain a bit.

https://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/12
[doublepost=1510356979][/doublepost]I am going to be doing more research on this topic as I am finding colour management to be a really fascinating topic. I am fully planning on sometime in the new year getting an Eizo coloredge screen, and have just brought a quadro p4000 to test out this stuff. I will do a full report sometime next year after I can fully do the research. My current screen is only 8 bit so can't really give a proper opinion yet.
[doublepost=1510357579][/doublepost]Mac OS does support 10 and has drivers for it but that is with the AMD cards I am not knowledgeable enough to comment on that regarding their use in a cMP.

I just find this a fascinating topic. I would like to eventually test this out with both geforce and quadro cards to get a real answer to how it works in the Mac Workspace.
 
Last edited:
  • Like
Reactions: dabotsonline
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.