Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

h9826790

macrumors P6
Apr 3, 2014
16,656
8,587
Hong Kong
Does Nvidia GPU use Metal in macOS? I thought that FCPX is optimised for AMD in general.

Yes it does, but still not perform well in FCPX rendering.

Screen Shot 2018-03-03 at 05.20.51.jpg
1080Ti Metal-1.jpg
 
  • Like
Reactions: itdk92

itdk92

macrumors 6502a
Nov 14, 2016
504
180
Copenhagen, Denmark
I can attest to this. FCPX 10.4 and latest High Sierra beta with a MSI RX 580 8GB "GamingX" yields a 19-20s BruceX. 10.13.x is a must.
[doublepost=1517338651][/doublepost]itdk, I also noticed that you upgraded from the 6-core to 8-core. Are you seeing faster export times?

I love to change my main Mac all the time, testing stuff out. That’s one of the advantages of having macs and all kind of hardware at hand :)

We are now on a 12 Core.

If you ask me, on the 5.1, the 12 Core is a must.
 
  • Like
Reactions: devon807

bsbeamer

macrumors 601
Sep 19, 2012
4,313
2,713
If you ask me, on the 5.1, the 12 Core is a must.

For video, completely agree. I'm down to a 6-core 5,1 at the moment. Not sure I can work like this for another few weeks. Most of the normal stuff is fine, expect rendering. Keeping an eye on a couple 12-core trays and hoping prices budge a little.
 
  • Like
Reactions: devon807

devon807

macrumors 6502
Dec 31, 2014
372
95
Virginia
I love to change my main Mac all the time, testing stuff out. That’s one of the advantages of having macs and all kind of hardware at hand :)

We are now on a 12 Core.

If you ask me, on the 5.1, the 12 Core is a must.
Currently looking into the 12-core upgrade. Was saving for a big GPU upgrade but I will stick with my Nvidia/AMD setup for now :) I wish GPU prices didn't suck. :( SmartSelectImage_2018-03-02-19-53-07.png
 

orph

macrumors 68000
Dec 12, 2005
1,884
393
UK
you put two in and there in?
it's not SLI but i assume some apps will be able to use both cards?
>.< always kind of the dream to have some openCL and CUDA at the same time if the apps you use are clever and able to see both cards?

iv never seen a lot of info of asymmetric crossbrand GPU use & it wont work in windoes ;)
 

devon807

macrumors 6502
Dec 31, 2014
372
95
Virginia
How does it work? How do you switch cards?
It does not, yet. :) Have some things in the works when MacOS 10.13.4 comes out, now is just R&D
[doublepost=1520373884][/doublepost]
you put two in and there in?
it's not SLI but i assume some apps will be able to use both cards?
>.< always kind of the dream to have some openCL and CUDA at the same time if the apps you use are clever and able to see both cards?

iv never seen a lot of info of asymmetric crossbrand GPU use & it wont work in windoes ;)
Most of it is just preliminary data gathering. Nothing worth actually doing this for yet. AME is the only app that uses both (Metal GPU Acceleration)
 
  • Like
Reactions: orph

bsbeamer

macrumors 601
Sep 19, 2012
4,313
2,713
AME is the only app that uses both (Metal GPU Acceleration)

Curious - are you actually getting acceleration benefits with AME CC 2018? Have an open ticket with Adobe and NVIDIA about this. CUDA is enabled, selectable, and fully utilized in all Adobe applications (that support it), except AME.

AME offers options/choices of CUDA, OpenCL, Metal and Software Only for GPU acceleration. Regardless of what setting I choose, I am getting NO BENEFITS with any of the GPU acceleration options over software only. All times are identical.

Have done lengthy tests to document the issue with many different codecs for Adobe & NVIDIA. GPU is a GTX 1080 FE. Identical issues with GTX 680 Official Mac version.

NVIDIA development team says AME is not written to properly support CUDA or any GPU acceleration. Adobe's answer is vague and ends up being accelerated only through Dynamically Linked timelines, they think...
 

devon807

macrumors 6502
Dec 31, 2014
372
95
Virginia
Curious - are you actually getting acceleration benefits with AME CC 2018? Have an open ticket with Adobe and NVIDIA about this. CUDA is enabled, selectable, and fully utilized in all Adobe applications (that support it), except AME.

AME offers options/choices of CUDA, OpenCL, Metal and Software Only for GPU acceleration. Regardless of what setting I choose, I am getting NO BENEFITS with any of the GPU acceleration options over software only. All times are identical.

Have done lengthy tests to document the issue with many different codecs for Adobe & NVIDIA. GPU is a GTX 1080 FE. Identical issues with GTX 680 Official Mac version.

NVIDIA development team says AME is not written to properly support CUDA or any GPU acceleration. Adobe's answer is vague and ends up being accelerated only through Dynamically Linked timelines, they think...
Odd. It may be because I do a lot of downsampling from 4k ProRes to 1080p via H.264. With this preset, I get 100% of the GPU usage.
 

bsbeamer

macrumors 601
Sep 19, 2012
4,313
2,713
Odd. It may be because I do a lot of downsampling from 4k ProRes to 1080p via H.264. With this preset, I get 100% of the GPU usage.

Thanks for the info. Will test some 4K and 2K down samples when I get a chance.
 

orph

macrumors 68000
Dec 12, 2005
1,884
393
UK
@bsbeamer AME if it's like most video encoders will only have vary light use of the gpu unless there is something being done to the video that can be accelerated better than on the cpu.

so as @devon807 mentions he is moving between resolutions and the re sizing of the video will be accelerated by the gpu

from what i know (and im not up on the subject) the higher the resolution the more you will see a benefit but unless your doing high end workflows even low end card (like a GTX660 or ATI equivalent) will work as well as a titan (unless VRAM limited)

also lots of things can change so lets look at video codecs (;) the joy never stops), different video codecs are better supported by GPU acceleration (and CPU scaling) ie some codecs will use the gpu more than some (or scale better to more CPU cores)

it a large and boring subject :p last time i dived in to it was a while ago.
my tips DNxHR is worth a look as a editing codec

hand brake had a big article explaining how little a GPU helps with h264 encodes somewhere and unless your doing size changes it's not in huge way.

also macpros dont have quick sync video encoder on the cpu
 

bsbeamer

macrumors 601
Sep 19, 2012
4,313
2,713
I (and NVIDIA engineers) suspect that AME is basically just like all other encoding software - does not utilize GPU for encodes. And that would make sense, technically speaking. But Adobe does not advertise and promote AME’s capability’s that way. That’s where the confusion lies.

It would make sense for resize to take advantage of GPU as most basic transform effects (size and position) are GPU accelerated in PPRO. Unlike GPU accelerated effects lists that are published for PPRO & AE (and filterable within the software), there is no published list of the functions, processes, or workflows that are GPU accelerated in AME. Apparently Adobe is working on that list for NVIDIA.
 
Last edited:
  • Like
Reactions: orph

orph

macrumors 68000
Dec 12, 2005
1,884
393
UK
from what i understand quick sync was not used in CC apps till recently (in windows), i think some one mentioned that the imac pro is using the vaga GPU for some extra encoding acceleration (only in FCX?) but im not clear on that.

there are some dedicated CUDA accelerated encoders but from what i understand they give lower quality output than a CPU (and codec limited ? i think GPU's now have hardware decoding & encoding of video but in a limited capacity with a limited list of codecs but i may be wrong there)

o this looks interesting, contains a list of codecs supported by GPU encoding on some apps with different gens of GPU
https://developer.nvidia.com/ffmpeg

and this is an informative reply on stackexchange
https://video.stackexchange.com/questions/14656/why-processor-is-better-for-encoding-than-gpu
and as ever if it works on windows it may not work in osx :(
 

bsbeamer

macrumors 601
Sep 19, 2012
4,313
2,713
Sorenson Squeeze used to support GPU encoding, unsure if it still does. The quality was pretty horrible compared to non-GPU encodes.

There are dedicated H264 machines that use GPU style acceleration that do perform well, but their price tag really is for broadcast facility usage. Would be hard to get any real benefit when outputting a timeline with those, unless connected via SDI and playing timeline(s) in real-time into them.
 
  • Like
Reactions: foliovision

PowerMike G5

macrumors 6502a
Oct 22, 2005
556
245
New York, NY
From my usage, AME utilizes the GPU only when having to scale the footage (like 4K to 2K) or when encoding GPU accelerated effects that haven't been rendered out yet. So a powerful GPU matters in some instances and not in others.
 

Raima

macrumors 6502
Jan 21, 2010
400
11
Got some Bruce X test results

Hackintosh intel i7 8700k
32gb 3000mhz ddr4
OS X 10.13.5
FCPX 10.4.2

Single RX580 8gb - 16 secs
Single Vega 64 8gb - 14.5 secs
Dual RX580 8gb - 11.65 secs

Some observations are, 10.13.2 performs faster than 10.13.3-5. Vega 64 was getting 12.7 secs, it’s now increased to 14.5 secs.

Also have problems with dual RX580 tho, 4k monitor not displaying in OS X. Only 1080p monitor displaying. Anyone know a fix by any chance?
 

orph

macrumors 68000
Dec 12, 2005
1,884
393
UK
have you trued different ports for 4K out or cables, also dose the 4K display show up in osx in system info ?

nice hack, was it easy to set up?
 

Raima

macrumors 6502
Jan 21, 2010
400
11
have you trued different ports for 4K out or cables, also dose the 4K display show up in osx in system info ?

nice hack, was it easy to set up?

Yeah I tried different ports. Do not recommend using 4k on dual rx580s. Was able to optimise single card performance.

Looking to release a video soon.

Was initially hard to setup, but there’s a lot of helpful people. Now it’s pretty much rock solid and I’m pretty happy with it.
 
  • Like
Reactions: slamjack

slamjack

macrumors member
Jul 22, 2011
69
13
Moscow
Yeah I tried different ports. Do not recommend using 4k on dual rx580s. Was able to optimise single card performance.

Looking to release a video soon.

Was initially hard to setup, but there’s a lot of helpful people. Now it’s pretty much rock solid and I’m pretty happy with it.
Could you please show the results by screen capture? What is a performance while working with heavy timeline?
 

flallnatural

macrumors newbie
Nov 14, 2018
3
0
[doublepost=1510650490][/doublepost]I have dual RX580, my bruceX test is: 8,74sec (background render off). It's very fast!
My system: 4,2 intel Core i7, 64 GB 3000 MHz DDR4 RAM, SSD, dual RX 580 8 GB, running High Sierra! (Hackingtosh ;-)

Any chance you could share with me what you did to get this to work? I have gotten Mojave to work with a single RX 580 and a Vega 64. With my new setup, I'm looking to get dual RX 580s working. I've ordered 2 of the Sapphire Nitro RX 580 8GB and am hoping to get this working in Mojave.
 

h9826790

macrumors P6
Apr 3, 2014
16,656
8,587
Hong Kong
Any chance you could share with me what you did to get this to work? I have gotten Mojave to work with a single RX 580 and a Vega 64. With my new setup, I'm looking to get dual RX 580s working. I've ordered 2 of the Sapphire Nitro RX 580 8GB and am hoping to get this working in Mojave.

How you install two Nitro+ RX580 in the cMP?

Slot 1+ 3 and remove all HDD in bay 2,3,4?
 

startergo

macrumors 603
Sep 20, 2018
5,022
2,283
Hey how about 2 WX7100 instead of 2 RX-580`s ? Same specs and smaller footprint?
 

startergo

macrumors 603
Sep 20, 2018
5,022
2,283
Then why not 2x PULSE RX580 with WX7100 clock speed? That will be way cheaper, and with better cooler.

https://forums.macrumors.com/threads/sapphire-pulse-rx580-8gb-vbios-study.2133607/

You prefer single slot card, rather than better cooling?

I actually prefer both :) but there is no way, so my second priority is space (we only have 3 pcie slots to play with) unless there is a good pcie expansion option.

I see you used the settings for the WX7100, but have you actually tested this card I am very curious.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.