Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
The first post of this thread is a WikiPost and can be edited by anyone with the appropiate permissions. Your edits will be public.
if i'm not fast and change the window's prefs to other settings than image's background, system freezes again...

Seems similar to this case.

https://forums.macrumors.com/thread...re-acceleration.2180095/page-12#post-27491463

TBH, I have absolutely no idea why this happen. I can't reproduce this issue on my setup. But so far, really absolutely zero freeze report from Vega users yet.

Also, this mod is actually make the AppleGVA believe our Mac is the iMac Pro (with Vega). So, really having a Vega onboard should be "more compatible" (theoretically).
 
Has anyone tested this with Blackmagic Design's DaVinci Resolve?

What kind of test you are looking for?

I am at home now and can do some tests. If you can tell me the exact procedure to perform the test, I should able to show you the result.

Anyway, for your info, DV's timeline can utilise UVD (hardware decoding). It can also use VCE (hardware encoding) to export H264 video (both single pass or multiple passes).
 
In theory, Davinci Resolve will decode and encode H264 already by default on the GPU, so it might not be a good software to test :(

No it won't. Without hwaccel, DV can NOT utilise the GPU video engine for H264 on cMP.

I just make a quick test by importing the Sony 4K HEVC 10 bit HDR video (link in post #1), and export to H264. As I said in post #1 Q&A, "the GPU is working" is not equals to "the video engine is working". By default, DV can only use GPU to compute / rendering (drawing each frame), but not really able to use the GPU to decode any H264 video, or use the GPU to encode each frame into H264. (Of course, I am only talking about in macOS, and on cMP)

This is WITHOUT hwaccel mod. (I didn't realise that I didn't capture the export setting properly. But I didn't change anything. Identical setting as per the 2nd capture below)
hwaccel OFF.png


This is with hwaccel mod
Screenshot 2019-07-04 at 4.16.19 PM copy.png
 
Last edited:
  • Like
Reactions: orph
No it won't. Without hwaccel, DV can NOT utilise the GPU video engine for H264 on cMP.

I just make a quick test by importing the Sony 4K HEVC 10 bit HDR video (link in post #1), and export to H264. As I said in post #1 Q&A, "the GPU is working" is not equals to "the video engine is working". By default, DV can only use GPU to compute / rendering (drawing each frame), but not really able to use the GPU to decode any H264 video, or use the GPU to encode each frame into H264. (Of course, I am only talking about in macOS, and on cMP)

This is WITHOUT hwaccel mod.
View attachment 846595

This is with hwaccel mod
View attachment 846596

Are you running the latest Davinci Resolve Studio 16 beta?
 
Are you running the latest Davinci Resolve Studio 16 beta?

No, just DV 15

But I very doubt if DV 16 can utilise the video engine if this function is disabled at OS level.

There is no way that an application can use a function that not supported by the OS.

It's just like you tell me that DV16 can use CUDA in Mojave. Simply impossible until web driver available. Because the OS has no support of that by default.
 
No, just DV 15

But I very doubt if DV 16 can utilise the video engine if this function is disabled at OS level.

There is no way that an application can use a function that not supported by the OS.

It's just like you tell me that DV16 can use CUDA in Mojave. Simply impossible until web driver available. Because the OS has no support of that by default.

Make sense. It could be worth to try with DV16 studio beta anyway though, I just heard it should be supported, but of course maybe they were referring to newer Macs.
 
cool looks like Resolve will get even faster, from 62w used by GPU to 144w is a massive jump in utilization & tad hard to tell but looks like more VRAM used.
 
cool looks like Resolve will get even faster, from 62w used by GPU to 144w is a massive jump in utilization & tad hard to tell but looks like more VRAM used.

That iStat wattage reading is erroneous. However, the percentage increase should be roughly correct.

That 62W may be representing about 40W actual.

And 144W may be about 100W.

But that "more than 100% increment" should be very correct. In fact, you can see that from the Activity Monitor GPU usage history graph.

I am not sure what's limiting the GPU usage (as a compute device) when the hwaccel is OFF. But my feeling is because the CPU is not fast enough to decode the demanding source HEVC video in this case. Therefore, the GPU has nothing to do but wait (most of the time).
 
I assume your project is just a export with no corrections?

ill have to look at how one of my projects behaves, off memory i think once you have a mix of corrections and effects GPU/CPU use changes depending on what's going on but that may only be timeline only and not during export.
but im relay not sure ill have a look later.

main thing is big speed boost & i assume this also boosts timeline performance which is relay cool.

edit
ok you can tell i don't work with 4K HEVC 10 bit video 60fps files, grabbed the Sony sample video with resolve 15 and only get 8-10fps almost no GPU use and not massive CPU use in time line editing
added a few luts and saw no real change
only stabilization seems to be using the gpu in any real way and thats only 50% from GPU history
osx10.13 so no hardware decoding and project settings only set to 1080p

so hardware decoding takes video from not workable to an viable option
 
Last edited:
  • Like
Reactions: h9826790
cMP 5,1 3.33 dual processor
firmware 141.0.0.0.0
NVME SSD
RX580
monitor: 32in 4k running scaled at 2560x1440 connected by DisplayPort

Does this work with firmware 141.0.0.0.0?

Does this do anything whatsoever if you are not converting videos?

I am having an issue when I attempt to play YouTube videos at full screen, the monitor will frequently go black for a few seconds randomly, as if going to sleep, but then it comes back on again. Then it happens again...every 30 seconds or so...

Youtube videos also really seem to increase my CPU B temp (processor is warped slightly causing imperfect contact w/heatsink).

Thanks!
 
Does this work with firmware 141.0.0.0.0?

Yes

Does this do anything whatsoever if you are not converting videos?

Yes, please read the Q&A section in post #1.

I am having an issue when I attempt to play YouTube videos at full screen, the monitor will frequently go black for a few seconds randomly, as if going to sleep, but then it comes back on again. Then it happens again...every 30 seconds or so...

Before or after this hwaccel mod?

Which browser you are using?

I never see this issue on my setup. Which OS you are running?
 
Before - have not attempted.

Safari & Waterfox, but can now confirm this occurs in VLC, and opening, or making the window bigger, seems to cause it.

10.14.5

Thanks!

If VLC also showing this issue. Check your cable, monitor setting, etc. Of course, can be the card's issue, but can be something else.

Before fix this issue, I will say don't go for the mod, otherwise, may cause more confusion.

I never see this issue in 10.14.5 with my PULSE RX580. So, shouldn't be OS issue.

Once you performed the mod. You can enjoy 4k or even 8k Youtube video in Chromium base browser. Not necessary really use Chrome. I use Brave browser, very happy for that.
 
Has anyone else tried this out on High Sierra with a RX580?

I think I've seen a single report of one user with a Vega card (and it did work!) and that was it.
 
Awesome! Has anyone tried this with the MP 6,1 with the built in Firepros? I read through all pages and did not see it referenced.
 
Awesome! Has anyone tried this with the MP 6,1 with the built in Firepros? I read through all pages and did not see it referenced.

I don't think it will work. The D700 is just the 7970 / 280X, and 7xxx card is already tested. No hwaccel.

You it's possible to get that by using eGPU.
 
Has anyone used this in an Adobe Premiere Pro workflow? Any experience on general timeline performance using 4K h.264-based codecs like XF-AVC?

I've noticed that even the 12-core 3.46 can struggle with real-time playback with more than one stream of decoding. If this offloads the decoding onto the GPU for playback, that sounds like it would be amazing.

I plan on testing with my Vega Frontier Edition when back from some current travels.
 
Has anyone else tried this out on High Sierra with a RX580?

I think I've seen a single report of one user with a Vega card (and it did work!) and that was it.
I tried using this in high sierra with the RX 580 it caused GPU restarts (i.e. did not work, made system unusable)
 
  • Like
Reactions: donluca
Indeed. Bummer though.

The AppleGVA method isn't possible in High Sierra as the file doesn't exist or is vastly different, right?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.