So I have been in contact with the devs for Millumin, so more testing is required before making complete recommendations. I have some DP adapters on the way, to test another theory; currently I only have OWC USB-C > Dual DP adapters - which may be causing the bottleneck. Will know soon enough.So for anyone who does a lot of Millumin stuff, this is not the card to get. I am still doing some testing, but configuring monitors so that the FPS stays at 60 is a chore. (so I have a 3840x1600 ultrawide as my main, and a 1080p off to the side); Program monitors were 2 27" @ 3840x2160. I did get it to work, but the config is really bizarre. I even threw my 580x back in to drive the 1080p monitor and still pretty much the same story.
My guess is the retail 6800XT or 6900XT will be better for this, or the retail W6800.
So I have been in contact with the devs for Millumin, so more testing is required before making complete recommendations. I have some DP adapters on the way, to test another theory; currently I only have OWC USB-C > Dual DP adapters - which may be causing the bottleneck. Will know soon enough.
On another front, doing more After Effects and Element 3D testing. Really enjoying the speed of the new card. Premiere,Media Encoder, everything is way way faster than the 580X (not that I was expecting any less.)
No word from the TGPro devs, but I did send them some info from my system so they could look into more.
It really depends on what you are doing....How much faster is after effects and media encoder compared to the 580x mpx? 2x? 3x?
Awesome!!! This is great info. Thanks a lot. Helps me on my upgrade decision.It really depends on what you are doing....
But Media Encoder from a Premiere timeline; 60 min ProRes 422 > Vimeo 1080p:
580x did it in 24:28
6800xDuo did it in 9:28
So between 2-3x faster for that; I imagine once you get to higher res stuff, the lack of memory on the 580x will start to show.
After Effects is tricky because it really depends on what you are doing, and what is GPU accelerated and what isn't. Element 3D (plug in from Video Co-pilot) uses the GPU so the speed up was quite substantial. I just confirmed that both cores are being used by Element 3D; but the scaling is not linear, they both are not pegged. I have a 16-core and none of the CPU cores were maxing out; but there wasn't much I was rendering that was not Element3D.
Puget Systems AE test does not currently run in the latest version of AE; but I will keep checking back, hopefully they will have a fix soon; which will yield a better overall look.
Do it. It’s an incredible card. I’ve been doing a lot of work in Unreal Engine 4 and it screams. Upgraded from a w5700x and the render times from FCP X and After Effects are amazing.I prefer someone to just tell me what to get. Lol. I'm gonna get the 6800 duo I think. Premire pro and after effects.
How about blender?Do it. It’s an incredible card. I’ve been doing a lot of work in Unreal Engine 4 and it screams. Upgraded from a w5700x and the render times from FCP X and After Effects are amazing.
triton100 according to what I see on the Blender site, Cycles uses CPU and GPUs for rendering, so I imagine it would provide a nice render speed up. If you are not using Cycles or some other external rendering pipeline that is GPU accelerated, then I imagine you are in the same boat as me with Lightwave3d. This card did nothing for me as far as rendering goes in native Lightwave, but the viewport OpenGL performance is way better and I was able to crank the texture resolution to maximum.How about blender?
It looks like Apple pushed the estimate for my card by another week. Dammit.So what you're saying is: I'm waiting.
That sucks when that happens, but in a weird way.. I kind of like ordering the MPX modules from Apple. They have a weird time frame generally, and it almost feels like Santa's Elves are hard at work crafting the MPX module for you personally, lol.It looks like Apple pushed the estimate for my card by another week. Dammit.
LOL, I received the same email; since I sent him the support package info -- guess you did as well.From Matt the developer of TG Pro who is helping to check the W6800X duo stats:
I've been looking into this along with the info from the support package and it seems that the GPU Proximity 6 sensor is giving a way higher value than the rest so I may have to exclude that one, or if I can figure out what it's actually for, relabel it. The other interesting thing is that Apple didn't expose the W6800X temperatures - the keys are there, but the values for both GPUs are 0C which means they haven't been set. I'm hoping that the drivers with macOS 12 may fix this, although it's really up to Apple since they do the graphics drivers.
So either way, I'll get this sorted out in the TG Pro 2.59 update which I'm working on right now.
Don't buy the w6800x duo, it's a crap. I made 2 videos explaining that.I prefer someone to just tell me what to get. Lol. I'm gonna get the 6800 duo I think. Premire pro and after effects.
Damn. Now I'm confused. Lol.Don't buy the w6800x duo, it's a crap. I made 2 videos explaining that.
get the Vega and be happy. In my next video I plan to show why the Vega IIs are much better valueDamn. Now I'm confused. Lol.
From Matt the developer of TG Pro who is helping to check the W6800X duo stats:
I've been looking into this along with the info from the support package and it seems that the GPU Proximity 6 sensor is giving a way higher value than the rest so I may have to exclude that one, or if I can figure out what it's actually for, relabel it. The other interesting thing is that Apple didn't expose the W6800X temperatures - the keys are there, but the values for both GPUs are 0C which means they haven't been set. I'm hoping that the drivers with macOS 12 may fix this, although it's really up to Apple since they do the graphics drivers.
So either way, I'll get this sorted out in the TG Pro 2.59 update which I'm working on right now.
This is a very vague statement (although I would watch the videos, the more info the better.) I do not think the card is crap; I think many thought it was going to be a holy grail; when in reality it isn't much more compute than the previous top of the line stuff.Don't buy the w6800x duo, it's a crap. I made 2 videos explaining that.
I like your answer! After very long time I found someone whose approach I can value a lot. Thanks for that. I would like to have your "charm" tooThis is a very vague statement (although I would watch the videos, the more info the better.) I do not think the card is crap; I think many thought it was going to be a holy grail; when in reality it isn't much more compute than the previous top of the line stuff.
I believe the better answer really boils down to what you already own coupled with what you do most in your workflow.
If you have a 580X or a 5500X , why even bother with the Vega II since it is yesterday's technology; and for all we know, the Metal API in Big Sur might still be optimized for leveraging Vega II and not RDNA2 cards; that in itself could have a huge impact looking ahead to the future. Your choice should be based solely on your workflow needs. W5700X module is a much better value if all you are doing is video editing, throw 2 in if you need more, at a much better price/performance ratio.
If you already have Vega II then I agree it is kind of pointless to upgrade to a W6800X, at least right now, who knows if that will be true once macOS Monterey is released? Better drivers and/or more Metal optimization targeting RDNA2?
If your primary reason of living is rendering 3D graphics with GPU based rendering solutions then the W6800X Duo is a much better choice than the Vega II Duo.
If all you want to do is play games; buy a Windows rig...ok if you MUST game on the Mac then a retail 6800XT or 6900XT is probably your better choice (and way cheaper). I just compared running "Shadow of the Tomb Raider" on my Mac with the W6800X Duo and my Windows rig which has a Nvidia 1080Ti. It was not much better than the 1080Ti. All this means is either the Ferrel Port for macOS is just so-so or the more obvious answer of DirectX being much more mature than Metal and thus is more efficient. (Was running @ 3840x1600, 60Hz, vsync on, everything maxed.)
I am not disappointed at all in my decision to go with the W6800X Duo. I mean really ANYTHING I bought was going to be better than the 580x. I chose the 580x on purpose when I purchased the Mac Pro because I knew RDNA2 was right around the corner.
Of course the Vega IIs are a better value now, because the price was slashed; but I just do not see the point in spending money on yesterday's tech.
Just do as much research as possible so that you make the best choice for you. It is your decision and you should not let me or anyone else try to talk you off someplace where you might regret any decision that is made. Not only that we are talking about a big wad of cash; spend it wisely.
Maybe you should wait for macOS Monterey to be released; that could change the equation.