Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Reindeer_Games

macrumors 6502
Nov 29, 2018
286
228
Pueblo, CO
Forget about marketing material gimmicks on websites. We have had these things since the 90s - every generation of tech, same marketing hype from graphic card makers from Matrix to Nvidia.

.....

You can right now have fun and see how 8K RED will work on your system. Download the 8K samples from RED’s website and also download their editor and playback apps which are free. Load an 8K video in the app and watch Activity Monitor monitor while the video caches.

If you have Windows the Task Manager will show you what is happening in RAM and VRAM. You will see how much each is consumed and why.

Very true about marketing and pushing sales.

Oh I need a better GPU before I’d step into that arena-LOL. Which is why we’re all hoping for a Christmas miracle-happy holidays to all!
 
Jul 4, 2015
4,487
2,551
Paris
Very true about marketing and pushing sales.

Oh I need a better GPU before I’d step into that arena-LOL. Which is why we’re all hoping for a Christmas miracle-happy holidays to all!

Geforce and Titan don't support 10 bit color anyway so trying to edit that HDR RED footage would be kinda redundant.

Don't worry bro the Mac Pro will be updated ;) I'm just hoping it has Navi and Apple supports overclocking. That would be a first time for Macs.
 
  • Like
Reactions: Reindeer_Games

Reindeer_Games

macrumors 6502
Nov 29, 2018
286
228
Pueblo, CO
Geforce and Titan don't support 10 bit color anyway so trying to edit that HDR RED footage would be kinda redundant.

Don't worry bro the Mac Pro will be updated ;) I'm just hoping it has Navi and Apple supports overclocking. That would be a first time for Macs.

I'd be happy if they'd just play nice and deliver drivers personally-this 8K part I'm rooting for in the professional community is above my pay-grade, really-LOL. Photography is my primary creative application, but Excel gets its fair share of workouts. Most people don't realize how data intensive trend analysis can get when you are taking about 20-200K data points and working across multiple page-sheets. I've just always focused on hardware-since software always comes second and is "easier" to modify; not by me-but other more talented people. There is nothing more beautiful than a driver update notification ping when limping along with buggy drivers.

I was under the impression 10-bit has been working in Windows for a while.
Now I don't have a 10-bit monitor or handle the media so I've paid no heed-and I can adjust it, but it wouldn't be visible/noticeable to me if I did. Are you saying this function doesn't work on your Titan XP? I'll take that with a grain of salt-as well as your claim you own any stock in a company that you'd bash on nearly as hard as you do. Just an observation, I don't really care what your portfolio looks like-but why would anyone still own stock that's performed as NVidia has in Q4 2018 unless there was turnaround expected? I prefer win/win scenario's over win/lose ones though-it tends to be more productive.

I understand that's a resolution output setting-but one would conclude that would enable the instruction set throughout the GPU, and if it's capable of outputting it-its capable of processing it to one degree or another.

"High Dynamic Range, or HDR, isn't a new concept in photography. It isn't even new to PC gaming, as some of the oldest games with HDR (using simple bloom effects) date back to the Valve Source engine (early 2000s). Those apps, however, used the limited 24-bit (8-bit per color, 16.7 million colors in all) color palette to emulate HDR. Modern bandwidth-rich GPUs such as the GTX 1080 have native support for large color palettes, such as 10-bit (1.07 billion colors) and 12-bit (68.7 billion colors), to accelerate HDR content without software emulation. This includes support for 10-bit and 12-bit HVEC video decoding at resolutions of up to 4K @ 60 Hz, or video encoding at 10-bit for the same resolution."-https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/3.html

For me-Turing's and Apple's speaking natively at a hardware and software level is an ideal solution to our many of our problems, but I have read great things leaking on Navi. We'll see what happens in Vegas next month and through March (End Q1 2019). If they want to open the door to both-so much the better. The grass is always greener on the other side and opening up GPU interchangeability eliminates the fear of missing out BS that drives 90% of disputes between PC/Mac/Hackintosh's.
 
Last edited:
Jul 4, 2015
4,487
2,551
Paris
I'd be happy if they'd just play nice and deliver drivers personally-this 8K part I'm rooting for in the professional community is above my pay-grade, really-LOL. Photography is my primary creative application, but Excel gets its fair share of workouts. Most people don't realize how data intensive trend analysis can get when you are taking about 20-200K data points and working across multiple page-sheets. I've just always focused on hardware-since software always comes second and is "easier" to modify; not by me-but other more talented people. There is nothing more beautiful than a driver update notification ping when limping along with buggy drivers.

I was under the impression 10-bit has been working in Windows for a while.
Now I don't have a 10-bit monitor or handle the media so I've paid no heed-and I can adjust it, but it wouldn't be visible/noticeable to me if I did. Are you saying this function doesn't work on your Titan XP? I'll take that with a grain of salt-as well as your claim you own any stock in a company that you'd bash on nearly as hard as you do. Just an observation, I don't really care what your portfolio looks like-but why would anyone still own stock that's performed as NVidia has in Q4 2018 unless there was turnaround expected? I prefer win/win scenario's over win/lose ones though-it tends to be more productive.

I understand that's a resolution output setting-but one would conclude that would enable the instruction set throughout the GPU, and if it's capable of outputting it-its capable of processing it to one degree or another.

"High Dynamic Range, or HDR, isn't a new concept in photography. It isn't even new to PC gaming, as some of the oldest games with HDR (using simple bloom effects) date back to the Valve Source engine (early 2000s). Those apps, however, used the limited 24-bit (8-bit per color, 16.7 million colors in all) color palette to emulate HDR. Modern bandwidth-rich GPUs such as the GTX 1080 have native support for large color palettes, such as 10-bit (1.07 billion colors) and 12-bit (68.7 billion colors), to accelerate HDR content without software emulation. This includes support for 10-bit and 12-bit HVEC video decoding at resolutions of up to 4K @ 60 Hz, or video encoding at 10-bit for the same resolution."-https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/3.html

For me-Turing's and Apple's speaking natively at a hardware and software level is an ideal solution to our many of our problems, but I have read great things leaking on Navi. We'll see what happens in Vegas next month and through March (End Q1 2019). If they want to open the door to both-so much the better. The grass is always greener on the other side and opening up GPU interchangeability eliminates the fear of missing out BS that drives 90% of disputes between PC/Mac/Hackintosh's.

10 bit definitely doesn’t work on GeForce in Windows. I have tried with three pro monitors and four graphics cards 980, 1070, 1080 ti and Titan XP. Have been posting results on this forum for 4 years.

The option sometimes shows on some versions of the drivers and then disappears on other driver versions. When the drop down option does appear it doesn’t appear to do anything and after a reboot it goes back to 8 bit.

This is known widely known (will appear in google results first pages) and the guy’s video is not reliable.

There are some Direct X 12 games that can force 10 bit color in full screen mode but it will be dithered color.

On macOS there is no GeForce 10 bit output and not even an option to change between different color depths.

Moving up to Quadro gives access to full 10 bit in applications.

AMD supports genuine 10 bit across their whole range.
 

MIKX

macrumors 68000
Dec 16, 2004
1,815
691
Japan
Apple started listening when Gilles Aurejac got his cMP to boot M.2 NVMe and then dosdude2 produced ROMtool. The slow but growing tide of bug reports ( thanks TsiAlex) resulted in bootrom 140.0.0.0 making NVMe booting native.

Perhaps they're listening once more re Nvidia drivers. :rolleyes: Let's hope.
 
Last edited:

itdk92

macrumors 6502a
Nov 14, 2016
504
180
Copenhagen, Denmark
That card will barely help. The CPU and memory is a big bottleneck if we are talking about 8K RAW regardless if you use 1080 ti or 2080 ti or 3080 ti.

My PC is 5Ghz with 4Ghz memory and Titan XP and is 'barely' good enough to handle 8K RAW. In an old cMP with 1333Mhz memory it would be a laughable experience.

Our Mac Pro 5.1 systems can edit 8K RED RAW 10:1 easy at 21fps even with nodes.

We just can’t go higher no matter how many graphic card we throw at it.

That might be a bandwidth limit but I don’t think so, and these RTX GPUs are so much better that we might see something good.
 
  • Like
Reactions: Reindeer_Games
Jul 4, 2015
4,487
2,551
Paris
Our Mac Pro 5.1 systems can edit 8K RED RAW 10:1 easy at 21fps even with nodes.

We just can’t go higher no matter how many graphic card we throw at it.

That might be a bandwidth limit but I don’t think so, and these RTX GPUs are so much better that we might see something good.

21FPS is poor performance for 10:1 (that's a high ratio). Yes, a new GPU won't do anything. The bandwidth limits is in your system's RAM preliminary. Secondarily the PCIE 2.0 interface will also become saturated. You can't go around this with pipe dreams about upgrades.
 

itdk92

macrumors 6502a
Nov 14, 2016
504
180
Copenhagen, Denmark
21FPS is poor performance for 10:1 (that's a high ratio). Yes, a new GPU won't do anything. The bandwidth limits is in your system's RAM preliminary. Secondarily the PCIE 2.0 interface will also become saturated. You can't go around this with pipe dreams about upgrades.

You don’t understand.

It’s 21 fps 8K Red Raw 10:1 in the Resolve Color Tab, while on a 4K timeline and playing back at Full Res Premium. That’s pretty good, especially because a few nodes don’t even impact the playback speed, which remains at 21fps.
 

Draeconis

macrumors 6502a
May 6, 2008
987
281
Just to clarify; does anyone have a reference 2070 running with boot screens on a 5,1 running Windows 10 x64 installed via EFI?
 
  • Like
Reactions: Squuiid

Reindeer_Games

macrumors 6502
Nov 29, 2018
286
228
Pueblo, CO
Our Mac Pro 5.1 systems can edit 8K RED RAW 10:1 easy at 21fps even with nodes.

We just can’t go higher no matter how many graphic card we throw at it.

That might be a bandwidth limit but I don’t think so, and these RTX GPUs are so much better that we might see something good.

Supposedly it's gonna get easier, though I still can't find a data rate number-and its just a theory on why boot screen is present but I'm excited to see what 2019 is going to bring our way.

https://ymcinema.com/2018/07/07/a-thesis-red-8k-reduces-data-rate-and-file-size/

Just to clarify; does anyone have a reference 2070 running with boot screens on a 5,1 running Windows 10 x64 installed via EFI?


Spacedust is work on a non-reference 2070, but we haven't heard from anyone claiming to have a working system for a bit. Most are holding off unless you are looking to R&D at this point.
 
Last edited:

Spoon!

macrumors 6502
Dec 9, 2018
256
391
Seriously ... if Apple doesn’t start supporting NVIDIA via eGPU within the next year, my current Mac will be my last. Unless AMD pulls their heads from their rears and releases comparable high end GPUs, otherwise the current sad selection of GPUs are not worth my time. I can’t even imagine what you MP users are dealing with.
 

Reindeer_Games

macrumors 6502
Nov 29, 2018
286
228
Pueblo, CO
I came across what appears to be pretty solid leaks on the rest of Nvidia's new release. The GTX 1160's memory looks...underwhelming IMHO:

https://www.techspot.com/news/78044-rtx-2060-vs-gtx-1160-ten-new-leaks.html

This is a route I definitely wasn't expecting since they suggest a different chipset for the GTX rehash since the tech industry has been reporting a surplus of GP106 being available due to the mining crash. I was kind of expecting them to overclock the GP106 and rebrand it-like they have so many times before.

The RTX 2060 looks good though and has promise for me.....if things get better on the Apple/Nvidia front.
 
Last edited:

09872738

Cancelled
Feb 12, 2005
1,270
2,125
.....if things get better on the Apple/Nvidia front.

Yup, this happens to be the key question... w/o drivers for Turing cards & Mojave this discussion is rather academic. Its just insane what goes on between nVidia and Apple... kinda reminiscent of Kindergarden
 
Last edited:
  • Like
Reactions: Reindeer_Games

TheStork

macrumors 6502
Dec 28, 2008
296
190
...
The RTX 2060 looks good though and has promise for me.....if things get better on the Apple/Nvidia front.
I've been looking at the RTX 2060, too, to replace the R9 280X in my Mac Pro 5,1 since it has TDP of 160 watts. So, you early adapters, let us know if we get a boot screen with the RTX 2060. (Yes, I know there is no drivers, yet. I can wait.)
 
  • Like
Reactions: Reindeer_Games

AidenShaw

macrumors P6
Feb 8, 2003
18,667
4,677
The Peninsula
I've been looking at the RTX 2060, too, to replace the R9 280X in my Mac Pro 5,1 since it has TDP of 160 watts. So, you early adapters, let us know if we get a boot screen with the RTX 2060. (Yes, I know there is no drivers, yet. I can wait.)
These are questions that Windows and Linux users never need to ask. Of course every card has a BIOS/boot screen. Of course.
 
  • Like
Reactions: thornslack

Reindeer_Games

macrumors 6502
Nov 29, 2018
286
228
Pueblo, CO
I've been looking at the RTX 2060, too, to replace the R9 280X in my Mac Pro 5,1 since it has TDP of 160 watts. So, you early adapters, let us know if we get a boot screen with the RTX 2060. (Yes, I know there is no drivers, yet. I can wait.)

Well remember, if you read my previous post-I'm pretty hesitant since BC reportedly won't boot (only Win10 EFI). But hope springs eternal for the patient. With the price of GPU's jumping higher and higher - I'm getting to the point that without official support on somebody's end it's become too painful (financially) for everybody; Apple and Nvidia included, but mostly the users.

Apple and Nvidia need to understand that they aren't competing just against AMD and PC, they are competing against themselves due to their mistakes of the past-learn from them. Mac's have always held their value due to their durability-you can't take that and expect them to hold value to anyone above a basic consumer. Once they get that then I'll spend some more of my money with them. Right now I would only be willing if I commit to purchase a RTX if building a new PC to drop it in as plan B-if the money becomes available, and continue running my 770 in cMP.

Always have a Plan A, B, C.... but I'm an optimist that thinks if there is enough pressure in the pro/superuser community, both (Apple /Nvidia) are far more likely to listen to the noise to assist their climb back from the recent economic slide. My personal opinion is they (Nvidia) have been waiting for the exit of the apex of the slide and will drop concurrent with the 2060 launch to gain full advantage of the momentum. Just a feeling though-and I don't want to build another computer, so I wait.
 
Last edited:
  • Like
Reactions: TheStork

star-affinity

macrumors 68000
Nov 14, 2007
1,996
1,333
Seriously ... if Apple doesn’t start supporting NVIDIA via eGPU within the next year, my current Mac will be my last. Unless AMD pulls their heads from their rears and releases comparable high end GPUs, otherwise the current sad selection of GPUs are not worth my time. I can’t even imagine what you MP users are dealing with.

My impression from reading post online and communicating a little with Nvidia's Director of Mac Product Manager via email (I've been doing some bug reporting for their Web Driver a few times) is that the drivers for Mojave are coming – it's probably just that it takes time to get it right. It's not difficult to imagine that the resources for producing the Mac drivers are far less than what they have for Windows and getting the drivers ready for both a new OS (Mojave) and new GPUs (RTX series) – if that's their intention – is probably not a small task.

But still, this is just my impression. I don't really have any inside info on what's going on behind the scenes. But I hope for the best…
 

skyline r34

macrumors 6502
Oct 10, 2005
397
33
San Diego
Confirm OS X boots, Just purchase a Titan RTX but it's not for this Mac Pro just testing it out to see if the card will work and it does, Unboxing install benchmark will be up in a few weeks on youtube.
 

Attachments

  • IMG_0986.jpeg
    IMG_0986.jpeg
    2.5 MB · Views: 329
  • IMG_0987.jpeg
    IMG_0987.jpeg
    2.6 MB · Views: 335
  • IMG_0988.jpeg
    IMG_0988.jpeg
    3.3 MB · Views: 371
  • IMG_0989.jpeg
    IMG_0989.jpeg
    3.3 MB · Views: 350
  • IMG_0990.jpeg
    IMG_0990.jpeg
    2.8 MB · Views: 320
  • IMG_0991.jpeg
    IMG_0991.jpeg
    2.3 MB · Views: 336

Reindeer_Games

macrumors 6502
Nov 29, 2018
286
228
Pueblo, CO
Confirm OS X boots, Just purchase a Titan RTX but it's not for this Mac Pro just testing it out to see if the card will work and it does, Unboxing install benchmark will be up in a few weeks on youtube.

Very nice and good to hear-thanks for the update.

I think it might be reasonable to presume the 2060 will as well-but confirmation is always better.
 
  • Like
Reactions: TheStork

Spoon!

macrumors 6502
Dec 9, 2018
256
391
My impression from reading post online and communicating a little with Nvidia's Director of Mac Product Manager via email (I've been doing some bug reporting for their Web Driver a few times) is that the drivers for Mojave are coming – it's probably just that it takes time to get it right. It's not difficult to imagine that the resources for producing the Mac drivers are far less than what they have for Windows and getting the drivers ready for both a new OS (Mojave) and new GPUs (RTX series) – if that's their intention – is probably not a small task.

But still, this is just my impression. I don't really have any inside info on what's going on behind the scenes. But I hope for the best…
By the time the drivers are out, the new version of macOS will be out.
 

Reindeer_Games

macrumors 6502
Nov 29, 2018
286
228
Pueblo, CO
By the time the drivers are out, the new version of macOS will be out.

Then don't buy one. Negativity gets no one anywhere-relatively slowly I might add. Mojave isn't that great IMO-I'll be happy with HS Turing drivers.

And coordinating a web-driver launch with a Mojave update might be the plan. But Nvidia has already announced RTX support for macOS at their CES presentation-to which macOS they are referring to isn't exactly clear yet.

*I am only referencing their CEO's comments referring to RTX Arnold support in both Windows and macOS's. In no way would I encourage any Mac users to spend their hard earned money until Nvidia delivers on that sales pitch.
 
Last edited:
  • Like
Reactions: bsbeamer

Spoon!

macrumors 6502
Dec 9, 2018
256
391
Then don't buy one. Negativity gets no one anywhere-relatively slowly I might add. Mojave isn't that great IMO-I'll be happy with HS Turing drivers.

And coordinating a web-driver launch with a Mojave update might be the plan. But Nvidia has already announced RTX support for macOS at their CES presentation-to which macOS they are referring to isn't exactly clear yet.

*I am only referencing their CEO's comments referring to RTX Arnold support in both Windows and macOS's. In no way would I encourage any Mac users to spend their hard earned money until Nvidia delivers on that sales pitch.
I’m still on High Sierra so I can keep using my NVIDIA GPU. But with the announcement of the Radeon VII, I might actually be able to update the OS.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
And coordinating a web-driver launch with a Mojave update might be the plan. But Nvidia has already announced RTX support for macOS at their CES presentation-to which macOS they are referring to isn't exactly clear yet.

Is there a link to that? Hard to believe they would make time for that announcement on stage just for the Mac Pro unless they weren't also planning on supporting eGPU.
 

Reindeer_Games

macrumors 6502
Nov 29, 2018
286
228
Pueblo, CO
Is there a link to that? Hard to believe they would make time for that announcement on stage just for the Mac Pro unless they weren't also planning on supporting eGPU.

I agree-and hope it's all stepping stones.


1 hr 14 min 23 sec is the most interesting part creatives will want to hear-but again, proof is in the pudding. 1 hr 18 min he directly speaks about Arnold-nothing has been said of the Mac Pro's specifically.

When the speech was given I couldn't' help but notice some similarities to some of the comments made in this very forum-LOL!

EDIT: I have been informed of the correction that the 1:18 comment is more than likely "MAX" 3D animation software, and may not actually be "Mac's" as stated in the CC subtitles.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.