Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.

ActionableMango

macrumors G3
Sep 21, 2010
9,613
6,909
I seem to remember in the early days of the nMP someone posted a block diagram he'd found, and the second GPU has no path to the monitors. It's the wiring that is the limitation, not OS X.

I don't understand why so many people are saying OS X doesn't support screens connected to multiple video cards. People have been doing that on this forum for years. Apple even sold MP 5,1 with two 5770 video cards as a BTO option. The second card wasn't compute-only and it ran monitors just fine.

Heck, I've seen crazy multiple card, multiple monitor setups here on MR all the way up to 10 monitors.

TEN monitors!

https://forums.macrumors.com/threads/success-10-screens-on-a-mac-pro.1562363/
 
  • Like
Reactions: Flint Ironstag

goMac

macrumors 604
Apr 15, 2004
7,663
1,694

Specs seem a little high with TDP a little low, but all in all not totally out of line with the sort of process change Nvidia (and AMD) will bring.
[doublepost=1458237575][/doublepost]
I seem to remember in the early days of the nMP someone posted a block diagram he'd found, and the second GPU has no path to the monitors. It's the wiring that is the limitation, not OS X.

Apple is intentionally keeping the second GPU as a device meant for compute. Their understanding is that if your in something like FCPX, one GPU will be busy with drawing, and the other will be busy with OpenCL. This doesn't translate well to graphics heavy workflows, but they don't want a second monitor causing the compute GPU to be slowed down.

It matters for the number of displays you can connect, but it probably won't matter for multi GPU rendering. Modern multi GPU rendering virtualizes everything so it doesn't matter what connects to what.

I don't understand why so many people are saying OS X doesn't support screens connected to multiple video cards. People have been doing that on this forum for years. Apple even sold MP 5,1 with two 5770 video cards as a BTO option. The second card wasn't compute-only and it ran monitors just fine.

I don't know why either. Apple sold Mac Pros with multiple GPUs for just this reason. It's documented, it's been tested. I'm pretty sure OS X supported this since it launched. I'm even pretty sure NeXTStep supported this.
 

ManuelGomes

macrumors 68000
Original poster
Dec 4, 2014
1,617
354
Aveiro, Portugal
The X could mean 10, therefore x80=1080.
Just like OS X :)
[doublepost=1458238562][/doublepost]8G GDDR5 mem? I'd buy it if it was GDDR5X. It should be but NVidia failing to mention it?! Or was it just a slip?!
The rest seems possible but a little too perfect.
512b mem bus at 8G would sip up a lot of power though.
[doublepost=1458239809][/doublepost]Intel released a goodie for gamers, the Skull Canyon NUC with TB3. Add an eGPU and there you have it.
 
  • Like
Reactions: koyoot

Mago

macrumors 68030
Aug 16, 2011
2,789
912
Beyond the Thunderdome
I know, but I would not simply dismiss this chart. People are, I think, underestimating the jump on nodes between 28nm and 14/16. 16 nm is supposed to double the performance. So it would be logical to think that GTX1080/X80 would have 4096 CUDA cores, for example.

Im not disagreeing that it is be fake. However, even if it is fake it can be extremely close to reality/turn to be reality.(if we discard the X80Ti - GP100 chip with GDDR5 even if Titan based on the same ASIC uses HBM2 ;)). I still think the specs of X80 and Titan are spot on.
Sorry but this chart isn't consistent with power savings on switching from 28nm to 14nn you previously published here some data from GF About, assuming it's true they twice num transistors at same clock and the process saves near 75% it's simple final power should be less than the original about 15-25 % less.

This chart seems a mockup or click bait
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Only AMD is using GloFo 14 nm process. Nvidia builds their GPUs on TSMC 16 nm FF+ process.
 

Mago

macrumors 68030
Aug 16, 2011
2,789
912
Beyond the Thunderdome
I really want to know is about Polaris and Vega, I know nVidia likely to be far superior but that gap among Polaris or Vega and Pascal will redefine all gpu pricing.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Huh? Where is information that NV will use 14nm process on that chart?

Chart from VeriSilicon was comparing GloFo 28 nm Process to 14 nm FF from GloFo. Nvidia never used GloFo process, and used always TSMC. Do not take that slide as a mark of power consumption of Nvidia GPUs.
 

Mago

macrumors 68030
Aug 16, 2011
2,789
912
Beyond the Thunderdome
Huh? Where is information that NV will use 14nm process on that chart?

Chart from VeriSilicon was comparing GloFo 28 nm Process to 14 nm FF from GloFo. Nvidia never used GloFo process, and used always TSMC. Do not take that slide as a mark of power consumption of Nvidia GPUs.
Not this chart, I talk about the chart with the alleged PASCAL gpu
 

lowendlinux

macrumors 603
Sep 24, 2014
5,460
6,788
Germany
I really want to know is about Polaris and Vega, I know nVidia likely to be far superior but that gap among Polaris or Vega and Pascal will redefine all gpu pricing.
I may be holding out on pointless hope but I don't think nVidia is going to be superior in any measurable way this generation
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
It looks like Polaris 10 is 4GB HBM1 GPU with Nano size.

More than enough for Pitcairn replacement.
 

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
As you said CrossFire (as SLI) are a nightmare also on windows, maybe Apple just didnt has it working stable enough to release to the public.

Current OS X openGL implementation with multi-gpu is a mess, especially if you want to use two or more GPU's with one monitor and run software that uses both openGL and openCL. Apple needs Grand Central Dispatch (that was released not until OS X 10.6 for CPU) for GPU's and I believe this is one of the goals of the Metal... hopefully in next release.
 
Last edited:

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Current OS X openGL implementation with multi-gpu is a mess, especially if you want to use two or more GPU's with one monitor and run software that uses both openGL and openCL. Apple needs Grand Central Dispatch (that was released not until OS X 10.6 for CPU) for GPU's and I believe this is one of the goals of the Metal... hopefully in next release.

Current OS X OpenGL for multiple GPUs isn't really a mess. Like I said, multi GPU is complicated on any platform. DirectX 12 makes it easier, so the best you could say is that things aren't as easy on OS X. But calling it a mess is really over dramatic. You can write a game that uses multiple GPUs right now in OS X if you wanted, but it just takes a bit of work, and there aren't enough multiple GPU Macs out there for anyone to care.

Apple has been doing multiple GPU rendering demos on OS X since 2010 with OpenGL:
https://developer.apple.com/videos/play/wwdc2010/422/
( ^ I posted the wrong link before, this is the right one)

But again, even though it's possible, no one cares.
 
Last edited:

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
Current OS X OpenGL for multiple GPUs isn't really a mess. Like I said, multi GPU is complicated on any platform. DirectX 12 makes it easier, so the best you could say is that things aren't as easy on OS X. But calling it a mess is really over dramatic. You can write a game that uses multiple GPUs right now in OS X if you wanted, but it just takes a bit of work, and there aren't enough multiple GPU Macs out there for anyone to care.

Apple has been doing multiple GPU rendering demos on OS X since 2010 with OpenGL:
https://developer.apple.com/videos/play/wwdc2010/422/
( ^ I posted the wrong link before, this is the right one)

But again, even though it's possible, no one cares.
Thanks for the link. Yes, my wording was somewhat strong..

You're right, the basics are there. If the real world is as nice as demoed in this video, all apps should almost automatically take advantage of multi-gpu's and it just works.. even the hot-plug GPU is supported and different brands together. But, as far as I have understood, in the real world use, these have not worked correctly, latency is too high, api's are broken.. etc. and that is why they're not used. Second reason has been CUDA, because it gave easier access to build multi-GPU software. Is that why Apple sack them? To make developers to support multi-GPU through OS X's api's and not CUDA?

Anyway, OS X would greatly benefit from Grand Central Dispatch technology for GPU's to easy up developers task to build software that automatically works with one or more GPU's. All graphics and GPU compute is going to happen through Metal in future.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
But, as far as I have understood, in the real world use, these have not worked correctly, latency is too high, api's are broken.. etc. and that is why they're not used.

I don't know about broken APIs. It seems like a pretty important API for driving the system.

Latency would have been an issue maybe back in 2010. Crossfire and SLI bridges would have bypassed the latency, but they only work on Windows. On newer systems it shouldn't be an issue anymore, which is why AMD and Nvidia are dropping the bridges.

I think it really comes down to developers not caring. The Mac Pro isn't enough of the market, and it's too much extra work to do for such a small slice of the market. Aspyr announced they were working on multi GPU support for Civ 5 on the Mac Pro, but they dropped it because they realized almost none of their customers had Mac Pros. Especially nMPs.

Second reason has been CUDA, because it gave easier access to build multi-GPU software. Is that why Apple sack them? To make developers to support multi-GPU through OS X's api's and not CUDA?

I don't think CUDA has any special multi-GPU APIs beyond what all the other compute libraries have. I don't think there is anything automatic in CUDA.

Anyway, OS X would greatly benefit from Grand Central Dispatch technology for GPU's to easy up developers task to build software that automatically works with one or more GPU's. All graphics and GPU compute is going to happen through Metal in future.

No one on the market has an automatic solution. There are easier solutions, but there will never be anything completely automatic. It's kind of like wishing for a solution that will automatically make all your software run over all 32 cores on a processor. It's not really going to happen. Crossfire and SLI aren't automatic either, and they require the program to support them and provide a profile.

Apple could make some changes to Metal to make it easier, but I really don't think it's high on their priority list. You're talking about something that would only work on the Mac Pro, and would really only be specific to gamers. Again, not very high on the priority list. It's not impossible that Apple would work on making multi GPU render easier, but they have a lot of probably more important things to work on.
 

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
AMD said in their Capsaicin presentation that in near future, two or more GPU's will become more and more common and mainstream.. maybe Apple will share their idea? That's why I was thinking that they have to help and motivate developers in this matter. I didn't find any tools for this from MetalKit.

I've been playing with an idea how to utilize simultaneously iGPU on machines where there is also dGPU. According to this video you posted, it should just work. I don't know, how the thermals have been calculated for it though.. so much potential in a lot of Mac's is unused.

The following tests have been run with CUDA and there is some benefit with two GPU's.. but they're CUDA only. You can find also our old friend mentioned there.. http://barefeats.com/gpu680v6.html
[doublepost=1458294267][/doublepost]Oculus Rift reported, that even Mac Pro with D700 cannot meet their minimum specs. It is weird, because their minimum spec is: R9 290. Dual D700 can produce about 50% more compute power than 290. Are they really that bad programmers at Oculus Rift, that they cannot use Apple's simple guidelines for dual GPU programming? Or is the reality of using multi-GPU in OS X worse than Apple claims? I see it purely as a lack of decent support for developing for multi-GPU's.

http://www.shacknews.com/article/93...upport-if-apple-ever-releases-a-good-computer
 
Last edited:

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
What AMD touted is that in future there will be multiple dies of GPUs on the same interposer, acting as single GPU. The ability to see multiple dies of GPUs, even if not connected directly through interposer, but through motherboard, is crucial for future computing, and right now nobody is denying it.
 

Mago

macrumors 68030
Aug 16, 2011
2,789
912
Beyond the Thunderdome
AMD said in their Capsaicin presentation that in near future, two or more GPU's will become more and more common and mainstream.. maybe Apple will share their idea? That's why I was thinking that they have to help and motivate developers in this matter. I didn't find any tools for this from MetalKit.

I've been playing with an idea how to utilize simultaneously iGPU on machines where there is also dGPU. According to this video you posted, it should just work. I don't know, how the thermals have been calculated for it though.. so much potential in a lot of Mac's is unused.

The following tests have been run with CUDA and there is some benefit with two GPU's.. but they're CUDA only. You can find also our old friend mentioned there.. http://barefeats.com/gpu680v6.html
[doublepost=1458294267][/doublepost]Oculus Rift reported, that even Mac Pro with D700 cannot meet their minimum specs. It is weird, because their minimum spec is: R9 290. Dual D700 can produce about 50% more compute power than 290. Are they really that bad programmers at Oculus Rift, that they cannot use Apple's simple guidelines for dual GPU programming? Or is the reality of using multi-GPU in OS X worse than Apple claims? I see it purely as a lack of decent support for developing for multi-GPU's.

http://www.shacknews.com/article/93...upport-if-apple-ever-releases-a-good-computer
Oculus partnerships wants to build a new pc market, when actually many pc meet the requirements for VR, they don't want that but the consumers to believe their machines are obsolete and need some upgrade to enable that wonderful world that VR promises, that's actually an hype manipulation.

It's like to said a pc is not qualified to play CRISYS Just because can't do dual display hd (or qHD) at full settings, that's the real extreme requirement for Oculus.

VR actually requires a bit less than 2x the gpu power for the same game at same settings on a single display.

This remember me when MS launched windows Vista, was a market flop since it had a deliberated handicap to run slower and require a much more powerful pc just for few visual effects, people soon learned their systems where downgraded by this "upgrade", then you know the history.

I will order soon an PS VR Just to bite the hype, from the reviews I have a mixed impression, some people report nausea on some games (maybe the same people report nausea at sea or flying), all previous 3D systems used to report these syndrome, in case the VR tech can't overcome this, are terrible bad news for Oculus, OSVR Also for cardboard. It's a thing to be concerned.
 

ManuelGomes

macrumors 68000
Original poster
Dec 4, 2014
1,617
354
Aveiro, Portugal
Broadcom is abandoning the WiFi business it seems.
Apple uses their WiFi chips, as well as GbE, so what will happen here?
Maybe finally they start using an all round Intel networking solution.
That would be great.
[doublepost=1458315797][/doublepost]Imagination also seems to be in trouble, Apple's got a stake on it, let's hope the iPhone's integrated graphics core will keep showing it's claws. Laying off people and no R&D money (might not be the case though) could make it stall.
There's a rumor that AMD might provide some IP to Intel in the graphics department. How about that?
Also, I could see Apple turning to AMD for the graphics core, if Imagination doesn't make it.
That would make Apple all AMD, which is dangerous for them.
[doublepost=1458316210][/doublepost]Correction, it seems Imagination will even reinforce the PowerVR team after all.
I would like MIPS to come back in full force though.
 

ManuelGomes

macrumors 68000
Original poster
Dec 4, 2014
1,617
354
Aveiro, Portugal
stats.png

Developers using OS X...
 

tuxon86

macrumors 65816
May 22, 2012
1,321
477
So that's a reflection of what people on Stackoverflow are using to code, not coder in general.

While OS X has not surpassed the combined user base of Windows XP, Vista, 7, 8, and 10 which hold down a total 53.2 percent of active coders, surveyors StackOverflow note that if trends continue, next year, Windows will claim less than 50 percent.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.