Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.

tuxon86

macrumors 65816
May 22, 2012
1,321
477
That's not VR properly, it's telepresence, and stereo vision it's optional and depends more on bandwidth than graphics processing, neither you record the entire 360 scene, I'll name it instead Stereo vision telepresence.

You don't get to define what VR is or isn't either buddy...

In fact you are way off base when it comes to VR, maybe finish school and study it a bit before trying to define what is VR or isn't.

Like I said we are using it today and your not, so find a new hobby this one is starting to be boring.
 
  • Like
Reactions: ssgbryan

Mago

macrumors 68030
Aug 16, 2011
2,789
912
Beyond the Thunderdome
You don't get to define what VR is or isn't either buddy...

In fact you are way off base when it comes to VR, maybe finish school and study it a bit before trying to define what is VR or isn't.

Like I said we are using it today and your not, so find a new hobby this one is starting to be boring.
Yes sir, you are an authority...
 

Derpage

Suspended
Mar 7, 2012
451
194
Whether VR is really the future or not isn't that relevant. VR capability is a practical benchmark. It's high requirements give you an idea of what sort of technologies can be driven by the hardware you are interested in. Arguing about whether this a craze or the future of computing just obfuscates the reality that a closed system limits your ability to adapt to trends that the manufacturer did not anticipate, or worse yet, built into their sales model.
 

mattspace

macrumors 68040
Jun 5, 2013
3,343
2,975
Australia
Skylake, and Fiji.
Yes, but that's probably not locked into the hardware, and a nnMP or more likely an OS update, or both, could change that.

From the way I've heard it described by engineers, the "all displays are driven by a single card" is actually a pretty fundamental part of the design of the OS X display system, and to change it would more or less require rewriting the entire display system from scratch. While it *does* actually work to varying degrees to have different displays driven by different cards, it's more by accident than intent.
 
  • Like
Reactions: Xteec

AidenShaw

macrumors P6
Feb 8, 2003
18,667
4,677
The Peninsula
Skylake, and Fiji.


From the way I've heard it described by engineers, the "all displays are driven by a single card" is actually a pretty fundamental part of the design of the OS X display system, and to change it would more or less require rewriting the entire display system from scratch. While it *does* actually work to varying degrees to have different displays driven by different cards, it's more by accident than intent.
Or run Windows on your MP6,1 - even XP can utilize more than one card for displays. ;)
 

mattspace

macrumors 68040
Jun 5, 2013
3,343
2,975
Australia
I wonder why Apple would engineer OSX that way?
To deliberately remove a feature from your operating
that is of benefit to your customers. It seems really odd.

probably part of the baggage inherited from *NIX/NeXT - couldn't be changed at the time OS X's architecture was set for release, and overlapped by the rise of graphics cards that could drive multiple monitors, so not a priority for the resources necessary to change it.
 
  • Like
Reactions: Nugget and Xteec

flat five

macrumors 603
Feb 6, 2007
5,580
2,657
newyorkcity
3D on a display screen is in fact 2D. You render your scene in 3D but the resulting image displayed on your screen is 2D.
Make this simple test: Create a 3D box on your monitor and without rotating it try to see what is hiden behind it. With VR I can "move" behind the box and look at what is there. I'm the one moving, not the scene. Can't do that on a monitor.
(not disagreeing etc.. just a launch point)

our eyes only work in 2D.. everything we see is 2D.. we can only see up/down/left/right but can't actually see forward/backward (which is why we can look at a '3D' model or 3D movie on a flat panel and make sense of it as having 3 dimension.. we're used to interpreting 3 dimensional info through 2 dimensional input.. but that's just our brains trying to make some sort of sense out of things ;) ).. it's things like motion/relative size/light&shadows which our brains use to perceive distance in a very similar way 3D modeling apps work in order for us to -- but we don't actually see the distance.

in the simple test, you have to physically move behind the 3D box in order to look 'inside' of it with 2 dimensional vision.. with 3D vision, you wouldn't have to move around the box in order to look inside.. this would be us experiencing the 4Dimensional world in a way we've never been able to before (i mean, we've never been able to even recognize dimensions other than x,y,z using the tools we're born with).. virtual reality tech will possibly be a big one in furthering our understanding of reality reality. (ironic or funny that we currently call this stuff 'virtual' reality)

personally, i think VR tech and/or the future tech based off rift etc is going to be offering up much more than what most people can imagine right now.. it's not going to be 'just another way to view content as we know it' (like 3D movies or similar do).. yes, doing a walkthrough of a yet to be constructed structure in a manner that is more real or humanlike experience will be awesome.. and things like that, i think, could drive the VR market on it's own..
but at the same time, things like that are only scratching the surface of what we may experience and learn from technology along these lines.
 
Last edited:

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
From the way I've heard it described by engineers, the "all displays are driven by a single card" is actually a pretty fundamental part of the design of the OS X display system, and to change it would more or less require rewriting the entire display system from scratch. While it *does* actually work to varying degrees to have different displays driven by different cards, it's more by accident than intent.

Mmmmmm no. This isn't really getting at what the problem is.

All operating systems assume a display is only driven by one card. Windows, Mac, Linux, whatever. Even all the hardware assumes it. Unless you're plugging a monitor into two different cards, at the end of the day, only one card is driving it.

Crossfire works by slicing up your output, dividing it up among cards, and then sending all the slices back to the main card powering the display. So if you divide your display in two, one card will draw the top half, the other card will draw the bottom half, and then one card will take the two halves, put them together, and then send them off to the display that the card is connected to. In the past this has been done over the bridge, but now software is fast enough to do it.

There actually isn't a problem here. OS X can totally do partial rendering on one card, and then send it to another card for final display (They've demoed this at WWDC previously). The issue is that it's not automatic. A developer has to write the code by hand to do it. If Apple or AMD wanted to, nothing is stopping them right now from writing Crossfire support. It's just that no one cares.

They also may already have it working for automatic graphics switching. It's likely that when your dGPU turns on in your Macbook Pro, it's just forwarding the rendered 3D output from your discrete GPU to your integrated card.

The other thing to keep in mind is that Crossfire is complicated. Done the wrong way, trying to send resources back and forth between cards can lead to a performance decline. That's why Crossfire or SLI have profiles. DirectX 12 requires that developers be willing to get their hands a little dirty. Nowhere exists a multiple GPU implementation that is totally automatic and just works with everything.

So can classic MacOS, but not OS X, it seems.

I don't think this is true either. Classic Mac OS barely supported OpenGL, much less multiple cards.

The underlying display system under OS X, while having other issues, doesn't prevent this from going on. Nothing is wrong with OS X here, and it's certainly not a result of some sort of UNIX heritage outdatedness.
 
Last edited:

mattspace

macrumors 68040
Jun 5, 2013
3,343
2,975
Australia
Mmmmmm no. This isn't really getting at what the problem is.

All operating systems assume a display is only driven by one card. Windows, Mac, Linux, whatever. Even all the hardware assumes it. Unless you're plugging a monitor into two different cards, at the end of the day, only one card is driving it.
I don't think this is true either. Classic Mac OS barely supported OpenGL, much less multiple cards.

I think you're reading me backwards. The problem, as I understand it, is that OS X is architecturally not able to reliably have multiple displays, driven by more than 1 GPU - in other words if you had 3 GPUs on an OS X System, it's not designed to allow you to have 1 monitor pugged into each card - what it is architected to do is drive all 3 monitors off of one card, and keep the others for non-display compute.

In scenario 1, you might have 3 GPUs, each pushing pixels for a single 4k monitor.
In scenario 2, 1 GPU has to push all the pixels of 3x 4k monitors, and the other 2 GPUs perhaps sit idle.

For VR this potentially means it's tricky to have a dedicated graphics card driving each eye as a separate display.

In the Classic Mac OS days, when OpenGL wasn't an issue and we're just talking about drawing pixels to the screen, you usually only had 1 monitor per graphics card, you could plug in as many cards as you had NuBUS/PCI slots, with a display on each, and that was the way it was designed to work. My understanding is that is not the case for OS X.

I'm happy to be corrected on this, but what I've been told is that scenario 1 may work, but it's not a supported config, and weirdness can be expected.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
I think you're reading me backwards. The problem, as I understand it, is that OS X is architecturally not able to reliably have multiple displays, driven by more than 1 GPU - in other words if you had 3 GPUs on an OS X System, it's not designed to allow you to have 1 monitor pugged into each card - what it is architected to do is drive all 3 monitors off of one card, and keep the others for non-display compute.

In scenario 1, you might have 3 GPUs, each pushing pixels for a single 4k monitor.
In scenario 2, 1 GPU has to push all the pixels of 3x 4k monitors, and the other 2 GPUs perhaps sit idle.

No, that's totally wrong. OS X only supports running the display off of the GPU it's attached to. There are several tech notes on it, and it's baked into the APIs.

https://developer.apple.com/library/mac/technotes/tn2229/_index.html

"Multiple GPU support has existed in Mac OS X for a long time"

The only time you get to something close to this is on the Mac Pro, where all six Thunderbolt ports are internally wired to one card. But even on the old Mac Pro Apple supported multiple GPU configurations with each wired to different monitors, and each powering their monitors independently.

In the Classic Mac OS days, when OpenGL wasn't an issue and we're just talking about drawing pixels to the screen, you usually only had 1 monitor per graphics card, you could plug in as many cards as you had NuBUS/PCI slots, with a display on each, and that was the way it was designed to work. My understanding is that is not the case for OS X.

Nope. OS X totally supports that.
 

mattspace

macrumors 68040
Jun 5, 2013
3,343
2,975
Australia
Nope. OS X totally supports that.


OK, that's all really interesting and seems to run counter to what I've been hearing of late, which I confess hadn't sounded correct to me, because I'd known Apple had sold multiGPU cMPs, but having never used one (all my OSX systems back to ~2003 have been powerbooks or minis), based on what I've been told lately, I assumed they must have just had all screens running off one card.

thanks for the clarification.
 

Flint Ironstag

macrumors 65816
Dec 1, 2013
1,334
744
Houston, TX USA
I don't pretend to know what's going on under the hood, but from experience with mac pro towers with multiple GPUs, all displays are definitely NOT driven by a single card. For anecdotal evidence, ask the folks who have a GT120 plus a high performance, non-flashed card in their mac pro tower. Leave a monitor plugged into the GT120, and it will provide boot screens. Leave a monitor plugged into the high performance card, and it will happily provide video once the login screen is reached (drivers are loaded).

Individual cards can absolutely output their own video.
 

Fl0r!an

macrumors 6502a
Aug 14, 2007
909
530
Apple just decided to hook up all 6 TB ports + HDMI to the same GPU. I have no idea why they did that, because those old AMD cards are limited to 2 legacy (DVI/HDMI) connections at a time. Using both GPUs would have doubled that number to 4 concurrent legacy displays, and additionally the load would be shared equally.
This wouldn't have required anything but a different routing on the PCB.

It's certainly not a limitation of OS X, you can easily throw multiple GPUs in a Hackintosh and use them to build a huge video wall. OS X just lacks Crossfire/SLI.
 
  • Like
Reactions: ActionableMango

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland

Mago

macrumors 68030
Aug 16, 2011
2,789
912
Beyond the Thunderdome
You guys forgot that if Apple wants to go with external GPUs they have to make multiple GPU setups working properly and in plug-n-play mode.

It is however a bit embarrassing that you have to ask Apple to do something that on other platforms is already for years...(Multiple displays from multiple GPUs).

P.S. http://www.overclock3d.net/articles/gpu_displays/amd_polaris_10_engineering_sample_pictured/1

Apple has a patent on a Thunderbolt display with GPU on board, enabling GPU on thunderbolt depends on drivers I mean.

Razer just released its external GPU cage (empty at 500$) only supports few GPUs by now (on windows) at least is not an propertary device and should work witn any TB3 pc (or mac?)

Razer's GPU seems strange familiar to me, look like an TB2 cage, maybe they just rebranded and updated to TB3 or just picked production before its launched on OSX environmet.
[doublepost=1458227716][/doublepost]
They didn't want to mess with it because Metal was and is coming.

so why Apple included those lines on the logic boards ? I no plan to support it why to expend money on that (unles Windows gaming worth for apple)...

As you said CrossFire (as SLI) are a nightmare also on windows, maybe Apple just didnt has it working stable enough to release to the public.
 

Bubba Satori

Suspended
Feb 15, 2008
4,726
3,756
B'ham
Apple has a patent on a Thunderbolt display with GPU on board, enabling GPU on thunderbolt depends on drivers I mean.

Razer just released its external GPU cage (empty at 500$) only supports few GPUs by now (on windows) at least is not an propertary device and should work witn any TB3 pc (or mac?)

Razer's GPU seems strange familiar to me, look like an TB2 cage, maybe they just rebranded and updated to TB3 or just picked production before its launched on OSX environmet.
[doublepost=1458227716][/doublepost]

so why Apple included those lines on the logic boards ? I no plan to support it why to expend money on that (unles Windows gaming worth for apple)...

As you said CrossFire (as SLI) are a nightmare also on windows, maybe Apple just didnt has it working stable enough to release to the public.

Yeah, I think that's the case. Maybe with the new generation of cards
they'll be more stable in multiple configurations and Apple will have a go at it.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
I know, but I would not simply dismiss this chart. People are, I think, underestimating the jump on nodes between 28nm and 14/16. 16 nm is supposed to double the performance. So it would be logical to think that GTX1080/X80 would have 4096 CUDA cores, for example.

Im not disagreeing that it is be fake. However, even if it is fake it can be extremely close to reality/turn to be reality.(if we discard the X80Ti - GP100 chip with GDDR5 even if Titan based on the same ASIC uses HBM2 ;)). I still think the specs of X80 and Titan are spot on.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.