Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
You're not getting solid 5K gaming with any card in the iMac this gen, in particular, when even current top-of-the-line enthusiast cards can't do it.

Polaris 10 with adaptive sync will do just that. It fits perfectly in to iMacs' 125W TDP class giving Fiji performance. With adaptive sync, even 43 fps looks smooth.

I can play Starcraft 2 with my MBP's 750M full retina 2880x1800 playable rates with medium settings. Radeon Fury is on some tests six times faster than 750M. There are less than three times the pixels in 5k screen than in rMBP 15". Starcraft 2 will run on 5k without a hiccup, with better settings. Of course there are games that wont run butter smooth with ultra settings and 5k. Do you really think that Apple's going to demo those? The list I made is the Apple marketing version. With a dose of sarcasm.

And don't forget that with Metal, and especially with the complete version, graphic engines will boost their performance even on an older HW.
 
Last edited:

ManuelGomes

macrumors 68000
Original poster
Dec 4, 2014
1,617
354
Aveiro, Portugal
Broadwell-EP 1600 v4 MIA. According to the roadmap, I believe it should have been out at the same time as the 2600.
Still, there are 2600 from 4 cores up, Apple could use these. Base clocks are low though.
 

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
I see no value in TrueAudio, personally, and I don't think it has a place in OSX/iOS. CoreAudio and AudioUnits cover most of that already, it is another proprietary format and uses DSP. To me it just screams 'stillborn'.

There's already a DSP integrated in Apple's ARM Ax chips that is used for instance for noise canceling. It is called actually ISP, because Apple is using it's power to Image processing too. Without it, taking pictures with iPhone would be very slow and face detection wouldn't work. ARM is too slow for that. If Apple goes to Virtual Reality, it needs all the signaling power it can get to create AR that works.

Specialized multiprocessor systems a la Commodore Amiga (with bundle of co-processors) are coming back. Past five years x86 CPU's have mainly evolved in power efficiency and core count.. There's going to be end for that. (ok, you can add more cores, but 20 cores already are pretty useless in general use. There are specialized cases of course...)

Just look at iPhone/iPad. There's a lot of things integrated to one chip; dual or more CPUs, dual or more GPUs, ISP(DSP), M9.. CPU becomes more like a police officer on a traffic congested street. I wouldn't be surprised that soon we'll have a desktop running two os on two different CPU's in one computer simultaneously - not just for security purposes - but for many other services. With a very low specced CPU you can do miracles when there's a group of specialized co-processors around.

My guess is that THIS is just what Apple is doing with AMD at the moment. There's going to be x86-64 and ARM integrated on same chip with GPU,GPGPU,M9,DSP and eDRAM and HBM2. It will run two os variants in one machine so clever that we don't even notice there's two computers running in one machine.
 
Last edited:

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
Not that there's a good chance of the other predictions coming true, but you really lost me on this one.
It had a dose of sarcasm in it. Apple promised similar things when Metal for OS X was announced. "These companies are going to support it" who still aren't.

And 8k and DP 1.4 are both already a standard existing so nothing new there really...
[doublepost=1459507834][/doublepost]
New file system? You'd hope, but I doubt that is high on Apple's priority list.

If you look at what I wrote above, new filesystem is mandatory to converge two OS running on same chip. Maybe only one of the CPU's has access to the file system and data between them is shared with a ram disk? HSA is anyway a method to accomplish this.

And if OS X becomes macOS 11, a new filesystem would be one way to make it a new OS. Apple has gained attention lately with their opinion about privacy. Maybe there's going to be a co-processor specialized for that? A chip that encrypts everything on fly without access from CPU. That would need a new file system - or at least improved - too.
 
Last edited:

Roykor

macrumors 6502
Oct 22, 2013
292
315
Polaris 10 with adaptive sync will do just that. It fits perfectly in to iMacs' 125W TDP class giving Fiji performance. With adaptive sync, even 43 fps looks smooth.

I can play Starcraft 2 with my MBP's 750M full retina 2880x1800 playable rates with medium settings. Radeon Fury is on some tests six times faster than 750M. There are less than three times the pixels in 5k screen than in rMBP 15". Starcraft 2 will run on 5k without a hiccup, with better settings. Of course there are games that wont run butter smooth with ultra settings and 5k. Do you really think that Apple's going to demo those? The list I made is the Apple marketing version. With a dose of sarcasm.

And don't forget that with Metal, and especially with the complete version, graphic engines will boost their performance even on an older HW.

So you think the big game developers are going to release there top games to OSX? You still will need a Windows partition to enter the best games. I dont see Metal is going to be a game changer for game developers focused on DirectX.

I have seen a couple of very good videos backedup with data from big gamers, that explain if 4K gaming is around the corner and if its working. 45fps is not good. You want higher frames simply because the frame drops. You dont want to see stutter when you are in a very active moment, right?

Dont get me wrong. A Nice videocard in a iMac will be more than welcome. I am interested in the heat problems that comes with that as well. A thing that iMacs always have to deal with cause the one in all setup.
 

filmak

macrumors 65816
Jun 21, 2012
1,418
777
between earth and heaven
My guess is that THIS is just what Apple is doing with AMD at the moment. There's going to be x86-64 and ARM integrated on same chip with GPU,GPGPU,M9,DSP and eDRAM and HBM2. It will run two os variants in one machine so clever that we don't even notice there's two computers running in one machine.

Very nice thoughts. Nice to remember Amiga...

Can you please inform me what is the benefit or practical case of running two OS ?
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Can you cover this how this works? Because i can not find this info and have not heard it before.



That would be nice to so Nvidia back in the office. Or at least, that we could have a choice. But what about their OpenCL only vision?

I personlay dont think NVidia is in any trouble. If you are building a good (gaming) machine, its an Nvidia that fills up the case. Check out any new builds on youtube, check out any recommended builds on the internet, its Nvidia all over. They are also working with the chips to build into cars..
Search for AMD HIP, and Boltzmann initiative. Also Otoy ported Octane renderer to CUDA that can work on any platform and any hardware. HPC market starts to use the HIP compiler to port CUDA software into OpenCL. It slowly starts to grow, also AMD launched S9300X2 GPU based on dual Fiji, and they already won pretty big contract: http://www.amd.com/en-us/press-releases/Pages/firepro-s9300-x2-gpu-2016mar31.aspx
AMD said:
Canadian Hydrogen Intensity Mapping Experiment (CHIME) will harness the AMD FirePro™ S9300 x2 Server GPU, the world’s fastest single-precision GPU accelerator1, to analyze extraordinary amounts of data to help create a new, very detailed 3D map of the largest volume of the Universe ever observed.

Every site currently advises to buy R9 390 because it is best performance to price ratio currently on the market. Also if you will look at sites that test GPUs on wider range, not just one, you will see that if you compare reference models between GPUs it is AMD that is currently winning every performance bracket. In 4K Fury X is faster than Titan X, but I would call it a draw because it is a ratio of 8:7, so in all fairness I would say that here is a draw. In 1440p currently Fury X is faster than GTX980 Ti, and Fury ties with it. In the same resolution R9 390X is faster in 13 out of 15 games on techpowerup suite than GTX980 if we compare reference models of the GPUs, and in that resolution R9 390 ties with Nvidia GPU. There is no point in comparing GTX 970 to R9 390. In 1080p R9 380 is faster in 12 from 15 games than GTX960. R9 380X has no real competitor, and it starts to tie with GTX 970 in some games. And there is DX12 fiasco for Nvidia, that has been debunked recently. R9 390X is as fast in DX12 titles as GTX980Ti. It has to, because it has similar compute power. And that is factor that will determine the performance of future GPUs in the gaming branch of market. Also, I have posted lately changes that MS made to HLSL, and DX12 initiative, looks like they do not want to stop development of the API, and few parts of functionality, like HLSL they are making OpenSource.

Nvidia has a lot to be afraid. Only thing at this moment what makes Nvidia GPUs better is CUDA and efficiency. If you want raw power - you have to look somewhere else.

I have written all of this over and over on this forums. Mindshare for Nvidia GPUs is beyond belief. People believe that 4TFLOPs GPU from Nvidia is magically more powerful than 4TFLOPs GPU from any other vendor.

And if anyone want to ask about gaming on OSX. There is strong evidence that Mass Effect: Andromeda, or Mass Effect 4 is coming to OS X. I have posted a thread about this on this forum.
 

ManuelGomes

macrumors 68000
Original poster
Dec 4, 2014
1,617
354
Aveiro, Portugal
So, now only Polaris missing for the nMP becomes a reality.
The GPU options are still somewhat blurry. I wouldn't also see Fiji in nMP because of DP and mem limit.
Polaris seems the right choice. But will it be Polaris 11 in D310 and Polaris 10 in D510/710? Or will we have to wait for Vega 10 for the high end models, next year?
Polaris + Vega would make sense.
Still, both GDDR5X (on high data rates) and HBM2 consume somewhat more power, that will have to come out of the total power budget, which is already tight.
 

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
Very nice thoughts. Nice to remember Amiga...

Can you please inform me what is the benefit or practical case of running two OS ?
Case a) ARM could handle disc-io, de-/encryption,usb, DSP/ISP and sleep mode activity. It could run only in kernel-mode. x86-64 could run apps,ui and band master of GPGPU. It would be possible to switch the roles according to the need. You could re-boot either and it wouldn't crash the second.
Case b) Integrating iOS os macOX worlds. Connecting iPad Pro to Mac Pro with TB and they could start to take each others roles. You could have split screen where you can share documents from both devices like they were one. Imagine Mac Mini that works as a dock for iPad Pro. This would also make a new kind of hybrid laptop possible.
[doublepost=1459517096][/doublepost]
So, now only Polaris missing for the nMP becomes a reality.
The GPU options are still somewhat blurry. I wouldn't also see Fiji in nMP because of DP and mem limit.
Polaris seems the right choice. But will it be Polaris 11 in D310 and Polaris 10 in D510/710? Or will we have to wait for Vega 10 for the high end models, next year?
Polaris + Vega would make sense.
Still, both GDDR5X (on high data rates) and HBM2 consume somewhat more power, that will have to come out of the total power budget, which is already tight.
It could be that we see dual 2GB Tonga as a starter GPU set, because not everyone need a strong GPGPU. For better models there could be dual Polaris Pro 4GB and dual Polaris XT 8GB.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
The thinking in the industry is that Nvidia's long term position is not good without some sort of x86 core (or maybe if ARM really takes off.) More and more of computers are shifting to integrated GPUs, as are consoles. As it stands, Nvidia has no strategy to get themselves back in the console market. Without a GPU/CPU in a single package they're locked out of industries.

They're pushing hard into HPC because it's really all they've got left that they're not actively threatened in. Even HPC may eventually fall to single package solutions, but Nvidia will use a CUDA lockout to stall that.
Tell me my friend, if any market will see that CUDA stalls you, not makes your work better, will they stick with it, or go somewhere else?
What if on our eyes there is birth of API that can work as good as CUDA, and is OpenSource? Will they stick to CUDA?

Industry is going in direction of OpenSource solutions.
So, now only Polaris missing for the nMP becomes a reality.
The GPU options are still somewhat blurry. I wouldn't also see Fiji in nMP because of DP and mem limit.
Polaris seems the right choice. But will it be Polaris 11 in D310 and Polaris 10 in D510/710? Or will we have to wait for Vega 10 for the high end models, next year?
Polaris + Vega would make sense.
Still, both GDDR5X (on high data rates) and HBM2 consume somewhat more power, that will have to come out of the total power budget, which is already tight.
Polaris has 1/3 the DP ratio. At least that can be seen on engineering sample of Ellesmere GPU.

Broadwell-EP still not on Intel ARK?
Nope :p.
 

ManuelGomes

macrumors 68000
Original poster
Dec 4, 2014
1,617
354
Aveiro, Portugal
Wouldn't bet on Tonga now. Possible though of course.
And going 2GB again would be a step back.

koyoot, let's hope the info on Polaris is accurate.
About ARK, that was actually a rectorical question, I checked first. :)
 

--AG--

macrumors member
Dec 20, 2012
36
14
OpenCL may be open but my own coding attempts have been nothing but a struggle. In contrast, my cuda projects have worked just fine. I guess my opinion is not objective and I am by no means an expert, however my conclusion is the same as with other open source projects – typically commercial projects/software just works better.
 

fuchsdh

macrumors 68020
Jun 19, 2014
2,028
1,831
Polaris 10 with adaptive sync will do just that. It fits perfectly in to iMacs' 125W TDP class giving Fiji performance. With adaptive sync, even 43 fps looks smooth.

I can play Starcraft 2 with my MBP's 750M full retina 2880x1800 playable rates with medium settings. Radeon Fury is on some tests six times faster than 750M. There are less than three times the pixels in 5k screen than in rMBP 15". Starcraft 2 will run on 5k without a hiccup, with better settings. Of course there are games that wont run butter smooth with ultra settings and 5k. Do you really think that Apple's going to demo those? The list I made is the Apple marketing version. With a dose of sarcasm.

And don't forget that with Metal, and especially with the complete version, graphic engines will boost their performance even on an older HW.

StarCraft II is hardly the most graphically demanding game (I was playing it just fine on a pre-unibody MBP) and frame rates consistently lower than 60fps is a non-starter for a lot of people.

No doubt the tech is advancing fast and I don't begrudge Apple or anyone for not worry about native-res gaming when putting out these panels, as they have better uses of the space (and gaming is focused on niche stuff like 120/144Hz stuff that is in turn irrelevant to most users.) But I'll buy into native-res iMac gaming when BareFeats shows me it's possible. Right now, the iMacs have decent cards for 1080/1440p gaming but that's about it.

http://barefeats.com/imac5k2.html
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
http://www.bitsandchips.it/images/2016/03/31/zhash1.png
http://www.bitsandchips.it/images/2016/03/31/zmem1.png
http://www.bitsandchips.it/images/2016/03/31/zraytrace1.png

Looks like there is a lot going on in the industry, after few days of complete silence ;).

I copied the only 100% non-April Fools joke from that site. Well, not really, the only other thing that is not AF joke is that Zen will have Broadwell-E levels of performance/Efficiency in better power envelope(this CPU is supposed to have 95W TDP compared to 140W).
 

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
StarCraft II is hardly the most graphically demanding game (I was playing it just fine on a pre-unibody MBP) and frame rates consistently lower than 60fps is a non-starter for a lot of people.

Our eyes start to see from 43hz/fps and above animation/movement as fluid. What has been problem before is the tearing happening on 60Hz panel running lower or faster fps or vsync that drops frames. Adaptive sync will fix that, IF a game can manage to keep its framerate above 43fps. For 3D it has to be at least double, depending on the technology how 3D glasses work. (are they passive or active)

VR on the other hand needs two screens that run 43fps minimum with adaptive sync.
 

Stacc

macrumors 6502a
Jun 22, 2005
888
353
I copied the only 100% non-April Fools joke from that site. Well, not really, the only other thing that is not AF joke is that Zen will have Broadwell-E levels of performance/Efficiency in better power envelope(this CPU is supposed to have 95W TDP compared to 140W).

First, these results are most likely an april fools joke. Second, even if they are true the zen part still isn't competing with Haswell-E and Skylake. I for one am skeptical AMD can catch up to Intel.
 

deppest

macrumors member
Oct 6, 2009
69
8
Our eyes start to see from 43hz/fps and above animation/movement as fluid. What has been problem before is the tearing happening on 60Hz panel running lower or faster fps or vsync that drops frames. Adaptive sync will fix that, IF a game can manage to keep its framerate above 43fps. For 3D it has to be at least double, depending on the technology how 3D glasses work. (are they passive or active)

VR on the other hand needs two screens that run 43fps minimum with adaptive sync.

Flicker fusion frequency is highly variable between individuals, heavily depends on the displaying device and the content being displayed (e.g. Dark/light, moving/still). Where do these 43 fps come from?
 

zephonic

macrumors 65816
Feb 7, 2011
1,314
709
greater L.A. area
There's already a DSP integrated in Apple's ARM Ax chips that is used for instance for noise canceling. It is called actually ISP, because Apple is using it's power to Image processing too. Without it, taking pictures with iPhone would be very slow and face detection wouldn't work. ARM is too slow for that. If Apple goes to Virtual Reality, it needs all the signaling power it can get to create AR that works.

Specialized multiprocessor systems a la Commodore Amiga (with bundle of co-processors) are coming back. Past five years x86 CPU's have mainly evolved in power efficiency and core count.. There's going to be end for that. (ok, you can add more cores, but 20 cores already are pretty useless in general use. There are specialized cases of course...)

Just look at iPhone/iPad. There's a lot of things integrated to one chip; dual or more CPUs, dual or more GPUs, ISP(DSP), M9.. CPU becomes more like a police officer on a traffic congested street. I wouldn't be surprised that soon we'll have a desktop running two os on two different CPU's in one computer simultaneously - not just for security purposes - but for many other services. With a very low specced CPU you can do miracles when there's a group of specialized co-processors around.

My guess is that THIS is just what Apple is doing with AMD at the moment. There's going to be x86-64 and ARM integrated on same chip with GPU,GPGPU,M9,DSP and eDRAM and HBM2. It will run two os variants in one machine so clever that we don't even notice there's two computers running in one machine.


I am specifically talking about TrueAudio and its DSP. I don't see it adding value, as Apple already has most of that stuff, and in Pro Audio we have seen a shift away from dedicated DSP rather than towards it.

I only read AMD's promo page, and while it may be a boon for game developers (primarily on the Windows side), I don't see any other use cases beyond that. I'm inclined to view this as primarily a marketing attempt to entice gamers away from nVidia. If enough games support it, it may eventually become a thing, I guess. But I doubt game development is a big factor in Apple's OS design decisions.

And finally, while things like the motion co-processor or facial recognition may require more power than ARM can throw at it, audio processing demands are modest by comparison. Convolution Reverb can become pretty CPU-intensive but it's nowhere close to graphic demands.
Companies like Avid and UA continue to sell hardware DSP, but in reality they are nothing more than dongles for their software, and in some cases they actually impede performance.

I don't know about the return of the co-processor or dedicated DSP. I'm just a layman on the outside looking in, but it seems to me the industry is actually moving in the opposite direction. Even the future for dGPU's doesn't look very bright to me at the moment.
 

fuchsdh

macrumors 68020
Jun 19, 2014
2,028
1,831
Our eyes start to see from 43hz/fps and above animation/movement as fluid. What has been problem before is the tearing happening on 60Hz panel running lower or faster fps or vsync that drops frames. Adaptive sync will fix that, IF a game can manage to keep its framerate above 43fps. For 3D it has to be at least double, depending on the technology how 3D glasses work. (are they passive or active)

VR on the other hand needs two screens that run 43fps minimum with adaptive sync.
Flicker fusion frequency is highly variable between individuals, heavily depends on the displaying device and the content being displayed (e.g. Dark/light, moving/still). Where do these 43 fps come from?

Yeah I'm going to need the receipts as well. I can create an animation with no tearing, play them back, and I can tell 48 versus 60 frames apart.
 

Stacc

macrumors 6502a
Jun 22, 2005
888
353
Yeah I'm going to need the receipts as well. I can create an animation with no tearing, play them back, and I can tell 48 versus 60 frames apart.

This argument of what frame rates are observable is silly. 60 FPS is the gold standard for PC gaming. Arguably this is now out of date, given that we have 120+ Hz monitors and VR that demands 75+ Hz. Certainly you can tolerate frame rates lower than 60 Hz but its not ideal. We have tolerated 24 Hz in films but remember we aren't interacting with a film.

No shipping video card can play the most graphically intense games at 4k at 60 FPS consistently let alone 5k. To reliably get 4k@60 hz in all games requires dual video cards like the Nvidia GTX 980 Ti or the AMD Fury. Certainly you can play older and less demanding games (like starcraft 2) with less GPU power. Adaptive sync can alleviate some of the tearing you get when you drop below 60 Hz but its no reason to not target an average of 60 FPS.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Im just curious, does any of you, talking about framerates, game on your machines?

I am playing each day competitively in Hearthstone and Heroes of the Storm. In future I will be playing also in Overwatch. Framerate is not my issue when I am dealing with tactic and doing the right thing in the right time.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.