Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Is koyoot in radio silence mode? No response....



I'd love to see the link supporting this claim. Nvidia said almost nothing about consumer Pascal when they launched the P100. The only futures discussed were that P100 chips in PCIe form factor cards would come in Q1CY17 - which they've hit.

It's "alternative facts" to claim that Nvidia blogs last spring talked about the differences between the P100, Telsa and GeForce cards.

https://www.pcper.com/reviews/Editorial/Rumor-Thoughts-NVIDIAs-High-End-Pascal-GPUs
https://devblogs.nvidia.com/parallelforall/inside-pascal/

Compute Capability 6.0 Table.

Kepler, Maxwell, and Pascal GP100 have exactly the same number of resources available to particular amount of cores.

With Kepler its 192 cores, with Maxwell its 128, with Pascal its 64 cores. In essence 64 cores in Pascal are doing the same job, as 192 cores in Kepler, because they are not starved for resources. This is the reason also, why GTX 980 was faster than GTX 780 Ti in compute. Similar story we see with Similarly core counted Titan X, and P40. P40 is faster, despite having the same amount of cores, and lower core clock.

What about consumer Pascal? Each SM in GP102, 104, 106 GPUs have 128 cores. And they are fed, by the same amount of resources, as previous versions of Nvidia architecture. Exactly the same as Maxwell, but with higher core clocks. This is what I meant when I have written that Pascal and Maxwell consumer architectures are no different. If you will OC GTX 980 Ti to GTX 1080 levels, the difference in performance is very ver close.

How come? GTX 980 Ti, 2816 CC GPU, clocked at 1492 MHz, and 2090 MHz on Memory:
perf_oc.png

136 FPS.

GTX 1080 - reference design, with 320 GB/s and 2560 CC's
perf_oc.png

137 FPS. With higher core clocks.

This is the reality of Consumer GPUs. I am dumbfounded that you never knew this was the case. Only new Pascal GPU was GP100 chip. GP102, and the rest of consumer GPUs were just Maxwell GPUs on 16 nm node.

P.S. None of the consumer GPUs, nor professional GPUs made with GP102, 104, 106 chips have the new features from GP100, chip(49-bit memory, unified memory and FP16). Guess why?
 
Last edited:
Are those features "visible", or the GPUs are just Compatible with CC 6.1? ;)

Because that is a difference ;).

Its funny that you are completely unaware of this. If you want links, first that pops.
http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/5
After initially testing FP16 performance with SiSoft Sandra – one of a handful of programs with an FP16 benchmark built against CUDA 7.5 – I reached out to NVIDIA to confirm whether my results were correct, and if they had any further explanation for what I was seeing. NVIDIA was able to confirm my findings, and furthermore that the FP16 instruction rate and throughput rates were different, confirming in a roundabout manner that GTX 1080 was using vec2 packing for FP16.

As it turns out, when it comes to FP16 NVIDIA has made another significant divergence between the HPC-focused GP100, and the consumer-focused GP104. On GP100, these FP16x2 cores are used throughout the GPU as both the GPU’s primarily FP32 core and primary FP16 core. However on GP104, NVIDIA has retained the old FP32 cores. The FP32 core count as we know it is for these pure FP32 cores. What isn’t seen in NVIDIA’s published core counts is that the company has built in the FP16x2 cores separately.

To get right to the point then, each SM on GP104 only contains a single FP16x2 core. This core is in turn only used for executing native FP16 code (i.e. CUDA code). It’s not used for FP32, and it’s not used for FP16 on APIs that can’t access the FP16x2 cores (and as such promote FP16 ops to FP32). The lack of a significant number of FP16x2 cores is why GP104’s FP16 CUDA performance is so low as listed above. There is only 1 FP16x2 core for every 128 FP32 cores.
Compatibility with Compute Compatibility MEANS NOTHING if your hardware is not capable of doing this.

Will this end this conversation? True new Pascal uArch on high, and low-level is only GP100 chip. GP102, 104, 106 are reused Maxwell uArch, with few tweaks on new node.
 
Will this end this conversation? True new Pascal uArch on high, and low-level is only GP100 chip. GP102, 104, 106 are reused Maxwell uArch, with few tweaks on new node.
All the Pascals have native FP16, and 49-bit addressing. It's quite a leap to argue that because the GP100 has better FP16 and FP64 performance, the other chips must be rebadged Maxwells.

Yes, this ends the discussion.
 
All the Pascals have native FP16, and 49-bit addressing. It's quite a leap to argue that because the GP100 has better FP16 and FP64 performance, the other chips must be rebadged Maxwells.

Yes, this ends the discussion.
49 bit addressing and FP16 is not the point. You have completely taken out most important part, about Consumer Pascal GPUs, which is the structure of SM's and available resources to them, and focused on features.

uArchitectures are not defined by features, that can be added easily. They are defined always by the execution resources, and structures of the cores. The fact is simple: Consumer Pascal has exactly the same uArchitecture, as Maxwell, which I have pointed out before. GP100 - Pascal chip - it is completely different uArchitecture.
 
For the gamers out there - "a taller version of the nMP..."
http://wccftech.com/corsair-one-preorder-ready/

Looks way better than most PC cases/systems out there. Yet, it is ugly - everything is (both internally and externally) compared to the cylindrical MP. At least people don't have to pay a fortune to get it and won't feel bad for being forced to overpay the Apple/Intel/AMD/USA/California tax.
 
Holly crap. Check this out. I want one.
Dual DP1.4, nice.
USB3.0 still?!


"... but it may be wiser to save up and wait for better 8K technology. The UltraSharp 32 Ultra HD showed brilliant pixels but took a long time to re-draw images on screen.

That’s because GPUs aren’t ready for 8K yet, and Dell had to work with Nvidia until the last second to tweak drivers to support 8K. ..."
http://www.pcworld.com/article/3155...ch-8k-display-the-ultrasharp-32-ultra-hd.html

Requires two DP 1.3 cables just to drive it at 60Hz (https://en.wikipedia.org/wiki/DisplayPort#Resolution_and_refresh_frequency_limits_for_DisplayPort); let alone do HDR, broad color without substantive compression. Remember the initial wave of 4K panels and folks saying being limited to 30Hz was "OK" just because of all those pixels. Slow refresh, pixel peeping applications ... sure, but general purpose utility, not hardly.

Sure the drivers will get better but all of this generation 0.5 stuff is far more for the folks with more money than sense. Fart $5K down the drain ... sure no problem it is a $1M budget anyway. Just dump the cost overruns onto my customers for the difference.

USB 3.0 because it is primarily keyboards, speakers, and mice being hooked up with perhaps some legacy thumb flash drive mixed in. . And charging phones (and other portables. ).
 
  • Like
Reactions: jblagden
If that's the same panel the "iMacPro" is going to use, I can live with an iMacPro and not missing a MacPro for a while.

Seems unlikely going to see a 32" iMac. First, there never has been one. Second, the GPU to drive this at 60Hz and under a wide range of workloads is hardly in the "mid-upper" mobile class workload zone. Barring major changes to targeted TDP envelopes the 'umph" to do this likely won't come to the midtream level for at least one or two more iterations. Cost wise also on the panel to put it into iMac price range is more than a year out.

Apple needs an iMac that is more than comfortable with 5K under a wide variety of graphic context workloads before moving onto trying to "master" 8K. More data at 5K ( e.g., leverage something like Optane and SSD backed GPUs ) is a more broad based "pro" move than chasing the early 8K crowd which is substantively smaller.
 
Seems unlikely going to see a 32" iMac. First, there never has been one. Second, the GPU to drive this at 60Hz and under a wide range of workloads is hardly in the "mid-upper" mobile class workload zone. Barring major changes to targeted TDP envelopes the 'umph" to do this likely won't come to the midtream level for at least one or two more iterations. Cost wise also on the panel to put it into iMac price range is more than a year out.

Apple needs an iMac that is more than comfortable with 5K under a wide variety of graphic context workloads before moving onto trying to "master" 8K. More data at 5K ( e.g., leverage something like Optane and SSD backed GPUs ) is a more broad based "pro" move than chasing the early 8K crowd which is substantively smaller.

Eh, while I think an 8k iMac is unlikely, I don't think it will be because of limited GPU power. That has never stopped Apple before. For instance the limited GPU in the MacBook pro is advertised as being able to drive 2 5k displays, which is close to the number of pixels in an 8k display. Low powered GPUs has never stopped Apple from delivering retina screens. Some of the first retina iPads and integrated graphics driven laptop retina screens were fairly underpowered.
 
  • Like
Reactions: jblagden
About the 8K iMac ;)

http://appleinsider.com/articles/15...ac-8k-later-this-year-display-partner-lg-says

Well, LG is the manufacturer, who has the exclusivity for IPS displays, and Apple buys the displays strictly from LG ;).

They did not pulled this from their ass ;). The information came out first on LG Blog ;).

I will not be staggered if we will see in upcoming future 3 sizes of iMac.
4K 21.5, 5K 27, and 31.5 8K iMac.

Apple has also announced that they will release the 'iMac 8K' with a super-high resolution display later this year," the press release states. Apple, however, has not publicly announced plans for a new iMac update.

Ah yes, the Late 2015 8k iMac. I remember that update well.

They did not pulled this from their ass ;).

Whether or not Apple was tinkering with LG screens, it certainly does seem like they pulled that iMac update out of their ass.
 
Eh, while I think an 8k iMac is unlikely, I don't think it will be because of limited GPU power. That has never stopped Apple before. For instance the limited GPU in the MacBook pro is advertised as being able to drive 2 5k displays, which is close to the number of pixels in an 8k display. Low powered GPUs has never stopped Apple from delivering retina screens. Some of the first retina iPads and integrated graphics driven laptop retina screens were fairly underpowered.


5k : 5120 x 2880 x 10 x 3 ~442M ( 332,368,000 ) * 2 = ~ 884
8k : 7680 x 4320 x 10 x 3 ~995M ( 995,328,000)

(995 - 884 ) / 884 = 13% that is just pixel fill overhead. Haven't even gotten to model manipulation overhead at all.


the MBP is driving 2-3 (not sure of 2 5K nuked the internal display) and this is just one. The iMac is now spent in terms of doing another monitor. And frankly, the nominal mode of the MBP is just driving just one monitor. The number of MBP owners likely to hook up two 5K monitors is probably less that 10% of that user base. It is a corner case. In the "one monitor" model the MBP is likely running circles around the iMac in your case on anything with dynamic model content. Being "able to drive" and the nominal mainstream mode are two different things.
[doublepost=1490385756][/doublepost]
About the 8K iMac ;)

http://appleinsider.com/articles/15...ac-8k-later-this-year-display-partner-lg-says
......

They did not pulled this from their ass ;).


Next to the definition of irrational, overhyped, Apple fanboy is a picture of AppleInsider. In the context of hyper partisan mac rumors, up there with wcctech when it comes to wishful thinking posing as analysis.

I'm sure Apple acquired some 8K panels to do research on. But just around the corner iMac product.... extremely likely not ( back then as well as now. )
 
Last edited:
  • Like
Reactions: tuxon86
Next to the definition of irrational, overhyped, Apple fanboy is a picture of AppleInsider. In the context of hyper partisan mac rumors, up there with wcctech when it comes to wishful thinking posing as analysis.

I'm sure Apple acquired some 8K panels to do research on. But just around the corner iMac product.... extremely likely not ( back then as well as now. )
What I meant is that Apple is working on 8K iMac. LG, which supplies the displays to Apple does not pulled this information from their ass.

Funniest part? Next year, 2018 is 20th anniversary of iMac introduction. Apple usually likes to release new product versions on this occasions.
 
What I meant is that Apple is working on 8K iMac. LG, which supplies the displays to Apple does not pulled this information from their ass.
What actually I believe is that Apple Insider (source from that rumour) buy this rumour from some 'trusted' source as our funny remembered DarkNetGuy (which I thought is a bit more acccurate).

Another rumor from they at same time was the Apple's Thunderbolt 5K Display... which ended on a troublesome LG 5K 'thunderbolt' display (one which you cant hook any thunderbolt peripheral, which uses only TB3's MST feature to merge 2 dp 1.2 signals into a single cable).

I dont regret we will see an 8K Display at any future Mac but not anytime soon (ok in a future w/o Cook, with Captain Cook the Mac has no future).

8K display will remain in a very specialized niche as it was the 4K by years after reaching mainstream.
 
What actually I believe is that Apple Insider (source from that rumour) buy this rumour from some 'trusted' source as our funny remembered DarkNetGuy (which I thought is a bit more acccurate).

Another rumor from they at same time was the Apple's Thunderbolt 5K Display... which ended on a troublesome LG 5K 'thunderbolt' display (one which you cant hook any thunderbolt peripheral, which uses only TB3's MST feature to merge 2 dp 1.2 signals into a single cable).

I dont regret we will see an 8K Display at any future Mac but not anytime soon (ok in a future w/o Cook, with Captain Cook the Mac has no future).

8K display will remain in a very specialized niche as it was the 4K by years after reaching mainstream.
They did not. It was a official LG blog, that has posted information about Apple releasing 8K iMac.

LG supplies the displays to Apple. It is similar story to the information that popped on 20th December last year, on Pike's blog, about upcoming Apple hardware, that he had to pull later on.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.