Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.
Absolutely nobody around here mentions that AMD left the high end market. The fanboys shifted the goal posts towards the perf/watt argument and have since argued that it's an economic choice and not a performance choice driving their decisions. They stick with this argument even after AMD openly admitted that they can not compete with Nvidia. Well, AMD realized they couldn't cut into Intel's market share either and are now back to attempting to make high end cards. I don't even know how to reply half the time to the nonsense spewed around here.

Yeah, brand new card shows up, wipes the floor with everything in existence and yet certain people whine.

"It isn't as efficient as it could be" HUH ?

"The memory bandwidth is really gonna catch up with it in 6-10 months" WHAT ?

Shows that more than a few here have other agendas, agendas that prefer the truth to get shoved in a closet.

I suppose if my paycheck had "AMD" on it, I'd be looking for sharp edges on the new heatsink to complain about too.
 
  • Like
Reactions: tuxon86
Yeah, brand new card shows up, wipes the floor with everything in existence and yet certain people whine.

"It isn't as efficient as it could be" HUH ?

"The memory bandwidth is really gonna catch up with it in 6-10 months" WHAT ?

Shows that more than a few here have other agendas, agendas that prefer the truth to get shoved in a closet.

I suppose if my paycheck had "AMD" on it, I'd be looking for sharp edges on the new heatsink to complain about too.
The amazing bits of conversation are those that involve AMD's bleeding edge tech. Apple chose budget bin parts for the 500, 600 and 700's. The 5K iMac cards are cludges of old chips. Intel is good enough for the mini... There is no evidence of Apple moving to the forefront of the gpu market, yet these folks keep prattling on about tech that Apple has shown that they are not interested in time after time. What does Apple have in store for us? Probably more *old* rebranded cludges marketed as "innovative, my arse!" or whatever they choose as their defiant rallying call.
 
I suppose if my paycheck had "AMD" on it, I'd be looking for sharp edges on the new heatsink to complain about too.
That's a really sad ad hominem attack. When you run out of other arguments, attack the speaker.

There's plenty of fodder to take down AMD (such as the simple fact that the GTX 1080 is in reviewers' hands, and the Polaris/UrsaeMinoris/NorthStar/Vega/AlphaLyrae have yet to even see a paper launch).

No need for ad hominem attacks when it's so easy to take her out on substance.
 
AMD Polaris may beat Pascal on the 50-130 Watt segment this year. That matters for Macs, except Mac Pros.
 
http://forums.anandtech.com/showthread.php?t=2473319
NTMBK said:
  • GP100 has 64 FP32 cores per SM, GP104 has 128 FP32 cores per SM
  • GP100 has 2 processing blocks per SM, GP104 has 4 processing blocks per SM
  • GP100 has 64KB of shared memory per SM, GP104 has 96KB of shared memory per SM
  • GP100 has 32k registers per processing block, GP104 has 16k registers per processing block
  • GP100 has 32 FP64 cores per SM, GP104 has 4 FP64 cores per SM
This pretty much sums up the differences between GP100 and GP104.

@netkas http://www.computerbase.de/2016-05/...agramm-ashes-of-the-singularity-async-compute

In 4K it looks like there is regression with Async on. It is due to narrow memory bus. Like I have said, in 6-10 months you will see performance tanking in 4K gaming and high resolution VR experience. This is first glimpse of that.
 
This image explains why not everyone happy about 1080

perfdollar_2560_1440.png
 
This image explains why not everyone happy about 1080

perfdollar_2560_1440.png

Although, personally, I don't think that "performance per dollar" can be a considerable measurement within high-end GPUs context, Ars seems to give a good explanation (quoting):

This is the sort of price rise that only a company without competition could get away with. If AMD's Fury range had fared better, perhaps Nvidia might have pushed the 1080 further or been more aggressive on price.

Lack of competition (since obviously this is the case) hurts everyone, it seems. We don't get the best possible GPUs, and we don't get the best possible prices. I do hope AMD will be able to make a come back, mostly for customers' shake.

Being the way it is, however, I read in the same article that 1080 is faster by as much as 62 percent compared to the corresponding maxwell 980. That's not bad at all.
 
This image explains why not everyone happy about 1080

perfdollar_2560_1440.png
Want to know why people are disappointed? Because if Nvidia would use GP100 architecture in GP104 then the increase in performance would be MUCH higher. On what levels? Add 30-40% to what you see currently. All Nvidia did on new node, was brute forcing MAXWELL architecture(because layout of SM is exactly the same, apart from smaller cache and registry file) and selling it a brand new thing.

Overall performance is great. But how much better would it be if Nvidia would use GP100 arch in GP104?

http://semiaccurate.com/forums/showpost.php?p=262609&postcount=1197 I suggest reading this post...
 
Last edited:
Nvidia knew there wont be competition any time soon at GTX 1080 performance level. So why put the latest and greatest tech there when you can milk the HPC industry with the new one. And it could be, that we're gonna see the Flounders edition of GTX 1080 for some time. But for GTX 1070, the Flounders edition season will be shorter, just until the competition starts.
 
I wonder what people around here will complain about when/if a "ti" version come out later on with even better performances...
 
I thought you were going to show an image where various PR hacks/ shills were revealed.

Not fair!

yep, these shills are doing great work.. they've definitely won me over and i'm now going to buy macs with AMD in them instead of nVidia. :rolleyes:

or maybe they're working for apple instead? with the goal of trying to convince people that AMD inside the entire mac lineup is a wise move?

or?
i mean, what exactly would be the point of planting a GPU shill in a mac forum?

---
also.. if this is PR work at play, it's entirely ineffective.. i don't think the 5 of you arguing about GPUs for the past 25 pages realize how boring the conversation is to outside readers.. even tech minded readers.. PR work has to at least be a little bit interesting in order to be effective.
 
I wonder what people around here will complain about when/if a "ti" version come out later on with even better performances...
There will not be Ti version of GP104. GTX1080 Ti will be based on GP102. That will be derived from GP100, but without FP64 cores.

And you know what? That thing will be a beast. It has nothing to do with personal preferences. But technical analysis. Think about it. You get 2 times more CUDA cores than GTX 1070(3840) with 30-40% better IPC.

I ask anyone of you, who come to this thread. Leave personal preferences about brand in home and lets talk ONLY technical stuff. Analyze it from top to bottom.
yep, these shills are doing great work.. they've definitely won me over and i'm now going to buy macs with AMD in them instead of nVidia. :rolleyes:

or maybe they're working for apple instead? with the goal of trying to convince people that AMD inside the entire mac lineup is a wise move?

or?
i mean, what exactly would be the point of planting a GPU shill in a mac forum?

---
also.. if this is PR work at play, it's entirely ineffective.. i don't think the 5 of you arguing about GPUs for the past 25 pages realize how boring the conversation is to outside readers.. even tech minded readers.. PR work has to at least be a little bit interesting in order to be effective.
I do not see who you are talking to, so I suppose it is MVC.

You are talking to a guy who sells Nvidia GPUs with Mac EFI. And calls everyone who does not agree with his world view AMD shills, and Apple PR.

Think about that, who is the "real" shill here. No this is not a stunt about him. Im just pointing some form of hypocrisy. If he wants to talk about technical analysis of the GPUs he is welcome. But without his typical approach "I am better, you are stupid". Without typical nitpicking to prove his view of the world.

All what people discuss here is technicalities about GPUs. And how that relates to next gen Mac Pro. Potentially... All of this discussion can be pointless if Apple will use Fiji ASIC in upcoming Mac Pro refresh. For me, personally it will be huge disappointment. Why? Because 4096 GCN 1.2 cores need 96 ROPs, not 64. Otherwise it will not be faster in graphics than R9 390X.
 
It would be nice if Apple could offer Nvidia model for those who need Cuda - even as Apple Founder's Special Tax Edition version. In principal it should be possible to put both Nvidia and AMD card to the same machine and run them together or separate or what ever needed.

But if Apple has Cuda angst, at least they should offer a proper alternative software tool that can do same things as Cuda does. And it should be on Apple's' priority list top 10 for macOS and Xcode. Unless Apple wants to become like Nintendo, where most of the good apps are in-house production.

Mac Pro SE with Xeon v5 and one GPU would be an interesting product. Then the markets could decide do they favor one or two GPU models.
 
Last edited:
  • Like
Reactions: H2SO4 and koyoot
It is true. That's why it'd be Mac Pro SE. Like iPhone SE. And it should be cheaper too.

  • Xeon v5
  • 8GB DDR4
  • Radeon 480, Fury Nano and Nvidia GTX 1080 as CTO options
  • Starting price US $1999

I would buy this machine in a heartbeat. Its the machine that a lot of users have wanted for a long time (i.e., the fabled xmac). Its always been a problem of Apple doesn't feel like this kind of machine fits in between the mac mini, iMac and Mac Pro. Apple has been diversifying the iPhone/ipad line to try and make every option available for almost any use case. This has happened to an extent with the mac line with more budget options like the mac mini and macbook air. I don't see it happening on the high end side though. If you fall into the minority of users who want a headless mac with more power than an Mac mini/iMac Apple would much rather you just pony up for the mac pro.

Another option that could help fill this niche could be external GPUs. If the mac mini would get the attention it deserves and go back to quad core CPUs then a mac mini with a TB3 GPU could fill that role nicely. I think external GPUs could play a nice role for Apple if they want it to. Have a macbook pro/mac mini/imac and want more GPU power? simply plug in this external puck like device and you have it. For enthusiasts you could buy something like a razer core and put your own off the shelf GPU into it.
 
It is true. That's why it'd be Mac Pro SE. Like iPhone SE. And it should be cheaper too.

  • Xeon v5
  • 8GB DDR4
  • Radeon 480, Fury Nano and Nvidia GTX 1080 as CTO options
  • Starting price US $1999
A Dell T3420 SFF or T3620 Minitower with
  • E3-1240 v5 (Quad Core HT 3.5Ghz, 3.9Ghz Turbo)
  • 8 GiB DDR4 ECC
  • Quadro® K620 2GB or FirePro™ W2100 2GB
  • 256 GB NVMe SSD
  • Price about $1299
 
  • Like
Reactions: Synchro3 and 762999
A Dell T3420 SFF or T3620 Minitower with
  • E3-1240 v5 (Quad Core HT 3.5Ghz, 3.9Ghz Turbo)
  • 8 GiB DDR4 ECC
  • Quadro® K620 2GB or FirePro™ W2100 2GB
  • 256 GB NVMe SSD
  • Price about $1299
Yep. Quadro® K620 is a 41W super GPU.

And Apple wants to sell you dual-core i7 Mac Mini for the same price...

I think we are on the core issue here. But yes, it would be nice to have it at $1599.
 
Last edited:
You are comparing Kepler based GPU with latest ones, Aiden.

Lets get back to topic of GTX 1080.

luxmark-1080.png

Any thoughts?

GTX 980 TI is Waterforce model.
 
Last edited:
So far unfortunately not. I am interested in this also, because... context with the upper slide.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.