Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I can see the dell up2715k for £699 in the UK on amazon
Wow. Guess it would've been a good idea to check Amazon UK before I called his bluff.

Well in that case that's awesome that you can get that monitor for so cheap in the UK. If you live in the UK, by all means go buy a PC (or build a hackintosh), get a 5K monitor, and be happy if that floats your boat :)

For those of us in the rest of the world I'm pretty sure my points are still valid.

To take it a step further, Apple made the choice to go AMD back when the only 5K monitor was $3000 (USD), and they sold the baseline Retina iMac for $2499. I think we can all agree it's pretty grossly
inaccurate to say that "AMD is cheap and Apple wants a good profit margin for their overpriced Macs."

While I'm sure Apple does make a healthy margin on the iMac's, the Retina iMac's were for a long time the cheapest way to even GET a 5K display. For those of us outside the UK, it still is one of the cheapest ways to get a computer with a 5K display.

[doublepost=1460542196][/doublepost]
Not BS. An extra £50 off what I estimated but this was literally the first result from a Google search. With time I'd be able to find a better price.


Pointless when the vast majority of world's content is at 1080p at best.


I can afford. But why would I want to waste my cash when I know no games will run at 5K and no content will be available for it either? Hell, the internet in my area struggles to stream 1080p. As it stands at the moment, Apple's 5K iMac is nothing but an expensive gimmick. Not very useful for many at the current time.

Wait- you didn't even answer my question. But its ok, I already know the answer as I said.

Oh no, I did, you just failed to notice (edited for nastiness). That's ok I'll repeat it for you. "To work with photos at near native resolution." Hmmmmm where can I find a bunch of 5K(+) content.... Oh. That's right. The photos folder on my iMac. Just because you still shoot with an 8MP iPhone 4S, and only work with 1080p content doesn't mean the rest of us do.

None the less. I get it. It's not a good value for you. That's fine. Don't buy it. Send Apple an email telling them how you'd like them to build something else. Go lobby your local representative so maybe you can get internet that doesn't struggle to stream 1080p. All of these things would probably be more effective than complaining on this message board.
 
Last edited:
Oh wait, I did, you just can't read very well. Guess they don't teach you to read English at school in the UK. That's ok I'll repeat it for you. "To work with photos at near native resolution." Hmmmmm where can I find a bunch of 5K(+) content.... Oh. That's right. The photos folder on my iMac. Just because you still shoot with an 8MP iPhone 4S doesn't mean the rest of us do.

Great. Viewing huge photos. That's fantastic. Totally worth the cash. Totally not possible on a 1080p monitor either.
I use a BlackBerry 9900 btw, not a 4S :p
 
Great. Viewing huge photos. That's fantastic. Totally worth the cash. Totally not possible on a 1080p monitor either.
I use a BlackBerry 9900 btw, not a 4S :p

Well at least you have a sense of humor.

I don't think either one of us is going to convince the other so I'll just say this.

I find editing photos on my 5K iMac to be a significant upgrade from doing the same on my old 1440p iMac. I don't have to zoom to check or adjust pixel level sharpness, and photo's just look down right amazing with 5K worth of detail on screen.

I've recently started shooting and editing video (both 1080p and 4K) and while I am by no means a videographer, I find the 5K screen makes this task a lot easier.

Finally, I just like the 4x scaled1440p 5K resolution and hate the way Apple's non linearly scaled resolutions look.

For me, 5K was worth the money, but I totally understand if it isn't for you.

Finally, for the record, I (partially) agree with you that Apple probably did choose AMD because it was the cheaper option. Where we disagree is that I don't think it was some grand plot by Apple to overprice their computers and screw their customers. Oh and as I've made clear I certainly don't think 5K is useless (vs 4K or any other resolution)
 
Last edited:
  • Like
Reactions: roadbloc
I just want to chime in and say, that regardless of the 5k content availability I use the iMac 5k for long durations of unity and blender and I find the crispness of the screen to be much more comfortable on my eyes. I also have countless monitors such as Dell U3415, U2515H, LG 34UM95 and although they are great screens I always end up back on my 5k iMac
 
5k is... kinda pointless in a word where we haven't even got 4k gaming and video streaming to work yet. I could get a £600 5K monitor for my £700(ish) PC today if I wanted (Dell UP2715K) but what is the point? Why bother? Apple's 5K thing is nothing more than a gimmick in a world that isn't even fully prepared for 4K.

So let me ask you a question, where is all this 5K content hiding that makes a 5K iMac so damn desirable? And other than displaying stuff at a 5K resolution, can the iMac play Fallout 4 at 5K or make some heavy edits to a 5K video file? I already know the answer is no.
Seem you are only a gamer. You know why 5K resolution? Because it is the resolution that provide 100% vision on a 1.2 mega pixel (iPhone 6s) photo without any zoom in zoom out. This is really helpful for designer and photographer on productivity that is what a gamer never understand.
 
Seem you are only a gamer. You know why 5K resolution? Because it is the resolution that provide 100% vision on a 1.2 mega pixel (iPhone 6s) photo without any zoom in zoom out. This is really helpful for designer and photographer on productivity that is what a gamer never understand.

If I'm unable to ever understand it (because I play games, some great logic there), why bother explaining it to me? Even though I have acknowledged above that it is useful for photos. Or did I? Because according to you, I'll never understand it since all I do with my life is play video games. :rolleyes:

Was going to leave it there, but sod it, I'll bite. Instead of buying a £600 5K monitor, lets just say I've spent £2K on a 5K iMac. It can do huge photos. Great. Now what else have you got for me? Because £2K is a lot of cash to spend on an i5 machine with mediocre graphics.

I already know the answer and it's nothing. No 5K games, no 5K videos, no real 5K content. And by the time 5 or 4K becomes the norm, the hardware in that iMac will no doubt be dated an unable to handle it. Other than photos, 5K is a gimmick. A gimmick that Apple are pricing high and having running on hardware that is okay at best.

But hell, all I do is play video games according to you, so what the hell do I know?
 
  • Like
Reactions: turbineseaplane
AMD = dirt cheap, speedy but high failure rate
NVidia = Expensive, slower but low failure rate.

Think this is very subjective...With AMD cards i never had any issue and neither one of the AMD card died on me.
I cannot tell this for NVidia...7-8models i had in last couple of years (decade). All of them are in trash bin.
Last AMD that i have in desktop computer is still up and running and that rig is from 2005.

Lets see if AMD Radeon R9 M395X on new iMac will survive till i buy new one (2-3years)
 
Think this is very subjective...With AMD cards i never had any issue and neither one of the AMD card died on me.
I cannot tell this for NVidia...7-8models i had in last couple of years (decade). All of them are in trash bin.
Last AMD that i have in desktop computer is still up and running and that rig is from 2005.

Lets see if AMD Radeon R9 M395X on new iMac will survive till i buy new one (2-3years)

True it is subjective.

I have friends who've had an AMD graphics card and replaced it with a new card in the same machine 3 times, whereas I've had other friends who've owned AMD since the beginning and have never had an issue. I've known only 2 people who have had an issue with NVidia and their replacement GPU didn't have any issues.

So either AMD is notoriously bad or they pretend to look at it and send the same card back hoping the computer manufacturer will eat the cost by sending a new computer/part.
 
AMD = dirt cheap, speedy but high failure rate
NVidia = Expensive, slower but low failure rate

Slower, really? Is that why 9 out of the top 10 videocard benchmarks are nVidia cards? #9 is an AMD and over a quarter (29%) slower... for the exact same price as the 980Ti.

Screenshot 2016-04-14 13.03.23.png


Source: http://www.videocardbenchmark.net/high_end_gpus.html
 
  • Like
Reactions: 952863
True it is subjective.

I have friends who've had an AMD graphics card and replaced it with a new card in the same machine 3 times, whereas I've had other friends who've owned AMD since the beginning and have never had an issue. I've known only 2 people who have had an issue with NVidia and their replacement GPU didn't have any issues.

So either AMD is notoriously bad or they pretend to look at it and send the same card back hoping the computer manufacturer will eat the cost by sending a new computer/part.

NVidia doesn't produce video cards... You have to put the blame on each different manufacturer (MSI, Gigabytes, EVGA...) for the failure unless you can demonstrated that the failure was with the GPU chip itself. Same with the AMD on windows machine, they outsource the board making to different partners. On the Apple side on the other hand the situation it depends on who made the custom boards for them.
 
Slower, really? Is that why 9 out of the top 10 videocard benchmarks are nVidia cards? #9 is an AMD and over a quarter (29%) slower.

View attachment 626824

Nice catch, I connected it with typical AMD vs Intel specs on accident, thanks for correcting me. I'm NVidia 100% I can only hope that the Skylake MBPs have NVidia cards.
 
  • Like
Reactions: 952863
Slower, really? Is that why 9 out of the top 10 videocard benchmarks are nVidia cards? #9 is an AMD and over a quarter (29%) slower... for the exact same price as the 980Ti.

View attachment 626824

Source: http://www.videocardbenchmark.net/high_end_gpus.html
I suggest comparing current gaming and compute benchmarks, not PassMark, which is CPU limited, mostly(draw calls, etc).

http://pic.yupoo.com/ztwss/Ftoyshb7/medish.jpg DX11 Game.
qb_ultra.png


NVidia doesn't produce video cards... You have to put the blame on each different manufacturer (MSI, Gigabytes, EVGA...) for the failure unless you can demonstrated that the failure was with the GPU chip itself. Same with the AMD on windows machine, they outsource the board making to different partners. On the Apple side on the other hand the situation it depends on who made the custom boards for them.
Yes, boards are made by manufacturers. But the dies are made by Nvidia and AMD, which sells them to manufacturers. And thats where we have to look ;). Of course boards can fail as always. But what is the percentage of fail of die of GPU, and what is the percentage of board fails? I suppose only IHV's know this.
 
Last edited:
Seriously, you're going to call out the industry standard video card benchmarking tool as somehow being wrong because it doesn't prove your "opinion"?
Check my post again, and pay attention the difference between R9 380X and GTX 970. GTX 970 is the die that is in GTX980M, but cut down and with lower core clocks. R9 380X is the same die in M395X but with slightly lower core clock. Is this opinion, or pure facts what is observed both in DX11 and DX12?
 
Check my post again, and pay attention the difference between R9 380X and GTX 970. GTX 970 is the die that is in GTX980M, but cut down and with lower core clocks. R9 380X is the same die in M395X but with slightly lower core clock. Is this opinion, or pure facts what is observed both in DX11 and DX12?

I don't give creedence game based benchmarks, mostly due to games being optimized for a specific GPU brand. Both AMD and nVidia are guilty of this.

I only trust GPU agnostic benchmarks, and those benchmarks shows AMD GPU's consistently under-perform compared to nVidia in real world tests.
 
  • Like
Reactions: tuxon86 and 952863
I don't give creedence game based benchmarks, mostly due to games being optimized for a specific GPU brand. Both AMD and nVidia are guilty of this.

I only trust GPU agnostic benchmarks, and those benchmarks shows AMD GPU's consistently under-perform compared to nVidia in real world tests.
If you look at benchmarks that not favor any arch so why you take PassMark, which is famous for being CPU bound? It is all how much of commands are pushed to the GPU, and then executed. And here, because of serial nature of application - GCN is bottlenecked. GCN is underutilized. Similar situation as was with DX11 titles.
 
why you take PassMark, which is famous for being CPU bound?

Ok, lets say for the sake of argument that Passmark is CPU bound.

These tests report the average of all the tests done on a particular GPU (not simply the peak score from one insane guy with liquid nitrogen cooling and extreme overclocking).

That's hundreds if not thousands of different systems with different CPU's, clock rates, RAM, etc. This is regardless if it's a AMD or nVidia GPU being tested. So are you saying all the AMD's testers scored lower than their nVidia counterparts because they must have been using slower processors when they did their tests?
 
  • Like
Reactions: tuxon86
No. CPU bound means that there is something that not allows on the way CPU-GPU for the GPU to be fully utilized.

DX11 was serial API that was CPU bound. It was how many Draw Calls were pushed to command scheduler of every GPU. Because there was not enough commands dispatched for GCN - therefore it could not be fully utilized. It is fairly easy way of explaining. And pretty rough ;). DX12, because it is multithreaded, and multiengine API completely removes any bottleneck on GCN. Nvidia does not get that big improvement of performance, because it was not bottlenecked by API. At least - not in the way GCN was.

Why do I compare DX11/12 to PassMark? Because PassMark is like DX11. If it not be, 100% we would see any AMD GPU being faster than even old Kepler GTX 780 Ti.
 
  • Like
Reactions: Maxx Power
I suggest comparing current gaming and compute benchmarks, not PassMark, which is CPU limited, mostly(draw calls, etc).

http://pic.yupoo.com/ztwss/Ftoyshb7/medish.jpg DX11 Game.
qb_ultra.png



Yes, boards are made by manufacturers. But the dies are made by Nvidia and AMD, which sells them to manufacturers. And thats where we have to look ;). Of course boards can fail as always. But what is the percentage of fail of die of GPU, and what is the percentage of board fails? I suppose only IHV's know this.

Most board dies because of bad caps or vrm, not because of the graphics chip. The die is the most protected part of the GPU card. That and solder traces failing due to board warping on long GPUs not equipped with backplates.
[doublepost=1460671272][/doublepost]
Ok, lets say for the sake of argument that Passmark is CPU bound.

These tests report the average of all the tests done on a particular GPU (not simply the peak score from one insane guy with liquid nitrogen cooling and extreme overclocking).

That's hundreds if not thousands of different systems with different CPU's, clock rates, RAM, etc. This is regardless if it's a AMD or nVidia GPU being tested. So are you saying all the AMD's testers scored lower than their nVidia counterparts because they must have been using slower processors when they did their tests?

Never mind Koyoot friend, in is own special place AMD is God incarnate and can do no wrong. Prepare to be swamped by link to blog posts from anonymous persons or AMDs' marketing pamphlets by him trying to make is point. He's a bit annoying but totally harmless.
 
Never mind Koyoot friend, in is own special place AMD is God incarnate and can do no wrong. Prepare to be swamped by link to blog posts from anonymous persons or AMDs' marketing pamphlets by him trying to make is point. He's a bit annoying but totally harmless.
Why do you attack me, not what I bring to the table? Why do you attack people who look to be more educated on topic of GPUs than you are? Why you resist objective facts about performance, and attack people? If you do not agree, why attack person, not the message? I am tired of you and others that constantly attack me, when I talk about OBJECTIVE FACTS.

Can you answer me this? You do not want to be proven that AMD is not that bad as you, and rest of people here paint it?

People cannot comprehend simple facts, that AMD is not that bad as it is painted here?

Those blogs, posts of anonymous people explained why Nvidia in current benchmarks behave, as it behaves. Why you downplay it? You do not understand it, or you do not want to understand it? Maybe you just do not want to see that Nvidia currently has weaker hardware, which is currently shown in every benchmark, because software caught up(Even DX11 titles!), which was pinpointed in every single link that I provided. Tell me Tuxon, I am tired of this. Why do you attack me, and every single thing that I bring.

When I brought DX12 titles benchmarks, you downplayed them saying they were AMD sponsored games. Why does then Quantum Break show EXACTLY the same behavior of the GPUs as the "AMD" sponsored titles. Quantum Break is not supposed to be sponsored. How come then it shows exactly the same behavior?
 
Last edited:
In the end it probably just came down to cost vs expense. AMD had to work with Apple to use their proprietary TCON w/ their single tile display. It wouldn't surprise me if nVidia didn't want anything to do with that at all let alone for a reasonable price.

Contrary to popular belief Apple can't just go buy a bunch of "980m" and just throw them into a 5k iMac and they just work.
 
Seriously, you're going to call out the industry standard video card benchmarking tool as somehow being wrong because it doesn't prove your "opinion"?

This is coming from someone who owns a GTX 970.(My gaming PC is 970/4690K) I have to call some BS on passmark, just as with any synthetic benchmark. Its clearly its not taking other important parameters into account. The R9 390 for example either matches or easily defeats the GTX 970 in almost every real-world benchmark, except for Fallout 4 IIRC, of which all AMD cards perform noticeable worse vs their nvidia counterparts in that game.

Basing your opinion of card purely off synthetic benchmarks is a mistake. If I'm planning to play quantum break and the AMD card simply performs better in real world benchmarks, then passmark is pretty much irrelevant.
 
No. CPU bound means that there is something that not allows on the way CPU-GPU for the GPU to be fully utilized.

DX11 was serial API that was CPU bound. It was how many Draw Calls were pushed to command scheduler of every GPU. Because there was not enough commands dispatched for GCN - therefore it could not be fully utilized. It is fairly easy way of explaining. And pretty rough ;). DX12, because it is multithreaded, and multiengine API completely removes any bottleneck on GCN. Nvidia does not get that big improvement of performance, because it was not bottlenecked by API. At least - not in the way GCN was.

Why do I compare DX11/12 to PassMark? Because PassMark is like DX11. If it not be, 100% we would see any AMD GPU being faster than even old Kepler GTX 780 Ti.

Well said. Technical details are spot on. My understandings are roughly the same as yours - GCN was built to be a forward looking, parallel threading monster. It was seriously held back by DX11.
[doublepost=1460677004][/doublepost]
That's hundreds if not thousands of different systems with different CPU's, clock rates, RAM, etc. This is regardless if it's a AMD or nVidia GPU being tested. So are you saying all the AMD's testers scored lower than their nVidia counterparts because they must have been using slower processors when they did their tests?

If you read tech websites, you'll see that people budget their computers with similarly priced components that fit with each other. No one in the right mind would buy a AMD FX4300 CPU and pair it up with a GTX980Ti. It means that the cheaper GPUs in Passmark's database is heavily correlated with cheap CPUs (which are mostly AMDs at the moment), which for the GCN architecture, holds back its performance severely.

Also, I have not seen a reputable electronics enthusiast website that uses Passmark as routine testing. This non-Passmark list includes the giants - Anandtech, Tomshardware, HardOCP, Hexus, Techpowerup and more.

In addition, synthetic benchmarks are typically not useful as indicators of real world performance. One reason is that, no one buys their hardware to run synthetic benches all day, and a second one would be that, real world applications (games or otherwise) are all optimized for modern architectures. Who releases unoptimized, generic code applications ? It makes no sense. Even at the compiler-level, you can tick a few flags to output code that takes advantage of a few performance-boosting features.

[doublepost=1460677650][/doublepost]
Yes, boards are made by manufacturers. But the dies are made by Nvidia and AMD, which sells them to manufacturers. And thats where we have to look ;). Of course boards can fail as always. But what is the percentage of fail of die of GPU, and what is the percentage of board fails? I suppose only IHV's know this.

Actually, this isn't hard to deduce if you want a qualitative answer. If you look at all the different brands of AMD gpus (ASUS, Gigabyte, MSI, Sapphire, XFX, PowerColor, etc) and get access to some reliability data (they are not always available, but do crop up from time to time in reliability studies), you'll see that the variability from manufacturer to manufacturer is in a certain range. Now, if you compare the size of that variability (statistically) to the variability of the reliability of GPUs as a function of the die maker - AMD/Nvidia, you'll see that the manufacturer of the board (OEM) probably makes a bigger impact to the reliability of the GPU rather than the die maker.

I'd wager that the OEM has far more to do with the reliability of the GPU than either AMD/Nvidia. I can't gather all the reliability studies, but here is one:

http://hexus.net/business/news/comp...eport-gigabyte-top-motherboards-msi-graphics/

Notice that there is >2x increase in failure rates between the bottom on that list than the top. I doubt AMD or Nvidia makes any modern GPUs which are 2x more likely to fail than the other.
 
Last edited:
Why do you attack me, not what I bring to the table? Why do you attack people who look to be more educated on topic of GPUs than you are? Why you resist objective facts about performance, and attack people? If you do not agree, why attack person, not the message? I am tired of you and others that constantly attack me, when I talk about OBJECTIVE FACTS.

Can you answer me this? You do not want to be proven that AMD is not that bad as you, and rest of people here paint it?

People cannot comprehend simple facts, that AMD is not that bad as it is painted here?

Those blogs, posts of anonymous people explained why Nvidia in current benchmarks behave, as it behaves. Why you downplay it? You do not understand it, or you do not want to understand it? Maybe you just do not want to see that Nvidia currently has weaker hardware, which is currently shown in every benchmark, because software caught up(Even DX11 titles!), which was pinpointed in every single link that I provided. Tell me Tuxon, I am tired of this. Why do you attack me, and every single thing that I bring.

When I brought DX12 titles benchmarks, you downplayed them saying they were AMD sponsored games. Why does then Quantum Break show EXACTLY the same behavior of the GPUs as the "AMD" sponsored titles. Quantum Break is not supposed to be sponsored. How come then it shows exactly the same behavior?

Because it is an AMD sponsored game silly. It was made first and foremost for the consoles which are AMD optimized and then ported to PC. And beside marketing blurbs you aren't really bringing anything in any GPU discussion. You are trying to pass yourself as an insider but never achieve anything more than displaying silly fanboyism.

If I displease you, you're free to put me on ignore. But keep in mind that I'll still contradict your propaganda.
 
  • Like
Reactions: 952863
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.