Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I think the reason is quite clear.....money and convenience! Apple got a good discount deal from AMD so probably they have signed a long term deal of exclusivity. Something that benefit Apple and benefit AMD because they were struggling to stay alive. Also it makes life easier for apple, because the 5K screen requires some additional stuff which I dont understand, Im sure they prefer not making sure it works for all kinds of GPUS. So laziness and money I guess.

I don't think its quite as clear as that although I'm sure money is a HUGE factor. AMD had to collaborate with Apple for their proprietary single tile 5k P3 display TCON. We don't know much more than that, for all we know nVidia could have said to buzz off.

AMD and nVidia have different business models. Neither AMD nor Apple are currently struggling especially AMD with their contracts. I mean come on, you are 5x more likely to see/use an AMD product than nVidia in your everyday routines....Playstation and Xbox alone almost guarantee that.

What I dont understand is that Apple has been striving to be considered a premium manufacturer of computers, and you would expect them to pick out the bast hardware options for their customers to choose from. Especially in the high priced BTO options, you should be able to pick the very best components....heck they charge you out of the nose for it. Right now Im starting to feel like the donkeys in Tom & Jerry, sitting with a classy slick machine knowing I paid tons of money for, and for everything else than the lightweight tasks, the machine doesn't perform great. Apple seem to be more focused on the outside than the inside of these machines.

You just feel nVidia is more premium because they currently have a better GPU line out. And with the use of Vulkan those lines are being more and more blurred. The Fury X was AMD's version of the Titan X (previous generation) and it held its own.

This is reminiscent of saying the Galaxy S(whatever) is better then the iPhone simply because they have different launch dates. Of course the newest thing from either competing company is going to be better.


I never been extremely focused on the detailed specs of every component.
But, currently the gap between Macs and what you can actually get for the big bucks regarding hardware has become too big. I just ordered my own PC components the other day. the machine Im building will be something like 4X faster than my maxed out 5K imac in pretty much every aspect. and it will cost me about the same as it would to buy the next iMac 5K maxed out BTO. So, this is a better solution for me. Finally I will have a machine that renders quickly and with an awesome GPU - And I no longer needs to be told by Apple what GPU I should buy.

List your parts you are using....

Its literally impossible to build a computer that is 4x faster than a maxed out iMac at the same price. Even if you are doing something as weird as comparing it to last years iMac you still can't build a PC that is 4x faster at the same price. You'll exceed the entire price of the iMac in a quest finding a CPU that is 4x faster. The iMac uses m.2 PCIe storage, nothing that is 4x faster. Thunderbolt, USB, etc etc all similar. Memory can definitely really be improved but not 4x...maybe 2x. So really when you say "pretty much every aspect" you mean "only the GPU"?

Plus you are building a desktop tower not an AIO which the two shouldn't be compared so easily. And YOU are building it yourself if I build something equivalent to a maxed out iMac with a GTX1080 through another manufacturer I come up with this. (its hard to find everything Apple offers in a BTO pc especially a 6700k and m.2 storage but I found Alienware has something similar).

i7 6700k, 32gb RAM, 1tb M.2 SSD, Windows 10 Home 64bit, GTX1080 Founders, with an optical drive (I couldn't remove that so take off 50 bucks).

Screen Shot 2016-08-31 at 4.59.51 PM.png


No monitor, no speakers, no mic, no camera, no thunderbolt. (I'm ASSUMING it comes with a mouse and keyboard although not positive about that).
 
I don't think its quite as clear as that although I'm sure money is a HUGE factor. AMD had to collaborate with Apple for their proprietary single tile 5k P3 display TCON. We don't know much more than that, for all we know nVidia could have said to buzz off.

AMD and nVidia have different business models. Neither AMD nor Apple are currently struggling especially AMD with their contracts. I mean come on, you are 5x more likely to see/use an AMD product than nVidia in your everyday routines....Playstation and Xbox alone almost guarantee that.



You just feel nVidia is more premium because they currently have a better GPU line out. And with the use of Vulkan those lines are being more and more blurred. The Fury X was AMD's version of the Titan X (previous generation) and it held its own.

This is reminiscent of saying the Galaxy S(whatever) is better then the iPhone simply because they have different launch dates. Of course the newest thing from either competing company is going to be better.




List your parts you are using....

Its literally impossible to build a computer that is 4x faster than a maxed out iMac at the same price. Even if you are doing something as weird as comparing it to last years iMac you still can't build a PC that is 4x faster at the same price. You'll exceed the entire price of the iMac in a quest finding a CPU that is 4x faster. The iMac uses m.2 PCIe storage, nothing that is 4x faster. Thunderbolt, USB, etc etc all similar. Memory can definitely really be improved but not 4x...maybe 2x. So really when you say "pretty much every aspect" you mean "only the GPU"?

Plus you are building a desktop tower not an AIO which the two shouldn't be compared so easily. And YOU are building it yourself if I build something equivalent to a maxed out iMac with a GTX1080 through another manufacturer I come up with this. (its hard to find everything Apple offers in a BTO pc especially a 6700k and m.2 storage but I found Alienware has something similar).

i7 6700k, 32gb RAM, 1tb M.2 SSD, Windows 10 Home 64bit, GTX1080 Founders, with an optical drive (I couldn't remove that so take off 50 bucks).

View attachment 647754

No monitor, no speakers, no mic, no camera, no thunderbolt. (I'm ASSUMING it comes with a mouse and keyboard although not positive about that).


You are right, the newer card from any of the companies will always be better than the last. But I dont think AMD will release anything this year that can compete with Nvidia. I think they have accepted that they lost the fight, the same way they have lost against Intel. And instead they are competing on value for money. And Thats not in the same lines as what Apple has been doing the last years, they always sell computers that are expensive and you should expect to get the best they can possibly build for that money...imo

I exaggerated when I said 4x across the board. But its not far from it.
These are my specs:
Now in my iMac I have 32 gb of 1600Mhz DDR3 ram. My new computer will get 128gb of 2400Mhz DDR4. So 4 times as much ram which is also a lot faster.

I have a 4Ghz i7 in my Mac. (the top processor of the 2014 model). Now I will get the Haswell-E 6 Cores i7-6850K. So 2 more cores, writes faster to ram and for me doing 3d rendering in Virtual Desktop because I use 3ds Max will get 4 times the performance. So going from from consumer to a Prosumer processor. Xeons would be bad for me, since adobe programs are what I mostly use.

An overclocked MSI GeForce GTX 1080 Gaming Z, which definately is at least 4x the speed of my R9 M295x with twice as much VRAM.

Additionally my monitor will be 32" Asus PA329Q. I have to say that the iMac 5K screen will be one of the things Ill miss, but at least this one has 100% Adobe RGB space and 32" and is suppose to be good, so I will be ok.

1 TB of SSD (same as I have in the iMac, but iMacs have faster r/w speeds, so this is one of the few areas I downgrade)...But I can live with 550 Mbps, probably wont even notice it much. Thats what I dont get with apple that the excel so much in certain areas like this, then leave stuff like GPU so far behind.

Also I will have a 4 TB secondary HDD with NAND flash and a 8GB archive HDD. So no need for plugged in HDDs

And everything will be almost silent because of great cooling, which is something else the iMac suck at because its too thin and lousy ventilation for heavy workload tasks.(which actually have annoyed me a lot)

But to each his own. Ive bought iMacs for the last 8 years because I think they are beautiful machines with decent hardware. But, I want to be able to push more pixels with my machine, I need a real work horse, and this time around that is more important to me than a nice looking computer and OS X. And the Mac Pro just is a useless design that looks more ugly than a PC tower when everything is connected and its completely outdated hardware and way overpriced. So only an idiot would buy that unless you absolutely have to....so that leaves me no choice.

PS! about price. I will build the machine myself. Yes it became a more expensive than the iMac. I paid about 5.000 USD for my iMac (they are more expensive in Norway than the US) and my new computer cost 6.500 USD (including keyboard, mouse, speakers, webcam, and all that). But since I can sell my iMac and write off the new machine on my company Ill pretty much go even.
 
Last edited:
You are right, the newer card from any of the companies will always be better than the last. But I dont think AMD will release anything this year that can compete with Nvidia. I think they have accepted that they lost the fight, the same way they have lost against Intel. And instead they are competing on value for money. And Thats not in the same lines as what Apple has been doing the last years, they always sell computers that are expensive and you should expect to get the best they can possibly build for that money...imo

I exaggerated when I said 4x across the board. But its not far from it.
These are my specs:
Now in my iMac I have 32 gb of 1600Mhz DDR3 ram. My new computer will get 128gb of 2400Mhz DDR4. So 4 times as much ram which is also a lot faster.

I have a 4Ghz i7 in my Mac. (the top processor of the 2014 model). Now I will get the Haswell-E 6 Cores i7-6850K. So 2 more cores, writes faster to ram and for me doing 3d rendering in Virtual Desktop because I use 3ds Max will get 4 times the performance. So going from from consumer to a Prosumer processor. Xeons would be bad for me, since adobe programs are what I mostly use.

An overclocked MSI GeForce GTX 1080 Gaming Z, which definately is at least 4x the speed of my R9 M295x with twice as much VRAM.

Additionally my monitor will be 32" Asus PA329Q. I have to say that the iMac 5K screen will be one of the things Ill miss, but at least this one has 100% Adobe RGB space and 32" and is suppose to be good, so I will be ok.

1 TB of SSD (same as I have in the iMac, but iMacs have faster r/w speeds, so this is one of the few areas I downgrade)...But I can live with 550 Mbps, probably wont even notice it much. Thats what I dont get with apple that the excel so much in certain areas like this, then leave stuff like GPU so far behind.

Also I will have a 4 TB secondary HDD with NAND flash and a 8GB archive HDD. So no need for plugged in HDDs

And everything will be almost silent because of great cooling, which is something else the iMac suck at because its too thin and lousy ventilation for heavy workload tasks.(which actually have annoyed me a lot)

But to each his own. Ive bought iMacs for the last 8 years because I think they are beautiful machines with decent hardware. But, I want to be able to push more pixels with my machine, I need a real work horse, and this time around that is more important to me than a nice looking computer and OS X. And the Mac Pro just is a useless design that looks more ugly than a PC tower when everything is connected and its completely outdated hardware and way overpriced. So only an idiot would buy that unless you absolutely have to....so that leaves me no choice.

PS! about price. I will build the machine myself. Yes it became a more expensive than the iMac. I paid about 5.000 USD for my iMac (they are more expensive in Norway than the US) and my new computer cost 6.500 USD (including keyboard, mouse, speakers, webcam, and all that). But since I can sell my iMac and write off the new machine on my company Ill pretty much go even.

That's a killer machine. If I was serious about gaming or even had any need for more graphical power in any regard I'd build something similar.

My opinion is you went the best route for your needs, be it the task you need to perform or even if it's just a fun hobby (building PC's).

If someone needs or just wants a GTX1080 today what will they need or want tomorrow? Buying a new iMac every year is crazy (unless you can afford it and just don't give a ****). But a PC you can swap out the graphic cards and/or add another (2 gtx1080s in SLI) every year or two. Same with the CPU. Plus you can easily throw an overclock on the CPU, GPU and its VRAM with little concerns over temps (AIO liquid cooling is affordable and easy to install anymore) and if anyone should know about temps it's you with an m295x. I can even cook my 775m (100 TDP) @ 90-95c playing games on my iMac.

This is why I think it's silly to entertain the idea of a 1080 in an iMac as cool as that would be. I'd make the argument why stop there? Put 2x1080's in there in SLI, hell Apple can figure it out put a new desktop Titan X in there or maybe two of them! As silly as that may sound to some people is as silly as a single 1080 in there sounds to me.

I find it highly unlikely people are running into bottlenecks with their aging software since the iMac is currently as fast as its been. Which means you (not you specifically) shouldn't have been using an iMac in the first place since it's never been fast enough for your needs or wants. Granted, I'm sure there is certain situations with certain software that's been upgraded and showing slow downs. But I'd venture a guess 99% of the time that's under Windows since an OS X dev is programming for their apps to run well on current Macs, not programming for imaginary gtx1080 iMac's. And if you need to run Windows and you want a Gtx1080 well it's very clear a Mac isn't for you (again not you specifically), since the hardware nor the software aligns with what you need. I mean if that's not a sign what is?

Don't get me wrong I want an iMac to be as good as it can be. Personally I want a 6 core CPU, Intel specifically. That would help me tremendously with encoding times. But Apple makes a machine with a higher core count (Mac Pro), and once updated it will also accommodate those that need more graphical power but even if they switched back to nVidia it wouldnt be their gaming Geforce cards it would be their professional Quadro cards.
 
That's a killer machine. If I was serious about gaming or even had any need for more graphical power in any regard I'd build something similar.

My opinion is you went the best route for your needs, be it the task you need to perform or even if it's just a fun hobby (building PC's).

If someone needs or just wants a GTX1080 today what will they need or want tomorrow? Buying a new iMac every year is crazy (unless you can afford it and just don't give a ****). But a PC you can swap out the graphic cards and/or add another (2 gtx1080s in SLI) every year or two. Same with the CPU. Plus you can easily throw an overclock on the CPU, GPU and its VRAM with little concerns over temps (AIO liquid cooling is affordable and easy to install anymore) and if anyone should know about temps it's you with an m295x. I can even cook my 775m (100 TDP) @ 90-95c playing games on my iMac.

This is why I think it's silly to entertain the idea of a 1080 in an iMac as cool as that would be. I'd make the argument why stop there? Put 2x1080's in there in SLI, hell Apple can figure it out put a new desktop Titan X in there or maybe two of them! As silly as that may sound to some people is as silly as a single 1080 in there sounds to me.

I find it highly unlikely people are running into bottlenecks with their aging software since the iMac is currently as fast as its been. Which means you (not you specifically) shouldn't have been using an iMac in the first place since it's never been fast enough for your needs or wants. Granted, I'm sure there is certain situations with certain software that's been upgraded and showing slow downs. But I'd venture a guess 99% of the time that's under Windows since an OS X dev is programming for their apps to run well on current Macs, not programming for imaginary gtx1080 iMac's. And if you need to run Windows and you want a Gtx1080 well it's very clear a Mac isn't for you (again not you specifically), since the hardware nor the software aligns with what you need. I mean if that's not a sign what is?

Don't get me wrong I want an iMac to be as good as it can be. Personally I want a 6 core CPU, Intel specifically. That would help me tremendously with encoding times. But Apple makes a machine with a higher core count (Mac Pro), and once updated it will also accommodate those that need more graphical power but even if they switched back to nVidia it wouldnt be their gaming Geforce cards it would be their professional Quadro cards.


Yeah, I think this pc I ordered is the right setup for me. I think a lot of people (myself included) buys a wrong setups considering what they use the machine for. Even computer stores often make bad recommendations for buyers, not understanding what the buyer are going to use the machine for. The same way I have been stuck with an iMac, a lot of people waste money on Mac Pros, without even knowing if it will do them any good. Many designers gets it to develop things through Adobe programs, not even aware that most of Adobes apps use one core mainly. I mean, just imagine what they pay, and those are the actual people that should get the iMac, because Photoshop has kind of reached its peak of computer needs and the higher clock values of the Skylake would actually give them better performance.
I think its important to think of which tasks you do the most. Like me, I do 3d and Unity and Visual stuff through After Effects and artwork through Photoshop....And a bit of recreational gaming.
Skylake is good for gaming and general use (not the highest demanding tasks - so general consumer, which makes sense in an iMac). Broadwell -E is better if you main use of the computer is professional use of demanding software, but your main tools aren't dependent on a lot of cores and threads. If I were using my machine for 3d and video rendering 80% of my day, I would definitely go for Xeon processors. But since 3d is about 20% of my day, and Photoshop, After Effects, etc is were I use most hours, I would actually get less performance with xeons because of the low clockspeeds, so then the Broadwell-E in-betweener is a perfect fit for me. As for GPU it is the same, if you are a 3D artist using the regular CPU renders, getting a cheaper GTX 10 serie is better than getting an expensive Quadro card. The Quadro cards are good if you use GPU renders like Octane or are a CAD designer that needs some additional viewport effects. So, for me, Geforce GTX 1080 is awesome, and then I can do a bit of gaming with great performance as well. To be honest, anyone getting a Quadro card should consider it heavily what they actually need it for, because you loose performance and up the price in many cases. Titan X or GTX 1080 probably would be in many situations a better option for a lot of users. ...rant over :D

For me being on the iMac, like you say, shouldn't have happened in the first place. So, Im guessing the reason I got iMacs in the first place was the all-in-one the form factor, etc... Such a beautiful machine, so I kinda always wished it to fit my needs. But Ive always been annoyed by the lack of speed compared to what I can get elsewhere. And, yeah upgrading the iMac on every update cycle to get slightly better performance, is just crazy, when I now can just replace the GPU or CPU or whatever as time goes by. So a cheaper solution in the long run with a better performance result.

PS! and yeah, you have a point with the GTX 1080 in an iMac. It might not be a good fit...And its no point for Apple to start the GPU race on a machine thats not primarily meant for gaming. But the only reason I think they should consider it, is if GTX 1080m could work with an iMac, Apple should let people get the option. When you buy a high priced premium built machine like that, give the user options to get the best available that can fit into it. But, I assume apple has reasoning for their decisions. And even if people want to game on their iMacs, Apple probably have different ideas of what the machine is for.
 
Last edited:
iMcLovin have you considered a SSSD Samsung 950 Pro M2 ? because now with the M2 ports you can reach (and go beyond) the speed of the Macs SSDs (2200MB/s Read 1500MB/s Write)

Btw, in my case I was thinking the idea to keep the late 2014 iMac for productivity and to build a gaming pc to connect to the 4K Tv, every day that goes by and rumors I read my hope for a Nvidia iMac is vanishing slowly, fortunately for me that I do mostly Photo editing with the iMac the actual and real need for GPU is only gaming, we will see next month.
 
Last edited:
iMcLovin have you considered a SSSD Samsung 950 Pro M2 ? because now with the M2 ports you can reach (and go beyond) the speed of the Macs SSDs (2200MB/s Read 1500MB/s Write)

Btw, in my case I was thinking the idea to keep the late 2014 iMac for productivity and to build a gaming pc to connect to the 4K Tv, every day that goes by and rumors I read my hope for a Nvidia iMac is vanishing slowly, fortunately for me that I do mostly Photo editing with the iMac the actual and real need for GPU is only gaming, we will see next month.

oh wow, no I didnt know they existed, geez thats almost even twice as fast as the iMac SSD. Do they exist in Desktop versions ?
 
oh wow, no I didnt know they existed, geez thats almost even twice as fast as the iMac SSD. Do they exist in Desktop versions ?
Sure, nowadays all the medium to high end motherboards have the M2 4x slot, so you should check your board and in case change the SATA disk for a M2 ;)
 
Sure, nowadays all the medium to high end motherboards have the M2 4x slot, so you should check your board and in case change the SATA disk for a M2 ;)

Oh well, I allready got the SSD, so it will have to be a future upgrade. But Ill definately check that out.
 
Yeah, I think this pc I ordered is the right setup for me. I think a lot of people (myself included) buys a wrong setups considering what they use the machine for. Even computer stores often make bad recommendations for buyers, not understanding what the buyer are going to use the machine for. The same way I have been stuck with an iMac, a lot of people waste money on Mac Pros, without even knowing if it will do them any good. Many designers gets it to develop things through Adobe programs, not even aware that most of Adobes apps use one core mainly. I mean, just imagine what they pay, and those are the actual people that should get the iMac, because Photoshop has kind of reached its peak of computer needs and the higher clock values of the Skylake would actually give them better performance.
I think its important to think of which tasks you do the most. Like me, I do 3d and Unity and Visual stuff through After Effects and artwork through Photoshop....And a bit of recreational gaming.
Skylake is good for gaming and general use (not the highest demanding tasks - so general consumer, which makes sense in an iMac). Broadwell -E is better if you main use of the computer is professional use of demanding software, but your main tools aren't dependent on a lot of cores and threads. If I were using my machine for 3d and video rendering 80% of my day, I would definitely go for Xeon processors. But since 3d is about 20% of my day, and Photoshop, After Effects, etc is were I use most hours, I would actually get less performance with xeons because of the low clockspeeds, so then the Broadwell-E in-betweener is a perfect fit for me. As for GPU it is the same, if you are a 3D artist using the regular CPU renders, getting a cheaper GTX 10 serie is better than getting an expensive Quadro card. The Quadro cards are good if you use GPU renders like Octane or are a CAD designer that needs some additional viewport effects. So, for me, Geforce GTX 1080 is awesome, and then I can do a bit of gaming with great performance as well. To be honest, anyone getting a Quadro card should consider it heavily what they actually need it for, because you loose performance and up the price in many cases. Titan X or GTX 1080 probably would be in many situations a better option for a lot of users. ...rant over :D

For me being on the iMac, like you say, shouldn't have happened in the first place. So, Im guessing the reason I got iMacs in the first place was the all-in-one the form factor, etc... Such a beautiful machine, so I kinda always wished it to fit my needs. But Ive always been annoyed by the lack of speed compared to what I can get elsewhere. And, yeah upgrading the iMac on every update cycle to get slightly better performance, is just crazy, when I now can just replace the GPU or CPU or whatever as time goes by. So a cheaper solution in the long run with a better performance result.

PS! and yeah, you have a point with the GTX 1080 in an iMac. It might not be a good fit...And its no point for Apple to start the GPU race on a machine thats not primarily meant for gaming. But the only reason I think they should consider it, is if GTX 1080m could work with an iMac, Apple should let people get the option. When you buy a high priced premium built machine like that, give the user options to get the best available that can fit into it. But, I assume apple has reasoning for their decisions. And even if people want to game on their iMacs, Apple probably have different ideas of what the machine is for.

Wouldn't it be kind of funny now that you bought a PC if the next iMac has a 1080m? Lol
 
I shake my head at the fanboys making excuses for Apple. The iMac is a VERY expensive machine, and it should use the best hardware available - especially integral parts like the GPU. Using exclusively AMD's inferior mobile GPUs is simply Apple being lazy and cheap, period.

As far as longevity of an AIO/iMac, I've had my 680MX/i7/SSD iMac for 3.5 years, and it still works fine for this casual gamer. The only game I had to seriously compromise with was Doom 4, which had to be played in 1080p. It still looked good, though. I don't need or want to buy a new GPU every year, if so I'd get a PC. I'd buy a new iMac w/Nvidia 1080M in a heartbeat, but another subpar iMac update with a slow AMD GPU - no thanks!
 
Last edited:
I shake my head at the fanboys making excuses for Apple. The iMac is a VERY expensive machine, and it should use the best hardware available - especially integral parts like the GPU. Using exclusively AMD's inferior mobile GPUs is simply Apple being lazy and cheap, period.

But then it would be an even more expensive machine, since Apple is not going to sacrifice their margins (just as the component makers don't - hence why their best GPU cards are often nearly $1000).

People - with some justification - carp about what a "loaded" 5K iMac costs now, but throw in a 6, 8 or even 10-core Broadwell-E i7 and an 8GB nVidia 1080 and we'd be talking $5000-6000. That's deep into Mac Pro pricing territory and probably a couple grand over what a top-end Windows gaming workstation with a third-party 27" 5K display would run.

Now such a machine might be a real "halo" model for Apple, but how many would they sell? And even if they only custom-made them (so they only buy the parts as-needed - which would drive up the build cost and by extension the sale price), what kind of sacrifices would they have to make for the rest of the line in order to support such top-end components? And would the general market for the iMac consider those sacrifices worth it?
 
I shake my head at the fanboys making excuses for Apple. The iMac is a VERY expensive machine, and it should use the best hardware available - especially integral parts like the GPU. Using exclusively AMD's inferior mobile GPUs is simply Apple being lazy and cheap, period.

As far as longevity of an AIO/iMac, I've had my 680MX/i7/SSD iMac for 3.5 years, and it still works fine for this casual gamer. The only game I had to seriously compromise with was Doom 4, which had to be played in 1080p. It still looked good, though. I don't need or want to buy a new GPU every year, if so I'd get a PC. I'd buy a new iMac w/Nvidia 1080M in a heartbeat, but another subpar iMac update with a slow AMD GPU - no thanks!

Which other AIO on the market would you pick over an iMac? Or are you doing that weird thing comparing it to a DIY custom build PC?

Haven't most benchmarks shown m395x>m295x>780m>680mx in most benchmarks and nearly all productivity task? How is that subpar and slow when you point of reference is the slowest?

m295x 5k iMac's users typically run Doom at Ultra settings 1200p OpenGL btw. And of course AMD is seeing much larger gains then nVidia with Vulkan.

I'm not making excuses for Apple nor am I fan boy. Matter of fact you have it quite the opposite. You are trying to justify nVidia in an iMac. Put it this way, when your iMac was brand new it was complete and utter garbage for gaming, just trash compared to the PC's I was building at the time when it came to gaming. It was a joke. But NOW the iMac needs a 1080, which will thermal throttle have have junky drivers on an outdated OpenGL and/or unused Metal system? #applefanboy#1
 
Basically the AMD chips were surplus from PS4 production, Apple saw an opportunity for some cheap chips, dropped NVidia and went AMD.
My current Mac is a 2012 with a 680MX GPU and it is/was the best of the recent machines. I won't bother upgrading unless Apple go with a better GPU than the current AMD crop, preferably NVidia.

same for me NVIDA in the 2012 680MX was the item that made me scrap my PC and go back to MAC for the first time since my OLD MAC SE
 
same for me NVIDA in the 2012 680MX was the item that made me scrap my PC and go back to MAC for the first time since my OLD MAC SE

I'm curious why? I reread my previous post and realize I came off like an ass, didn't mean too. But going to a Mac for a GPU from a PC seems like an odd choice, ever....on any iMac.

In 2012 if you just stuck with nVidia options the 660, 660 ti, 670, 680, and 690 were better options in a PC than a 680mx in an iMac. Even an OC'd 650 ti could hold its own, then you have SLI options. And of course you had the better previous gen desktop options available, 580 and higher. There were also a plethora of desktop AMD options that were more powerful (78xx and higher I believe).

It's interesting, on PC forums we were laughing at the 680mx back then. And I can understand why it's liked here so much, it's not a bad GPU. But then for those same people hate AMD so much because it's not the best thing available as if the 680mx was when it was first put in an iMac? Granted it was probably the best mobile option but then my question still stands, which AIO does everyone prefer over the iMac with a better GPU then an rumored RX 4xx?

You've got to at least understand my point even if you don't agree. Lol.

Edit: changed 560 ti to 580
 
Last edited:
Wouldn't it be kind of funny now that you bought a PC if the next iMac has a 1080m? Lol

Hehe, yeah that would have been a bit funny :D But, then again, I'm glad I made the switch. If apple would have released that now before I ordered my pc I probably would have made another stupid iMac 5K purchase, which like we discussed, would have been stupid. This machine is a pure production machine, with high focus on GPU and CPU heavy tasks..something I really need as I work in the games and VFX industry.

I do own a bunch of other Macs and pads though (Mac mini SSD as my entertainment system, MacBook Retina, iPad Pro, air2, air1 etc etc). so its not like I'm leaving Apple, but the iMac will no longer be my main work machine.

But, I hope for the sake of new iMac buyers they will still give options of top end GPUs that can handle the tight space. That would prove that Apple is shifting its focus a bit, something I think they should do for the future of Macs. VR, Games and AR is becoming increasingly important in peoples lives and Apple should acknowledge that...So it would be nice if OS X and Macs in general would become more than primarily productivity machines.
 
Seeing reports on full desktop RX470 chipsets (120watt) in laptops. Xell Alienware. Who knows we get a version of this in our imacs. I don't expect anything heavier. My next machine will be a hackingtosh.
I'm done with the high prices and zero upgradeability. I macs are doomed if you hope to be able to us VR in the near future​
 
Last edited:
  • Like
Reactions: AlexGraphicD
I'm curious why? I reread my previous post and realize I came off like an ass, didn't mean too. But going to a Mac for a GPU from a PC seems like an odd choice, ever....on any iMac.

In 2012 if you just stuck with nVidia options the 660, 660 ti, 670, 680, and 690 were better options in a PC than a 680mx in an iMac. Even an OC'd 650 ti could hold its own, then you have SLI options. And of course you had the better previous gen desktop options available, 580 and higher. There were also a plethora of desktop AMD options that were more powerful (78xx and higher I believe).

It's interesting, on PC forums we were laughing at the 680mx back then. And I can understand why it's liked here so much, it's not a bad GPU. But then for those same people hate AMD so much because it's not the best thing available as if the 680mx was when it was first put in an iMac? Granted it was probably the best mobile option but then my question still stands, which AIO does everyone prefer over the iMac with a better GPU then an rumored RX 4xx?

You've got to at least understand my point even if you don't agree. Lol.

Edit: changed 560 ti to 580

I disagree. I overclock the 680MX at 250/375 (at 100% stability with max temps in the low 80s celsius), which gives me about the same 3DMark score as the desktop GTX680 at default clock. Yes, you can overclock the GTX680 too, and then it will move up again. The 680MX is a downclocked desktop 680GTX, that's better than GTX650/660 (except for better cooling options in a PC). Again, a casual gamer doesn't want/need 100+ framerate for competing in twitched-based FPS, or use 8xAA or whatever.

Also, the argument that an iMac suddenly would be crazy expensive, just because one switched out the top AMD mobile GPU-card with the Nvidia one, is bogus.
 
RX470 desktop chip like in the just announced Dell Alienware laptop
and it has max 120W?can you provide a link with the new alienware?i thought the alienware goes for nvidia chips
[doublepost=1472838080][/doublepost]i read that M395x is 12% better than M295x and RX470 is 22% better then M395? so the real benefit probably only those with M295x will have?
 
i read that M395x is 12% better than M295x and RX470 is 22% better then M395? so the real benefit probably only those with M295x will have?

According to the Barefeet results, the M395X was about 8% better than the R295X at 2560x1440 in the games they tested.

The RX460 at 2GB for the base and the RX470 at 4GB as the BTO option seem pretty logical for what we'll see offered.
 
According to the Barefeet results, the M395X was about 8% better than the R295X at 2560x1440 in the games they tested.

The RX460 at 2GB for the base and the RX470 at 4GB as the BTO option seem pretty logical for what we'll see offered.
Considering that with the same TDP they can fit a 1070 that runs almost double isn't a shame ? :)
 
I disagree. I overclock the 680MX at 250/375 (at 100% stability with max temps in the low 80s celsius), which gives me about the same 3DMark score as the desktop GTX680 at default clock. Yes, you can overclock the GTX680 too, and then it will move up again. The 680MX is a downclocked desktop 680GTX, that's better than GTX650/660 (except for better cooling options in a PC). Again, a casual gamer doesn't want/need 100+ framerate for competing in twitched-based FPS, or use 8xAA or whatever.

Also, the argument that an iMac suddenly would be crazy expensive, just because one switched out the top AMD mobile GPU-card with the Nvidia one, is bogus.

I don't think so. If the current top AMD GPU in card form is 200-240 USD and the current top (technically second to top) nVidia GPU in card form is 650-700 USD we would see that difference reflected in some respect in an iMac. And since business typically operate with a percentage for their profit margin that difference will likely grow.

Crazy expensive is subjective of course. 400-600+ USD difference to you might not mean much. But I would consider that price jump in a BTO iMac to at least be referred to as "significantly" more expensive.

This also shows that Apple is following a business model of some sort. A cheaper iMac = less profit to some degree. Its hard to say what sort of analysis they have going on but I'm sure if they knew they could charge 400-600+ USD more for an iMac, maintain their profit margins, and people would still actually buy it then they would do it. But for some reason they aren't.

I wouldn't be surprised to see mostly integrated graphics in the not so distant future. For the time being I'm a very content Apple user however that is today, if I feel their products stop offering me the experience I want I'll move on to other systems. I'm starting to see this with the ATV, while I like it a lot "the future of TV" is UHD and HDR and devices coming out can actually leverage those techs. One day I may feel the same about the iMac and I'll likely end up building a PC again.

However I'm going to be bow out of this discussions. Its pointless and silly for me to argue your opinion on this since it is yours and inherently not incorrect. Meanwhile its pointless and silly for you since I think we both know where Apple is going with the iMac. I also apologize for my post to you above that came off childish.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.