Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Depends on one's definition of "user serviceable." Are the parts accessible for the end user to make repairs/upgrades? Yes. Does Apple want you touching anything other than the RAM or SSD? No.

And now I feel dumber for contributing to the dumbest discussion on the internet.
 
likewise, where in the classic mac pro manual does it describe how to swap a cpu.. or a power supply unit?
i don't think it does-- so are we to consider those parts not to be user serviceable? or are they in fact user serviceable?

Glad you brought up that Apple doesn't consider the power supply or CPU user serviceable either!
 
True, but in a properly designed computer both can be replaced without very much fuss.

Usually only custom builds have user replaceable power supplies. Most Dells and HPs I see have tinkered with the power supply enough that an ATX one isn't going to fit. And both Dell and HP (and Apple) don't usually make their power supplies available.

The reason the power supply in a Mac Pro is accessible but not considered user serviceable is (much like the GPUs in the Mac Pro) Apple wants to make it easy for an AppleCare technician to do a replacement, but not the user. Same is really true of Dell and HP. They make it easy enough for their technician to repair, but they're not usually going to send the part on it's own to the user. Only difference between Apple and HP is HP will send someone to you to get your machine fixed, and Apple makes you come to them.

Apple is using security torx and makes the GPU swappable because they want the genius bar to be able to do it instead of tossing the whole machine when a GPU dies. But they're using stuff like security Torx to just make it hard enough that end users will be discouraged enough from doing it.
 
AMD is a sinking ship.

The question is would Apple be interested in purchasing them? Given their product line and the high likelihood that we will see AMD under new management in the near future...
 
^ goMac pretty much killed it as far as I'm concerned. Nothing more needs to be said about screws.

cMP PCIe cards are considered user-serviceable by Apple (hence the detailed instructions in the manual).
nMP manual clearly tells you Do Not attempt to service anything but the RAM and SSD.

Hence, we got Apple-sanctioned video card upgrades for cMP. Not so likely for nMP.
IF such a thing ever happens, they'll tell you to get the upgrade done at an Apple authorized service shop.
 
Screw the torex. Let's get back to the issue that gave rise to this thread. Please.

AMD is a sinking ship.

The question is would Apple be interested in purchasing them? Given their product line and the high likelihood that we will see AMD under new management in the near future...

I can only hope that team green continues to support Mac OS with driver releases. If Apple buys AMD, things could get really weird really fast in the GPU arena.
 
  • Like
Reactions: tuxon86 and flehman
IF such a thing ever happens, they'll tell you to get the upgrade done at an Apple authorized service shop.

This. If not for the screws, then the thermal paste.

I think it is a bad sign that Apple has gone with cheapest/most desperate provider for GPUs whilst shrinking their machines. AMD chips run hotter and slower, using more power. They are the OPPOSITE of what Macs need.

I haven't looked recently, but I know a while AMD was beating Nvidia in the FCPX benchmarks, which is pretty much The Only Thing Apple Will Care About.
 
Don't think AMD will be going anywhere despite the recent troubles, my guess would be a buyout/rescue if the crap hit the fan. Would cause chaos in huge markets that depend on them (Apple, Sony, Microsoft, Nintendo) if AMD just went *poof* and suddenly died.

What I think is most mysterious is the Nvidia job posting on LinkedIn mentioning working with Apple on future products. Whether that's them just keeping the heat on in the winter or what, maybe they've got secret plans no one can talk about.
 
This. If not for the screws, then the thermal paste.



I haven't looked recently, but I know a while AMD was beating Nvidia in the FCPX benchmarks, which is pretty much The Only Thing Apple Will Care About.

why wouldn't they care how other pro apps fare on their hardware? Marketing this thing strictly as a glorified Final Cut/Logic tube doesn't seem like it would sustain interest in the product line.
 
AMD is a sinking ship.

The question is would Apple be interested in purchasing them? Given their product line and the high likelihood that we will see AMD under new management in the near future...

I hope not and I hope the government would stop them if they tried. They do need help and they need zen to be a bad a** they/we don't need the Apple kind of help.
 
  • Like
Reactions: TechGod
why wouldn't they care how other pro apps fare on their hardware? Marketing this thing strictly as a glorified Final Cut/Logic tube doesn't seem like it would sustain interest in the product line.

I don't think Apple sees third party apps as unimportant. They just think third party apps should do the same as FCPX. Adopt OpenCL, support multiple cards. Apple doesn't care about applications that use non Apple technology.

(Read that as: Apple has never cared, does not care, and will never ever care about CUDA.)
 
It would be much better if apple used not cape verde as cheapest part but bonaire.

It has 896 shaders vs 640 in verde and has opencl 2.1 support, and its gcn 1.1, so directx 12 feature level 12_0.
 
We can stop arguing about it here, I started a Poll.

https://forums.macrumors.com/threads/to-the-end-of-the-screwgate-scandal.1929958/#post-22107463

Back to regularly scheduled programming.

I think it is a bad sign that Apple has gone with cheapest/most desperate provider for GPUs whilst shrinking their machines. AMD chips run hotter and slower, using more power. They are the OPPOSITE of what Macs need.
Excuse me, what? The Fury X costs the same as a 980TI and is consistently MUCH quieter. Don't bring up the fact that the Ti can be water cooler as well because then it makes it more expensive than the X. Besides they trade blows and with the X you get the benefit of a much cooler and quieter card.

Just wait till Arctic Islands. Nvidea has another thing coming...
 
Excuse me, what? The Fury X costs the same as a 980TI and is consistently MUCH quieter. Don't bring up the fact that the Ti can be water cooler as well because then it makes it more expensive than the X. Besides they trade blows and with the X you get the benefit of a much cooler and quieter card.

Just wait till Arctic Islands. Nvidea has another thing coming...

Please show me any review where 980Ti and Fury "trade blows"

Other than the Fanboy sites, nobody is raving about Fury
 
Please show me any review where 980Ti and Fury "trade blows"

Other than the Fanboy sites, nobody is raving about Fury
Oh and let's not forget about how the 900 series doesn't support asynchronous computing like the GCN architechture does, which, I'd like to remind you, is coming in DX12 and once drivers are fully optimised, expect to see the 980Ti getting beaten up pretty well.
 
http://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review/15

If you actually want to understand how good the X is, read that.


Maybe you should, from the page you linked:

"With Shadow of Mordor things finally start looking up for AMD, as the R9 Fury X scores its first win. Okay, it’s more of a tie than a win, but it’s farther than the R9 Fury X has made it so far."

I have a Fury and a 980Ti. No real comparison.

UPDATE: And more "wonders of asynchronous compute"? Maybe you and coyote should tag team the posting of AMD PR drivel a little better.
 
Maybe you should, from the page you linked:

"With Shadow of Mordor things finally start looking up for AMD, as the R9 Fury X scores its first win. Okay, it’s more of a tie than a win, but it’s farther than the R9 Fury X has made it so far."

I have a Fury and a 980Ti. No real comparison.

UPDATE: And more "wonders of asynchronous compute"? Maybe you and coyote should tag team the posting of AMD PR drivel a little better.
37.9 vs 40.9 fps with the Ti winning in crysis 3 and that's not trading blows???
It does better in Far Cry 4 as well even beating out the Titan X the same story for The Talos Principle. Getting just 2fps less than the Ti on 4K with Civilisation beyond Beyond Earth etc... If you read the AT review you can clearly see it trading blows. You just seem to be wanting to ignore that for some reason. And you still haven't replied to me regarding the Fury's acoustics and heat output.
 
37.9 vs 40.9 fps with the Ti winning in crysis 3 and that's not trading blows???
It does better in Far Cry 4 as well even beating out the Titan X the same story for The Talos Principle. Getting just 2fps less than the Ti on 4K with Civilisation beyond Beyond Earth etc... If you read the AT review you can clearly see it trading blows. You just seem to be wanting to ignore that for some reason. And you still haven't replied to me regarding the Fury's acoustics and heat output.


I'm going to give you the benefit of the doubt and guess that your enthusiasm won over your drive to actually read the review.

Here are some sobering points from the conclusion: (you may wish to sit down first)

Had this card launched against the GTX Titan X a couple of months ago, where we would be today is talking about how AMD doesn’t quite dethrone the NVIDIA flagship, but instead how they serve as a massive spoiler, delivering so much of GTX Titan X’s performance for a fraction of the cost. But, unfortunately for AMD, this is not what has happened. The competition for the R9 Fury X is not an overpriced GTX Titan X, but a well-priced GTX 980 Ti, which to add insult to injury launched first, even though it was in all likelihood NVIDIA’s reaction to R9 Fury X.

The problem for AMD is that the R9 Fury X is only 90% of the way there, and without a price spoiler effect the R9 Fury X doesn’t go quite far enough. At 4K it trails the GTX 980 Ti by 4%, which is to say that AMD could not manage a strict tie or to take the lead. To be fair to AMD, a 4% difference in absolute terms is unlikely to matter in the long run, and for most practical purposes the R9 Fury X is a viable alternative to the GTX 980 Ti at 4K. None the less it does technically trail the GTX 980 Ti here, and that’s not the only issue that dogs such a capable card.

At 2560x1440 the card loses its status as a viable alternative. AMD’s performance deficit is over 10% at this point, and as we’ve seen in a couple of our games, AMD is hitting some very real CPU bottlenecking even on our high-end system.
 
Perhaps a Brandy before the Final paragraph:

Once you get to a straight-up comparison, the problem AMD faces is that the GTX 980 Ti is the safer bet. On average it performs better at every resolution, it has more VRAM, it consumes a bit less power, and NVIDIA’s drivers are lean enough that we aren’t seeing CPU bottlenecking that would impact owners of 144Hz displays. To that end the R9 Fury X is by no means a bad card – in fact it’s quite a good card – but NVIDIA struck first and struck with a slightly better card, and this is the situation AMD must face. At the end of the day one could do just fine with the R9 Fury X, it’s just not what I believe to be the best card at $649.
 
I'm going to give you the benefit of the doubt and guess that your enthusiasm won over your drive to actually read the review.

Here are some sobering points from the conclusion: (you may wish to sit down first)

Had this card launched against the GTX Titan X a couple of months ago, where we would be today is talking about how AMD doesn’t quite dethrone the NVIDIA flagship, but instead how they serve as a massive spoiler, delivering so much of GTX Titan X’s performance for a fraction of the cost. But, unfortunately for AMD, this is not what has happened. The competition for the R9 Fury X is not an overpriced GTX Titan X, but a well-priced GTX 980 Ti, which to add insult to injury launched first, even though it was in all likelihood NVIDIA’s reaction to R9 Fury X.

The problem for AMD is that the R9 Fury X is only 90% of the way there, and without a price spoiler effect the R9 Fury X doesn’t go quite far enough. At 4K it trails the GTX 980 Ti by 4%, which is to say that AMD could not manage a strict tie or to take the lead. To be fair to AMD, a 4% difference in absolute terms is unlikely to matter in the long run, and for most practical purposes the R9 Fury X is a viable alternative to the GTX 980 Ti at 4K. None the less it does technically trail the GTX 980 Ti here, and that’s not the only issue that dogs such a capable card.

At 2560x1440 the card loses its status as a viable alternative. AMD’s performance deficit is over 10% at this point, and as we’ve seen in a couple of our games, AMD is hitting some very real CPU bottlenecking even on our high-end system.
That point about the competition not being the over priced Titan X is a very valid point. I guess I kinda got carried away since seeing a flagship card from AMD was great because as consumers we need AMD. We really do. Without them, Intel and Nvidea can and will totally try to screw us over.

Here's hoping to Arctic Islands getting to market first and Zen reaching Skylake performance. Cause we need the competition and it looks like AMD is gonna bring it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.