Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.
I have already proposed something. Compare performance in Final Cut Pro X between AMD and Nvidia Maxwell GPUs. You will see why Apple went with AMD GPUs as go-to solution.

People don't understand that one of the reasons why Apple ditched Nvidia was the fact that Apple does not want optimization of performance in application through drivers.

Actually, that is quite perceptive. In many ways it makes sense. Apple does something similar with metal to allow the application direct access to the GPU.
 
  • Like
Reactions: mrxak
I was very close to buying my nMP back in Feb of this year but I decided to wait and I have been waiting for a long time now. And idk I'm not sure if I can wait till March next year I really need a new rig soon.
 
Final Cut Pro X is not optimized for hardware. It is optimized for OpenCL. And hardware that is better at it will always win for Apple.

@d00mz I think it all depends what are your needs, what is your current machine, and if you can get around with lets say refurbished Mac Pro for some time.

Refurb http://www.apple.com/shop/browse/home/specialdeals/mac/mac_pro then sell it after some time and maybe buy another refurb(new one) in few months later?
 
...
Would you also sue Intel because you can't keep all cores working at full turbo speed? Because if you don't know your stuff it might not be readily evident that turbo applies only to low core count at work.
...
What do mean by this statement?

I just ordered several servers with dual E5-2699v3 processors (18/36 cores/threads, 2.3 GHz base, 3.6 GHz turbo) - and I'll be loving that turbo boost with high core count processors.
 
I find it funny they let AMD get away with what made them ditch Nvidia for a while (see: the 8600M GT overheating scandal that affected multiple manufacturers). Plus going with AMD over Nvidia seems to rub against every Apple marketing philosophy we've ever seen (20% thinner!! 15% lighter!! 30% more power-efficient!! 40% better performance than last year's model!!) Maxwell seems made for the ideal Apple machine yet they refuse it.

So it could be an OpenCL vs CUDA war as many have said. Or what about the fact AMD is in bad financial shape and is frequently the lowest bidder? No coincidence they won contracts on 3 out of 3 console makers this generation. Some bookie at Apple must have calculated the ROI on repairing cooked iMacs vs selling computers with cheap AMD parts for a premium is still in the green when the dust settles. Either way, they stick with AMD to the detriment of their product quality.
 
  • Like
Reactions: MacVidCards
What do mean by this statement?

I just ordered several servers with dual E5-2699v3 processors (18/36 cores/threads, 2.3 GHz base, 3.6 GHz turbo) - and I'll be loving that turbo boost with high core count processors.
His response was in regards to the the "sue Apple for wrongful claims" being discussed. What he means is that TB is variable depending on thermals and cores in use.

As you know, there are times the E5 you mentioned will be operating at 2.3GHz, and there are times when cores will be at 3.6GHz, and everything in-between depending on thermals and cores in use. I.e., there's no way Intel can guarantee what speed the CPU will be working at at any given time.

Therefor, by pl595's logic, you could sue HP for false claims since it won't ALWAYS operate at 3.6GHz turbo even when only two cores are in use, and it won't ALWAYS operate at 3.4GHz with 3 cores in use, etc. I'm not so sure MG's logic is apples to apples in this instance, but this sub-topic about suing Apple is ridiculous to begin with, so who cares. :rolleyes:
 
  • Like
Reactions: mrxak
His response was in regards to the the "sue Apple for wrongful claims" being discussed. What he means is that TB is variable depending on thermals and cores in use.
My bad.

Intel classes processors by "low core count", "medium core count" and "high core count" based on the chip layout. I read "low core count" as an architectural classification - when "high core count" CPUs definitely have turbo.
 
Given Apple's way of introducing updates to their Mac Pro line, they'll probably announce the next revision at WWDC in October next year, which is the longest break Apple has ever taken from updating their pro line up. And given this being the first branch off a complete overhaul in todays' fast moving computer market, it sort of gives off a tingle of paralyzation.

This makes me quiet sad because if I'm forced to buy a new computer before this will happen, it will be a used model from 2012, thanks Apple for putting the least amount of effort in your Mac Pro line so I have to resort to older models...
 
Given Apple's way of introducing updates to their Mac Pro line, they'll probably announce the next revision at WWDC in October next year, which is the longest break Apple has ever taken from updating their pro line up. And given this being the first branch off a complete overhaul in todays' fast moving computer market, it sort of gives off a tingle of paralyzation.

This makes me quiet sad because if I'm forced to buy a new computer before this will happen, it will be a used model from 2012, thanks Apple for putting the least amount of effort in your Mac Pro line so I have to resort to older models...

I'm sorry, but can you please explain further this one ? "Given Apple's way of introducing updates to their Mac Pro line"
 
WWDC is in June, not October.

Besides, past updates have happened in January, March, April, June, July, August, and December. If we can be sure of one thing, updates to the Mac Pro have been entirely unpredictable.

But October is the month stuff announced at WWDC usually ships ;) If not later with the Mac Pro, so October is an early estimation.

Well at WWDC has been the most frequent, and after more than 2 years of no word from Apple right after a complete overhaul, it is very unlikely that they will just sneak it in.
 
But October is the month stuff announced at WWDC usually ships

In some cases, yes. But as I pointed out, past MP updates have been released in 7 different months spanning the entire calendar year. WWDC (and the month of October) seem to have little to do with any of it.
 
I find it funny they let AMD get away with what made them ditch Nvidia for a while (see: the 8600M GT overheating scandal that affected multiple manufacturers). Plus going with AMD over Nvidia seems to rub against every Apple marketing philosophy we've ever seen (20% thinner!! 15% lighter!! 30% more power-efficient!! 40% better performance than last year's model!!) Maxwell seems made for the ideal Apple machine yet they refuse it.

So it could be an OpenCL vs CUDA war as many have said. Or what about the fact AMD is in bad financial shape and is frequently the lowest bidder? No coincidence they won contracts on 3 out of 3 console makers this generation. Some bookie at Apple must have calculated the ROI on repairing cooked iMacs vs selling computers with cheap AMD parts for a premium is still in the green when the dust settles. Either way, they stick with AMD to the detriment of their product quality.

This is all so obviously true, yet rarely heard. Apple isn't going to have to use their usual "forced obsolescence through driver non-support" route. The iMac 5Ks will reduce their numbers quickly through self-immolation. Going to be some unhappy folks when they see the choice of new logic board off warranty or chuck the lovely 5K panel. Since there will be likely a brown spot over the AMD Space Heater might make it easier.

Good time to push the MacPro as they will be rather sour to the all-in-one idea by then.

I'm going to predict 7,1 in next 6 months. With Tahiti turning 4 years old in a couple weeks, they won't have much choice. (Good excuse for cake & ice cream) I can't imagine AMD still makes them, unless they have a "retro" division.
 
This is off-topic (but hey, nothing in the last 30 pages has been on topic, so what the heck) but there is something I don't understand:

The assertion that Apple went with AMD even though nVidia makes better parts doesn't apply to non-retina iMacs (they ship with integrated Intel or NVIDIA GeForce GT 750M/GT 755M).

So obviously Apple doesn't choose AMD solely for political reasons. Why would they cripple their flagship iMac 5K if nVidia has better alternatives? They have no problem putting them in the regular iMacs.
 
Last edited:
The assertion that Apple went with AMD even though nVidia makes better parts doesn't apply to non-retina iMacs (they ship with integrated Intel or NVIDIA GeForce GT 750M/GT 755M).

Apparently next week we'll see the new retina iMac 21.5" with Broadwell Iris Pro 6200 and R9 M380. Is this the bye bye for Nvidia... maybe low-res iMac's are going to be iGPU only..?

Also, this means that the product launch will be just a press release.. no October show this year.

I hope they'll update 5k's CPU to Skylake at the same time. It'd give 20W relief for the internal sauna...
 
Last edited:
After rolling on the floor laughing, ofc, thanks to the amount of nonsense being posted here...
Kudos to MacVidCards and AidenShaw for bringing that much desired common sense/logic, you name it, to this discussion.
 
This is off-topic (but hey, nothing in the last 30 pages has been on topic, so what the heck) but there is something I don't understand:

The assertion that Apple went with AMD even though nVidia makes better parts doesn't apply to non-retina iMacs (they ship with integrated Intel or NVIDIA GeForce GT 750M/GT 755M).

So obviously Apple doesn't choose AMD solely for political reasons. Why would they cripple their flagship iMac 5K if nVidia has better alternatives? They have no problem putting them in the regular iMacs.

They are merely the last ones left to switch. The iMac refresh of 2012 had 650, the 750 is same chip. Last Nvidia remnants to sweep out the door.

After rolling on the floor laughing, ofc, thanks to the amount of nonsense being posted here...
Kudos to MacVidCards and AidenShaw for bringing that much desired common sense/logic, you name it, to this discussion.

Yeah, I have another theory. A nMP refresh might be closer then we think. The period between June 2013 and Jan 2014 saw a similar uptick in "concerned consumers" ardently defending the 6,1 despite all logic and reason.

So this recent groundswell of cheering and yay-raaing might signal an imminent refresh. The PR department getting a team of "support" staff entrenched before release.

Apple touts "thinner, faster, quieter, greener" as their Mantra, then switches whole line over to the "hotter, noisier, slower, cheaper" GPU guys. Just as Adobe gets CUDA working in their apps in OSX Apple pulls the rug out and forces an OpenCl switchover for their AMD Space Heaters. Now, it's Metal, no wonder Adobe is getting tired of being whipped around and wants off the train.

It is a pity that Apple isn't just trying to make the best, fastest, and longest lived machines. They are charging us for them, why not just really make them?
 
  • Like
Reactions: JamesPDX
I agree completely for a workstation or server.

In fact, I just ran a 48-hour job on 24 threads (dual 6 core/12 thread) on a ProLiant system which stayed at 111% CPU GHz for the entire run. Computer room environment with 19° front inlet air temperature, and six internal fans with high performance heat sinks. Intel CPUs throttle back when the temperature rises - keep them cool and you get Turbo plus HyperThreading all day long.

For laptops (and the iMac is a stationary laptop) and low-end desktops, however, it's pretty much the norm for "thermal management" to reduce the top frequency when things get hot, and has been for a long time.

Expecting "it [to] operate at x frequency in continuous operation" simply means that you didn't do your homework.
I'm not sure if I wasn't clear but I believe the issue is not being able to maintain the base frequency and not the turbo boosted frequency. If a system is unable to maintain its base frequency under continuous operation that is, IMO, a failing of the product. And no amount of "homework" changes that.
 
nope. didn't miss anything.
OK. Any particular reason you're not providing one?
likewise, it's unreasonable to expect a consumer to know what GHz even means..
if you travel at a rate of 50mph, you more-likely-than-not, understand what that means..
now explain a rate of 2.7GHz.. i honestly don't think you can.. sorry.
Irrelevant. Whether one understands the meaning or not has no bearing on the fact one is not receiving what is being advertised.
 
Certain form factors have trade offs. Laptops were designed primarily for portability without the need for a separate monitor. If your expecting a laptop to do the job of a workstation, your choosing the wrong tool for the job.

Legally I see no case here. Apple didn't specify at how long a computer will sustain its advertised speed while trying to do something it was not designed for as shown in the technical specifications sheet.
I expect a laptop to operate at its advertised capability in continuous use. What task it is performing is irrelevant.
 
  • Like
Reactions: mburkhard
Not trying to start anything here but regarding the possible legal suit, come on.
It's reasonable for us to want the max performance all the time, but you also need to account for the particular machine and the use it's designed for.
Also, you see nowhere speed specs for the GPUs, and for the CPUs you see no SKUs but instead the advertised Intel (and Intel being here the focus) processor speeds, single core and turbo. They never claim full speed 100% of the time, as no one else does in fact, and that won't stick.
Would you also sue Intel because you can't keep all cores working at full turbo speed? Because if you don't know your stuff it might not be readily evident that turbo applies only to low core count at work.
We need to be realistic here and know what we're talking about and not get heated up for anything.
I'm not saying we should just take it without questioning, but do it with proper knowledge.
Perhaps I haven't been clear but my comments are not in relation to the turbo boost speed but rather the base clock rate. If a system cannot maintain its base clock rate due to thermal issues then the product is defective.

Likewise a legal case seems plausible. Who would have thought a legal case could be built on MB versus MiB wrt hard drives yet it happened.
 
Likewise a legal case seems plausible. Who would have thought a legal case could be built on MB versus MiB wrt hard drives yet it happened.
Just because it becomes a legal case doesn't mean it's not idiocy. ;)

East Texas has thousands of ridiculous troll lawsuits... it only shows how idiotic that part of the legal system is. Again, have fun with that.

Anyway, is there a reason you keep going on about the iMac CPU in a Mac Pro thread? Maybe take it up in the iMac forum?
 
  • Like
Reactions: mrxak
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.