Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.
Do you feel such an expectation is unreasonable?

what happens when it sometimes runs at 3.2GHz? should apple be able to charge you more since it's going faster than advertised?
because maybe i should quit allowing apple to monitor my workflow.. next thing i know, they'll be sending a bill for those cpu spikes :(

(i kid,i kid.. i don't really feel like arguing you about this)
 
i don't disagree with you.. just couldn't come up with a better analogy. sorry.

there's a point i'm trying to make but maybe i'll just refrain until clearer words come to me.
Apple goes to great lengths to have generic ads, and hides technical details (try to find which 8 core CPU is in the MP6,1 on an Apple website).

A lawsuit would be very difficult, since Apple makes few concrete claims.

(For example, imagine that Apple claimed that an iMac is as fast as an entry model Z-series at some video rendering task. But, what if that turned out to only be true for 30 second renders or shorter because the iMac would overheat and throttle back on longer renders? There could be some grounds for legal action in that case. But, Apple makes no such comparative claims.)
 
Last edited:
  • Like
Reactions: flat five
OK, here we go:

https://forums.macrumors.com/thread...rottling-heat-and-performance.1815601/page-40

If you don't want to read 40+ pages I'll sum up.

The 5K iMac when fitted with the better GPU (I use the term loosely) routinely runs at 105C.

The fan comes on and the CPU and GPU frequencies plunge until it gets a little cooler, then they go back up, bouncing on thermal limit. Apple's true level of Hubris becomes apparent when you see that in Windows the fan can go faster and keep the machine cooler. So, in order to maintain thinness and quiet, they let the internals boil themselves to death. (105C is higher then boiling in fact)

Note many posts by myself over last year or so, predicting early demise of said machines. Note that my predictions have started coming true. (recent repair shop posts about fixing 100+ logic boards for Apple)

The machine desperately needed quiet, cool Nvidia cards. Apple got the bargain basement AMD space heaters instead and let these poor machines run at high clocks and fry themselves to death. 105C as a standard operating temp is suicide, and Apple knew it.

Read the thread, several times people spoke with Apple Engineers who said 105 wasn't possible, only to be corrected. Expect brown areas over the GPU on that lovely 5K screen, no way to stop radiated heat.
 
what happens when it sometimes runs at 3.2GHz? should apple be able to charge you more since it's going faster than advertised?
because maybe i should quit allowing apple to monitor my workflow.. next thing i know, they'll be sending a bill for those cpu spikes :(

(i kid,i kid.. i don't really feel like arguing you about this)
Perhaps I missed it but I don't see an answer to my question in your response.
 
Apple goes to great lengths to have generic ads, and hides technical details (try to find which 8 core CPU is in the MP6,1 on an Apple website).

A lawsuit would be very difficult, since Apple makes few concrete claims.
If they advertise a system as operating at x frequency, which they do, then it is not unreasonable for a consumer to expect it operate at x frequency in continuous operation.
 
If they advertise a system as operating at x frequency, which they do, then it is not unreasonable for a consumer to expect it operate at x frequency in continuous operation.
I agree completely for a workstation or server.

In fact, I just ran a 48-hour job on 24 threads (dual 6 core/12 thread) on a ProLiant system which stayed at 111% CPU GHz for the entire run. Computer room environment with 19° front inlet air temperature, and six internal fans with high performance heat sinks. Intel CPUs throttle back when the temperature rises - keep them cool and you get Turbo plus HyperThreading all day long.

For laptops (and the iMac is a stationary laptop) and low-end desktops, however, it's pretty much the norm for "thermal management" to reduce the top frequency when things get hot, and has been for a long time.

Expecting "it [to] operate at x frequency in continuous operation" simply means that you didn't do your homework.
 
Perhaps I missed it but I don't see an answer to my question in your response.
nope. didn't miss anything.

If they advertise a system as operating at x frequency, which they do, then it is not unreasonable for a consumer to expect it operate at x frequency in continuous operation.

likewise, it's unreasonable to expect a consumer to know what GHz even means..
if you travel at a rate of 50mph, you more-likely-than-not, understand what that means..
now explain a rate of 2.7GHz.. i honestly don't think you can.. sorry.

like- i feel you're sitting around complaining about data@2.5GHz vs 2.7GHz but you don't actually know what that means.. it's just a number to us (consumers).. so you're going to be in court telling the judge to watch you run geek bench on your mba and point out that it says it's running at a clock less than the rating listed for the cpu.

i mean, that's seriously your entire argument (again- at least that's how i feel about it right now).

anyways-- if you're bent on a few MHz being dropped off a cpu under intense load then i'm not quite sure if i should show you this for fear of going postal ; )

Internal vs. External
The clock that usually gets included in marketing materials is the internal clock, but a processor also has an external clock that determines how quickly the processor can communicate with the outside world. The internal clock represents how quickly the processor can manipulate the data it already has, while the external clock specifies how quickly it can read the information it needs to manipulate or how quickly it can output the manipulated data. As of the date of publication, external clocks are frequently significantly slower than internal clocks. For example, while a processor may run at 3 GHz, its external clock could be anywhere from a few hundred MHz to 1 GHz. Since the external clock determines how quickly the processor can communicate with the system's memory, it has a significant effect on your processor's real-world speed.
 
Really worried this is the excuse Apple needs to completely exit the pro market.
Watch them say, "because no one wants these kinds of machines anymore", not, "because we built the wrong machine for the job"

Could see them undoing a lot of the damage by:
1. Ditch AMD for Nvidia, take more of a hit on the upfront component cost but save on the AppleCare bill while getting superior performing, cooler, more energy-efficient products
2. Reintroduce the "Power Mac" brand as a line of cMP-like towers with modern parts, or partner with IBM on Mac OS X workstations so they can keep those "ugly" looking towers away from all the pretty iToys
3. Save face by keeping the nMP as the "prosumer" line - for people who like the Mac Mini but need a bit more oomph but don't need a massive upgradeable workhorse machine

Not that it'll ever happen, sadly.
 
Really worried this is the excuse Apple needs to completely exit the pro market.
Watch them say, "because no one wants these kinds of machines anymore", not, "because we built the wrong machine for the job"

Could see them undoing a lot of the damage by:
1. Ditch AMD for Nvidia, take more of a hit on the upfront component cost but save on the AppleCare bill while getting superior performing, cooler, more energy-efficient products
2. Reintroduce the "Power Mac" brand as a line of cMP-like towers with modern parts, or partner with IBM on Mac OS X workstations so they can keep those "ugly" looking towers away from all the pretty iToys
3. Save face by keeping the nMP as the "prosumer" line - for people who like the Mac Mini but need a bit more oomph but don't need a massive upgradeable workhorse machine

Not that it'll ever happen, sadly.

what's with the doomsday talk?

but really.. this forum loves to say how apple has already abandoned the creative pro who were once a cornerstone of mac user base..

thing is, if they just quit making mac pro like you allude to.. then we would truly see what 'apple abandons pros' looks like.

it would be an uproar.. possible largest apple outburst ever.. why? because there are a million people out there using macs in such a way or with such&such software to justify a mac pro along with those who simply need a mac pro..

i understand if you spend hour(s) a day on a forum and twelve people yell "apple is doomed.. apple abandons pros".. it seems like an uproar.. but it's not. at least nothing in comparison to apple truly no longer catering to professionals.

i guess my question to you is- do you honestly see apple discontinuing the mac pro any time soon?
 
  • Like
Reactions: JamesPDX
Marketing is important as buyers make purchasing decisions based on what they're told a product can do. If a product fails to meet the marketed capabilities then the manufacturer is being misleading.

However I believe the issue extends further than just base versus turbo speed. I believe what people are referring to is the systems inability to maintain even the base speed for extended periods of time. Not because the components are incapable of doing so but rather the packaging of the components causes thermal issues which necessitate the decreased performance.

Certain form factors have trade offs. Laptops were designed primarily for portability without the need for a separate monitor. If your expecting a laptop to do the job of a workstation, your choosing the wrong tool for the job.

Legally I see no case here. Apple didn't specify at how long a computer will sustain its advertised speed while trying to do something it was not designed for as shown in the technical specifications sheet.

Could see them undoing a lot of the damage by:
1. Ditch AMD for Nvidia, take more of a hit on the upfront component cost but save on the AppleCare bill while getting superior performing, cooler, more energy-efficient products


I see much more of an advantage with Apple staying with AMD in the long run for consumers. Ever since Apple stayed with AMD Cards we are see much more development in openCL in many professional applications. Do we really want to see most application just on a proprietary platform of CUDA with no options besides Nvidia? We need a second choice for consumers and we are seeing increased development with AMD to create more competition the the graphic card field.

No, I haven't said that, I was simply impressed by how well Apple's iPhone is doing and its custom CPU. All I'm saying they might benefit from doing the same thing with their workstations.

Why does it not surprise me with you doing yet another outlandish comparison, and with two quite different architectures. It seems to be the current trend on MacRumors now a days. While Apples current "A" Series mobile processors are getting quite powerful, it still has a long way to go to compare to current Intel processors.

I don't think geekbench scores are even comparable together with desktop & mobile. Except on the same platform.
 
Last edited:
I see much more of an advantage with Apple staying with AMD in the long run for consumers. Ever since Apple stayed with AMD Cards we are see much more development in openCL in many professional applications. Do we really want to see most application just on a proprietary platform of CUDA with no options besides Nvidia? We need a second choice for consumers and we are seeing increased development with AMD to create more competition the the graphic card field.
Correct me if I'm misunderstanding the benchmarks out there, but hasn't Maxwell gotten pretty good with OpenCL? It seems like AMD still has a slight edge in the majority of benchmarks, but Nvidia appears to be in the hunt.

Competition is good, so I'd like to see AMD stay relevant, but I'd rather see Apple go with the best technology, whatever that is.

Is there any sense in Apple buying AMD outright?

Is there any sense in Apple developing it's own Mac-based GPU based on their ARM work?

As the Mac transforms more and more into appliance computing, open standards become less and less important. (not commenting here on whether that's good or bad, but that seems to be the direction the industry is moving in... the Mac is going to follow the iPhone/iPad, not the other way around)
 
  • Like
Reactions: JamesPDX
OK, here we go:

https://forums.macrumors.com/thread...rottling-heat-and-performance.1815601/page-40

If you don't want to read 40+ pages I'll sum up.

The 5K iMac when fitted with the better GPU (I use the term loosely) routinely runs at 105C.

The fan comes on and the CPU and GPU frequencies plunge until it gets a little cooler, then they go back up, bouncing on thermal limit. Apple's true level of Hubris becomes apparent when you see that in Windows the fan can go faster and keep the machine cooler. So, in order to maintain thinness and quiet, they let the internals boil themselves to death. (105C is higher then boiling in fact)

Note many posts by myself over last year or so, predicting early demise of said machines. Note that my predictions have started coming true. (recent repair shop posts about fixing 100+ logic boards for Apple)

The machine desperately needed quiet, cool Nvidia cards. Apple got the bargain basement AMD space heaters instead and let these poor machines run at high clocks and fry themselves to death. 105C as a standard operating temp is suicide, and Apple knew it.

Read the thread, several times people spoke with Apple Engineers who said 105 wasn't possible, only to be corrected. Expect brown areas over the GPU on that lovely 5K screen, no way to stop radiated heat.

This is why I'm a fan of fat computers with proper cooling instead of a slit-fan in the back. I'm also wondering if I should remove the 2nd SSD from my Mini to give it a bit of a break, thermal-wise. I can always stick that SSD into my thunderbolt MultiDock 2.
 
  • Like
Reactions: mrxak
Really worried this is the excuse Apple needs to completely exit the pro market.
Watch them say, "because no one wants these kinds of machines anymore", not, "because we built the wrong machine for the job"

Could see them undoing a lot of the damage by:
1. Ditch AMD for Nvidia, take more of a hit on the upfront component cost but save on the AppleCare bill while getting superior performing, cooler, more energy-efficient products
2. Reintroduce the "Power Mac" brand as a line of cMP-like towers with modern parts, or partner with IBM on Mac OS X workstations so they can keep those "ugly" looking towers away from all the pretty iToys
3. Save face by keeping the nMP as the "prosumer" line - for people who like the Mac Mini but need a bit more oomph but don't need a massive upgradeable workhorse machine

Not that it'll ever happen, sadly.

I know. I was looking at the new Boxx computers today, rubbing my jaw and wondering exactly what kind of beast it would be if it could be made to run OSX unhindered and with overclocking unleashed.
 
Correct me if I'm misunderstanding the benchmarks out there, but hasn't Maxwell gotten pretty good with OpenCL? It seems like AMD still has a slight edge in the majority of benchmarks, but Nvidia appears to be in the hunt.

Competition is good, so I'd like to see AMD stay relevant, but I'd rather see Apple go with the best technology, whatever that is.

Is there any sense in Apple buying AMD outright?

Is there any sense in Apple developing it's own Mac-based GPU based on their ARM work?

As the Mac transforms more and more into appliance computing, open standards become less and less important. (not commenting here on whether that's good or bad, but that seems to be the direction the industry is moving in... the Mac is going to follow the iPhone/iPad, not the other way around)
I have already proposed something. Compare performance in Final Cut Pro X between AMD and Nvidia Maxwell GPUs. You will see why Apple went with AMD GPUs as go-to solution.

People don't understand that one of the reasons why Apple ditched Nvidia was the fact that Apple does not want optimization of performance in application through drivers. That is why they developed primitive in comparison to the other options, but still, low-level API to get rid of optimization of performance through drivers and stick it in the App itself. Performance of the application is the most important here. With low-level APIs you have absolutely no control by drivers about performance of the application. All optimization is done through the Application itself. The API is telling the App what to do, and there is absolutely no room for error corrections in drivers. Its all done by Applications. That is good thing regardless of what people here will try to say. It gives the application full control over the hardware, with all of its features. With Asynchronous Compute as the most important feature.

What is the context here? Because Nvidia GPUs are flying when they have great drivers. Ashes of Singularity, and Nvidia developer guide proven it, that Context switching between Graphics and Compute in Maxwell GPUs brings decrease in performance. Not to mention that in their GPUs there is only one Asynchronous Compute Engine.

Before low-level APIs there was possibility that GPUs were underused(lets look at DX11 and AMD GPUs...). Not fully utilized. Right now - they can be.

And all of this Im writing for X time in this thread.

P.S. People on the hackintosh stage already did direct comparisons between Maxwell and GCN GPUs.
 
Last edited:
I have already proposed something. Compare performance in Final Cut Pro X between AMD and Nvidia Maxwell GPUs. You will see why Apple went with AMD GPUs as go-to solution.

People don't understand that one of the reasons why Apple ditched Nvidia was the fact that Apple does not want optimization of performance in application through drivers. That is why they developed primitive in comparison to the other options, but still, low-level API to get rid of optimization of performance through drivers and stick it in the App itself. Performance of the application is the most important here. With low-level APIs you have absolutely no control by drivers about performance of the application. All optimization is done through the Application itself. The API is telling the App what to do, and there is absolutely no room for error corrections in drivers. Its all done by Applications. That is good thing regardless of what people here will try to say. It gives the application full control over the hardware, with all of its features. With Asynchronous Compute as the most important feature.

What is the context here? Because Nvidia GPUs are flying when they have great drivers. Ashes of Singularity, and Nvidia developer guide proven it, that Context switching between Graphics and Compute in Maxwell GPUs brings decrease in performance. Not to mention that in their GPUs there is only one Asynchronous Compute Engine.

Before low-level APIs there was possibility that GPUs were underused(lets look at DX11 and AMD GPUs...). Not fully utilized. Right now - they can be.

And all of this Im writing for X time in this thread.

P.S. People on the hackintosh stage already did direct comparisons between Maxwell and GCN GPUs.

So, those iMacs running at 105C day in and out. Are they running asynchronous compute or just burning into Singular Ashes?

What difference do DX12 benchmarks of pre-release Windows software make when solitary repair facilities are reporting 100+ AppleCare reflows of melted iMac logic boards and their blazing AMD GPUs?

There is a reason that Apple gets AMD GPUs for pennies on the dollar, but they have begun "paying it backward" now, we'll see if that wakes them up or they keep helping the AMD Space Heater division get rid of old stock.
 
  • Like
Reactions: tuxon86
Not trying to start anything here but regarding the possible legal suit, come on.
It's reasonable for us to want the max performance all the time, but you also need to account for the particular machine and the use it's designed for.
Also, you see nowhere speed specs for the GPUs, and for the CPUs you see no SKUs but instead the advertised Intel (and Intel being here the focus) processor speeds, single core and turbo. They never claim full speed 100% of the time, as no one else does in fact, and that won't stick.
Would you also sue Intel because you can't keep all cores working at full turbo speed? Because if you don't know your stuff it might not be readily evident that turbo applies only to low core count at work.
We need to be realistic here and know what we're talking about and not get heated up for anything.
I'm not saying we should just take it without questioning, but do it with proper knowledge.
 
  • Like
Reactions: linuxcooldude
Not trying to start anything here but regarding the possible legal suit, come on.
It's reasonable for us to want the max performance all the time, but you also need to account for the particular machine and the use it's designed for.
Also, you see nowhere speed specs for the GPUs, and for the CPUs you see no SKUs but instead the advertised Intel (and Intel being here the focus) processor speeds, single core and turbo. They never claim full speed 100% of the time, as no one else does in fact, and that won't stick.
Would you also sue Intel because you can't keep all cores working at full turbo speed? Because if you don't know your stuff it might not be readily evident that turbo applies only to low core count at work.
We need to be realistic here and know what we're talking about and not get heated up for anything.
I'm not saying we should just take it without questioning, but do it with proper knowledge.

Please go read the iMac thread.

You'll get your eyes opened up.
 
Would you let go of the space heater, trash can and whatever diminishing descriptions you so much enjoy using for the hardware in this particular version of the Mac Pro?
We know you don't like it. Here's an idea: sell yours, buy an HP/Dell/whatever and move on to your so hyped film biz staff and selling cards.
I'm sure HP/Dell forums need someone bragging about how cool their machines are and how easily you can overclock them and how high you can get on some bench scores. Wouldn't that be great?!
You're so resourceful you can sure hack one of those WS to run OS X, right? You can even use your own boards, how cool is that?
Maybe when the next nMP comes up, if it suits you (which I very much doubt it, or you wouldn't admit it) come back with something interesting to share.
Right now, I don't think you have anything to add here.
 
  • Like
Reactions: mburkhard
I'm aware of the limitations of the iMac, however that's not a machine for me and precisely because of that I don't own one. It's not my thing. The Mac Pro is and that's why I'm here.
You don't see me in the iMac threads saying it's rubbish or anything, do you? Because I don't consider it to be, only you need to make sure if it suits you before you buy it. If it doesn't stay away from it, like anything else.
 
koyoot, not arguing really.
It's just that the same people keep punching the same bag and that's annoying.
I'm still not sure why some people who apparently don't like a thing Apple makes keep showing up here just to put it down.
 
  • Like
Reactions: ixxx69
About iMac. Does M295X heat up because of the GPU itself or because the construction of iMac does not dissipate the heat from the GPU? 125W TDP GPU can work in Mac Pro - no problem at all. In iMac. There is a problem, no question about that. But is it due the GPU, or the construction of the computer? If there would be 120W Nvidia GPU inside iMac it would also heat up to the same degree as is AMD GPU. Don't forget about it.

Manuel, if you say that MVC said something again... I ignored him. And somehow forum is now much cleaner ;).
 
  • Like
Reactions: mrxak
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.