Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

silentsage

macrumors member
Original poster
May 13, 2008
58
0
A question to the graphics card gurus on this forum.

I loaded Windows 7 on my i5 27" iMac. I ran GPU-Z, and it says the GPU clock is 503 MHz, and the Memory clock is 850 MHz.

I assume this means the graphics card is underclocked. Is there an ATI overclocking utility that I can use?

Thanks.
 
A question to the graphics card gurus on this forum.

I loaded Windows 7 on my i5 27" iMac. I ran GPU-Z, and it says the GPU clock is 503 MHz, and the Memory clock is 850 MHz.

I assume this means the graphics card is underclocked. Is there an ATI overclocking utility that I can use?

Thanks.

if there is, is it even very safe? the iMac gets pretty hot already as it is with the quad core chips in there!
 
if there is, is it even very safe? the iMac gets pretty hot already as it is with the quad core chips in there!

I just got my i5 4850 this afternoon and it indeed does get hot just setting there looking at internet, not really working it at all. I checked the temps on upper back and they were around 100 degrees F! Oh well, it will heat my room in winter, summer is another thing! It aint "green" for sure. I am going to see how much power it draws with a meter. When I set there looking at the screen, I can feel the heat on my face from it. Heat and long computer life dont go together, that is for sure! Since I had it on for about 3-4 hours, the screen has had a couple or so flashs go on it. hope this is not a bad sign!
 
I just got my i5 4850 this afternoon and it indeed does get hot just setting there looking at internet, not really working it at all. I checked the temps on upper back and they were around 100 degrees F! Oh well, it will heat my room in winter, summer is another thing! It aint "green" for sure. I am going to see how much power it draws with a meter. When I set there looking at the screen, I can feel the heat on my face from it. Heat and long computer life dont go together, that is for sure! Since I had it on for about 3-4 hours, the screen has had a couple or so flashs go on it. hope this is not a bad sign!

the heat is to be expected seeing as aluminium conducts heat extremely well so it just means its conducting the heat away from the insides of the computer which is a good thing. Download iStat Pro which will let you see what temps ur computer is running at and u can detirmine if they are unsafe or not.
The screen flash thing is a fault with a few computers, search these forums. Its not confirmed yet whether it is a hardware or software fault. I'd moniter it and if it continues or gets worse call applecare before your 14 day replacement warranty is up.
 
I just got my i5 4850 this afternoon and it indeed does get hot just setting there looking at internet, not really working it at all. I checked the temps on upper back and they were around 100 degrees F! Oh well, it will heat my room in winter, summer is another thing! It aint "green" for sure. I am going to see how much power it draws with a meter. When I set there looking at the screen, I can feel the heat on my face from it. Heat and long computer life dont go together, that is for sure! Since I had it on for about 3-4 hours, the screen has had a couple or so flashs go on it. hope this is not a bad sign!

I fail to see what is so bad about 100°F, Roy43.

My (soon-to-be-replaced) Power Mac G5 plus monitor are sitting here taking up ~200W at idle. That gives you a true indicator of "heat" since heat is work and can be measured in Watts. Temperature is another thing entirely and gives you no usable measure of work (another way of talking about heat) since you are only measuring one area of the iMac. Each internal component of my G5 is > 40°C, including the ambient sensors. Some are near 60°C. This gives you temperatures ranging from 104°F to 140°F.

The 27-inch Core 2 Duo iMac takes up ~150W at idle. Factor in the i5/i7, and you find that those CPUs take up 40W LESS than the Core 2 Duos, according to graphs at Anandtech.

So I will be getting a 27" i7 iMac that has 60% more screen resolution, 2x the cores, 2x the performance *per core*, 2x the GPU power and uses nearly HALF the electricity at idle, meaning it will be outputting HALF the heat.

As for your screen problem, I apologize. But please stop it with the troll-y remarks about the "greenness" of the iMac, when in fact it is probably the most efficient computer they've produced, and is probably more efficient than most other computers.

And to what fruitbench.ben said... EXACTLY! The reason you point your gun at the case and it reads 100°F (very cool for a computer) is because the iMac is highly efficient at pulling the heat AWAY from the interior and out into the environment. If you're coming from a crappy Dell or HP tower, the reason you point your gun and don't get a reading of 100°F is because it's probably 100°C on the inside, and not removing that heat. :)
 
I fail to see what is so bad about 100°F, Roy43.

My (soon-to-be-replaced) Power Mac G5 plus monitor are sitting here taking up ~200W at idle. That gives you a true indicator of "heat" since heat is work and can be measured in Watts.

Yeaa.... no. Heat is different from work, as a gfx card that does a lot of work, but creates little heat, will last for a lot longer then a gfx card that does little work, but overheats like nothing else. Heat = bad, mkay?

And yes, Apple has a history of underclocking graphics cards, I wouldn't be surprised.
 
Yeaa.... no. Heat is different from work, as a gfx card that does a lot of work, but creates little heat, will last for a lot longer then a gfx card that does little work, but overheats like nothing else. Heat = bad, mkay?

You have no idea what you're talking about. "Work" is a physics term. In the case of electronics, Heat and Work are nearly the same because all energy put into the component will be inevitably output as HEAT, barring anything that produces light (LEDs may output something like 90% light and 10% heat with the energy they are given... I don't remember the exact ratio).

As far as your assertions about in what kind of situation a GPU will last longer, you make little sense. You (again) don't understand that heat and work are essentially the same, but you're also assuming that HEAT = TEMPERATURE, which it does not. Yes, high TEMPERATURES will reduce the life of electronics, but as reported in numerous threads the ATI 4850 GPU is perfectly within the norms of ALL GPUs. It idles around 40°C-50°C and doesn't normally go above 80°C and this is perfectly acceptable. Almost ALL video cards achieve these same temperatures. And, again, just because most GPUs have about these temperatures says NOTHING about their WORK or their HEAT. Are you understanding yet??

An ATI 5970 is going to require a LOT more energy, and thus will output a LOT more heat than the iMac 4850. And of course the Wattage requirement of the GPU is more or less a gauge of its WORK and thus HEAT, but says nothing about TEMPERATURE! I bet you the 5970 operates within 5°C of the 4850 under idle/load, and you know why??? A more efficient heatsink/fan. It's probably outputting 2-4x the heat, if not more, but remaining the same temperature because HEAT != TEMPERATURE. Got it yet?

And just because the 5970 may operate at the same TEMPERATURE while doing more WORK (and thus outputting more HEAT) says absolutely nothing about the longevity of the card versus the 4850.

So please, the next time you say "Yeah... No" make sure you have some idea about the subject.
 
I loaded Windows 7 on my i5 27" iMac. I ran GPU-Z, and it says the GPU clock is 503 MHz, and the Memory clock is 850 MHz.

I assume this means the graphics card is underclocked. Is there an ATI overclocking utility that I can use?

Thanks.

That's not underclocked, those are standard factory specs for the ATi mobility version HD 4850. Often people forget or may not even know it's the mobile card in these bad boys. Still does a fine job, as far as iMacs go that is.

(one source of many: http://www.notebookcheck.net/AMD-ATI-Mobility-Radeon-HD-4850.13975.0.html )

Manufacturer ATI
Series Mobility Radeon HD 4000
Codename M98
Pipelines 800 - unified
Core Speed * 500 MHz
Shader Speed * 500 MHz
Memory Speed * 850 MHz
Memory Bus Width 256 Bit
Memory Type GDDR3
Max. Amount of Memory 1024 MB
Shared Memory no
DirectX DirectX 10.1, Shader 4.1
Transistors 956 Million
technology 55 nm
Features OpenGL 2.0, PCI-E 2.0 x16, Powerplay, DisplayPort support up to 2560x1600, HDMI support up to 1920x1080 (both with 7.1 AC3 Audio), 1x Dual-Link/Single-Link DVI, 1x Single-Link DVI Support (all display ports have to be supported by the laptop manufacturer)
Notebook Size large
Date of Announcement 09.01.2009
Link to Manufacturer Page http://ati.amd.com/products/mobilityrade...
 
Mobile version of ATI 4850 is basically just underclocked desktop 4850 and its performance is about the same as desktop 4830's
 
You have no idea what you're talking about. "Work" is a physics term. In the case of electronics, Heat and Work are nearly the same because all energy put into the component will be inevitably output as HEAT, barring anything that produces light (LEDs may output something like 90% light and 10% heat with the energy they are given... I don't remember the exact ratio).

As far as your assertions about in what kind of situation a GPU will last longer, you make little sense. You (again) don't understand that heat and work are essentially the same, but you're also assuming that HEAT = TEMPERATURE, which it does not. Yes, high TEMPERATURES will reduce the life of electronics, but as reported in numerous threads the ATI 4850 GPU is perfectly within the norms of ALL GPUs. It idles around 40°C-50°C and doesn't normally go above 80°C and this is perfectly acceptable. Almost ALL video cards achieve these same temperatures. And, again, just because most GPUs have about these temperatures says NOTHING about their WORK or their HEAT. Are you understanding yet??

An ATI 5970 is going to require a LOT more energy, and thus will output a LOT more heat than the iMac 4850. And of course the Wattage requirement of the GPU is more or less a gauge of its WORK and thus HEAT, but says nothing about TEMPERATURE! I bet you the 5970 operates within 5°C of the 4850 under idle/load, and you know why??? A more efficient heatsink/fan. It's probably outputting 2-4x the heat, if not more, but remaining the same temperature because HEAT != TEMPERATURE. Got it yet?

And just because the 5970 may operate at the same TEMPERATURE while doing more WORK (and thus outputting more HEAT) says absolutely nothing about the longevity of the card versus the 4850.

So please, the next time you say "Yeah... No" make sure you have some idea about the subject.

clap clap. Nice rambling, though through all this your have also demonstrated you actually know knothing about the architecture of the CPU cores produced by Nvidia/ATI. The R700 had used GPUs based on 55 nm process (4850)the RV870 (5870) use a 40 nm process meaning they use less power and produce less heat ,therefore they can be pushed further, The fan itself is still damn crappy on the 5870, the whole improvement is based on the new GPUs. Ease of the physics and do some reading on the R700 v RV870.
 
clap clap. Nice rambling, though through all this your have also demonstrated you actually know knothing about the architecture of the CPU cores produced by Nvidia/ATI. The R700 had used GPUs based on 55 nm process (4850)the RV870 (5870) use a 40 nm process meaning they use less power and produce less heat ,therefore they can be pushed further, The fan itself is still damn crappy on the 5870, the whole improvement is based on the new GPUs. Ease of the physics and do some reading on the R700 v RV870.

I'm probably just pushing a simple misunderstanding even further but...

You haven't really changed the fact that people are making a simple distinction error when describing heat / temperature. Sure, smaller process usually means lower power consumption. The heatsink doesn't necessarily reduce watt consumption. And your comments aren't any more germane to the discussion that you believe they are.... So:

"clap clap. Nice rambling, though through all this your have also demonstrated you actually know knothing about the architecture of the CPU cores produced by Nvidia/ATI."

Anyone could have said what you just said be reading a tech review off of Anandtech or Tomshardware.
 
.....

The 27-inch Core 2 Duo iMac takes up ~150W at idle. Factor in the i5/i7, and you find that those CPUs take up 40W LESS than the Core 2 Duos, according to graphs at Anandtech.

.....


Apple is specifying the maximum continuous power as 241W for the 21.5" model and 365W for the 27" model. They don't distinguish between duo, i5 and i7, just the screen size.

A lot of this power is used for the LED screen (LEDs are typically more power hungry, that the old fluorescent light sources to produce the same brightness), that's why it is warm, even at idling.

I am planning to do my own measurements as soon as I get mine.
 
Apple is specifying the maximum continuous power as 241W for the 21.5" model and 365W for the 27" model. They don't distinguish between duo, i5 and i7, just the screen size.

A lot of this power is used for the LED screen (LEDs are typically more power hungry, that the old fluorescent light sources to produce the same brightness), that's why it is warm, even at idling.

I am planning to do my own measurements as soon as I get mine.

I'm pretty sure LED uses substantially less power than their CCFL counterparts.
 
itommyboy - Thanks. I've run a few games on it, and I have to say I'm pleased with the performance. I'm getting far better performance with the i5/4850 combo than I did with my old iMac 3.06/8800GS. Add that to the incredible screen and that makes the iMac a tremendous value.

Regarding the discussions concerning heat and temperature - I'm an electrical engineer and I've designed computer motherboards for the last 30 years. The back of the iMac is aluminum for a reason - it's a great heat sink. Hot heat sinks are good - it means they're working.

If you wnt to know if you have a thermal problem with a computer, just watch the temperature of the chips. As long as they're below about 90 C you're OK. If they get to 95 C it bears watching. If they're above 100 C, the cooling needs adjustment.

Failures due to temperature follow an exponential distribution. At 90 C the chips will run fine for a very long time. At 100 C the lifetime drops by about 10x. At 105 C you have a real problem.

The temperature of the case of the iMac is nothing to worry about - it's designed to operate that way.
 
I'm pretty sure LED uses substantially less power than their CCFL counterparts.

The lab prototypes are just catching up with the mainstream CCFLs. The commercial LEDs are still at around half of commercial CCFLs.
 
The lab prototypes are just catching up with the mainstream CCFLs. The commercial LEDs are still at around half of commercial CCFLs.

Weird how pretty much every LED HDTV consumes less power than CCFL outfitted ones though ...
 
I'm probably just pushing a simple misunderstanding even further but...

You haven't really changed the fact that people are making a simple distinction error when describing heat / temperature. Sure, smaller process usually means lower power consumption. The heatsink doesn't necessarily reduce watt consumption. And your comments aren't any more germane to the discussion that you believe they are.... So:

"clap clap. Nice rambling, though through all this your have also demonstrated you actually know knothing about the architecture of the CPU cores produced by Nvidia/ATI."

Anyone could have said what you just said be reading a tech review off of Anandtech or Tomshardware.

Yes but do you realize what the advantages are from going from 55nm to 40nm? The new cards are much more efficient using less power and producing less heat, though then hey pump the crap out of them increading the core and memory speeds to make em fly. Effectively you can push a 40nm so much further then a 55nm. At the clocks that the 5870 are being run they are bloody hot, the heatsink on the new cards is damn loud as a result. The limiting factor for CPU and GPUs is heat in modern system, get crazy cooling and you can push the cards way beyond base specs. So in relation to this forum the 4850 would be under-clocked by apple for a good reason, to keep is stable.
 
So in relation to this forum the 4850 would be under-clocked by apple for a good reason, to keep is stable.


In relation to this forum I guess we try to make things as precise and clear as possible, so here I go again. It's not under-clocked by Apple or anyone else for that matter.

I keep making this point not to be a blowhard but because it's very important for people to know, especially first time buyers; more importantly PC switchers who now feel they can get a good enough gaming experience along with OSX, but may feel duped by what the card really is post purchase. As I and countless others have been pointing out for about 8 months now since this card was released in the last gen of iMacs, the ATi HD 4850 is simply the mobility version - not any stronger or weaker than it was intended to be by AMD. The clocks are very similar and often dead on when compared to the same GPU in a windoze laptop.

In hopes that this does not scare anyone away from a new iMac I'd also like to say again that imho they are very decent cards as far as iMacs go thus far. My 24" has been my go to machine as of late playing such titles as COD4, Borderlands, Dragon Age Origins, WOW, Torchlight and the like on great looking and smooth running high settings, many games often max'd.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.