Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

DaveinJapan

macrumors member
Original poster
Apr 9, 2005
77
0
Japan
I just bought an iMac, and I've noticed that it seems a bit slower than my Powerbook. It SHOULD be a bit more maxed out than the powerbook, but for some reason it doesn't seem as quick. Also, I noticed in the activity moniter that the yellow "active ram" is a lot more than the active ram on my powerbook while running similar tasks...is there a reason for this? I wonder if the ram is installed correctly, but it reads 1.5GB on both machines (all things being equal, the iMac should be a little faster, no?).

Should I call Apple about this? Or maybe try opening up the iMac again and re-inserting the ram chip? Any suggestions?
 
DaveinJapan said:
I just bought an iMac, and I've noticed that it seems a bit slower than my Powerbook. It SHOULD be a bit more maxed out than the powerbook, but for some reason it doesn't seem as quick. Also, I noticed in the activity moniter that the yellow "active ram" is a lot more than the active ram on my powerbook while running similar tasks...is there a reason for this? I wonder if the ram is installed correctly, but it reads 1.5GB on both machines (all things being equal, the iMac should be a little faster, no?).

Should I call Apple about this? Or maybe try opening up the iMac again and re-inserting the ram chip? Any suggestions?


Go in system preference and then in energy saver. And in one of the tab check if an option if it's set on automatic and if so put it on max (sry if it's unclear my imac g5 is not near me)
 
Davein you are right

The G5 is a bluf, I said it million times before but many people from this community just do not want to accepted.

Actually a single G5 processor would be slower than their equivalent G4.

I boyght a G5 dual 2.0 to do an after effects render and it was just 20% faster than my 1Ghz 12" Powerbook. Hey! we are talking about 2 2Ghz 64bit processors against ONE 32 bit 1Ghz. ¿?¿?¿?

I am tire to say that the "speed" all the kids are bragging about is Video Ram.

The G5 was a big scam and Jobs couldn't hold it any more, OSX can compensate some definciencies but until one point.

I am glad I am not the only one realizing the facts many people here should. I am glad I returned that G5 computer back to the store.

By the way... the movie "THE INCREDIBLES" that was made by Pixar used Intel Processors, not Apple computers fro the rendering... read the credits!
 
Actually, Pixar began the process of switching their workstations over to G5s about a year ago. Their render farm is still x86 Linux, I believe. If the G5s are such a fraud, then why have they been chosen over x86 processors for supercomputers?
 
mymemory said:
The G5 is a bluf, I said it million times before but many people from this community just do not want to accepted.

Actually a single G5 processor would be slower than their equivalent G4.

I boyght a G5 dual 2.0 to do an after effects render and it was just 20% faster than my 1Ghz 12" Powerbook. Hey! we are talking about 2 2Ghz 64bit processors against ONE 32 bit 1Ghz. ¿?¿?¿?

I am tire to say that the "speed" all the kids are bragging about is Video Ram.

The G5 was a big scam and Jobs couldn't hold it any more, OSX can compensate some definciencies but until one point.

I am glad I am not the only one realizing the facts many people here should. I am glad I returned that G5 computer back to the store.

By the way... the movie "THE INCREDIBLES" that was made by Pixar used Intel Processors, not Apple computers fro the rendering... read the credits!


Pixar has nothing to do with it as the G5 didn't exist when the render farm was created.

I will let Arstechnica answer your other comments here and here. I trust them a lot more than you as they track CPU's for a living. Long story short, the G5s are wonderful chips, IBM just couldn't ramp the clock speed fast enough nor lower their power consumption.

And to answer the original poster, remember that the Altivec in the G5s is a worse implementation than a G4 so chances are you are seeing that difference.
 
IJ Reilly said:
Actually, Pixar began the process of switching their workstations over to G5s about a year ago. Their render farm is still x86 Linux, I believe. If the G5s are such a fraud, then why have they been chosen over x86 processors for supercomputers?

Well, isn't fraud a little bit strong? My impression is that the same issue is true of the Pentium III and Pentium IV generations -- that a PIII is faster at a given clock speed than a PIV.
 
mymemory said:
The G5 is a bluf, I said it million times before but many people from this community just do not want to accepted.

Actually a single G5 processor would be slower than their equivalent G4.

I boyght a G5 dual 2.0 to do an after effects render and it was just 20% faster than my 1Ghz 12" Powerbook. Hey! we are talking about 2 2Ghz 64bit processors against ONE 32 bit 1Ghz. ¿?¿?¿?

I am tire to say that the "speed" all the kids are bragging about is Video Ram.

The G5 was a big scam and Jobs couldn't hold it any more, OSX can compensate some definciencies but until one point.

I am glad I am not the only one realizing the facts many people here should. I am glad I returned that G5 computer back to the store.

By the way... the movie "THE INCREDIBLES" that was made by Pixar used Intel Processors, not Apple computers fro the rendering... read the credits!

Would that be the stock 2GHz (256MB RAM) that you said you bought to do a project and then return it to the Apple Store?
:rolleyes: :rolleyes: :rolleyes: :rolleyes:
 
mkrishnan said:
Well, isn't fraud a little bit strong? My impression is that the same issue is true of the Pentium III and Pentium IV generations -- that a PIII is faster at a given clock speed than a PIV.

Well yes... I was responding to the poster who called the G5 a "scam." People are getting so weird about this. It's like they're deeply and personally offended by Apple's move to Intel and have to find reasons for it other than the obvious. This actually has me kind of worried about Apple's future. A lot of Apple customers seem to be wedded to ideas that don't seem to have a lot to do with computer technology.
 
rdowns said:
Would that be the stock 2GHz (256K RAM) that you said you bought to do a project and then return it to the Apple Store?
:rolleyes: :rolleyes: :rolleyes: :rolleyes:

I don't think you can even boot OS X with less than 64 MB RAM (maybe 32, but I don't know, I've never tried).

:rolleyes: ;)
 
Historical overview with some links

IJ Reilly said:
Actually, Pixar began the process of switching their workstations over to G5s about a year ago. Their render farm is still x86 Linux, I believe. If the G5s are such a fraud, then why have they been chosen over x86 processors for supercomputers?

Yes - the switch from a Sun renderfarm to Intel Xeon occured back in 2003. Here's the CNET story: http://news.com.com/2100-1001-983898.html. This was the time when the MR forums were talking about SJ at intel conf. and the like (https://forums.macrumors.com/threads/19532/). At about the same time, the desktops migrated to OSX. http://www.macworld.com/news/2003/10/28/pixarosx/. Then in 2004, there were the reports that Pixar had switched to osx and G5s for their "main production work" http://www.macnn.com/articles/04/03/10/pixar.switches.to.os.x.g5/. I don't know if that means they upgraded their desktops or switched their farm or both. I didn't find anything definative in my searches on the subject. Some confusion on MR at the time of the G5 transition as well: https://forums.macrumors.com/threads/96454/

I find it interesting that IBM's chart-topping (http://www.top500.org/lists/plists.php?Y=2005&M=06) supercomputer uses the PPC 440 32-bit chip running at 700 MHz. Granted, it uses 65,000 of them. The second place system uses the same 32-bit 700 MHz chips, 40,000 or so for that one. Interesting that the PPC 970 or other PPC was not used by IBM for those projects.
 
IJ Reilly said:
Well yes... I was responding to the poster who called the G5 a "scam." People are getting so weird about this. It's like they're deeply and personally offended by Apple's move to Intel and have to find reasons for it other than the obvious. This actually has me kind of worried about Apple's future. A lot of Apple customers seem to be wedded to ideas that don't seem to have a lot to do with computer technology.

Too true... Part of this is maybe because of Apple's anti-Intel campaigns of the past, but it does seem like there are people out there who are convinced speeds on Mactels will suck before Apple has even sold any. :(
 
mkrishnan said:
Too true... Part of this is maybe because of Apple's anti-Intel campaigns of the past, but it does seem like there are people out there who are convinced speeds on Mactels will suck before Apple has even sold any. :(

...or that the performance stats for the G5 (or PPCs in general) were some kind of big lie that Apple's been telling us.
 
DaveinJapan said:
G5 should be faster than G4, right?

Wrong. Well, partially right. G4 to G5 is a LOT like when Intel switched from the Pentium III to the Pentium 4. The P4 (and the G5) push more cycles per second (more MHz/GHz) but the P3 and G4 for the most part did more work per clock cycle.

Back when the P4 was introduced, Tom's Hardware Guide (one of the prominent x86 hardware review sites) ran a test between the 1GHz P3, 1.2GHz AMD Athlon and the 1.4GHz P4. To my recollection, the P3 beat the P4 in 5 out of 8 tests even though the P4 had 40% more clock cycles. The Athlon beat the P4 in 6 out of 8 tests even though the P4 had 17% more clock cycles.

From what I saw on the Mac review sites (like BareFeats, etc.) the PowerPC G4 and G5 are in a similar situation. For some things, the G5 is faster. It certainly helps when you want to use 64-bit software since the G4 is strictly a 32-bit processor but the G5 is 64-bit. But for "most programs" in 32-bits the G4 would beat the G5 most of the time at the same clock speed.

Trouble is, try finding a 2.7GHz G4. :)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.