Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
That's just so plain wrong. I know more graphic designers both freelancers and employed ones that work with iMacs or MacBook pros than Mac Pros.

I love how people always use the "I'm fine with only 4GB and you should be too!"

Yes you can 'run' everything on an iMac but it won't be near anything as fast as a Mac Pro (or a much cheaper PC) loaded with ram. (Try batch processing 200 12MP photos while working on a logo and having 4 other apps open, not pretty.)

If time = money then an iMac is a waste of my money because 4GB is not enough anymore. If Photoshop Lightroom, Illustrator can all access 3.8GB of ram to run optimally then that tells me I should have at least 12GB installed to get as much done, as fast as possible, in order to make as much profit as I can. If I can save myself 10hrs/week in computing time I just made myself available for a potential $500 worth of additional work.
 
Why are people still using the old "Windows is unstable" argument? What exactly is unstable about Windows? It hasnt been prone to crashes since Windows 98, I cant even remember the last time it crashed on me. Modern OS's do not have stability issues anymore, get some new material. Its not the OS's fault if your network at school or work is unstable and breaks everything since the techs thought it was a good idea to tie all the software and login information to the network.

Maybe because it's not old and still true?
Windows IS unstable, especially Vista.
Of course, Modern OS's don't have stability issues, and that's why Windows is NOT a modern OS.
It IS the OS fault if you're working with perfectly working, powerful hardware and common software and you still get affected by Windows's instability.
 
I love how people always use the "I'm fine with only 4GB and you should be too!"

Yes you can 'run' everything on an iMac but it won't be near anything as fast as a Mac Pro (or a much cheaper PC) loaded with ram. (Try batch processing 200 12MP photos while working on a logo and having 4 other apps open, not pretty.)

If time = money then an iMac is a waste of my money because 4GB is not enough anymore. If Photoshop Lightroom, Illustrator can all access 3.8GB of ram to run optimally then that tells me I should have at least 12GB installed to get as much done, as fast as possible, in order to make as much profit as I can. If I can save myself 10hrs/week in computing time I just made myself available for a potential $500 worth of additional work.

And I love the "a 2008 computer in 2009 won't do it anymore" crowd. (exaggeration) :)

Of course there are exceptions, if you are "batch processing 200 12MP photos while working on a logo and having 4 other apps open" all the time by all means get a Mac Pro.

You won't need 3.8 gb of ram to create a logo in Illustrator though, apart from that I was talking about pure graphic design, not photography.

And these "potential $500 worth of additional work" need to be found in these times of recession, I hope your business runs better than mine does right now. :eek:
 
Maybe because it's not old and still true?
Windows IS unstable, especially Vista.
Of course, Modern OS's don't have stability issues, and that's why Windows is NOT a modern OS.
It IS the OS fault if you're working with perfectly working, powerful hardware and common software and you still get affected by Windows's instability.

Windows is not Unstable at all, I've had just as many application crashes on mac as I have on windows and I have been using Windows for a very long time. Also it's not always the OS's fault for crashing, Adobe programs are far from perfect and they crash on both systems. Powerful hardware does not stop crashes at the most it could delay them.

I also don't understand what you mean when you say it's not modern, you do realize Mac OS X is based on UNIX which is old too. Considering all windows has to be compatible for it's incredibly stable. Really when you think about it Leopard should be far more stable than it is because it only runs on specific hardware apple chooses.
 
Maybe because it's not old and still true?
Windows IS unstable, especially Vista.
Of course, Modern OS's don't have stability issues, and that's why Windows is NOT a modern OS.
It IS the OS fault if you're working with perfectly working, powerful hardware and common software and you still get affected by Windows's instability.
BS. A well maintained Windows machine with Vista or XP and proper hardware drivers is not unstable in any way. I've used Vista for several hours daily since January 2007, and whatever stability issues there were initially got sorted within a 3-month window once hardware manufacturers got the hang of Vista's new driver model. Had it been unstable I simply wouldn't have used it, these are my work machines that have to bring in $$$ for me 24-7 and I can't be arsed to waste time troubleshooting.

OS X can feign stability pretty well thanks to the fact that the hardware, the OS and much of the popular software all comes from a single manufacturer. It lives inside the Apple bubble. But this bubble is very easy to pop, and then the boy in the bubble dies because he has no skin. Some innocent third party add-on like DivX or Logitech Control Center whatever can grind a Mac to a halt. How many people did not get the Mac rendition of Blue Screen of Death when they installed Leopard? What was it this time, DivX, RAID, LCC, Kaleidoscope...? Aww, one little germ sneaks into the bubble and whammo, instant death. Try killing a Windows PC with a weapon as feeble as DivX and see what happens. Hint: Nothing that'll make it freeze on bootup...

As for "modern OS", I wonder when OS X will get network capabilities from this side of the year 2000?

Let's consider this case scenario. You're two people in a home, you have one computer each, one portable mp3 player each, and a NAS drive with gigs of music files on it.

If both computers are Vista or Win7 machines, the setup is a breeze. Both machines will detect the NAS drive automatically, and you can use "Map to network drive" so that the computers will mount the NAS drive permanently. For all intents and purposes the NAS will be treated like a local drive. Right, so then you point Windows Media Player to the folder on the NAS drive where your music files are. WMP will consider this part of its music library and will constantly monitor the NAS folder for changes. When you rip a music CD on your machine, it will show up in the other person's music WMP library a few minutes later, and that person can then sync the new music to his or her portable player.

Now let's replace the two PCs with Macs. First you'll notice they have trouble detecting the NAS drive, it doesn't show up in Finder. In order for network discovery to work, you will have to disable Leopard's firewall entirely, or mount the drive with a home made script. So you disable the firewall (feels really secure, doesn't it?), and voilá, the NAS drive pops up in Finder's left-hand pane. But you have to log in manually to the NAS drive, or again, write a script that does it for you. Or you can drag an alias to the Start Objects under your user account, then it will map to the NAS drive automatically on startup. Unfortunately it will also open a Finder window on every startup, but whatever. Right, now onto making the music files part of the iTunes libary. You'll find that iTunes can't monitor folders at all, you have to add files or folders manually. So every time you rip a CD on one computer, you have to go to the other computer, fire up iTunes and add the new files manually. You could of course share the library from one computer, but then the other computer can't copy those files to the portable mp3 player because the shared library is off limits to iPods. And eventually you'll discover that iTunes doesn't really like having its library on a NAS drive; sometimes it will default back to your local (empty) music folder when you update iTunes. Sometimes it will say the files don't exist and mark them with exclamation points, even though the NAS drive is clearly online because you're browsing it in a Finder window. It all feels so... whatever the opposite of modern is.
 
Windows is not Unstable at all, I've had just as many application crashes on mac as I have on windows and I have been using Windows for a very long time. Also it's not always the OS's fault for crashing, Adobe programs are far from perfect and they crash on both systems. Powerful hardware does not stop crashes at the most it could delay them.

Windows IS unstable. I've used Windows for almost all my life, and the same programs have always run better on Mac OS X. It has always been more stable, under every aspect.

I also don't understand what you mean when you say it's not modern, you do realize Mac OS X is based on UNIX which is old too. Considering all windows has to be compatible for it's incredibly stable. Really when you think about it Leopard should be far more stable than it is because it only runs on specific hardware apple chooses.

I don't really mean that the OS is more modern, it's just that according to stainlessliquid modern OS-es don't have stability problems, and since Windows has quite a lot of them, it's not a modern OS.

Really when you think about it Leopard should be far more stable than it is because it only runs on specific hardware apple chooses.

Doesn't change the fact that it's more stable.


BS. A well maintained Windows machine with Vista or XP and proper hardware drivers is not unstable in any way. I've used Vista for several hours daily since January 2007, and whatever stability issues there were initially got sorted within a 3-month window once hardware manufacturers got the hang of Vista's new driver model. Had it been unstable I simply wouldn't have used it, these are my work machines that have to bring in $$$ for me 24-7 and I can't be arsed to waste time troubleshooting.

It has nothing to do with hardware. Of course, bad hardware and unstable drivers do cause problems, but that's just a small part of the problems. Other problems are caused by Windows itself (it's unstable). Things like the registry just make things worse.

OS X can feign stability pretty well thanks to the fact that the hardware, the OS and much of the popular software all comes from a single manufacturer. It lives inside the Apple bubble. But this bubble is very easy to pop, and then the boy in the bubble dies because he has no skin. Some innocent third party add-on like DivX or Logitech Control Center whatever can grind a Mac to a halt. How many people did not get the Mac rendition of Blue Screen of Death when they installed Leopard? What was it this time, DivX, RAID, LCC, Kaleidoscope...? Aww, one little germ sneaks into the bubble and whammo, instant death. Try killing a Windows PC with a weapon as feeble as DivX and see what happens. Hint: Nothing that'll make it freeze on bootup...

So what? There are a lot of programs which will break Windows just as easily. Hint: Malware?

As for "modern OS", I wonder when OS X will get network capabilities from this side of the year 2000?

Again, read my reply to the previous poster.
 
Windows IS unstable. I've used Windows for almost all my life, and the same programs have always run better on Mac OS X. It has always been more stable, under every aspect.



I don't really mean that the OS is more modern, it's just that according to stainlessliquid modern OS-es don't have stability problems, and since Windows has quite a lot of them, it's not a modern OS.



Doesn't change the fact that it's more stable.

I'm sorry but not one person can attest instability. If Mac is so much more stable why don't more businesses use them? They obviously would if they were that much more stable, and companies would start making many products for them if businesses switched. They're not more stable at all. Name some stability issues please and don't be using apple applications either. The only way to tell if one is more stable than the other is to be running the exact same processes which is impossible. Applications run differently on windows than they do on mac. I've been using Adobe Photoshop CS4 on Windows 7 for a solid week and it's didn't crash once! I used the same application on Mac for about a week and it did crash.

Oh and do I need to explain what a "Fact" is to you? because there's no way that you can just create one without solid evidence.
 
Windows IS unstable. I've used Windows for almost all my life, and the same programs have always run better on Mac OS X. It has always been more stable, under every aspect.
That's what I thought when I bought an iMac and started using it for graphics back in December of 2007. A couple of days later, Adobe Photoshop CS3 had corrupted several PSD documents on the server. It turned out that the combination of Leopard and CS3 resulted in file corruption when saving to a server. This was one of a gazillion early problems with Leopard -- it took 3 or 4 10.5.X releases to sort them out. After 17 years with Windows I'm trying to think up any problem I've ever had that comes close to the severity of actual destruction of the files you're working on, but I can't really think of anything. I think a song file I was working on in Cubase on Windows 3.11 was corrupted back in 1994 or 1995, but that's about it.

It has nothing to do with hardware. Of course, bad hardware and unstable drivers do cause problems, but that's just a small part of the problems. Other problems are caused by Windows itself (it's unstable). Things like the registry just make things worse.
Do you have any actual examples of how this supposed instability manifests itself? You keep saying it's "unstable", which is about as helpful as leaving your car at a workshop and describing the problem as "it feels broken somehow". Windows 98 was universally unstable. Windows NT 4.0, 2000, XP, Vista and Windows 7 are not.

So what? There are a lot of programs which will break Windows just as easily. Hint: Malware?
You're comparing the damage that malware (as in explicitly malicious software whose only purpose is destruction) can do to Windows, with the damage that benign, harmless application enhancers such as a video codec can do to OS X? I rest my case...

Techguy172 said:
Really when you think about it Leopard should be far more stable than it is because it only runs on specific hardware apple chooses.
Precisely. One systems only runs on a very small selection of machines with combinations of hardware components that were hand picked by the OS manufacturer. The other system was written to run on millions of hardware combinations, the OS manufacturer has zero control over whether you try to install it on a homebuilt computer, a Mac(!) or an effing bicycle. In other words, it's a marvel that Windows even starts, and it's an epic fail that OS X isn't 100% rock solid.
 
That's just so plain wrong. I know more graphic designers both freelancers and employed ones that work with iMacs or MacBook pros than Mac Pros. Many even still use G5s with around 2 gb ram & CS3/4. You simply DO NOT NEED a Mac Pro for graphic design anymore. For animation & video editing yes, but not for graphic design.

I have more stuff opened like you on my iMac with no problems – right now Mail, Firefox, Adium, iCal, iTunes, Acrobat Pro, Illustrator CS4, InDesign CS4, Photoshop CS4 & Handbrake.

You don't have to do everything in Photoshop, vector programs are there for a reason. ;)

This is very true, but throw in Photoshop as well because it'll still run fine. My home system has 1Gb, my work has 2. I keep at least ten apps running at all times(not the most intensive of apps in that example, but you get the idea). What's funny is the same thing could be said back in OS9 with 256mb! of RAM. Adobe, wtf?
 
Windows IS unstable. I've used Windows for almost all my life, and the same programs have always run better on Mac OS X. It has always been more stable, under every aspect.

Right because YOU continue to say so makes it true, right? I've also used windows (and macs) "all almost all my life" and I've just as many crashes, bombs of doom, hardware failures etc as I did on windows. For every blue screen or random reboot I had black apple bomb or total computer freeze.

Ever work in a design lab before OSX? There was a rule "Save every 5 minutes" and not to be a good student but to stop kids from going into a blind rage and destroying their PowerMac / G3 because it corrupted their file again.

Doesn't change the fact that it's more stable.

Sorry to be the first to tell you this but OPINION does not equal FACT. Facts are backed up with data, if you care to provide your long established list of data I'd be happy to look through it. ;)

Now everyone take a breath and realize some real OPINIONS:

1. Macs aren't attacked because there aren't enough of us yet to make it worth a 13yr old kids time. In several challenges the Mac was first to be taken down with a simple Safari hack. Read it here

2. OSX first few versions were terrible and weren't remedied until 10.3. READ ABOUT 10.1 Vista also was terrible until SP1. Does it still have issues YES but that is where Windows 7 comes in. If Apple got 3 sub-releases to fix major issues windows should be allowed the same.

3. Everyone is entitled to an OPINION. But lets not confuse them with FACTS. My opinion is that OS X is king but Vista (especially 64bit) is very capable and not far behind in stability and speed.

Now lets arm wrestle to see who is right.. old fashion style! :D
 
Ever work in a design lab before OSX? There was a rule "Save every 5 minutes" and not to be a good student but to stop kids from going into a blind rage and destroying their PowerMac / G3 because it corrupted their file again.
1999-2002 (MacOS 9 days for the most part) I worked at a web design company with 50 employees all in one big open room (no cubicles)... I worked on sound design and graphics on PC, and all except one of the graphic designers around me were on Macs. The nearest other group were the coders (Lingo, ASP etc) and a little less than half of them were on Macs, the rest on PCs.

After a few weeks at the company it dawned on me that most of the computer troubles in the room had to do with Macs, which was weird since I'd always heard of it as the machine that "just works"... it was the Macs that the IT dude was scurrying around and fixing, and it was mostly the Mac users who broke out in tirades of 4-letter words due to crashes in Director, Flash etc. Of course these were issues with Macromedia's and Adobe's buggy software rather than the Mac as such, but it also had to do with RAM issues. Back then Macs couldn't allocate RAM dynamically, you had to set a fixed amount of RAM for each application, and if you blew the limit strange things started happening. The IT dudes hated Macs because they networked so poorly with PCs... Macs for some odd reason couldn't handle dynamic IP allocation at the time, and I remember one weird phenomenon where the IT dude after some nagging allowed us to use ICQ... Mac users were getting ICQ messages intended for other people every time the IP numbers were shifted around. My girlfriend who worked there was on a Mac, and I barely ever dared send her private ICQs again fearing that it would end up on some whole other Mac in the house...

Apple should thank the maker that Windows 95, 98 and ME were so crappy because that's the only conceivable reason why some people still held on to their Macs before OSX came out. OS9 was pathetic and Apple had no software whatsoever. No browser (for a short while there OS9 actually shipped with MSIE and Outlook Express...!), not even a basic mp3 player (Mac users had to pay for some buggy shareware junk called "SoundJam"). Only the hardcore Mac professionals like designers and musicians kept Apple afloat through those dark days...
 
I don't think this has been mentioned before, if I'm wrong please forgive me.

As far as I know OS X has far better font smoothing for smaller fonts, if I'm using 8 pt Helvetica condensed for copy text in InDesign and look at the whole page the typo will still look like 8 pt Helvetica condensed in OS X, but in Windows it will look like some pixel font.

Are there workarounds for this in Windows I don't know of?

I've only used Windows for games in the last couple of years & I'm really curious for the answer. :)
 
Have to fully agree with Anuba.

I didn't have so many corrupt .psd files before and after the time we had that silly G3 server. Oh man, very bad memorys.

For the outsider, OS9 was funny, when you get the bomb message. Yes, but not for the user.

And I never did understand, why Apple did remove the reset button from their later machines (the PowerMac MDDs didn't have a reset button, the same goes for G5s and newer).

If the Mac freezes, you have to shut it down - but in most case this doesn't work, you have to pull out the power cord/cable from the wall/conector.
 
And I never did understand, why Apple did remove the reset button from their later machines (the PowerMac MDDs didn't have a reset button, the same goes for G5s and newer).

If the Mac freezes, you have to shut it down - but in most case this doesn't work, you have to pull out the power cord/cable from the wall/conector.

Just hold the power button for a couple of seconds. ;)

And you may call me weird - I switched back in the OS 9 days, got to know it in graphic design college and liked it far more than Windows 95/98. For me personally it was more stable, more logical and more user friendly than Winows ever was.
 
OS 9 had it's issues with extensions and such, but it was awesome compared to windows at the time.
 
Just hold the power button for a couple of seconds. ;)

And you may call me weird - I switched back in the OS 9 days, got to know it in graphic design college and liked it far more than Windows 95/98. For me personally it was more stable, more logical and more user friendly than Winows ever was.

Oh Windows 95/98 were terrible and I have lots of horror stories so I feel your pain in switching back! I think my problem is I got re-introduced to Macs in school during the late OS8 and then OS9 days and the new G3s in the lab seriously crashed/hanged/lost files etc on a daily basis.. it was dark times for me haha.
 
It IS the OS fault if you're working with perfectly working, powerful hardware and common software and you still get affected by Windows's instability.
It would be the OS's fault if that happens, luckily it doesnt.
 
Windows IS unstable. I've used Windows for almost all my life, and the same programs have always run better on Mac OS X. It has always been more stable, under every aspect.

The difference is that when Windows breaks (because you installed malware, actually Windows is pretty good against viruses and stuff these days, unless you download it yourself and hit OK a few times, thanks vista!) you can still run the OS, log in, save all your stuff, or maybe even keep working.

I have never had a mac break where it was still usable. Whenever OS X has a problem, it's a re-install and re-image from Time machine. OS X either works or it doesn't, there's no way to coax it to keep going.

So how exactly is that better then Windows?

I'm not gonna lie, my next PC is going to be a Latitude, not a Macbook, just because of the OS and support that comes with the computer.
 
I have never had a mac break where it was still usable. Whenever OS X has a problem, it's a re-install and re-image from Time machine. OS X either works or it doesn't, there's no way to coax it to keep going.
Yes...! What's up with that?

Back when I used Windows PC's exclusively, one of the many variations on the "Mac just works, Windows just breaks" taunts I always heard from Mac users was that we, the Windows users, constantly have to reinstall Windows to keep the machine running. While this was a wild exaggeration, I could see the point buried under the excess of FUD, because after a year or so the registry would be filled up with loads of residual crap from all the installations and uninstallations.

So then when I got my first Mac, I expected no less than being able to solve problems and keep the system fresh without ever having to reinstall it. As soon as problems did occur and I started asking for advice on Mac forums and such, the answer to every single problem no matter how insignificant was "do an archive/reinstall".
- But it's just the...
- Archive/reinstall.
- Are you sure I can't...
- Archive/reinstall.
- Really?
- Archive/reinstall.

If you're lucky, they'll grace you with the "repair disk permissions" suggestion before they hit you with the big archive/reinstall sledgehammer. That's another funny Mac thing, I'm not sure how it manages but these fabled 'disk permissions' seem to become screwed up on a daily basis, always dozens of errors in there.

Another thing that was supposedly so great about the Mac was that you never had to reboot the system after installing stuff. Oh those silly Windows users, they spend all their days rebooting after installs. After a couple of years with OS X machines I gotta say... is there anything that doesn't require a reboot on a Mac? Every little puny X.0.0.0.0.0.0.0.1 update of iCrap-this or QuickFart-that wants me to sign some goddamn license agreement (identical to the umpteen previous ones I already signed and haven't stopped agreeing with) and then reboot the system. How this is somehow smoother than the silent installs that Windows Update does in Vista without rebooting, I have yet to figure out.

Oh, and another taunt - UAC. Apple even made a commercial, and a funny one at that, about Vista's UAC and the constant nagging. And sure, it's a pain in the rear, at least in the beginning before it settles down. But I fail to see how clicking past an alert is more bothersome than entering your damn password every time you install updates or move something in or out of the Applications folder, like in Leopard. Both Vista and Leopard treat the user like an intruder, just in different ways.
 
I don't understand what people would be doing that requires the reinstall... Anecdotal, I know, by I've been managing about 15 systems since OS 9 and I have never reinstalled a Mac OS.

Tried it once on my parent's computer, but that wasn't what the problem was so that was unnecessary anyway.

Windows registry, on the other hand....
 
Vista ultimate (and business iirc) has imaging software (on top of roll back) and theres always acronis true image for other versions. Install all your apps, setup windows to save files on a separate drive/partition then make a snapshot/image of the boot/program drive.
Personally I do two, one with all the drivers installed and one with all my apps installed :)

Basically the same role/principle as archive/reinstall on a mac etc. Saves no end of time when you do actually reinstall on windows.

UAC is a pita for people who know what they're doing but theres apps which can make it 'silent' and only bug you on important things :)
 
Vista ultimate (and business iirc) has imaging software (on top of roll back) and theres always acronis true image for other versions. Install all your apps, setup windows to save files on a separate drive/partition then make a snapshot/image of the boot/program drive.
Personally I do two, one with all the drivers installed and one with all my apps installed :)
Yeah, it's nice... but I prefer a clean install and then letting Windows do its thing and download the myriad of security updates, service packs etc, and then I go about installing the latest software and drivers. I've documented every little step I make and follow those instructions to the letter. I actually find the whole procedure enjoyable, though I haven't done it in about a year... I was thinking of just replacing Vista Ultimate with Win7 beta, I'll be replacing this computer before the Win7 beta expires in September(?) anyway, so...

UAC is a pita for people who know what they're doing but theres apps which can make it 'silent' and only bug you on important things :)
UAC is considerably more quiet in Win7, so I'll put up with it. The only thing about it that's still annoying is the black flash you see when it switches to secure desktop mode. I've tweaked it in Vista so that I get the UAC prompt on the regular desktop... I know it makes the machine slightly less secure, but the black flash was the visual equivalent of rusty nails scraping a blackboard and it just had to go.
 
Yes...! What's up with that?

Back when I used Windows PC's exclusively, one of the many variations on the "Mac just works, Windows just breaks" taunts I always heard from Mac users was that we, the Windows users, constantly have to reinstall Windows to keep the machine running. While this was a wild exaggeration, I could see the point buried under the excess of FUD, because after a year or so the registry would be filled up with loads of residual crap from all the installations and uninstallations.

So then when I got my first Mac, I expected no less than being able to solve problems and keep the system fresh without ever having to reinstall it. As soon as problems did occur and I started asking for advice on Mac forums and such, the answer to every single problem no matter how insignificant was "do an archive/reinstall".
- But it's just the...
- Archive/reinstall.
- Are you sure I can't...
- Archive/reinstall.
- Really?
- Archive/reinstall.

If you're lucky, they'll grace you with the "repair disk permissions" suggestion before they hit you with the big archive/reinstall sledgehammer. That's another funny Mac thing, I'm not sure how it manages but these fabled 'disk permissions' seem to become screwed up on a daily basis, always dozens of errors in there.

Of course that the replies will be less technical, since the number of mac users experiencing problems is nothing to compared to the Windows ones, so they just use the shortcut: reinstall, and don't want to learn how to fix something. On the other hand, Windows users encounter so many problems that, want it or not, they learn how to fix a lot of them, so there are less people who'll just tell you to reinstall. If you want more specialized support, you'll have to speak with experienced users or qualified Apple support guys.

Oh, and another taunt - UAC. Apple even made a commercial, and a funny one at that, about Vista's UAC and the constant nagging. And sure, it's a pain in the rear, at least in the beginning before it settles down. But I fail to see how clicking past an alert is more bothersome than entering your damn password every time you install updates or move something in or out of the Applications folder, like in Leopard. Both Vista and Leopard treat the user like an intruder, just in different ways.

Sure, so if anyone comes at your home and touches your Vista PC, if the person wants to install malware, he/she just needs to press "Yes" a couple of times and that's it. On Mac OS X, no password, he/she can't do anything.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.