Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Is it just me or did the point of this topic get completely lost? I thought the comparison of performance was supposed to be between Crysis 1 and Crysis 2, not whether you have a *gaming* machine, or just a clarification of hardware specs. If Crysis 1 runs well on a particular machine, and the devs says that Crysis 2 should run even faster than 1, logic dictates you can take the same machine that ran Crysis 1 and run Crysis 2 better.

Wouldn't that be the only discussed here?
 
Maybe it's supposed to run better on comparable machines at the time? As in at the time Crysis 1 was released, the current gen systems most were unable to handle it while Crysis 2, current gen systems will run them fairly well?
 
Crysis had Low, Medium, High and Very High. I get the impression that Crysis 2 has High, Very High, and Ultra High.

I could play Crysis quite well on my 8600M GT, if I put everything tho its absolute lowest. I wouldn't want to try and run Crysis 2 on that. It would explode. And that is disappointing, considering how much hype Crytek put towards the "Runs Faster Looking Better" idea.

I'm really hoping performance improves, because it doesn't run nearly as well as it should, IMO.
 
Maybe it's supposed to run better on comparable machines at the time? As in at the time Crysis 1 was released, the current gen systems most were unable to handle it while Crysis 2, current gen systems will run them fairly well?

Well from what I've experienced, that's exactly what I've been thinking they meant as well.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.