Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

overheated

macrumors member
Original poster
Jun 3, 2004
35
0
Can some one with more knowledge then myself give me some feed back on how a new MacBook Pro 2011 with the 6750m chip would compare against a 2008 Mac Pro with the nvidia 285. i play the COD series in bootcamp on windows 7.

my thoughts are, if the graphics performance would be similar or perhaps superior, i would replace the Mac Pro with the 2011 MacBook Pro 2.3 15".

i play on a 30" ACD at a resolution of 1440x900 or there about with most settings turned way down to get max FPS.

please chime in with your thoughts
 
Is this a joke? No comparison between the GTX 285 and the 6750m. The 285 Trumps the 6750m.

So lets put it this way. A gtx 285 is almost 1.4-2x (depending on title) faster than the desktop 5770 (800cores) which in turn is also 1.8x faster than the mobile 6750(480cores). So you're looking at a 285 for gaming, which is almost 2-4 times faster than the mobile chip in the mbp depending on the game.

Go with the Mac pro unless you plan on playing an DX11 games on mid settings.
 
Is this a joke? No comparison between the GTX 285 and the 6750m. The 285 Trumps the 6750m.

So lets put it this way. A gtx 285 is almost 1.4-2x (depending on title) faster than the desktop 5770 (800cores) which in turn is also 1.8x faster than the mobile 6750(480cores). So you're looking at a 285 for gaming, which is almost 2-4 times faster than the mobile chip in the mbp depending on the game.

Go with the Mac pro unless you plan on playing an DX11 games on mid settings.

He's right, I had a GTX 285 in my PC last year (moved to 480GTX). The GTX285 will blow the doors off any laptop's GPU. The 2011 MBP is still very fast when compared to other laptops.
 
Last edited:
i play on a 30" ACD at a resolution of 1440x900 or there about with most settings turned way down to get max FPS.

BWAHAHAHAAA!!!

Seriously, how many hundreds of frames per second do you need? There comes a point where the framerate won't increase due to physical bottlenecking of bandwidth.

Framerates above around 60 fps are completely pointless, and if a game has motion blur, framerates above around 25 fps are pointless.
 
Not true.....??? Why are you here? To waste people's time it seems.

You don't have to like his attitude, but I actually agree with him.

Unless a game engine's hitbox code, for multiplayer purposes, is tied in with fps, more than 60 fps seems to be pointless other than your own perception of what you want for fps. At 60 fps, the game will react smoothly enough to not matter where in time, between frames drawn, your eyes are snapping the 30 fps from and will still be showing you more fps than you'll physically register in your brain.

Personally, any game you need a high fps for for a competitive advantage is badly designed. It may not be a hack or a code, but it's still a cheat and that is lame.
 
You don't have to like his attitude, but I actually agree with him.

Unless a game engine's hitbox code, for multiplayer purposes, is tied in with fps, more than 60 fps seems to be pointless other than your own perception of what you want for fps. At 60 fps, the game will react smoothly enough to not matter where in time, between frames drawn, your eyes are snapping the 30 fps from and will still be showing you more fps than you'll physically register in your brain.

Personally, any game you need a high fps for for a competitive advantage is badly designed. It may not be a hack or a code, but it's still a cheat and that is lame.

Well that's your own personal thoughts. 30 FPS is not enough to play competitively, especially in FPS games. I require a minimum of 60FPS while 120FPS being preferred.
 
Well that's your own personal thoughts. 30 FPS is not enough to play competitively, especially in FPS games. I require a minimum of 60FPS while 120FPS being preferred.

Unless you actually have one of the suits your avatar depicts (ie hopped up on nanite speed), your eye response time and reflexes are insufficient to notice any difference between 60 fps and higher. And as the only thing you'd notice at lower fps is ghosting, which a bit of motion blur removes, there's not even any point in more than 30.
 
If I get less than 125 fps in quake it is very noticeable same with call of duty or any quake engine game. If you get less than 100 fps in any half-life engine game is is very noticeable. Furthermore I cannot play at less than 85 hz or my eyes hurt after awhile I prefer 100 or 120. The only competitive fps game I have ever seen that feels smooth at 60 fps was quake 4 and that was the doom 3 engines cap. It doesn't matter if you guys think 30 or 60 fps is ok for you because for most people its not and saying that just makes you look stupid. If you can play those games and 30 or 60 fps more power to you but dropping more than 20 fps from the standard fps specs used in those games makes it unplayable for me.
 
I know you can't really see more that 30fps so I find this back and forth interesting. I might agree that when you are up at 60fps, there is a smoothness you don't have at 30fps or in those high demand moments, the box producing 60fps will dip down less than the one that usually holds 30fps. My 3.2Ghz P4, would run Unreal Tournament at about 80fps. For an overall experience, 80 seemed to be better than 30, but this is not a scientific opinion. For a television analogy, it's kind of like why 240 hz is better than 120 hz. :)
 
If I get less than 125 fps in quake it is very noticeable same with call of duty or any quake engine game. If you get less than 100 fps in any half-life engine game is is very noticeable. Furthermore I cannot play at less than 85 hz or my eyes hurt after awhile I prefer 100 or 120. The only competitive fps game I have ever seen that feels smooth at 60 fps was quake 4 and that was the doom 3 engines cap. It doesn't matter if you guys think 30 or 60 fps is ok for you because for most people its not and saying that just makes you look stupid. If you can play those games and 30 or 60 fps more power to you but dropping more than 20 fps from the standard fps specs used in those games makes it unplayable for me.

No, lack of paragraphs makes people look stupid. Keep your insults to yourself. We're talking about hardware performance here, not professional gaming careers in Korea.
 
Well that's your own personal thoughts. 30 FPS is not enough to play competitively, especially in FPS games. I require a minimum of 60FPS while 120FPS being preferred.

The short and skinny of my post was that more than 60 fps is wasted, not more than 30 fps. Even though eyeballs wont process more than 30 fps, like someone else said, what you will notice is ghosting, which 60 fps gets rid of a lot of that.

You quake engine gamers that talk about how necessary 120 fps is, I've already talked about that being a competitive advantage, not a performance requirement as if you'll notice a visual difference.
 
Last edited:
30 fps is not enough for a pc game.

Total myth that your brain can only handle that much in a video game.

Not true in games because they don't have natural motion blur like movies.

Anyone that spreads this crap is not using their 2 eyes. Very easy to tell the difference between 60 fps and 30 fps.

Now once you get beyond 60 fps the importance of frame rate drops dramatically. Always good to have some reserve though for when there is lots of action on a screen or a scene that is poorly optimized.
 
Unless you actually have one of the suits your avatar depicts (ie hopped up on nanite speed), your eye response time and reflexes are insufficient to notice any difference between 60 fps and higher. And as the only thing you'd notice at lower fps is ghosting, which a bit of motion blur removes, there's not even any point in more than 30.

Well it seems you too dont know what you are talking about. I can see far beyond 60FPS. I Can notice a difference between 30, 60, 90, 120, and 240FPS. You keep your crap opinion to yourself please. It is fact that humans can register way past 60FPS.
 
If I get less than 125 fps in quake it is very noticeable same with call of duty or any quake engine game. If you get less than 100 fps in any half-life engine game is is very noticeable. Furthermore I cannot play at less than 85 hz or my eyes hurt after awhile I prefer 100 or 120. The only competitive fps game I have ever seen that feels smooth at 60 fps was quake 4 and that was the doom 3 engines cap. It doesn't matter if you guys think 30 or 60 fps is ok for you because for most people its not and saying that just makes you look stupid. If you can play those games and 30 or 60 fps more power to you but dropping more than 20 fps from the standard fps specs used in those games makes it unplayable for me.

So true, but if your eyes hurt than its probably because you are using a CRT. Try Acer's 120HZ LCD monitor. Its soo much better than a crt.
 
Well it seems you too dont know what you are talking about. I can see far beyond 60FPS. I Can notice a difference between 30, 60, 90, 120, and 240FPS. You keep your crap opinion to yourself please. It is fact that humans can register way past 60FPS.

If it's a fact, I'm sure you'll be posting a link to something credible? I don't have much opinion one way or another. But, if it's a fact, back it up. If you can't, then...
 
If it's a fact, I'm sure you'll be posting a link to something credible? I don't have much opinion one way or another. But, if it's a fact, back it up. If you can't, then...

Its a FACT that i can tell a difference. I dont care what you think i can or cant see. Everybody's eyes are different.
There is a very noticeable of smoothness between 60,120, and 240hz.
Ask any true gamer/tell them you cant see past 30/60 and you will be told you are wrong every time.
 
Easily noticable between 30FPS and 60FPS, thats for sure.

60FPS to 90FPS is somewhat.

I always prefer 100+FPS for online gaming just incase something strange happens and the FPS dips 20-30FPS or so. Its ALWAYS nice to have reserves.



Anyways, actually I'm more curious the speeds between MBP model 6,2 which has the 300M(512MB) vs the new ATI one, how much faster is it?
 
Its a FACT that i can tell a difference. I dont care what you think i can or cant see. Everybody's eyes are different.
There is a very noticeable of smoothness between 60,120, and 240hz.
Ask any true gamer/tell them you cant see past 30/60 and you will be told you are wrong every time.

Apologies. You seem to be confused about what I said.

You said it was a fact that "humans" can discern between between 60fps and up (say, 80fps, 100fps, etc.). I didn't say anything about refresh rate (60hz, 120hz, etc.) nor did I say that you mentioned that.

For the record, I, and most people who know what they're looking for, can differentiate between 60hz, 120hz, etc. refresh rates. I find it pretty hard to game on LCD's in general. One of the several reasons I own a plasma tv.

I don't know that I can differentiate between 60fps and 80fps, or between 80 and 120fps. You said it was a fact that humans can, so I simply asked for a source for that fact. I never said you could or couldn't tell the difference between different frames-per-second rates.

I'm not sure where to find a "true gamer", but if I see one, I'll be sure to inquire.
 
Wow! It's incredible how defensive people are over their own eyesight!

I know I can tell the difference between a 50Hz and a 100Hz TV screen, but it's only because TVs don't do motion blur, and hence I see ghosting in fast movement. I can't tell the difference between 100Hz and 200Hz.

I have no doubt the human eye can easily see 200Hz and over, that's been "proved", ie. if you flash a picture up on a screen for 5ms, you can tell what it was. But that's not relevant, as it's just a residual flash, and not comparable to a computer game.

The eye effectively compresses the huge amount of data it receives by a monumental amount, and there is a physical data transfer and processing time delay of around 100ms before you actually "see" anything.

Considering all this, any gameplay difference you may experience at frame rates over around 30 fps is going to be from ghosting only. I'd bet (perhaps the consumption of a hat) that any perceived lack of gaming ability at frame rates below 60 fps vs above are purely psychological.

Which is what Full Screen Motion Blur was designed to address.
 
I find it weird he can see more than 60fps when the ACD itself is 60Hz and can't show more than 60fps anyway. So anything over 60 will actually be a waste, which is why a lot of games give you the option to lock the fps at 60.

And before you ask, I have a Samsung 3D computer lcd that can also display 120Hz and honestly can't really see any difference over 60. Gameplay is smooth, but I prefer pushing resolution and effects and games and get 60fps than play at a higher framerate.
 
I find it weird he can see more than 60fps when the ACD itself is 60Hz and can't show more than 60fps anyway. So anything over 60 will actually be a waste, which is why a lot of games give you the option to lock the fps at 60.

And before you ask, I have a Samsung 3D computer lcd that can also display 120Hz and honestly can't really see any difference over 60. Gameplay is smooth, but I prefer pushing resolution and effects and games and get 60fps than play at a higher framerate.

and some people that are sick get better when they take a placebo... the mind is a powerful thing :)
 
If it's a fact, I'm sure you'll be posting a link to something credible? I don't have much opinion one way or another. But, if it's a fact, back it up. If you can't, then...

http://www.boallen.com/fps-compare.html

here's 15 vs 30 vs 60. I can definitely tell the difference. And in games when it's not a static gif but with other guys moving as well as yourself - ala FPS games, even higher will help in feeling smooth.


According to Quazimojo, the 6750 is very comparable to the 260mGTX

https://forums.macrumors.com/threads/1102639/

this is pretty accurate looking at my 260m GTX benchmarks from my laptop over a year and a half ago =) If the 460m to 485m GTX jump is a good indicator, the 485m although not much "higher" in terms of nomenclature is nearly TWICE as powerful as a 460m.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.