The minumum specs for the game require a 3GHz processor and 128MB of VRAM, and those are the minimums. Recommended are 3.5GHz and 256MB.
Even though your Macbook is one of the newer versions with the X3100, I'd be surprised if the game even started up, let alone ran at playable speed.![]()
I would like 2 play this game with my macbook....is that possible?
Hell no! Nor a baseline HD2400XT iMac.
well, sorry but you need to rethink what you said.Not true, Rainbow Six Vegas is playable on iMac with HD 2400 XT, graphic, when turn graphic setting to between low to medium.
It's still mobile chipset, HD 2600 Pro from iMac is still mobile chipset too, Mac Pro is only one model that can run this game at max setting (HD 2600 XT and GeForce 8800 GT).
well, sorry but you need to rethink what you said.
I'm an all around gamer, pc/mac/linux, tested through tnt, 9200se/pro, x800xl/xt, 6600GS/GT, 7900GS, 8800GTX/SLI and last year R6Vegas was my FPS of choice and tested it in a 7900GS, 8800GTX single and SLI, and with the 7900GS was barely playable being the fps between 20/40 depending on the situation at native 1680x1050 and medium/low. The 7900GS is a lot more powerful than the 2600XT which struggles to play at 1680x1050 almost all at low.
So, if you call gaming at a 10-15fps slideshow, no problem.
Max settings with the 2600XT in a bad Xbox port using Unreal Engine 3?? Sure you don't know what max settings are...![]()
In theory it will run in the x3100 since it has SM3 but no driver is optimized for it.
Sometimes i forget that some people actually play at 800x600, but for me that's not playing by any means. I loved it for the gaming experience but I have to agree that the visual port is crappy at native resolution or at 800x600 and scales very badly in any kind of hardware.
Here you have some analysis: http://www.anandtech.com/video/showdoc.aspx?i=2895&p=2
Don't bite that info how the HD2600XT is much better because of more 200Mhz on the core. Those Apple stupid guys fitted the ddr3 version instead the ddr4 one. Both still sh/tty though![]()
Fairly certain the PC version required a Shader Model 3 compliant video card, which none of the Intel onboard chipsets are. Sorry!