Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
haha barely perceptible? You are crazy man, and you are claiming that I know nothing about computers?

Iris 5100 vs HD 4000
Shader Processing Units 160 vs 16
Actual Shader Performance 314 vs 51

What else do you want to bring up? ROPs? Texture Compute? ALUs? Feature sets? The Iris 5100 is vastly superior in all of these.

http://www.game-debate.com/gpu/inde...-5100-mobile-vs-intel-hd-graphics-4000-mobile

I'm not debating that the update could have been better. I obviously would have shelled out for a quad core iris pro 5200, but come on man. Don't be ignorant! Iris 5100 vs HD 4000 barely perceptible!? Please! A game that is unplayably slow on the HD 4000 will work on the Iris 5100. That is perceivable

For all those extra shaders and other extra hardware the Iris that Intel is bragging about, the Iris should be 2-3x faster in REAL GAMES (3dmark/Heaven benchmarks aren't games)
Game debate unfortunately only compares shaders, and doesn't show real game benchmarks comparing the two.

Show me the benchmarks that show the superiority of the regular Iris? When running at decent resolutions (at least 1080P, not 720P) the difference between the two become nil. Even at crappy resolutions with a not very demanding game (bioshock is an older title running an outdated graphics engine) the difference here is pathetic as both chips can't even reach 30fps which IMHO is the bare minimum.
55768.png


This is even worse:
55770.png


Granted it's the HD5000, but the HD5100 hasn't been shown to provide a huge performance increase. The extra cache on the Iris Pro makes all the difference...
 
For all those extra shaders and other extra hardware the Iris that Intel is bragging about, the Iris should be 2-3x faster in REAL GAMES (3dmark/Heaven benchmarks aren't games)
Game debate unfortunately only compares shaders, and doesn't show real game benchmarks comparing the two.

Show me the benchmarks that show the superiority of the regular Iris? When running at decent resolutions (at least 1080P, not 720P) the difference between the two become nil. Even at crappy resolutions with a not very demanding game (bioshock is an older title running an outdated graphics engine) the difference here is pathetic as both chips can't even reach 30fps which IMHO is the bare minimum.

Granted it's the HD5000, but the HD5100 hasn't been shown to provide a huge performance increase. The extra cache on the Iris Pro makes all the difference...

I don't know if this will paste well into this so here in the link:
http://www.anandtech.com/show/7072/intel-hd-5000-vs-hd-4000-vs-hd-4400

Essentially comparing the HD 4000 to the HD 5000 see's anywhere between a 2.3% improvement to a 40.8% improvement in performance depending on the game.

I agree that you can cherry pick games that see very little benefit. You can also pick games that have a large improvement. And this comparison is the HD5000 that you mentioned not the Iris 5100. The point though is GPUs don't run on magic. It has far more shaders, ALUs, and more advanced feature set. Its much more powerful. I don't know the reasons why some games has only 2.3% improvement, but I'd guess that they didn't make use of GPU resources? Which might seem obvious but games that don't use good use of the GPU don't seem to benefit from better GPU?

Intel HD 5000 (Haswell ULT GT3) vs. Intel HD 4000 (Ivy Bridge ULV GT2)
GRID 2 Super Street Fighter IV: AE Minecraft Borderlands 2 Tomb Raider (2013) Sleeping Dogs Metro: LL BioShock 2
HD 5000 Advantage 16.2% 12.4% 16.9% 3.0% 40.8% 6.5% 2.3% 24.4%
 
I don't know if this will paste well into this so here in the link:
http://www.anandtech.com/show/7072/intel-hd-5000-vs-hd-4000-vs-hd-4400

Essentially comparing the HD 4000 to the HD 5000 see's anywhere between a 2.3% improvement to a 40.8% improvement in performance depending on the game.

I agree that you can cherry pick games that see very little benefit. You can also pick games that have a large improvement. And this comparison is the HD5000 that you mentioned not the Iris 5100. The point though is GPUs don't run on magic. It has far more shaders, ALUs, and more advanced feature set. Its much more powerful. I don't know the reasons why some games has only 2.3% improvement, but I'd guess that they didn't make use of GPU resources? Which might seem obvious but games that don't use good use of the GPU don't seem to benefit from better GPU?

Intel HD 5000 (Haswell ULT GT3) vs. Intel HD 4000 (Ivy Bridge ULV GT2)
GRID 2 Super Street Fighter IV: AE Minecraft Borderlands 2 Tomb Raider (2013) Sleeping Dogs Metro: LL BioShock 2
HD 5000 Advantage 16.2% 12.4% 16.9% 3.0% 40.8% 6.5% 2.3% 24.4%

The link you showed has all the games running at mediocre resolutions...

To quote Anandtech "On average, Intel's HD 5000 offered a 15.3% performance advantage over Intel's HD 4000 graphics. Whether or not that's impressive really depends on your perspective. Given the sheer increase in transistor count, a 15% gain on average seems a bit underwhelming. "

My point was that neither chip is even decent for gaming unless you think 720P resolution is 'decent' Once you go to 1080P resolution the gap gets much smaller due to memory bandwidth limitations of DDR3. Then again, if you're dead set on getting the mini as a 'gaming device' (which IMHO ludicrous) go ahead and get the Iris mini. The 5100 just really hasn't really panned out performance wise unlike it's much faster Iris Pro sibling which has some pretty decent gaming performance even at higher resolutions.
 
Blah blah blah, who cares if the graphics is faster, you can't game on either and to give up upgradeable ram, quad core and dual SSD drives for raid is NOT worth it. I'd go back to intel 3000 graphics before I would give up quad core and upgradeability. New mac mini sucks. End of story.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.