Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So I took your recommended settings on my iMac 5K 2014 with max anti-aliasing and noticed a huge difference when it is 2560X1440. Graphics clearly are smoother.

I took the same max anti-aliasing setting to my Corsair One with Alienware 34 set at 120Hz, I didn't noticed any difference at all, but FPS drop from 110-120 to 70s when I turn on anti-aliasing, why is that so? In essence, I am not able to notice any difference between turning on AA and turning off AA?
The difference is because it is more graphically intensive to use AA techniques. You're probably utilizing the full resolution of your Corsair One's monitor in WoW, which makes the point of Anti-Aliasing somewhat moot. Anti-Aliasing is supposed to smooth out the rough edges of lower resolution by using a "half pixel" method to soften jagged lines. It takes some calculations to do, and it taxes your GPU more. If your computer is already running the full resolution, you can't take advantage of the "half pixel" plotting methods, and just wasting computational resources. On the iMac though, you have a 5k monitor with spare pixels to use as "half pixels" with Anti-Aliasing.
 
The difference is because it is more graphically intensive to use AA techniques. You're probably utilizing the full resolution of your Corsair One's monitor in WoW, which makes the point of Anti-Aliasing somewhat moot. Anti-Aliasing is supposed to smooth out the rough edges of lower resolution by using a "half pixel" method to soften jagged lines. It takes some calculations to do, and it taxes your GPU more. If your computer is already running the full resolution, you can't take advantage of the "half pixel" plotting methods, and just wasting computational resources. On the iMac though, you have a 5k monitor with spare pixels to use as "half pixels" with Anti-Aliasing.


I am learning something today.

So at 3440x1440, max 10/10 setting in WOW, I technically don't need anti-aliasing, am I right?

This is same too for iMac 5K, if you run at 5K native resolution, you don't need anti-aliasing, am I right?

For iMac 5K: Native 5K resolution vs max anti-aliasing setting with lower resolution, which is more taxing to the GPU? Which one has better FPS in WOW?
 
I am learning something today.

So at 3440x1440, max 10/10 setting in WOW, I technically don't need anti-aliasing, am I right?

This is same too for iMac 5K, if you run at 5K native resolution, you don't need anti-aliasing, am I right?

For iMac 5K: Native 5K resolution vs max anti-aliasing setting with lower resolution, which is more taxing to the GPU? Which one has better FPS in WOW?
AA is less taxing than running at full resolution. And yes, at max resolution you technically don't need anti-aliasing. So in WoW you would get a higher FPS at a lower resolution with AA.
 
AA is less taxing than running at full resolution. And yes, at max resolution you technically don't need anti-aliasing. So in WoW you would get a higher FPS at a lower resolution with AA.

Got it, thanks for the explanation. On the Corsair One, I will just turn off AA then.

Appreciate your time and patience.
 
  • Like
Reactions: SecuritySteve
No problem. I'm having a slow day at work, and this was a fun conversation. :D

It is indeed a fun conversation. Now to I have think carefully and make a decision.

I just found out that Micro Center is selling iMac Pro Vega 56 for $3999...a $1000 discount!!!!

How is Vega 56 compare to 64 in term of gaming performance?

Wonder if anyone has the Vega 56 and also playing WOW?
 
Ok, now I'm getting more into my iMac Pro gaming! :) I just played Doom with everything on ultra + AA in 1440p, and the Vega 64 isn't even breaking a sweat (Vulkan). Capping the framerate at 59, the GPU temps are moderate - and no fan noise. I also plugged in my external thunderbolt soundcard, the Focusrite Clarett, and voila - great sound quality. You have to admit the iMac 5K display is pretty darn amazing for gaming, even if it's just 60hz. The image quality is excellent.

To offset the lack of fan control software, you can tweak the graphics settings and/or use a framerate limiter, stay in 1440p, and make sure the card operates a bit under max load. Then temps are more moderate. I don't think the frequent extreme temperature jumps that otherwise happens is ideal for long term.

PS: Does anyone know how much improvement in gaming sound quality I would get if buying an external USB Sound Blaster card - w/support for EAX 5.0 or other gaming sound fx/reverb (using a pair of studio monitors)?
 
Last edited:
I am learning something today.

So at 3440x1440, max 10/10 setting in WOW, I technically don't need anti-aliasing, am I right?

This is same too for iMac 5K, if you run at 5K native resolution, you don't need anti-aliasing, am I right?

For iMac 5K: Native 5K resolution vs max anti-aliasing setting with lower resolution, which is more taxing to the GPU? Which one has better FPS in WOW?

You can actually still benefit from AA even run at native resolution, but just not as much as lower resolution. It's about how the GPU do the sampling and rendering.

http://www.nvidia.com/object/coverage-sampled-aa.html

From the above link (the very last part), you can see that without AA, the picture will only consist of blue, white and black. That's at native resolution. With AA, the GPU will "create" some grey pixels to make the edge looks more smooth.

Here is another example. All 1,2,3,4 are at native resolution. However, 2 (and 4) will look more smooth because of AA.
400px-Antialias-vrs-Cromapixel.svg.png


I actually just made a samples from my Mac. Both of the pictures are rendered at my screen's native resolution. And they do look different.

No AA, the edges are more pixelated.
NOAA.jpg


With AA. Definitely more smooth.
AA.jpg


Is the difference very obvious inside the game? No (especially at high resolution).
Are they different? Yes.
 
Last edited:
  • Like
Reactions: nikster0029
You can actually still benefit from AA even run at native resolution, but just not as much as lower resolution. It's about how the GPU do the sampling and rendering.

http://www.nvidia.com/object/coverage-sampled-aa.html

From the above link (the very last part), you can see that without AA, the picture will only consist of blue, white and black. That's at native resolution. With AA, the GPU will "create" some grey pixels to make the edge looks more smooth.

Here is another example. All 1,2,3,4 are at native resolution. However, 2 (and 4) will look more smooth because of AA.
400px-Antialias-vrs-Cromapixel.svg.png


I actually just made a samples from my Mac. Both of the pictures are rendered at my screen's native resolution. And they do look different.

No AA, the edges are more pixelated.
View attachment 751494

With AA. Definitely more smooth.
View attachment 751493

Is the difference very obvious inside the game? No (especially at high resolution).
Are they different? Yes.

Thank you so much for explaining. It is very educational.
 
  • Like
Reactions: h9826790
You can actually still benefit from AA even run at native resolution, but just not as much as lower resolution. It's about how the GPU do the sampling and rendering.

http://www.nvidia.com/object/coverage-sampled-aa.html

From the above link (the very last part), you can see that without AA, the picture will only consist of blue, white and black. That's at native resolution. With AA, the GPU will "create" some grey pixels to make the edge looks more smooth.

Here is another example. All 1,2,3,4 are at native resolution. However, 2 (and 4) will look more smooth because of AA.
400px-Antialias-vrs-Cromapixel.svg.png


I actually just made a samples from my Mac. Both of the pictures are rendered at my screen's native resolution. And they do look different.

No AA, the edges are more pixelated.
View attachment 751494

With AA. Definitely more smooth.
View attachment 751493

Is the difference very obvious inside the game? No (especially at high resolution).
Are they different? Yes.


@h9826790 I just did a test on my Corsair One 7700K 1080TI 16GB in World of Warcraft
Monitor: Alienware 34 inch 1440p 120Hz

With anti-alias turn off, I getting 100-120FPS, I am noticing the pixelated issues you have pointed, I always thought they are normal because of the computer resolution.

I turned on anti-alias to max setting, in World of Warcraft it is SSAA 4X + CMAA, this is the highest anti-alias setting along with 10/10 ultra setting for everything else. Guess what....those pixelated things almost disappear! I can't believe it!!! Down side is, FPS dropped to 70-80.

Location of test: Stormwind

Question for all: Keeping anti-alias on and sacrificing FPS...is it worth it?

Practically and embarrassingly speaking, I am seriously not seeing the different between 120FPS, 100FPS and 70FPS, is there something wrong with my eyes?


Updated: Tested it outside of Stormwind (Stormwind has a lot of people, sometime contributing to lower FPS). In places with less people, FPS went up to 90-100.

However, I only noticed the difference if I actually zoom in and pay attention to it and I will see those pixelated effect, if I just play normally, without anti-aliasing, I barely notice anything.

GTX 1080TI is seriously a beast, I know it is powerful, but didn't think that it is this powerful. I thought it is just a little better than the Vega 64 based on all the reviews I saw online. I guess it is way ahead...
 
Last edited:
  • Like
Reactions: h9826790
@h9826790 I just did a test on my Corsair One 7700K 1080TI 16GB in World of Warcraft
Monitor: Alienware 34 inch 1440p 120Hz

With anti-alias turn off, I getting 100-120FPS, I am noticing the pixelated issues you have pointed, I always thought they are normal because of the computer resolution.

I turned on anti-alias to max setting, in World of Warcraft it is SSAA 4X + CMAA, this is the highest anti-alias setting along with 10/10 ultra setting for everything else. Guess what....those pixelated things almost disappear! I can't believe it!!! Down side is, FPS dropped to 70-80.

Location of test: Stormwind

Question for all: Keeping anti-alias on and sacrificing FPS...is it worth it?

Practically and embarrassingly speaking, I am seriously not seeing the different between 120FPS, 100FPS and 70FPS, is there something wrong with my eyes?


Updated: Tested it outside of Stormwind (Stormwind has a lot of people, sometime contributing to lower FPS). In places with less people, FPS went up to 90-100.

However, I only noticed the difference if I actually zoom in and pay attention to it and I will see those pixelated effect, if I just play normally, without anti-aliasing, I barely notice anything.

GTX 1080TI is seriously a beast, I know it is powerful, but didn't think that it is this powerful. I thought it is just a little better than the Vega 64 based on all the reviews I saw online. I guess it is way ahead...

IMO, that’s the beauty of having a high Hz monitor. We can enjoy the best possible graphics without tearing, or noticeable frame drop.

What my experience is more or less same as yours. I can't tell the difference between 120 and 70FPS. And the pixelated effect is not that noticeable most of the time. So, my setting is as follow.

Turn the graphics settings as high as possible (including AA) with V-sync off (because my monitor only has FreeSync, no G-sync). And let the FPS vary between 60-144. As long as the variation is within this range, I can't tell the difference, and I won't see any tearing.

If even at max setting, the game can still deliver >144FPS, then turn on Dynamic Super Resolution. The graphic will be further improved, and the performance will eventually drop to below 144FPS.

If the lowest FPS drop to below 60, then reduce AA. But the 1080Ti is really a beast, most of the games can run at MSAAx4 without issue. It actually makes my 9 years old Mac able to deliver max performance in VR :D
VR details.JPG
 
I've been gaming on iMacs since 2013, and I've hardly ever had a problem with tearing. I just use triple buffered vsync, also capping the framerate at 59 helps reduce lag (don't know if this is still applicable in Windows 10, but I do it out of habit). Now AMD also has "Enhanced vsync", which will hopefully also be available on iMac Pro when a proper GPU driver arrives.
 
IMO, that’s the beauty of having a high Hz monitor. We can enjoy the best possible graphics without tearing, or noticeable frame drop.

What my experience is more or less same as yours. I can't tell the difference between 120 and 70FPS. And the pixelated effect is not that noticeable most of the time. So, my setting is as follow.

Turn the graphics settings as high as possible (including AA) with V-sync off (because my monitor only has FreeSync, no G-sync). And let the FPS vary between 60-144. As long as the variation is within this range, I can't tell the difference, and I won't see any tearing.

If even at max setting, the game can still deliver >144FPS, then turn on Dynamic Super Resolution. The graphic will be further improved, and the performance will eventually drop to below 144FPS.

If the lowest FPS drop to below 60, then reduce AA. But the 1080Ti is really a beast, most of the games can run at MSAAx4 without issue. It actually makes my 9 years old Mac able to deliver max performance in VR :D
View attachment 751535


You are able to install a 1080TI in a 2009 Mac Pro? Wow....that is impressive.

I think with Gsync, I can't go beyond 120Hz, because my monitor is 120Hz, I think G-sync put a hard cap. I have never seen WOW FPS meter went beyond 120, it just stop at 120.

When I increased the level of anti-aliasing to max level, I noticed the FPS tend to fluctuate a lot, but never went below 60.
[doublepost=1518861901][/doublepost]
I've been gaming on iMacs since 2013, and I've hardly ever had a problem with tearing. I just use triple buffered vsync, also capping the framerate at 59 helps reduce lag (don't know if this is still applicable in Windows 10, but I do it out of habit). Now AMD also has "Enhanced vsync", which will hopefully also be available on iMac Pro when a proper GPU driver arrives.

I have been gaming on Mac since 2007. It wasn't until recently I considered a gaming PC. My iMac 5K 2014 version just can't do it well. I first considered an eGPU, but after realizing this was the first 5K machine and does not have TB3, eGPU is not supported. So I was seriously debating getting an iMac Pro Vega 56 or 64 to enhance my experience in WOW or get a Corsair One with an Alienware 34 monitor. I ended up getting the later few days ago.

Earlier when I started this thread, I was actually having buyer remorse on the Corsair One due to too many wirings and not as minimalistic as the iMac Pro. However, after testing WOW at all max setting the whole day and getting used to the new 34 inch monitor at 120Hz, I have to admit that Corsair One is indeed a great gaming machine. The Alienware 34 inch at 120Hz is also a great gaming monitor.

Once I am used to 120Hz and going back to the 60Hz iMac 5K, even though I don't know how to describe the difference, but I think my eyes "FELT" the difference..
 
Last edited:
Well, the iMac 2014 was always GPU underpowered and running very hot, and certainly not a good iMac generation for gaming. I have to set aside some of my initial scepticism for the iMac Pro. A full unofficial GPU driver is right around the corner, and with an external sound card everything works quite well. I'm still hoping for some software ability to control the fans, which is rather important.

How are the latest gaming monitors? I haven't seen any in person. I imagine the image quality itself can't compare with the 5k iMac? Also, the 2017 iMac generation improved the colours, which really pops. Playing games with Reshade using Vibrance at 0.250 looks stunning. Also running games at 4K is a good compromise (and doesn't really need AA) instead of 5K, but the GPU runs a lot hotter than in 1440p.

I'm sure you'd get used to 60hz again, but sounds like you have a nice setup now. Loving my iMac Pro though. ;)
 
I've been gaming on iMacs since 2013, and I've hardly ever had a problem with tearing. I just use triple buffered vsync, also capping the framerate at 59 helps reduce lag (don't know if this is still applicable in Windows 10, but I do it out of habit). Now AMD also has "Enhanced vsync", which will hopefully also be available on iMac Pro when a proper GPU driver arrives.

Triple buffering surely help, I always have that on when I was using the ACD. And actually I need that ON, otherwise, I can see the tearing. However, what surprise me is that with a gaming monitor. Now I can turn it OFF and I actually never see any tearing.
You are able to install a 1080TI in a 2009 Mac Pro? Wow....that is impressive.

I think with Gsync, I can't go beyond 120Hz, because my monitor is 120Hz, I think G-sync put a hard cap. I have never seen WOW FPS meter went beyond 120, it just stop at 120.

When I increased the level of anti-aliasing to max level, I noticed the FPS tend to fluctuate a lot, but never went below 60.
[doublepost=1518861901][/doublepost]

I have been gaming on Mac since 2007. It wasn't until recently I considered a gaming PC. My iMac 5K 2014 version just can't do it well. I first considered an eGPU, but after realizing this was the first 5K machine and does not have TB3, eGPU is not supported. So I was seriously debating getting an iMac Pro Vega 56 or 64 to enhance my experience in WOW or get a Corsair One with an Alienware 34 monitor. I ended up getting the later few days ago.

Earlier when I started this thread, I was actually having buyer remorse on the Corsair One due to too many wirings and not as minimalistic as the iMac Pro. However, after testing WOW at all max setting the whole day and getting used to the new 34 inch monitor at 120Hz, I have to admit that Corsair One is indeed a great gaming machine. The Alienware 34 inch at 120Hz is also a great gaming monitor.

Once I am used to 120Hz and going back to the 60Hz iMac 5K, even though I don't know how to describe the difference, but I think my eyes "FELT" the difference..

Yes, 1080Ti works fine in my 2009 Mac Pro. Some guys even install multiple TitanX, but that’s mainly for CUDA computation. For gaming, a single 1080Ti make more sense.

You can always lock the output to 60Hz and check if you can notice the difference now. What I found is that I can only notice the difference in mouse cursor smoothness. All other UI animation, web page scrolling, etc looks the same to me.

Yes, the frame rate will be locked with any kind of V-sync. IMO, this is good, rendering more frame than your monitor can display is just waste of power, generate more heat, but no benefit. Especially 120FPS is good enough already.

If there is a G-sync version of my monitor, I will get it. However, there is only one option for 49" 32:9 monitor. So, no alternative G-sync choice. But what I found interesting is that, when I use the 27" ACD, I can really see the tearing if I turn off V-sync. But with this 144Hz gaming monitor, I never see any tearing even I turn V-sync Off (even though the GPU deliver more than 200FPS). Before I switch to this new monitor, I always turn on V-sync because I hate tearing. In fact, I worried about without G-sync may affect my gaming experience. But luckily I don't actually need it with my current setup.

I think 60Hz vs 120Hz is very personal. IMO, 60Hz monitor is not a huge problem for gaming, as long as the GPU can stably deliver right at 60FPS and never drop below (I am just a causal gamer). Do I want to go back? Not at this moment. What I really want? An OLED monitor with 7680x2160 @ 75Hz (or above). This should be good for both gaming and working, but obviously it won't happen at any foreseeable future. I really love this super ultra wide experience, it makes gaming like watching movies (I love RPG games), super enjoyable :D And I really don't want to go back to anything 27" 16:9, no matter is 5K or 8K :p
Ultrawide gaming_filtered.jpg
 
Last edited:
Triple buffering surely help, I always have that on when I was using the ACD. And actually I need that ON, otherwise, I can see the tearing. However, what surprise me is that with a gaming monitor. Now I can turn it OFF and I actually never see any tearing.


Yes, 1080Ti works fine in my 2009 Mac Pro. Some guys even install multiple TitanX, but that’s mainly for CUDA computation. For gaming, a single 1080Ti make more sense.

You can always lock the output to 60Hz and check if you can notice the difference now. What I found is that I can only notice the difference in mouse cursor smoothness. All other UI animation, web page scrolling, etc looks the same to me.

Yes, the frame rate will be locked with any kind of V-sync. IMO, this is good, rendering more frame than your monitor can display is just waste of power, generate more heat, but no benefit. Especially 120FPS is good enough already.

If there is a G-sync version of my monitor, I will get it. However, there is only one option for 49" 32:9 monitor. So, no alternative G-sync choice. But what I found interesting is that, when I use the 27" ACD, I can really see the tearing if I turn off V-sync. But with this 144Hz gaming monitor, I never see any tearing even I turn V-sync Off (even though the GPU deliver more than 200FPS). Before I switch to this new monitor, I always turn on V-sync because I hate tearing. In fact, I worried about without G-sync may affect my gaming experience. But luckily I don't actually need it with my current setup.

I think 60Hz vs 120Hz is very personal. IMO, 60Hz monitor is not a huge problem for gaming, as long as the GPU can stably deliver right at 60FPS and never drop below (I am just a causal gamer). Do I want to go back? Not at this moment. What I really want? An OLED monitor with 7680x2160 @ 75Hz (or above). This should be good for both gaming and working, but obviously it won't happen at any foreseeable future. I really love this super ultra wide experience, it makes gaming like watching movies (I love RPG games), super enjoyable :D And I really don't want to go back to anything 27" 16:9, no matter is 5K or 8K :p
View attachment 751577

Up to yesterday, I really thought I wouldn't noticed the difference between 60Hz and 120Hz, I never truly understand why so many Youtube tech reviewers vouch for the higher refresh rate and abandon a beautiful iMac. After setting up the Corsair One to play WOW at 120Hz at a steady rate, I went back to my iMac 5K, I still love the high resolution, but for some reason I can't explain, there is a difference....it is like looking at the new iPad Pro (2017) vs iPad Pro (2016)...it is very subtle...but your eyes "feel" it. Agree with you 60Hz is still great for gaming. If the iMac Pro 64 Vega can run WOW at 60FPS at max setting in 5K at all time, I am seriously considering getting it.

The Corsair One is really super fast, fast boot time, fast load time...pretty much everything is faster...I feel that I am using my iMac less and less now.

Just curious, where did you get the driver support for 1080TI for Mac OS?
 
Last edited:
  • Like
Reactions: Mac32 and h9826790
Up to yesterday, I really thought I wouldn't noticed the difference between 60Hz and 120Hz, I never truly understand why so many Youtube tech reviewers vouch for the higher refresh rate and abandon a beautiful iMac. After setting up the Corsair One to play WOW at 120Hz at a steady rate, I went back to my iMac 5K, I still love the high resolution, but for some reason I can't explain, there is a difference....it is like looking at the new iPad Pro (2017) vs iPad Pro (2016)...it is very subtle...but your eyes "feel" it. Agree with you 60Hz is still great for gaming. If the iMac Pro 64 Vega can run WOW at 60FPS at max setting in 5K at all time, I am seriously considering getting it.

The Corsair One is really super fast, fast boot time, fast load time...pretty much everything is faster...I feel that I am using my iMac less and less now.

Just curious, where did you get the driver support for 1080TI for Mac OS?

Nvidia usually publish the web driver for their GPU within 24hr after an OS update.

You can get it from their official website. But this page has all the direct links, which is very handy.

http://www.macvidcards.com/drivers.html
 
I've been doing some experimenting with my iMac Pro w/Vega 64. I turned up the fans to 2100rpm in OSX, and booted into Windows. Pushing the GPU at 100% (Crysis 3, or demanding dx11 games at 4K etc.) maxed the GPU temperature up into high 80s (86-89C) after about 10-20 minutes. At this speed the iMac Pro fans are fairly audible, although they can go up to 2500rpm. In terms of noise-to-rpm ratio, it's pretty similar to my late 2012 iMac, though of course the iMac Pro fans are more powerful.

Judging from this, the overclocking potential for Vega 64 might not be very big, as Vega Pro 64 runs rather hot under full load (no big surprise there). I'm new to AMD graphics card, but my old iMac late 2012 w/680MX could be overclocked aggressively (+250/+375) with very manageable temperatures and being 100% stable, except for especially demanding games like Crysis 3 that would push the temperature up to 90C or more at this overclock.

Many desktop Vega owners have been able to significantly undervolt their cards, as AMD run the volt settings rather high on Vega. I would love to test if it's possible to tweak Vega Pro 64 for more performance. Until Apple/AMD decides to release a proper Bootcamp GPU driver for the Vega Pro 56/64, this is all guesswork. Right now, we have to run a Radeon 580 driver, so there's no clock, volt or fan control available in overclocking programs like MSI Afterburner.

As far as gaming in 5K, it's not necessary IMO. 4K looks super sharp and like native resolution. Put some Reshade SMAA on top (and Vibrance at 0.250 to make the colours really pop) to smooth out the most obvious things like wires and straight lines, and everything looks great.
 
Last edited:
Seeing all the discussions and enthusiasms here, I am pretty sure the gaming market is just as big as the professional market, if not bigger. From a business standpoint, I don't understand why Apple is refusing to invest in the gaming realm.
 
Seeing all the discussions and enthusiasms here, I am pretty sure the gaming market is just as big as the professional market, if not bigger. From a business standpoint, I don't understand why Apple is refusing to invest in the gaming realm.
Let's just say that Apple did in fact start catering to or at least made more noise to make themselves move visible to the gaming community. How much do you think an actual gaming equivalent system made by Apple would cost? A machine that would have to be upgradable in terms of GPU(s), RAM, and storage, and have more gaming-centric CPUs such as a higher clock speed i7. The machine cost would be stratospheric relative to the cost of equivalent and far more customizable Windows boxes which already have a well established gaming community. The iMac Pro base model alone starts at 5 grand. I wish Apple luck in trying to convince the majority of gamers to spend that at a minimum on a system. Not many gamers can write off a 5-10 thousand dollar system as a business expense or recoup the money by raiding :D

Apple doesn't need to try and convince gamers to buy and use their machines, they need to convince creative professionals that they were't abandoned. If the hardware that Apple turns out just happens to be somewhat viable for gaming, that's an added plus, but it shouldn't be their primary focus. As much as I would love Apple to roll out a high end gaming rig, I just don't see it happening at all. The only real promising venture is seeing what and how this forthcoming "modular" Mac Pro will be. I'm looking forward to it. The one thing we do know about it is that it will be obscenely expensive! :)
 
Let's just say that Apple did in fact start catering to or at least made more noise to make themselves move visible to the gaming community. How much do you think an actual gaming equivalent system made by Apple would cost? A machine that would have to be upgradable in terms of GPU(s), RAM, and storage, and have more gaming-centric CPUs such as a higher clock speed i7. The machine cost would be stratospheric relative to the cost of equivalent and far more customizable Windows boxes which already have a well established gaming community. The iMac Pro base model alone starts at 5 grand. I wish Apple luck in trying to convince the majority of gamers to spend that at a minimum on a system. Not many gamers can write off a 5-10 thousand dollar system as a business expense or recoup the money by raiding :D

Apple doesn't need to try and convince gamers to buy and use their machines, they need to convince creative professionals that they were't abandoned. If the hardware that Apple turns out just happens to be somewhat viable for gaming, that's an added plus, but it shouldn't be their primary focus. As much as I would love Apple to roll out a high end gaming rig, I just don't see it happening at all. The only real promising venture is seeing what and how this forthcoming "modular" Mac Pro will be. I'm looking forward to it. The one thing we do know about it is that it will be obscenely expensive! :)
We’re not talking about creating a seperate gaming platform or anything, just some friggin functional audio and graphics drivers for Bootcamp. Who at Apple thought it was ok to ship the iMac Pro, an expensive high-end machine, without basic Bootcamp driver support? Very disappointing.
 
There are some good news though. Matt (behind bootcampdrivers.com) is supposed to release a full driver this weekend (hopefully), and maybe this will perform better. Right now, afaik you can't undervolt or overclock, or use newer software feature like enhanced vsync..


Any update on these driver and what are the performance after using them?
 
You are mostly right, it is really not that bad. I got the Corsair One Pro 7700K 16GB 1080TI recently, it is the most beautifully designed PC on the market in my opinion, by PC standard it is pretty minimalist, hence I bought it.

I ended up going back to wired mouse by using Deathadder Elite, I am not sure which is the best wireless mouse for WOW. I got the Corsair K63 wireless keyboard, thinking that it will free me of wire. I ended wiring it to charge it because the battery wouldn't last for more than 2 days if you turn on the light, also the K63 does not know how to automatically shut down to save battery like the way Apple Wireless Keyboard, quite disappointing. Then there is a also the speaker, I got the Nommo Chroma which has wires and I don't like it.

If someone can recommend me a great wireless mouse, keyboard and speaker, it will be great.
Never heard of the Corsair One. Looks like a nice machine. How is the fan noise and is there any throttling?

I'm thinking of getting the F131 from Maingear or Bolt from Digital Storm. Small tower with full acrylic side panel and hard-line tubing...what a dream.
But I'm open to a mini tower that looks sleek & minimalist like that Corsair.

IMO, if one is at all serious with gaming, just get a dedicated PC for it (and a ultrawide monitor ;)).
Mac for anything else.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.