Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Do you have Rocket League, League of Legends and/or Cities Skylines (with or without expansion After Dark)?
Win or OSX doesn't matter, but OSX is of course preferable. And if you have any of these games, could you try out different settings and resolutions to see which gives the best settings while keeping it above acceptable performance?

Great thread!
Hi Tanax, great to have you here!

I don't have either Rocket League or Cities Skyline, but I have LoL and as I stated in the OP:

League of Legends runs at 60 fps at 5K, very high detail (yes) - Windows 10

But beware, at 5K the UI is ridiculously small and you'd have to accelerate your mouse accordingly. So if you're gaming competitively, stay at 1440p ultra and don't worry about a thing except for the lag ;)
 
I just tried some rift runs at 1440p and all details on high. I have no problems getting a steady 60 FPS in D3 on my i5 M395X.

At times, yes. I'd suggest that you don't play high levels Greater Rifts with 4-players very often. That really isn't optimized very well. Frame rates can take a total dump.
 
  • Like
Reactions: MandiMac
At times, yes. I'd suggest that you don't play high levels Greater Rifts with 4-players very often. That really isn't optimized very well. Frame rates can take a total dump.
Very interesting if it is the case under Windows as well - and it would be interesting if actually the GPU is the culprit, too.
 
I think the bottom line is the original post is about upgrading front 650M. Of COURSE the M395X is going to feel like a massive upgrade (because it is!).

For the rest of us that have been using the M295X for a year, this is hugely disappointing. If anything it's poor value for money.

I suspect next year we'll see a massive GPU upgrade from Apple that will blow us all away. At least, I tell myself that...
 
  • Like
Reactions: turbineseaplane
Very interesting if it is the case under Windows as well - and it would be interesting if actually the GPU is the culprit, too.

I find Diablo 3 to be a test only in so far as running around at say Torment III by yourself. Which any modern-ish PC can do at 60fps. I've seen the frame rate on my iMac plummet to 15fps at 1440p when a few witch doctors are running around setting off their armies.
 
  • Like
Reactions: MandiMac
I find Diablo 3 to be a test only in so far as running around at say Torment III by yourself. Which any modern-ish PC can do at 60fps. I've seen the frame rate on my iMac plummet to 15fps at 1440p when a few witch doctors are running around setting off their armies.
I get what you're saying, but I'm hesitant to point the finger to any component here.

Is it the CPU that can't keep up?
Is it Diablo III itself which isn't optimized for all the minions?
Is it only with witch doctors?
Is it the GPU?
Is it the internet connection?
Do we have other bottlenecks, like a HDD?

I just can't believe that when you're gaming on your own you get 1440p/ultra above 140 FPS, but once you start a multiplayer session you plummet down to 15 fps that this should be the problem of the GPU. I'm obviously no expert on this, but when you cut 140 FPS in half (2-player?) you get 70, once more in half (3-player?) you get around 30-35, and once more in half you'd get 15-17 FPS. So many questions...
 
You're welcome! Thanks to everyone who makes it lively as it is.

Sorry, I don't have it, but I think I know someone who has it - gotta confirm that first. OS X or Windows?

I'll definitely be doing a Boot Camp install just for games. The performance gain is worth the reboot and I already have Windows 8, waiting to upgrade to 10. It is nice to hear you can get by in OS X though, definitely not the case on my old MBP where every little gain counts.
 
I get what you're saying, but I'm hesitant to point the finger to any component here.

Is it the CPU that can't keep up?
Is it Diablo III itself which isn't optimized for all the minions?
Is it only with witch doctors?
Is it the GPU?
Is it the internet connection?
Do we have other bottlenecks, like a HDD?

I just can't believe that when you're gaming on your own you get 1440p/ultra above 140 FPS, but once you start a multiplayer session you plummet down to 15 fps that this should be the problem of the GPU. I'm obviously no expert on this, but when you cut 140 FPS in half (2-player?) you get 70, once more in half (3-player?) you get around 30-35, and once more in half you'd get 15-17 FPS. So many questions...

I'm sure it's not linear like you're saying, but I'm a pretty hardcore D3 player, so I know the frame-rate can get absolutely hammered depending on what you're doing. D3 is just poorly optimized. Even running around town results in little hitches all over the place at times. Blizzard switched their file types recently to try and combat some issues they'd been having, but it doesn't look like it made much difference in regards to the unstable frame rate.

In any case, D3 is really not a good test, since there are SO many variables at play here which lead to frame rates in the 100s or in the 30s, depending no what is going on.
 
In any case, D3 is really not a good test, since there are SO many variables at play here which lead to frame rates in the 100s or in the 30s, depending no what is going on.
You're absolutely right about that. We need another game which is a good test :)
 
I'm obviously no expert on this, but when you cut 140 FPS in half (2-player?) you get 70, once more in half (3-player?) you get around 30-35, and once more in half you'd get 15-17 FPS. So many questions...


As far as the GPU is concerned, the difference between 1 player and 2 player is just one extra model on the screen. The change in fps should be next to nothing. Unless having a second player also double the number of monsters/objects on the screen as well (which I dont think it does on diablo, but I have to admit I haven't played it online).

If it's really halving the fps by playing multiplayer, thats much more likely to be a network issue or a fault with the way the game is programmed, not a limitation of the GPU/CPU.

Apologies if I'm wrong with how the game works though, I guess its possible adding a player also doubles all the artifacts and monsters.
 
As far as the GPU is concerned, the difference between 1 player and 2 player is just one extra model on the screen. The change in fps should be next to nothing. Unless having a second player also double the number of monsters/objects on the screen as well (which I dont think it does on diablo, but I have to admit I haven't played it online).

If it's really halving the fps by playing multiplayer, thats much more likely to be a network issue or a fault with the way the game is programmed, not a limitation of the GPU/CPU.

Apologies if I'm wrong with how the game works though, I guess its possible adding a player also doubles all the artifacts and monsters.

It's a lot more than one extra model on screen. It's all the fireworks and explosions created by that player, the effect said explosions have on the enemies etc. There can be more than 10x (throwing random numbers out there) the effects on screen when you add a witch doctor to a party.
 
It's a lot more than one extra model on screen. It's all the fireworks and explosions created by that player, the effect said explosions have on the enemies etc. There can be more than 10x (throwing random numbers out there) the effects on screen when you add a witch doctor to a party.
I'm gonna try that one. I don't know if or when I can gather three friends (do I even have that many?) and what classes they are playing but I'd love to bring light to these mysteries. Diablo III, soon I come!
 
Hi Tanax, great to have you here!

I don't have either Rocket League or Cities Skyline, but I have LoL and as I stated in the OP:

League of Legends runs at 60 fps at 5K, very high detail (yes) - Windows 10

But beware, at 5K the UI is ridiculously small and you'd have to accelerate your mouse accordingly. So if you're gaming competitively, stay at 1440p ultra and don't worry about a thing except for the lag ;)

Thanks!
What about LoL on OSX? Also, I'm not interested in playing in 5K resolution, 1440p will be more than enough! :)
 
It's a lot more than one extra model on screen. It's all the fireworks and explosions created by that player, the effect said explosions have on the enemies etc. There can be more than 10x (throwing random numbers out there) the effects on screen when you add a witch doctor to a party.
Hi there, I just played on a Win 7 machine with a i5-4670 CPU and a GTX 770 desktop machine. We were 3 (monk/sorc/demon hunter) and in a single rift (out of 5 or 6) the FPS broke down to 20. If it's even that bad on a desktop system, I can imagine that it's a problem with Diablo III and not with the GPU solution in the present iMac. And to add insult to injury:
- 1920x1080 resolution
- we weren't even three Witch Doctors ...
 
Last edited:
Thanks!
What about LoL on OSX? Also, I'm not interested in playing in 5K resolution, 1440p will be more than enough! :)
Haha, I see ;) I'll be at my own iMac in around 14 hours. Gotta tell you afterwards, first a good night's sleep :)
 
  • Like
Reactions: Tanax
I think the bottom line is the original post is about upgrading front 650M. Of COURSE the M395X is going to feel like a massive upgrade (because it is!).

For the rest of us that have been using the M295X for a year, this is hugely disappointing. If anything it's poor value for money.

I suspect next year we'll see a massive GPU upgrade from Apple that will blow us all away. At least, I tell myself that...
I think the better question is... why do you need to upgrade your computer on a yearly basis? Especially you just got a maxed out 2014 model???
 
Why question my needs? Surely it should be disappointing to anyone, no? If something doesn't really improve, should we not be dissatisfied?
there are improvements, like much FASTER ssds, a better display, and a slightly faster graphic card. I don't see why anyone would be dissatisfied.

BTW throwing your money at an imac (which includes a 5k display) on a yearly upgrade so you can play video games is dumb. sorry.
 
there are improvements, like much FASTER ssds, a better display, and a slightly faster graphic card. I don't see why anyone would be dissatisfied.

BTW throwing your money at an imac (which includes a 5k display) on a yearly upgrade so you can play video games is dumb. sorry.

Faster SSDs very few will notice. This has been discussed ad nauseam. The display improvements I can't comment on because I haven't seen it personally yet. From what I gather it's not easy to tell the difference. The GPU speed improvements are marginal.

Personally, how I spend my money is none of your concern, so keep your insults to yourself please. I wasn't necessarily buying one. I just like to see progress. Especially from a forward-thinking company like Apple.
 
Don't you find 1440p empty of details after trying the 5K ? I can't go back, it's only me ?
For me personally the mostly bad UI size and mouse problems (don't want to switch around every time) kill the joy of having 5K graphics. I prefer smooth gameplay at 60 fps though, so I have no problem with 1440p.
 
  • Like
Reactions: SBruv
its free on steam and native for OSX. also based on Source Engine 2
I guess I could download it, play the tutorial and delete it afterwards again, but that experience won't be the same like a 5vs5 fight in a late game. It's no use if I play around for a few minutes because it's not the usage you'd want to see. That's why I asked for anyone else who actually plays the game: Less hassle, more accuracy.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.