Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I have tried running the test on 4K high in RDR2 several times now since being asked to provide results proving that it can achieve 59FPS on native 4K and the best I’ve been able to get was 52FPS average. I have no clue what my initial settings were short of 4K high before I embraced scaling+ultra textures but until I can prove it, yeah, it certainly does not perform at the level of a 2080TI, and if it does at any point thanks to its RAM overhead, it certainly could not for a very long period of time due to the power constraints. I knew that going in which is what led to such shock and excitement on my part. I know I was not mistaken when I saw the results of the benchmark now leading me to believe it must have been a fluke as I cannot replicate it. @rkuo is right that professionals are paid to do this correctly. I’m sorry to @rkuo for so ardently assuming I would be able to replicate my results, even knowing I am not mistaken in what I saw, I must acknowledge it must have been a fluke or that I am genuinely mistaken. However, this is the closest we’re gonna get for a while from Apple and I’m sorry if any of my overexcitement for its power oversold anyone on its abilities. Better than a 2070 Super we got in the house for our games but beyond that I’m not the one to quantify it.
It's all right. Like you say it's the best card and it's not like there is anything better to buy so people buying it know what they're paying for.
 
We need a disclaimer if a video is just a rehash of literally every video before it and doesn't contain any useful new information or benchmarks.

I actually wrote such a disclaimer myself in my first post and asked people try to post reviews with new interesting info. Of course it's hard to know what everybody else already knows and seen and what's interesting for you or anyone else. I have let worse reviews than this one pass without making an objection simply because they don't do harm. The reason for posting it was his short mention and comparison between iMac and other AIOs, benchmark about the performance gain in Adobe Premiere, comparison with Dell laptop and his final words regarding AS. You can always do like I do and fast-forward the video or jump to the desired chapter as in this case without watching through the whole thing. :)
 
  • Like
Reactions: ErikGrim
We need a disclaimer if a video is just a rehash of literally every video before it and doesn't contain any useful new information or benchmarks.

I must admit I look at a lot of the 'professional' youtube channels much like you have done above. It's a hard judgment call tbh. I video stuff mainly as I kept being asked about the kit I get access to, and because it also makes me look at what I need to know about something in a bit more detail - helps me get my thoughts together. In addition to that I found a lot of 'review' type stuff just didn't go in to the detail I needed - and I wondered who else thought the same thing? It's hard to get a lot of content in to 10-15 minutes of video.

The thing is though you have to look at how people consume youtube. Here, we consume all videos about X products. In general though people have their fave channels so people follow those channels - so those channels tend to cover the same topics - with the former viewing method it can appear like they're just repeating the same old stuff. I guess they are; but it's for their viewership I imagine, rather than how it's viewed here.

I'd rather have too much content than not enough, but I do understand your point completely.
 
  • Like
Reactions: ErikGrim
I agree with all of your points above, I was only talking about for this thread specifically where we are likely to have viewed all the previous videos.
Btw you seem to have bought your iMac already. Why are you still watching these reviews? Do something fun instead! ;)
 
I have tried running the test on 4K high in RDR2 several times now since being asked to provide results proving that it can achieve 59FPS on native 4K and the best I’ve been able to get was 52FPS average. I have no clue what my initial settings were short of 4K high before I embraced scaling+ultra textures but until I can prove it, yeah, it certainly does not perform at the level of a 2080TI, and if it does at any point thanks to its RAM overhead, it certainly could not for a very long period of time due to the power constraints. I knew that going in which is what led to such shock and excitement on my part. I know I was not mistaken when I saw the results of the benchmark now leading me to believe it must have been a fluke as I cannot replicate it. @rkuo is right that professionals are paid to do this correctly. I’m sorry to @rkuo for so ardently assuming I would be able to replicate my results, even knowing I am not mistaken in what I saw, I must acknowledge it must have been a fluke or that I am genuinely mistaken. However, this is the closest we’re gonna get for a while from Apple and I’m sorry if any of my overexcitement for its power oversold anyone on its abilities. Better than a 2070 Super we got in the house for our games but beyond that I’m not the one to quantify it.
My friend, you may be the first person on the internet to admit they made a mistake. Good for you.

Really no need to oversell the iMac, it is good on its own merits and the 5700 XT is a fine and fast (but not fastest!) video card.
 
  • Like
Reactions: John90976
iMac 27 2019 i9
Result: 125 Tracks
MacBook Pro 15 2018 2,2Ghz i7
Result: 67 Tracks
iMac 27 2014 i7 4 Ghz
Result: 42 Tracks
iMac 27 2017 i5 3,5 Ghz
Result: 27 Tracks
MacBook Pro 13 i5 2016
Result: 14 Tracks
MacBook Air 13 i5 2018
Result: 3 Tracks
 
Is there any way to get more screen space on current macs?

Of course. You can option click scaled in monitors to set the default scaled resolution to whatever you want, including native 5K without scaling:

Screen Shot 2020-08-19 at 11.37.49.png
 
  • Like
Reactions: kamikazeeMC
Actually I do not agree with his statement. I have a 2019 i9 iMac with a 2560x1440 display at 144hz and I can game at 144FPS and have my monitor at 144Hz without any additional tools. I also have a 4K 60hz third display connected. I get these in both macOS and Bootcamp where I can use 144Hz without issue with just macOS and Windows 10 without any additional tools and keeping the iMac screen active.
Got so bored waiting for my iMac to arrive I went ahead and bought that 34" Xiaomi gaming display to use with it. :eek:
 
Finally set up my 8 core i7 with the 5700 XT 16Gb. XCOM 2 is indeed laughably unoptimised. 5K it runs 10-15fps at Maximum. But I've found if you dial down a couple of sane settings (no AA, directional shadows etc.), I can get a very enjoyable ~40fps at 4K. Especially since I've been gaming 2000+ hours on the 5-20fps on 1440p low settings on my 2014 5K iMac.

I haven't tested it under the Wintendo yet so I may get some improvements from that.

Civ VI runs perfectly smooth at 50-60fps with absolutely everything maxed out at 5K native (just remember to scale the UI 200%). In the super-stress test GTS benchmark with the late game everything on it hovers around 25fps, but again everything looks smooth to my eye still (i.e. 100s of windmills turning at a good clip without any hiccups) so I doubt it would bother me to play a full game at those settings.

In both games I was hoping for a more dramatic decrease in loading times, so I'm guessing the SSD read was never the bottleneck I imagined it to be.

I've created a Wintendo partition and starting to fill it with games. Currently downloading the new Flight Sim, Chimera Squad and AOE II Definitive Edition.
 
Last edited:
But John90976 showed in his images he wasn't running the game in 4K. It was running 4K with 0.5 resolution scale. That's equal to 1152p and 2K. As he wrote you may not see a visual difference but it certainly would affect the performance.
[automerge]1597771552[/automerge]
10-core 5700XT



What I said was? The 5700XT destroys the 2080Ti in 'Performance AND Price.' The two come together.

It offers 50 (or more in the YouTube link I provided...) or % in performance for 1/3rd the price. That's just as a PC Card.

On the Mac is has extra VRAM as well.

For £1200. The 2080Ti is a pretender.

Azrael.
[automerge]1597834343[/automerge]
As for 4k. For GPUs in general. Efficiency is important. So playing with settings makes sense.

Whilst? We're in the transition to 4k being a 'mainstream' gaming thing.

Azrael.
 
  • Like
Reactions: Homy
Finally set up my 8 core i7 with the 5700 XT 16Gb. XCOM 2 is indeed laughably unoptimised. 5K it runs 10-15fps at Maximum. But I've found if you dial down a couple of sane settings (no AA, directional shadows etc.), I can get a very enjoyable ~40fps at 4K. Especially since I've been gaming 2000+ hours on the 5-20fps on 1440p low settings on my 2014 5K iMac.

I haven't tested it under the Wintendo yet so I may get some improvements from that.

Civ VI runs perfectly smooth at 50-60fps with absolutely everything maxed out at 5K native (just remember to scale the UI 200%). In the super-stress test GTS benchmark with the late game everything on it hovers around 25fps, but again everything looks smooth to my eye still (i.e. 100s of windmills turning at a good clip without any hiccups) so I doubt it would bother me to play a full game at those settings.

In both games I was hoping for a more dramatic decrease in loading times, so I'm guessing the SSD read was never the bottleneck I imagined it to be.

I've created a Wintendo partition and starting to fill it with games. Currently downloading the new Flight Sim, Chimera Squad and AOE II Definitive Edition.

Software can kill any hardware.

Azrael.
 
I have tried running the test on 4K high in RDR2 several times now since being asked to provide results proving that it can achieve 59FPS on native 4K and the best I’ve been able to get was 52FPS average. I have no clue what my initial settings were short of 4K high before I embraced scaling+ultra textures but until I can prove it, yeah, it certainly does not perform at the level of a 2080TI, and if it does at any point thanks to its RAM overhead, it certainly could not for a very long period of time due to the power constraints. I knew that going in which is what led to such shock and excitement on my part. I know I was not mistaken when I saw the results of the benchmark now leading me to believe it must have been a fluke as I cannot replicate it. @rkuo is right that professionals are paid to do this correctly. I’m sorry to @rkuo for so ardently assuming I would be able to replicate my results, even knowing I am not mistaken in what I saw, I must acknowledge it must have been a fluke or that I am genuinely mistaken. However, this is the closest we’re gonna get for a while from Apple and I’m sorry if any of my overexcitement for its power oversold anyone on its abilities. Better than a 2070 Super we got in the house for our games but beyond that I’m not the one to quantify it.

I wouldn't beat yourself up too much for it.

It's better than a Super 2070 and I'd have taken that. It's on the heels of a 2080 super. Also impressive.

And it hammers the 2080 Ti for performance AND price. You get at least 50% of the performance for 1/3rd of the price.

In a lot of games running the 2080Ti vs the 5700XT...the latter isn't too far behind and that varies according to the game. Around 25% give or take?

We can't get a £1200 2080Ti on the iMac. But the case you clearly lay out is that we shouldn't feel 'too bad' with the 5700XT with 16 GIGs of VRAM. At £500.

Unprecedented performance in an iMac.

Azrael.
 
  • Like
Reactions: John90976
  • Like
Reactions: ErikGrim
I have tried running the test on 4K high in RDR2 several times now since being asked to provide results proving that it can achieve 59FPS on native 4K and the best I’ve been able to get was 52FPS average. I have no clue what my initial settings were short of 4K high before I embraced scaling+ultra textures but until I can prove it, yeah, it certainly does not perform at the level of a 2080TI, and if it does at any point thanks to its RAM overhead, it certainly could not for a very long period of time due to the power constraints. I knew that going in which is what led to such shock and excitement on my part. I know I was not mistaken when I saw the results of the benchmark now leading me to believe it must have been a fluke as I cannot replicate it. @rkuo is right that professionals are paid to do this correctly. I’m sorry to @rkuo for so ardently assuming I would be able to replicate my results, even knowing I am not mistaken in what I saw, I must acknowledge it must have been a fluke or that I am genuinely mistaken. However, this is the closest we’re gonna get for a while from Apple and I’m sorry if any of my overexcitement for its power oversold anyone on its abilities. Better than a 2070 Super we got in the house for our games but beyond that I’m not the one to quantify it.

I've tried World of Warcraft (hello Blizzard and WoW fans...) and it's a 'big ask' for anything beyond 1440p.

1080 and 1440p (HD and QHD) are rock solid. 100FPS+. At Level 5, Smooth as butter. On WoW graphics 10? Rock solid.

I tried 4k. I couldn't really see the fidelity difference (WoW is what it is. A cartoony style MMO.)
I tried 5k. Problems. 15 fps. WoW L10. 100% scaling. Fans were going. System struggling.

This was wit 8 gigs of Ram. I have my 32 gigs. And according to Linus (tech' tips...) going fro m8 to 16 gigs will make a big difference. So 32 gigs should give plenty of room to WoW which seems a bit of a memory hog. Maybe all games are.

HD or QHD. Golden.

Anyone who wants smooth as butter 4k or 5k maybe reaching somewhat.

Even the 2080ti has got a fight on to get a smooth as butter 60 fps 4k experience.

It's not something I see as credible until Ampere and RDNA2 bring on the Branston.

Azrael.
 
  • Like
Reactions: grundgedanke
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.