Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Marshall73

macrumors 68030
Apr 20, 2015
2,713
2,837
Can you quantify a lot of heat? Was watching one of the reviewers yesterday and even after a 30 minute stress test they did observe some throttling but they said it was warm but not anywhere close what the existing Intel systems got like
I ran Diablo 3 today for about an hour and it made the MacBook Air hot across the bottom rear of the case. Not as hot as my late 2016 MacBook Pro with AMD 460 gets but it was pretty warm.

Running World of Warcraft which is now native runs at 60fps and barely warms the MacBook Air, although Warcraft does hard freeze after anywhere between 8 and 20 minutes of play. I suspect it has something to do with Rosetta as I played on the earlier non native version which had issues and I think I need to find the rosetta files and clean them out or possibly just reinstall Warcraft.

Update: Reinstalled Wold of Warcraft (shadowlands) to a different path but it still crashes. Bit of a weird one tho, the game continues to run but the display if frozen. So you can still cast spells etc and you hear that all going on but the display is frozen. When you quit it takes ages to get back to the desktop if at all, sometimes you have to hold power to force a shutdown and restart.
 
Last edited:

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
25 games tested. Very impressive! I mean Bioshock 2 Remastered, Borderlands 2-3 and Deus Ex Mankind Divided on ultra settings on iGPU through Rosetta??? WOW!
It's impressive for an internal GPU, but on absolute terms, the results are not so good. Look here: https://forums.macrumors.com/thread...gb-ram-screens-and-settings-included.2269262/

For example, Dota 2 (a very old game) has performance issues, and Starcraft 2 requires low quality shaders. With Intel Macs, you could plug an external GPU to have up to 85% native performance, but now, you're stuck.
 

Luposian

macrumors 6502
Apr 10, 2005
389
258
Wait for the dust to settle. Wait for updates to older games that may see M1 optimizations. Be patient and not jump to conclusions. Buy what you can afford and give it a go. I know I will.
 

Sanpete

macrumors 68040
Nov 17, 2016
3,695
1,665
Utah
It's impressive for an internal GPU, but on absolute terms, the results are not so good. Look here: https://forums.macrumors.com/thread...gb-ram-screens-and-settings-included.2269262/

For example, Dota 2 (a very old game) has performance issues, and Starcraft 2 requires low quality shaders. With Intel Macs, you could plug an external GPU to have up to 85% native performance, but now, you're stuck.
Yes, it's impressive for what it is, not for what it isn't. Not many entry-level computers are impressive compared to eGPUs.

Now that Apple has a more inviting platform more games might be optimized for it.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
For example, Dota 2 (a very old game) has performance issues, and Starcraft 2 requires low quality shaders. With Intel Macs, you could plug an external GPU to have up to 85% native performance, but now, you're stuck.

In my book, improving the experience of 99.9% of users who will never use an eGPU comes before the 0.01% of users that will have one.

As much as I like the eGPU idea, unfortunately, it’s incompatible with the base architecture of Apple Silicon Macs. I sold mine a week after Apples WWDC announcement as it was very clear to me where this was going.
 

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
In my book, improving the experience of 99.9% of users who will never use an eGPU comes before the 0.01% of users that will have one.

As much as I like the eGPU idea, unfortunately, it’s incompatible with the base architecture of Apple Silicon Macs. I sold mine a week after Apples WWDC announcement as it was very clear to me where this was going.

Sorry, but I call it ********. External GPUs before were absolutely fine, the only issues being because Apple didn't want to release OpenGL drivers.

I think this has more to do with a marketing issue than anything. If they allow you to use an external GPU, it would be much easier to buy a mac Mini and a high-end 3D GPU card, as many people were doing. Now, your only option will be to purchase a model with a better 3D GPU that Apple releases.
 

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
Wait for the dust to settle. Wait for updates to older games that may see M1 optimizations. Be patient and not jump to conclusions. Buy what you can afford and give it a go. I know I will.

I want to believe in the M1 Mac, but without the option to pair it with a high end 3D GPU, it gets hard to justify the purchase. Not just that, but there aren't many details yet on how x86 emulation / virtualization works with QEMU.
 

Fomalhaut

macrumors 68000
Oct 6, 2020
1,993
1,724
I concur. Seriously, there is no way that an on package chip that is sharing memory with CPU cores and other cores, can have the performance of a dedicated GPU, like the 2060.

Some of you that expected the performance like Nvidia's discrete GPUs, just don't understand how computer components work.
I guess you haven't seen the PS5 and XBox X then, which have exactly what you're describing: a huge SoC with a 12TFlop GPU, equivalent to a GTX2060. https://www.anandtech.com/show/11992/the-xbox-one-x-review/3

They run at 180-200W TDP though, not 10-15W like the M1, but it shows that there isn't an inherent limit to putting a powerful GPU on a SoC.
 

bill-p

macrumors 68030
Jul 23, 2011
2,929
1,589
I guess you haven't seen the PS5 and XBox X then, which have exactly what you're describing: a huge SoC with a 12TFlop GPU, equivalent to a GTX2060. https://www.anandtech.com/show/11992/the-xbox-one-x-review/3

They run at 180-200W TDP though, not 10-15W like the M1, but it shows that there isn't an inherent limit to putting a powerful GPU on a SoC.

The One X has 12GB of GDDR5 as unified memory with 384-bit memory interface.

In comparison, M1 has maybe 16GB of LPDDR4X with 128-bit memory interface. The difference in memory performance is actually more stark than it looks. There's a reason the One X is as powerful as it is.
 

Chompineer

Suspended
Mar 31, 2020
502
1,183
Ontario
The One X has 12GB of GDDR5 as unified memory with 384-bit memory interface.

In comparison, M1 has maybe 16GB of LPDDR4X with 128-bit memory interface. The difference in memory performance is actually more stark than it looks. There's a reason the One X is as powerful as it is.

One X is old news, Series X is out and on GDDR6, 16GB unified.
 

bill-p

macrumors 68030
Jul 23, 2011
2,929
1,589
One X is old news, Series X is out and on GDDR6, 16GB unified.

Well, yeah. I'm just saying... the M1 is an impressive feat of engineering. The CPU is nothing to sneeze at. But in terms of GPU performance, it's still behind what AMD and nVidia are able to achieve.

Apple can do the same thing for the M1X or M2... just stuff GDDR6 or even HBM2 as unified memory over LPDDR4X, but that will undoubtedly destroy the battery life figures we are seeing with M1.

So it remains to be seen what Apple will choose to do with their next chip. I'll just enjoy my M1 in the mean time.
 
  • Like
Reactions: eltoslightfoot

zakarhino

Contributor
Sep 13, 2014
2,611
6,963
I pray the rumors surrounding Apple's exploration into selling these chips to third parties turns out to be true, I'd love to run Linux on these machines but we all know Apple will do their best to ensure nobody can run 3rd party operating systems.
 

PortoMavericks

macrumors 6502
Jun 23, 2016
288
353
Gotham City
In my book, improving the experience of 99.9% of users who will never use an eGPU comes before the 0.01% of users that will have one.

As much as I like the eGPU idea, unfortunately, it’s incompatible with the base architecture of Apple Silicon Macs. I sold mine a week after Apples WWDC announcement as it was very clear to me where this was going.
You’re being too simplistic. Those 0,1% users are making videos on YouTube and they’ve become the main source of information for purchase intent.
 

zakarhino

Contributor
Sep 13, 2014
2,611
6,963
This will never happen. 100% sure about that.

I know, just let a man dream...

sh6Aj8F.jpg
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
Sorry, but I call it ********. External GPUs before were absolutely fine, the only issues being because Apple didn't want to release OpenGL drivers.

I think this has more to do with a marketing issue than anything. If they allow you to use an external GPU, it would be much easier to buy a mac Mini and a high-end 3D GPU card, as many people were doing. Now, your only option will be to purchase a model with a better 3D GPU that Apple releases.

External GPUs were fine before exactly because the drivers were there. But Apple Silicon Macs with Apple GPUs change the rules of the game.

For the first time ever, Apple offers streamlined high performance programming GPU capabilities abs programming model across all of its platforms. You have unified memory, which allows for pro workflows impossible until now. You have programmable GPU caches and TBDR guarantees. You have advanced texture compression (that desktop GPUs don’t support). Most of all, you have predictability. As a programmer with keen interest in GPUs, I can assure you, it’s a big deal. That why gaming consoles work. You know exactly how the hardware behaves and you have intimate control over it. Apple gives you all this on all their devices - from the TV to the Mac.

Now, eGPUs complicate this. They break the nice set of assumptions and guarantees that Apple has carefully lined up. They behave differently. They make things confusing for both the developer and the user. How would you explain to a user that their video rendering works slower on a high-end desktop GPU than on an integrated M1? How do you convince a game dev to program and test for a super rare case that a user might run an eGPU? An eGPU will definitely be faster in some cases - a lot of games, scientific computations on the FOU, some other niche cases. But overall, eGPU support will make the average user and developer experience worse, not better.

One thing we know with Apple - they dint target the niche use case. They try to target the majority. Just like with M1, that screwed over a very small subset of users that want to run multiple monitors or need more RAM on an entry level machine, I am fairly sure that Apple is more then ready to screw over an even smaller group of eGPU users - if this means improving the experience for the vast majority of the users.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
I guess you haven't seen the PS5 and XBox X then, which have exactly what you're describing: a huge SoC with a 12TFlop GPU, equivalent to a GTX2060. https://www.anandtech.com/show/11992/the-xbox-one-x-review/3

They run at 180-200W TDP though, not 10-15W like the M1, but it shows that there isn't an inherent limit to putting a powerful GPU on a SoC.

By the way, Apple claims to do 2.6 tflops with 10 watts. Imagine what they could do with a 60 watt SoC.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
You’re being too simplistic. Those 0,1% users are making videos on YouTube and they’ve become the main source of information for purchase intent.

Ah, it will boil over. The performance and user experience of Apple Silicon are is hard to overlook. People will soon start ignoring the minor annoyances. In the end, eGPUs were used mostly by content creators who were dissatisfied with Intel Mac performance. With M1 doing so good in this area, eGPUs are not really needed anymore anyway.
 
Last edited:
  • Like
Reactions: eltoslightfoot

iLes

macrumors member
Jul 25, 2007
45
15
Tewksbury Massachusetts
I tweeted my performance test results here:
I stacked the MacBook Air M1 up against a 2019 Alienware M17 RTX 2070 running my Ax Performance Test (iPad version here: https://apps.apple.com/us/app/ax-performance-test/id1534481764). The results were mind-blowing. The M1 easily beat out the Alienware in number of physics cubes dropped. Alienware registered more polygons per second because it has a 144hz refresh rate (as opposed to 60hz for MacBook Air) but it could not match the number of objects in the scene. I first ran the iPad app and then I built a x64 version and ran that on the MacBook via Rosetta 2 translation and the M1 easily beat out the Alienware. When Unity releases a player for the M1 I will build a native M1 app and run the test again. I might add that the Alienware cooling fans were screaming for help while the MacBook Air did this all in complete silence :)

MacBook Air M1:
EnH8mLWW4AIqqvx.jpeg


2019 Alienware M17 (GeForce RTX 2070):
EnH8qo8WMAITxi2.jpeg


iPad Air 4 (Apple A14):
EnH8seOXMAEEpFf.jpeg
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.