Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Piggie

macrumors G3
Original poster
Feb 23, 2010
9,191
4,145
I am not sure I follow the letdown here. The systems this chip replaced were all iGPU to begin with. Which the new Apple Silicon beats the crap out of any iGPU in existence, even matching the GTX 1080Ti levels which is still the most popular graphics card on Steam Survey is speaking volumes. These systems never included a dedicated GPU, so lets hold off on the disappointment until the higher end 13", iMac, 16" MBP and Mac Pro are updated that USED to have dGPUs.

Perhaps you may not have read some of my text.
To say it once more I'm not complaining or feeling let down by the new M1 Machines.

I'm saying this about Apple's previous machines over the years, and hoping that the new M1 is a sign that the past can be left behind and a new era of Apple taking GPU's seriously in where Apple Silicon will be going.

Regards one thing you typed in your post.
Please correct me if I'm wrong, but I don't believe anyone has suggested the new M1 chip is anywhere near an Nvidia GTX 1080 Ti in performance.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
The thought of paying £1000 / £2000 and having a computer with basically low power laptop graphics chip/power is simply something I personally could not accept.

Then you are not a target market for Apple. As Apple says, they don't do cheap. Compared to other premium machines, M1 machines have very good GPUs.

We might, at long long last see Apple take this side of computing seriously and put effort into competing with AMD and Nvidia in this up till now ignored area of computer chip design?

They do take GPUs very seriously. I mean, they have their own GPU API (and a very nice one at that) and they have some pretty much unique GPU technology that makes them 2-3 times faster in graphics than AMD or Nvidia. Their more expensive machines will feature more powerful GPUs. But don't expect a 2060-class performance from a $999 laptop.
 
  • Like
Reactions: Piggie

thenewperson

macrumors 6502a
Mar 27, 2011
992
912
Well Phones had gotten thicker in recent years as it was getting silly.
I recall so many many people, even on these forums continually begging for Apple to stop it, fit bigger batteries etc.
I'm not hearing any real complaints these days due to the phone becoming thicker.
Sure, but Jony presided over those. And they've gone right ahead and made things thinner + lighter with the 12s (as well as reducing battery capacity). I think there's this weird need to pinpoint a single person to represent whatever bad (or good) thing someone wants to comment on, when whatever it is is likely more institutional.
 

Jorbanead

macrumors 65816
Aug 31, 2018
1,209
1,438
That’s great but why “shame about graphic” though, when M1 graphic is already the “fastest” integrated graphic TODAY?
What the shame about being “fastest” while consume less power?

Could Apple make something that beat RTX 3090 today? IMO, you bet! Will they want to consider the thermal cost? I don’t think so.
Could they make something as powerful as RTX 3090 but fit in a Mac Pro? Well, we have to wait and see, won’t we?

I think you’re missing OP’s point. They’re not shaming the current M1 chip. They’re speaking about the history of Apple and gaming graphics, and posing the question “does Apple silicon pave the way for a new era of Mac gaming?”

And I think OP poses a good question. About 80% of all mac sales are notebooks. I’d guess that also translates to user base, meaning the majority of those running Mac OS are using notebooks. Of course top-tier iMacs, 16” MacBook Pro, and Mac Pro can already run AAA games no problem, but historically the lower-end notebooks have struggled (and this is true for any manufacturer so it’s not a shame on Apple).

But that’s why so many devs have not felt it is worth the investment in gaming on a Mac. If the majority of Mac customers are using low-end notebooks that can’t run a AAA well, then it’s not worth it for them. The only way apple and AAA gaming will become mainstream is if Apple can get AAA game graphics available to their lower-end notebooks. At that point, it becomes much more lucrative for a dev.

I think the M1 does show us that this is going to be a reality. Of course we’re still in the early days, but if even the MBA can now run some AAA games, which is the most popular Mac product, that opens the floodgates for devs to port a Mac version optimized for Apple silicon.

I am cautiously optimistic that AS will usher in a new era of gaming on the Mac.
 
  • Like
Reactions: leman

Maconplasma

Cancelled
Sep 15, 2020
2,489
2,215
OP why don't you just wait and see what Apple does when they release more powerful M chips. Sitting here wondering what they are going to do makes no sense, and I don't see what point you're getting at. I just have to wonder sometimes do people actually USE their Macs or sit around creating negative theories about Apple (and bringing up the dead called Steve Jobs, SMH) , especially when Apple puts out new tech that is getting rave reviews.
 
  • Like
Reactions: Piggie and Ethosik

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
Perhaps you may not have read some of my text.
To say it once more I'm not complaining or feeling let down by the new M1 Machines.

I'm saying this about Apple's previous machines over the years, and hoping that the new M1 is a sign that the past can be left behind and a new era of Apple taking GPU's seriously in where Apple Silicon will be going.

Regards one thing you typed in your post.
Please correct me if I'm wrong, but I don't believe anyone has suggested the new M1 chip is anywhere near an Nvidia GTX 1080 Ti in performance.
That was a typo, it was supposed to be 1050 Ti.
 

4sallypat

macrumors 601
Sep 16, 2016
4,034
3,782
So Calif
.....Their more expensive machines will feature more powerful GPUs. But don't expect a 2060-class performance from a $999 laptop.
Agreed, in my $2500+ 2019 MBP 16", it has the better graphics but at the cost of more power drain and higher heating issues.
  • Base 16" has AMD Radeon Pro 5300M with 4GB of GDDR6 memory
  • Top end 16 has optional AMD Radeon Pro 5600M with 8GB of HBM2 memory
 
  • Like
Reactions: Piggie

happyslayer

macrumors 65816
Feb 3, 2008
1,031
579
Glendale, AZ
Being that these are the lower end "budget" macs, I am looking forward to see what they (Apple) do with the more high-end macs. I can say, I bought the base MBA with only 8GB RAM and the 7 core GPU and I've now played a couple games that while playing on my previous 2018 15" Pro got super hot and cranked the fans and didn't seem to play as smooth. Just played an hour of World of Warcraft on it yesterday and it never even got warm! Also, never slowed down or even acted like it was working too hard. So far I am impressed with the M1 tech and super interested to see what comes next.
 

Anonymous Freak

macrumors 603
Dec 12, 2002
5,604
1,388
Cascadia
The M1 - a chip in Apple's lowest-end computers - has graphics performance better than the integrated graphics in any Intel or AMD chip.

It has better performance than AMD and Nvidia *midrange desktop* cards.

For Apple's launch effort, launched in low-end systems, this is extraordinarily impressive. Apple won't put this in any system that used a discrete GPU in the Intel era, but for replacing systems that used integrated graphics before, this is a huge leap.

Don't assume that this is what will come to the 16" MacBook Pro, or iMac Pro, etc.

Apple will either ramp up their graphics to seriously compete with discrete graphics - or they'll just use discrete graphics.
 

Piggie

macrumors G3
Original poster
Feb 23, 2010
9,191
4,145
Wild guess, but I suspect it will be only heat that means Apple will need to design their own separate GPU?
Perhaps they won't need to for Desktops.
I mean the brand new Xbox and PS5 are a single chip and don't need a separate GPU
But of course need a LOT of well designed cooling.
Built in GPU's in faster M1 x? machines would be quicker of course, if they can keep the heat down.

Very much looking forward to seeing what route they take.
I suspect most here are feeling it will have to be a separate GPU at some point?
 

poorcody

macrumors 65816
Jul 23, 2013
1,339
1,584
Many seem to be thinking Apple make have it's own separate GPU chip to fit inside higher end macs next year.
Now, of course I don't think anyone is expecting a RTX 3090 with 28 billion transistors and many many Thousands of various GPU cores level of chip from Apple next year ;)
But they could make a small start along that path, with a goal in perhaps 5+ year to reach the current leaders PC machines can enjoy.
Considering Apple just made the fastest integrated GPU, why would it take them five years to make a dGPU matching the best? GPUs are actually simpler than CPUs, and no reason to believe that the same experience of making iPhone/iPad CPUs led to the M1, making iPhone/iPad GPUs will lead to the G1. And given the biggest bottleneck with GPUs is heat, and Apple has matched Intel's top-performance with 75% less power, seems like Apple could be quite good at GPUs.

That's assuming they will make a discrete GPU. Maybe it will be more of a GPU die in the chiplet package to take advantage of their unified memory architecture. Since they are replacing machines with discrete GPUs already, I think they will have to do something to at least match them.

The reason you can get better graphics performance in the PC world is just the fact you can buy a tower and install giant graphics cards with huge fans and power supplies. You can't achieve the same in the Mac world simply because of the lack of tower-based (or otherwise similarly-expandable) machines. (I'm ignoring the Mac Pro purely because it is priced-out of most budgets.) So your criticism of Apple not pursing top-flight graphics is fair in that regard, IMO. I think they tried to have eGPUs fill that need to an extent.

I wonder if you will see more emphasis on graphics by Apple though, as the rumors are implying Apple wants to go big in the augmented/virtual-realty technology...
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
Wild guess, but I suspect it will be only heat that means Apple will need to design their own separate GPU?

Apple has been designing their own custom GPUs for years. M1 contains a custom Apple GPU. I really don’t understand what you mean. You seem to be confusing physical chips, GPU IP and the specific implementation of that IP.
 

theluggage

macrumors G3
Jul 29, 2011
8,011
8,444
Perhaps you may not have read some of my text.
To say it once more I'm not complaining or feeling let down by the new M1 Machines.

...I think your subject line was a mistake. There have been other threads here from people all disappointed because the Letter-envelope compatible, 12-hour battery life ultraportable was only as powerful as a 3-year-old full-fat desktop PCIe GPU. (Oh, the humanity...)

As for the future - Apple's problem was always that they prioritised thinness, lightness and battery life (+ proprietary, non-upgradeable solutions) over raw performance. If the performance-per-watt that these new ultraportable machines have demonstrated scales up to 16" MBPs and iMacs that have the power/cooling to support more cores and higher clock-speeds, then Apple may well have solved that conflict.

They've also gone some way to justify the whole non-upgradeable proprietary solution thing by integrating everything on the SoC to get faster interconnects.

A lot of the video editing etc. workloads in question scale well to multi-threading (after all, the current Intel/AMD/NVIDIA approach to higher-end workstations is already to throw more CPU and GPU cores at the problem) so a future Mac workstation could do worse than stick multiple M1-like SoCs in a box. More likely we'll see a M1X/M2/Whatever but I'd still hazard a guess that the eventual Mac Pro replacement might be a slot-in multi-processor solution.
 

Boil

macrumors 68040
Oct 23, 2018
3,477
3,173
Stargate Command
What do you mean by "integrate"? It would need to be kept as a seperate application.

Integrate as in integrate, why would it need to be a separate app...?

If you say because it is a fork of the original Open Source app, then no, they do not have to; once they fork the app, they can do anything they want with it...

Blender already has rudimentary (compared to FCPX, Resolve, & Nuke) video editing & compositing modules; Apple would just be integrating everything together for an all-inlone DCC software suite...
 

matrix07

macrumors G3
Jun 24, 2010
8,226
4,895
I think you’re missing OP’s point. They’re not shaming the current M1 chip. They’re speaking about the history of Apple and gaming graphics, and posing the question “does Apple silicon pave the way for a new era of Mac gaming?”

And I think OP poses a good question. About 80% of all mac sales are notebooks. I’d guess that also translates to user base, meaning the majority of those running Mac OS are using notebooks. Of course top-tier iMacs, 16” MacBook Pro, and Mac Pro can already run AAA games no problem, but historically the lower-end notebooks have struggled (and this is true for any manufacturer so it’s not a shame on Apple).

But that’s why so many devs have not felt it is worth the investment in gaming on a Mac. If the majority of Mac customers are using low-end notebooks that can’t run a AAA well, then it’s not worth it for them. The only way apple and AAA gaming will become mainstream is if Apple can get AAA game graphics available to their lower-end notebooks. At that point, it becomes much more lucrative for a dev.

I think the M1 does show us that this is going to be a reality. Of course we’re still in the early days, but if even the MBA can now run some AAA games, which is the most popular Mac product, that opens the floodgates for devs to port a Mac version optimized for Apple silicon.

I am cautiously optimistic that AS will usher in a new era of gaming on the Mac.
Yeah he clarified that but you have to admit “Great machines - Shame about the graphics” in M1 specific forum does point to M1 chip. In his opening he even said “My above title comments, in all honestly could have been leveled at most/All Apple machines over the past couple of decades.” That also pointed to he’s talking about M1 in the title.

I believe from what the 1st version of AS achieved we can see a trajectory that’s much better than what Intel can provide, and also can have hope that it can beat AMD and even nvidia. Apple has always been about thermal efficiency but AS is so efficient we can have hope so you’re right. For the first time we can be optimistic.
 
Last edited:

Yebubbleman

macrumors 603
May 20, 2010
6,024
2,616
Los Angeles, CA
My above title comments, in all honestly could have been leveled at most/All Apple machines over the past couple of decades.
In fact this is the single reason which has stopped me from going all in on Apple machines as much as I would otherwise have loved to.
The thought of paying £1000 / £2000 and having a computer with basically low power laptop graphics chip/power is simply something I personally could not accept.

And I accept this is 100% down to Steve Jobs not personally having any strong interest in this area of computing.
If Steve had always been a "Gamer" and loved entertainment titles, the history of Apple in this regard would totally different.
But that's the past..................

So, now we have Apple ARM based Silicon, and Apple now have the freedom to stretch their wings in any direction they wish.

Given this new found freedom, do you think Apple will finally kick of the past when it came to not taking GPU's and Entertainment seriously.
Still leave it to AMD (as they won't use Nvidea) to handle this?

Or might there be a chance that finally, after literally decades, and given they have total freedom now to follow any path they like.
We might, at long long last see Apple take this side of computing seriously and put effort into competing with AMD and Nvidia in this up till now ignored area of computer chip design?

Many seem to be thinking Apple make have it's own separate GPU chip to fit inside higher end macs next year.
Now, of course I don't think anyone is expecting a RTX 3090 with 28 billion transistors and many many Thousands of various GPU cores level of chip from Apple next year ;)
But they could make a small start along that path, with a goal in perhaps 5+ year to reach the current leaders PC machines can enjoy.

So, Given their freedom do you think they will expand into this direction and aim to have the best consumer CPU's and GPU's to offer Apple customers.
Or do you think the concept of not trying too hard with graphics, and "it's good enough" is so deeply ingrained into the companies core mentality that even with their new freedom it's simply not going to happen?

Considering the 2020 Intel versions of both the MacBook Air and the 2-port 13" MacBook Pro, let alone the Intel integrated graphics in any 13" laptop, Mac or otherwise, the M1's graphics are pretty amazing. Considering that it's pretty obvious that there will be higher performance SoCs destined for the 16" MacBook Pro, the iMacs, and maybe even the replacement to the 2020 Intel 4-port 13" MacBook Pro (be that a 13" or a 14" MacBook Pro), things are only going to get better from here.

If gaming is your concern, however, then you're focusing on the wrong element of the problem. The problem isn't that Apple's SoC graphics are lame (because, all things considered, they're not). The problem is that, by switching away from x86 (let alone designing GPUs in a fundamentally different way from the rest of the PC industry [Intel Macs included here]) and requiring Metal to really realize the performance, they're pushing away developers that either don't have the resources to go all-in on Apple's APIs and frameworks or don't want to, which will limit Mac ports of popular PC games (which were already on the decline). The other problem is that macOS, like iOS and iPadOS keep advancing with Apple forcing both users and developers to keep current. If you're a game that operates on free-to-play and/or has recurring content updates, you'll keep up (see "League of Legends" and all modern Blizzard titles). If you put out a game once, and maybe issue a couple patches to resolve issues you couldn't get resolved in beta, but nothing else, your game will eventually stop working for the Mac by virtue of not being updated to support a particular transition (be it PowerPC to Intel, 32-bit Intel to 64-bit Intel, Intel to Apple Silicon, OpenGL to Metal, and so on). It's not a great platform for games given all of these reasons.

Our best bet for gaming on the Mac is iOS and iPadOS games that either get ported to macOS via Catalyst and/or just run natively on Apple Silicon Macs. Not stellar. But, as they say, Macs are not a good gaming platform.
 

SlCKB0Y

macrumors 68040
Feb 25, 2012
3,431
557
Sydney, Australia
If you say because it is a fork of the original Open Source app, then no, they do not have to; once they fork the app, they can do anything they want with it...
That's not how the GPLv2 license works at all. A person can't just take GPL code, fork it and then change the licensing. The licensing follows the fork and all subsequent and derivate work is also GPL.

If you then integrate that with another App like you're suggesting, all the code from the other apps must also be released as GPL code as the integrated code is considered to be a derivative work.
 

widEyed

macrumors regular
Aug 18, 2009
175
68
...I think your subject line was a mistake. There have been other threads here from people all disappointed because the Letter-envelope compatible, 12-hour battery life ultraportable was only as powerful as a 3-year-old full-fat desktop PCIe GPU. (Oh, the humanity...)

As for the future - Apple's problem was always that they prioritised thinness, lightness and battery life (+ proprietary, non-upgradeable solutions) over raw performance. If the performance-per-watt that these new ultraportable machines have demonstrated scales up to 16" MBPs and iMacs that have the power/cooling to support more cores and higher clock-speeds, then Apple may well have solved that conflict.

They've also gone some way to justify the whole non-upgradeable proprietary solution thing by integrating everything on the SoC to get faster interconnects.

A lot of the video editing etc. workloads in question scale well to multi-threading (after all, the current Intel/AMD/NVIDIA approach to higher-end workstations is already to throw more CPU and GPU cores at the problem) so a future Mac workstation could do worse than stick multiple M1-like SoCs in a box. More likely we'll see a M1X/M2/Whatever but I'd still hazard a guess that the eventual Mac Pro replacement might be a slot-in multi-processor solution.
Been wondering the same. They need lots of compiler work I guess to do it but I wonder if the future Mac Pro and iMac Pro might use multiple SoCs rather than keep expanding the die for short run specialist market Macs? Also wondering if any future models will allow for conventional off-SoC RAM to go beyond the on-chip limitations, what kind of bus do these Macs have to the rest of motherboard/connectors (no PCI by the sounds of it?) (16 GB in the case of M1, and even if that’s equivalent to 32 GB of Intel Mac RAM as some are speculating, it’s not 64, 128 or 512 GB that we might expect to see on a future Mac Pro if it stayed with Intel CPUs).

Also really hoping that the 4x TB ports on 2 seperate buses and eGPU support comes back to mini or a mini Mac Pro Tower without the Mercedes Benz price-point enclosure becomes a thing. Hopefully they just need to do more macOS work to support eGPU on Apple Silicon. Apples trend towards intograting everything and preventing DIY upgrades that can save us the Apple tax (300%) on RAM and SSDs is something I’m hoping doesn’t get doubled down on. They seemed to listen on user upgradable RAM more recently which is promising.
 

widEyed

macrumors regular
Aug 18, 2009
175
68
I want to see Apple do two things:

1 - Fork Blender & integrate it with Final Cut Pro X & Logic Pro X, creating a comprehensive DCC (Digital Content Creation) software suite.

2 - Release an xMac, guts of the forthcoming new (smaller) Apple Silicon, but in a more affordable chassis. Or maybe a re-envisioning of the Mac Cube?
More like give up the part of the FCP market they haven’t already bled to Adobe Premier to Da Vinci Resolve with much improving NLE tools.

The cube died it’s death for very good reasons.
 

SlCKB0Y

macrumors 68040
Feb 25, 2012
3,431
557
Sydney, Australia
If you say because it is a fork of the original Open Source app, then no, they do not have to; once they fork the app, they can do anything they want with it...

Just to clarify, what you are describing is much more in line with the BSD license. That's how Apple was able to use so much BSD code and integrate proprietary software (like all of the GUI) with it to create OS X. Although they don't have to, Apple does contribute back a significant amount of code to the BSD project via Darwin and other software.

Even Microsoft used a lot of BSD code in Windows NT and 2K especially for their networking stack. I'm sure there are remnants here and there in their modern Windows version.
 

wyrdness

macrumors 6502
Dec 2, 2008
274
322
The Intel equivalents of these machines also had integrated graphics. Apple's integrated graphics have been demonstrated to be far faster than Intel's. We simply don't know what Apple is going to do for their higher end machines. There are rumours of a future discrete Apple GPU, but they're complete unconfirmed.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.