Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
M1/M2 are "SoCs" which stands for "System on a Chip". Apple designs the entire chip, basically everything on it, which includes the GPU. They started putting PowerVR GPUs from Imagination Tech in their first A-series SoCs a dozen years ago and currently license Imagination IP, so their GPUs can be described as PowerVR-derivatives.
 
M1/M2 are "SoCs" which stands for "System on a Chip". Apple designs the entire chip, basically everything on it, which includes the GPU. They started putting PowerVR GPUs from Imagination Tech in their first A-series SoCs a dozen years ago and currently license Imagination IP, so their GPUs can be described as PowerVR-derivatives.
Great!Thanks.

Ah i was really hesitating that i kinda only want to bring just one MBP while travelling but I might need better GPU performance to run some apps locally.

Ah...when Apple GPU starts to rock....
 
Great!Thanks.

Ah i was really hesitating that i kinda only want to bring just one MBP while travelling but I might need better GPU performance to run some apps locally.

Ah...when Apple GPU starts to rock....
M1 Max in the MacBook Pro is pretty great with 24 or 32-core GPU.
 
  • Like
Reactions: Tagbert
Does it really work with Stable Diffusion..i saw many people say even M1 Max serise is slow

It’s slower than Nvidia for sure, partially due to the software maturity and partially due to the fact that Nvidia has dedicated ML hardware… and of course, the power consumption is an entirely different class. But for a laptop it’s not too terrible, plus, performance will likely increase going forward.

The thing is, you kind of are able to use stable diffusion on the M2 Air. It won’t be very performant but should suffice for hobbyist use. Can’t say the same for other ultraportable business ultra books on the same price category.
 
  • Like
Reactions: GMGN and wyrdness
It's probably best to use a cloud provider for Stable Diffusion. I'm using replicate.com which provides A100's via AWS. Laptop GPU's aren't really up to this kind of task.
 
It’s slower than Nvidia for sure, partially due to the software maturity and partially due to the fact that Nvidia has dedicated ML hardware… and of course, the power consumption is an entirely different class.
The rumors are insane at how much the RTX 4090 is going to require. If left unchecked, there could be a time where desktop machines use two power supplies. I know a lot of this is rumors, but there seems to be some truth to the fact that Nivdia is taking a page out of Intel's playbook by just boosting the power to increase performance (I know there's other improvements).

I absolutely love how my MBP gets the same amount of performance on battery as it does plugged in. My razer's performance tanks, thanks to the fact that battery cannot power the RTX 2070 that it uses. My MBP lasts all day doing all sorts of tasks, where as the Razer's battery would only last 1/2 as long (best case scenerio)

I still enjoy using my Razer, but for much of my mobility needs, the MBP is the better option
 
It works
The rumors are insane at how much the RTX 4090 is going to require. If left unchecked, there could be a time where desktop machines use two power supplies. I know a lot of this is rumors, but there seems to be some truth to the fact that Nivdia is taking a page out of Intel's playbook by just boosting the power to increase performance (I know there's other improvements).

I absolutely love how my MBP gets the same amount of performance on battery as it does plugged in. My razer's performance tanks, thanks to the fact that battery cannot power the RTX 2070 that it uses. My MBP lasts all day doing all sorts of tasks, where as the Razer's battery would only last 1/2 as long (best case scenerio)

I still enjoy using my Razer, but for much of my mobility needs, the MBP is the better option
Performance/Watt is still going to favour the RTX 4090 as it is built on two full nodes smaller than the RTX 3090. I believe the current rumour is a TGP of 450 Watt up to a max of 660 Watt.
 
up to a max of 660 Watt.
That's the issue, 660 watts, and will we see even higher requirements with a Ti model that will eventually come out?

I think its crazy how much power both the intel processors and Nvidia GPUs are requiring
 
  • Like
Reactions: Queen6
I think its crazy how much power both the intel processors and Nvidia GPUs are requiring
It's crazy (in a good way) how little power the apple silicon computers use. My Mac Studio Ultra draws just 16-17W when sitting idle.
 
  • Like
Reactions: Queen6
Does it really work with Stable Diffusion..i saw many people say even M1 Max serise is slow
I’ve been playing around with stable diffusion on my M1 mbp for the past few days and I’m very pleasantly surprised how functional it is. Sure, it’s probably not as fast as a 4090, but it runs way better than I would have expected.
 
  • Like
Reactions: Tagbert
I’ve been playing around with stable diffusion on my M1 mbp for the past few days and I’m very pleasantly surprised how functional it is. Sure, it’s probably not as fast as a 4090, but it runs way better than I would have expected.
could you be more specific?like how long did you get one picture under what prompts
 
That's the issue, 660 watts, and will we see even higher requirements with a Ti model that will eventually come out?

I think its crazy how much power both the intel processors and Nvidia GPUs are requiring
Why does it matter if it performs accordingly? The performance/watt is going to be much better than the RTX 3090. The new AMD graphic cards are also rumoured to have increased power requirements.

I mean, I'm thrilled that my machine (Mac Studio) only sips power comparatively to those desktop monsters but some people need all the performance money can buy, especially on a desktop machine.
 
Why does it matter if it performs accordingly?
For me, I'd rather not be spending a lot of money on electricity, incurring more heat. Its a personal preference but I'm not inclined to have a desktop that is using kilowatts worth of power.
 
Apple. And for an integrated GPU, there's absolutely nothing better. It blows Intel out of the water.

And on paper, it can also be on par with dedicated graphics cards. On paper. It's not happening with the M1 line though, even with M1 Ultra, and if there's an M1 Mac Pro, it won't happen either. We'll see how the M2 line unfolds. Maybe they can scale their GPUs better with more cores. It's far from being as linear as it is with their CPUs.
 
That's the issue, 660 watts, and will we see even higher requirements with a Ti model that will eventually come out?

I think its crazy how much power both the intel processors and Nvidia GPUs are requiring
I had to get a 1000 Watt power supply for my RTX 3080 Ti (when it only required 750) with everything else I have any less would be a huge risk.
 
  • Like
Reactions: maflynn
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.