Great!Thanks.M1/M2 are "SoCs" which stands for "System on a Chip". Apple designs the entire chip, basically everything on it, which includes the GPU. They started putting PowerVR GPUs from Imagination Tech in their first A-series SoCs a dozen years ago and currently license Imagination IP, so their GPUs can be described as PowerVR-derivatives.
M1 Max in the MacBook Pro is pretty great with 24 or 32-core GPU.Great!Thanks.
Ah i was really hesitating that i kinda only want to bring just one MBP while travelling but I might need better GPU performance to run some apps locally.
Ah...when Apple GPU starts to rock....
Does it really work with Stable Diffusion..i saw many people say even M1 Max serise is slowM1 Max in the MacBook Pro is pretty great with 24 or 32-core GPU.
Does it really work with Stable Diffusion..i saw many people say even M1 Max serise is slow
It's amazing what companies can do with TSMC's most advanced node. I wonder if the Apple 14 Pro GPU has more transistors than the MX450.Ah, by the way, something about performance of Apple GPUs. This is 3dmark score for the iPhone 14 Pro: https://www.notebookcheck.net/Apple-A16-GPU-5-Core-GPU-Benchmarks-and-Specs.652746.0.html
It’s essentially approaching Nvidia’s dedicated MX450. That’s a phone we are talking about.
The rumors are insane at how much the RTX 4090 is going to require. If left unchecked, there could be a time where desktop machines use two power supplies. I know a lot of this is rumors, but there seems to be some truth to the fact that Nivdia is taking a page out of Intel's playbook by just boosting the power to increase performance (I know there's other improvements).It’s slower than Nvidia for sure, partially due to the software maturity and partially due to the fact that Nvidia has dedicated ML hardware… and of course, the power consumption is an entirely different class.
Performance/Watt is still going to favour the RTX 4090 as it is built on two full nodes smaller than the RTX 3090. I believe the current rumour is a TGP of 450 Watt up to a max of 660 Watt.The rumors are insane at how much the RTX 4090 is going to require. If left unchecked, there could be a time where desktop machines use two power supplies. I know a lot of this is rumors, but there seems to be some truth to the fact that Nivdia is taking a page out of Intel's playbook by just boosting the power to increase performance (I know there's other improvements).
I absolutely love how my MBP gets the same amount of performance on battery as it does plugged in. My razer's performance tanks, thanks to the fact that battery cannot power the RTX 2070 that it uses. My MBP lasts all day doing all sorts of tasks, where as the Razer's battery would only last 1/2 as long (best case scenerio)
I still enjoy using my Razer, but for much of my mobility needs, the MBP is the better option
the build itself is done by TSMC...but the design and the whole SoC is design by Apple it selfYeah...
Just being curious...Is it still Intel?
Or Apple itself.
That's the issue, 660 watts, and will we see even higher requirements with a Ti model that will eventually come out?up to a max of 660 Watt.
It's crazy (in a good way) how little power the apple silicon computers use. My Mac Studio Ultra draws just 16-17W when sitting idle.I think its crazy how much power both the intel processors and Nvidia GPUs are requiring
I’ve been playing around with stable diffusion on my M1 mbp for the past few days and I’m very pleasantly surprised how functional it is. Sure, it’s probably not as fast as a 4090, but it runs way better than I would have expected.Does it really work with Stable Diffusion..i saw many people say even M1 Max serise is slow
iphone????crazy!Ah, by the way, something about performance of Apple GPUs. This is 3dmark score for the iPhone 14 Pro: https://www.notebookcheck.net/Apple-A16-GPU-5-Core-GPU-Benchmarks-and-Specs.652746.0.html
It’s essentially approaching Nvidia’s dedicated MX450. That’s a phone we are talking about.
could you be more specific?like how long did you get one picture under what promptsI’ve been playing around with stable diffusion on my M1 mbp for the past few days and I’m very pleasantly surprised how functional it is. Sure, it’s probably not as fast as a 4090, but it runs way better than I would have expected.
Why does it matter if it performs accordingly? The performance/watt is going to be much better than the RTX 3090. The new AMD graphic cards are also rumoured to have increased power requirements.That's the issue, 660 watts, and will we see even higher requirements with a Ti model that will eventually come out?
I think its crazy how much power both the intel processors and Nvidia GPUs are requiring
For me, I'd rather not be spending a lot of money on electricity, incurring more heat. Its a personal preference but I'm not inclined to have a desktop that is using kilowatts worth of power.Why does it matter if it performs accordingly?
The GPU with the higher performance/W ratio will incur a lower power bill.For me, I'd rather not be spending a lot of money on electricity, incurring more heat.
How so?The GPU with the higher performance/W ratio will incur a lower power bill.
Yeah...
Just being curious...Is it still Intel?
Or Apple itself.
I had to get a 1000 Watt power supply for my RTX 3080 Ti (when it only required 750) with everything else I have any less would be a huge risk.That's the issue, 660 watts, and will we see even higher requirements with a Ti model that will eventually come out?
I think its crazy how much power both the intel processors and Nvidia GPUs are requiring