All true.
On the other hand: put an M1 Pro/Max into Mac Mini, provide a descent cooling solution, here‘s your gaming machine most certainly on par with Xoxes and Playstations…
They'd need to stay in the $500 range if they were to make a console. It's possible if they don't take any profits and sell it at cost.Sounds expensive. Who would want a $1000+ console?
Well, it‘d be on par with consoles, yet a more capable machine, a „normal“ Mac. Maybe a bit more expensive compared to gaming PCs, but not outrageously so.Sounds expensive. Who would want a $1000+ console?
They'd need to stay in the $500 range if they were to make a console. It's possible if they don't take any profits and sell it at cost.
M1 Max is estimated to cost around $150 based on die dimensions, and 5nm yields. Add in a 1TB SSD, a fan, and some controller. They could hit $500 if they want to.
How much do you think 16GB of LPDDR5 costs?That might be the manufacturing cost of the die itself... but aren't you forgetting the memory? 16GB of LPDDR5 on itself is already not cheap, and here we have a 512-bit RAM bus, advanced packing technology and custom RAM modules... Not even high-end GPUs use this wide RAM bus since it would be too expensive. There is no way $500 even covers the material bill on an M1 Max machine...
To not gain any performance turning off AA seems sus.No, it isnt, not in this game and in this scenario where ,as he mentioned in the other thread, the cpu is an desktop i9-11900K with an desktop 3080TI
And at this type of game the gpu is the most important...is not like starcraft that is an cpu hungry
Bottom line is that for proper games that takes adv of Apples hardware this gpu is impressive for 55-60W
Have a nice day to all of you, going to play with Maya
The best you can hope for is probably around 1650-level performance in a new smaller M2 Mac mini.
How much do you think 16GB of LPDDR5 costs?
Apple could also replace that with the more power-hungry but cheaper DDR5/GDDR6.
I don't think Apple will enter the computer or console gamer's market. The future is in mobile. Apple's GPUs are getting more powerful every year while maintaining the same power window. I would think in 2 to 3 years, mobile GPUs would be good enough for the effects that most gamers would care for.They could yes, but they would need to build an entirely new SoC for that. New memory controllers, different pinout etc.
RTX 3060 is disappointing as it is an entry level GPU for gaming laptops.
"Entry level" for gaming laptops is something like 1650. You people have really weird expectations. Machines with a 3060 (laptop or desktop) or higher represent less than 5% of PCs on the steam hardware survey. How can you call something "entry level" when only one in 20 "gamers" have access to that tech?
Except in the x86 / x64 world, we are starting to see handhelds PCs with very decent performance, so the future isn't in mobile games. Quite the contrary, mobile platforms are absorbing larger titles.I don't think Apple will enter the computer or console gamer's market. The future is in mobile.
What you described sounds awfully like an iPad (except for the battery life), don't you agree?Except in the x86 / x64 world, we are starting to see handhelds PCs with very decent performance, so the future isn't in mobile games. Quite the contrary, mobile platforms are absorbing larger titles.
View attachment 1881851
This little **** looks like a toy. But make no mistake: it has a Core i7 with 16 GB RAM, and runs quite a few triple A games while offering eGPU support and being upgradeable (yes, you can replace the motherboard!). Right now, the fans are loud when a game is too demanding, but give it a few more years and we may start to see the same kind of performance efficiency boost we saw on Macs.
And no, as you can guess, it's not a low power system with some obscure Android / Linux distro. It's a full-blown system, so you could even install Office or Photoshop and make it your daily driver
I 100% agree.Did I say macs are “only” for creative workloads? No. But this is a discussion about the M1 Max chip, specifically the GPU, a chip used in the MacBook Pro, which is designed for creative workloads. Office stuff is irrelevant to a discussion about high-end GPU workloads.
Of course you can still use a MacBook Pro for office stuff. That’s irrelevant to this discussion.
I'll disagree because while the iPad (especially the Pro models) have similar performance compared to the Macbook Pro, it has a more restricted system. The GPD series (and competitors) can run full-blown Windows systems smoothly, so they're more like ultra portable premium gaming systems. The GPD does have a touch screen, but it's not suitable for drawing as the iPad's.What you described sounds awfully like an iPad (except for the battery life), don't you agree?
Not the RTX 3050?Tiny correction: the RTX 3060 is the entry level Ray-tracing-capable GPU for gaming laptops.
Except they’re not.
Some professional cards are clocked higher, they have greater bandwidth, usually considerably more memory, and drivers built for different workflows.
The A6000 for example is 48gb RAM compared to the predecessor at 24gb. And is around 30% faster than a 3090 in machine learning tasks.
Not the RTX 3050?
Default/Boost clocks on the A6000 are higher than the 3090FE, and it has 2 more SMs than the 3090FE as well.Oh, where exactly is your source for the upclocked A6000? We’re not just trying to talk about any higher spec you believe that you can find that makes your point. More memory in a GPU mostly allows for higher-resolution textures.
Please explain in detail and with sources how double the GPU RAM makes for faster completion of ML tasks?
Are you speaking from experience, is that your personal experience? Or, are you going by marketing material?
Oh, I stand corrected, but I’m not sure if it’s a BTO option for gaming Laptops.
But is it optimized for the M1 Pro & M1 Max? I'd expect that to occur 52 weeks from now
PC for those who want to do both work and gaming on the same machine and don't mind the sound of a blow dryer and the heat of a sauna in their room non-stop, along with a higher electricity bill.
Default/Boost clocks on the A6000 are higher than the 3090FE, and it has 2 more SMs than the 3090FE as well.
Default/Boost clocks on the A6000 are higher than the 3090FE, and it has 2 more SMs than the 3090FE as well.