Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

magbarn

macrumors 68040
Oct 25, 2008
3,018
2,386
How is this possible? Only Apple doesn’t understand proper cooling. Only Apple’s laptop need additional cooling. Intel laptops (other than MacBooks of course) always have adequate cooling to not thermal throttle. /s
Apple's Intel laptops had inadequate cooling. Apple fixed it by replacing the CPU/GPU

In these comparisons, there should be screenshot scene comparisons, as I'm suspecting the macOS version may be rendering less or have less draw distance. Were both laptops fully equalized on "eye candy" settings?

Reason I'm asking is that on the games that I've tried on macOS and the bootcamp version, the windows/bootcamp version had better graphics fidelity.
 
  • Angry
Reactions: EPO75

crazy dave

macrumors 65816
Sep 9, 2010
1,453
1,229
Apple's Intel laptops had inadequate cooling. Apple fixed it by replacing the CPU/GPU

In these comparisons, there should be screenshot scene comparisons, as I'm suspecting the macOS version may be rendering less or have less draw distance. Were both laptops fully equalized on "eye candy" settings?

Reason I'm asking is that on the games that I've tried on macOS and the bootcamp version, the windows/bootcamp version had better graphics fidelity.

The joke is that plenty of laptops beyond Apple’s had (and still have) thermal issues (with Intel’s chips). That’s the issue with laptops. What individual OEMs actually do with cooling and BIOS settings matter so much.

Anyway, the graphical settings appeared to be the same to me. But ?‍♂️.
 

Surne

macrumors member
Sep 27, 2020
76
57
That game is translated through Rosetta and not optimized for M1 and it still managed a higher frame rate than on the PC.
Keep in mind the 3080 in the video is not the full mobile 3080 it's a gimped variant. Nvidia started doing this annoying thing where they have the same naming conventions for the same GPU that has been gimped to perform severely less than it should normally. So they'll take a 3080 and effectively neuter it down to weaker GPU levels but keep it's name as 3080. The 3080 in this video performs more around the mobile 3060 range rather than a full mobile 3080.

Regardless it's still very impressive as it shows it's performing around a mobile 3060 while running through translation layers.
 
Last edited:
  • Like
Reactions: Mayo86

mi7chy

macrumors G4
Oct 24, 2014
10,625
11,296
Michy, I noticed you want to trade your MBA M1 16GB/256GB. Apple have a great trade in scheme, you can even arrange it all on the Web.

Thinking about it but last I checked they gave peanuts and to be honest I just need a rental to satisfy curiosity and probably wouldn't use it enough to justify spending $3500 for the 16" M1 Max 32-GPU I'm interested in.
 

crazy dave

macrumors 65816
Sep 9, 2010
1,453
1,229
Keep in mind the 3080 in the video is not the full mobile 3080 it's a gimped variant. Nvidia started doing this annoying thing where they have different naming conventions for the same GPU. So they'll take a 3080 and effectively neuter it down to weaker GPU levels but keep it's name as 3080. The 3080 in this video performs more around the mobile 3060 range rather than a full mobile 3080.

Regardless it's still very impressive as it shows it's performing around a mobile 3060 while running through translation layers.

According to the guy (who may be mistaken) he had the 100W variant which is not the top but should be better than a mobile 3060 unless something else is constraining it.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
I’m aware, but it should still be better than it is according to other benchmarks with this game. It’s about 20% off the mark (actually more looking at the numbers, it looks definitely hamstrung).
Maybe that should be Apples angle. You don't need to do anything special to get great game performance from Apple Silicon, whereas on traditional PC you have to muck with settings in order to get the best performance.
 
  • Like
Reactions: crazy dave

Taz Mangus

macrumors 604
Mar 10, 2011
7,815
3,504
Maybe that should be Apples angle. You don't need to do anything special to get great game performance from Apple Silicon, whereas on traditional PC you have to muck with settings in order to get the best performance.
Not to mention that the laptop needs to be plug into the power outlet to get peak performance and fans roaring loudly.
 

Melbourne Park

macrumors 65816
Thinking about it but last I checked they gave peanuts and to be honest I just need a rental to satisfy curiosity and probably wouldn't use it enough to justify spending $3500 for the 16" M1 Max 32-GPU I'm interested in.
I was surprised Apple were offering me good money for my 2017 August 15.4". Goodness knows what would happen to my computer - it runs fine, it must go someplace ...

In many countries you can buy one from Apple and return it inside 14 days. Check that Apple do that where you live. Better than rent if you are ready to run your program.

When the upgrades happen, refurbs might have some good deals ... some time away for affordability I reckon ... you might end up with your trials machine ...
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
Well, stories I’ve heard says Intel promised certain thermal windows for their CPUs but couldn’t deliver as promised.
The current narrative is that Apple decided to start working on Apple Silicon for the Mac when Intel missed their target for the 2015 MacBook and 2016 MacBook Pro.
 

anticipate

macrumors 6502a
Dec 22, 2013
936
768
I mean, it's not bad. 4K max setting AAA gaming is definitely possible.

In Shadow of TR, at totally maximum/ultra settings (including AMD's upscaling mode on, I forget what it's called), at 4K, my M1 Max 16" got 33 fps average.

My desktop GTX 3070 based PC with RT off and DLSS on got 74; 53 with Ray Tracing & DLSS on at same maximized 4k settings.

At 1440p the Max went to 61 fps, same maximized settings.

So that's very impressive for the M1 Max, especially running in Rosetta. But even a desktop 3070 will crush it. For some games.

For productivity, FCP, Resolve, even Premiere are faster than even a desktop 3080 based machine (all assuming 8 core GPUs). The chip is designed for content creation, not gaming. That's why I have a gaming PC. Clearly. But it isn't bad at all. Certainly as capable as my old iMac Pro with a Vega 64 was.
 

jmho

macrumors 6502a
Jun 11, 2021
502
996
Nvidia started doing this annoying thing where they have the same naming conventions for the same GPU that has been gimped to perform severely less than it should normally. So they'll take a 3080 and effectively neuter it down to weaker GPU levels but keep it's name as 3080.
I really hope NVIDIA learn that if they keep selling neutered GPUs as if they're "real" 3080s, then their competitors are going to benchmark their neutered GPUs as if they're "real" 3080s.

Hopefully the reputation damage outweighs the money they've made scamming their customers into buying 3080s that only run like 3060s.
 

Surne

macrumors member
Sep 27, 2020
76
57
I really hope NVIDIA learn that if they keep selling neutered GPUs as if they're "real" 3080s, then their competitors are going to benchmark their neutered GPUs as if they're "real" 3080s.

Hopefully the reputation damage outweighs the money they've made scamming their customers into buying 3080s that only run like 3060s.
Agreed. It's incredibly stupid on Nvidia's part and scammy. They deserve whatever bad looks and backlash they get from misleading people.
 
  • Like
Reactions: JMacHack

crazy dave

macrumors 65816
Sep 9, 2010
1,453
1,229
Maybe that should be Apples angle. You don't need to do anything special to get great game performance from Apple Silicon, whereas on traditional PC you have to muck with settings in order to get the best performance.

I guess I would phrase it that with any given PC laptop you don’t know what you’re actually buying half the time - and that can be true for the CPU too not just the GPU. To give them credit, Intel is sort of addressing some of that with Alder Lake - reporting two numbers for power usage rather than just “TDP” which was beyond a joke even for desktops and waaaay beyond one for laptops. But it really needs the OEM to also be upfront about exactly what chips are in it and how they’ve been configured as well as the cooling capacity if those settings are modifiable.
 
Last edited:

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
Keep in mind the 3080 in the video is not the full mobile 3080 it's a gimped variant.
As far as I can tell there is no such thing. All Laptop 3080 chips have equal CUDA core counts, memory interface width, and so forth.

What's going on is just thermal management. Nvidia permits system integrators to configure a wide range of different power limits. When Razer made a thin and light 3080 laptop, they set the target at somewhere around 100W and the 3080's power management controls boost clocks to stay at or below that target. But another vendor making something much chunkier with a lot more cooling can choose to raise the limit as high as 155W.

It's not just the 3080. As you can see in these specs, every 30* laptop GPU supports a range of power settings.

 

crazy dave

macrumors 65816
Sep 9, 2010
1,453
1,229
As far as I can tell there is no such thing. All Laptop 3080 chips have equal CUDA core counts, memory interface width, and so forth.

What's going on is just thermal management. Nvidia permits system integrators to configure a wide range of different power limits. When Razer made a thin and light 3080 laptop, they set the target at somewhere around 100W and the 3080's power management controls boost clocks to stay at or below that target. But another vendor making something much chunkier with a lot more cooling can choose to raise the limit as high as 155W.

It's not just the 3080. As you can see in these specs, every 30* laptop GPU supports a range of power settings.


For this generation of Nvidia products that’s what people mean when they say “gimped variant” or “not full”. It’s generally understood that wattage, and performance, have been reduced rather than core counts or the like. That’s also what irks people as OEMs can claim to have a certain GPU and CPU but the large range of different power profiles means the actual performance can vary substantially from what the consumer will assume based on the names.
 
Last edited:
  • Like
Reactions: JMacHack and Surne

magbarn

macrumors 68040
Oct 25, 2008
3,018
2,386
As far as I can tell there is no such thing. All Laptop 3080 chips have equal CUDA core counts, memory interface width, and so forth.

What's going on is just thermal management. Nvidia permits system integrators to configure a wide range of different power limits. When Razer made a thin and light 3080 laptop, they set the target at somewhere around 100W and the 3080's power management controls boost clocks to stay at or below that target. But another vendor making something much chunkier with a lot more cooling can choose to raise the limit as high as 155W.

It's not just the 3080. As you can see in these specs, every 30* laptop GPU supports a range of power settings.

Thermal management or gimping, the end result is the same. I call it deceptive advertising. That would be akin to BMW selling a V8 with fuel management limiting the power to a 6 cylinder due to reduced cooling capacity.

The old Nvidia Max-Q marketing is actually more consumer friendly, but I guess the OEMs didn't like their MaxQ designs being passed over for the full powered variants so now we just have utter confusion.
 
  • Like
Reactions: Surne

JouniS

macrumors 6502a
Nov 22, 2020
638
399
For this generation of Nvidia products that’s what people mean when they say “gimped variant”. It’s generally understood that wattage, and performance, have been reduced rather than core counts or the like. That’s also what irks people as OEMs can claim to have a certain GPU and CPU but the large range of different power profiles means the actual performance can vary substantially from what the consumer will assume based on the names.
Isn't that just the usual cores vs. clock rate issue? A bigger chip with more cores is more expensive, but if you have a fixed power budget, it will also perform better than the smaller chip. Some consumers may be poorly informed and some vendors may use misleading marketing tactics, but the trade-off itself is sensible and easy to understand.
 

crazy dave

macrumors 65816
Sep 9, 2010
1,453
1,229
Isn't that just the usual cores vs. clock rate issue? A bigger chip with more cores is more expensive, but if you have a fixed power budget, it will also perform better than the smaller chip. Some consumers may be poorly informed and some vendors may use misleading marketing tactics, but the trade-off itself is sensible and easy to understand.

It depends on the usage and GPU vs CPU - eg Ian Cutress from Anandtech showed one where they benchmarked an i7 getting beaten in almost every metric by the i5 in otherwise the same laptop (or was that LTT? I can’t remember now). But yes the bigger concern is how easily it allows various vendors to mislead people about their product and Nvidia, Intel, and AMD don’t enforce branding rules surrounding wattage to ensure the consumer knows what they’re actually getting.

Edit: found it, Ian goes through one example here and mentioned the story I talked about above:

 
Last edited:

Melbourne Park

macrumors 65816
Imagining if it was my business I'd prefer not to abuse return policy.
I think a lot of the reviewers do just that.

You could call Apple and ask them if they mind you buying one with the intention of testing how well it will run some of your software. If they say that is fine by them, then you are not doing anything wrong.

Also I presume their policy is part of the "Apple Tax".

I returned a new iPhone 11 - because it was loosing its email. Apple returned the funds no questions asked. I did go to a store though, trying to find out what was going on. They had no idea. So in desperation (I was going to Europe from Australia in two days) I bought an 11 Pro instead - which had no bugs in it. With that crazy costly Pro phone , I left my camera gear at home. I took a lot of videos. I've hardly used my Sony full frame since ...

I presume the return policy suits their manufacturing model more than more "open" notebooks - ie ones where you can upgrade the SSD, RAM or the GPU oneself, and change the battery yourself. You cannot with Apple and case is not easy to get into. So Apple are protected there.

When I had my keyboard replaced by Apple, they replaced the whole top of the notebook - including a new trackpad, and a new battery. Just one whole piece changeover.

Other USA companies do the same 14 day return thing - although I think its not always so easy with them. I had problems getting my money back from HP for a monitor that did not perform as their service department had promised it would (I had checked with them a capability which I had to have before buying the monitor).

The risk for you though is scratching the casing, or dropping the computer. I dropped my MBP the day after I bought it and put a slight bend into the bottom of it's thin screen. So you'd have to put in on a desk and leave it there!!!

Apple don't allow you to go to their store and do tests there, do they? So unless you find a cooperative dealer, the only option left is the buy and return one.
 
Last edited:

Homy

macrumors 68030
Jan 14, 2006
2,510
2,461
Sweden
Watch the RTX 3080 165W with a huge 300W power brick getting destroyed when unplugged (like a true laptop should be), even in performance mode while using 4 times more battery power! Only 29 fps vs 92 for M1 Max running a Rosetta game.

 
Last edited:
  • Like
Reactions: jdb8167

Surne

macrumors member
Sep 27, 2020
76
57
Watch the RTX 3080 165W with a huge 300W power brick getting destroyed when unplugged (like a true laptop should be), even in performance mode while using 4 times more battery power! Only 29 fps vs 92 for M1 Max running a Rosetta game.

Performance mode is auto disabled on battery.

Source, me. I own a Legion 5 Pro with a full powered RTX 3070. The Legion 7 uses the exact same Lenovo Vantage software as my Legion 5 Pro.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.