Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

opeter

macrumors 68030
Aug 5, 2007
2,709
1,619
Slovenia
Recently I changed my mighty PC for a M1 Mac Mini...
Of course is the M1 efficient, because it is another league. Comparing an M1 with a PC, that has desktop component and discrete GPU, is like comparing a cat with a whale. Both have their usage.

Anyway, now try to run some games on your M1, like Crysis Remastered.
 
  • Like
Reactions: mi7chy

opeter

macrumors 68030
Aug 5, 2007
2,709
1,619
Slovenia
Macs in general aren't for gaming.

Yes, this thinking applies to the (latest) computers, especially these with Apple Silicon.

But it wasn't true before that. At least from 2006 onward you could use Boot Camp and install Windows on the Macs, if needed. Also, some older PowerPC games got an Universal binary release. Many AA/AAA games, even indies had a native Mac port. The biggest boom was in the leta 2000 and than trough the 2010s, up until to the end of the decade.

The biggest problem was, that many (most) of these games had only 32-bit versions or were using 32-bit code. With macOS Catalina or never, this segment died.

And not to forget the x86 Mac Pros could accept (and still accept) many different GPUs. And the late generations of x86 Macs could use external GPUs through Thunderbolt 2 and 3.

Where as in the new Macs, especially these ARM integrated you are limited to, whatever Apple puts in there. So your only choice will be the games, that will be available in the Mac Apps Store* or on Apple Arcade (how long will this service last?)

Remember my words: Apple will discontinue Rosetta 2 in max. two years.
The future macOS, that will come in 2024 won't have support Intel/x86-64 Mac applications anymore.

*Apple is trying more or less to "shut down" or better said restrict the Mac platform from downloading and instralling external softwares, similar to what it does with iOS.

OP never mentioned gaming though.

Can you please elaborate, why would one need an Geforce GTX 2070, if not for gaming?

For work, the integrated graphics are more than capable or, if you need power for you workstation, than there are the professional-grade GPUs like the Nvidia Quadros or AMD Radeon Pro/FirePros with Adobe, Autodesk, DaVinci etc. certificates and not to speak of the CAD/CAM/CAE vendors...
 
Last edited:

macguy2021

Suspended
Jun 2, 2021
101
148
Can you please elaborate, why would one need an Geforce GTX 2070, if not for gaming?

For work, the integrated graphics are more than capable or, if you need power for you workstation, than there are the professional-grade GPUs like the Nvidia Quadros or AMD Radeon Pro/FirePros with Adobe, Autodesk, DaVinci etc. certificates and not to speak of the CAD/CAM/CAE vendors...

Perhaps that's a question better posed to the OP. I highly doubt it was for gaming because you would not buy any Mac for gaming, let alone one on ARM to replace a spec'd out gaming PC if you wanted to continue gaming. OP doesn't sound disappointed with the transition, either. So I'd say it wasn't for gaming, whatever the reason he had that GPU.

I'd say it was something of the case where it was for video editing or just to have a fast powerful PC. Plenty of people buy machines that are recommended to them that are overpowered for their needs. I don't see it as a problem. I've always been a "sledgehammer where a tack hammer would do" person when it comes to computer parts and specs. I always go for max CPU, RAM, SSD storage, etc on my MBPs. I think it futureproofs the device and I'm one that notices even the slightest bit of lag and where most people would find it trivial, to me those few seconds add up over time. Most aren't counting, however I am. If there is one thing I feel like I'm willing to overspend to prevent, it's slow computers/ devices or slow internet. I spend all day every day on a computer, tablet, or smartphone and it matters to me.

Again, all speculation as far as the use for that powerful of a GPU until the OP answers that question himself.
 

thunng8

macrumors 65816
Feb 8, 2006
1,032
417
Perhaps that's a question better posed to the OP. I highly doubt it was for gaming because you would not buy any Mac for gaming, let alone one on ARM to replace a spec'd out gaming PC if you wanted to continue gaming. OP doesn't sound disappointed with the transition, either. So I'd say it wasn't for gaming, whatever the reason he had that GPU.

I'd say it was something of the case where it was for video editing or just to have a fast powerful PC. Plenty of people buy machines that are recommended to them that are overpowered for their needs. I don't see it as a problem. I've always been a "sledgehammer where a tack hammer would do" person when it comes to computer parts and specs. I always go for max CPU, RAM, SSD storage, etc on my MBPs. I think it futureproofs the device and I'm one that notices even the slightest bit of lag and where most people would find it trivial, to me those few seconds add up over time. Most aren't counting, however I am. If there is one thing I feel like I'm willing to overspend to prevent, it's slow computers/ devices or slow internet. I spend all day every day on a computer, tablet, or smartphone and it matters to me.

Again, all speculation as far as the use for that powerful of a GPU until the OP answers that question himself.
He uses it for video editing. M1 is super fast for that task, even beating mobile Nvidia RTX 3080

 
  • Like
Reactions: macguy2021

thunng8

macrumors 65816
Feb 8, 2006
1,032
417
Yes, this thinking applies to the (latest) computers, especially these with Apple Silicon.

But it wasn't true before that. At least from 2006 onward you could use Boot Camp and install Windows on the Macs, if needed. Also, some older PowerPC games got an Universal binary release. Many AA/AAA games, even indies had a native Mac port. The biggest boom was in the leta 2000 and than trough the 2010s, up until to the end of the decade.

The biggest problem was, that many (most) of these games had only 32-bit versions or were using 32-bit code. With macOS Catalina or never, this segment died.

And not to forget the x86 Mac Pros could accept (and still accept) many different GPUs. And the late generations of x86 Macs could use external GPUs through Thunderbolt 2 and 3.

Where as in the new Macs, especially these ARM integrated you are limited to, whatever Apple puts in there. So your only choice will be the games, that will be available in the Mac Apps Store* or on Apple Arcade (how long will this service last?)
That’s is an outright lie. You are not limited to App Store or Apple Arcade on a Mac.
 

opeter

macrumors 68030
Aug 5, 2007
2,709
1,619
Slovenia
That’s is an outright lie. You are not limited to App Store or Apple Arcade on a Mac.
Right now: maybe. But for how long? How will be the situation in two years? Apple is the one, who wants to close/limit the system down even further.
 
Last edited:

thunng8

macrumors 65816
Feb 8, 2006
1,032
417
Right now: maybe. But for how long? How will be the situation in two years? Apple is the one, who wants to close/limit the system down even further.
I'd say never. You cannot develop with a locked down system. Development as well as the whole scientific/research communities/universities doing research would be forever abandon Mac OS if that was to happen.

There would be no need for WWDC - Apple's largest event - because no-one could effectively develop in such as system
 

Fravin

macrumors 6502a
Original poster
Mar 8, 2017
803
1,059
Rio de Janeiro, Brazil
Of course is the M1 efficient, because it is another league. Comparing an M1 with a PC, that has desktop component and discrete GPU, is like comparing a cat with a whale. Both have their usage.

Anyway, now try to run some games on your M1, like Crysis Remastered.

Forget gaming. I make my life editing videos. And I was surprised that this tiny computer would handle video editing as my PC did, sometimes faster. And it really wowed me that my power bill was cut down.

Thank you.
 

Fravin

macrumors 6502a
Original poster
Mar 8, 2017
803
1,059
Rio de Janeiro, Brazil
I'd say it was something of the case where it was for video editing or just to have a fast powerful PC.
You were right.

That GPU was used for faster rendering in Video editing. Nvidia has developed a software driver that pushes rendering tasks further with GPU cores.

Screen Shot 2021-06-25 at 10.25.52.png
 
  • Like
Reactions: macguy2021

tornado99

macrumors 6502
Jul 28, 2013
454
445
An M1 mini hits 40W peak power, and an Intel NUC hits 60W peak power.

If you ran them 100% cpu all day the M1 would use 1 kWh and the NUC 1.4 kWh. In reality they use far less as they are idling more than they are at peak.

A domestic fridge uses 2 kWh a day. An electric oven/grill far higher.

That's why I doubt the entire premise of this thread. There's nothing particularly magical about an M1 Mini compared to other products in its category, as far as your *total* electricity bill is concerned.
 

thunng8

macrumors 65816
Feb 8, 2006
1,032
417
An M1 mini hits 40W peak power, and an Intel NUC hits 60W peak power.

If you ran them 100% cpu all day the M1 would use 1 kWh and the NUC 1.4 kWh. In reality they use far less as they are idling more than they are at peak.

A domestic fridge uses 2 kWh a day. An electric oven/grill far higher.

That's why I doubt the entire premise of this thread. There's nothing particularly magical about an M1 Mini compared to other products in its category, as far as your *total* electricity bill is concerned.
Not sure about where you got 40w from. I can’t get it to peak over 30w when stressing both cpu and gpu.

in any case, op doesn’t use a nuc since it is horribly slow (in video editing) - he uses a desktop with discrete graphic card that most likely will peak over 400w.
 
  • Like
Reactions: duervo

tornado99

macrumors 6502
Jul 28, 2013
454
445
It's not drawing 400W 24 hours a day though, it will be idling at 10W. Let's say it was running at 400W for 4 hours a day, that's still only 1.8kWh, less than a fridge.
 
  • Angry
Reactions: duervo

tornado99

macrumors 6502
Jul 28, 2013
454
445
From the angry faces it seems some people on here need to get acquainted with the difference between Watts and Joules (kWh is really another way of stating Joules).
 
  • Like
Reactions: opeter

thunng8

macrumors 65816
Feb 8, 2006
1,032
417
It's not drawing 400W 24 hours a day though, it will be idling at 10W. Let's say it was running at 400W for 4 hours a day, that's still only 1.8kWh, less than a fridge.
Where did you get 10w from? Most desktops with discrete gpu idles at around 70-100w. You only less than 10w if it is turned off.

I do get your point though in comparison with other appliances.
 

lJoSquaredl

macrumors 6502a
Mar 26, 2012
522
227
I mean my old Windows PC had like a 500w PSU and it was a pretty low power computer in terms of gaming. Since I switched to only MBAs and MBPs I'm well below that, probably what 5-10w most of the day. I can get used to this along with the fanless experience even when doing strenuous activities:)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.