Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

salamanderjuice

macrumors 6502a
Feb 28, 2020
580
613
I would not be surprised that Intel would still be on 14nm. Make up some BS explanation why they are still there when Apple has iPhone 2nm chips in 2024.

Intel stayed at 14nm to save on manufacturing cost. From 2006-2020 they had all PC OEMs as customers. Where the incentive to improve when you have a monopoly? Shareholders will be happy as their dividend will go up and Intel management gets their bonuses.

Apple's quarter million iPhone chip orders to TSMC is the only reason why M1 chips were possible. AMD/Intel ships a quarter million PC chips annually at time when Apple was still using Intel chips.

Intel is in danger of being rendered impotent if Windows 11 on ARM gets momentum. Imagine 2nd best chips to iPhone on a Windows machine?

Only reason why you'd go x86 is because of the legacy software. Give it a decade after new Windows machines are 80% ARM and Intel become a shadow of its former self.
It was definitely manufacturing troubles bud. You can see it in the way they first "launched" 10nm chips as supremely supply limited tiny low power mobile parts and the fact they had to backport their 10nm to 14nm. You don't do that for fun and you don't do that while your competitor is nipping at your heels.

They definitely did rest on their laurels a bit while AMD was down which is why consumers were stuck on 4 cores max for so long but they definitely had 10nm manufacturing troubles.
 

Zest28

macrumors 68030
Jul 11, 2022
2,581
3,932
You are actually only counting cores and MHz...As far as I understand, the power draw is the same or less despite the larger chip and increased performance. Then who cares how large the chip is? In the end we got 20-40% performance increase between two generations (must be some GPU architecture advances involved to explain this). Read Intel history and then you will understand how amazing this.

The M2 MBA power draw and heat is much higher than M1 MBA. So what makes you think it is any different for M2 Pro?

Higher clock speeds + More cores = more heat + power draw. It's why the M2 MBA thermal throttles so much more than the M1 MBA.

I care about it, as if I want a laptop that has fan noise under heavy loads, I might as well buy a PC laptop. That's the great thing about the current 16" M1 Max MacBook Pro, the fans almost never turn on.
 
Last edited:

kasakka

macrumors 68020
Oct 25, 2008
2,389
1,073
Flagship intel/amd CPUS are normally used for high-end gaming machines, where you will have a high-end graphic cards that eats even more power. So you will need to put a 1000W power supply anyways. This is what helps them in that regard.

I'm happy with Silicon transition, I think was the best thing Apple did since many years. Not only they manage to pack a tremendous cpu performance/consumption ratio, but also great graphics performance, so it's really awesome.
That's a bit off the mark. My gaming PC is an Intel 13600K + PNY GeForce 4090 in a CoolerMaster NR200P powered by a Corsair SF750 750W Platinum SFX size power supply. Runs cool and quiet. The whole computer is something like two Mac Studios stacked in size and could be even smaller with a different case and CPU cooler. Cost is still far less than a top spec Mac Studio.

The Apple Silicon is absolutely worthwhile for more compact, portable devices though. I can't wait to get rid of my 2019 Intel i9 Macbook Pro for a M2 Max model because the Intel is hot, noisy and underperforming.

I don't care that much about non-upgradeable RAM and disk space for a laptop, but having those limitations on a Mac Studio or Mac Mini is inexcusable. I could live with the RAM for its unique design, but not having a standard M.2 NVME drive is ridiculous only so that you need to get Apple's overpriced solutions.

I paid about 230 euros for a 2 TB Samsung 980 Pro drive for my PC yet getting the same diskspace on a Mac Studio is +690 euros! GTFO, Tim Cook.
 

MauiPa

macrumors 68040
Apr 18, 2018
3,438
5,084
to paraphrase Forest Gump "silly is as silly says ma'am". Certainly, Intel makes some high powered (i.e. hot) i9s, but I am having trouble finding them in laptops. Why is that? too hot, too power hungry? I haven't found the current generation (HP doesn't even offer i9s in laptops). But on Dell (I'd never buy a Dell I've had too much bad luck with them) there are some Alienware Dells with paltry battery life and are very heavy with the I912900HK which is not as fast as the M2 Pro and Max, these are pretty darn expensive too, well so much for the haters' theory that apples are overpriced (premium priced, sure. I'll admit that the Alienware has a lot of extra weight and heat to support external graphics cards, but if the Intels are so good, why is that?

Anyway, Apple silicon is, was and remains awesome for a lot of seriously valid reasons. I totally get if you are stuck on windows (I personally did my time on windows, and will never go back), or are serious gamers (already mentioned stuck on windows, yah?). I get how you want to stay, just don't expect comparable performance, weight, battery life for a giveaway price - it doesn't exist
 

jimmirehman

macrumors 6502a
Sep 14, 2012
519
384
The problem I see is upgradability. With memory and storage being unified with no room for expansion, prices of the units should be way lower because the tech is basically disposable now.

Mac mini should be sub $500.
$299 8GB, 256
$399 16GB 256
$499 16GB 512

I’m going to spend over $1000 for a Mac Mini with 16GB and 1TB SSD. And then that’s it. I can’t add any more anything down the line.

I’m a fanboy but this is a fail.

I’m all for unification but allow additional expansion beyond the unified.
 

Unami

macrumors 65816
Jul 27, 2010
1,446
1,724
Austria
Comparing "nanometers" is an apple to oranges comparison and not really representative. E.g. Intel managed to get 242 million transistors on a square millimeter with it's 7nm process (formerly called "Enhanced 10nm SuperFin"), while tsmc "only" does 147 million transistors per mm2 with it's 5nm process. There are other factors like fin pitch, min metal pitch, cell height and gate pitch to consider. Nowadays "x-nm process" has become a marketing term .
 
Last edited:
  • Like
Reactions: Basic75

Spaceboi Scaphandre

macrumors 68040
Jun 8, 2022
3,414
8,106
I dunno what the hell MacWorld is talking about. The thrill of Apple Silicon is still very much there with how much these chips are monsters, and the absurdly long battery life they bring. M1 is the reason I switched to Mac in the first place, and it's made so many people around me who used to hate Macs for years get one for themselves since the chips are giving them a miles better experience in a laptop than the Windows counterparts.
 

hfvienna

macrumors newbie
Sep 10, 2015
8
9
Vienna, Austria
It is a pity that people who dont have the slightest clue about technology are allowed to write such an article. Steps which have been made in last 3 years by Apple are nothing less than revolutionary and it must be accepted that physics and chemistry are sometimes hard to be bent. Advantages like already mentioned here by others need dramatic effort, and little tiktokers hungry for sensation don't get it how big the struggle is in the background . It is sad that such comments drag down the huge success into every day business and have no respect that sometimes we have to wait a bit for the next jump. A japanese manager once explained it to me as dancing place, which follows every steep slope naturally. Don't let uninformed youngsters spoil the joy we have with Apples progress.
 
  • Like
Reactions: MacPowerLvr

GumaRodak

macrumors 6502a
Mar 14, 2015
583
362
The future of gaming is streaming from cloud. Why to invest 4000$ for good gaming machine if you can play on a ***** computer for 300$. The high performance computers will be sold only to thise who really need them
 

MajorFubar

macrumors 68020
Oct 27, 2021
2,174
3,825
Lancashire UK
I bugs me that some (actually, many) people condemn or martyr a computer depending on how well it plays games. I'm not telling you what you should do in your spare time but frankly that's a ridiculous metric. Grow up.
 
  • Like
Reactions: MacPowerLvr

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
The future of gaming is streaming from cloud. Why to invest 4000$ for good gaming machine
I won't disagree that streaming is probably the future, but its nowhere near here. Also, I have a decent gaming rig that plays nearly all games it did not cost m 4,000 dollars. More like 1,500. Granted this was a few years ago, but most people don't need a RTX 4090.

One thing that has largely stopped streaming is network bandwidth. If you don't have the fastest broadband, or there's high latency, its not feasible. I've tried a number of services and I have good broadband and it was not a very enjoyable experience.
 
  • Like
Reactions: Mackilroy

Philip Turner

macrumors regular
Dec 7, 2021
170
111
There are parts of the world that $/kWh is much higher than where you live. As such halving of power consumption for a device that is turned on for at least 12 hrs/day, 365 days of a year for the next decade is helpful in lowering power bills. If you only see it as marketing nonsense then you are not as smart as you think you are.
Let's fact-check this. In the US, electricity costs $0.16/kWh and in Germany, $0.48/kWh. Let's use a 32-core M1 Max GPU at 75% ALU utilization (40 W), constantly running computationally intensive simulations for 12 hrs/day.

40 W x 12 hrs/day x 365 days/year x 1 kW/1000 W x $0.16 kWh = $28/year US, $84/year Germany

Most people aren't running their computers to the max every day of the year Let's say an Intel/NVIDIA system is 3x less power efficient: $84/year US, $252/year Germany (generous). If someone upgraded their $1,500 system every 10 years, that amortizes to $150/year (conservative). The cost of buying equipment exceeds lifetime energy cost, even with the most unfair estimates of energy consumption.
 
  • Like
Reactions: bobcomer

Sydde

macrumors 68030
Aug 17, 2009
2,563
7,061
IOKWARDI
You realize that the Mini predates the M1 chip and the M1 Studio is a chonky boy, where I don't see any advantages that that M1 gives us in that form factor

Well, the Mini has not included a graphics card since G4 days. The graphics is whatever is built into the CPU. So, if you want to do some heavy lifting, the base M1 appears to be a bit more than twice the Metal score compared to the Intel on-chip graphics, the M1 Pro around 4x (no reliable M2 Pro scores I can see so far). Compared to top-of-the-line cards, M-series has some catching up to do in terms of raw performance – if'n that is your big need. The AS Mini looks a good deal better all around compared to the i3~i7 line. And mostly silenter, AIUI.
 

Andropov

macrumors 6502a
May 3, 2012
746
990
Spain
Well, the Mini has not included a graphics card since G4 days. The graphics is whatever is built into the CPU.
The first Intel Minis also had discrete graphics cards. I believe the last one was the 2011 Mac Mini with the Radeon HD 6630M.
 
  • Like
Reactions: maflynn

TechnoMonk

macrumors 68030
Oct 15, 2022
2,604
4,111
I agree with this. On a laptop it does make a difference. On a desktop maybe if you're environmentally conscious or have multiple computers running but I can't imagine saving much money compared to the other power hungry devices at home.


I wish Apple would get into gaming. Ease of repair isn't guaranteed in a Windows laptop. Many lower end windows laptops have soldiered RAM chips. I think this is the future regardless of the OS.


I hope you're right but I don't know if Apple is going to make a larger iMac with all the current economic and other issues going on. It would be risky to design and make something like this if it didn't sell. I suspect the starting price would be around 2k USD.


What are you doing where you think you're going to experience this mind blowing performance? Maybe if you edit video or run benchmarks but for what the average person does I think we've reached a plateau. Sure over a ten year old computer it's going to seem crazy fast but if I put an M1 and this M3 side by side I don't think I could tell them apart by doing things like opening a webpage, creating a MS Word doc or even scrolling through photos.
This power doesn’t matter on desktop and workstation meme should die already. Have you used the latest AMD/Intel/Nvidia chips outside of gaming? Those things get hot with sustained load, need to under-volt and under-clock by 20-30% to prevent throttling and destroying the PS. These benchmarks with overclocked short bursts are useless. I will take an efficient processor any day, which can run at maximum performance level.
Saving money on power is nice, but for me performance drop is a big problem.
 
  • Like
Reactions: MacPowerLvr

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
Let's fact-check this. In the US, electricity costs $0.16/kWh and in Germany, $0.48/kWh. Let's use a 32-core M1 Max GPU at 75% ALU utilization (40 W), constantly running computationally intensive simulations for 12 hrs/day.

40 W x 12 hrs/day x 365 days/year x 1 kW/1000 W x $0.16 kWh = $28/year US, $84/year Germany

Most people aren't running their computers to the max every day of the year Let's say an Intel/NVIDIA system is 3x less power efficient: $84/year US, $252/year Germany (generous). If someone upgraded their $1,500 system every 10 years, that amortizes to $150/year (conservative). The cost of buying equipment exceeds lifetime energy cost, even with the most unfair estimates of energy consumption.
Even then it is a still a cost savings. That's still money inside your pocket whether it be $0.10, $1.00, $100.00, $1,000.00.

How much you earn is not important it is how much you save and invest is.

People whine about fuel prices whenever a barrel of oil goes up and yet most car owners do not use their cars 12 hrs/day.

And using your metrics would the cost of a car be lower than fuel over 10 years?
 
Last edited:

TechnoMonk

macrumors 68030
Oct 15, 2022
2,604
4,111
Let's fact-check this. In the US, electricity costs $0.16/kWh and in Germany, $0.48/kWh. Let's use a 32-core M1 Max GPU at 75% ALU utilization (40 W), constantly running computationally intensive simulations for 12 hrs/day.

40 W x 12 hrs/day x 365 days/year x 1 kW/1000 W x $0.16 kWh = $28/year US, $84/year Germany

Most people aren't running their computers to the max every day of the year Let's say an Intel/NVIDIA system is 3x less power efficient: $84/year US, $252/year Germany (generous). If someone upgraded their $1,500 system every 10 years, that amortizes to $150/year (conservative). The cost of buying equipment exceeds lifetime energy cost, even with the most unfair estimates of energy consumption.
My 100 W M1 Max is pretty close in performance to an Nvidia/AMD workstation drawing 1300W. It would be big difference if only Apple had better support for libraries and software.
Edit: Not to mention the amount of money spent on cooling these things.
 

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
My 100 W M1 Max is pretty close in performance to an Nvidia/AMD workstation drawing 1300W. It would be big difference if only Apple had better support for libraries and software.
Edit: Not to mention the amount of money spent on cooling these things.

Gamers put more importance to ease of repair/upgrades, modularity of the computer and the gaming library than performance per watt, power consumption, quietness of the system and waste heat energy.

Most of the comparison reviews of Apple chip vs Intel chip rarely mention or do not emphasize enough the power consumption.

Hence Intel having little to no fear of sticking to 14nm from 2014-2020. Only moving their ass in 2021-today because Apple went 5nm on them.

Like back during the 1st month of release of the Mac Studio M1 Ultra. Apple claims the Ultra's iGPU had the same performance as a Nvidia RTX 3090 dGPU. PC gamers were in disbelief and furious... then celebrated when there was a 1 or a few benchmarks that disproved Apple's claims be less than 20%.

Did anyone mention that the fully loaded Mac Studio M1 Ultra accomplished this while using less than 215W and thermal output of less than 734BTU/h?

Just for the RTX 3090 dGPU without any other part or component would consume between 355W and 365W.

So I could operate two Mac Studio M1 Ultras and both would consume the same power as 1 RTX 3090 dGPU.

It does not enter the equation because power's that cheap or they never consider it as part of the purchase.

It is equivalent to US vehicle owners buying the biggest SUV or pick up affordable to them and never caring for or even glance at its rated fuel consumption. It only cropped up when Putin had his war or the Middle East comes into conflict.
 
Last edited:

TechnoMonk

macrumors 68030
Oct 15, 2022
2,604
4,111
Gamers put more importance to ease of repair/upgrades, modularity of the computer and the gaming library than performance per watt, raw performance, power consumption, quietness of the system and waste heat energy.
I am not a PC gamer, I will use Xbox or Playstation when I am gaming. With AI starting to get mainstream, GPU, unified memory, along with sustained performance could be more important. Apple has a huge oppurtunity here if they can get their software support and strategy right.
 
  • Like
Reactions: George Dawes

russell_314

macrumors 604
Feb 10, 2019
6,659
10,260
USA
This power doesn’t matter on desktop and workstation meme should die already. Have you used the latest AMD/Intel/Nvidia chips outside of gaming? Those things get hot with sustained load, need to under-volt and under-clock by 20-30% to prevent throttling and destroying the PS. These benchmarks with overclocked short bursts are useless. I will take an efficient processor any day, which can run at maximum performance level.
Saving money on power is nice, but for me performance drop is a big problem.
I think it matters to some but not most. Some people with desktop computers overclock them and buy extra cooling hardware to make it possible. It’s not just short bursts. Of course, this is a small part of the of the total PC market.

These are people who want to tinker with their computer just because it gives them some enjoyment. It’s not a logical thing. It’s kind of like the Mac user who installs an app to monitor RAM usage. This app is totally counterproductive because it uses RAM, but it gives the person enjoyment and he can see that pretty graph.

I think it just depends on what you’re wanting. I think efficiency is a consideration for most people. I’m fairly confident that most people living outside their parents house would choose a more efficient device unless it had a significant performance disadvantage.
 

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
I am not a PC gamer, I will use Xbox or Playstation when I am gaming. With AI starting to get mainstream, GPU, unified memory, along with sustained performance could be more important. Apple has a huge oppurtunity here if they can get their software support and strategy right.
They gotta provide incentives for game developers to port their work to macOS.

It does not help that Apple's Metal requires a rewrite for what already works with AMD or Nvidia dGPUs.
 
  • Like
Reactions: tevion5 and Basic75

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
These are people who want to tinker with their computer just because it gives them some enjoyment. It’s not a logical thing. It’s kind of like the Mac user who installs an app to monitor RAM usage. This app is totally counterproductive because it uses RAM, but it gives the person enjoyment and he can see that pretty graph.

I think it just depends on what you’re wanting. I think efficiency is a consideration for most people. I’m fairly confident that most people living outside their parents house would choose a more efficient device unless it had a significant performance disadvantage.
I use MenuMeters to monitor my fiber connection's throughput and if my ISP's broken that day. I do not use its function to monitor CPU, GPU, RAM and other metrics because as you said its' counterproductive.

And I fully agree with you that most PC gamers with a 1kW or 2kW PSU are still living with their parents.

If they paid their own bills they'd opt to go with a PC gaming desktop that sips as much power as a laptop.

If they realized how much their health was being impacted by staying sedentary in front of a keyboard for entertainment they'd stop gaming.

In my home we used window ACs that were 2 decades old. They used old tech that was not inverter-based. I switched all three of them to inverter ductless mini-splits. Doing so cut my household's power consumption by over 70% and yet utility was increased from 8hrs to 16hrs.

Did something similar when I switched from a 2 decades old top loading twin tub washing machine to inverter front loading washing machine. Consumption of power, water and detergent went down. Time to finishing a wash dropped as well.

To me Intel chips are like incadenstant bulbs while Apple chips are 223lm/W LEDs.

A $699 Lenovo Thinkpad E14 with a 7nm AMD chip has a 65W charger vs a $999 Macbook Air M1 5nm has a 30W charger.

If Apple offered a Macbook that sold at that price, was as easily repairable and there were macOS versions of the Windows programs we used then I'd be inclined to move our people to Macs as a cost saving measure.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.