Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Pugly

macrumors 6502
Jun 7, 2016
411
403
I doubt there will ever be a jump in performance as big as these m1 chips. Each new generation is going to be 5-20% faster... maybe more on the GPU side... judging by the A processors year to year improvements.

The Air is about three times faster than the last generation and everything else is twice as fast or more. On top of the low heat, energy and battery life improvements... which is arguably more important.
 

EntropyQ3

macrumors 6502a
Mar 20, 2009
718
824
Process technology jumps aren’t what they used to be. TSMC 3nm will bring improvements to logic density, a bit in power/performance but very little in SRAM density and of course I/O.
It will be a rather shallow step, historically speaking, not a jump.
The next step will bring a new transistor design often referred to as GAA (Gate All Around). This will be interesting to us nerds, but it is opaque just what real world improvements it will bring to the table - personally I’m most excited by a proposed new SRAM design that could help with density but it is a ways out even if everything pans out.
What the cool kids are doing is focussing on packaging/interconnects and the improvements it can bring. (Apples M1 Ultra being an example) But that doesn’t really adress power draw, nor of course cost.
 
Last edited:

Born2Run

macrumors 6502
Nov 27, 2010
261
612
Hove
The 5nm tech in the M1, M1 Max and M1 Ultra is now old tech. We are on the cusp of 3nm M2's. That is a 40% decrease in size and significant boost to performance and energy savings. The last change was 7nm to 5nm which was less than 30% decrease in fabrication. The iPhone 12 has a 5nm chip and that is almost 2 years old now.

Paying $4000 or $8000 for a 5nm chip computer right now is probably a bad idea. When the 3nm rollout comes later this year, the longevity of those chips will be significantly better. Also, we are running into constraints with Moore's law and will probably not see 1nm chips for several years.
I’m refusing to buy any new technology until they release 0.0000000000000000000000000000000000001nm chips… How dare Apple sell perfectly good technology now when it’s obvious those sneaky buggers are probably working on even betterer chips!
 

marco114

macrumors 6502
Jul 17, 2001
440
458
USA
It just comes down to whether or not you need a computer now or later. I just replaced my MacBook Pro 15 Retina Mid-2015 model. It lasted 7 years. That's pretty amazing.
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
Purchase what you need when you need it for the work at hand. Don't over spec, replace when newer hardware can increase profit or offer a significant differnce to the user experience. Those purchasing top spec Mac's are likely monetising them with the ROI being significant.

Waiting on the next best thing is a fools errand as it never ends, you can also get burnt like the hapless 2016 MBP redesign...

Q-6
This. There are other things that you gain. I kept my 2010 Mac Pro with 8GB of RAM for far too long. Even though it does still sit there as just a compressor system now. While it still performed the work the same speeds as a new system (compared it to a 2019 i9 iMac with 128GB of RAM and Vega GPU -- seems we plateaued in h.264 1080p video export speeds a while ago until Apple's latest M1 releases), I was still stuck with SATA 2, PCIe 2, USB 2.0, DDR 3 RAM and more things. I could fix some of the issues with PCIe cards, but you get my point is that not just upgrading for more RAM, but I get a modern CPU, GPU, new type of RAM, SSDs that are 7 GB/s now, and many more things.
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
Exactly a $600 machine can turn six figures in the right hands, a $6K Mac can equally be obsolete in short time with some workflows. For those utilising their Mac's professionally the cost should hopefully be irrelevant. For the consumer/prosumer different situation, that said I maintain buy what you need when you need it at the lowest price point for your needs. Apple is a master class in upselling it's HW a point well worth considering.

Current crop of Apple Silicon Mac's are devastatingly fast and will remain relevant for a good while. I've only got a base model M1 MBP as that's all I need, better the $$$$ in my pocket than Apple's :) Should I need more, to the point capitalise on more performance it would be a just a few clicks away nor would I for one instant worry about the next best thing...

Q-6
True story, I know a local photography studio that still uses PowerPC macs, and their most powerful one is only 1GB of RAM. They use Adobe Creative Suite 2. Another photography studio I know has some modern hardware, but they are old Lenovo desktops from around 2013 with only 6GB of RAM. They use the Creative Suite 4 application. It still suits both companies just fine. Man I do miss the days of buying software and using it 10+ years later! Creative Suite 2 released in 2005 and that one company is STILL using it! Can't do that today as its a monthly cost!

So you are exactly right in that a $600 machine can still be used for monetary gain.
 
  • Like
Reactions: Bob_DM and Queen6

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
Just buy a computer when you need a computer. Regardless of what nm we’re on, the M1 Macs are a significant step up over their Intel counterparts. Do you have a good enough computer now? Then wait to upgrade for M2 or M3 or M17 or whatever you need. If you need one now then waiting for an M2 (or more likely M3) Ultra is a stupid move. Because they’re literal years away from release.
Yeah, nobody asks "What nm is that processor?" If I need a high core count, and a lot of video encoders/decoders, and a lot of GPU cores, will it make sense for me to get the 5 nm M1 Ultra vs the brand new shiny BASE M2 3 nm? You cannot really cross gen compare with things like this. This is like saying an X+1 gen Intel i3 is better than an X gen i9 because its less nm!
 
  • Like
Reactions: Queen6

romanof

macrumors 6502
Jun 13, 2020
361
387
Texas
Speak for yourself, I’m personally refusing to upgrade from my 66 MHz Power Mac 6100 until I can be certain there won’t be any technological advances after I buy my next Mac!
Reminds me of a salesman in our office way, way back. When the IBM PC came out, he dithered about buying it, but then the XT appeared as an upgrade. Then the AT, PS2... When I left the company in 1992 he was still refusing to purchase a PC until the final best one was announced.

I suspect he is still waiting.
 

Queen6

macrumors G4
True story, I know a local photography studio that still uses PowerPC macs, and their most powerful one is only 1GB of RAM. They use Adobe Creative Suite 2. Another photography studio I know has some modern hardware, but they are old Lenovo desktops from around 2013 with only 6GB of RAM. They use the Creative Suite 4 application. It still suits both companies just fine. Man I do miss the days of buying software and using it 10+ years later! Creative Suite 2 released in 2005 and that one company is STILL using it! Can't do that today as its a monthly cost!

So you are exactly right in that a $600 machine can still be used for monetary gain.
It's a marketing game; renting content only serves the provider not the end user. In many respects a lot of SW is way beyond the average users need outside of those that really employ it fulltime as a factor of employment. If a Dev delivers a solid paid upgrade I'll update in a heartbeat. Alternatively the Dev opts for the "rental" scenario I'll drop them like a stone...

Exactly the same applies to HW, faster silicon wont automatically increase productivity unless you know how to apply it. One of the best computing deals I've ever had is a lowly Acer Switch 5; i3, 4GB RAM & 128GB SSD. It only cost around $600, bought on a whim, yet still remains in the rotation as the computer delivers well beyond expectation.

Most I feel don't fully understand the computing power on hand and how companies operate. I've even seen some nonsense how the original M1 is now not no longer relevant LOL. This Asus 17" W10 PC I'm currently on is an 8th Gen hex core i7 is and remains to be a desktop replacement. My base M1 13" MBP beats the Asus handsomely, yet some consider M1 to be slow...

Q-6
 
Last edited:

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
You sure existing lasers can handle the tighter tolerances?
They are using them right now. Not sure what you’re saying, unless you are worried about two or three nodes from now. But, of course, if you need a new kind of laser for a new node, why would it need neon? If it needs neon, then it’s the same laser as they use now. (The detail you can resolve using the laser is based on the wavelength of the light, which is in turn based on the gas.)
 

rafark

macrumors 68000
Sep 1, 2017
1,839
3,212
The 5nm tech in the M1, M1 Max and M1 Ultra is now old tech. We are on the cusp of 3nm M2's. That is a 40% decrease in size and significant boost to performance and energy savings. The last change was 7nm to 5nm which was less than 30% decrease in fabrication. The iPhone 12 has a 5nm chip and that is almost 2 years old now.

Paying $4000 or $8000 for a 5nm chip computer right now is probably a bad idea. When the 3nm rollout comes later this year, the longevity of those chips will be significantly better. Also, we are running into constraints with Moore's law and will probably not see 1nm chips for several years.
There blazing fast. Totally worth it rn if you can afford them.
 

DJJAZZYJET

macrumors 6502
Jun 4, 2011
461
144
Wouldn’t say 5NM is ‘old tech’ - I doubt Apple will be releasing any M2 chips this year because they just don’t need to. They already beat out the competition and Apple can afford to drag its heels. When they do release M2 I don’t think they’ll drop M2 Pro/Max and delay those even further down the line.
 

myhaksown

macrumors member
Feb 6, 2012
79
105
The 5nm tech in the M1, M1 Max and M1 Ultra is now old tech. We are on the cusp of 3nm M2's. That is a 40% decrease in size and significant boost to performance and energy savings. The last change was 7nm to 5nm which was less than 30% decrease in fabrication. The iPhone 12 has a 5nm chip and that is almost 2 years old now.

Paying $4000 or $8000 for a 5nm chip computer right now is probably a bad idea. When the 3nm rollout comes later this year, the longevity of those chips will be significantly better. Also, we are running into constraints with Moore's law and will probably not see 1nm chips for several years.
You do realize intel and AMD are using a 7nm process for their lates processors. That could change though as AMD is supposedly releasing new CPUs next month.

The best way to look at it in my opinion is this. If the cpu outperforms anything comparable in other companies lineups then it doesn’t matter the process. If the cpus on the market are all using 7nm and this one is using 5nm but 4nm is ready then that doesn’t mean it’s obsolete. It just means we haven’t started using new processes. It’s always gonna be like this.

Perhaps this is a better example. USB 3 came out and didn’t get any meaningful adoption for a few years. I didn’t not buy a computer because the USB ports were outdated. I bought the computer because it performed exceptionally well for its time.
 

collin_

macrumors 6502a
Nov 19, 2018
583
888
Are you kidding? Price/performance wise these kill windows desktops. I could not configure a threadripper pc with equivalent performance, memory, fast SSDs etc for under 6000. So 4000 for a Mac ultra vs 6000 for a threadripper. Hmmmmm, which is less, I can’t figure it out.

And no, don’t pretend a slower performance threadripper with crappy SSDs and not enough ram is equivalent, it is not. But maybe you will have better luck. Send on the link
Wait, what? Why are you talking about Threadripper CPUs when my post was about an i9-12900K.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
If M2 has hardware ray-tracing acceleration, that might be worth waiting for, from a 3D/DCC perspective...?
That was my stretch prediction in the other place. I know they've been working on it for awhile, just not sure if they are ready, and since A15 didn't have it, maybe they will wait one more generation.
 
  • Like
Reactions: Boil

nquinn

macrumors 6502a
Jun 25, 2020
829
621
You do realize intel and AMD are using a 7nm process for their lates processors. That could change though as AMD is supposedly releasing new CPUs next month.

The best way to look at it in my opinion is this. If the cpu outperforms anything comparable in other companies lineups then it doesn’t matter the process. If the cpus on the market are all using 7nm and this one is using 5nm but 4nm is ready then that doesn’t mean it’s obsolete. It just means we haven’t started using new processes. It’s always gonna be like this.

Perhaps this is a better example. USB 3 came out and didn’t get any meaningful adoption for a few years. I didn’t not buy a computer because the USB ports were outdated. I bought the computer because it performed exceptionally well for its time.
EDIT: I was wrong.

Intel 7 is at around 100 MTr/mm2 vs. TSMC 171 MTr/mm2.

Looks like they will catch up with Intel 4 in 2H 2022 of this year.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.