yeah, I wouldn’t get the 3nm either.. better wait for the sub atomic technology coming in 2035.. but that’s old tech too if you think about what’s coming in 2050.
I’m refusing to buy any new technology until they release 0.0000000000000000000000000000000000001nm chips… How dare Apple sell perfectly good technology now when it’s obvious those sneaky buggers are probably working on even betterer chips!The 5nm tech in the M1, M1 Max and M1 Ultra is now old tech. We are on the cusp of 3nm M2's. That is a 40% decrease in size and significant boost to performance and energy savings. The last change was 7nm to 5nm which was less than 30% decrease in fabrication. The iPhone 12 has a 5nm chip and that is almost 2 years old now.
Paying $4000 or $8000 for a 5nm chip computer right now is probably a bad idea. When the 3nm rollout comes later this year, the longevity of those chips will be significantly better. Also, we are running into constraints with Moore's law and will probably not see 1nm chips for several years.
This. There are other things that you gain. I kept my 2010 Mac Pro with 8GB of RAM for far too long. Even though it does still sit there as just a compressor system now. While it still performed the work the same speeds as a new system (compared it to a 2019 i9 iMac with 128GB of RAM and Vega GPU -- seems we plateaued in h.264 1080p video export speeds a while ago until Apple's latest M1 releases), I was still stuck with SATA 2, PCIe 2, USB 2.0, DDR 3 RAM and more things. I could fix some of the issues with PCIe cards, but you get my point is that not just upgrading for more RAM, but I get a modern CPU, GPU, new type of RAM, SSDs that are 7 GB/s now, and many more things.Purchase what you need when you need it for the work at hand. Don't over spec, replace when newer hardware can increase profit or offer a significant differnce to the user experience. Those purchasing top spec Mac's are likely monetising them with the ROI being significant.
Waiting on the next best thing is a fools errand as it never ends, you can also get burnt like the hapless 2016 MBP redesign...
Q-6
True story, I know a local photography studio that still uses PowerPC macs, and their most powerful one is only 1GB of RAM. They use Adobe Creative Suite 2. Another photography studio I know has some modern hardware, but they are old Lenovo desktops from around 2013 with only 6GB of RAM. They use the Creative Suite 4 application. It still suits both companies just fine. Man I do miss the days of buying software and using it 10+ years later! Creative Suite 2 released in 2005 and that one company is STILL using it! Can't do that today as its a monthly cost!Exactly a $600 machine can turn six figures in the right hands, a $6K Mac can equally be obsolete in short time with some workflows. For those utilising their Mac's professionally the cost should hopefully be irrelevant. For the consumer/prosumer different situation, that said I maintain buy what you need when you need it at the lowest price point for your needs. Apple is a master class in upselling it's HW a point well worth considering.
Current crop of Apple Silicon Mac's are devastatingly fast and will remain relevant for a good while. I've only got a base model M1 MBP as that's all I need, better the $$$$ in my pocket than Apple's Should I need more, to the point capitalise on more performance it would be a just a few clicks away nor would I for one instant worry about the next best thing...
Q-6
Yeah, nobody asks "What nm is that processor?" If I need a high core count, and a lot of video encoders/decoders, and a lot of GPU cores, will it make sense for me to get the 5 nm M1 Ultra vs the brand new shiny BASE M2 3 nm? You cannot really cross gen compare with things like this. This is like saying an X+1 gen Intel i3 is better than an X gen i9 because its less nm!Just buy a computer when you need a computer. Regardless of what nm we’re on, the M1 Macs are a significant step up over their Intel counterparts. Do you have a good enough computer now? Then wait to upgrade for M2 or M3 or M17 or whatever you need. If you need one now then waiting for an M2 (or more likely M3) Ultra is a stupid move. Because they’re literal years away from release.
Reminds me of a salesman in our office way, way back. When the IBM PC came out, he dithered about buying it, but then the XT appeared as an upgrade. Then the AT, PS2... When I left the company in 1992 he was still refusing to purchase a PC until the final best one was announced.Speak for yourself, I’m personally refusing to upgrade from my 66 MHz Power Mac 6100 until I can be certain there won’t be any technological advances after I buy my next Mac!
It's a marketing game; renting content only serves the provider not the end user. In many respects a lot of SW is way beyond the average users need outside of those that really employ it fulltime as a factor of employment. If a Dev delivers a solid paid upgrade I'll update in a heartbeat. Alternatively the Dev opts for the "rental" scenario I'll drop them like a stone...True story, I know a local photography studio that still uses PowerPC macs, and their most powerful one is only 1GB of RAM. They use Adobe Creative Suite 2. Another photography studio I know has some modern hardware, but they are old Lenovo desktops from around 2013 with only 6GB of RAM. They use the Creative Suite 4 application. It still suits both companies just fine. Man I do miss the days of buying software and using it 10+ years later! Creative Suite 2 released in 2005 and that one company is STILL using it! Can't do that today as its a monthly cost!
So you are exactly right in that a $600 machine can still be used for monetary gain.
You sure existing lasers can handle the tighter tolerances?The lasers already exist, and you don’t need to fuel them with neon, so existing fabs are fine.
They are using them right now. Not sure what you’re saying, unless you are worried about two or three nodes from now. But, of course, if you need a new kind of laser for a new node, why would it need neon? If it needs neon, then it’s the same laser as they use now. (The detail you can resolve using the laser is based on the wavelength of the light, which is in turn based on the gas.)You sure existing lasers can handle the tighter tolerances?
There blazing fast. Totally worth it rn if you can afford them.The 5nm tech in the M1, M1 Max and M1 Ultra is now old tech. We are on the cusp of 3nm M2's. That is a 40% decrease in size and significant boost to performance and energy savings. The last change was 7nm to 5nm which was less than 30% decrease in fabrication. The iPhone 12 has a 5nm chip and that is almost 2 years old now.
Paying $4000 or $8000 for a 5nm chip computer right now is probably a bad idea. When the 3nm rollout comes later this year, the longevity of those chips will be significantly better. Also, we are running into constraints with Moore's law and will probably not see 1nm chips for several years.
You do realize intel and AMD are using a 7nm process for their lates processors. That could change though as AMD is supposedly releasing new CPUs next month.The 5nm tech in the M1, M1 Max and M1 Ultra is now old tech. We are on the cusp of 3nm M2's. That is a 40% decrease in size and significant boost to performance and energy savings. The last change was 7nm to 5nm which was less than 30% decrease in fabrication. The iPhone 12 has a 5nm chip and that is almost 2 years old now.
Paying $4000 or $8000 for a 5nm chip computer right now is probably a bad idea. When the 3nm rollout comes later this year, the longevity of those chips will be significantly better. Also, we are running into constraints with Moore's law and will probably not see 1nm chips for several years.
Wait, what? Why are you talking about Threadripper CPUs when my post was about an i9-12900K.Are you kidding? Price/performance wise these kill windows desktops. I could not configure a threadripper pc with equivalent performance, memory, fast SSDs etc for under 6000. So 4000 for a Mac ultra vs 6000 for a threadripper. Hmmmmm, which is less, I can’t figure it out.
And no, don’t pretend a slower performance threadripper with crappy SSDs and not enough ram is equivalent, it is not. But maybe you will have better luck. Send on the link
That was my stretch prediction in the other place. I know they've been working on it for awhile, just not sure if they are ready, and since A15 didn't have it, maybe they will wait one more generation.If M2 has hardware ray-tracing acceleration, that might be worth waiting for, from a 3D/DCC perspective...?
Thats why I still have my Commodore 128. I’m waiting until progress stalls and I can be sure my next system is future proof.
I hope your sporting the 1571 disk drive and a 1200 baud modem with that.Thats why I still have my Commodore 128. I’m waiting until progress stalls and I can be sure my next system is future proof.
Wait, what? Why are you talking about Threadripper CPUs when my post was about an i9-12900K.
EDIT: I was wrong.You do realize intel and AMD are using a 7nm process for their lates processors. That could change though as AMD is supposedly releasing new CPUs next month.
The best way to look at it in my opinion is this. If the cpu outperforms anything comparable in other companies lineups then it doesn’t matter the process. If the cpus on the market are all using 7nm and this one is using 5nm but 4nm is ready then that doesn’t mean it’s obsolete. It just means we haven’t started using new processes. It’s always gonna be like this.
Perhaps this is a better example. USB 3 came out and didn’t get any meaningful adoption for a few years. I didn’t not buy a computer because the USB ports were outdated. I bought the computer because it performed exceptionally well for its time.