Damn! This takes me back to 1994! So happy to see the CISC versus RISC debate pop up again. Ah, the memories.x86 is a poor architecture. Even if Intel chip happens to be faster in some specific task, so what. RISC designs are inherently superior.
Damn! This takes me back to 1994! So happy to see the CISC versus RISC debate pop up again. Ah, the memories.x86 is a poor architecture. Even if Intel chip happens to be faster in some specific task, so what. RISC designs are inherently superior.
“Broke the spell”? You sound like that Intel ad. But if you prefer PC, go for it. Nothing wrong with that.Yes, sure. I admit I have a curiously emotional bias against the weird, more and more unrecognizable company Apple. I finally broke that spell and became disillusioned after decades of being in some thrall over having to use all Apple things in my life.
Thermal characteristics are very important, but the “H” in the name of the chip literally means it is a mobile chip and designed for mobile.Speed estimated outside a thermal envelope doesn’t mean absolutely anything.
Furthermore M1 Max is Mobile chip. Comparing it against a workstation chip it’s nonsense.
M1 Max is the fastest mobile chip on the market, period.
Nah, I'm a jerk, you're right, it's all good dude. I'm more combative than anyone else here, but I sure don't want to hurt anyone, I'm just rather upset with myself if you want to know the truth. Apple can do what they want, and if people find use and joy in their continued offerings, well I can screw right off for trying to tear that down. But I think there is some sense in exploring and questioning... and some healthy observation and doubt.
Yes. People, this is still a LAPTOP with regards to M1 Pro/Max. Just wait when power/thermals are not a consideration (Mac Pro and to a lesser extent ~30" iMac).Damn, a desktop level chip that will probably run at 200w+ at max load is beating a 30w laptop chip by a little bit.
In all seriousness, Intel will undoubtedly say this is the fastest CPU on the planet and they're right. Some people will fall for the marketing. But at the end of the day, it's mostly about performance/watt on mobile.
I do wish Apple would be aggressive to reclaiming the single thread though and it looks like they will as soon as the M2 Macbook Air.
If Apple does a 12-month update for the MBP, there is little chance Intel will catch Apple. If Apple does an 18-month cadence, Intel will certainly have the single-thread performance crown even if the TDP is significantly higher than Apple SoCs.
Just imagine how amazing Windows would be if they cut out the support of the older stuff!good one..thank god we dont have to tolerate 32-bit support until year 2100..
thank god you dont say anything about flash support being dead
Apple is just moving forward...while windows and OEM cannot do that because they will loose a lot of market because of its legacy support...they will support it even after 100 years or until a new CEO will come around that will force the world to move on
All the real pro apps are moved to 64-bit support...the rest of apps that are just 32-bit then they are not worth it since the developers dont care about it why you should?! there are so many apps out there..
Why does this need to be so complicated?Don’t fall for intels marketing. Intel 7 is still 10nm.
for 10nm alder lake to compete with TSMC‘s 5nm while using legacy x86 architecture is an achievement in itself.
That image you keep posting is comparing it to the 15" MBP (2018 model) not the 16" MBP from 2019.View attachment 1876972
The new MBP is an entire display thicker than the old MBP. My M16 is also an entire display thicker than the old MBP. How clearer can it be? The confusion is because you are adding the rubber feet to the dimensions of the PCs for whatever reason, as those definitely add height for good thermals, but have nothing at all to do with the thickness of the computer as a whole.
I am not interested in defending the XPS, as those are terrible laptops, and I don't care to think about them in any capacity.
My HP laptop from work does the same. Gets WAY TOO LOUD even just using Azure DevOps in Chrome.No it’s not. My Lenovo P53 does it as well. In fact that just throttles down so much that windows UI lags out.
The HP elite book before did it.
I had desktops before that which were marginally better. But they won’t ship me one now as I’m “on call” despite working from home and they installed about 200 hot desks in the office (which I never go to). Muppets
While I broadly agree with your post, my personal thought is that Apple switching to their own custom chips is going to reduce monopolization and consolidation of power in the industry.I'm upset too that some software will no longer be compatible and I'm deeply upset that the power balance of control over our own devices is being heavily weighted towards the corporations having all the say and power. Overall the benefits we get with the walled garden approach of "We, the company, decide what's best for you" do not outweigh the downsides of the ongoing assault against real innovation and computational empowerment of the public.
I feel this way too - mostly just want it to be quiet & cool. With that said the amount of code bloat, particularly javascript/react/electron crap means that even simple websites these days are monsters to run.I could care less if it's the fastest chip. I just want decent performance with long battery life. I don't need a Ferrari to drive down a dirt road. For most people, all they are doing is checking email, word processing, opening PDF's, and surfing the web. Even with my 400+ page PDF and Word documents, my Mac mini M1 and MacBook Air M1 handle it just fine.
And this is the thing that drives me crazy when people complain about RAM and number of browser tabs they can have open. How well is that website optimized? How much javascript does it have? Ads? We are not in the 1990s anymore, websites are not simple static websites. I once observed a website using 2GB of RAM on my Windows based computer.I feel this way too - mostly just want it to be quiet & cool. With that said the amount of code bloat, particularly javascript/react/electron crap means that even simple websites these days are monsters to run.
Woah, this is something that doesn’t get remotely any sort of spotlight… I would be curious if your friend or someone related to him gets an M1 Pro or Max and does the same.(…)
A friend of mine who upgraded from a ~2015 MBP to an M1 Air said his Stan models that used to take 3+ hours now only take 20-30 minutes
(…)
If Windows cut off legacy support nobody would use Windows.Just imagine how amazing Windows would be if they cut out the support of the older stuff!
I am sure many people still would, I definitely would so there goes your "nobody". I don't use programs that were made in 2001, but I know Windows can still run those programs. Part of why Windows is so horrible is because it is so bloated and brings all those legacy items.If Windows cut off legacy support nobody would use Windows.
I thought it was intel 7 or something equally ridiculousThey changed its name to 7nm.
The main advantage of Windows is that it can still run programs from the 1990s. The bloat of Windows is a feature, not a bug. Microsoft's corporate customers would freak out if Microsoft decided to abandon the architecture they started using in the 1980s (or abandon the one they were using in the '90s... or the one they were using in the 2000s... like another company I could mention).I am sure many people still would, I definitely would so there goes your "nobody". I don't use programs that were made in 2001, but I know Windows can still run those programs. Part of why Windows is so horrible is because it is so bloated and brings all those legacy items.
One of these vendors has delivered for the past decade. The other has offered up 5% per year on contrived benchmarks - at ever increasing thermal output.I love these threads.
Apple destroy completion : “omg! Look at these leaks. So much power! So good!”
Intel perform better than apple : “got to wait for real world tests”
Why does this need to be so complicated?
Intel's at 10nm, but they changed the name to 7nm. BUT, Intel's 10nm is better than TSMC 7nm. So what, is Intel's 7nm better than TSMC 5nm? Geez industry, is is it a nanometer or not?!
This is like saying my swimming pool is 12 yards, but its also like 20 yards! No....take out a measuring tape. Is it 12 or 20?
It probably wouldn't even exist, businesses like backwards compatibility. I bet there would be a heck of a lot less computers too. Windows made the personal computer market.Just imagine how amazing Windows would be if they cut out the support of the older stuff!