Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

skaertus

macrumors 601
Feb 23, 2009
4,252
1,409
Brazil
IMHO the M1’s potential has yet to be fully realized by existing software, since its only 5 months old. There’re yet more performance to be unleashed for the M1 once developers understand how to make better use of the UMA and the currently unused cores locked within the SoC.

As for Intel, it has been Apple pushing them to deliver on mobile solution CPUs, which resulted in the original MB Air. Intel just couldn’t keep the momentum going, probably because Apple is just a tiny fraction of their business and the other manufacturers are just fine with whatever Intel is dishing out.

Going further back, I think it’s due to Apple switching to the PowerPC and initially having a performance advantage over x86 that got Intel into shape. When Apple switched to Intel, this likely made Intel conceited and that made them lost their drive to improve.

So the way I see it, unless Apple has changed their culture of pushing the boundary, I think we’ll continue to get surprises from them in the foreseeable future. Apple, IMHO, has always competed against itself, and this will likely continue.
I am not sure if it was Apple that drove Intel into being competitive back then. PowerPC was a real thing back then, and it was even used to power successful videogames, such as the Wii (which sold over 100 million units) and the Playstation 3 and the XBOX 360 (which sold over 80 million units each). Big business. The fact that Apple used PowerPC only made it stronger, but it was not the only driving force behind it.

Plus, Intel had many competitors over the years, including successful designs such as the 68000, the MIPS architecture, and the promising DEC Alpha.

But then all these competitors ended up falling. By early 2010, Intel's dominance was clear, as AMD failed to deliver with its Bulldozer architecture (https://www.theverge.com/2012/11/15/3646698/what-happened-to-amd). Processors in smartphones were weak and did not pose a threat to Intel. Intel reigned supreme and everyone else, including Apple, had to dance to Intel's tune.

Since then, competition became fiercer and Intel was left behind. It is interesting how these things go and how they change fast. Less than ten years later, the undisputed king of processors is struggling for its own survival. I would not be surprised if Intel rebounds and poses some serious competition to Apple. Intel does not have to outdo the M1, but if it comes close enough, it can avoid most PC makers from turning to ARM. Which is, of course, the ultimate goal.
 

Abazigal

Contributor
Jul 18, 2011
20,392
23,894
Singapore
However, perhaps Intel does not have to do away with all the drawbacks, as long as consumers are satisfied with what they get. I looked into some reviews I found on the Internet.
Thanks for taking the time to sift out and organise the numbers. I had no idea the MBP had significantly better life over the MBA.

Another factor we haven't considered is that the M1 chip could also allow form factors that would otherwise not have been possible with Intel chips, which could in turn draw consumers away from Intel PCs. Take for example how slim the M1 iMac is. Apple has managed to do away with the bump at the back and make it uniformly slim (and I presume heat throttling won't be an issue either), while cramming in impressive speakers (using tech inherited from the HomePod). This clearly hasn't impressed many people here at Macrumours, who feel that these traits are largely wasted on a desktop, but its slim appearance may well attract many consumers who desire a compact PC that won't take up too much space on their desk.

The all-in-one market is an area where Apple seems to have virtually zero competition in.

I would imagine that Apple could further optimise Final Cut Pro (I am reminded of that video where Jonathan Morrison showed himself editing video on a 2015 Macbook and how it was faster than a windows ultrabook running premiere), further improve the lead to beyond what can be inferred from benchmarks alone.

The macbooks are still currently using the older form factor, and I wager the next revision will come with a significant redesign as well. Imagine a 16" MBP with desktop-class performance, all-day battery life, capable of sustained performance even when not plugged into the mains (something many windows laptops still cannot do), while retaining a thin and light design that makes it easy to carry around.

What I see Apple doing is changing the rules of the game by having their Macs fill market niches that the M1 chip is uniquely positioned to excel in. So either the competition falls into Apple's trap by trying to compete on Apple's terms (and failing miserably), or they choose not to play and completely cede those markets to Apple.

Either way, Apple wins.
 
  • Like
Reactions: BigMcGuire

skaertus

macrumors 601
Feb 23, 2009
4,252
1,409
Brazil
Thanks for taking the time to sift out and organise the numbers. I had no idea the MBP had significantly better life over the MBA.

Another factor we haven't considered is that the M1 chip could also allow form factors that would otherwise not have been possible with Intel chips, which could in turn draw consumers away from Intel PCs. Take for example how slim the M1 iMac is. Apple has managed to do away with the bump at the back and make it uniformly slim (and I presume heat throttling won't be an issue either), while cramming in impressive speakers (using tech inherited from the HomePod). This clearly hasn't impressed many people here at Macrumours, who feel that these traits are largely wasted on a desktop, but its slim appearance may well attract many consumers who desire a compact PC that won't take up too much space on their desk.

The all-in-one market is an area where Apple seems to have virtually zero competition in.

I would imagine that Apple could further optimise Final Cut Pro (I am reminded of that video where Jonathan Morrison showed himself editing video on a 2015 Macbook and how it was faster than a windows ultrabook running premiere), further improve the lead to beyond what can be inferred from benchmarks alone.

The macbooks are still currently using the older form factor, and I wager the next revision will come with a significant redesign as well. Imagine a 16" MBP with desktop-class performance, all-day battery life, capable of sustained performance even when not plugged into the mains (something many windows laptops still cannot do), while retaining a thin and light design that makes it easy to carry around.

What I see Apple doing is changing the rules of the game by having their Macs fill market niches that the M1 chip is uniquely positioned to excel in. So either the competition falls into Apple's trap by trying to compete on Apple's terms (and failing miserably), or they choose not to play and completely cede those markets to Apple.

Either way, Apple wins.
Well, I am not 100% sure about this for some reasons. I will divide them in two.

The all-in-one market

First, I have serious doubts about the all-in-one market. I have not found specific iMac sales figures, but many news point out that the bulk of Mac sales refer to laptops. It is understandable.

The original iMac was a success, but it was released back in 1998. Back then, laptops were bulky, heavy, slow, and expensive. Desktop PCs reigned supreme and the iMac was the elegant and compact alternative.

Now, more than two decades later, things are different. Laptops are all over the place, and they have become much faster, cheaper, and lighter. Desktop PCs are kind of a niche now and many of its consumers are power users seeking extreme performance or a better performance-to-price ratio than laptops.

In later years, the iMac incorporated desktop-class processors, making it faster than similarly-priced MacBooks. It made sense to a certain degree to have a very portable MacBook Air to carry everywhere and a more powerful iMac at home. Now, if the MacBook Air and the iMac offer the very same performance for the very same price, what is the point of having both?

There is certainly a market for the new iMac, but I guess it is a shrinking one. Some users may buy the iMac for the colors, for the design, or for the beautiful screen. But some will not buy it because the MacBook Air already offers the bulk of what they would expect. Buying an external monitor, keyboard, and mouse would help them get a similar experience for much less money.

I am sure many people will still buy the iMac. And that Apple will still be king of the all-in-one market. I am just not sure how significant it is. Most people do not have a spare pile of cash to spend just because they think the iMac will match the room's decoration, especially if it offers no performance advantage over a laptop.

Other form factors

Apple may put the M1 inside many different form factors. But I wonder how this might work.

Many PC manufacturers have already tried different form factors with varying degrees of success. Most of these form factors proved to be too cumbersome or not user-friendly. In the end, there are basically clamshell laptops, 2-in-1s, convertibles, and tablets. Apple is more conservative and has not yet delivered a convertible or a 2-in-1, which, by the way, I think is the right decision.

Apple has the iPad, which already dominates the tablet market. Android is a poor competitor here, as software is mostly optimized for smartphones. Microsoft Windows is also a poor competitor, as software is optimized for traditional computers. Including the M1 in the iPad may be overkill, but Apple is doing it and it will certainly become a great machine.

Now, Apple can deliver lighter powerful laptops. A next-gen 16-inch MacBook Pro can be thinner and lighter and still impressive. That is for sure. But if Apple cuts on the size, battery life will suffer. The 13-inch MacBook Pro has better battery life than the MacBook Air because it has a larger battery. And it comes at a cost: the Pro is heavier than the Air. If Apple makes them smaller, it will have to reduce the battery, which means less battery life.

The 16-inch Pro will probably use a more powerful processor, which should consume more battery. It is yet to be determined how long it will last with a single charge. Hopefully, Apple can impress us all.

But the market already offers some solutions which are not too far off. The 17-inch LG Gram comes with a Core i7-1165G7 processor (which is the closest thing to M1 that Intel seems to have produced so far), a 17-inch 16:10 2560x1600 screen, an 80Wh battery (providing about 12 hours of battery life), and weighs only 2.98 lbs (less than the 13-inch Pro). The 16-inch version is even lighter at only 2.6 lbs (less than the Air).

You may argue that the MacBook Pro will likely have a faster processor, a better screen, better audio, and overall better quality than the LG Gram. But it will also cost more money. And the LG Gram is already available for sale, while the redesigned MacBook is a figment of our imagination and (hopefully) another well-kept secret that Apple is yet to announce.

In any case, the LG Gram already has many of the advantages one would expect from the yet-to-be-announced MacBook: good performance, large screen, lightweight, and good battery life. And the forthcoming 12th gen Intel processors will only improve the possibilities.

So, Apple may have an edge, but I do not think a revolution is coming.
 
  • Like
Reactions: Abazigal

BigMcGuire

Cancelled
Jan 10, 2012
9,832
14,032
The only revolution I see is ... wow... I don't hear fan noise anymore doing what I used to do! Wow, I can put this on my lap without a lap desk and not burn the **** out of my legs. Wow, this thing opens things REALLY fast. lol.

Battery life is insane. I charged it to 100% Friday night, used it for 6 hours+ (light usage (book reading, web browsing)) yesterday. This morning, I'm at 84% and probably won't charge till Tuesday. (M1 MBP 13').

Performance wise, I prefer my M1 to my i7 2019 MBP (work provided) with an eGPU because it's smoother. As a user, I could really care less about ARM vs Intel - I've always wanted a laptop that got iPad like battery life and didn't get untouchably hot just scrolling webpages.

Yes, I'll be looking seriously at the performance Macs later this year but like skaertus stated - I'm sure it's not going to be cheap. Hopefully trading in the M1 helps a little. I'm hoping there's a performance M1X/M2 MBP 13' with 4 ports. I'd upgrade to that in an instant.
 
  • Like
Reactions: xraydoc

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
People here defend Apple by calling it "legacy software", that we are "using the Mac wrong".
Yeah, and that seems so wrong to me, both as a consumer, and as an IT guy.

AMD would have been the best choice imo.
I don't think that would have met their criteria for control, but I agree, performance would have been better and they wouldn't have forced so much change for their customers.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
For now. Intel is trying to rebound. There is a new CEO to conduct a deeper restructuring.

Intel has been struggling for some time now. At one point it got so much ahead of competition that it became too lazy and inefficient.

Then AMD, which is a much smaller company making Intel clones, managed to beat Intel at its own game in recent years.

Then Qualcomm, a manufacturer of processors for smartphones, decided to put its toe on the water and make processors for PCs.

And Apple decided that its own smartphone processors became more powerful and power-efficient than Intel's, and decided to put them in all the line-up.

Intel is in a bad place, but it is a large company and it is fighting back. The 11th gen Intel processors are significantly better than the 10th gen. And the 12th gen is expected to bring several improvements.

It is not on the same level as Apple, but the gap may reduce. That will also depend on how Apple manages to improve the M1 into the M2 or something.

Competition is always good. Apple is so comfortable now that the M1 iMac seems far less worth it than the M1 MacBook Air released last year. Perhaps this is because no competitor managed to provide similar performance in the past five months since the release of M1?

Leave Apple alone with its M1 and we will get decreasing performance improvements for increasing price adjustments.

For now? You mean “for a decade,” right? And their new CEO is more concerned about trying to force christianity on people than on Intel’s success. (Google it)

AMD also beat intel at its own game in the 1990’s - I was there. Remember Opteron and athlon 64?

The problem isn’t that Intel has been lazy - the problem is that it failed at execution. 14+++++ anyone?

Intel sucks, they hire bad engineers, stellar engineers don’t want to work there, and they will always have the problem of the x86 baggage. **** Intel.
 
  • Like
Reactions: BigMcGuire

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
Exactly. And the route Apple has been taking, is Apple is now dictating what software we can run, rather than providing the enviroment where we can run all the software we need. If your software is still in 32-bit, sorry, Apple says you cannot run it anymore and Apple doesn't offer the possibility to downgrade OS X, because my machine is too new.

People here defend Apple by calling it "legacy software", that we are "using the Mac wrong".

The iPad Pro never reached it full potential not due to hardware, but due to software. Time will tell if this is going to be the case for ARM Mac's too. Because Rosetta 2 will be removed at some point too and I will be curious to see which software remains then.

AMD would have been the best choice imo.

AMD has no history - none - of being able to sustain a lead for more than a few years, or for addressing the entire range of the market. AMD has been in the lead in the past, and everyone here forgets that. It never lasts. And what AMD sells today can’t compete with M1 - so what makes you think they’ll compete with what apple silicon can do in the future?
 

skaertus

macrumors 601
Feb 23, 2009
4,252
1,409
Brazil
For now? You mean “for a decade,” right? And their new CEO is more concerned about trying to force christianity on people than on Intel’s success. (Google it)

AMD also beat intel at its own game in the 1990’s - I was there. Remember Opteron and athlon 64?

The problem isn’t that Intel has been lazy - the problem is that it failed at execution. 14+++++ anyone?

Intel sucks, they hire bad engineers, stellar engineers don’t want to work there, and they will always have the problem of the x86 baggage. **** Intel.
Intel certainly has an execution problem.

But the real problem comes down to management, it has become lazy and inefficient. Intel's CEO was forced to step out in June 2018 following a consensual relationship with an employee which violated the company's policies. Since then, Intel was run by its former CFO, who took the role of "interim CEO". It took Intel more than one year and a half just to replace the interim CEO for a definitive one, and only did that after the pressure of an active investor.

Intel's manufacturing problems are not new. Intel is struggling with the 10nm process for years now. But perhaps it thought it had sufficient lead to keep trying instead of outsourcing. What kind of management would allow Intel to delay its process, which is core for its business, year after year, without taking further action?

I remember AMD in the 1990s. The Athlon x64 was cheaper than Intel's Pentium, but it was not in the same ballpark of performance. Now Intel has a much bigger problem.

As for the new CEO, let's see. He is in office for what? 2 or 3 months now? There is little he could have done so far. It is still too early to tell, but some analysts are hopeful he can put Intel back on track.
 

Sydde

macrumors 68030
Aug 17, 2009
2,563
7,061
IOKWARDI
There’re yet more performance to be unleashed for the M1 once developers understand how to make better use of the UMA and the currently unused cores locked within the SoC.
This makes no sense, though. For over a decade (since Snow Leopard) Apple has had a tool in MacOS (and in iOS since 4.0) called Grand Central Dispatch which makes leveraging those "unused cores" almost trivially easy. GCD is CPU-agnostic, so a program designed with it will run faster on an 8-core platform but will not be hamstrung by only having 2 cores to work with. It also offers the advantages of multithreading with somewhat fewer nightmares for the programmer.

Because it has been around so long and is so flexible, developers have had plenty of time to figure this out.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
Intel certainly has an execution problem.

But the real problem comes down to management, it has become lazy and inefficient. Intel's CEO was forced to step out in June 2018 following a consensual relationship with an employee which violated the company's policies. Since then, Intel was run by its former CFO, who took the role of "interim CEO". It took Intel more than one year and a half just to replace the interim CEO for a definitive one, and only did that after the pressure of an active investor.

Intel's manufacturing problems are not new. Intel is struggling with the 10nm process for years now. But perhaps it thought it had sufficient lead to keep trying instead of outsourcing. What kind of management would allow Intel to delay its process, which is core for its business, year after year, without taking further action?

I remember AMD in the 1990s. The Athlon x64 was cheaper than Intel's Pentium, but it was not in the same ballpark of performance. Now Intel has a much bigger problem.

As for the new CEO, let's see. He is in office for what? 2 or 3 months now? There is little he could have done so far. It is still too early to tell, but some analysts are hopeful he can put Intel back on track.

The athlon 64 blew away pentium. Pentium, at the time, didn’t even have a true 64-bit pipeline - it had to run instructions twice through 32-bit ALUs. Cray and other supercomputer vendors switched their designs to use opterons (which were the same design as athlon 64), and most of the world’s fastest supercomputers at the time used AMD CPUs.

And this guy isn’t going to do **** for Intel:

“VMware CEO Pat Gelsinger is the Chairman of the Board at a group called Transforming the Bay with Christ. This coalition of business leaders, venture capitalists, non-profit leaders and pastors aims to convert one million people over the next decade.”

You think the best engineers in Silicon Valley are going to put up with that garbage and stick around to be preached to?
 

Andropov

macrumors 6502a
May 3, 2012
746
990
Spain
This makes no sense, though. For over a decade (since Snow Leopard) Apple has had a tool in MacOS (and in iOS since 4.0) called Grand Central Dispatch which makes leveraging those "unused cores" almost trivially easy. GCD is CPU-agnostic, so a program designed with it will run faster on an 8-core platform but will not be hamstrung by only having 2 cores to work with. It also offers the advantages of multithreading with somewhat fewer nightmares for the programmer.

Because it has been around so long and is so flexible, developers have had plenty of time to figure this out.
UMA is certainly new, and not every app uses GCD. I bet Adobe apps (for example) don't use GCD.
 

jdb8167

macrumors 601
Nov 17, 2008
4,859
4,599
This makes no sense, though. For over a decade (since Snow Leopard) Apple has had a tool in MacOS (and in iOS since 4.0) called Grand Central Dispatch which makes leveraging those "unused cores" almost trivially easy. GCD is CPU-agnostic, so a program designed with it will run faster on an 8-core platform but will not be hamstrung by only having 2 cores to work with. It also offers the advantages of multithreading with somewhat fewer nightmares for the programmer.

Because it has been around so long and is so flexible, developers have had plenty of time to figure this out.
Maybe he was talking about other kinds of cores like the neural engine.
 

pasamio

macrumors 6502
Jan 22, 2020
356
297
Sadly that may be true. What’s it been, 10 years since GCD came out?
Adobe have treated MacOS as second tier for a long while now and have lagged in implementing MacOS specific APIs that would improve the performance of their software since their focus has been on Windows.

GCD is a decade old and Apple have a number of other cross platform APIs like Accelerate, Metal and more recently examples like Core ML which generally automatically make use of available capabilities based on the device that as executing the code. Applications leveraging these APIs should take advantage of the extra cores like neural engine and moving forward as Apple expands their hardware capabilities, it stands to reason they'll add library support for that as well meaning that developers in the ecosystem can get access to future enhancements easily (potentially a recompile to target an updated SDK).
 

Sydde

macrumors 68030
Aug 17, 2009
2,563
7,061
IOKWARDI
Maybe he was talking about other kinds of cores like the neural engine.
There is an API for using the neural engine and another for the GPU. Curiously, from what I have read, your code is not necessarily guaranteed to run on the designated target. Presumably, if the resource you are expecting to use is busy, the OS will do a fallback to a different resource, just to keep your code flowing as best it can.

GCD is a decade old …

Not only that, it open source (though I am not sure what the licensing is).
 

jdb8167

macrumors 601
Nov 17, 2008
4,859
4,599
There is an API for using the neural engine and another for the GPU. Curiously, from what I have read, your code is not necessarily guaranteed to run on the designated target. Presumably, if the resource you are expecting to use is busy, the OS will do a fallback to a different resource, just to keep your code flowing as best it can.



Not only that, it open source (though I am not sure what the licensing is).
Apache License, Version 2.0.
 

Andropov

macrumors 6502a
May 3, 2012
746
990
Spain
Sadly that may be true. What’s it been, 10 years since GCD came out?
I think GCD was released with Snow Leopard, so 12 years old now. Adobe insisted on using their own multi-platform threading API, at least until a few years ago, I don't know if anything changed.

The full-screen API is almost as old (Lion, 10 years old) and yet neither Photoshop nor Illustrator implement it either. They have their own full screen mode for that, of course, which is triggered in a non-standard way, without an animation, and disables the ability to show the hidden menu bar/Dock by moving the cursor to the top/bottom of the screen, making the app unusable. And doesn't play nicely with Mission Control, since it's a regular window hacked to fill the screen.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
I think GCD was released with Snow Leopard, so 12 years old now. Adobe insisted on using their own multi-platform threading API, at least until a few years ago, I don't know if anything changed.

The full-screen API is almost as old (Lion, 10 years old) and yet neither Photoshop nor Illustrator implement it either. They have their own full screen mode for that, of course, which is triggered in a non-standard way, without an animation, and disables the ability to show the hidden menu bar/Dock by moving the cursor to the top/bottom of the screen, making the app unusable. And doesn't play nicely with Mission Control, since it's a regular window hacked to fill the screen.
They also insist on their own file dialogs. I have a subscription because I need it from time-to-time for work, but I would be *happy* about paying a monthly fee if they’d make illustrator, photoshop and Lightroom classic use native sdks.
 

Andropov

macrumors 6502a
May 3, 2012
746
990
Spain
They also insist on their own file dialogs. I have a subscription because I need it from time-to-time for work, but I would be *happy* about paying a monthly fee if they’d make illustrator, photoshop and Lightroom classic use native sdks.
Same here. From time to time I need to use some weird feature that is only available in Photoshop/Illustrator and keeps me from unsubscribing, but I mostly start working in Pixelmator, switch to Photoshop/Illustrator if I need something specific, and go back to Pixelmator. The only Adobe app I use full time is Lightroom Classic (don't really have a lot of complaints here).

I tried a few times to do some video work on Premiere since I was already paying the subscription, but the app kept crashing every couple hours for different reasons. I guess (hope?) I just had bad luck with that, because there's no way someone would do professional work under that conditions. I ended up buying FCPX.

To think that if they stopped reimplementing every single thing the OS already provides, and used native SDKs, they would have genuinely good and enjoyable apps... I wonder if it's bad corporate management or the baggage of decades of code.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
To think that if they stopped reimplementing every single thing the OS already provides, and used native SDKs, they would have genuinely good and enjoyable apps... I wonder if it's bad corporate management or the baggage of decades of code.
I would guess both, and also the desire to avoid added complexity for relatively little benefit.

Multiplatform software usually follows the logic of the majority platform and then tries to emulate the same behavior on minority platforms. Platform-specific APIs that follow a different logic are generally avoided, unless the benefits are clear and significant, because the added complexity is a massive source of bugs on all platforms.
 

LinkRS

macrumors 6502
Oct 16, 2014
402
331
Texas, USA
The athlon 64 blew away pentium. Pentium, at the time, didn’t even have a true 64-bit pipeline - it had to run instructions twice through 32-bit ALUs. Cray and other supercomputer vendors switched their designs to use opterons (which were the same design as athlon 64), and most of the world’s fastest supercomputers at the time used AMD CPUs.

And this guy isn’t going to do **** for Intel:

“VMware CEO Pat Gelsinger is the Chairman of the Board at a group called Transforming the Bay with Christ. This coalition of business leaders, venture capitalists, non-profit leaders and pastors aims to convert one million people over the next decade.”

You think the best engineers in Silicon Valley are going to put up with that garbage and stick around to be preached to?
Hi cmaier,

I have to come respect your inputs in these forums, as you seem both well-informed and knowledgeable about the topics you reply to. This reply concerning the Athlon 64 vs Pentium rang some alarm bells, making me believe that I have a gap in my personal knowledge about this. If memory serves, the direct competitor to the Athlon 64 was the Pentium 4, which used the NetBurst Architecture. This architecture was significantly different from Intel's past efforts, in that it prioritized high-clock speeds (cough, cough) vs Instructions Per Clock (IPC). The thought (at the time), was that the lower IPC could be compensated for by running the clock faster, so the Pentium 4 needed high-clock speeds to be competitive. The Athlon 64 was different, and not just because of the 64-bit extensions (AMD64), as it used the "classic" method of upping IPC (apologies for stating what you already know, but this info is to help with context :)). Clock-for-clock the Athlon 64 was faster than Pentium 4, hands down. In some game-specific scenarios (sound familiar ha ha) Pentium 4 systems could outperform the Athlon 64, when the clock speed was high enough. Problem was, that the Pentium 4's node process was unable to reliably get past ~3.8 GHz (too much heat, sound familiar again LOL), and when Athlon 64s were hitting low 3 GHz, they walked all over the Pentium 4. It wasn't until the Core microarchitecture came out, did Intel being to retake the performance crown. This is all bypassing the multi-core efforts with the dual-core Pentium 4s which were just two dies in one package, etc... With the release of Core 2, the Athlon 64 was no longer the fastest, and the rest is history... All of that aside, I do not recall when Intel was faking a 64-bit pipeline with their CPUs? Was this during the Pentium 4 era, the Core, Core 2, and current Core i (all derived form the original Pentium Pro architecture) were all "true" 64-bit like the Athlon 64, were they not? Thanks in advance for your reply!

Rich S.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
Hi cmaier,

I have to come respect your inputs in these forums, as you seem both well-informed and knowledgeable about the topics you reply to. This reply concerning the Athlon 64 vs Pentium rang some alarm bells, making me believe that I have a gap in my personal knowledge about this. If memory serves, the direct competitor to the Athlon 64 was the Pentium 4, which used the NetBurst Architecture. This architecture was significantly different from Intel's past efforts, in that it prioritized high-clock speeds (cough, cough) vs Instructions Per Clock (IPC). The thought (at the time), was that the lower IPC could be compensated for by running the clock faster, so the Pentium 4 needed high-clock speeds to be competitive. The Athlon 64 was different, and not just because of the 64-bit extensions (AMD64), as it used the "classic" method of upping IPC (apologies for stating what you already know, but this info is to help with context :)). Clock-for-clock the Athlon 64 was faster than Pentium 4, hands down. In some game-specific scenarios (sound familiar ha ha) Pentium 4 systems could outperform the Athlon 64, when the clock speed was high enough. Problem was, that the Pentium 4's node process was unable to reliably get past ~3.8 GHz (too much heat, sound familiar again LOL), and when Athlon 64s were hitting low 3 GHz, they walked all over the Pentium 4. It wasn't until the Core microarchitecture came out, did Intel being to retake the performance crown. This is all bypassing the multi-core efforts with the dual-core Pentium 4s which were just two dies in one package, etc... With the release of Core 2, the Athlon 64 was no longer the fastest, and the rest is history... All of that aside, I do not recall when Intel was faking a 64-bit pipeline with their CPUs? Was this during the Pentium 4 era, the Core, Core 2, and current Core i (all derived form the original Pentium Pro architecture) were all "true" 64-bit like the Athlon 64, were they not? Thanks in advance for your reply!

Rich S.

I don’t recall all of Intel’s marketing names (believe it or not, us CPU designers rarely knew which marketing name was which - we used our internal names for everything, and used our competitors’ internal names as well - because 3 different chips on the market may all be the same design, etc.)

The first Intel chips to support AMD64 did so by faking it. They did not have true 64-bit ALUs. I believe they were Prescott - this should have been obvious to most folks because the first Prescott’s did not support 64-bit, then suddenly they did. No way to do that other than to fake it by changing the microcode to parse 64-bit instructions and turn them into a complicated series of instructions to get the same answer by using 32-bit ALUs.
 

xraydoc

Contributor
Oct 9, 2005
11,030
5,489
192.168.1.1
I think GCD was released with Snow Leopard, so 12 years old now. Adobe insisted on using their own multi-platform threading API, at least until a few years ago, I don't know if anything changed.

The full-screen API is almost as old (Lion, 10 years old) and yet neither Photoshop nor Illustrator implement it either. They have their own full screen mode for that, of course, which is triggered in a non-standard way, without an animation, and disables the ability to show the hidden menu bar/Dock by moving the cursor to the top/bottom of the screen, making the app unusable. And doesn't play nicely with Mission Control, since it's a regular window hacked to fill the screen.
Adobe is the Microsoft of the 3rd party software world. They're stuck on a 10-15 year old code base and modernizing is too big of an undertaking.

I don't personally use Photoshop or Illustrator, but I do need to use Acrobat Pro (or DC or whatever they call it these days) periodically for work. That application is the biggest pig I've ever used. It's clunky, slow and uses a ridiculous amount of resources. Even on modern high-end modern machines, it's simply awful. I avoid it at all costs until there's something specific I need it for.

It's a shame their apps are industry standard. My oldest daughter is in advertising & graphic design and even she hates them but has no choice (and she's got absolutely top end Apple hardware)... she's stuck on Photoshop/Illustrator but fortunately their company otherwise is a Mac house so she can use Final Cut & Logic for video and music production.
 
  • Like
Reactions: Krevnik

Krevnik

macrumors 601
Sep 8, 2003
4,101
1,312
Intel’s market share is about to be a story fit for the ole Lenin.jpeg quote on weeks where years happen.

Agree their market should continue to shrink, but I don't think we should expect large market moves until fab capacity improves and Intel's competition can actually buy the capacity they need to absorb those customers.

Since TSMC and Intel are both talking ~2 years to get new capacity online, that's a surprisingly long time to not be able to aggressively capitalize on Intel's mistake for someone like AMD.

Sure I agree Intel’s entrenched position is a huge advantage and I won’t join the chorus of “Intel is DOOOMED” just yet. But I’d say the pandemic actually hasn’t helped them much - hence my edit. Sure notebook sales are up 50% for Intel but their consumer group CCG’s growth was only 8% ... those margins are awful for Intel. Basically they’re being relegated to the cheapest sets and vendors are able to say they’ll go AMD or ARM unless Intel cut prices. Similarly their server side business, DCG, also had their margins halved.

Again, I don't disagree with Intel's general outlook. It's just that there's a viable question of where does one go in the next 18-24 months during the shortage as new capacity is built? If you aren't going ARM, and sticking with x86, it's either AMD or Intel, and AMD simply won't have the ability to absorb the customers unless they get access to Intel's fabs. That's upwards of two CPU release cycles of "status quo" before we start seeing larger shifts in the x86 market split. That's really the crux of the point I was trying to make.

But also keep in mind that cheap hardware are where the bulk of the growth has been during the pandemic. So it's not terribly surprising to see these sort of numbers, IMO. (Edit: If there's details on how much of this is from Intel's weaker margins, versus shifts in the types of sales in market, I'd love to see it)

What sets Apple apart compared to Microsoft's attempts at ARM has been Apple fully investing in it, building their tooling to support it and making it very easy to take an existing x86-64 MacOS app and run it on Apple Silicon. Microsoft are apparently excited to announce that next year Visual Studio will finally be 64-bit. To me the message there is that moving that ecosystem hasn't been important for Microsoft but if your developers can't work natively on the new platform, what hope do you have for adoption? To quote from AWS, nothing build developer tool ecosystems better than volume and Apple are providing the volume for their ecosystem however Microsoft's lack of support for native ARM development show that they intrinsically value it less.

This. I've been saying this for years. Microsoft has seemingly always treated new APIs as a sort of "build it and they will come" exercise. Releasing it out into the wild and hoping things happen. They can't even get the internal teams onto one thing before they've moved onto the next new shiny.

Apple on the other hand, they build plans and roadmaps for the platform and execute surprisingly well on them. They don't share those roadmaps with the public all that well, but on iOS it was pretty incredible how you could trace new APIs in UIKit leading to new hardware or OS features year after year after year, the development cost of supporting the iPhone 6/6+ or iPad split screen was cheap for teams that onboarded to auto-layout early enough for example. That said, Apple's history is what makes Catalyst and SwiftUI's current state a bit perplexing to me. They feel more like a Microsoft-released API than anything else Apple has done in 20 years.
 

crazy dave

macrumors 65816
Sep 9, 2010
1,453
1,229
Again, I don't disagree with Intel's general outlook. It's just that there's a viable question of where does one go in the next 18-24 months during the shortage as new capacity is built? If you aren't going ARM, and sticking with x86, it's either AMD or Intel, and AMD simply won't have the ability to absorb the customers unless they get access to Intel's fabs. That's upwards of two CPU release cycles of "status quo" before we start seeing larger shifts in the x86 market split. That's really the crux of the point I was trying to make.

But also keep in mind that cheap hardware are where the bulk of the growth has been during the pandemic. So it's not terribly surprising to see these sort of numbers, IMO. (Edit: If there's details on how much of this is from Intel's weaker margins, versus shifts in the types of sales in market, I'd love to see it)

We’ll find out for sure when AMD releases their earnings soon. If AMD’s margins also took a hit, took less of one, or none at all that will tell us a lot about Intel’s design wins vs AMD. But the biggest data point for now would be Intel’s much weaker margins in data center as well as consumer. Every analysis is pointing to AMD’s success as the primary culprit rather than pandemic related which makes sense given that market.

Also analysis has indicated Apple’s share of the premium market grew substantially given the M1 CPU advantage. IDC pegged it at twice overall market growth and while estimates and comparing Mac to PC sales is not always easy, at least it shows that there was growth in the premium consumer market to be had as well during the pandemic and Intel doesn’t look like it got any.

Btw I agree even if Intel is to fall eventually (which is far from guaranteed), Intel isn’t going anywhere for awhile. I’m just being somewhat more negative about where they are already from a business perspective. There are worrying signs (for Intel) that their engineering troubles are starting to come home to roost financially. This isn’t destiny (yet) but I think I peg them as being in more trouble now than you. It’s a matter of degree rather a qualitative disagreement.

P.S. not a business person at all, just random internet guy’s opinion, but others who do seem to know their stuff seem to share it (and informed it) so that’s why I share it :)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.