Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

257Loner

macrumors 6502
Dec 3, 2022
456
635
I'm not sure I agree 100% with that reasoning.
Yes, cell phones are at a point nowadays they CAN be used as desktops if manufacturers make that process easy.
However, bigger computers have the advantage of having a larger area (which means more transistors), which also allows for better cooling.

That is the reason mainframes STILL exist today, although for less applications than before.

So, bigger computers will always have a space advantage over small ones. The question is instead, will this matter? If cell phones get so powerful using them as full-blown desktops only takes a fraction of their power 90% of the time, then it probably won't matter.
I sort of agree. V12s are still more powerful than V4s, but V4s have come such a long way that to a person living in the 1960s, a V4-powered car today would seem like a sports car. SoC's, likewise, are battery-efficient integrated chips that are easier to mass-produce, and will one day make dedicated hardware of yesteryear seem unnecessary.
 
  • Like
Reactions: sam_dean

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
I sort of agree. V12s are still more powerful than V4s, but V4s have come such a long way that to a person living in the 1960s, a V4-powered car today would seem like a sports car. SoC's, likewise, are battery-efficient integrated chips that are easier to mass-produce, and will one day make dedicated hardware of yesteryear seem unnecessary.
I guess folks have to choose their "farms" to defend.

Some defends "farms" with better future potential, while others defend "farms" that has the most yields now.

As I said in another post, the world is moving towards mobile computing. "Farms" that rely on ever increasing use of energy to produce the same level of yields is, IMHO, untenable.
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
I'm not sure I agree 100% with that reasoning.
Yes, cell phones are at a point nowadays they CAN be used as desktops if manufacturers make that process easy.
However, bigger computers have the advantage of having a larger area (which means more transistors), which also allows for better cooling.

That is the reason mainframes STILL exist today, although for less applications than before.

So, bigger computers will always have a space advantage over small ones. The question is instead, will this matter? If cell phones get so powerful using them as full-blown desktops only takes a fraction of their power 90% of the time, then it probably won't matter.
I'm talking about R&D money and economies of scale of ARM SoC vendors vs x86.

Apple can be the sole user of TMSC N3 or any future process node because they have both.

Apple's M, M Pro/Max/Ultra SoC die sizes have a surface area that is larger than any iPhone chip. These SoC are placed into laptops & desktops with & without active HSFs.

Form FactorMBP 14"
MBP 16"
Mac Studio
MBP 14"
MBP 16"
MBA
MBP 13"
iMac 24"
Mac mini
2021 iPad Pro
2022 iPad Air
Mac StudioiPhone 13
iPhone 13 mini
iPhone 13 Pro
iPhone 13 Pro Max
2021 iPad Mini
Apple silicon chipM1 MaxM1 ProM1M1 UltraA15
LaunchQ4 2021Q4 2021Q4 2020Q1 2022Q3 2021
# of dies11121
CPU10108206
performance cores884162
efficiency cores22444
GPU core32168645
Neural Engine core1616163216
memory bandwidth400GB/s200GB/s68.2GB/s800GB/s68.2GB/s
Max Memory64GB32GB16GB128GB6GB
Hardware-accelerated H.264, HEVC, ProRes, and ProRes RAW11-2-
Video decode engines11-2-
Video encode engines21-4-
ProRes encode and decode engines21-4-
Estimated Die Size4.251cm²2.513cm²1.19cm²6.3765cm²1.116cm²


Has any new mainframes been sold after Y2K or are they still being maintained because of the half a century of data contained within? Companies who bought into them during the 1960s, 1970s, 1980s and even 1990s may maintain it today because they do not want operational disruptions and it is cheaper to keep it still running than to transition to a 2023 data center.
 
Last edited:

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
We'll probably be there in 5-6 years from now. iPhones by then will probably have the computing power of M1/M2. However, at the moment the AAA gaming industry still only pushes the newest and latest graphics power, which always requires cutting-edge desktop graphics, meaning that phones will always remain a few years behind, at least in that space.
In every other space, the iPhone can probably emulate any desktop game up to about 2016 now. The only limitation is the ability to actually emulate, and transform the iPhone into a desktop replacement. USB-C will partly help with connecting external devices (gaming keyboard, mouse, monitor), but iOS isn't helping the situation.
With the iPhone chip moving from 5nm (2020) to 3nm (2023) and having a larger die surface area may make your estimate of it having the raw performance of a M1 come sooner than 5-6 years. Odds are within 15-26 months.

I think the future 3nm 2023 iPhone 15 Pro chip would have the raw performance of any Intel Mac excluding a Core i9 or Xeon models.

As you notice in the 2017 chart below even the chips used by Android makers are inching closer to Intel's.

22855-28192-A11geekbench-xl.jpg


Source: https://appleinsider.com/articles/1...erates-top-chips-from-qualcomm-samsung-huawei
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
Using full-blown ARM Windows is already a reality with some Android phones (e.g, Samsung). Sure, it's tricky to install and very advanced games don't run well. But as I mentioned in other threads, you can definitely use it for more simple document or photo editing. If you only want to run very basic 3D games or some light 2D games, you might be able to pull it off too.
If end user runs Windows x86 binaries then the performance hit will be a factor. But when these Windows programs are recompiled for ARM then performance will improve.

This is more of a problem of legacy software that are x86-only. When Windows developers release fat binaries that works on both x86 and ARM then you will see competition in the Windows space that is absent during the Intel/AMD duopoly.
 

mectojic

macrumors 65816
Dec 27, 2020
1,330
2,523
Sydney, Australia
Using full-blown ARM Windows is already a reality with some Android phones (e.g, Samsung). Sure, it's tricky to install and very advanced games don't run well. But as I mentioned in other threads, you can definitely use it for more simple document or photo editing. If you only want to run very basic 3D games or some light 2D games, you might be able to pull it off too.
Exactly. We need it to be really optimised. If any company can pull it off, and essentially make a half-iPhone, half dockable Mac Mini, then it will be awesome.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
So, bigger computers will always have a space advantage over small ones. The question is instead, will this matter? If cell phones get so powerful using them as full-blown desktops only takes a fraction of their power 90% of the time, then it probably won't matter.
This focuses too much on performance. In many uses, the CPU and the GPU are already the least important components of a computer, because they are fast enough anyway. The real value comes from what you can do with the computer and how. For example:
  • Some people value long battery life, as it enables using the computer as a truly mobile device.
  • Office users often prefer having multiple external monitors, as it makes working with multiple documents easier.
  • Larger devices have more space for internal components. For example, the RAM limits in M1/M2 devices are based on the physical dimensions of the package rather than on what the chips could trivially support if Apple wanted so.
  • If you have to connect many devices to the computer, having internal expansion capacity can make your desk cleaner and nicer.
  • Peripherals always have to make trade-offs between connection speed, cable price, and cable length. For example, Thunderbolt is heavily constrained by lightweight copper cables.
  • Sometimes the availability of chips is a bigger constraint than power. Then it may be more cost-effective to use more power than more/bigger dies. Especially as wind/solar can make power ridiculously cheap for those with flexibility in time of use.
 
  • Like
Reactions: mectojic

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
This focuses too much on performance. In many uses, the CPU and the GPU are already the least important components of a computer, because they are fast enough anyway. The real value comes from what you can do with the computer and how. For example:
  • Some people value long battery life, as it enables using the computer as a truly mobile device.
  • Office users often prefer having multiple external monitors, as it makes working with multiple documents easier.
  • Larger devices have more space for internal components. For example, the RAM limits in M1/M2 devices are based on the physical dimensions of the package rather than on what the chips could trivially support if Apple wanted so.
  • If you have to connect many devices to the computer, having internal expansion capacity can make your desk cleaner and nicer.
  • Peripherals always have to make trade-offs between connection speed, cable price, and cable length. For example, Thunderbolt is heavily constrained by lightweight copper cables.
  • Sometimes the availability of chips is a bigger constraint than power. Then it may be more cost-effective to use more power than more/bigger dies. Especially as wind/solar can make power ridiculously cheap for those with flexibility in time of use.

We're considering portable devices replacing desktops (as opposed to the portability of a notebook, or the connectivity of a cellphone). This implies running full-blown desktop apps and performance.

Talking about e.g, portability in this context is moot because we are already there.

If we consider portable devices REPLACING their bigger siblings, we're not there *yet*, but we can run full-blown software in light scenarios, which means we are closer to that scenario. That is what I was referring to.
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
  • Larger devices have more space for internal components. For example, the RAM limits in M1/M2 devices are based on the physical dimensions of the package rather than on what the chips could trivially support if Apple wanted so.
  • If you have to connect many devices to the computer, having internal expansion capacity can make your desk cleaner and nicer.
  • Peripherals always have to make trade-offs between connection speed, cable price, and cable length. For example, Thunderbolt is heavily constrained by lightweight copper cables.
  • Sometimes the availability of chips is a bigger constraint than power. Then it may be more cost-effective to use more power than more/bigger dies. Especially as wind/solar can make power ridiculously cheap for those with flexibility in time of use.

Are we talking about a Mac Pro with expansion for RAM, SSD and PCIe slots?

I dont think what is presented by Apple as RAM limits are physical or even technical. It is more by design on what ~80% of the expected configurations would be per SoC line.

Example would be the A12X Bionic SoC that came in the 2020 iPad Pro had 6GB of LPDDR4X but months later in the 2020 Developer Transition Kit it had 16GB RAM. These were selected per use case within a given product life.

2020 M1 was limited to 16GB while 2022 M2 to 24GB. Odds are Apple did not offer higher RAM configs because it would just increase the supply chain cost without meaningfully increase units sold or revenue.

It may also decrease sales of higher-end SoC like the Pro, Max and Ultra for people like me who would have bought an iMac 27" replacement with a M1 32GB RAM and 512GB SSD. That config's overkill but if you are coming from a 2012 Core i7 then it still a great leap forward. This may be why a 15" MBA M1 was not on offer as it would decrease sales of the MBP 16" M1 Pro.

Having specific customization would counter economies of scale and force Apple to redesign parts just for desktops other than the Mac mini and Mac Studio. So it may be "cheap" to replace Low Power RAM for desktop DIMMs for the desktop product lines and even allow for DIMM slots but in terms of overall cost it would be more expensive to do so because of the lower quantity ordered of said parts.

These Pro desktops will be more expensive to make and its MSRP will reflect that unless it is subsidized by other product lines that do not share that feature.

In the PC industry units shipped annually is approx ~80% laptop & ~20% desktop for the last ~2 decades. That was a reason why Steve Jobs moves from PPC to Intel in 2006.

Mac laptops & desktops use the same SoC. That's why they have identical/near identical performance when comparing same SoC SKU regardless of form factor. They could not/would not overclock the SoC destined for the desktop to create a differentiator.
 
Last edited:

JouniS

macrumors 6502a
Nov 22, 2020
638
399
I dont think what is presented by Apple as RAM limits are physical or even technical. It is more by design on what ~80% of the expected configurations would be per SoC line.
You need over 8 billion transistors and capacitors for each gigabyte of DRAM. Those components take a lot of space. Just compare the sizes of the memory chips in the M2 / M2 Pro / M2 Max packages to get some idea. Or look at how many chips the highest-capacity DIMMs you can find contain.
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
You need over 8 billion transistors and capacitors for each gigabyte of DRAM. Those components take a lot of space. Just compare the sizes of the memory chips in the M2 / M2 Pro / M2 Max packages to get some idea. Or look at how many chips the highest-capacity DIMMs you can find contain.
Apple is a systems vendor. They could easily increase package size if there is supply chain reason for it.

If inventory turnover is impacted due to low demand then why offer it?

As I pointed out A12X Bionic SoC that came in the 2020 iPad Pro had 6GB of LPDDR4X but months later in the 2020 Developer Transition Kit it had 16GB RAM. These were selected per use case within a given product life.

It is like Mathematica users demanding an Ultra chip CPU core count but at a binned M1 GPU core count as they have zero need for that many Ultra chip GPU cores with more than 128GB RAM. This could be addressed if Apple designed a boutique Ultra chip with two M Pro dies rather than two M Max dies.

What is the demand of that like? A few hundred customers a year? A few thousands? Last year 28.6 million Macs were shipped worldwide.

You may as well not service them and let Intel/AMD take that business. The Mathematica user would probably not care if they push a 1.5kW PSU as they want the results ASAP.
 
Last edited:

JouniS

macrumors 6502a
Nov 22, 2020
638
399
Apple is a systems vendor. They could easily increase package size if there is supply chain reason for it.
If they increase package size, there will be less space left for other functions, such as battery and cooling. Designing a small device is always about making compromises. You can't fit in everything people would want to buy and you could provide cost-effectively.
 
  • Like
Reactions: sam_dean

nexusrule

macrumors 6502a
Aug 11, 2012
625
759
We take it all in one year; we just hit it really hard. That's not what happens in the rest of the industry or historically.

This quote just means that whatever improvement they manage to have from generation to generation won't be scattered over several years but used all together, how much it does take them to build a meaningful update is another matter.
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
If they increase package size, there will be less space left for other functions, such as battery and cooling. Designing a small device is always about making compromises. You can't fit in everything people would want to buy and you could provide cost-effectively.
It is within their purvey to rearrange things or even reducing parts and putting it onto the SoC or the other side of the logic board.

They control the whole system.
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
Lol.

That there is a massive failure of imagination. You seriously think you have *any* idea what the world will look like in 17 years? Tech moves a lot faster than the world in general (dragging it along, in fact). I could name ten things that are likely to completely derail those predictions, but that would just be a failure of imagination on my part, because most likely the biggest things in 2040 are not yet visible to us. ...although, maybe, it's not a tough call to say the endlessly ramifying AI explosion will be one of them.
Intel/AMD's advantage is x86 legacy software. They could be the next mainframe. Sales are dependent on x86 software and systems.

Qualcomm's NUVIA will hopefully cater to all price points of Windows 11 on ARM and not just the market that Apple occupies.

My statements tend to be supported by market research and relevant news articles.

Picture-2-768x461.png


 
Last edited:

gpat

macrumors 68000
Mar 1, 2011
1,931
5,341
Italy
Intel/AMD's advantage is x86 legacy software. They could be the next mainframe. Sales are dependent on x86 software and systems.

Qualcomm's NUVIA will hopefully cater to all price points of Windows 11 on ARM and not just the market that Apple occupies.

Steve Jobs said the same around 2011 about being in the "post-PC era".
Fast forward 12 years and the PC didn't become any less relevant.
Actually I believe that the PC market recovered a lot in the last years.
Bottom line, Apple is the only vendor that can truly afford a complete ARM transition.
They have an userbase accustomed to ditch legacy code and technologies.
PC users will never do that simply because nobody will force them to, and also because Intel and AMD will always offer "good enough" performance to keep them in their x86 comfort zone.
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
Steve Jobs said the same around 2011 about being in the "post-PC era".
Fast forward 12 years and the PC didn't become any less relevant.
Actually I believe that the PC market recovered a lot in the last years.
Bottom line, Apple is the only vendor that can truly afford a complete ARM transition.
They have an userbase accustomed to ditch legacy code and technologies.
PC users will never do that simply because nobody will force them to, and also because Intel and AMD will always offer "good enough" performance to keep them in their x86 comfort zone.

A decade after he said that

PCs using Intel & AMD chips (all price points)

- 2021: 322.2 million units
- 2022: 263.7 million units

Vs

Macs, iPads & iPhones using Apple M & A chips ($429 & up)

- 2021: 321.5 million units
- 2022: 316.8 million units

Vs

Android smartphones (all price points)

- 2021: 1.124 billion units
- 2022: 0.979 billion units

Vs

Android/Windows tablets (all price points)

- 2021: 110.5 million units
- 2022: 101 million units

PC sales got a bump due to COVID (2020-2022) forcing remote work & remote learning. Buyers were either upgrading 4-6 years or older PCs or never had one at home. Lost opportunity for Apple not to start the transition to Apple Silicon in Nov 2018 rather than Nov 2020. If they did that then M1 and M2 Macs would have had a better surge in sales rather than Intel/AMD.

Qualcomm NUVIA has the best shot at Windows 11 on ARM. If Microsoft can effectively push fat binaries on future Windows 11 programs then there will be incentive to switch especially when the NUVIA team are run by 3 ex-Apple Silicon top engineers.

x86 isn't designed for performance per watt and are a few process nodes delayed so its sole advantage that I see is strong association with desktops/laptop & legacy softwar/hardware.

They will be the next mainframe. Mainframes are still around.

Intel/AMD does not have a 5G/6G modem business unit unlike Qualcomm so from a buyer's point of view that is value added just there.
 
Last edited:

gpat

macrumors 68000
Mar 1, 2011
1,931
5,341
Italy
PC sales got a bump due to COVID forcing remote work & remote learning
https://www.gsmarena.com/qualcomm_i...g_chipset_with_oryon_cpu_cores-news-57499.php
Qualcomm NUVIA has the best shot at Windows 11 on ARM. If Microsoft can effectively push fat binaries on future Windows 11 programs then there will be incentive to switch especially when the NUVIA team are run by 3 ex-Apple Silicon top engineers.

x86 isn't designed for performance per watt and are a few process nodes delayed so its sole advantage that I see is legacy software and legacy hardware. They will be the next mainframe. Mainframes are still around.

Intel/AMD does not have a 5G/6G modem business unit unlike Qualcomm.

Those sales figures considered like that prove nothing about the decline of the PC market.
Microsoft has been fiddling with ARM for decades and I can't even begin to count the projects they started and ended up aborting.
Windows Phone had way more potential than anything about the PC running on ARM and it still failed.
Windows 10X? Aborted.
Windows Andromeda? Aborted, ended up using Android on a MS phone, imagine that.
Windows RT? Aborted.
Windows CE? Surely it's still running somewhere, like the mainframes are, but still aborted.
Surface Pro 9 with the choice between Intel and Microsoft SQ3? You'd have to be drunk to buy the latter.
Microsoft will never have the kind of vertical control on hardware, software, developers and customers that Apple has. That's why Windows on ARM will never be a real market player.
About process nodes, TSMC is the big player and they manufacture Apple, Qualcomm and AMD chips regardlessly so there you have it. No process node handicap.
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091
Those sales figures considered like that prove nothing about the decline of the PC market.
Microsoft has been fiddling with ARM for decades and I can't even begin to count the projects they started and ended up aborting.
Windows Phone had way more potential than anything about the PC running on ARM and it still failed.
Windows 10X? Aborted.
Windows Andromeda? Aborted, ended up using Android on a MS phone, imagine that.
Windows RT? Aborted.
Windows CE? Surely it's still running somewhere, like the mainframes are, but still aborted.
Surface Pro 9 with the choice between Intel and Microsoft SQ3? You'd have to be drunk to buy the latter.
Microsoft will never have the kind of vertical control on hardware, software, developers and customers that Apple has. That's why Windows on ARM will never be a real market player.
About process nodes, TSMC is the big player and they manufacture Apple, Qualcomm and AMD chips regardlessly so there you have it. No process node handicap.
It wasn't properly supported by either Microsoft or their ARM partners.
 

sam_dean

Suspended
Sep 9, 2022
1,262
1,091

Other evidence/speculation include:
  • Rumors were that M2 SoC was ready much earlier but was held back by the late new Air design.
  • Reports and video file name suggest that the M2 Pro/Max were ready as early as October 2022, which would have made it one year since M1 Pro/Max
  • M2 used A15 cores (not A16 in iPhone 14), which suggests Apple wants to take advantage of the iPhone SoC design update each year
  • Reports are that M2 will be a short-lived generation because M3 is expected to launch on time
  • Many speculators point to iPad SoCs, which were not updated yearly. However, the base M chips go into far more devices than old iPad SoCs. This makes it more economical to design and launch new SoCs yearly.
  • M1 launched in the same quarter as A14, which suggests Apple develops the base M and the A series in tandem
  • Apple has been fighting supply chain issues in the last 3 years, which caused many delays in new designs shipping.
Why is this important? Because gaining 10-20% in performance every two years is not impressive. But once a year? Now that's impressive and exciting.

If Apple were to continue with its prevailing product refresh cycle the 3nm M3 would come out between the Sep 2023 iPhone 15 Pro release and Jan 2024.

Now, what Apple could do is skip/scrap the use of 2022 A16 Bionic chip tech and just jump directly into 2023 A17 Bionic chip tech so it is easier to align everything at 1 go.

No need to delay 2022 Mac Studio M2 Max/Ultra.

Sure it would make it a "loss" in terms of R&D spend but that's already spent already and you're further differentiating Apple Silicon from Intel/AMD.
 

gpat

macrumors 68000
Mar 1, 2011
1,931
5,341
Italy
It wasn't properly supported by either Microsoft or their ARM partners.

That's right, it wasn't because they didn't see any potential in that to become a desiderable product.
Last year Microsoft released a low-cost laptop for education to rival the Chromebook, the Surface Laptop SE.
If they cared about ARM in the least, that would've been the perfect testbench for a Qualcomm chip.
What did they put in it? The same crappy Celeron as always.
If MS truly cared about Windows on ARM development, they should start on offering a simple product: Official Windows virtualization for Apple Silicon.
Even recycling the old Virtual PC brand (they own it), offer a paid downloadable license of Windows to virtualize on ASi machines.
Optimized by them, supported by them.
Because I assure you that ASi users wanting a stable and solid way to run Windows make a way larger market than those that will buy a Surface with SQ3 or whatever halfbaked overpriced Nuvia product that will come out on next year.
 

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
Surface Pro 9 with the choice between Intel and Microsoft SQ3? You'd have to be drunk to buy the latter.
Microsoft will never have the kind of vertical control on hardware, software, developers and customers that Apple has. That's

That's not so clear-cut.
They do have vertical integration with their own hardware, e.g, Surface hardware and XBox.
I'm not sure how good it is compared to Apple's, but them not achieving it (assuming that is the case with the Surface) has probably more to do with their internal culture than anything else.
 

gpat

macrumors 68000
Mar 1, 2011
1,931
5,341
Italy
That's not so clear-cut.
They do have vertical integration with their own hardware, e.g, Surface hardware and XBox.
I'm not sure how good it is compared to Apple's, but them not achieving it (assuming that is the case with the Surface) has probably more to do with their internal culture than anything else.

That's a long shot from doing the same thing on the whole PC platform.
They do have it on the Xbox, but it's another kind of product.
Surface is an attempt to do so but it's not even a major player on the PC market, nor they wish it to be.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.