Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

JouniS

macrumors 6502a
Nov 22, 2020
638
399
I wonder if there is a viable intermediate between the current system of tower PC assembly, and full SoC. Imagine a motherboard with a CPU socket, a GPU socket, and RAM in between. You could get the modularity that PC builders want, while having 'unified memory' and a faster interface than PCIe. It would also do great things for GPU cooling.
Unified memory is a compromise best suited for low-end systems. CPU and GPU tasks have very different requirements for the memory. If you go with a one-size-fits-all approach, you have to choose between low memory bandwidth for the GPU, low memory capacity for the CPU, and/or stupidly expensive memory.

The average PC has four main components: CPU + motherboard, RAM, SSD, and GPU. Technically you can upgrade the CPU without changing the motherboard, but it rarely makes sense. The other upgrades can often be cost-effective ways of extending the lifespan of the system. The most likely step in further integration would be combining the CPU and the chipset, possibly leaving more space on the motherboard for memory/storage extensions.
 

MarkAtl

macrumors 6502
Jul 9, 2019
402
407
that "junk" displaced mainframes and enabled personal computing for the past 30so years ... not saying that there isn't something better and it needs to be replaced, but calling 3 decades of success "junk" doesn't do it justice either ...
In the early days of Moore’s Law PC improvements were incredible. It’s just that since 2011 it doesn’t apply anymore for performance increases.
 

Fomalhaut

macrumors 68000
Oct 6, 2020
1,993
1,724
There's a reason that Macs are not on the approved vendor list for most companies. E.g. harder to provide IT services for, incompatible with most Enterprise software. It's cute they work for you but you're a mere data point.

Which Enterprise software are you referring to that is incompatible? Examples please!

Let's be honest - a lot of enterprise software is now web-based, and this trend is only increasing, so it doesn't matter what client machine you use. E.g. Salesforce.com, Oracle Apps, SAP, ServiceNow, MS Dynamics, Atlassian, plus nearly everything from Amazon, Google & Microsoft.

I can think of one: MS Visio - only available on Windows. But you have compatible MacOS alternatives like OmniGraffle, or web-based solutions like Lucent or Draw.io.

I don't have the data for "most companies", but I can say that in my last 25 years of work for large enterprises (IT-based), including Oracle & Amazon, that I have seen a good proportion of Mac computers in use, and have had no restriction on using a Mac for work.

I have worked on-site at a lot of enterprise client offices and Macs are well represented, particularly for IT staff.

Curiously, one major bank I worked with used Macs everywhere - but running Windows! They found the hardware to be more reliable than the alternatives from HP, Dell, Lenovo etc. Maybe they had some Windows-only apps? Not sure; I was there working on Oracle (web-based) applications.

The reason some (most?) enterprises don't approve Macs is mostly down to cost, and partially due to lack of IT support. Possibly the reason I've seen more Macs than you in corporate settings is because I work with IT professionals who are able to configure and manage their own machines without assistance. Corporate IT is mainly just concerned with security, and as long as you conform to the standards (e.g. disk encryption, MFA tokens, VPN access etc.) they don't care what you use.

I'm happy to hear the experience of others who come from different backgrounds; I expect there are scientific, engineering, and creative applications that are not available on MacOS, but we are talking about "most users" - who are doing corporate work involving a mixture of office productivity apps and some task-specific tools - which are increasingly SaaS-based.
 
Last edited:

jz0309

Contributor
Sep 25, 2018
11,387
30,043
SoCal
There's a reason that Macs are not on the approved vendor list for most companies. E.g. harder to provide IT services for, incompatible with most Enterprise software. It's cute they work for you but you're a mere data point.
That might have been true 10 or so years ago, but since almost all IT services as well as enterprise apps are cloud/web based today your argument doesn't hold anymore.
And as for IT support, IBM proved a few years back that it is actually cheaper to support Macs vs Windows PCs
 

jz0309

Contributor
Sep 25, 2018
11,387
30,043
SoCal
In the early days of Moore’s Law PC improvements were incredible. It’s just that since 2011 it doesn’t apply anymore for performance increases.
agree that the last 10 or so years have been disappointing from a CPU performance perspective, more gains from GPU and SSDs ...
 

Fomalhaut

macrumors 68000
Oct 6, 2020
1,993
1,724
I've been in this area for more than a decade and as the industry shifted from bare metal, to virtualisation to cloud, the software we used followed suit. Across all of our technical teams in terms of desktop apps, their jobs can pretty much be done with a web browser, an SSH client and their text editor of choice (local or otherwise) and very occasionally an RDP client with almost all of them running either Mac laptops with a few on Linux

Everything else is web based, either running on our own infrastructure, AWS or SaaS. Atlassian's apps, Slack, Zapier, GApps, Zabbix/Nagios/Data Dog, PagerDuty, Zendesk, BambooHR, Monday/Asana, Graphana, Xero, Salesforce, Mailchimp, GitHub, Zoom and many more.

The only other really required local apps are Excel for the finance team (Sheets doesn't cut it for them but for the rest of us its fine), and Adobe CS for our design teams and they are all run on Macs.

It' so liberating being apart of a company not tied to Windows and desktop apps.
Thanks for that! It sounds similar to my working experience, and you named a great collection of enterprise apps that I've heard of but didn't mention.

I don't know which enterprise apps @dogslobber is referring to....maybe he works in an industry sector that has traditionally been dependent on Windows applications?

You are absolutely spot-on about the transition from physical/bare-metal to virtualized to cloud infrastructure, and the subsequent adoption of web-based apps, with occasional use of Citrix or Remote Desktop for some older apps that only run on specific OSes.
 
  • Like
Reactions: Captain Trips

jz0309

Contributor
Sep 25, 2018
11,387
30,043
SoCal
here is actually Intel's response ... not surprising, what else were they going to say?

Intel EVP on Apple testing new chip design: ‘We feel very good with where we are competitively’​

 

dmccloud

macrumors 68040
Sep 7, 2009
3,142
1,899
Anchorage, AK
here is actually Intel's response ... not surprising, what else were they going to say?

Intel EVP on Apple testing new chip design: ‘We feel very good with where we are competitively’​


Intel has no clue what is coming from Apple, all they can go by is the M1. The issue is that Intel is in a state of denial, given that the M1 has outperformed even their newest 11th gen mobile parts. When the pro versions of the SoC come out, Intel may find themselves in a dogfight, especially with AMD continuing to gain ground (and market share).
 

neinjohn

macrumors regular
Nov 9, 2020
107
70
here is actually Intel's response ... not surprising, what else were they going to say?

Intel EVP on Apple testing new chip design: ‘We feel very good with where we are competitively’​

"One of the things that makes us unique, Julie, is we have engineers all around the world who sit with our customers to go create these fantastic new platforms. We launched a new Evo-based platform at the end of this year, and we're really excited about the experiences that brings and the prospects that brings for us and our customers where we've gone out and done deep co-engineering and verification of those experiences with our customers, really for the first time at scale all around the world."

From Ultrabook Review

"So, in order to get the Evo badging, laptops:
  • must run on Intel 11th gen Tiger Lake Core i5/i7 processors with Irix Xe graphics (or later), with 8+ GB of RAM and 256+ GB of SSD storage;
  • must provide consistent responsiveness on battery;
  • must instantly wake from sleep (in less than 1 second);
  • must provide 9 or more hours of real-world battery life (on laptops with a FHD display) and must be able to charge quickly over USB-C (4+ hours of battery life in under 30 minutes of charging);
  • must include modern connectivity options: WiFi 6 (Gig+), USB-C with Thunderbolt 4, optional LTE;
  • must include biometrics (IR cameras or finger sensor), Precision touchpads, backlit keyboards, 3-side narrow bezels around the display, good speakers, and a few other aspects inherited from the original Project Athena fact sheet."
Basically we'll send you engineers that build you a laptop (lowering costs) that copies most Apple Macbooks keypoints and maybe sell you our parts (CPU, GPU, Wifi, Thunderbolt, etc) at discount. What's interesting is this project started with Athena last year predates for 2 or more years M1's launch and given Apple, reportedly, very demanding relationship with Intel to develop products for their objectives they probably just copied the "Suggestion's Sheet" and started working on it when the separation was beyond done.

I wonder what AMD thinks about this. They can't replicate it with their over-extended meagre resources. Perhaps they'll just focus on riding the Renoir's sucess to get OEMs to build more low MSRP laptops with their 5000 series, gain the more performant laptops with the core count and maybe convince a few at least to put their mobile 6000 graphics as the performance per watt is better than NVidia (on desktop at least).

Funny enough since Apple can match AMD's core count, and will probably beat it on all points, the top laptops with Ryzen 9 and discrete graphics may need a boost of RTX's CUDA/DLSS/RTX/software/compatiblities to keep up with a Pro 16'' with 32 GPU cores for example.
 

Tech198

Cancelled
Mar 21, 2011
15,915
2,151
Just wondering.
Generally in any technology something is created which appears better in way to what's currently available.
This has been going on since man invented the wheel/rollers.
Someone else saw this and copied it, hopefully over time improving it.
We as a species continued to do this till we found ourselves where we are today.

So Apple launches the M1 which has benefits in real world usage for a specific type of machine.
With the promise that more powerful machines will follow.

The Windows/Linux (PC) world will of course see this, and if promised improvements at the higher ends do materialize will of course change also to gain similar benefits.
I would be crazy, would it not to assume the PC world will simply ignore ARM and the M1 if it does deliver what people are expecting.

So, what do you think will happen, over, perhaps the next decade let's say?
I know AMD are reported to be working on something a little like the M1 in some ways.

I would guess, at some point Intel will do something, albeit very late.

What are your thoughts?

Your comparing to the wheel man? ok.. going back a bit far.
Anything better will always trickle to others sooner, or later... Most likely later..

Intel doing M1 and hinting possible on either choosing between "M1 PC's" or "x86 PC" which would sel better ?

The majority of businesses will crumple until apps nativally get updated.. and everyting could run on WOW64, (Apple's equivelant to Rosetta) I dn't think MS is in the same performance park...
 

jdb8167

macrumors 601
Nov 17, 2008
4,859
4,599
Intel has no clue what is coming from Apple, all they can go by is the M1. The issue is that Intel is in a state of denial, given that the M1 has outperformed even their newest 11th gen mobile parts. When the pro versions of the SoC come out, Intel may find themselves in a dogfight, especially with AMD continuing to gain ground (and market share).

Expect a traditional Intel marketing campaign where they cherry pick particular work flows where they have an advantage. And there is little doubt that they can find those advantages. For example, testing shows that the Apple M1 does not have the performance of the latest Intel CPUs when it comes to Single Instruction Multiple Data (SIMD) instructions*. The likely reason for this is that Apple would rather spend the transistor budget on hardware to support particular use cases like encryption, video encode and decode and now Machine Learning. All of those can also be done less efficiently with SIMD. Overall, it is unlikely that Apple will have worse general use performance but expect the Wintel tame press to go along and crow about any benchmark that shows an Intel advantage.

*Arm MacBook vs Intel MacBook SIMD Benchmark

"One of the things that makes us unique, Julie, is we have engineers all around the world who sit with our customers to go create these fantastic new platforms. We launched a new Evo-based platform at the end of this year, and we're really excited about the experiences that brings and the prospects that brings for us and our customers where we've gone out and done deep co-engineering and verification of those experiences with our customers, really for the first time at scale all around the world."

From Ultrabook Review

"So, in order to get the Evo badging, laptops:
  • must run on Intel 11th gen Tiger Lake Core i5/i7 processors with Irix Xe graphics (or later), with 8+ GB of RAM and 256+ GB of SSD storage;
  • must provide consistent responsiveness on battery;
  • must instantly wake from sleep (in less than 1 second);
  • must provide 9 or more hours of real-world battery life (on laptops with a FHD display) and must be able to charge quickly over USB-C (4+ hours of battery life in under 30 minutes of charging);
  • must include modern connectivity options: WiFi 6 (Gig+), USB-C with Thunderbolt 4, optional LTE;
  • must include biometrics (IR cameras or finger sensor), Precision touchpads, backlit keyboards, 3-side narrow bezels around the display, good speakers, and a few other aspects inherited from the original Project Athena fact sheet."
Basically we'll send you engineers that build you a laptop (lowering costs) that copies most Apple Macbooks keypoints and maybe sell you our parts (CPU, GPU, Wifi, Thunderbolt, etc) at discount. What's interesting is this project started with Athena last year predates for 2 or more years M1's launch and given Apple, reportedly, very demanding relationship with Intel to develop products for their objectives they probably just copied the "Suggestion's Sheet" and started working on it when the separation was beyond done.

I wonder what AMD thinks about this. They can't replicate it with their over-extended meagre resources. Perhaps they'll just focus on riding the Renoir's sucess to get OEMs to build more low MSRP laptops with their 5000 series, gain the more performant laptops with the core count and maybe convince a few at least to put their mobile 6000 graphics as the performance per watt is better than NVidia (on desktop at least).
I'm actually happy that Intel is promoting this because I very much want Thunderbolt to become mainstream unlike what happened to Firewire. The advantages of Thunderbolt are very clear and it will benefit the whole computer industry to move to Thunderbolt.

It also might force AMD to support Thunderbolt which again, I think is a very good thing. AMD competes very well on price/performance against Intel but they don't do as well with how they use I/O. Everything on Intel's list for Evo is good for the industry and will also make Apple better with competition.
 
  • Like
Reactions: Captain Trips

dmccloud

macrumors 68040
Sep 7, 2009
3,142
1,899
Anchorage, AK
Expect a traditional Intel marketing campaign where they cherry pick particular work flows where they have an advantage. And there is little doubt that they can find those advantages. For example, testing shows that the Apple M1 does not have the performance of the latest Intel CPUs when it comes to Single Instruction Multiple Data (SIMD) instructions*. The likely reason for this is that Apple would rather spend the transistor budget on hardware to support particular use cases like encryption, video encode and decode and now Machine Learning. All of those can also be done less efficiently with SIMD. Overall, it is unlikely that Apple will have worse general use performance but expect the Wintel tame press to go along and crow about any benchmark that shows an Intel advantage.

*Arm MacBook vs Intel MacBook SIMD Benchmark


I'm actually happy that Intel is promoting this because I very much want Thunderbolt to become mainstream unlike what happened to Firewire. The advantages of Thunderbolt are very clear and it will benefit the whole computer industry to move to Thunderbolt.

It also might force AMD to support Thunderbolt which again, I think is a very good thing. AMD competes very well on price/performance against Intel but they don't do as well with how they use I/O. Everything on Intel's list for Evo is good for the industry and will also make Apple better with competition.

If you saw their "event" announcing the 11th gen parts, they cherry picked the hell out of their comparisons. The only GPUs they compared their Xe graphics to were AMDs Vega 6 (which is at least 4 years old) and nVidia's MX250 (at least 3 years old). They completely ignored all newer AMD iGPUs and the nVidia MX350 (which I believe either ties or pulls ahead of XE). They also spent at least five minutes (felt like 20) complaining how all benchmarks were poor indicators of performance while substituting their own test suite for said benchmarks. The one smart move Intel has made in the last 3-4 years is to make Thunderbolt 3 available to everyone royalty free, because that will be what makes TB ubiquitous in the PC/Mac industry. Firewire was dead in the water because of Sony's refusal to pay for FW licensing and creating their own "IEEE 1394" connector that did the same thing.

What I have found interesting about many of the reviews of the M1 Macs is that the only real "complaints" are that it only has 2 ports (just like the models it replaced, so no change there) and not all Intel applications run on the processor. That's like Car & Driver complaining that a new Tesla doesn't use unleaded gas. Intel will cling to application compatibility and their SIMD gains because that's about the only thing they have going for them given the comparisons between the M1 and 11th gen Intel chips (which only replaced the low-end Y and U Series mobile CPUs). They are also still downplaying the ramifications of delaying the full rollout of 10nm to 2022 by bragging about all of the superficial junk that is honestly irrelevant to the big picture. That 11th gen event from Intel was Steve Ballmer levels of exaggeration and hyperbole, and I just wanted to yell "shut up!" at them during the entire event.
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
Apple can do something that the rest of the PC market cannot. Optimize for both hardware and software. Windows needs to be open to work on millions or billions of hardware configurations. This alone will make an M1 style processor impossible. For example, do you need a system to run dozens of virtual machines? You probably want to get more cores vs higher single threaded performance. Do you want to get a system to play games? You probably want to sacrifice CPU power for GPU power (i5 with an RTX 3080 would beat out an i9 with an RTX 2080). Do you need to get a system for professional 3D work? Then you will need a different kind of graphics card that is better for work but not as good for games (Quadro cards).
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
You know what I'd love to see, simply out of curiosity.
Windows 10 on The brand new Xbox series X console, which is basically a VERY graphically powerful PC with a SOC made by AMD powering it.
100% sure it could be done if Msoft wanted to, but I'm, sure they don't simply because Msoft don't want to make other PC builders angry.
Not everybody wants a PC for playing games. And at that point, we would be back to where we are now. SOC for games, SOC for general compute tasks, SOC for professional workloads, SOC for servers that need to run dozens of virtual machines (128 cores+)
 
  • Like
Reactions: Captain Trips

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
The PC community response? That falls into two categories:
1. The majority saying "wow, that's cool!"
2. The few in denial. (Check comment sections on M1 articles for some delicious salt)

The PC industry's response will likely be to dissect Apple's M-series and copy what they can. Until that point they're gonna play up whatever strengths they have and downplay the M-series.
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
In the early days of Moore’s Law PC improvements were incredible. It’s just that since 2011 it doesn’t apply anymore for performance increases.
True. The fact that my 2010 Mac Pro 6 core system was just as powerful as my 2019 i9 iMac when exporting videos (difference of a ~30 seconds on 15+ minute projects) is proof that things slowed down to an absolute crawl. Only SSDs have majorly changed the game since 2010.
 
  • Like
Reactions: Captain Trips

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
That might have been true 10 or so years ago, but since almost all IT services as well as enterprise apps are cloud/web based today your argument doesn't hold anymore.
And as for IT support, IBM proved a few years back that it is actually cheaper to support Macs vs Windows PCs
Visual Studio for Windows is still lightyears ahead of Visual Studio for Mac. This and Gaming is the only reason why I use Windows on a second computer. Everything else I do on my Mac.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
True. The fact that my 2010 Mac Pro 6 core system was just as powerful as my 2019 i9 iMac when exporting videos (difference of a ~30 seconds on 15+ minute projects) is proof that things slowed down to an absolute crawl. Only SSDs have majorly changed the game since 2010.
GPU performance, the number of CPU cores, and the amount of RAM all increased by (roughly) an order of magnitude from 2010 to 2020. We didn't see most of it on our desk, because the demand for high-performance desktops and laptops collapsed. Once computers were fast enough for most purposes, people started demanding the same performance in a smaller device with a longer battery life for a lower price.

Back in 2010, I was using a brand new HPC cluster, where each node had 8 CPU cores and 32 GB memory. Today the average cost-effective server I'm using has 32-48 CPU cores and 384-768 GB memory. The difference is quite significant.
 

dmccloud

macrumors 68040
Sep 7, 2009
3,142
1,899
Anchorage, AK
You know what I'd love to see, simply out of curiosity.
Windows 10 on The brand new Xbox series X console, which is basically a VERY graphically powerful PC with a SOC made by AMD powering it.
100% sure it could be done if Msoft wanted to, but I'm, sure they don't simply because Msoft don't want to make other PC builders angry.

Windows 10 would not work as well on a Series X as people think. While you are correct that Microsoft is using an AMD-designed SoC, it was designed specifically for that console and its operating system (much as Apple has designed the M1 and Big Sur to work together). The issue would be replacing the XBox operating system (which is based on Windows 10) with the full-featured, general purpose Windows 10 that is used on desktops and notebooks virtually everywhere. The XBox OS is closer to Windows 10 S Mode than either Windows 10 Home or Professional. Microsoft can put Windows on any device they want, and the PC manufacturers have no say in the matter. That's why Microsoft jumped into the PC manufacturing side with their Surface lineup.
 
  • Like
Reactions: Piggie

ian87w

macrumors G3
Feb 22, 2020
8,704
12,638
Indonesia
Which Enterprise software are you referring to that is incompatible? Examples please!

Let's be honest - a lot of enterprise software is now web-based, and this trend is only increasing, so it doesn't matter what client machine you use. E.g. Salesforce.com, Oracle Apps, SAP, ServiceNow, MS Dynamics, Atlassian, plus nearly everything from Amazon, Google & Microsoft.

I can think of one: MS Visio - only available on Windows. But you have compatible MacOS alternatives like OmniGraffle, or web-based solutions like Lucent or Draw.io.

I don't have the data for "most companies", but I can say that in my last 25 years of work for large enterprises (IT-based), including Oracle & Amazon, that I have seen a good proportion of Mac computers in use, and have had no restriction on using a Mac for work.

I have worked on-site at a lot of enterprise client offices and Macs are well represented, particularly for IT staff.

Curiously, one major bank I worked with used Macs everywhere - but running Windows! They found the hardware to be more reliable than the alternatives from HP, Dell, Lenovo etc. Maybe they had some Windows-only apps? Not sure; I was there working on Oracle (web-based) applications.

The reason some (most?) enterprises don't approve Macs is mostly down to cost, and partially due to lack of IT support. Possibly the reason I've seen more Macs than you in corporate settings is because I work with IT professionals who are able to configure and manage their own machines without assistance. Corporate IT is mainly just concerned with security, and as long as you conform to the standards (e.g. disk encryption, MFA tokens, VPN access etc.) they don't care what you use.

I'm happy to hear the experience of others who come from different backgrounds; I expect there are scientific, engineering, and creative applications that are not available on MacOS, but we are talking about "most users" - who are doing corporate work involving a mixture of office productivity apps and some task-specific tools - which are increasingly SaaS-based.
I agree that for general enterprise software, majority of them can be transitioned to the cloud, assuming that the company is willing to do it. When one boils down to it, it's all about upfront cost, which Windows laptops will win every time. It's hard to convince executives about long term savings when they only care about cutting cost to show good numbers the next quarter.

Having said that, depending on the company, some can still carry a ton of legacy stuff. I've seen some companies where their own web portal for the company's own email still has a hard requirement of internet explorer (yes, IE). Windows legacy software is still thriving in emerging markets since most cloud-based services are either from US or western European countries. For example in my country, many popular non-cloud enterprise software (eg. accounting, CRM, software from the government, etc) are Windows only due to the marketshare.

But I agree, in general, with the transition to cloud-based solutions (helped by the pandemic), the local OS used is less and less relevant.
 
  • Like
Reactions: Captain Trips

MarkAtl

macrumors 6502
Jul 9, 2019
402
407
GPU performance, the number of CPU cores, and the amount of RAM all increased by (roughly) an order of magnitude from 2010 to 2020. We didn't see most of it on our desk, because the demand for high-performance desktops and laptops collapsed. Once computers were fast enough for most purposes, people started demanding the same performance in a smaller device with a longer battery life for a lower price.

Back in 2010, I was using a brand new HPC cluster, where each node had 8 CPU cores and 32 GB memory. Today the average cost-effective server I'm using has 32-48 CPU cores and 384-768 GB memory. The difference is quite significant.
High end server memory might have increased by an order of magnitude but not at the lower levels. A Dell T40 ships with 8GB by default. A T110 from 2010 maxed out at 16GB.
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
GPU performance, the number of CPU cores, and the amount of RAM all increased by (roughly) an order of magnitude from 2010 to 2020. We didn't see most of it on our desk, because the demand for high-performance desktops and laptops collapsed. Once computers were fast enough for most purposes, people started demanding the same performance in a smaller device with a longer battery life for a lower price.

Back in 2010, I was using a brand new HPC cluster, where each node had 8 CPU cores and 32 GB memory. Today the average cost-effective server I'm using has 32-48 CPU cores and 384-768 GB memory. The difference is quite significant.
That doesn't affect my work at all. I got burned hard by buying into the "More memory = better performance" when I only work on 1080p video editing. 128GB of RAM is not better for my work than 8GB. Only a few seconds difference and not worth the $500 RAM pricing. Other than games, GPU performance really hasn't changed much since my GTX 1080.

Its 2020 (well my iMac is 2019 tech). Computers should scream when working on 1080p video editing but its not that much better compared to my 2010 Mac Pro. And my iMac fans are maxed out. It is ridiculous!
 

MarkAtl

macrumors 6502
Jul 9, 2019
402
407
Moore’s Law has never been about performance but transistor density.
You’re of course correct. However, there has usually been a correlation between increase in transistors and higher performance. Moore’s 5th paradigm -

MooresLaw.jpg


compare this with a 9900K benchmarking against a 2600K from 2011:


Roughly 2x increase in performance over 7 years.
 
  • Like
Reactions: Captain Trips

JouniS

macrumors 6502a
Nov 22, 2020
638
399
High end server memory might have increased by an order of magnitude but not at the lower levels. A Dell T40 ships with 8GB by default. A T110 from 2010 maxed out at 16GB.
The low end of servers continues below the low end of desktops and laptops, while the high end is something like 100-200 cores and up to 12 TB memory. My numbers were for typical servers that provide cost-effective performance for general workloads.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.