Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

spiderman0616

Suspended
Aug 1, 2010
5,670
7,499
And even then Windows 10 has massive issues with interface consistency, both cosmetic and functional. Eg: System tools and configuration:

1. Administrative Tools: These look and function like something from Windows from the NT days almost.

administrative-tools-folder-on-windows-8.14.png


2. Control Panel: Other than icon updates and the optional grouping feature, this has been the same since Windows 9x and is the primary place for system configuration.

Windows_Control_Panel.png


3. Settings App: Introduced with Windows 8, it is still not feature complete compared to control panel, has a totally different look and feel and for some reason they have not switched over to it despite having had more than 8 years to do so

settings-app-windows-10-au.jpg


I mean it's just insane that something so important has not been unified.
Yep. Dig deep enough into any Windows build and you'll eventually see UI elements going all the way back to Windows 2000, sometimes even older. Even the "New Outlook" setting for macOS is like this. Everything looks "new" until you do something like break a new draft out into its own window---then you're immediately back in a UI that looks like Office XP.
 

dmccloud

macrumors 68040
Sep 7, 2009
3,142
1,899
Anchorage, AK
This, 100%. I'm also old enough to have been around then and Microsoft - and Windows, in particular - was absolutely despised. I once witnessed an employee throw a monitor down a spiral staircase in frustration, only having to go back to grab the CPU base unit when he realised he hadn't destroyed the actual PC.

For a certain generation, Windows still represents the absolute pits when it comes to computing. It's no wonder they switched to Macs and iPhones and won't touch PCs still, no matter how much better Windows 10 (or even 7 at the time) is. For a lot of people, it will always represent ugly beige boxes, incomprehensible error messages and BSODs. In human terms, time in their lives they'll never get back.


True, but I think Apple really missed a trick when Windows 8 was released. That was a godawful mess of an OS with the most abysmal mish-mash of a UX in history. I don't know anyone who didn't hate it. I would shake my head in disbelief at it on a daily basis (the 'Charms' bar on a desktop OS - what a debacle). But Apple was too distracted to care - they released the trashcan Mac Pro and the lousy 2014 Mac Mini in subsequent years and wasted a golden opportunity, seemingly losing interest in the Mac itself at the time when they could have hoovered up market share from Windows users who had finally had enough.

I know a LOT of people who switched to the Mac when Windows 8 was released, mainly because 8 looked like a damn toy. In essence, Apple didn't need to pull any tricks out of their hat, because Microsoft unwittingly set up flashing billboards pointing to the Mac as the one remaining relatable UI.
 

hugodrax

macrumors 65816
Jul 15, 2007
1,225
640
Yep. Dig deep enough into any Windows build and you'll eventually see UI elements going all the way back to Windows 2000, sometimes even older. Even the "New Outlook" setting for macOS is like this. Everything looks "new" until you do something like break a new draft out into its own window---then you're immediately back in a UI that looks like Office XP.

This is because unlike OS X Windows is not based on a common object oriented API. Win32api is a huge mess, tons and tons of duplicate items and none are consistent or break in different ways. Even the teams within windows will duplicate efforts and each will have its own team working on UI elements, libraries etc.

All the Spaghetti code, layering new **** on top of creaky old foundation Leads to what you experience. What happened with Windows NT is they created a new kernel (Dave Cutler) but then threw on top of that all the garbage from the past on top of it what a waste of an opportunity.
 

Serban55

Suspended
Oct 18, 2020
2,153
4,344
I agree with this. The energy cost savings moving towards ARM based data centres are going to be huge. As for laptops and tablets, good enough will be good enough. There’s no short term need to beat Apple’s MX performance. There is an urgent need for something a lot better than current efforts in the Windows/ARM space.
Thats already old news...all the servers, in UK will be arm based by the end of 2030...and the preg is that mil of pounds are saved just from that , because of the cooling savings and buildings floors
Japan also has already started
 

skaertus

macrumors 601
Feb 23, 2009
4,250
1,401
Brazil
I cant see anyone at Dell worried over improving the XPS to using a non-intel processor over the success on the M1 processor. They seem to be fine.
Well, the XPS line may see some impact here. The Dell XPS and the MacBook line are competitors, even though many people do not acknowledge it. They are both premium light laptops, and, for many people, the operating system is secondary.
 
  • Like
Reactions: MBAir2010

FlyingTexan

macrumors 6502a
Jul 13, 2015
941
783
You’re of course correct. However, there has usually been a correlation between increase in transistors and higher performance. Moore’s 5th paradigm -

MooresLaw.jpg


compare this with a 9900K benchmarking against a 2600K from 2011:


Roughly 2x increase in performance over 7 years.
Where’s the last 20years? Moores law doesn’t apply anymore
 

pshufd

macrumors G4
Oct 24, 2013
10,149
14,574
New Hampshire
Well, the XPS line may see some impact here. The Dell XPS and the MacBook line are competitors, even though many people do not acknowledge it. They are both premium light laptops, and, for many people, the operating system is secondary.
The XPS line has desktops too. I had a look at the current models and passed because the case size had been decreased and there's a swing-out hinge for a bunch of the parts that I thought would eventually fail. So I built a PC from parts. XPS is a premium line and the XPS 13 is probably a direct competitor for the MacBook Pro/M1 - with Intel's best laptop chip too.
 

Fomalhaut

macrumors 68000
Oct 6, 2020
1,993
1,724
My guess is that Apple will load up their own datacenters with Apple Silicon powered servers and save a ton on electricity costs. They won't sell their silicon to third parties and it will end up being a competitive advantage.
I would be surprised if Apple goes back into building servers, even for their own internal use. Apple makes extensive use of external cloud services, even to provide their own "iCloud" services:


It's true that they have reduced their use of AWS recently, and are using their own data centres more, but I expect it would be a massive undertaking to replace the x86 Linux servers with M1-based machines running virtualized ARM-Linux.

It's conceivable that they might have some specialized services running home-grown Apple Silicon servers...but I think it more likely they would use commodity servers, maybe with more emphasis on ARM machines, as these become more prevalent in the industry.

I would love to be proven wrong...but I just don't think Apple wants to go back into enterprise server manufacturing or to become a generic Cloud Platform infrastructure provider competing with Amazon, Google & Microsoft.
 

pshufd

macrumors G4
Oct 24, 2013
10,149
14,574
New Hampshire
I would be surprised if Apple goes back into building servers, even for their own internal use. Apple makes extensive use of external cloud services, even to provide their own "iCloud" services:


It's true that they have reduced their use of AWS recently, and are using their own data centres more, but I expect it would be a massive undertaking to replace the x86 Linux servers with M1-based machines running virtualized ARM-Linux.

It's conceivable that they might have some specialized services running home-grown Apple Silicon servers...but I think it more likely they would use commodity servers, maybe with more emphasis on ARM machines, as these become more prevalent in the industry.

I would love to be proven wrong...but I just don't think Apple wants to go back into enterprise server manufacturing or to become a generic Cloud Platform infrastructure provider competing with Amazon, Google & Microsoft.

Maybe sell servers to AWS exclusively for use by iCloud. Amazon would have to do the work to get it all to run. I have no doubt that they could.
 

Argon_

macrumors 6502
Nov 18, 2020
425
256
It's true that they have reduced their use of AWS recently, and are using their own data centres more, but I expect it would be a massive undertaking to replace the x86 Linux servers with M1-based machines running virtualized ARM-Linux.

Native ARM Linux already exists. Of all the companies, Apple certainly has the in-house development might to refine it for their servers.
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
Of all the companies, Apple certainly has the in-house development might to refine it for their servers.
I would think it would not take much for Apple to write optimised Linux drivers for their Mx Macs. Tho. I think enterprise storage and backup solutions might present a problem for them. I would think deploying Linux would be easier from a management point of view compare to macOS, due to the toolings and solutions already available for Linux.
 

Argon_

macrumors 6502
Nov 18, 2020
425
256
I would think it would not take much for Apple to write optimised Linux drivers for their Mx Macs. Tho. I think enterprise storage and backup solutions might present a problem for them. I would think deploying Linux would be easier from a management point of view compare to macOS, due to the toolings and solutions already available for Linux.
Not for their Mx macs, but for datacenters running Mx processors to economize power.
 

iPadified

macrumors 68020
Apr 25, 2017
2,014
2,257
Laptops and desktops are fundamentally the same, because they run a general-purpose operating system such as macOS and Windows. Pure consumer devices tend to use more limited operating systems such as iPadOS, Android, and Chrome OS.

"Niche" is an interesting word. Most people are outside any particular niche, but everyone belongs to at least a few niche groups. Even in computing context, I would expect that 30-50% of people still using general-purpose computers are in at least one niche user group.


The traditional GPU already disappeared a long time ago. Today's GPUs are computers with a high level of data parallelism. Their use is still growing, as people are figuring out how to use them effectively in a wide range of tasks. Common tasks such as machine learning can be pushed special-purpose hardware (such as Apple's Neural Engine, Nvidia's Tensor Cores, or Google's Tensor Processing Units), but again 30-50% of GPGPU users are probably using them for niche tasks.

Computational science is normally done on remote hardware, but the people developing the methods and tools often have local hardware for abnormal situations. For example, I have 128 GB memory in my iMac, because it's convenient to be able to investigate problems in a local virtual machine. People working on numerical methods may have high-end consumer GPUs for similar reasons.
General purpose according to what definition? For the vast majority the capabilities of an iOS and Android is sufficient as it does all these user needs. What is lacking in these OS platform is competent software. iPads stopped being "toys" or consumable devices long ago.

I omitted laptops and minis on purpose because they are preconfigured from the factory with no internal updates and hence are difficult to configure fora given situation.

GPU/GPGPU: I know but people should be more precise in their wordings. In case of M1 it is especially important as it relies heaving on compute coprocessors. This difference is the core of the whole thread.

In the old days, you would probably be doing your work on mini computers: Sun Sparc, SGI etc. Really expensive niche computers. I see the traditional PC/Mac tower being pushed into that category. Can a niche computer be a general purpose computer or is this a contradiction in terms?

Computer scientists: What budgets do you have for doing research (apart from salary)? My PhD students typically have about 60000 USD running costs over three years. Why not just ask for the money you need to do your research? It is peanuts compared to salary and overhead.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
General purpose according to what definition? For the vast majority the capabilities of an iOS and Android is sufficient as it does all these user needs. What is lacking in these OS platform is competent software. iPads stopped being "toys" or consumable devices long ago.
From my point of view, a general purpose OS exposes the computer as a general purpose tool. Something that can (in principle) do anything any other computer can. If we start talking about user needs, we are no longer talking about general purpose computing.

Computer scientists: What budgets do you have for doing research (apart from salary)? My PhD students typically have about 60000 USD running costs over three years. Why not just ask for the money you need to do your research? It is peanuts compared to salary and overhead.
My expenses as a staff scientist are maybe half of that. The issue is more about university/funder policies than money. Just because you have the money doesn't mean you can spend it the way you would prefer.
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
Though both Dell and HP offer several Ryzen models in their ranges classed as "business".
I've checked this out myself actually, only HP offers Ryzen in their "higher-end" lineup. If you want an XPS, you're getting Intel. Or a dGPU in fact.

I would say that for every time companies have managed to reach a consensus on a standardised format, there are a much greater number of instances where consortiums of companies have released competing technologies with one (not necessarily the technologically superior solution) withering and dying, usually at the expense of early adopter consumers. It's almost always messy.
I never said it wouldn't be messy.
Because of terrible leadership. Steve Ballmer was desperate to halt the decline of Windows PCs in the iPad age; Microsoft had compltetely missed the boat on mobile and envisioned a touch interface and new form factors as the savior of Windows.

He went "all-in", rushing 8 out of the door when they should have stepped back to figure out how this would work on non-touch environments. It was blind panic, a fever dream of an OS in a time when Android and iOS were swallowing up market share for casual computing. The PC in every home became a smartphone in every pocket.

The whole disaster led to the Satya Nadella era and Microsoft's reimagining as a cloud and Enterprise company, and Ballmer's legacy will forever be viewed with disdain.
Ballmer was 100% a clown, basically anything Satya Nadella does is going to be good in comparison.
 

Fomalhaut

macrumors 68000
Oct 6, 2020
1,993
1,724
Not for their Mx macs, but for datacenters running Mx processors to economize power.
This is more likely. My earlier post about using virtualized ARM-Linux referring to the current limits on dual-booting on M1 Macs.

It would be interesting to see Apple use custom Mx-based servers in their data centres, but I'm not sure that they have the appetite to get back into this kind of hardware unless there is a compelling technical or financial reason for it. It's probably cheaper for them to use commodity servers (maybe ARM-based) or 3rd party cloud offerings like the AWS Graviton 2 instances.
 

Fomalhaut

macrumors 68000
Oct 6, 2020
1,993
1,724
Maybe sell servers to AWS exclusively for use by iCloud. Amazon would have to do the work to get it all to run. I have no doubt that they could.

AWS has just started offering EC2 MacOS instances, powered by the 6-core i7 Mac Mini, and will be offering M1 Macs next year. However, these are likely to be limited to MacOS and not providing a generic Linux platform. Apple would need to decide to support multiple-OS booting on Apple Silicon. It's possible of course!
 

iPadified

macrumors 68020
Apr 25, 2017
2,014
2,257
From my point of view, a general purpose OS exposes the computer as a general purpose tool. Something that can (in principle) do anything any other computer can. If we start talking about user needs, we are no longer talking about general purpose computing.


My expenses as a staff scientist are maybe half of that. The issue is more about university/funder policies than money. Just because you have the money doesn't mean you can spend it the way you would prefer.
The first point is really a grey scale.

Second point: I suspected this, the theorists at my department always seem under financed compared to cell and molecular biologists.Funders, like management seem to have difficulties to distinguish between cost and value in regards to computers. Applying their needs (office) to general purpose computing needs of scientific personnel is narrow-minded and makes no economic sense what so ever. I even heard management said "..but we support Microsoft" when Apple computers was mentioned.

By contrast fund givers in biology and medicine are accustomed to very high running costs. Interesting difference...
 

SlCKB0Y

macrumors 68040
Feb 25, 2012
3,431
557
Sydney, Australia
but I expect it would be a massive undertaking to replace the x86 Linux servers with M1-based machines running virtualized ARM-Linux
I imagine if they ever did move to ARM in the data centre they would use Linux/KVM as the base hypervisor. If they wanted to be purists about it they could use Darwin/Bhyve. Using a full-blown version of macOS (as they have previously used for a server OS) makes no sense and even Microsoft has recognised this with their use of Windows Server Core.
 
Last edited:

SlCKB0Y

macrumors 68040
Feb 25, 2012
3,431
557
Sydney, Australia
I think enterprise storage and backup solutions might present a problem for them.
If they were going to make their own cloud platform for iCloud there are plenty of software-based distributed storage solutions such as Ceph, GlusterFS they could implement. Dedicated "hardware" storage solutions are probably not the best choice for a modern cloud architecture to be based upon.
 
Last edited:

SlCKB0Y

macrumors 68040
Feb 25, 2012
3,431
557
Sydney, Australia
In the old days, you would probably be doing your work on mini computers: Sun Sparc, SGI etc. Really expensive niche computers. I see the traditional PC/Mac tower being pushed into that category. Can a niche computer be a general purpose computer or is this a contradiction in terms?
It depends on what the "old days" are to you ?

Your example of SGI - at their peak in the late 80s/early 90s they were firmly in the "workstation" category of computers, and because they were so performant in their niche area of graphics computation using hardware acceleration, they were able to charge huge sums of money.

But just a few years later in the mid 90s, the confluence of Microsoft producing workable GUIs, graphics APIs, consumer hardware-accelerated 3D (Eg. 3DFx) and the resulting ports of specialised 3D modelling software to Wintel saw this formerly niche area of computing became commoditised.

So having said all that, I guess I have four points to make:

1. Whilst the SGI workstations were extremely good at one area of computing, they could be used as a general purpose computer (even though it would make no financial sense to do so). So, a niche computer can (but not always) be used as a general purpose computer.

2. An area of computing that is considered niche can become commoditised and mainstream in the blink of an eye. This has occurred time and time again.

3. Whilst expensive, I would not consider the Mac Pro a niche computer, just a high powered general purpose computer. It is only niche in terms of it's cost, and therefore the target consumer, but nothing about it is technologically niche. So it really depends on how one defines "niche" in the context of computing.

4. I would argue that currently, most niche computing is implemented in the many different types of specialised embedded computers and that "niche" has moved up the computing stack from hardware to software.
 
Last edited:

iPadified

macrumors 68020
Apr 25, 2017
2,014
2,257
It depends on what the "old days" are to you ?

Your example of SGI - at their peak in the late 80s/early 90s they were firmly in the "workstation" category of computers, and because they were so performant in their niche area of graphics computation using hardware acceleration, they were able to charge huge sums of money.

But just a few years later in the mid 90s, the confluence of Microsoft producing workable GUIs, graphics APIs, consumer hardware-accelerated 3D (Eg. 3DFx) and the resulting porting of specialised 3D modelling software to Wintel saw this formerly niche area of computing became commoditised.

So having said all that, I guess I have three points to make:

1. Whilst the SGI workstations were extremely good at one area of computing, they could be used as a general purpose computer (even though it would make no financial sense to do so). So, a niche computer can (but not always) be used as a general purpose computer.

2. An area of computing that is considered niche can become commoditised and mainstream in the blink of an eye. This has occurred time and time again.

3. Whilst expensive, I would not consider the Mac Pro a niche computer, just a high powered general purpose computer. It is only niche in terms of it's cost, and therefore the target consumer, but nothing about it is technologically niche.
Good points though niche must be how many uses a computer not what it can do. Point number 2 is especially valid and that is what driving Mac Pro and traditional desktop into the niche category (in my definition). Ten years ago, you needed a Mac Pro for video editing. Now you need a laptop and can do decent stuff.

If we learned anything from the M1 and A12Z is that the hardware only is not defining what can be considered a general purpose computer, it is the OS. It is clear that here (hello all fellow nerds), general purpose is defined by how much the OS lets you do. In my world that is a grey scale depending on your point of view. I consider a phone OS to be more general purpose than embedded system despite that you have better control of all computing aspects of said embedded system.

What we also learned is the the Swiss army knife approach ("general purpose" ) of intel CPU and perhaps GPU is lacking to its end and we see more and more hardware solution to specific compute problems. Nearly all need to process video and picture today so we have dedicated hardware for that which handily outperform the GP processors. Next step, I expect, is real time rendering accelerator so you do not need a 3090 or something similarly energy inefficient for smooth rendering.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.