Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

spiderman0616

Suspended
Aug 1, 2010
5,670
7,499
There is no reason x86 has to die, or any other product unless it's dangerous to use. And x86 drives most businesses these days -- do you want all businesses to die too, just for the sake of your "progress".

The market decides what it wants and right now for PC's, it's x86 by a good margin. When that market share gets down to less than 10%, you can start talking about it should die, until then, you're just anti-business, anti-consumer, and anti free market.
That's a bit harsh. I see a lot of businesses, large and small, adapting to a non-x86 world in many ways. A smaller shop may start with touch-based payment kiosks running on iPads and grow from there. A larger company, like the one I just left, may start realizing that most of its employees prefer Macs now. This is all anecdotal, of course, but just as a personal example:

When I started at my previous company in 2011, it was a Windows business through and through. Macs were an afterthought. In my tier one desktop support job, we had maybe 3 Macs--one for testing, one for development, and one for support to mess with. Very few people even bothered with them in my department. When I left the company at the beginning of this year, 3/4 of the total internal user base was on a Mac.

The last time I job hunted, the question "Do you prefer a Mac or PC?" was not even an option. Every offer I've gotten recently has included that question. And when I say Mac, I don't get the eye rolls or snide comments like I used to back in the old days. I really feel like the worm is starting to turn in this regard. I maintain my theory that a very, very large percentage of the Windows user base is people who were learned on beige box Gateway 2000s and AOL in the 90s. That demographic overlaps with a lot of others as well, but I can't think of a worse reason to stick with x86.

Things HAVE to progress. Businesses HAVE to progress. x86 is legacy. You cannot run forever on legacy. It's just not sustainable in the days where literally almost every human on earth has a computer in their pants pocket.
 
  • Haha
  • Like
Reactions: Erixtr and alien3dx

spiderman0616

Suspended
Aug 1, 2010
5,670
7,499
That's quite interesting since we had a very different experience. Macs might be more expensive but the total cost of ownership was significantly lower than using Windows laptops. The real cost is the support and personell downtime, and those were much much rarer with Macs in our experience.
I blame IT departments for a lot of the problems businesses have with allowing Mac use. Macs are often perceived as a threat to IT people, so they rage against them to everyone who will listen to keep them out of the building. BYOD trends have really changed that, but I still see the resistance here and there.
 

ThunderSkunk

macrumors 601
Dec 31, 2007
4,075
4,561
Milwaukee Area
To use Jobs old analogy, not everybody needs a truck, some people can get by with an economy car. The utility (truck) end of the Apple product line peaked in 2015, 7 years ago already. By 2016, Apple started chopping features and components off across their line until reducing unit cost (& increasing profit) became their primary innovation. It is not a coincidence that every change since 2016 has been in the service of making their product less expensive to produce, as well as distribute and support. This is pretty typical of companies that get bought out by a competitor looking to cash in on them for a few profitable years before driving them into the ground. Interesting to see Apple following the same path voluntarily, but at the trillion dollar threshold, what’s more important to the decision makers at the top? The trivial details of another disposable consumer product line that irritate users but keep people buying the products more often, or keeping the shareholder scheme in the black for another quarter? The former is only a potential risk, & the latter has to be the focus.
 

AAPLGeek

macrumors 6502a
Nov 12, 2009
731
2,271
1) You cannot get Low Power RAM modules in any other package. To move away from soldered RAM, they’d need to lose performance and increase power consumption.

2) This was the reason that I sold my M1 MBP, ok my requirements may not have been mainstream, but compatibility affected me. My OH also uses software daily for work that has reported compatibility issues on AS too.

This will come in time, but I can’t wait for it to possibly happen.
That's fine for laptops, but there's absolutely no need for low power RAM on desktops. Especially when its extremely cost prohibitive for the average user. Apple memory upgrade prices are INSANE.
 

ThunderSkunk

macrumors 601
Dec 31, 2007
4,075
4,561
Milwaukee Area
Things HAVE to progress. Businesses HAVE to progress. x86 is legacy. You cannot run forever on legacy. It's just not sustainable in the days where literally almost every human on earth has a computer in their pants pocket.
Unfortunately, that is very much not the case in risk-averse industries, where reliability is key, and materials, processes, & equipment with a decades-long proven track record win out over the endless parade of flash-in-the-pan short-lived innovations. As long as the aircraft & automotive industries exist, Dassault Systems & Autodesk aren't abandoning intel, and as long as they don’t, design & engineering firms worldwide will remain stuck on intel as well. And for everything else, there’s already the ARM computer in your pocket.
 
  • Like
Reactions: bobcomer

leman

macrumors Core
Oct 14, 2008
19,521
19,677
until then, you're just anti-business, anti-consumer, and anti free market.

Come on, that was really not necessary. Sure, the PC world runs on x86, but that is because of historical reasons and not because it is the best possible technology. It is hardly appropriate to call someone out as "anti-consumer" because they point out that the current monopolist legacy technology has known problems. Even disregarding the fact that x86 is almost forty years old by now and has accumulated a lot of legacy cruft, simply because it is controlled by only two companies is reason enough to have a guarded stance against it, especially if you claim to be pro free market.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
That's a bit harsh. I see a lot of businesses, large and small, adapting to a non-x86 world in many ways.
What you said that I was responding to was way more harsh to me. If you only knew just what kind of budget I'd need to get rid of x86 it would make your toenails fall off. And we're just a small company that can't pay that kind of dues without extreme hardship. (<100 at our location)

I see a lot adapting too, we even have some ARM based stuff, but only for embedded type jobs. But I also see a heck of a lot more just happy as a clam with their x86 gear. (as am I.)

A larger company, like the one I just left, may start realizing that most of its employees prefer Macs now. This is all anecdotal, of course, but just as a personal example:
Very anecdotal, as is mine. We have only had one user that temporarily used a Mac here, and he lost his job for other reasons than that. I only use Mac's at home because Mac's just aren't compatible enough for me to use one in my job. I suspect you live in a very different part of the country that I do. I'm in the south east, you don't see Mac's at all here except for in a very minority of homes. Mac's only have a 16% share of the market after all, and that counts ALL macs, not just AS Macs.

Things HAVE to progress. Businesses HAVE to progress. x86 is legacy. You cannot run forever on legacy. It's just not sustainable in the days where literally almost every human on earth has a computer in their pants pocket.
LOL, you're spouting belief with no basis in fact. A business is there to make money, period. And as long as there is a market for your "legacy" (not my term, because it's still active), there will be businesses that cater to it, because, you know, they're in business to make money, not to get rid of their own business.
 

Larsvonhier

macrumors 68000
Aug 21, 2016
1,611
2,983
Germany, Black Forest
Interesting - Makes me feel like a humble bee (not able to fly according to laws of physics / aerodynamics, but practically flies every day).
Perhaps my M1 does simply not know that it should not run Windows 11 (ARM) and emulate x86 therein.
Until someone tells it to stop doing it, I´m happy with the current situation.
;-)

(I suppress jokes about locked-in syndrome, a too serious topic)
 

EdT

macrumors 68020
Mar 11, 2007
2,429
1,980
Omaha, NE
I would like to perform internal upgrades to my....
TV
Washing machine
Tumble Dryer
Sofa
Coffee Table
Carpet
Floor Lamps
Log burning stove
etc

From being on the side of "This is a PITA" why I can't do my own upgrades.
I am actually getting why Apple are taking the route they are.
I actually tore apart and replaced the heating coils in my electric tumble dryer. Replaced the drum belt as well since i didn’t want to have to tear it apart again in however months until that wore out.

My iMac 24 inch from 2008 wasn’t really upgradable, nor was the 2013 MacBook Pro laptop that I bought. My current 2015 27 inch iMac is mostly not upgradable, except you can add RAM fairly easily. But not a different graphics card or (very easily) a SSD hard drive. So lack of upgradability isn’t a new Apple problem.

I really wish that the M series RAM was cheaper, because that’s the non-upgradable item that will probably bite people first. I understand that the memory architecture is different on Apple’s Arm chips and the performance gains are real and dramatic and not just marketing BS, according to experts not being paid by Apple so I trust what they say. But telling someone that 8 GB is “enough” when 1 -2 years from now it likely won’t be enough makes expensive memory a sore point for me. If I could get 32GB without raising the cost by $800-1000 I would agree that by the time memory is a problem then technology will have changed enough that buying a new computer is more practical. That’s true about Intel or AMD based machines as well.
 
Last edited:

EdT

macrumors 68020
Mar 11, 2007
2,429
1,980
Omaha, NE
Interesting - Makes me feel like a humble bee (not able to fly according to laws of physics / aerodynamics, but practically flies every day).
Perhaps my M1 does simply not know that it should not run Windows 11 (ARM) and emulate x86 therein.
Until someone tells it to stop doing it, I´m happy with the current situation.
;-)

(I suppress jokes about locked-in syndrome, a too serious topic)
I would seriously be interested in what emulator you run and what drawbacks you’ve found. I’m assuming that if you have found some then they aren’t serious, and if I move to a new M platform I may want to try it.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
I blame IT departments for a lot of the problems businesses have with allowing Mac use. Macs are often perceived as a threat to IT people, so they rage against them to everyone who will listen to keep them out of the building. BYOD trends have really changed that, but I still see the resistance here and there.

I don't know, when I was running the local IT department I preferred to do as little work as possible. Having users with Macs was great for that. But I hear that our university now has a new chief of IT who is pushing for managed one-size-fits-all devices, screw that. The hell I am going to work at a place where someone else has remote access to my machine.

That's fine for laptops, but there's absolutely no need for low power RAM on desktops.

Of course there is, assuming you want to go fast. Getting to 400GB/s or higher is not exactly cheap (neither in terms of cost nor in terms of energy usage). Once your RAM starts to use the amount of power comparable to your CPU you have a problem, no matter how you look at it. Power consumption and interconnect complexity are the limiting factors. The only reason why system RAM has traditionally been upgradeable is because it's slow. Devices with high-performance RAM (be it GPUs or specialised supercomputer chips) don't have socketed RAM exactly for this reason.

And frankly, how do you imagine that working in practice? An M1 Ultra with slotted DDR5 RAM would need 16 slots, all of which must to be populated. For 128GB using cheapest 8GB DDR5 modules I found online that's $1300. And you would need a huge chassis to host all that RAM. And you can forget about having a compact, quiet and relatively affordable machine like the Studio with this kind of setup. The mainboard costs alone would be off the charts.

P.S. The often overlooked innovation of Apple's solution is that they give you RAM that can be both blazing fast and extremely efficient. My M1 Max has 32GB of 400GB/s RAM which only consumes 0.5 watts under normal office-like operation. That, as far as I know is absolutely unprecedented in the industry.
 
Last edited:

TiggrToo

macrumors 601
Aug 24, 2017
4,205
8,838
I blame IT departments for a lot of the problems businesses have with allowing Mac use. Macs are often perceived as a threat to IT people, so they rage against them to everyone who will listen to keep them out of the building. BYOD trends have really changed that, but I still see the resistance here and there.
About half our IT Staff run exclusively on Macs - myself included.
 
  • Like
Reactions: spiderman0616

ct2k7

macrumors G3
Aug 29, 2008
8,382
3,439
London
Where I work, we don’t have individual machines, just thin clients. The state of business software for enterprises on Mac sometimes leaves more to be decided
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
Come on, that was really not necessary.
I'm afraid it was VERY justified. He's trying to dictate to people what they can and cannot use and that's not even close to kosher. As a business person that has budget constraints, I couldn't respond any other way.

Sure, the PC world runs on x86, but that is because of historical reasons and not because it is the best possible technology.
Good enough and cheap is very often good enough.

If I were to choose a computer architecture / operating environment for all businesses to run on, it wouldn't be x86, or Mac's either, neither are the best, but guess what, there isn't a best at everything, and historical reasons are good enough to stay the course -- because it costs less and costing less makes more profit.

It is hardly appropriate to call someone out as "anti-consumer" because they point out that the current monopolist legacy technology has known problems.
The trouble is he didn't do that -- he said intel had to die without giving any reasons why other than it's legacy. Which, I might add, since x86 is the dominant platform in the market, is extremely anti consumer. (since they are the market)

Even disregarding the fact that x86 is almost forty years old by now and has accumulated a lot of legacy cruft, simply because it is controlled by only two companies is reason enough to have a guarded stance against it, especially if you claim to be pro free market.
A guarded stance is okay! Preparing for changes, cutting greenhouse gasses, making things better, all good. But calling for the death of a whole market segment, nah, that's not right, ever.

And btw, the x86 of today is a lot different than the original, so saying it's 40 years is ignoring a lot of advancement Intel and AMD has made. I'm typing on an AMD laptop at this very moment that is faster than my M1 MBA, and is a decent amount lighter. This is no 8088. (The first PC processor I ever used!)

I might receive my M1 Studio Max this afternoon, I can't wait. :)
 
  • Like
Reactions: ArkSingularity

BellSystem

Suspended
Mar 17, 2022
502
1,155
Boston, MA
There is no reason x86 has to die, or any other product unless it's dangerous to use. And x86 drives most businesses these days -- do you want all businesses to die too, just for the sake of your "progress".

The market decides what it wants and right now for PC's, it's x86 by a good margin. When that market share gets down to less than 10%, you can start talking about it should die, until then, you're just anti-business, anti-consumer, and anti free market.
Expecting the market to drive progress is not a great strategy. X86 is still alive today because Windows is hesitent to commit to RISC and not incentivizing developers to jump on the train. It takes this kind of move from companies like Apple to drive it forward. Architecturally x86 has needed to go for a long time. It's inefficient, bloated, and stagnated. Even Mircrosoft is hedging its bet with Intel by having ARM products.
 
  • Like
  • Haha
Reactions: leman and bobcomer

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
I would seriously be interested in what emulator you run and what drawbacks you’ve found. I’m assuming that if you have found some then they aren’t serious, and if I move to a new M platform I may want to try it.
He's probably using Parallels and running Windows on Arm with virtualization rather than emulation. The drawbacks are some performance (not bad), and the Windows on Arm EULA really doesn't allow it, so in a business setting where one might get software audited, there may be a problem. Some things just don't run and no way to tell before hand.

In my testing it runs pretty well, with only one application that is blocked on it for some reason. Performance is acceptable, even though Windows on Arm actually emulates x86/x64 inside the VM. I don't use it for work at all until the EULA changes and only run it using the insiders program.

UTM can run x86 Windows somewhat, at the expense of stability and it being really slow. But there is hope for the future...
 

ct2k7

macrumors G3
Aug 29, 2008
8,382
3,439
London
Expecting the market to drive progress is not a great strategy. X86 is still alive today because Windows is hesitent to commit to RISC and not incentivizing developers to jump on the train. It takes this kind of move from companies like Apple to drive it forward. Architecturally x86 has needed to go for a long time. It's inefficient, bloated, and stagnated. Even Mircrosoft is hedging its bet with Intel by having ARM products.
X86 is also very convenient for the mean time.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
Expecting the market to drive progress is not a great strategy. X86 is still alive today because Windows is hesitent to commit to RISC and not incentivizing developers to jump on the train. It takes this kind of move from companies like Apple to drive it forward. Architecturally x86 has needed to go for a long time. It's inefficient, bloated, and stagnated. Even Mircrosoft is hedging its bet with Intel by having ARM products.
I couldn't disagree more.
 

theluggage

macrumors G3
Jul 29, 2011
8,015
8,449
Perhaps my M1 does simply not know that it should not run Windows 11 (ARM) and emulate x86 therein.
Until about 6 months ago that involved signing up to a "Windows Insiders Program", agreeing to a lot of T&Cs and downloading a "preview" version of Windows 11 that was only licensed for test and evaluation purposes (whether or not you managed to activate it with an existing key). You probably weren't likely to get sued, but good luck if you use it for work and get hit by a software audit.

Then, with no fanfare beyond an addendum to a post on a Parallels support forum, it became possible to download the release version of Windows 11 for ARM, buy a license from Microsoft with no mention of joining any evaluation-only "insiders" program and activate it on a M1 Virtual machine. However, Microsoft don't list M1 as a supported processor (have fun debating whether that refers to virtualisation or not) so, presumably, if it breaks that's just tough.

That make it fine for personal use (not like you were counting on MS for support anyhow), but really non-ideal for business/professional use.

...and, yeah, the x86 emulation is great if it "works for you" but it's not really a replacement for people running more demanding x86 windows apps, probably under Bootcamp (which ain't gonna happen on Apple Silicon).

Personally, I could live without x86 - all I'm saying is that if you actually need to run x86 OSs for anything non-trivial then Apple Silicon may not be great for you.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
I couldn't disagree more.

Unfortunately, that's just how innovation works in practice. The mainstream PC market is incredibly traditionalist and anti-innovation. Almost every single user-facing innovation in the last decade such as high-dpi displays, better quality input devices, thin-and-light portables, universal connectivity etc. can be tracked to a handful of forward-thinking companies, with Apple at their forefront. One can poke fun at Apple's failures such as butterfly keyboards or the Touch Bar, but the simple truth is that without Apple we would still be lugging around square black plastic boxes with terrible displays.
 

oldoneeye

macrumors regular
Sep 23, 2014
134
418
That's fine for laptops, but there's absolutely no need for low power RAM on desktops. Especially when its extremely cost prohibitive for the average user. Apple memory upgrade prices are INSANE.
The choice of the 3D package is for performance. The power efficiency is because you don't need to boost all those signals truly "off-chip". We're going to see a lot more of this - not less.

Apple charges what people will pay. It's only loosely coupled to their costs.
 
  • Like
Reactions: Zdigital2015

robco74

macrumors 6502a
Nov 22, 2020
509
944
I checked DDR5 prices, and they don't seem too far out of line with what Apple charges. It's pretty expensive right now. Not to mention the number of slots required to match the bandwidth Apple has achieved.

Architecture shifts are not easy for those who rely on legacy apps. It really does depend on your use case. Personally, and at my work, it's a non-issue. Everything we need runs just fine on ASi. That isn't going to be the case for everyone.

Unlike MS however, Apple won't bend over backwards to maintain compatibility. If that is a requirement, then Apple's platforms are not a good solution.
 

jonblatho

macrumors 68030
Jan 20, 2014
2,529
6,241
Oklahoma
Whilst I have an M1 Mac, I feel my days on the Mac are numbered. The cost of potential upgrades is front loaded and my workload isn’t very predictable at times, moreover, I feel most of the engineer workloads I’m on tend to favour x86 (ARM isn’t that prevalent in HPC workloads and cloud services aren’t in a rush to move to ARM).

I do miss being able to work on the Mac Pro and change things as I needed.
Re: Arm in HPC/the cloud, a recent case study caught my attention of the UK Met Office moving part of its cloud computing to Arm with Amazon’s Graviton chip.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
Unfortunately, that's just how innovation works in practice. The mainstream PC market is incredibly traditionalist and anti-innovation. Almost every single user-facing innovation in the last decade such as high-dpi displays, better quality input devices, thin-and-light portables, universal connectivity etc. can be tracked to a handful of forward-thinking companies, with Apple at their forefront. One can poke fun at Apple's failures such as butterfly keyboards or the Touch Bar, but the simple truth is that without Apple we would still be lugging around square black plastic boxes with terrible displays.
Innovation is great, I'm always looking for something better, and if something is better enough to take the market share of something else, fine by me as long as it's the market and not some one trying to control others.

As for x86 vs M1, M1 isn't better, it does the same general computer work, with the caveat that it's not compatible with the market leader. True, it's a more elegant design but that makes absolutely no difference. What makes a difference is something that's truly revolutionary, and we haven't seen that compared to x86. It'll happen eventually I expect, but I also expect it wont be in my lifetime, and it will be something quite different than a current digital CPU. I've worked with far too many CPU's, and they all do the same things. The market leader is just via momentum and that's okay too, until something comes along that is truly better. I had hopes for the transmeta processor, but oh well. Nothing new here, move along...

That something new may be real AI, with something more than a CPU that makes that a reality, rather than just another expert system.
 

ct2k7

macrumors G3
Aug 29, 2008
8,382
3,439
London
Re: Arm in HPC/the cloud, a recent case study caught my attention of the UK Met Office moving part of its cloud computing to Arm with Amazon’s Graviton chip.
I actually know the head of engineering (I think that’s the title) at the Met Office, and we for some reason we’re talking about HPC. He said that few small projects could move to Graviton, but the large archaic models in FORTRAN are decades of not more in moving to a different platform.

I’ll ask my fiancé who worked in weather / scientific computing at NTNU, but I feel based on prior discussions, he’s not optimistic about a migration either

Edit, he’s a technical lead at TMO
 
  • Like
Reactions: jonblatho
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.