Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Mostly Stack Exchange and similar :p - I wouldn’t mind someone hitting me with K&R, haha

Yeah, it’s pretty light.

I have always very much avoided engaging in places like stack exchange - I learned to code before any of that existed, and most people on-line are jerks. When I‘ve searched stack exchange for answers to questions i might have, i see the question, usually followed by a dozen people telling the original poster that they aren’t asking the right question, they should be using a different programming language, that if they don’t know the answer they shouldn’t be coding, etc.
 
Yeah, it’s pretty light.

I have always very much avoided engaging in places like stack exchange - I learned to code before any of that existed, and most people on-line are jerks. When I‘ve searched stack exchange for answers to questions i might have, i see the question, usually followed by a dozen people telling the original poster that they aren’t asking the right question, they should be using a different programming language, that if they don’t know the answer they shouldn’t be coding, etc.
Yep. That’s exactly what I mean. - I don’t actively use Stack myself to ask questions - in part because I find that all the questions I may have have already been asked by someone else. :)
 
The power is there, but AMD and even possibly Intel will catch up in time. I fear the future market fragmentation, with developers having to develop specifically for Apple Silicon ARM and just not having the time to do so.

Not to mention that games for Apple Silicon are just not a thing, and gaming is a huge part of the PC market, and realistically will probably never be a thing, since Apple and gaming just don't work together.

Even the new 10nm Intel CPUs will be much better than before, and AMD is already doing great in raw power.

The idea of Apple controlling both software and hardware is great, something they've been trying to do for decades, but the big question is how the support from the developers will be.

I look forward to the power, but I'm just not so sure about the future.

I am a complete noob and have no idea what I'm talking about in this area, but I'm just wondering what other people here think.
There are more developers that have been supporting iOS on ARM than they were that supported Mac Os on Intel.

So developers are already there, now it's easier for them to write code that works both on a Mac and iOS.

And games on Apple Silicon are not a thing? iOS is probably the biggest gaming platform out there. You mean AAA games.
 
  • Like
Reactions: JMacHack and ikir
There are more developers that have been supporting iOS on ARM than they were that supported Mac Os on Intel.

So developers are already there, now it's easier for them to write code that works both on a Mac and iOS.

And games on Apple Silicon are not a thing? iOS is probably the biggest gaming platform out there. You mean AAA games.

And the definition of AAA games will evolve, over time, so it always means “games not playable on Apple hardware.”
 
I doubt Apple is all that concerned about perpetually being the performance king especially using synthetic benchmark scores. The performance of M Series will continue to increase simply from microarchitecture design evolution and of course as they ramp up the core counts (especially for GPU) but remember that Apple always ranks PPW higher than pure raw speed if you were hooked up to a nuclear reactor.

For Apple user experience is king - they are more interested in seeing all the apps run fast and snappy and everything be smooth than a benchmark score. And of course they want to increase the kinds of software devs can provide that is the aforementioned fast and snappy - hence for example the major ramp up on GPU cores.
 
I doubt Apple is all that concerned about perpetually being the performance king especially using synthetic benchmark scores. The performance of M Series will continue to increase simply from microarchitecture design evolution and of course as they ramp up the core counts (especially for GPU) but remember that Apple always ranks PPW higher than pure raw speed if you were hooked up to a nuclear reactor.

For Apple user experience is king - they are more interested in seeing all the apps run fast and snappy and everything be smooth than a benchmark score. And of course they want to increase the kinds of software devs can provide that is the aforementioned fast and snappy - hence for example the major ramp up on GPU cores.

I look at used Apple hardware frequently and I always check the GB5 scores on everymac. I find that the correlate really well with my user and production experiences. My opinion is that Apple is doing a good job with their hardware. Software has a few more bugs than I like but they don't bother me or I find workaround.

I want to see how Alder Lake looks in a thin and light laptop. I don't really care for a 250 Watt PL2 CPU during the summer. It would be okay during the winter here so that I don't have to run my space heater.
 
  • Like
Reactions: l0stl0rd and grandM
Intel's Operating Margins are 30%. AMD's 20%. Apple's 30%. -- Yahoo Finance
Thanks for the correction, I should have included the word "gross" in my post, below are links that show historical data for the 3 companies you're mentioning.
My original post was merely trying to say that Intel will not be the "good samatarian" and not increase prices when the market allows for that, they are a company just like any other looking at their bottom line.



 
Thanks for the correction, I should have included the word "gross" in my post, below are links that show historical data for the 3 companies you're mentioning.
My original post was merely trying to say that Intel will not be the "good samatarian" and not increase prices when the market allows for that, they are a company just like any other looking at their bottom line.




I generally prefer operating margin for comparison between companies because it factors in a lot of the stuff that doesn't make it into the income statement and balance sheet.

I wouldn't expect Intel to lower prices except for trying to gain marketshare.

Which they are going to do with Alder Lake from the pricing leaks that I've seen. It indicates that AMD has done some damage to Intel. I think that we saw some of this in response to Zen 3 as well. Intel lowered prices in the mid-range to be competitive. They also gave us two more cores in the mid-range after AMD went crazy with cores.
 
  • Like
Reactions: jz0309
I wouldn't expect Intel to lower prices except for trying to gain marketshare.

Which they are going to do with Alder Lake from the pricing leaks that I've seen. It indicates that AMD has done some damage to Intel. I think that we saw some of this in response to Zen 3 as well. Intel lowered prices in the mid-range to be competitive. They also gave us two more cores in the mid-range after AMD went crazy with cores.
True statement and Intel has "competed" with AMD that way in the past, they always had deeper pockets ...
 
You can't ignore a three to one performance per watt efficiency advantage.
You can if you choose to ignore it. If you are used to having an Intel based game or application like Microsoft Access, even if you happen to running a AMD work-alike Rhyzen processor, if there are other more efficient processors or systems on a chip like Apple is using or transitioning to, but it can't run your software, then you ignore it. I wouldn't be surprised if the majority of Windows users will go along happily doing what they have always done and ignore what those Mac users are up to.
 
  • Like
Reactions: bobcomer
Gamer types always overestimate their relevance.
I'm not a gamer, but gamers do hold some significance in the market since they are the loudest and often the most represented bunch. So their opinions do matter, as they tend to punch above their weight, in a way. If gamers were somehow to switch to Apple and started to recommend to others to buy Apple, which they won't, it would most certainly boost Apple's desktop sales across the board.

The same goes with "professionals" with the Mac. Apple doesn't need to appease them, since they are a tiny, tiny fraction of the market, but their opinion does matter, especially considering Apple's whole marketing strategy of appealing to "creatives" and "people who think different".

It's an abstract concept, but a valid one.
 
  • Haha
Reactions: Maconplasma
The approach recommended in the post you're replying to *is* native. It's just dynamic dispatch

It's not even necessarily dynamic dispatch (although that will also help and lessen the burden).

It is recompiling your app. We aren't writing in freaking assembly language any more. If you use the apple provided APIs, there should be minimal work to recompile to get a native app. You don't need to write code specifically for Apple Silicon - this is what libraries are for.
 
This whole thread assumes Apple's competitors are Intel, AMD. They're not really direct competitors at all. Ultimately, HP, Dell, etc are Apple's competitors and Macs have outperformed Dells and HP models with the same architecture.
 
  • Like
Reactions: throAU
It's not even necessarily dynamic dispatch (although that will also help and lessen the burden).

It is recompiling your app. We aren't writing in freaking assembly language any more. If you use the apple provided APIs, there should be minimal work to recompile to get a native app. You don't need to write code specifically for Apple Silicon - this is what libraries are for.

Correct. I was talking about APIs like AVFoundation that will dynamically dispatch work to video Encoding/decode blocks or GPUs or afterburner cards or what have you. Or CoreML that'll use GPUs or Neural Engines. Or even OpenCL that can actually run on both GPU, CPU and FPGAs as long as an OpenCL compatible driver unit is on the system. Though of course OpenCL is no longer considered best practice on Apple platforms and is deprecated in favour of GPGPU Metal with Metal Performance Shaders easing some of that. - That's getting rant-y though :p
 
  • Like
Reactions: throAU
Correct. I was talking about APIs like AVFoundation that will dynamically dispatch work to video Encoding/decode blocks or GPUs or afterburner cards or what have you. Or CoreML that'll use GPUs or Neural Engines. Or even OpenCL that can actually run on both GPU, CPU and FPGAs as long as an OpenCL compatible driver unit is on the system. Though of course OpenCL is no longer considered best practice on Apple platforms and is deprecated in favour of GPGPU Metal with Metal Performance Shaders easing some of that. - That's getting rant-y though :p
Yeah I get it, but even worst case (recompile) isn't bad - you should be doing application updates for new features/bugfixes/security updates regularly anyway.

People like to see turning molehills into mountains.

As you say most of the libraries are dynamically bound and run natively anyway, it JIT recompiled into quick native code by rosetta2 anyway.

The people crying right now about "re-writing" their stuff for Apple Silicon native are going to be massively butt-hurt if Apple silicon evolves into SiFive based for whatever reason.
 
Yeah I get it, but even worst case (recompile) isn't bad - you should be doing application updates for new features/bugfixes/security updates regularly anyway.

People like to see turning molehills into mountains.

As you say most of the libraries are dynamically bound and run natively anyway, it JIT recompiled into quick native code by rosetta2 anyway.

The people crying right now about "re-writing" their stuff for Apple Silicon native are going to be massively butt-hurt if Apple silicon evolves into SiFive based for whatever reason.
While it’ll be true for most systems, it isn’t always that simple though. If you have a program that loads in plug-ins or other executable code you can’t mix native and Rosetta 2 in the same process space.
If you depend on a framework or library that’s outside your control it has to be updated first. It being a dylib isn’t an advantage there if it’s only distributed in binary form and you want to go native but the binary is x86 only, see problem above.
You may also have code that uses compiler intrinsics or in-line assembly. That’s not common but it exists. You may also have made assumptions about the GPU being an immediate mode renderer in Metal code, and if you build against older SDKs macOS will emulate this behaviour on the M1 GPU, but it might give you visual glitching if you build against newer SDKs (native or not) holding back the updating process while you debug your shaders.

In short, for a lt of developers it will just be a matter of recompiling. Not all can do it immediately for factors outside their control, and some will have to put in more effort than that.
Not all situations are equal.
 
It's not even necessarily dynamic dispatch (although that will also help and lessen the burden).

It is recompiling your app. We aren't writing in freaking assembly language any more. If you use the apple provided APIs, there should be minimal work to recompile to get a native app. You don't need to write code specifically for Apple Silicon - this is what libraries are for.

What's wrong with assembler?
 
This whole thread assumes Apple's competitors are Intel, AMD. They're not really direct competitors at all. Ultimately, HP, Dell, etc are Apple's competitors and Macs have outperformed Dells and HP models with the same architecture.

Exactly.

And we should also remember that Apple Silicon M1, M1X, M2, or whatever are *only* available in Apple's devices.

So it doesn't matter what happens with all these chips. The next great Intel or AMD CPU won't be inside an Apple device... and Apple's next great M-series processor won't be in a Dell or HP computer.

We should, instead, compare the actual devices.

And it also depends on what applications you intend to run.

If you're looking to run XCode or Final Cut Pro... it doesn't matter how great the AMD Zen 4 or the next Intel "Lake" processors are.

:p
 
This whole thread assumes Apple's competitors are Intel, AMD. They're not really direct competitors at all. Ultimately, HP, Dell, etc are Apple's competitors and Macs have outperformed Dells and HP models with the same architecture.
That's not correct. Dell and HP have always had more powerful laptop and workstation models than anything Apple offered. For example, they have so called mobile workstations (laptops). Apple never played in this market.
 
That's not correct. Dell and HP have always had more powerful laptop and workstation models than anything Apple offered. For example, they have so called mobile workstations (laptops). Apple never played in this market.
Its WAY better to get a desktop than a mobile workstation. They are stupidly expensive and the Xeons that come with them are utter garbage.
 
Ultimately, all of the back end computing will end up in the cloud. AWS, Microsoft, Google and the others begging for scraps from their table. It's just a matter of time until they get it right and they're actually cheaper than the bulk of in-house IT. They just need to work on inproving their service levels.

Apple has never taken the enterprise market seriously. It's always baffled me, as the market's pretty damn rich in terms of margin - especially when you look at the government contracts.

Apple IS the only game in town in terms of a secure platform - ever since BlackBerry folded up. (yes, open source vs closed source is a religious debate - but you can't argue the nefarious depths of how much spying Google does).

I really wish Apple would get serious about going after the "entire ball of twine". As much as they lamented IBM in the 1984 video - they could really become the dominant IT company if they really wanted to be - and do it in a fashion that enables capabilities.

- Go after emerging markets with a cheap phone (Ala 5c)
- Go after the enterprise market with REAL mobile device management
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.