Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
This doesn't make sense --- we're talking about consumer hardware (i.e., displays that actually exist can that find usefulness of 8k output). If we want to arbitrarily say "well, it's the MacBook pro, so it should do X" --- then why not 10k? or 50k? perhaps it should do 100k output, because it's the MacBook pro? Obviously, that doesn't make sense, because there aren't 100k displays out there. The same goes with 8k, it seems.
LG, Samsung, and TCL will happily sell you a "cheap" 8k tv; unless we are talking about monitors, where there are higher resolution displays but they are not consumer priced.
 

EntropyQ3

macrumors 6502a
Mar 20, 2009
718
824
This doesn't make sense --- we're talking about consumer hardware (i.e., displays that actually exist can that find usefulness of 8k output). If we want to arbitrarily say "well, it's the MacBook pro, so it should do X" --- then why not 10k? or 50k? perhaps it should do 100k output, because it's the MacBook pro? Obviously, that doesn't make sense, because there aren't 100k displays out there. The same goes with 8k, it seems.
We are a bunch of people shooting photos and video both at 8k resolution or above, including a large group of amateurs, and it is gradually moving into the proverbial "soccer mom" territory. There is nothing particularly exotic about it any more. And over the lifetime of a MacBook Pro, definitely not.

If you check prices on 8k TV sets this Black Friday, I'm sure you'll see a number of offers at $-three digits. If you're just using your TV to watch Netflix and similar, for another few years you're probably better off buying a good 4k set at the same price, but there's a lot of people around that don't just consume. And they fit smack in the middle of the target demographic of MBPs, and arguably of Macs in general.

(Or look at it this way if you want - there are a number of high-res options already, but the moment Apple offers 8k video recording on their iPhones, that's 250 million users or so annually, just from them. And they will want to actually see their travel videos in full resolution.)
 

januarydrive7

macrumors 6502a
Oct 23, 2020
537
578
LG, Samsung, and TCL will happily sell you a "cheap" 8k tv; unless we are talking about monitors, where there are higher resolution displays but they are not consumer priced.
That's the thing --- content creators, as far as I know, create content for consumers. As these mediums become more popular, I'm better the market, including Apple, will respond in kind.
 

duck apple

macrumors regular
Feb 26, 2009
205
68
The shift in here to "omg power savings" is hilarious. Last week it was about being the king of the hill, but now we're back to stroking our "power savings." ?
But I remembered everybody laughing at benchmark showing A12x iPad Pro was outperforming Intel notebook.
In just 3 years, everybody in Intel fan group is so cheerful for the regain of performance crown by Intel and beating up M1.
 
  • Like
Reactions: throAU

duck apple

macrumors regular
Feb 26, 2009
205
68
Thanks to Intel being lead by an engineer again, Alder Lake might just be the start, similar to M1.
With the 12 gen, I don't believe any ******** he talk about and I'm really worry about Intel's future.
You want Apple back and convince Microsoft not jump together with PC OEM to ARM wagon? You first show mobile version that beat up AMD in all respects

And who need him? AMD is good enough for PC world.
 
  • Angry
Reactions: EPO75

duck apple

macrumors regular
Feb 26, 2009
205
68
The power consumption figures are astonishing though. It makes me wonder why Intel bothered with e-cores at all. 10nm is roughly equivalent to tsmc 7nm, and Zen consumes half the power in benchmarks, for comparable performance. Maybe AMD has a pure architecture advantage?
It just clearly show that Intel lies; its 10nm is no comparison with TSMC 7nm at all.
AMD Ryzen Threadripper 3990X 64 core (also TSMC 7nm) has its TDP of 280W; it's about 4x performance of M1 Max in real word 3D rendering (without using any GPU power); this is something future desktop M needs to competes.
 

throAU

macrumors G3
Feb 13, 2012
9,204
7,356
Perth, Western Australia
Real comparable mix will be when Alder Lake laptops start shipping.

Apple made the right move. Looking back, it's clear that Intel didn't properly communicate development issues to Apple and definitely poorly did it to the public about their roadmap.

Intel still has a lot to lose out of everything. The silicon market hasn't been this fierce... I don't think ever.

Intel got arrogant, and spent a long time thinking no one could touch them, and this perfect storm brewed up.

I hope Intel ACTUALLY becomes competitive again. The industry needs as much competition as powerful. I also REALLY hoped Blackberry would start innovating after the release of the iPhone but a very specific type of arrogance and lack of understanding what made it popular killed their phone business.

There's lots of things going against them. I haven't had a look at Alder Lake outside of an Anandtech article on it in any real detail, but if it's a chiplet based design I have questions about the core to core interconnects and the speed of it. I am also curious about real world testing under sustained load.

The Intel MacBook Pro 16 ramped down aggressively shortly after starting a video call. Fans were going full blast. Sure, Alder Lake looks great in a Desktop environment but does it fall off under continuous load? Can it sustain the performance with liquid cooling? Does PCIe 5 give them the same types of performances gains between the CPU and GPU that having them on the same package would give you?

By no means trying to argue, or talk negatively about your statement. You have lots of valid points.

I'd have to review my original readings on what Intel was doing, but when the Surface Neo was announced the "Hybrid CPU" wasn't an actual SOC. It was an SIP (System In a Package), which is great because you can fab a bunch of Intel Core series processors, take the good ones, and connect them with Intel Atom (just saying that after I thought it was dead is making me roll my eyes). Theoretically all of these things are promising. But it's less of an architected solution, as much as it's trying to get all of the pizza crusts from Chuck E. Cheese to prove they reuse pizza. Sure, you can do it, but if the connectivity isn't there, and if there isn't shared caches across distinct cores, it just looks like more latency.

I'll do more of a review over the weekend. I am BEYOND happy to be wrong on all of this. I want Intel to sock Apple in the jaw. That way, Apple has to up its game.

I honestly think that no matter what intel did, Apple would have ended up with arm based macs sooner or later.

It’s just another way to get economy of scale out of their iPhone r&d and unify software development across all their platforms.

The fact that intel screwed the pooch pretty comprehensively maybe just gave Apple the hurry up.
 

aevan

macrumors 601
Feb 5, 2015
4,539
7,236
Serbia
AFAIK, 8k isn't terribly useful, yet, on consumer hardware. Unless you sit unreasonably close to your screen. Or perhaps for gaming, if you don't care about frame rates.

We're at a loooooong distance from 8K gaming. 8K anything, really. 8K TVs mostly (if not completely) use their upscalers to upscale 4K content.

I can see the benefit from shooting in 8K to be able to crop segments of the video. There is also this thing when you downsample higher resolutions to lower resolution screens to get smoother looking edges, though I'm not sure that applies to video.

I am curious why some video creators on YT mention they work with 8K footage. I don't think it's to actually make 8K videos, it must be just to have more data to produce better looking 4K videos - but even there I'm not sure.

It will be a long while before 8K is a thing (if ever, we're approaching very diminishing returns). Even 4K is not what you'd call standard today, in terms of content (most new TVs are 4K, but content is still lagging).
 

aevan

macrumors 601
Feb 5, 2015
4,539
7,236
Serbia
I honestly think that no matter what intel did, Apple would have ended up with arm based macs sooner or later.

For sure. First of all, they prefer to control the entire vertical. Second - it's not just performance in terms of raw power, their own silicon allows them to do things they want to do and improve workflows they want to improve. I doubt Intel would add ProRes acceleration, for example. Then all the hardware encoding for SSDs. Then camera stuff. Machine Learning stuff. This allows Apple to create custom chips for their custom needs.

The fact that they are actually beating Intel is just a (great) bonus.
 

throAU

macrumors G3
Feb 13, 2012
9,204
7,356
Perth, Western Australia
I am curious why some video creators on YT mention they work with 8K footage. I don't think it's to actually make 8K videos, it must be just to have more data to produce better looking 4K videos - but even there I'm not sure.

as I understand it you want to work with 8k because in future you might want to remaster for higher res.

A lot work with 4k and export to 1080p as you get better quality that way rather than working with 1080p directly.

Presumably the same works for a 4k end result - do your effects and original footage in 8k and render out the end result at 4k...
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
I'm really worry about Intel's future.
Don't worry about them, they still sell way more PC processors than everyone else and by a good margin. AMD is the only real threat in that area, and we users don't really need to care about that, x86 will live on. It'll be decades before x86 becomes hard to get, even if there's a major switchover. Same for Windows and Microsoft.

Worry about it when it becomes a problem, and that wont be anytime soon.

And besides, there really isn't anything that special with Arm based processors, lower power is the only advantage they have and there's no guarantee they wont be surpassed too. We wont have a real revolution in computing until we give up the current digital processors for something else.
 

EntropyQ3

macrumors 6502a
Mar 20, 2009
718
824
I can see the benefit from shooting in 8K to be able to crop segments of the video. There is also this thing when you downsample higher resolutions to lower resolution screens to get smoother looking edges, though I'm not sure that applies to video.
You are quite correct, and it applies to video. Hence cameras having 12k resolution (typically downsampled to 8k) and 6k (typically downsampled to 4k) resolution. You could say that this roughly compensates for the resolution loss due to the Bayer colour sampling pattern and you can tell as the output TV/monitor panels have full RGB per pixel resolution.

Video quality has increased rapidly and it's quite understandable. For instance if you are a wedding photographer, you are documenting something that will be a reference point for decades, generations even, forward. And that is how your customers regard it too - they are recording now to review with their children and grandchildren.

Not everything is about consuming the usual generic opium for the masses. Quite a few people have broader interests.
 
Last edited:

robco74

macrumors 6502a
Nov 22, 2020
509
944
Don't worry about them, they still sell way more PC processors than everyone else and by a good margin. AMD is the only real threat in that area, and we users don't really need to care about that, x86 will live on. It'll be decades before x86 becomes hard to get, even if there's a major switchover. Same for Windows and Microsoft.

Worry about it when it becomes a problem, and that wont be anytime soon.

And besides, there really isn't anything that special with Arm based processors, lower power is the only advantage they have and there's no guarantee they wont be surpassed too. We wont have a real revolution in computing until we give up the current digital processors for something else.
I'm going to disagree that low power is the only advantage. There's also the fact that OEMs can now order customized silicon to meet specific needs. Apple done quite well here. Google is now dipping a toe into the water. Even with licensed ARM cores, you can mix and match them, add more or fewer and different GPU cores, add other co-processors, encoders/decoders, etc. Of course, with Apple Silicon, we're primarily seeing laptop performance. We're still only about halfway through the transition. But on the server side, we're also seeing more powerful chips on the horizon.

Given the current climate crisis, providing comparable performance at a fraction of the power draw is nothing to sneeze at. For large scale enterprises, the energy savings may very well start to become more important.
 
  • Like
Reactions: cmaier and JMacHack

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
I'm going to disagree that low power is the only advantage. There's also the fact that OEMs can now order customized silicon to meet specific needs. Apple done quite well here. Google is now dipping a toe into the water. Even with licensed ARM cores, you can mix and match them, add more or fewer and different GPU cores, add other co-processors, encoders/decoders, etc. Of course, with Apple Silicon, we're primarily seeing laptop performance. We're still only about halfway through the transition. But on the server side, we're also seeing more powerful chips on the horizon.

Given the current climate crisis, providing comparable performance at a fraction of the power draw is nothing to sneeze at. For large scale enterprises, the energy savings may very well start to become more important.
The elephant in the room here is the people designing the silicon. iirc Johnny Srouji is ex-Intel, as is a lot of the team who heads Apple Silicon. And Apple acquired PA Semi and the talent therein. Apple Silicon wasn’t an overnight thing, it’s built on a foundation of talent that’s been acquired over years.

We know Intel and AMD have plenty of talented engineers themselves, even though Intel has had internal struggles, a lot of top engineers still work there. Unless other companies can poach the best minds at these companies there’s no guarantee that Google or the like will be able to design processors that out compete the established players. And they’re starting from a disadvantage.

Personally I think the most likely future competitor in the ARM space is NVidia. They already have a great and experienced design team, and the financial power to poach even more talent. Granted, they’re not as big as Intel, but Intel is wedded to x86, and they’re not (likely) going to disrupt that by doing ARM designs.

AMD is a wild card. They’re the only other licensee to x86, and their chips have out-competed Intel for the last couple years. However they have started up ARM development again, and have shown the desire to disrupt the status quo.

I’m more confident in the establishment moving to ARM designs than Google competing with Apple Silicon. Google unfortunately has a history of starting ambitious projects and letting them stagnate. Qualcomm and Samsung I don’t know enough about to say anything.
 
  • Like
Reactions: throAU

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
The elephant in the room here is the people designing the silicon. iirc Johnny Srouji is ex-Intel, as is a lot of the team who heads Apple Silicon. And Apple acquired PA Semi and the talent therein. Apple Silicon wasn’t an overnight thing, it’s built on a foundation of talent that’s been acquired over years.

We know Intel and AMD have plenty of talented engineers themselves, even though Intel has had internal struggles, a lot of top engineers still work there. Unless other companies can poach the best minds at these companies there’s no guarantee that Google or the like will be able to design processors that out compete the established players. And they’re starting from a disadvantage.

Personally I think the most likely future competitor in the ARM space is NVidia. They already have a great and experienced design team, and the financial power to poach even more talent. Granted, they’re not as big as Intel, but Intel is wedded to x86, and they’re not (likely) going to disrupt that by doing ARM designs.

AMD is a wild card. They’re the only other licensee to x86, and their chips have out-competed Intel for the last couple years. However they have started up ARM development again, and have shown the desire to disrupt the status quo.

I’m more confident in the establishment moving to ARM designs than Google competing with Apple Silicon. Google unfortunately has a history of starting ambitious projects and letting them stagnate. Qualcomm and Samsung I don’t know enough about to say anything.
Intel doesn’t have any top engineers.
 

leman

macrumors Core
Original poster
Oct 14, 2008
19,521
19,679
And besides, there really isn't anything that special with Arm based processors, lower power is the only advantage they have and there's no guarantee they wont be surpassed too.

Lower power and a sane ISA. The problem with x86 is that it has been all over the place recently, with Intel doing major blunders with their SIMD extensions. It is incredibly embarrassing that Apple's 10 thread config (of which 2 threads are anemic) offers more or less identical performance to Intel's newest 24-thread config in FP-heavy workflows, while consuming 6 times less power. It's not x86 FP units are that much weaker — but you have to use AVX512 to utilize them fully, which a) is situational at best and b) only ships with selected SKUs and is therefore not a prime target of dev's attention.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
I'm going to disagree that low power is the only advantage.
You can, that's fair, but I do disagree with your disagree'l. :)

I'm a software guy, even when I'm doing hardware and if I can do it on one processor, I can do it on another. Sure, some things might be faster with one processor over another at any given point in time, but you really don't know if that will be true with the next iteration of processors. In other words, any advantage is short lived.

There's also the fact that OEMs can now order customized silicon to meet specific needs.
How is that an advantage? It just fragments the market more and creates pockets of incompatible hardware, not something that will take over the market, and it gives me *nothing* as a software guy. fwiw, they've *always* had that ability, even when purchasing intel processors. You pay enough, you get what you want.

It so happens that a general processor with specialized software is much *cheaper* to do and makes better business sense!

Google is now dipping a toe into the water. Even with licensed ARM cores, you can mix and match them, add more or fewer and different GPU cores, add other co-processors, encoders/decoders, etc.
How does that change anything in the PC market currently? The answer is it doesn't, and may never. It may in the future, but distant future, decades I'm talking about...

Of course, with Apple Silicon, we're primarily seeing laptop performance. We're still only about halfway through the transition. But on the server side, we're also seeing more powerful chips on the horizon.
We're also seeing it on the Intel side -- that's not any less static than Arm.

We've also had different server side chips for a long time! We have a midrange computer here with a Power9 processor, and now there are power10's. It's main benefit is database access and boy can it push and query data like nothing in the PC world, but it also doesn't compete in the PC world.

Given the current climate crisis, providing comparable performance at a fraction of the power draw is nothing to sneeze at. For large scale enterprises, the energy savings may very well start to become more important.
So go with renewable power! That would help *much* more than the greenhouse gasses difference between a Macbook Pro and another intel based laptop.

Have you ever calculated the cost difference in power between your Macbook and an intel laptop per year, and the compared it to other costs? Drop in the bucket is putting it mildly! I work for a manufacturing plant, our A/C spinning up every day takes more electricity than all our PC's do for a year. Even that large enterprise you're talking about costs don't have PC's as the biggest draw on power.

And we're getting to the point where personal PC's are fast enough for user use, even cheap ones. We no longer need to spend thousands to cover normal user performance needs, a few hundred is all it takes. I'll take a current 6 core i5 over *any* office computer for 99% of users out there and that's $800 for a good one with plenty of RAM. That makes up a LOT of differences in costs for a corporation compared to macbook's. And that's not even getting into OS's, and you know I like backwards compatibility cost wise!

Now it's true macbooks have different uses that they are very good at, I'm not arguing against that, but none of those uses make any difference to where I work.

We even have a lot of PLC's in house for controlling/reporting on the manufacturing equipment, and some of those are Arm, some proprietary, some even intel chip based, darn useful too, but not PC's, and I really don't deal with them much, that's for the electricians to do. :)
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
sane ISA.
That kind of argument means absolutely nothing to me, I couldn't care less about what someone thinks of the ISA in an intel processor. For where I work, costs are the most important thing and intel/Windows wins there with backwards compatibility.

The problem with x86 is that it has been all over the place recently, with Intel doing major blunders with their SIMD extensions.
So? Can they be overridden with software or firmware. (yes)

It is incredibly embarrassing that Apple's 10 thread config (of which 2 threads are anemic) offers more or less identical performance to Intel's newest 24-thread config in FP-heavy workflows, while consuming 6 times less power. It's not x86 FP units are that much weaker — but you have to use AVX512 to utilize them fully, which a) is situational at best and b) only ships with selected SKUs and is therefore not a prime target of dev's attention.
It's embarrassing?????????????????????????????????????????????? Now you're really losing me. I couldn't possibly care less about that. All I care is does it do what I want it to do, fast enough, and at a reasonable cost. And that goes the same for me at work as an IT Manager, and me at home as an OS geek.

I already said power was an advantage to Arm -- right now, who knows about the future.
 

leman

macrumors Core
Original poster
Oct 14, 2008
19,521
19,679
That kind of argument means absolutely nothing to me, I couldn't care less about what someone thinks of the ISA in an intel processor. For where I work, costs are the most important thing and intel/Windows wins there with backwards compatibility.

Well, if you are fine with the fact that your CPU is underutilized, so be it. But it is strange to hear it from someone who cares bout performance.

So? Can they be overridden with software or firmware. (yes)

The fact that you you need to write three different versions of your code to take advantage of a wide selection of x86 CPUs is not something that can be "override with firmware". This also relates to your concern of backwards compatibility. High-performance x86 is not backwards compatible at all — it relies on complementary sets of ISA extensions that poorly interact with each other (with major implications for the software industry, there are programing languages forcing to do weird performance-hurting workarounds because how x86 works). ARM on the other hand is more concerned about backwards compatibility — by providing a sane ISA from the start.

It's embarrassing?????????????????????????????????????????????? Now you're really losing me. I couldn't possibly care less about that. All I care is does it do what I want it to do, fast enough, and at a reasonable cost.

Yes, it's embarrassing, because Intel gives you these very fast CPUs and makes it extremely difficult to use them to their full performance. Basic design mistakes...
 
Last edited:

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
Well, if you are fine with the fact that your CPU is underutilized, so be it. But it is strange to hear it from someone who cares bout performance.
I am fine with it! Performance of the software is what I'm concerned about, not some mythical benchmark.

The fact that you you need to write three different versions of your code to take advantage of a wide selection of x86 CPUs is not something that can be "override with firmware". This also relates to your concern of backwards compatibility. High-performance x86 is not backwards compatible at all — it relies on complementary sets of ISA extensions that poorly interact with each other (with major implications for the software industry, there are programing languages forcing to do weird performance-hurting workarounds because how x86 works). ARM on the other hand is more concerned about backwards compatibility — by providing a sane ISA from the start.
So it can't possibly be overridden, even if the OS knows about it and the compliers know about it. :)

Can't say I've ever run into any trouble with it personally.

Yes, it's embarrassing, because Intel gives you these very fast CPUs and makes it extremely difficult to use them to their full performance. Basic design mistakes...
Like I said, you left me way behind with that comment.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.