Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
The regions are called GOPs (Groups of Pictures) and for GOP compression algorithms like H264, HEVC, VP9, etc, each GOP is usually independent of others. This means each one can be processed in parallel, given the available hardware.
Can be processed in parallel in theory. How do you know that, in practice, the hardware has been designed to support that? You don't have the specs for Apple's video codec blocks, but here you are claiming that Apple doesn't know how to write drivers for their own peripherals.

I submit that the Occam's razor explanation here is "the hardware can't do that, actually", and the answer to your questions about why Apple put so many hardware codecs in is that they know there are video editing packages which take advantage of them, since video editing projects frequently contain many source clips and those can be the unit of parallelism.
 

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
Actually it is, at least for LCD TV's I would think the same principles should apply to panels for LCD computer displays. The "mother glass" produced by a modern (Gen 10.5) LCD plant is quite large. It is then cut up to produce various sizes of display panels:
"Mother glass" means something very analogous to a silicon wafer, and the processes used to prepare it to become part of an LCD stack are semiconductor manufacturing processes. They're printing thin-film transistors (TFTs) and wires on the glass. The materials and feature sizes are different, but it's recognizably the same kind of thing.

Since TFT LCD glass is an electronic circuit, you can't just chop it up to reuse partially defective panels the way you're envisioning. The mother glass is printed, then the cutting is done according to what was printed, then defects are scrapped because they can't be used. (Or they might get repaired - feature size on TFT panels should be large enough to make that practical to some extent.)

The quotes you gave later in your post were talking about advances which allow printing and cutting out differing sizes of LCD panel on a single sheet of mother glass. Presumably the older cutter tech was only capable of doing straight-line cuts across the entire sheet, which limits flexibility. But this is still talking about an up-front decision - you choose how many and which sizes are going to be printed beforehand. I don't think there's any hint of a yield harvesting scheme.
 

AtheistP3ace

macrumors 6502a
Sep 17, 2014
663
431
Philly
Sit back and enjoy their Tea - since even the 12th Gen can't hold a Candle to the 1 1/2 year old M1 architecture.

Yes, Intel can beat it in Multicore. But tbh, Intel's best i9 Alder Lake-H having +50% Performance of a M1 Pro/Max while consuming +300% (4 times!!) as much Power is not "beating Apple".

And viewing how Alder Lake P is, Intel can't compete either. i7 1280p in a Dell XPS 13 Plus gets like 11k Cinebench R23 points with like 37 Watt (and short bursts above 60 Watt), while a <34w M1 Pro/Max gets 12500 points, and stays much cooler and quieter.

This "better than the competition" should be way more than the highest benchmark Numbers. Especially, since "100% Load" is extremely Rare in most real world usage.

Let's be practical here: I still use my m1 Macbook Pro 13" from late 2020. I can use a Windows 11 VM, updating stuff, joining a Zoom Session, doing some Anaconda Jupyter Notebooks, and maybe doing some SQL Server stuff inside the Windows 11 VM (business intelligence Apps like Microsoft SQL Server Management Studio or Visual Studio SSDT, because it's Windows Only), and it's 100% Silent because it stays cool. Batterylife doesn't want to drop at all. I have literaly no Idea how i could kill the battery within 12 hours or so, unless i'm playing Cinebench the whole Day.
And it doesn't feel like it's throttling, it's just fast.
I would really like the M1 Pro upgrade, but i can't even get the M1 to it's limit.

Or let me take it shorter: I can do all my work without issues, battery lasts forever, and i. never. hear. a. noise.*
Not even for a single Second in those 1,5 years the Fan turned on - unless i did Benchmark Tests.
As long Intel and AMD can't hit THIS Standard, even the M1 series will be better than the Competition.
(naturally, ignoring Applications like Gaming or windows-only / x86 only Applications someone might need).
Sorry Intel. But if you need to put 4,5 Ghz on 12 Cores when i'm just playing a Youtube video or moving the Mouse around, you're doing it wrong.


* My Thinkpad T14 G2 i7 from Work... well.. I do almost nothing literaly. Writing some Numbers into Excel, scrolling though Outlook, and Batteryplan is on "better Batterylife" (1 away from the lowest, 2 away from max performance), and i can't get this thing to consume less than 15% per Hour when CPU Usage is <4%.
I don't touch it for 30 minutes, and 7% battery are just gone.
Sometimes Fans turn on. Sometimes i don't touch it, and Fans don't want to stop.
It's so annoying.

Who has the "Benchmark-Crown" is not really important in the real World, it's simply 1 single Factor out of many others.
How did you get SQL Server running in a Windows 11 ARM VM on a M1 cpu? I must know for this is the only application that I need that I still have not found a solution for. I have an m1 max on order for work but have been messing with my wife's m1 pro 14.
 

Darkseth

macrumors member
Aug 28, 2020
50
89
How did you get SQL Server running in a Windows 11 ARM VM on a M1 cpu? I must know for this is the only application that I need that I still have not found a solution for. I have an m1 max on order for work but have been messing with my wife's m1 pro 14.
Just to be sure, since there are so many applications that sound similar (just started Business-Intelligence Course this Semester, so i'm a pure beginner there).
I was refering to this one: https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-ver15
I have used the "2017 standalone installer". I simply installed it under Windows 11 ARM VM. There was an Error a few times (but that happens on most Windows PCs too from what i've seen in my Uni and my own Windows Machine), but at the 4th time or so the installation went through. And after that, i could start the application.

Also this one works fine for me: https://docs.microsoft.com/en-us/sq...-management-studio-ssms?view=sql-server-ver15

If you link me the exact application, i could try if it runs for me.

But if you can't install it, i'd try "run as Administrator" for the installation file, or right click -> properties -> compatibility Mode down to Windows 10, 8 or 7.
I had to use this for a little Finance-Game Application.
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
"Mother glass" means something very analogous to a silicon wafer, and the processes used to prepare it to become part of an LCD stack are semiconductor manufacturing processes. They're printing thin-film transistors (TFTs) and wires on the glass. The materials and feature sizes are different, but it's recognizably the same kind of thing.
Yes, that's correct. I had in mind the same analogy.

Since TFT LCD glass is an electronic circuit, you can't just chop it up to reuse partially defective panels the way you're envisioning. The mother glass is printed, then the cutting is done according to what was printed, then defects are scrapped because they can't be used. (Or they might get repaired - feature size on TFT panels should be large enough to make that practical to some extent.)
You've misunderstood me. The inability to reuse defective panels was something brought up by @Unregistered 4U. But if you read my original post on this, you'll see the discussion was never about this. It was instead about economies of scale, and whether LG needs a separate panel production processes for the 5k 27", or whether that same production process can be used to produce all of Apple's 218 ppi panels, i.e. those in the 24" iMac, 27" Studio, and 32" XBR.

I was saying I think it's at least possible the latter is the case since, unless there are other differences in their panels that preclude this, all three could be cut from the same mother glass. And based on what you wrote here, it seems you agree:

The quotes you gave later in your post were talking about advances which allow printing and cutting out differing sizes of LCD panel on a single sheet of mother glass.

Presumably the older cutter tech was only capable of doing straight-line cuts across the entire sheet, which limits flexibility. But this is still talking about an up-front decision - you choose how many and which sizes are going to be printed beforehand. I don't think there's any hint of a yield harvesting scheme.
Yes, this is known as Multi-Model on a Glass (MMG):

 
Last edited:

satcomer

Suspended
Feb 19, 2008
9,115
1,977
The Finger Lakes Region
The M1 was based off the processor that powered iOS! That was 2 years ago so Apple can do it again on newer generation on those chips! Apple and Army in general will have at least a generational affect! Heck even Microsoft made Arm version so suck on that!
 

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
You've misunderstood me.
I did misunderstand, and we do agree.

That said, I will point out a slight subtlety here - it's probably not significant that many Apple displays share the same PPI value. As you can see in that Samsung page about MMG, they're able to lay out TVs of radically different size on the same mother glass. Most TVs have the same pixel count, implying that pixel pitch can be very different. That makes sense, since lithography processes can print both big and small features at the same time.

I'd guess the main factor which determines whether two panel types can be laid out on the same sheet of mother glass is whether the pixel design of two different panels requires any incompatible differences in the processing steps used to print/etch layers.
 

Unregistered 4U

macrumors G4
Jul 22, 2002
10,610
8,628
Actually it is, at least for LCD TV's I would think the same principles should apply to panels for LCD computer displays. The "mother glass" produced by a modern (Gen 10.5) LCD plant is quite large. It is then cut up to produce various sizes of display panels:
Ok, the way you had described it (or, more likely, how I understood you writing it) was just that there’s a large panel that they produce and then chop it up after. Instead, ahead of making the mother glass, they define all the panels that will come out of the process. If the mother glass is designed to have 9 panels of a certain size, then that’s all of the panels that will come from that mother glass.

If they’ve produced a 27” panel and then determine that they’d rather have a 24” panel, they can’t just shave off the edges of the 27” to get a 24”. So, they’d have to specifically buy separate 24” and 27” panels. And, as of now, only one company is even taking the effort to lay out 5K 27” panels.
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
I did misunderstand, and we do agree.

That said, I will point out a slight subtlety here - it's probably not significant that many Apple displays share the same PPI value. As you can see in that Samsung page about MMG, they're able to lay out TVs of radically different size on the same mother glass. Most TVs have the same pixel count, implying that pixel pitch can be very different. That makes sense, since lithography processes can print both big and small features at the same time.

I'd guess the main factor which determines whether two panel types can be laid out on the same sheet of mother glass is whether the pixel design of two different panels requires any incompatible differences in the processing steps used to print/etch layers.
Ok, the way you had described it (or, more likely, how I understood you writing it) was just that there’s a large panel that they produce and then chop it up after. Instead, ahead of making the mother glass, they define all the panels that will come out of the process. If the mother glass is designed to have 9 panels of a certain size, then that’s all of the panels that will come from that mother glass.

If they’ve produced a 27” panel and then determine that they’d rather have a 24” panel, they can’t just shave off the edges of the 27” to get a 24”. So, they’d have to specifically buy separate 24” and 27” panels. And, as of now, only one company is even taking the effort to lay out 5K 27” panels.
In one of the articles they said if you didn't cut the panel at all, the resulting screen diagonal from a Gen 10.5 plant would be 110". So now I'm envisioning one of the plant managers where they're making 27" Retina panels saying: "Leave one of the those uncut for me 😁.

That would be...let's see...about 20k, which would work out to ~870 Gb/s uncompressed at 120 Hz & 10 bits. So with TB4 (assuming we don't have DP Alt Mode2.0), that would need only 22.5:1 compression. But with DSC, that's visually lossless, right?;)
 
Last edited:

Dismayed

macrumors member
Apr 30, 2022
35
40
I doubt it had much to do with 5nm node. Sure, the node helps with efficiency, but the gap is just too wide.

Apples design philosophy is very different from Intel or AMD. Apple relies on very wide execution backends and extremely large caches. They start with low power consumption and extend it to performance. Intel does it exactly the other way round.




There is no indication that Intel is any close in reaching M1 in efficiency, so unless they come out with radically new core design that takes lessons from how Apple does things I wouldn’t worry about it. Similarly, Intels fastest enthusiast CPUs are around 20% faster in single core than M1 - while consuming close to 10x power. There is not much spare room left there and it’s not a method you can utilize for laptops (and you can clearly see that premium mobile Alder Lake barely outperforms even the old M1).

So yeah, first I want to see Intel getting anywhere, because so far, they are not. Alder Lake is Great of course, but it doesn’t bring any noteworthy increases in efficiency and it’s praised performance improvements boil down to the massive increase in cores which gives you good results in some popular benchmarks. If that’s where the wild is blowing, well, Apple can easily add a couple of CPU cores to the next gen and get back ahead. The real trick is getting thst kind of per-core perf at 5watts, and so far only Apple can do that.



That would have been a terrible choice. Sure, sticking with Intel could give us marginally faster desktops (and even that’s not guaranteed) today. But the real strength of Apple Silicon is a new unified programming model. CPU, GPU, vector coprocessors, ML coprocessors, unified memory. Developing and testing becomes simpler and unlocks new programming paradigms. Apple Silicon is a truly heterogeneous system with multiple programmable processors thst can work in unison. Can’t really do that with x86 - yet. There are indications that they feel the pressure and want to get there. Like they have been discussing multi-chip standards so that one can build complex systems from components made by different vendors. We will see how that will go.



Agreed. There is a lot new tech coming and it’s unclear how it will change computing. I am looking forward to the new innovations!
Intel is hard at work writing new benchmarks that lower clock speeds and disable cores of competitors’ chips.
 
  • Like
Reactions: JMacHack

MayaUser

macrumors 68040
Nov 22, 2021
3,177
7,196
To OP - How long? its easy just watch Johny Srouji track and you answer to your question
Intel was on top when Johny Srouji was there...since Apple took him, the A series became a marvel to watch
So, its easy, the answer is until Johny Srouji leaves Apple, but if he retires then Apple can still be on top of the game even after that since he laid the path and the foundation. I dont think he will leave for amd or for qualcomm or anything else...but time will tell
 

Andropov

macrumors 6502a
May 3, 2012
746
990
Spain
To OP - How long? its easy just watch Johny Srouji track and you answer to your question
Intel was on top when Johny Srouji was there...since Apple took him, the A series became a marvel to watch
So, its easy, the answer is until Johny Srouji leaves Apple, but if he retires then Apple can still be on top of the game even after that since he laid the path and the foundation. I dont think he will leave for amd or for qualcomm or anything else...but time will tell
While I don't question Srouji's skills, Apple has been hiring a lot of top talent in the field. I don't think a single person changing employers would make or break Apple's track record.
 

Krevnik

macrumors 601
Sep 8, 2003
4,101
1,312
But given the lead time in silicon design and fabrication, Apple has known about the M1 Ultra for years. It would seem that's enough lead time for a single company to coordinate hardware and software they totally control.

Thanks for the refresher. By region, I was thinking of regions within a frame, rather than GOP, but GOP is a much easier way to slice things for sure. I could swear that some encoders I worked with years ago would dynamically adjust the length of a GOP if a frame cropped up that was just so expensive in terms of bits that it was worth starting a new GOP with a new key frame. I’d imagine that sort of logic might through a wrench in things though.

That said, you’d think Apple would be able to coordinate, but they are extraordinarily secretive. Their keynotes are as much for the employees to find out what’s going on as it is the public at times. So I wouldn’t be surprised if there are some gaps in that coordination due to the secrecy.
 

senttoschool

macrumors 68030
Nov 2, 2017
2,626
5,482
While I don't question Srouji's skills, Apple has been hiring a lot of top talent in the field. I don't think a single person changing employers would make or break Apple's track record.
This is probably true as the Apple Silicon machine involves thousands of people. Although we may see competitors catch up or get close to Apple in the next 3-5 years since competitors are poaching.
 
  • Like
  • Haha
Reactions: Andropov and mi7chy

Appletoni

Suspended
Mar 26, 2021
443
177
MacBook still runs at full power and gets 18hrs of battery life…people seem to forget this.
Run many different software, other benchmarks, chess engines, …, and you will see that a MacBook which is running at full power will run only ~2 hours but not 18 hours😂.
 

darngooddesign

macrumors P6
Jul 4, 2007
18,362
10,114
Atlanta, GA
M1 Ultra inside the MacBook Pro 16-inch.
It's doable....just gotta leave room for the cooling system. On the plus side, there's space for SD, CF Express, and XQD slots, not to mention a USB hub.

maxresdefault.jpg
 

Unregistered 4U

macrumors G4
Jul 22, 2002
10,610
8,628
Looking at a lot of other benchmarks, chess, …, it was was not the fastest single core.
I would assume everything other than the raw low level computation benchmarks would vary wildly. Because, once the benchmark includes anything other than simple computation, we’re depending on the skill of the developer (or developers if the tool is open source and is being tweaked over time) to know how to develop code for the target platform.

Just like there will always be some Intel PC somewhere that’s faster than everything Apple makes, there will always be mature Intel focused benchmarks that will always perform better on Intel processors than anything Apple makes. And, I don’t think that’s invalid. If a user’s primary use case is executing well tweaked Intel based code, those benchmarks will show that the code runs best on Intel based systems.
 

senttoschool

macrumors 68030
Nov 2, 2017
2,626
5,482
It's doable....just gotta leave room for the cooling system. On the plus side, there's space for SD, CF Express, and XQD slots, not to mention a USB hub.

View attachment 2006175
It's actually doable without the laptop looking like this.

Intel's Alder Lake laptop chips can actually boost to 157w. This is double the max power usage of the M1 Max (100% CPU + GPU). If Apple allows the 16" Macbook Pro to get hotter and they throttle the M1 Ultra when the user does heavy CPU + GPU work at the same time, Apple can do this now.

The upside would be that you can get a 20-core CPU performance or 64-core GPU performance in a laptop as thin as the 16" Macbook Pro. This would be true as long as you don't max the CPU and GPU at the same time.
 
  • Like
Reactions: Appletoni
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.