Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

pshufd

macrumors G4
Oct 24, 2013
10,149
14,574
New Hampshire
Cooling the bottom chips would be mighty interesting, no?

Yup.

He mentioned another patent with a different technique.

There was also something about manufacturing and and binning in that defects on the interposer results in an M1 Max while clean, adjacent chips make an M1 Ultra candidate.
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,628
1,101
Vadim did a video this morning expecting a quad-max chip at WWDC. The diagram has two stacked Ultra chips implying that the interposers between the two ultra chips are also connected. He refers to some of Apple's patents in this area.
Can TSMC's InFO_LSI technology do that?
 

Unregistered 4U

macrumors G4
Jul 22, 2002
10,610
8,628
I feel that competitors simply don’t have the means to respond effectively when thinking of the final product (i.e. one company doing it all, including the OS, apps, components, silicon, hardware). This is the main reason why I think all of the news surrounding Intel's revival attempts comes across as irrelevant to me, or whether nvidia or AMD will be able to match Apple.

You may be able to beat Apple in one particular benchmark here or there (which often comes at the expense of everything else), but you will never be able to match the specific experience that Apple is ultimately offering, and that is what will allow Apple to continue extending their lead over the rest of the industry.
I believe AMD/Intel may have the means to, but business reality says they need to keep the low cost and low power chips providing low performance and save high performance for their high power (and high dollar) chips. They HAVE to sell a wide range of solutions at wide price points to make sure they can sell as many of all the different variants as possible. As a result, their low power solutions will, sometimes artificially, be held back from performing well.
 

pshufd

macrumors G4
Oct 24, 2013
10,149
14,574
New Hampshire
I believe AMD/Intel may have the means to, but business reality says they need to keep the low cost and low power chips providing low performance and save high performance for their high power (and high dollar) chips. They HAVE to sell a wide range of solutions at wide price points to make sure they can sell as many of all the different variants as possible. As a result, their low power solutions will, sometimes artificially, be held back from performing well.

The problem is that Apple reset the bar for low-end to M1. Yes Apple products are relatively expensive, but the M2 chips will roll out and then the M1 systems will hit the used markets at substantial discount and it will raise the bar at a lower price point.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
Can TSMC's InFO_LSI technology do that?

LSI is just interconnect. That’s what apple accomplished its own way. The interposers necessary to connect two M1 Ultras would require more than interconnect - it would have to have substantial logic circuitry in it to make it work.
 
  • Like
Reactions: Xiao_Xi

JimmyjamesEU

Suspended
Jun 28, 2018
397
426
But but but it's not Geekbench and Blender isn't optimized for AS.
Ignoring the part where the presenter says not to take these scores to seriously. Check
Ignoring the part where the presenter shows improvements between versions on the mac of between 100 to 300%. Check
Ignoring the use of RT and Tensor cores when compared to the Mac. Check
Pretending a mature windows program can be seriously compared to what amounts to a beta on macOS. Check.

Yes indeed, we have an mi7chy full house! It's a bingo!
 

tekboi

macrumors 6502a
Aug 9, 2006
731
145
EasŦcoast
I mean, welcome to the tech industry where companies compete and the latest and greatest doesn't stay the latest and greatest for long.

WTF is the point of this thread? Yeah, it won't be top dog for long. Water is also wet.
 
  • Like
Reactions: AlphaCentauri

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,628
1,101
The interposers necessary to connect two M1 Ultras would require more than interconnect - it would have to have substantial logic circuitry in it to make it work.
Can TSMC connect two Ultra SOCs with its current technology?

According to Tom's Hardware, Apple decided to use CoWoS-S for the Ultra SOC because InFO_LSI wasn't ready. What are the benefits of InFO_LSI? Cheaper interposer?
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
Can TSMC connect two Ultra SOCs with its current technology?

According to Tom's Hardware, Apple decided to use CoWoS-S for the Ultra SOC because InFO_LSI wasn't ready. What are the benefits of InFO_LSI? Cheaper interposer?

Again, connecting two ultras would also require a logic die. The logic die could be connected to the two ultras with InFO_LSI. But InFo_LSI is just a way of wiring chips together.
 
  • Like
Reactions: Xiao_Xi

Rickroller

macrumors regular
May 21, 2021
114
45
Melbourne, Australia
This is the kind of stupidity that fanboism reveals. Its great that you like Apple products, you should, they are good. But any logical person will realise that Apple is a corporation. Their ultimate goal is to make profits. If they have no serious competition, they will absolutely charge more for their products. More than what they would have in the case where they did have serious competition.

And furthermore, serious competition is what drives the Industry forward. If there is no competition, the driving force to make serious improvements to their products by any company diminishes, because if there already is no competition, why should major strides be made? It is more economically feasible to make the minimum amount of effort for maximum profits in the case of no competition.

So yeah, feel free to cheer Apple to succeed. But logically, also cheer their competition. It's just stupid as a consumer not to.
But isn’t it up to the competition…to compete…?
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Can TSMC connect two Ultra SOCs with its current technology?

According to Tom's Hardware, Apple decided to use CoWoS-S for the Ultra SOC because InFO_LSI wasn't ready. What are the benefits of InFO_LSI? Cheaper interposer?


Neither InFO_LSI nor CoWoS-S can do what being proposed here. The interposer is a chip. But in these technologies it is a trailing edge chip. (e.g., layer a. 5nm chip on top of a 14-16nm interposer. ). It is not layering two chips of the same bleeding edge advanced fab process on top of each other.

The one on the bottom isn't running much logic (compute) at all. It is mainly to serve as an adapter so the connection pads/bumps on the bottom of the advanced chip can be smaller and more densely packed without more adverse power and space consumption problems on that chip. So the older tech chip takes most of the "scale down from outside much larger pads/bumps " . If mainly just running wires and not "computing" much in the interposer the Wattage produced is going to be relatively low.

InFO_LSI isn't necessarily cheaper. The precise required is higher (the pads/bumps are smaller. That is better for perf/watt. but harder to join with low defects). The interposer is smaller ( so that helps on interposer die costs). But have to compose a 'trench' to put the interposer into.

CoWoS-S allows for larger chips to be grouped together. InFO_LSI maxes out at 1x reticle limit (around 750-850mm^2). CoWoS-S is about 3x . If the Max fits on InFO_LSI it is "just barely" status.


What is being proposed is two full blown, high computational logic chips glued on top of each other. Nobody is doing that. Intel Foveros might put a I/O chip on the bottom layer, but that is not where high performance CPU/NPU/GPU cores go.
 
Last edited:
  • Like
Reactions: Xiao_Xi

oz_rkie

macrumors regular
Apr 16, 2021
177
165
But isn’t it up to the competition…to compete…?
Maybe read the entire conversation that was happening to get the full context. I was merely responding to those who were happy with there being no competition. I stated that it is in the best interest of everyone (including Apple customers) for there TO BE serious competition between apple, intel, amd etc.
 

Abazigal

Contributor
Jul 18, 2011
20,392
23,893
Singapore
Maybe read the entire conversation that was happening to get the full context. I was merely responding to those who were happy with there being no competition. I stated that it is in the best interest of everyone (including Apple customers) for there TO BE serious competition between apple, intel, amd etc.

For all the good that does. Half the world can hope for better competition and it won’t change a single thing. Only concrete physical action does. As such, I see no shame in celebrating and revelling in the current status quo where Apple continues its upwards ascent unabated, and I encourage the rest to rejoice in this as well.

Viva la Apple!
 
  • Haha
Reactions: oz_rkie

robco74

macrumors 6502a
Nov 22, 2020
509
944
I'm happy that Apple is actually offering up something different instead of just cobbling together off the shelf parts like every other OEM. I'm much happier with my new ASi laptop than I was with my Intel Macs. They are no longer constrained by what Intel and AMD are offering and that is refreshing.

I'm sure there will be competition if/when WoA is no longer limited to Qualcomm chips and we start seeing others compete. For now though, Apple is offering something different than just another machine built around x86.
 

Abazigal

Contributor
Jul 18, 2011
20,392
23,893
Singapore
I think it’s way better than parroting motherhood statements like “competition is better for consumers” irrespective of the current situation.

Take the Apple Watch for example. There’s just so much distance Apple has put between itself and the competition that really, the only thing the competition has to be thankful over is that for now, the Apple Watch is still tied to the iphone. Yet people still parrot this line ad nauseam when the reality is that Apple is facing more competition from its older line of products (and not just for the Apple Watch) than competing brands.

Maybe there are times when this may be warranted, and then there’s Apple vs the rest of the industry, and this is where I find that said statement just falls apart.

Something about it just rubs me the wrong way. I am still in the midst of putting together a cogent response intended at demolishing this statement, though it probably won’t be ready anytime soon though, but yeah, something about it just irritates me.
 
Last edited by a moderator:
  • Like
Reactions: tmoerel

oz_rkie

macrumors regular
Apr 16, 2021
177
165
I think it’s way better than parroting motherhood statements like “competition is better for consumers” irrespective of the current situation.

Take the Apple Watch for example. There’s just so much distance Apple has put between itself and the competition that really, the only thing the competition has to be thankful over is that for now, the Apple Watch is still tied to the iphone. Yet people still parrot this line ad nauseam when the reality is that Apple is facing more competition from its older line of products (and not just for the Apple Watch) than competing brands.

Maybe there are times when this may be warranted, and then there’s Apple vs the rest of the industry, and this is where I find that said statement just falls apart.

Something about it just rubs me the wrong way. I am still in the midst of putting together a cogent response intended at demolishing this statement, though it probably won’t be ready anytime soon though, but yeah, something about it just irritates me.
I don't know if you are trolling, are unable to comprehend well enough, or some other reason that you are failing to grasp what the conversation is about, i.e. the need for competition for the Industry to thrive. Anyhow, be ignorant if you want to I guess.
 
  • Like
Reactions: arvinsim

Abazigal

Contributor
Jul 18, 2011
20,392
23,893
Singapore
I don't know if you are trolling, are unable to comprehend well enough, or some other reason that you are failing to grasp what the conversation is about, i.e. the need for competition for the Industry to thrive. Anyhow, be ignorant if you want to I guess.

Your original position was that a lack of competition would lead to higher prices by Apple. I have been thinking about this since I read your argument yesterday, and I disagree with it on a fundamental level. I am still not 100% satisfied with the way I have crafted my response, but here goes.

It is my observation that Apple’s pricing strategy is not based on the idea of forcing users to pay an “Apple Tax”. Instead, Apple prices its products in a way that maximises gross margin and revenue on an absolute basis.

For example, it’s common to poke fun at Apple users for their apparent cluelessness in paying more for an Apple-branded laptop despite cheaper alternatives being available. Over time, this has morphed into a form of criticism aimed at Apple products which are higher-priced than the competition (which in Apple’s case, is basically everything they sell).

I disagree with this line of logic for the simple reason that Apple doesn’t license their software to third parties, so there is no way of knowing just how much a third party vendor would have sold an equivalent Macbook for. As such, a Macbook running Apple software ends up offering a very different experience compared to a windows laptop, even if both have similar hardware specs on paper.

As such, I feel it is more accurate to say that any “Apple Tax” actually reflects the value of software which Apple believes affords its users, rather than some arbitrary premium conjured out of thin air.

So even there were less or no competition in the market tomorrow, I don’t believe that Apple would meaningfully raise their prices, because of the laws of supply and demand. What Apple likely has done (and will continue to do) is forecast how much a product’s price will impact consumer demand for said product. The higher a product is priced, the less it will sell. Price it too low, and you leave money on the table. The trick is to find that sweet spot where units sold x gross margins (revenue - costs) is at its highest.

It may be tempting to look at the way Apple prices some of their products and conclude that Apple is trying to milk its user base. However, I believe that their incentive is not to squeeze every last dollar out of our pockets, but to expand the Apple user base and provide us with great experiences.

We see this in Apple’s declining hardware margins (caused mainly by higher iphone costs, and which is partially offset by service revenue, contrary to popular criticism). We see this in Apple expanding their product portfolio to cover a wider range of prices.

It is this design-led business model (focusing on the experience rather than the technology) that has enabled Apple to grab monopoly-like share of industry profits. Not by charging an Apple Tax, but by making great products that people are willing to pay a premium for.

So yeah, I disagree with your original premise that a lack of competition would lead to Apple charging its users more for products, or that it may somehow lead to a slowing down in innovation, because that’s not what drives Apple. They don’t aim to outdo the competition, but to make great products for their users.
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
Precisely my point. You seem to be contradicting yourself. Its great that you don't NEED the most badass components but your posts are written in such a way that you want to imply that just because you don't need or want it, no one else does either. Here, read the first words in your original post -
Yes I said NEED.....NEED. It was JayZTwoCents that said it too and he has a large audience. So its not just me. People try to convince others that they need three 3090s and a threadripper and 256GB of RAM to even play games.

I am even playing the new HOT game at the moment on my GTX 1080. Take a look at Elden Ring's system requirements. So again, how is a 3090 NEEDED? Its a brand new game, released in 2022, that an old GTX 1080 can play at around 60 FPS on average. Which the M1 Ultra is at least better than a 1080.

EldenRing.png
 
Last edited:
  • Like
Reactions: AlphaCentauri

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
My gosh people. Can't we have civil conversations on this website anymore? I see a lot of "You are not arbiter of Unix usage" or "you don't speak for everyone" or "neither are you" conversations. What happened to being nice to everyone? Every thread is just way too tiring on this website anymore.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
My gosh people. Can't we have civil conversations on this website anymore? I see a lot of "You are not arbiter of Unix usage" or "you don't speak for everyone" or "neither are you" conversations. What happened to being nice to everyone? Every thread is just way too tiring on this website anymore.

It‘s a very exciting time in computer technology. It reminds me a lot of the ’80’s, because we finally once again have a variety of system architectures, form factors, etc., and it’s just beginning. People care about CPUs again, there’s competition (though for the moment it’s a bit lopsided), and there are a lot of great things to talk about it. What will AMD do? What’s Nvidia’ s plan without owning Arm? Can Intel meet its goals? But it’s getting increasingly difficult to talk about all that here.
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
It‘s a very exciting time in computer technology. It reminds me a lot of the ’80’s, because we finally once again have a variety of system architectures, form factors, etc., and it’s just beginning. People care about CPUs again, there’s competition (though for the moment it’s a bit lopsided), and there are a lot of great things to talk about it. What will AMD do? What’s Nvidia’ s plan without owning Arm? Can Intel meet its goals? But it’s getting increasingly difficult to talk about all that here.
I agree! I like everything! I still love Intel in my Windows gaming PCs. But the minute you or I say anything POSITIVE about Apple we are labeled as fanboys by some people. Can't we just please have nice conversations! It was an issue 10 years ago, but for some reason it has gotten much MUCH worse!

I am curious about Intel's approach to the big little P/E cores and how AMD will improve. We can like multiple things and discuss differences PEACEFULLY! Its always nice to have debates, but we should have them peacefully.
 

oz_rkie

macrumors regular
Apr 16, 2021
177
165
Yes I said NEED.....NEED. It was JayZTwoCents that said it too and he has a large audience. So its not just me. People try to convince others that they need three 3090s and a threadripper and 256GB of RAM to even play games.

I am even playing the new HOT game at the moment on my GTX 1080. Take a look at Elden Ring's system requirements. So again, how is a 3090 NEEDED?

View attachment 1972612
Man, I am pretty sure at this point that you are trolling but I will bite one last time. Like I said before, you don't need a 3090. But scroll back a few pages and read your own post. You were making a general statement 'Pro tip: you don't need the biggest baddest component' - here if you wan't to re-read it https://forums.macrumors.com/thread...idia-in-just-6-7-months.2337279/post-30924781

Like I said time and time again, yes you might not need a 3090, someone else might though. Try playing cyberpunk on 4k with RT on a 1080 (you can't even RT on a 1080), let me know what fps you get.

My gosh people. Can't we have civil conversations on this website anymore? I see a lot of "You are not arbiter of Unix usage" or "you don't speak for everyone" or "neither are you" conversations. What happened to being nice to everyone? Every thread is just way too tiring on this website anymore.
You literally do the same thing though, lol. Read your own post that I linked. Just because you don't need the best gpu does not mean someone else might? Do you get that? Do you get that if someone WANTS (like playing a high end game on 4k ultra) to do something they NEED (like a high end gpu) the things that allow them to do that.

If you want to talk about not NEEDING things at a philosophical level, why are you even here on this forum typing things on your computer. From your point of view, the basic NEED of a human being is just enough food to not starve and a roof over their head. Everything else per your logic is not NEED. What utter non-sense.
 
  • Like
Reactions: MayaUser

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
Man, I am pretty sure at this point that you are trolling but I will bite one last time. Like I said before, you don't need a 3090. But scroll back a few pages and read your own post. You were making a general statement 'Pro tip: you don't need the biggest baddest component' - here if you wan't to re-read it https://forums.macrumors.com/thread...idia-in-just-6-7-months.2337279/post-30924781
You countered my statement saying I don't speak for everyone but you JUST proved my point. You don't NEED.....NEED a 3090. That was my pro-tip. You confirmed it yourself. So why are you getting all up in my case when you CONFIRMED IT?

This whole topic is about M1 Ultra being irrelevant when the new NVIDIA/AMD cards come out. And I just said you don't NEED the biggest baddest component. Seems relevant to the conversation at hand. People are still gaming on a GTX 1080 (myself included) on brand new games released in 2022. So to come out with a topic that the M1 Ultra can't POSSIBLY compete with the 40 series NVIDIA is pointless. So therefore, the M1 Ultra will still remain relevant, just like my GTX 1080 is today.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
I agree! I like everything! I still love Intel in my Windows gaming PCs. But the minute you or I say anything POSITIVE about Apple we are labeled as fanboys by some people. Can't we just please have nice conversations! It was an issue 10 years ago, but for some reason it has gotten much MUCH worse!

I am curious about Intel's approach to the big little P/E cores and how AMD will improve. We can like multiple things and discuss differences PEACEFULLY! Its always nice to have debates, but we should have them peacefully.

The problem Intel has is that even if it sat down with a clean sheet of paper and designed P and E cores that were meant to work together (e.g. they each support the same instructions, and neither is repurposed from another microarchitecture), both the P and the E cores need to decode variable-length instructions. That penalty will never be able to be overcome, other than by a superior fab process, and there is no indication that they will be able to beat TSMC any time soon. Long term, they need to come up with a RISC design that the market accepts.

They’d still suffer from not controlling the entire software stack, but they could get to “good enough.”
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.