Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
The Mac really took off when DTP (Desk Top Publishing) was fresh, Macs were the backbone of many grassroots publications; without DTP & the Macintosh, I doubt Adobe would be where they are today...?

In reality, it wasn't the Mac that made small-scale DTP a thing, it was the Laserwriter. Affordable, networked laser printing meant whatever computer was hooked to that printer, it could generate photo-repro camera-suitable text and layout elements, which print plates could be made from. Even if you were still doing a manual layup with hot wax and bromide cameras to screen the artwork (as we did in the pre-scanner days), the text blocks could come from a Laserwriter.
 
  • Like
Reactions: singhs.apps

Melbourne Park

macrumors 65816
Yes, the definition (or as good as a definition as you can find) of a "professional" workstation is a slotbox. Professional products are by their nature, utilitarian, ergonomic, user & field serviceable, downtime minimising, flexibility enhancing.

Look at vehicles - a professional might use a range rover, but a Professional utility truck, even one as styled as a Ford F150 has a removable rear tray, and can have any other body module bolted on instead - it might even be sold as a cab with a naked chassis at the back.

Looking at computers, the first mass market professional IT tool was IMO the HP 65 calculator. Engineers and scientists all over bought them. Despite it having I/O program cards, it was not upgradable.

The world's most popular Pro vehicle is the Honda Cub motor cycle. It's main feature was that it was built as an appliance. A key strategy was not having a clutch - affordable, reliable and for the first time, anyone could operate one. It's the world's most popular delivery vehicle.

What makes a computer professional is how much revenue it generates. Not its architecture.

You want more same old same old. Maybe there's another way?
 
Last edited:

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
What makes a computer professional is how much revenue it generates. Not its architecture.

You want more same old same old. Maybe there's another way?

I am standing by my thoughts on a Mac Pro Cube with a single slot half-length PCIe (Gen5 x16...?) card that tethers to an external PCIe expansion chassis...

With (as far as we know) no more discrete GPUs, an Apple PCIe expansion chassis should serve the needs of the remaining slot users...?
 

4wdwrx

macrumors regular
Jul 30, 2012
116
26
Yes, the definition (or as good as a definition as you can find) of a "professional" workstation is a slotbox. Professional products are by their nature, utilitarian, ergonomic, user & field serviceable, downtime minimising, flexibility enhancing.

Look at vehicles - a professional might use a range rover, but a Professional utility truck, even one as styled as a Ford F150 has a removable rear tray, and can have any other body module bolted on instead - it might even be sold as a cab with a naked chassis at the back. Or a van - a professional might use a Chrysler Voyager, but a Professional van is an empty shell, that can be configured by the user, and reconfigured by the user whoever the circumstances may require.

That independent reconfigurability is what makes them Professional Products, as opposed to Consumer products used to make money.

That's why every professional workstation maker, be it Apple, HP, Lenovo, Dell, Puget, Boxx etc. makes slotboxes.

A laptop is about being portable, it's not the same thing as a Desktop. Desktop workstations used in a professional environment (ie an environment configured for professional work standards) don't have "small" as a requirement. If someone needs a desktop workstation as small as the trashcan, the bigger problem is they're not operating in a professional manner, by failing to have a sufficient workspace.

If any professional needs upgradability, and the product is not upgradable, it is by virtue of that not a professional product.

My work is professional, but I use a laptop, I do professional work on it. As a professional, anyone who do good work is professional in my eyes, regardless of race, gender, orientation, or what they use.
 

Joe The Dragon

macrumors 65816
Jul 26, 2006
1,031
524
I am standing by my thoughts on a Mac Pro Cube with a single slot half-length PCIe (Gen5 x16...?) card that tethers to an external PCIe expansion chassis...

With (as far as we know) no more discrete GPUs, an Apple PCIe expansion chassis should serve the needs of the remaining slot users...?
let's see an $999 one powered by TB bus so max pci-e gen 3.0 X4 for TB4 and maybe pci-e gen 4.0 X4 for TB5?
 

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
I am standing by my thoughts on a Mac Pro Cube with a single slot half-length PCIe (Gen5 x16...?) card that tethers to an external PCIe expansion chassis...

With (as far as we know) no more discrete GPUs, an Apple PCIe expansion chassis should serve the needs of the remaining slot users...?

let's see an $999 one powered by TB bus so max pci-e gen 3.0 X4 for TB4 and maybe pci-e gen 4.0 X4 for TB5?

I clearly specified the expansion chassis would be feed via a PCIe Gen5 x16 slot, no Thunderbolt ports need be involved; power for the expansion chassis would be provided via an internal PSU...
 

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
Looking at computers, the first mass market professional IT tool was IMO the HP 65 calculator. Engineers and scientists all over bought them. Despite it having I/O program cards, it was not upgradable.

For a calculator being able to fundamentally alter the device's function with a user configurable card, is literally the equivilent of sticking a PCI card into a slotbox.

It's main feature was that it was built as an appliance. A key strategy was not having a clutch - affordable, reliable and for the first time, anyone could operate one. It's the world's most popular delivery vehicle.

Have you looked at the upgrades and modifications market for it? You can almost build an entire bike with 3rd party parts - engine upgrades, brakes, you name it. What gives it professional utility, is that it can be modified with all that delivery hardware. In fact there's a specific professional "postie" version available here, that's ONLY available to registered postal delivery drivers.

Again:
  • Volkswagen Transporter = empty shell users (re)configure themselves = Professional Vehicle
  • Volkswagen Kombi = Transporter with pre-fitted factory interior of seats / camping setup = Consumer Vehicle
What makes a computer professional is how much revenue it generates. Not its architecture.

no, Professionalism is an attitude, a qualification, a certification, and methodology by which one approaches one's work, and the tools used to do it, not the ability to make money. Dave the Bathtub Cheese Maker might earn a living making cheese in the bathroom at home, and selling it down the pub, that does not make him a professional fromager.

You want more same old same old. Maybe there's another way?

If Old = more utility, and less of my time spent changing and adapting to the new, then yes, that's exactly what I want. The Faster Horse is fuelled by grass that grows in the yard, the Faster Horse can be a single person conveyance, or pull a wagon, the Faster Horse makes more Faster Horses, the Faster Horse's waste is a highly effective fertiliser for growing food. Just because Henry Ford could make cars on a production line, does not make his observations about not giving people what they want correct.

Why do you think Apple lost so much of their editing marketshare to Premiere in the FCPX switch? People what the tools they want, that fit the mental model of how they want to do their work.

It's always funny how the "another way" seems to always be "whatever Apple happens to be hawking at that particular moment". As they've shown time, and time again, Apple does not actually know, or care what professionals want by and large - they care about what they can make professionals accept. Be it touch bars, notches, dongletown life or obsolescent appliances.
 

kvic

macrumors 6502a
Sep 10, 2015
516
460
It's an interesting idea, but I think the full-size Mac tower is just the current Intel Mac Pro with updates as necessary to extend it. Even a new motherboard is probably possible. As some folks on this thread have made abundantly clear, nothing short of an Intel slotbox that can be upgraded annually with a new GPU card, ideally from Nvidia, is going to meet the need for the committed tower buyer (Not that I think you're going to see Nvidia). That computer already exists today and can be bought at any local Apple Store. Apple Silicon is going to be used where it excels and that doesn't come from trying to force it to act like a PC.

I believe Intel Mac Pro will be available for a couple more years (or perhaps much longer than people thought originally). I said it a couple of times in this thread. I think Ice Lake refresh will come out in early 2022, and I believe Apple will continue to offer new generations of high end radeon cards as long as Intel Mac Pro is on Apple store.

The future Full-tower Mac Pro is my thought on a possible successor to current Intel Mac Pro if Apple decides Mac Pro is more than a hobby for them. I believe it could come out as early as 2023 or perhaps later. There could be a couple of years overlap between this model and Intel Mac Pro.

The NVswitch is fascinating tech, although I doubt these will be in PCs anytime soon, let alone going to Mac. This is something that is used in places like AI research laboratories.

NVswitch is a glorified version of NVlink. 12-port NVswitch seems not available on desktop form factors. But smaller versions of it is already everywhere.

I dont' believe Apple will sell a "12-port" equivalent. A "4-port" version embedded in the motherboard of the future Full-tower Mac Pro is much more likely. They'll design in a way the tech will allow them to scale up to 12 or more connectivities if they decide to be ambitious about data centre business.

"Half the size" does not automatically mean "half as tall", or "half as wide"; "half as" can cover all dimensions, so half the overall volume might be more realistic (which would place it close to a 20L microATX box)...

"Bidirectional combined", so 300GB/s each way...? M1 Max memory bandwidth is 400GB/s, and the theoretical M1 Max Duo & Quad may have bandwidth of 800GB/s & 1.6TB/s; so the 300GB/s bandwidth of the NVLink is slower than any of those...?

A 20L microATX size box is perfect for my dream desktop!

Latest generation of NVlink is 300GB/s each way. That's a lot. You need to think it through what M1 Max 400GB/s really means to you.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
By the end of 2022, AMD will have RX7900XT with >15,000 cores. AMD has already demonstrated how it'll be done in AMD Instinct MI250 today.

the MI250 presents as two GPUs to the software/OS. I'm skeptical that if the rotate down a tweaked version of CDNA down to the mainstream space that you'll end up with the same naming sequence for those cards. The dual die cards will more likely be skewed more to "compute" ( whether crypto, AI/ML, non-interactive render) rather than interactive 3D real time interactive rate.

Presenting as two GPUs with a single package is more so just better "Packaging" than Apple's W6800X Duo. Better thermal envelope and more efficient board area usage.

However, for games, 3D interactive , etc app that have little infrastructure to put two GPUs to work on a single problem with no visual/app glitches there won't be as much impact. ( stuff like SLI / Crossfire have always been a bit glitching at 60Hz targeted normal refresh synchronization rates. Crank that even higher and it is likely more glitchy ).



The latest rumor that Apple "officially" leaked through Information no longer mentioned 4x M1 Max dies. It still mentioned 2x M1 Max dies but 4x dies will be based on the successor of M1 Max.

It can't literally be two M1 Max dies as there is no interdie connect infrastructure on the dies. It would be a different die. Since it is a different die there is some chance that it would just be just one die. Especially, if they abandoned getting to 4 die 'worth' of core count in the M1 era.

One option would be to just scale up like they did with the M1 Pro -> M1 Max. Just add more GPU and CPU cores. Drop one of the NPU+Video decode clusters ( the one at the 'bottom' between the extended GPU core stack and the die edge and just start from there (somewhat in a mirror image GPU+Memory controller + CPU core + subset of I/O (no need for extra secure processor , SSD controller , swap some Thunderbolt complexes for simpler generic PCI-e v4 , etc . ) . Apple probably would be up near the reticle limit ( > 700mm2 range ), but they'd be avoiding packaging overhead costs and would get a higher Perf/Watt. End user costs would probably go up ( paying for defective dies can't use) , but is Apple really pressed about how much they are charging for upper end M1-series SoCs ; not much.
In short, build an even bigger GPU complex ( this time 64 cores) as the nucleus and wrap the CPU cores and I/O around that.

Second option would be to again 'nuke' (actually more so 'repurpose' ) the second M1 Max NPU+Video decode into Interdie connect infrastructure . But those wouldn't be M1 Max dies anymore. The combo of the two dies would still have dual NPU+Video decode , but would have pushed the collective CPU and GPU core count higher. If packaging costs are offset by higher yield rates then might save the end users a bit of money get to more bang for buck out of the wafer starts they have access to.

In the first case, the "inter CPU/GPU complex " networking is all on the die. They might still segmented it into two "zone" so could turn a major part of the network and components off when not needed. In the second case, it is a "walk before run" approach to evolving a interdie networking solution. If capped at just two dies it is easier to solve than a "fully connected" ( three links to three partners) solution. Taming the NUMA issues of just two is easier than four (or more).

If just "exactly clone" two M1 Max'es together then will not have much of flexible , very high bandwidth I/O left in the combo. ( a couple of x1 PCi-e v4 lanes isn't much and Thunderbolt 4 is limited. Nothing like x16 PCi-e v4 (or better) ). To get to substantively better I/O then they will need something substantively different than a M1 Max die. Period.


There were four "Jade family" code names. Jade (M1 Max) , Jade-chopped ( M1 Pro) , Jade2c , and Jade4c . There were likely four different dies. Saying they would take four Jade dies to make a "Quad" product would beg the question as to why would need implement a jade4C solution ? Maybe it was "muddled" code words before but the Jade and Jade-chopped are obviously different dies. Super high degree in overlap in basic design, but two separate die masks being used. Therefore, highly likely that Jade2C and Jade4C are also two separate die masks.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
I believe Intel Mac Pro will be available for a couple more years (or perhaps much longer than people thought originally). I said it a couple of times in this thread. I think Ice Lake refresh will come out in early 2022, and I believe Apple will continue to offer new generations of high end radeon cards as long as Intel Mac Pro is on Apple store.

It isn't so much whether end users believe that as much as does Apple "believe" that. Apple didn't keep doing cards for the Mac Pro 2010-2012 long after they stopped selling them. Cards came that happened to work , but that was far more driven by the new Macs those GPU models were being embedded into than because of some super highly Mac Pro only motivation. Mac Pro's bow waved off other Macs not the other way around.

As for as keep selling for long time , but do GPU specific upgrades .. Mac Pro 2013. Sales lifespan of 6 years. new GPUs delivered by Apple? Zero. ( other than the same bow wave off the other Macs via hacked Thunderbolt 2. ).

Even back when there was MP 2006 , 2008 , 2009 there was not much official support from apple for Mac Pros multiple generations back with Apple GPU cards. Might get a one gen back "official support" but Apple wasn't trying (i.e., put tons of Apple effort into an apple engineered add-in card) to cover older Mac Pros.

There is no track record there.

The RDNA2 cards than Apple released in 2021 means Apple has done a 180 turn on long term Mac Pro GPU support. Not necessarily. Especially, if they end up in an early 2022 Mac Pro W-3300 (Ice Lake) refresh. If the chassis was late and they released the cards in 2021 for the "one gen back" Mac Pro ... pragmatically that is their same ole track record. (just a slightly out of order release process on the chassis. If distracted with M-series and pandemic logistics hiccups... that wouldn't be surprising. )

No 3rd party GPU drivers on macOS on M-series should be a major expectation deceleration factor on long term MP 2019 GPU upgrade support. If the MP 2019 is the ONLY place these can go then the volume is gong to shrink over time. Even if Apple throws a stop-gap MP 2022 into the mix.... same path... a shrinking over time market. Apple putting large sums into a shrinking market is pretty unlikely. If Apple does a MP 2022 it is pretty likely primarily going to be a 'cash cow'. (same thing that they did with the MP 2013 .... kept it going because it pumped out profits. After a while have profits because no major new R&D expenditures being piled on top. )

"I have a very high sunk cost in a Mac Pro so Apple has to also build a higher sunk cost into the Mac Pro" is far more "myopic" than focused on what Apple is likely to do for itself or aligned with their explicitly expressed long term strategic interests.


Also a bit doubtful that the very high margin on the Apple W---X MPX cards is going to make up the difference in low volume. In the Intel based Mac space, eGPU played a factor in bumping up the volume (new macs with embedded much larger impact, but this eGPU is also factor). Apple's "get out of bed in the morning" motivation is to get folks to dump those older Intel Macs that had eGPUs for new M-series systems that don't need them as much. Apple isn't going to prematurely kill off macOS on Intel, but the expectation that they are going to throw major new R&D money at those system is likely misguided. ( if they switch and loop M-series systems into being able to push work to a 2023-2024 AMD GPUGPU maybe. But their current scorched earth moat around GPUs on macOS on M-series is likely be bleed back into macOS on Intel too over the long term if there are no changes. )


P.S. Another indicator that the long term future of AMD MPX cards isn't as bright as might expect is that the W6900X is more expensive than the W6800X Duo. The W6900X is in the death spiral pricing zone. Fewer folks buy it because expensive so volume goes down. So raise the price ... fewer buyers ... rise and repeat.

""If I were running Apple, I would milk the Macintosh for all it's worth -- and get busy on the next great thing. " -- Steve Jobs.

Intel Macs are going to get "Steve'd" . It is just a matter of time.
 
Last edited:

mikas

macrumors 6502a
Sep 14, 2017
898
648
Finland
Ofcourse they are gona get "Steve'd". And they are gonna get Tim'd too.

ps. I love reading your posts, because of all the detailing your posts have got. It's not about knowing everything, it's about thinking through it. And then again thinking through it.

With prognostications, there is no need to agree of everything and at least, not everytime.

edit. sorry, deleted some erroneous attachments of mine.
 
Last edited:
  • Like
Reactions: JMacHack

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
"Half the size" does not automatically mean "half as tall", or "half as wide"; "half as" can cover all dimensions, so half the overall volume might be more realistic (which would place it close to a 20L microATX box)...

That is just gibberish. Half = 1/2 . If you take a pie and make one vertical cut through the center of the circle then you have cut the pie in half ( 1/2). If you take a whole pie and make one horizontal cut through the center of the circle then you also have half a pie.

If you combine those with two cuts. One vertical and One horizontal the remaining pieces are NOT a "half a pie". They are a quarter of a pie. You had two 1/2 and then cut them again in half. 1/2 * 1/2 = 1/4

1/2 * 1/2 = 1/2 is plain gibberish.

Significantly chopping a MP 2019 chassis on multiple dimensions would be a more than major reduction in volume; not a "half size" one.

Can try to go through gyration with cherry picking small cuts on multiple dimension so to get to half the volume , but it doesn't do much productive in terms of function. Still would have an oversized literal desktop , 2D footprint, but would have squeezed out a large number of options. An even smaller object to place on the floor also have low tradeoff benefits ( longer cables , more air flow closer to floor , etc. ) .

P.S. If Apple has some "Obsessive Compulsive" disorder that literal desktop Macs have to fit in 8" x 8" 2D footprint on a desk then the overall volume will be shrunk by that OCD constraint... not by "half". At least not a literally descriptive "half".
 
Last edited:

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
That is just gibberish. Half = 1/2 . If you take a pie and make one vertical cut through the center of the circle then you have cut the pie in half ( 1/2). If you take a whole pie and make one horizontal cut through the center of the circle then you also have half a pie.

If you combine those with two cuts. One vertical and One horizontal the remaining pieces are NOT a "half a pie". They are a quarter of a pie. You had two 1/2 and then cut them again in half. 1/2 * 1/2 = 1/4

1/2 * 1/2 = 1/2 is plain gibberish.

Significantly chopping a MP 2019 chassis on multiple dimensions would be a more than major reduction in volume; not a "half size" one.

Half does not mean it can only be changed in a single dimension, that's just gibberish...

If I take a rectangular object & cut it on all three dims (HxWxD) to end up with an object that is half of the original volume; well, that is half...
 

Melbourne Park

macrumors 65816
....

Why do you think Apple lost so much of their editing marketshare to Premiere in the FCPX switch? People what the tools they want, that fit the mental model of how they want to do their work.

It's always funny how the "another way" seems to always be "whatever Apple happens to be hawking at that particular moment". As they've shown time, and time again, Apple does not actually know, or care what professionals want by and large - they care about what they can make professionals accept. Be it touch bars, notches, dongletown life or obsolescent appliances.

I presume Apple switched FCP because they viewed a new interface more resembling their bundled iMovie would make them more money. FCP costs a fraction of the rental model of Premiere. FCP is much less professional both on traditional user requirements but also it costs a lot less. However people still make income from using FCP. One wonders if Apple had continued on with the old software model, and had it on a rental basis like Premiere, whether it would have ended up costing Apple profitability by dividing their development course. It seemed to me that Apple's vision was a product differentiation based upon intuitive or "Easy to Use". That may not be the case, but there is an element of that vision with Apple's software, and hardware too.

As far as professional goes, Steven Jobs himself said that 10 years was the life of any hardware cycle. Apple supports their Mac Pros for I think 10 years. So anyone still running a Mac Pro 5.1 can no longer run safe operating software. Those machines are dead professionally. It's no coincidence that when 5,1s cliff approached, Apple brought out - pretty late - a similar machine. A question then arises - did all users of 5,1s then go out and buy 7,1s? Apple would know how many did ... so for those that did not, it would have shown to Apple that the professional place was not based on the form factor of a 5,1 or a 7,1.

My own desire when looking at Apple, and the M1 Pro Max chip, is that I hope Apple brings out a Pro machine that a user can add to it, to extend its usable life. I think however, that extending usable life of a Mac Pro beyond 5 years, is not a professional requirement. Wanting more than 5 years of life from a computer, is a Prosumer want. That's my space. Prosumer gear is little used compared to Pro use. So Prosumers want their gear to last longer.

Professionals have written off all their hardware in three or four years. Their leases have run out. They've upgraded by swinging their finance onto better, more productive hardware, with their costs remaining the same. They've become more efficient, retained their competitive capability, for the same overhead cost portion of their revenue. Computer performance has continued to increase significantly annually, 10 year hardware life isn't for professionals IMO.

The area where expandability has kept old hardware going, is by adding on other company's hardware to existing hardware - best shown by 3rd party GPUs and storage hardware.

But Apple has hardly profited from such add ons to their desktop Pro hardware.

I would like it if I bought a new architecture Mac Pro, that I could add more of Apple's own chips in order to enhance its performance after purchasing it. I may be dreaming though. I think the business case from Apple's perspective for that is weak.

I think PCI slots will not be in Apple's future. I think there is a slim chance that Apple could produce purely GPU chips, as the Max processor has shown, Apple can add a lot of GPU power if they choose too. One third of the Max chip is GPU I think ... why not sell GPU add ons to a Pro computer that has GPU slots for Apple's own GPU add ons? And have some empty NVME slots as well?

I am guessing the reason Apple may not, is because it would be simpler for Apple to just add on more combination CPU/GPU processors than having separate GPU slots ... and thereby allowing Apple to stray less from the bulk of their computer architecture. Spare or removable memory and removable NVME disk slots - they would be gratefully received by me, but I am far from certain we'll get them. And if they are there, at what extra cost will Apple charge for any extendability?
 
Last edited:

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,794
Does Apple enjoy a distinctive advantage of its GPU over Nvidia/AMD's "state of the art"?

I don't believe so. Instruction sets for GPUs are private and vendors could change at will from one generation to the other to suit their goals. Hence, unlike CPUs, they aren't bounded by complexity imposed by x86-64. I very much doubt that once Apple scales up GPU clocks, they'll still enjoy power efficiency bragging right except they have process node manufacturing advantage over, say, AMD's current "state of the art".

If we do a quick & dirty comparison between RX6900XT (AMD) and M1 Max 36C GPU (Apple), logically they're surprisingly similar to me:

ALUs: 5120 (AMD) vs 4096 (Apple)
Texture units: 320 (AMD) vs 256 (Apple)
ROPs: 128 (AMD) vs 128 (Apple)
Peak clock: 2250MHz (AMD) vs 1296MHz (Apple)
Memory bandwidth: 512GB/s (AMD) vs 400GB/s (Apple)
Power: 300W (AMD) vs 60W? (Apple)
Process: TSMC 7nm (AMD) vs TSMC 5nm (Apple)

Apple doesn't seem to me having a distinctive advantage over AMD/Nvidia. We could predict how much higher Apple could clock their current GPU without destroying their marketing rhetoric of power efficiency.

I also believe the integrated design (CPU+GPU) is a dis-advantage here. It limits how high Apple could push its GPU clocks. Higher freq > more heat > heat spill into CPU clusters. That might result in degraded CPU performance in concurrent CPU+GPU workloads. Apple will have to strike a balance of clocks between CPU and GPU and the cooling solution they'll deploy in the Workstation Mac Pro.

Another limiting factor perhaps will be memory bandwidth. With future 8.5GT/s LPDDR5X, Apple will get 544GB/s on par with RX6900XT. Anywhere less than that, by the wisdom of AMD, we could expect it's futile to push GPU clocks to near as high as AMD's because the many cores can't be fed with enough data to crunch.

Why would you see this as a disadvantage for a laptop chip? It's competitive with a TOP OF THE LINE monster desktop card!?! Apple has yet to release their desktop level chips, much less their pro one.

The rumor is the new iMac/pro will have 2 M1Max's slapped together, so bandwidth may well be 800. And the rumor for the pro level machine is you might get 4 M1Max's slapped together, so you might get truly insane throughput on the graphics side.

I do agree that one term, having your graphics integrated is a minus because graphic technology will continue to move forward. But if they provide PCI support for additional graphics cards, and why not, then that could be solved.

That we are even comparing the M1max to the top of the line rx6900xt is a triumph for apple IMO
 
  • Like
Reactions: JMacHack

Melbourne Park

macrumors 65816
...

I do agree that one term, having your graphics integrated is a minus because graphic technology will continue to move forward. But if they provide PCI support for additional graphics cards, and why not, then that could be solved.

That we are even comparing the M1max to the top of the line rx6900xt is a triumph for apple IMO

Unless Apple allows users to add an Apple made GPU, or perhaps, upgrade the Soc, such as going from an M1 Pro to an M1 Pro Max, or upgrading to an M2 Pro Max, etc. Or being able to add an extra M processor to the mother board.

With the price of RX6900 XT GPUs, such an upgrade path would be attractive for owners of Mac Pros and also for Apple.

In other words, how many pro users would benefit from PCI slots if apple provides their own upgrade paths?
 
Last edited:
  • Like
Reactions: ZombiePhysicist

kvic

macrumors 6502a
Sep 10, 2015
516
460
the MI250 presents as two GPUs to the software/OS. I'm skeptical that if the rotate down a tweaked version of CDNA down to the mainstream space that you'll end up with the same naming sequence for those cards. The dual die cards will more likely be skewed more to "compute" ( whether crypto, AI/ML, non-interactive render) rather than interactive 3D real time interactive rate.

Presenting as two GPUs with a single package is more so just better "Packaging" than Apple's W6800X Duo. Better thermal envelope and more efficient board area usage.

However, for games, 3D interactive , etc app that have little infrastructure to put two GPUs to work on a single problem with no visual/app glitches there won't be as much impact. ( stuff like SLI / Crossfire have always been a bit glitching at 60Hz targeted normal refresh synchronization rates. Crank that even higher and it is likely more glitchy ).

AMD GPUs had separated into two main microarchitectures: CDNA (Compute DNA ?) for data centre or high performance compute; RDNA (Radeon DNA ?) for gaming and professional graphics. They may share common partials of functional unit level designs but better treat them as distinct designs that are optimized for different workloads. Instinct Mi250/250X is based on their latest CDNA 2.0 microarchitecture. Regarding chiplet based RX7900XT, perhaps it will be presented as two GPUs or perhaps not. Wait and see how AMD will handle it in their best wisdom.

It can't literally be two M1 Max dies ... ... ...

Bloomberg tried the best they can, and there is limit a mainstream tech journalist could do. Also I would think Apple had selectively leaked and put it in vague language when they were feeding the guy info. Seems a few implications stand out from the leak: 1) bigger Apple SoCs will be multiple die based 2) their performance would be in the scale of 2X or 4X the "base model". With that in mind, I think your two approaches aren't unreasonable.

That is just gibberish. Half = 1/2 . ...

1/2 * 1/2 = 1/2 is plain gibberish.

I think the "correct" interpretation of "half sized" as per Bloomberg's report is half the volume of Intel Mac Pro. Current Intel Mac Pro is about 55L. Half that size is about 25L which is about the size of a compact microATX box quite popular in some PC markets. See....I've been mentioning a compact microATX box for the upcoming Smaller Mac Pro from the very beginning. I even posted my favourite case to assist people in imaging. lol

""If I were running Apple, I would milk the Macintosh for all it's worth -- and get busy on the next great thing. " -- Steve Jobs.

Intel Macs are going to get "Steve'd" . It is just a matter of time.

Post Jobs Apple was quite a different Apple. The pace of Intel Mac Pro's retirement will depend on market acceptance of Apple silicon Mac Pro. I mentioned this point before but I don't expect people to remember. I believe it's safe to assume Apple will make the transition as smooth and painless as possible for this niche market segment. By saying Intel Mac Pro will be available for longer than people thought, I was betting the inertia in this people is bigger than MacBook Pro/Air, iMac or Mac mini users.

Why would you see this as a disadvantage for a laptop chip? It's competitive with a TOP OF THE LINE monster desktop card!?! Apple has yet to release their desktop level chips, much less their pro one.

Unlike you, I don't have much love for Apple (or any company for that matter). But I didn't compare a "laptop gpu" against a competitor's "desktop gpu". I was trying to illustrate when the GPU clock of M1 Max scales up, what sort of power consumption and subsequently performance uplifts to expect. Would that likely to happen or not. If not, what could be possible reasons.
 

Pressure

macrumors 603
May 30, 2006
5,182
1,545
Denmark
Unless Apple allows users to add an Apple made GPU, or perhaps, upgrade the Soc, such as going from an M1 Pro to an M1 Pro Max, or upgrading to an M2 Pro Max, etc. Or being able to add an extra M processor to the mother board.

With the price of RX6900 XT GPUs, such an upgrade path would be attractive for owners of Mac Pros and also for Apple.

In other words, how many pro users would benefit from PCI slots if apple provides their own upgrade paths?
Seeing the available upgrades all the way back from the G5 Tower to the Trashcan I can honestly say I expect zero upgradeability from a tower or equivalent design from Apple.

They just don't do it. And whatever bone they throw at us is just not enough. We've had to figure out EFI to get support for newer graphic cards, using said hardwares media capability to enable encode and decode of video and all that jazz.

If we have had OpenCore in its current capabilities 10 years ago it would have been a miracle.

There is essentially zero reason for PCIe slots if you have nothing to stick in them.
 
Last edited:

Melbourne Park

macrumors 65816
Seeing the available upgrades all the way back from the G5 Tower to the Trashcan I can honestly say I expect zero upgradeability from a tower or equivalent design from Apple.

They just don't do it. And whatever bone they throw at us is just not enough. We've had to figure out EFI to get support for newer graphic cards, using said hardwares media capability to enable encode and decode of video and all that jazz.

If we have had OpenCore in its current capabilities 10 years ago it would have been a miracle.

There is essentially zero reason for PCIe slots if you have nothing to stick in them.

The trashcan wasn't a tower ... it did have some upgradeability though.

The Cube failed, due to it's price (the same as a G4 tower), its lack of access, its lack of expandability, and also due to cracks in its plastic case that drew criticism.

So ... price and functionality did matter ... but the Cube did inspire other designs and technical solutions.

IMO the reason for a tower is for better cooling, a larger power supply, and for expandability. For those reasons, people will pay more. But if there isn't the promise of expandability, then people will not want to pay much more. People won't want to spend a premium price on another Cube or a Mac-Mini-Mark II.

Hence to achieve a premium price, Apple needs to offer some expandability.

Also if people buy a Mac tower - they'll stay in the Apple space, and buy more Apple gear. Phones, notebooks, services, etc etc. A tower of some type is important for Apple to appeal to the whole market.

IMO a tower is not attractive unless it promises expandability ... all I would want is expandable memory, disk slots, replaceable CPUs and / or GPU slots. No need for me for expensive 3rd party GPUs either - why not the ability to increase GPU performance via Apple's own hardware?
 

jjcs

Cancelled
Oct 18, 2021
317
153
My work is professional, but I use a laptop, I do professional work on it. As a professional, anyone who do good work is professional in my eyes, regardless of race, gender, orientation, or what they use.

"Professsional" used to have more meaning that just "paid to do it". It usually involved a recognized profession.
 
  • Like
Reactions: mattspace

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
The Cube failed, due to it's price (the same as a G4 tower), its lack of access, its lack of expandability, and also due to cracks in its plastic case that drew criticism.

Yup, when they made the decorative feature (clear perspex) the primary marketing angle, and couldn't actually deliver that part, it didn't do a lot for the machine's sales. The endemic power button and optical drive failures, from those components being in the heat path didnt do a lot for the machine's sales either.

But the biggest problem it had, was that Apple believed the pricing should represent the processing power of the machine, and hence it was priced equivalent to the tower, and small size is the "free" feature you got as opposed to the "free" expandability of the tower. The market disagreed, and the common refrain of the time was that it should have been much closer to the price of the iMac than to the tower, given the iMac's similar expandability, and trading a display for a better processor.
 
  • Like
Reactions: Melbourne Park

4wdwrx

macrumors regular
Jul 30, 2012
116
26
"Professsional" used to have more meaning that just "paid to do it". It usually involved a recognized profession.
There is no real meaning for professional, it is just a society created word. If you think you are professional, you are a professional, don't let others dictate you are or are not.

It does not matter the degree or title, or the equipment, it's what you can do.
 

jjcs

Cancelled
Oct 18, 2021
317
153
There is no real meaning for professional, it is just a society created word. If you think you are professional, you are a professional, don't let others dictate you are or are not.

It does not matter the degree or title, or the equipment, it's what you can do.
There are, in fact, definitions for the word "professional". Some involved just being paid for something. Others involve working in a specific profession with educational and certification requirements.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.