Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

exoticSpice

Suspended
Jan 9, 2022
1,242
1,952
Listened to who? It is extremely not likely to be coincidental that CSAM rollout died at the exact same time Apple pushed end-to-end encryption onto Photos. Apple has protected themselves from child porn charges by just completely side stepping the liability problem. They can't help stop the child porn because now they can't see it ( at rest) . So it can't be partially their fault it is being stored and distributed on their equipment. Sgt. Shultz; they see nothing , know nothing , hear nothing, nothing , nothing , nothing.


Most of the original CSAM was about scanning the iCloud photos. Apple isn't dropping all of the measures, they just aren't going to chasing archive builders/distributors.

"...
Instead, it's focusing on opt-in Communication Safety features that warn parents about nudity in iMessage photos as well as attempts to search for CSAM using Safari, Siri and Spotlight.

Apple plans to expand Communication Safety to recognize nudity in videos as well as content in other communications apps. The company further hopes to enable third-party support for the feature so that many apps can flag child abuse. There's no timeframe for when these extra capabilities will arrive, but Apple added that it would continue making it easy to report exploitative material. ..."


Two Apple initiatives collided here. Apple's big "we are a super mega corp, but trust us " Privacy initiative. And CSAM. the 'bigger' initiative won. Apple big initiative on Perf/Watt versus maximal power draw ... probably plays out in a similar way.
See they could have enabled CSAM for non-encrypted data but they did not. After all Advanced Data Protection is opt-in.

Obviously Apple lost trust after the CSAM issue, they probably made ADP after the pressure to stop it.



In the same way Apple has to do the Pro group good. One thing to keep in mind is that Apple is not above physics.
Apple is focused on Pref/w more than other companies but if they stay on the same node they will increase power without looking back.
 

innerproduct

macrumors regular
Jun 21, 2021
222
353
Buying an XDR or Mac Pro or not should be an independent decision from the tax breaks. If you actually need (real requirements . not 'want'/'desire' ) it, then buy it. If that happens to generate some tax breaks then that should only be "gravy on top".
I see almost no sane reason for buying macs outside of my personal wellbeing tbh since my only power need is 3d rendering and for that PCs are king as we all know. An XDR screen will be useful and nice but is more of an indulgence for my personal workstation. Probably will get a MP or new Mac studio next year and want a really nice display to go with that. And with prices going up (Im in Europe) I’m quite sure the next xdr screen will be at least 30 % more expensive. Anyway, this is slightly OT. Let’s get back to grinding that rumour mill 😂
 
  • Haha
Reactions: maikerukun

Boil

macrumors 68040
Oct 23, 2018
3,478
3,174
Stargate Command
M1 = no Media Engine
M1 Pro/Max/Ultra = Media Engine(s)

M2 = no hardware ray-tracing
M2 Pro/Max/Ultra/Extreme = hardware ray-tracing

Or maybe Apple has the M2 Pro & M2 Max laptop SoCs on 5nm, without hardware ray-tracing; and the M2 Ultra & M2 Extreme SoCs are on 3nm, with some of the extra transistor capacity earmarked for said hardware ray-tracing...?
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
M1 = no Media Engine
M1 Pro/Max/Ultra = Media Engine(s)

M2 = no hardware ray-tracing
M2 Pro/Max/Ultra/Extreme = hardware ray-tracing

The M2 filled in Media Engine ( ProRes )

Apple rolled out new software support for Ray tracing at WWDC 2022. Than doesn't mean that hardware support is coming soon. Doesn't discount it arriving, but doesn't enable it either. Apple has a keen interest in rolling out more efficient , effective ray tracing to the current platforms at least as much to solely limiting it to some future ones.

And teh media engine stuff rolled out incrementally over several years also. Afterburner came first. Apple rolled that out to a decently large end user audience. Worked out the bugs and then only then put it into fixed function silicon. Apple has spent a substantial effort over last 2-3 years helping 3D raytracing programs getting their stacks adapted to Apple's Metal foundation. IHMO, it is a bit early that to show up as fixed function silicon. Apple really didn't get the kinks out of most of that until the last year or so on the software side. So if the API is shifting and evolving how are they going to do fixed function hardware in the most optimal way? Probably not. So waiting like they did for the Afterburner case until get enough deployed API feedback would be a less risk path to rolling out hardware ray tracing. ( rather than being in a feature check box race with Nvidia. )


And the Apple hardware that is likely most pressed for hardware ray tracing is the tether-less VR/AR goggles; not the Mac Pro.



Or maybe Apple has the M2 Pro & M2 Max laptop SoCs on 5nm, without hardware ray-tracing; and the M2 Ultra & M2 Extreme SoCs are on 3nm, with some of the extra transistor capacity earmarked for said hardware ray-tracing...?

Again the VR/AR goggles are in deeper need of TSMC N3 to pack hardware ray tracing into a highly constrained die area than the Mac Pro SoC usage. Rumors early in 2022 about the goggles is that they were running way too hot. (which highly likely means the battery life was also 'way too short' to be competitive. ). Unloading the GPU cores there with a more power efficient ray trace pipeline would buy a bigger 'bang for the buck' there.
[ And rather super early , 'at risk' production N3 chip in early 2022 that was having heat problems wouldn't be that surprising. Rumors Apple using a M1 Pro/Max in the goggles did not much much sense other than as a 'test mule' SoC just to power a prototype physical set up. And blowing the thermal budget in that context wouldn't be surprising either. ]


Apple's original plans were for the Mac Pro to be out in 2022. Loading it down with stuff that wasn't finished in 2020-2021 would not help that happen on time. The AR/VR goggles had no explicitly stated timeline. So if they slide , then they just slide. There is no user base there yet.

The plain M2 stuffers from die bloat. It is incrementally faster, but it is also bigger. Smaller M2 ultra M2 extreme die components will be easier to package. Do not want the biggest chiplets; generally going to want smaller ones.
So pretty good chance not going to take the same M1 Max die size and stuff the maximum number of transistors into the same size. Not going to have the same area budget. So M2 architecture in smaller space is going to 'spend' that relatively smaller transistor budget. (relatively to what it could have been it went for the more expensive, larger die).

The same percentage size die bloat ( m1 -> M2 ) applied to a 400+ mm^2 Max die would be bad. Probably blow past the packaging TSMC tech used for the M1 Ultra. The die cost would go up. They'll probably need a different packaging tech for the 'more than two" die set up regardless, it isn't going to help to use bigger than need be chiplets.



Full size E core complex ( 4 cores ) , 2 more GPU cores per cluster , some cache increases (which N3 does not shrink as well) , DisplayPort 2.1 , and some NPU/Media tweaks/improvements has pretty good chance of using up most of the increased transistor budget in a smaller die size.

All of those will be more useful to a broader user base than just hardware raytracing. More CPU and GPU cores and keeping them feed with data ( cache bump) is just better overall, general throughput. Broader display support for 2023+ higher end displays (not everyone who needs more screen resolution needs those pixels raytrace generated).

Hardware ray-tracing can be very narrow in impact with a reasonably small increment in transistor budget. Apple could squeeze a limited (or single) function into the budget. I wouldn't expect some humugous impact there though. (substantive improvements , but not 'revolutionary' ) And likely will be like AVFoundation library relation to the media engine. If you use the Apple RT API exactly how they laid it out then will get the uplift. Roll your own RT stack and nothing. What is being 'swapped out' is a call to something Apple wrote, not a general 'tool' for 3rd party implementations.

Ray tracing likely not a 'big budget increase' driver. Especially on the first iteration/generation implementation.

it is probably more important for Apple to have portable Metal RT code ( same code work on AR/VR goggles, iPhone, iPad , Mac ) than to have some some custom RT code that only works on > $5K systems with the RT hardware in it.
 

innerproduct

macrumors regular
Jun 21, 2021
222
353
Since there are few games on mac and no reasonable priced computer for gaming (never was either) I view hw ray tracing more as a professional feature on mac. So if pro silicon get it in the form of new afterburners or on the package that makes most sense. Imho
 

maikerukun

macrumors 6502a
Original poster
Oct 22, 2009
719
1,037
I see almost no sane reason for buying macs outside of my personal wellbeing tbh since my only power need is 3d rendering and for that PCs are king as we all know. An XDR screen will be useful and nice but is more of an indulgence for my personal workstation. Probably will get a MP or new Mac studio next year and want a really nice display to go with that. And with prices going up (Im in Europe) I’m quite sure the next xdr screen will be at least 30 % more expensive. Anyway, this is slightly OT. Let’s get back to grinding that rumour mill 😂
I mean I don't mind the pit stop, lol...and honestly for 3D rendering, the Mac Pro is genuinely a monster...but the M1 chips are going to kill that T-Rex and birth a raptor in it's place. A serious waste.
 

jmho

macrumors 6502a
Jun 11, 2021
502
996
Apple's entire strategy in every segment other than the Mac has been total hardware dominance, and then use that to entice developers to use their platforms, making sure that they have the absolute best software ecosystem.

They understand the unspoken rule that when developers make multi-platform software, they will usually use the most powerful hardware because it lets them iterate faster.

By making the iPhone so powerful, Apple have made sure that most apps and mobile games are written for iOS first and then ported to Android afterwards. This means iOS has the better software ecosystem.

By making the iPad so powerful, they have made sure that anyone who needs power on the move has to buy an iPad instead of an Android tablet. Therefore iPad has the better ecosystem.

This situation is turned on its head with the Mac, where if someone wants to make something requiring huge amounts of GPU power, they pick up a 4090. This means that nobody at the high end is writing stuff in Metal, they're writing things in CUDA, and nVidia is playing the game Apple invented - going after developers and locking them into their ecosystem.

In much the same way that the M1 went after Intel's lunch, the Mac Pro 8.1 needs to go after nVidia's lunch. If they can't decisively crush a $2k 4090 with a $10k machine then the entire transition was a mistake.

Transitioning to Apple Silicon is such an all-or-nothing proposition for Apple that I think they know that if they want people to actually bother writing software with their proprietary APIs, they need to have developers of high-end software drooling over the Mac Pro in the same way that they currently drool over a 4090.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
CUDA seems like it’s finally starting to enter a decline. There are a lot of other options for compute languages, and hardware is starting to get more diverse again. CUDA’s lock in is also starting to play against it as dedicated hardware and accelerators get more popular again. I see a lot of GPU research still preferring CUDA but commercial packages starting to move away.

But - CUDA losing ground does not mean Metal will gain ground.
 
  • Like
Reactions: maikerukun

Mac3Duser

macrumors regular
Aug 26, 2021
183
139
In much the same way that the M1 went after Intel's lunch, the Mac Pro 8.1 needs to go after nVidia's lunch. If they can't decisively crush a $2k 4090 with a $10k machine then the entire transition was a mistake.
For the mac pro transition only. For laptops, ASI are the best.
 
  • Like
Reactions: maikerukun

jmho

macrumors 6502a
Jun 11, 2021
502
996
CUDA seems like it’s finally starting to enter a decline. There are a lot of other options for compute languages, and hardware is starting to get more diverse again. CUDA’s lock in is also starting to play against it as dedicated hardware and accelerators get more popular again. I see a lot of GPU research still preferring CUDA but commercial packages starting to move away.

But - CUDA losing ground does not mean Metal will gain ground.
Indeed. Apple have made their bed, and they've decided that they're not going to play nicely with anyone else, which means for them it's victory or nothing.

CUDA losing ground, and everyone moving towards more portable universal solutions is not good for Apple. But it is an opportunity.

As you say, CUDA losing ground does not mean Metal will gain ground, but if Metal doesn't gain ground then Apple Silicon is never going to be competitive in GPU based tasks.

For the mac pro transition only. For laptops, ASI are the best.
It depends what you're doing. If you're using first party software then I agree.

If you want to do 3D content creation or game development then you'll run into a huge number of missing features and poor performance.

A strong Mac Pro is the key to getting better quality software onto the laptops.

Even someone who only wants to use a MacBook Air and would never buy a Mac Pro should be hoping that the 8.1 Mac Pro is an amazing machine, because that will cause well optimised software to trickle down to the consumer level.
 

Kimmo

macrumors 6502
Jul 30, 2011
266
318
I mean I don't mind the pit stop, lol...and honestly for 3D rendering, the Mac Pro is genuinely a monster...but the M1 chips are going to kill that T-Rex and birth a raptor in it's place. A serious waste.

Exactly.

Raptor is good.

Screen Shot 2022-12-12 at 9.09.38 AM.png


But kinda lightweight compared to T.Rex.

t-rex-electric-warrior-cover-art.jpg
 

maikerukun

macrumors 6502a
Original poster
Oct 22, 2009
719
1,037
Apple's entire strategy in every segment other than the Mac has been total hardware dominance, and then use that to entice developers to use their platforms, making sure that they have the absolute best software ecosystem.

They understand the unspoken rule that when developers make multi-platform software, they will usually use the most powerful hardware because it lets them iterate faster.

By making the iPhone so powerful, Apple have made sure that most apps and mobile games are written for iOS first and then ported to Android afterwards. This means iOS has the better software ecosystem.

By making the iPad so powerful, they have made sure that anyone who needs power on the move has to buy an iPad instead of an Android tablet. Therefore iPad has the better ecosystem.

This situation is turned on its head with the Mac, where if someone wants to make something requiring huge amounts of GPU power, they pick up a 4090. This means that nobody at the high end is writing stuff in Metal, they're writing things in CUDA, and nVidia is playing the game Apple invented - going after developers and locking them into their ecosystem.

In much the same way that the M1 went after Intel's lunch, the Mac Pro 8.1 needs to go after nVidia's lunch. If they can't decisively crush a $2k 4090 with a $10k machine then the entire transition was a mistake.

Transitioning to Apple Silicon is such an all-or-nothing proposition for Apple that I think they know that if they want people to actually bother writing software with their proprietary APIs, they need to have developers of high-end software drooling over the Mac Pro in the same way that they currently drool over a 4090.
I 100% agree with this...very interested in seeing where it all goes.
 

maikerukun

macrumors 6502a
Original poster
Oct 22, 2009
719
1,037
Indeed. Apple have made their bed, and they've decided that they're not going to play nicely with anyone else, which means for them it's victory or nothing.

CUDA losing ground, and everyone moving towards more portable universal solutions is not good for Apple. But it is an opportunity.

As you say, CUDA losing ground does not mean Metal will gain ground, but if Metal doesn't gain ground then Apple Silicon is never going to be competitive in GPU based tasks.


It depends what you're doing. If you're using first party software then I agree.

If you want to do 3D content creation or game development then you'll run into a huge number of missing features and poor performance.

A strong Mac Pro is the key to getting better quality software onto the laptops.

Even someone who only wants to use a MacBook Air and would never buy a Mac Pro should be hoping that the 8.1 Mac Pro is an amazing machine, because that will cause well optimised software to trickle down to the consumer level.
Yep, agree with this too lolol. All very good points :)
 

fuchsdh

macrumors 68020
Jun 19, 2014
2,028
1,831
Anyone here with a 6k xdr ? Considering one to get taxes in the rights side of the year. Had expected the mp and new xdr to come out this year so I waited. Is it a sane buy today?
The next upgrade over the existing XDR would be an iterative miniLED and 120Hz/HRF or microLED, and a resolution bump. MicroLED ain't coming any time soon, and given the costs of such displays when they're not geared for gaming (aka focused on refresh rates and response times over accurate color) I don't think it's in the pipeline any time in the immediate future.
 

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,794
The next upgrade over the existing XDR would be an iterative miniLED and 120Hz/HRF or microLED, and a resolution bump. MicroLED ain't coming any time soon, and given the costs of such displays when they're not geared for gaming (aka focused on refresh rates and response times over accurate color) I don't think it's in the pipeline any time in the immediate future.

It's crazy to me that they wouldnt upgrade to 8k. That said, they'd need to go to a substantially bigger than 34" screen to make it worth while.
 

maikerukun

macrumors 6502a
Original poster
Oct 22, 2009
719
1,037
It's crazy to me that they wouldnt upgrade to 8k. That said, they'd need to go to a substantially bigger than 34" screen to make it worth while.
What's your idea monitor size and resolution for say, a studio display?

For me, 6k 32inch feels perfect. In other words, the perfect update to the XDR for me would be the same display but with a deeper black contrast and no bloom with 120hz.
 
  • Like
Reactions: jmho and enc0re

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,794
What's your idea monitor size and resolution for say, a studio display?

For me, 6k 32inch feels perfect. In other words, the perfect update to the XDR for me would be the same display but with a deeper black contrast and no bloom with 120hz.
Well I have an 8k at 86" and I love it. It has the same dots per inch as the old 30" Cinema Display.

If you want to go retina and basically turn the 8k into a 4k'ish-but-retina screen then I think you can do around 43" for an 8k screen.

I personally prefer the old dot pitch because on a desk where you have your documents say 2-3' away from you, the retina has less value to me than the extra real estate. On a laptop or iPad or phone, you're so much closer to the screen that the use case for retina is much higher, IMO. But I can see where others would prefer otherwise.

I find the XDR zone lighting to be an awful joke. On the few times on your screen where you have enough black for it to zone out, it blooms and actually DISTORTS the reality of the image youre looking at. It's ANTI-fidelity. I hate it on my 12" iPad as well. The Samsung I have deals with it better IMO, simply because it has more zones, but still, meh. A case where they want a stupid 1:1million contrast ratio stat sheet, and dont give a darn on how lousy it actually works and looks.

That said, I'd love to just get to OLED, but they have their own issues; on a monitor, burn in really becomes more of an issue with so many static UI elements on the screen.

The only other thing on the horizon is MicroLED and I think that will arrive right around the same time as fusion and true AI... any minute now. In the mean time, I suspect OLED will be improved enough to deal with burn in before microLED gets real at reasonable scale.
 
Last edited:
  • Like
Reactions: maikerukun

innerproduct

macrumors regular
Jun 21, 2021
222
353
Since the 7900 xtx reviews started to drop in we now have some blender benchmarks for 3.4 which give us some interesting insights. First, a properly utilised 7900 xtx is about 1.8 times faster for blender rendering that a 6900xt.
So if there was a Mac Pro 2019 compatible version of this card, we would be looking at just that: up to 80% perf gain at same power usage. That would have been a solid upgrade.
Now, if we look at a current Mac Pro it could be outfitted with many different GPUS. The original radeon Vega II each have about the rendering perf of single m1 Max. The single of the shelf (now 600 $) 6900xt is about 35% faster than a m1 ultra. And a system specced out with dual 6800 duos renders at similar perf as 8 m1 Max chips.
I list this a "evidence" that Apple just won't release a new Mac Pro that is worse than the current, almost 4 year old system. Sure they could make something doesn't beat the top of the line version but if we don't get better perf per $ then what is the purpose? Just for specialty audio work? I mean, for video, does anyone really need anything better that the current Ultra? Or even m1 max?
The conclusion might then be: if there is a Apple Silicon Mac Pro, it will be better than a middle of the road 2019 MP at least. That means rendering capabilities at least better than dual 6900xt but with the benefits of silent running and large VRAM.
For that to be true, a M2 "Extreme" without extras will just not cut it. It might touch similar scores but it would't be better. So, apple adds RT HW in the m2 pro series, that could save the day and most people would have a good enough system. Maybe it will not beat a dual 6800 duo or be close to PC/nVidia perf but at least useful.
However, not being able to upgrade the GPUs would make the system a lot less interesting as an investment.

Would you guys buy an AS MP 2023 with a really fast 48 core (32+16) CPU but with a GPU in the realm of dual 6900xt at max? List price 12000$ with 192 GB RAM? No accelerators to be had?
 

fuchsdh

macrumors 68020
Jun 19, 2014
2,028
1,831
Well I have an 8k at 86" and I love it. It has the same dots per inch as the old 30" Cinema Display.

If you want to go retina and basically turn the 8k into a 4k'ish-but-retina screen then I think you can do around 43" for an 8k screen.

I personally prefer the old dot pitch because on a desk where you have your documents say 2-3' away from you, the retina has less value to me than the extra real estate. On a laptop or iPad or phone, you're so much closer to the screen that the use case for retina is much higher, IMO. But I can see where others would prefer otherwise.

I find the XDR zone lighting to be an awful joke. On the few times on your screen where you have enough black for it to zone out, it blooms and actually DISTORTS the reality of the image youre looking at. It's ANTI-fidelity. I hate it on my 12" iPad as well. The Samsung I have deals with it better IMO, simply because it has more zones, but still, meh. A case where they want a stupid 1:1million contrast ratio stat sheet, and dont give a darn on how lousy it actually works and looks.

That said, I'd love to just get to OLED, but they have their own issues; on a monitor, burn in really becomes more of an issue with so many static UI elements on the screen.

The only other thing on the horizon is MicroLED and I think that will arrive right around the same time as fusion and true AI... any minute now. In the mean time, I suspect OLED will be improved enough to deal with burn in before microLED gets real at reasonable scale.
I haven't experienced the Pro Display with HDR content, but my Sony M9 monitor has substantially fewer zones (96 in a 27" footprint) and it works remarkably well. If there's a single small white element on a black screen (loading screens and the like) the "halo" effect is pretty pronounced, but in general use as long as you don't sit off-axis it's absolutely still an upgrade to me over a standard panel.

(Where I would say manufacturers are stat-chasing is the crop of "HDR400 verified" monitors that don't really get bright enough and only have 16 edge-lit zones. HDR simply looks worse on rather than off and thus the specs are useless.)
 

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
Would you guys buy an AS MP 2023 with a really fast 48 core (32+16) CPU but with a GPU in the realm of dual 6900xt at max? List price 12000$ with 192 GB RAM? No accelerators to be had?

I think buying a PC would make more sense. You'd have to really hate Windows 11 to invest in a Mac system like that, at least for 3D / VP use.

Actually, I think Apple should find a way for AS to use PCIe GPUs for the Mac Pro. Apart from giving that model a unique selling point, it would mean Apple could continue to concentrate on iPhones and laptops, without the constant need to compete with Nvidia and AMD's meatiest GPUs. They could leave it to AMD to put in the work there, or if AMD were to fall way behind, just partner with Nvidia. It doesn't seem worthwhile for Apple to take on the costs of developing every aspect of a very niche machine - and commit to doing so indefinitely.

The Mac Pro is just too far from the rest of Apple's products, specifically in terms of GPU. Ganging SoCs together is reasonable for the Ultra, but 4x Max chips (if possible) would have excessive numbers of CPU cores, whilst still not impressing compared to e.g. an RTX4090 or two. Especially not for the (presumably) massive cost.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.