Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

teagls

macrumors regular
May 16, 2013
202
101
*shrugs. Who said anything about simple? Though a 'few day' port sounds doable. £££ will bring the 'will' for porting software. How did developers get on x86 in the 1st place? It's prevalence and the £££ that it brings. 40 million Mac/ipad units (per year) will provide a sufficient incentive for many developers.

As for the app store making less money. What? Less billions? :p

X86 is a dead end. ...and Intel's struggles are set to continue for another year at least.

Intel's cpu performance has languished and the Apple silicon is already looming in benchmarks. Xeons? Nothing special? AMD graphics? Polaris...mid-range Navi? I don't see how they represent 5 years of progress in gpus on the Mac platform. Very mediocre. Apple haven't even bothered to use the 5700XT on a Mac between £1500-3000 and it's been out for a year.

As for GPUs. Nivdia AMD? They've sat still for most of the last decade. The fall released of RDNA2 and Ampere being the 1st real leap in a very, long time. My point? The bar isn't 'that' high.

ARM developer kit? And? You neglected to mention in Apple's demos' it was running 3 x 4k streams and games under emulation...handling multi gig PS files and running iOS apps natively. All the kind of things the current Mac Mini Intel would struggle with?

And Mac Os windowing has never been buttery smooth since Mac Os X did grace our Macs. Even with Nvidia and AMD gpus. Only has it, in recent times...been less 'juddery.' I'd take 3x4k streams over window smoothness. But Mac Sur is beta software. I wouldn't have thought it would be smooth on an A12z ultra mobile chip...when it can barely be a smooth experience on Intel running beta software. Who knew.

Yet the gpu user experience on an iPad is super fast and smooth. Guess iOS and A12z were tuned for one another.

And yes, a mobile A12z (Which INtel, AMD nor NV' can match for its intended purpose...) is empirical proof that Apple won't match them for their intended Mac usage with the as yet and unannounced (AS14?)?

Apple optimised and tuned software hardware vs half assed open gl crums from Nv or AMD?

I'm looking forward to seeing what Apple AS offers. If the iPad 'experience' is anything to go by. I'll take it.

Azrael.

Developers got on x86 because it was easily available. This article by Linus Torvalds sums it up perfectly.


Less than 0.01 percent of Consumer mobile apps will be considered financial success.


Intel CPUs and AMD graphics are terrible, yes. Intel has tried for years to make a discrete GPU, which turned into their failed Xeon Phi Coprocessors. But that should be an indicator of how hard it actually is to produce improvements in technology.

Not adding 5700XT is on Apple. What makes you think they will upgrade continuously even with their own silicon.

Nvidia has sat around!? LOL. Maxwell to Pascal was one of their biggest jumps. Pascal was so good people didn't want to upgrade to Turing. Name me a GPU better than Nvidia! Oh wait..

LOL – demo's are just that. Demo's.. they are choreographed, rehearsed and designed to look impressive. Why would they show anything that looks bad? Got some terrible news.. real world performance is different. But maybe you didn't understand. Just dragging a window in Finder used over 30% of the GPU. Imagine doing anything else... are you that obtuse?

You realize Apple maintained OpenGL on Mac. They were responsible for it!! OpenGL on Nvidia works amazing on Linux. Guess why, because the drivers come directly from Nvidia. Nvidia regularly updates OpenGL too. Not Apple's garbage.

You should do some deeper research into these areas. You clearly aren't aware of the facts. Especially thinking Nvidia was in control of OpenGL on Mac...
 

Azrael9

macrumors 68020
Apr 4, 2020
2,287
1,835
And yet the App Store remains. Raking in billions. 'Less than...more than...' Tomato TOMaTo.

...and they have got on iOS because it's even more available. And when 40 million Mac/iPad units walk out stores per year...that will be available.

As for Intel's attempt at GPU? Laughable. I think they now see it's increasing importance...but very late to the party, Intel.

Apple have clearly prioritised GPU on iPad with a 1000x fold improvement? That suggests I fancy their chances to make a compelling platform re: gpus than Intel.

Nv and AMD sat around for the best part of 5 years. I'll give Nv' a point for optimising their hand, they did better than Radeon for the last 5 years. But that's not saying that much. Gpus were just as static as cpus for 5 years.

Ampere and RDNA2 are the 1st serious jumps since...amazing what some actual competition can do.

Obtuse. Or Myopic for not realising those demo'd were pretty darn' impressive compared to 'window' lag which has been a Mac specialty even on Nv' Mac gpus. All that on a beta 'AS' chip on a beta OS.

No one ever said Apple draped themselves in glory re: their Open GL responsibilities. GPUs performing 50% of their winddows equivalent. For premium Apple prices. If we're 'lucky' we'll get the year old 5700XT in the 'new' iMac this Sep'. Which we're still...waiting for. I'll give Nv' props for their Open GL maintainence but they didn't have that much competition. GL is old, middleware. It's the past. And performs like it.

There's no need to do research. I lived through it. Which I'll put up against your future gazing to Apple's potential for GPUs (when iPad clearly shows they handed teh competition their ass, Intel, AMD, Nv' in mobile. Where the real money has been) Open GL on Mac was a 2nd rate disaster of middleware junk. And keeping it was a 'no go' it was going nowhere. With devs that offered half hearted ports...of...well...anything. Open Gl. Inefficient. It had it's time back when Apple needed 'open' as it was more widely supported than Quickdraw3d.

Apple entering the cpu and gpu space is good for competition. It's even better for Mac users getting 2nd rate gpu performance for 1st rate prices.

If Intel and AMD are terrible...it won't be difficult to unseat them performance wise on cpu or gpu. I'd expect some software and hardware optimisations that Intel and AMD simply won't be able to match. There will be raw performance...but it will also be in hollostic user experiences (maybe they'll get rid of your window lag on the beta(!) dev kit...by putting an actual AS chip in there...and optimising Big Sur for the next 'Big Mac.')

Nv'? Better gpu? What? A 2080Ti for £1200? A medicore 5700Xt offers half the performance for a third of the price. Not so great. I'd look towards their hot and brute force Ampere to see hardware that is actually worth the price. And their proprietary cul de sac (yes, I'm sure those users on PCs and Linux may enjoy those...'benefits...') Ultimately irrelevant to Mac users. As Apple clearly has forgiven them for...

Who cares about Linux. Li Tor? He aint leading Apple. Or Mac dev'. (I'm not even sure Tim Apple is leading Mac development...) The gravitional pull of iPhone and iPads are market leading £££ and that is going to drag the Mac AS along with them. Kicking and screaming if necessary. But it can't get any worse than it is now or was under Open Gl's 2nd rate existence on the Mac platform. Glad it was taken behind the cow shed and put down. I expect AS to be transformative.

But yes. The pudding will be in the eating. Let's hope it's not as bad a pudding as Open GL...or those nasty Intel iGPUs...or those so-so Xeons...

Cuz we aint getting no Threadrippers and no Nv's anything. (NV'. Big. Hot. Not 'that' efficient.)

Apple were never going to win an up hill struggle with NV/Intel on Mac Towers. They tried it under Jobs and the needle barely moved. (I had a Power Mac...by the way...so, we've seen how this all played out...)

The iPhone and App store...and even the iPad changed everything for Apple. And for the Mac. The iPhone £££ and iPad £££ (both the finance and the learning curve of the A12x chip, as an example...have given Apple and the Mac an opportunity it wouldn't have gotten otherwise.) This means the Mac, under AS hardware can be 'born again.' Will it be better than Intel cpu/AMD gpu? It gets Metal. It will get AS. What fusion of power and performance we get out of AS'14' is yet to play out.

I don't think Apple enters the game unless Apple can offer its users a better experience.

Azrael.
 
Last edited:

patrick.a

macrumors regular
May 22, 2020
153
125
It's not that simple. The App Store is not as popular & lucrative as it used to be. Only big software shops that make cross-platform software have the most success. Explain to me why companies would hire very expensive iOS/MacOS developers at $150k+ salaries to port existing cross-platform software to Metal and optimize it for Apple Silicon. The return is minimal if not a total loss. Only large companies can do this. Smaller shops that rely on open source or existing x86 software frameworks can't afford that...

As for GPUs. I have the ARM Developer kit. Just dragging around windows on it uses over 30% of the GPU. On my Mac Pro with Big Sur the usage doesn't even show up. Apple will never be able to compete with Nvidia on GPUs. Nvidia's entire company and business model revolves around producing the best in the world. To think Apple can just pop something out that competes with Nvidia is absurd and shows a clear lack of understanding how engineering, research and development works.
Thanks for putting things into perspective. It's easy to get excited watching Apple's well made keynote and marketing material. A lot of people in this forum and all of Youtube seems hyped on things Apple has yet to prove. And I don't quite believe all the smoke and mirrors a 100% just yet either.
 

Azrael9

macrumors 68020
Apr 4, 2020
2,287
1,835
As a demonstration of potential insight to the transition and potential AS hardware? Promising. And yes, exciting. More so than the last transition where Steve Jobs hummed him way through the loading of Photoshop under Rosetta.

That that wasn't a shipping OS. It wasn't shipping hardware. So they offered measured insight. Rather than benches and hard promises (to which people would later say, 'But you said...')

Apple Marketing 101. What people say and what they do are two different things. And what they expect a dark science.

It was Beta software and 'stand in hardware.' That was pretty clear.

Azrael.
 

Azrael9

macrumors 68020
Apr 4, 2020
2,287
1,835
Uh, no; People couldn't AFFORD to upgrade to Turing!

True dat. After ramping up gpu prices to outrageous levels (partly due to greed...and partly due to lack of AMD competition...)

And after beta testing people on Turing they then hosed them further with the 'S' series they should have released in the 1st place?

Azrael.
 

teagls

macrumors regular
May 16, 2013
202
101
Uh, no; People couldn't AFFORD to upgrade to Turing!

Compared to the perf per dollar of a Mac Pro it's a steal... Apple is no better. If not the worst.

To go even further at least with buying Turing you knew you were getting the best. With Apple's Mac Pro LOL you could build a faster machine for a 1/3 the cost.
 
Last edited:

eflx

macrumors regular
May 14, 2020
191
207
Kind of a myth there @teagls that you can build a comparable system for 1/3 the cost. Go configure even a Lenovo server ... kiddy AMD cpu based systems don't count ;) Don't be jealous ... go buy a Mac Pro, I bet you'd quite the cost nonsense quick if you use it for more than web surfing.
 

teagls

macrumors regular
May 16, 2013
202
101
Kind of a myth there @teagls that you can build a comparable system for 1/3 the cost. Go configure even a Lenovo server ... kiddy AMD cpu based systems don't count ;) Don't be jealous ... go buy a Mac Pro, I bet you'd quite the cost nonsense quick if you use it for more than web surfing.

I have a 2019 Mac Pro and it's garbage compared to the AMD Epyc server I bought along side it. The AMD Epyc server was cheaper too!
 

richinaus

macrumors 68020
Oct 26, 2014
2,432
2,186
Recently I have been running an eGPU [AMD W5700 + Razer Core], with a 16" maxed MBP.

I was never that impressed with the stability nor speed of the apps I use in MacOS, and felt a bit ripped off.

So I bought a 2080 super, put it in the same Razer Core and ran the exact same apps in bootcamp.

Suffice to say all the apps in bootcamp totally destroyed what I had on MacOS [same computer remember].........

read what you want into this, but I now have a different attitude to windows. I was able to get a better output on my renders, quicker and more efficiently. It improved my design work also by being able to work more fluidly and able to test iterations quicker.

It is more than obvious that the drivers and apps are all better in windows. I am looking forward to seeing what ARM and metal can achieve, but am certainly not holding my breath.
 

eflx

macrumors regular
May 14, 2020
191
207
The Apple setup for something like the Mac Pro will utilize CCIX - look it up. The only real "promise" seems to be the unified memory architecture to obtain super high speed & efficiency. If they utilize CCIX interconnect technology between the different processors (an add-in GPU would count here) then technically they will be able to provide expandable GPUs using standard PCIe slots.
 

Hps1

macrumors regular
Apr 14, 2017
106
28
It is more than obvious that the drivers and apps are all better in windows.

Yeah I've been feeling this recently. Houdini in MacOS (at least, in 10.13 where you can use gpu rendering) is a crash prone mess. Just horrible. C4D is perfectly fine up until S22, where it also starts to suffer. Same goes for many other apps.

In Windows? Everything sings. Such a shame.
 

richinaus

macrumors 68020
Oct 26, 2014
2,432
2,186
Yeah I've been feeling this recently. Houdini in MacOS (at least, in 10.13 where you can use gpu rendering) is a crash prone mess. Just horrible. C4D is perfectly fine up until S22, where it also starts to suffer. Same goes for many other apps.

In Windows? Everything sings. Such a shame.

Exactly my experience in Twinmotion, Rhino, unreal, and a host of other apps.

Unless apple release the 3D equivalent of final cut / logic there really is little these days to keep me as the drivers and support in the apps I use are all better in windows (boot camp). Already pricing up a Workstation pc for these jobs.....
 

Hps1

macrumors regular
Apr 14, 2017
106
28
Update from the Redshift team:

Considering Apple’s recent announcements regarding ARM CPUs, we’ll be looking at ARM support. But that does not change our short-term plans in any significant way.

Regarding general progress: Things have gotten pretty stable recently! And faster! We’re still working on optimizations and some last bugs and we’re also expanding the range of AMD GPU architectures we’re testing with. And, of course, have started testing with Big Sur which looks like it’ll be the OS version “Redshift for Metal” will be shipping for.

That's AGES away
 

sirio76

macrumors 6502a
Mar 28, 2013
578
416
Big Sur is expected for late October/November, doesn’t seems ages to me(provided they’ll have RS ready).
 

Hps1

macrumors regular
Apr 14, 2017
106
28
Yeah, I'm just whinging after expecting it every other week since 2019. My fault, not theirs.
 
  • Like
Reactions: vel0city

vel0city

macrumors 6502
Original poster
Dec 23, 2017
347
510
Big Sur is expected for late October/November, doesn’t seems ages to me(provided they’ll have RS ready).

Hands up who would use a version 1.0 render engine on MacOS 11.0 for professional work? This will take at least a few updates of both the OS and Redshift to be reliable and stable enough for production work, but even then, judging by the frequency of regular Redshift updates, I wouldn't expect to be using it for anything serious until well into next year. How long until farms fully support it? Some farms still don't accept Redshift 3.xx projects because of its beta status. Will existing project files, assets and plugins be fully, 100% compatible?

Total shiteshow in my opinion. Beyond disapointed by this entire situation.
 
  • Like
Reactions: shuto and Hps1

Hps1

macrumors regular
Apr 14, 2017
106
28
100% would rather use a buggy/underperforming beta if it meant being able to use modern hardware.
 

Boil

macrumors 68040
Oct 23, 2018
3,478
3,174
Stargate Command
Hands up who would use a version 1.0 render engine on MacOS 11.0 for professional work? This will take at least a few updates of both the OS and Redshift to be reliable and stable enough for production work, but even then, judging by the frequency of regular Redshift updates, I wouldn't expect to be using it for anything serious until well into next year. How long until farms fully support it? Some farms still don't accept Redshift 3.xx projects because of its beta status. Will existing project files, assets and plugins be fully, 100% compatible?

Total shiteshow in my opinion. Beyond disapointed by this entire situation.

I would use a current Mac Pro & stable software for paying jobs, but I would also get an Apple Silicon Mac (does not even have to be a Pro-line Mac at first, a Mac mini or an iMac might be enough to start) and 'side project' it to evaluate AS hardware & assorted beta softwares...?

Work with what works & test what needs tested until it meets deployment standards...!
 

vel0city

macrumors 6502
Original poster
Dec 23, 2017
347
510
I would use a current Mac Pro & stable software for paying jobs, but I would also get an Apple Silicon Mac (does not even have to be a Pro-line Mac at first, a Mac mini or an iMac might be enough to start) and 'side project' it to evaluate AS hardware & assorted beta softwares...?

Work with what works & test what needs tested until it meets deployment standards...!

Doable, and I get it, but over-complicated and not ideal for workflow. After all this time of waiting I hoped to be on a Mac Pro rendering in both Redshift and Octane, but here we are looking into 2021 for that to be a viable workflow.

It's just blowing my mind that this ever turned out to be so complicated, when we used to be able to stick an Nvidia GPU in our Macs and be done with it. Apple have turned this into a nightmare for 3D artists, all the fluff and "we have listened to our creative professionals and worked with them to deliver the workstation that they need" about the Mac Pro was just marketing and lies. Apart from edge niche workflows that computer is a dud so far. Ask anyone on Adobe CC, C4D, Substance, ZBrush, Quixel, Maya or Unreal if they would swap from a PC to that computer. It's a joke.
 
  • Like
Reactions: shuto

Romanesco

macrumors regular
Jul 8, 2015
126
65
New York City
Doable, and I get it, but over-complicated and not ideal for workflow. After all this time of waiting I hoped to be on a Mac Pro rendering in both Redshift and Octane, but here we are looking into 2021 for that to be a viable workflow.

It's just blowing my mind that this ever turned out to be so complicated, when we used to be able to stick an Nvidia GPU in our Macs and be done with it. Apple have turned this into a nightmare for 3D artists, all the fluff and "we have listened to our creative professionals and worked with them to deliver the workstation that they need" about the Mac Pro was just marketing and lies. Apart from edge niche workflows that computer is a dud so far. Ask anyone on Adobe CC, C4D, Substance, ZBrush, Quixel, Maya or Unreal if they would swap from a PC to that computer. It's a joke.

Yup. Everyone and everything on the Mac Pro (2019) was overpromised and underdelivered. Shame on Apple. Never burning down like this with their products again.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.