Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Coconut Bean

macrumors 6502
Jul 21, 2011
400
380
I ran across the AWS re:Invent video from the session put on by AMD to promote the Epyc and Radeon stuff that AWS is offering. As an aside, the video had 89 views as of the time I was writing this so it's definitely not a hot topic. Note this was done in December, so not about the stuff just announced.

AMD Epyc instances are offered at a 10% discount from Intel instances. Since they were launched last year, AWS has been offering them in the T, M, and R lines (burstable, balanced, and memory weighted). This year they are offering some "semi-custom" processors in the C line (compute weighted) with faster processors than you can get elsewhere, so apparently AWS is buying that yield instead of Apple.

Notably missing from the AMD offerings on AWS is anything in the X line (extreme memory weighted). The most memory in an AMD offering is 768 GiB. The X line goes up to 3,904 GiB, all Intel.

I'll be keeping an eye out for any announcements about the new stuff going into any AMD instances, but it is interesting that AMD had to sponsor a session to tell a crowd notorious for jumping on any cost optimization that they could save 10% on their EC2 bill by tacking an "a" on the end of the instance type.

Are you guaranteed to get Intel if you are not running a ***a instance or they might give you the AMD instance?
 

Coconut Bean

macrumors 6502
Jul 21, 2011
400
380
That’s not true. A huge amount of us went with hacks and/or Windows based systems where we could while we waited for 7,1. I used both myself. Still have them. Hacks in our environment are unreliable. They were mostly there, but some of us had problems, and most of us couldn’t justify the risk of missing deadlines due to hack issues. Windows was also fine. All but a few pieces of software run on Windows. But most of us dislike it and are willing to sacrifice price and possibly some performance (our needs are already met) to continue using MacOS.

And yes, that would be amazing, but we're not getting that and knowing Apple's history, they weren't going to give it to us.

My point is that this is a tool. If it doesn't suit the needs of the user, there are other options. In this particular instance, it fits our needs for the most part. There are much more powerful options that will run Windows/Linux/Unix depending on what you need.

EDIT: Also, forgive me for saying this, but it really seems that Apple built this thing as a workstation for those in the film/tv/music industry. It's marketed that way. It *might* not be the computer for everyone.

[automerge]1579461791[/automerge]

Point taken. I definitely don't represent everyone, but I understand the basic needs of those I work around, and 7,1 is mostly there. Enough so that we're happy.

Mind if I ask what you were planning on using the Pro for? Is a Windows workstation not an option?

At what point did the (hack / Windows ) transition happen?

I always wondered what all the people used before whom the higher-end Mac Pro combined with an XDR display were made for. The previous Mac Pro was outdated like 4-5 years ago, while iMacs were still slow and they have a good display, but nowhere near the XDR. It seems like majority of the people already went with alternative solutions, likely costing a lot both in terms of money, but other infrastructure, software and training of employees.

According to some random tech Youtuber's review, his exporting went from approximately 20 (Macbook Pro latst maxed out) -> 12 (Maxed out iMac) -> 8 (high-specced Mac Pro). Recalling this from memory so might be little different numbers. Were studios relying on Final Cut Pro just *ing around for hours watching the beachball while their machines were loading until now?
 
  • Like
Reactions: Marekul

AidenShaw

macrumors P6
Feb 8, 2003
18,667
4,677
The Peninsula
Are you sure? :)

DCG is Data Center Group, which also involves Server group.
Goalposts in motion, again.

The earlier post that you gleefully reported as proof that Intel was doomed, was in fact about an IT support reorganization the shed a number of jobs.

When I pointed that out, you respond with a different Charlie-d message about server group layoffs (which have not yet happened).

And what are your views on the *fact* that Apple can't support any system with more than 32C/64T?

maxresdefault[1].jpg
 

throAU

macrumors G3
Feb 13, 2012
9,239
7,398
Perth, Western Australia
Which is why they're not ****ing themselves over the AMD Threadripper 3 - they can't use the higher core counts. The 28C/56T Xeon is very close to the limit of what Apple OSX can use.

Apple OSX can't support a 64C/128T processor. Windows and Linux can.

This is something that can be fixed in software.

Linux never used to support more than ONE CPU properly either. Neither did Windows.

If Apple are not supporting ThreadRipper purely because their OS is unable to support more cores, they need to get their priorities sorted out, because the only reason intel aren't running similar core counts to compete is because right now they are manufacturing capability limited. You can bet your house that as soon as intel are actually CAPABLE of manufacturing a 64 core processor, they will have one on the market. However right now, even intel's top of the line 28 core parts are expensive, scarce and difficult to actually purchase (even at that price) because they simply can't produce in volume.

I see people posting above how they think workloads won't scale beyond a core count of X, but that's garbage if

  • you have many projects to do, and any of them are CPU bound and need to be batched
  • you want to properly isolate your processes for security via hardware virtualisation
  • you want to run tasks for multiple users off a central, high power machine acting as a server (e.g., an xgrid xcode build server perhaps?)
  • you have a workload that is "embarrassingly parallel" - which a lot of media processing/rendering style tasks are
At the end of the day, these workloads exist, they are in Apple's wheel-house and they don't have a machine that can support higher core counts to do them.

And even if that isn't the case.... for a moment lets play devil's advocate - lets say that even if Apple do not need 64 cores in any application on their platform.... why pay intel more money for slower cores? Why use hotter, more power hungry CPUs in their products?

Right now, even AMD's 3950X out-performs anything intel have, at any price (in less power) in some end user workloads (and that's what they could/should be using in their iMac for example).
 
Last edited:

AidenShaw

macrumors P6
Feb 8, 2003
18,667
4,677
The Peninsula
This is something that can be fixed in software.
But it's not a trivial fix - it requires major changes to a critical core component of the kernel, with real performance issues.

Simply supporting a flat scheduler with more threads than fit in a native integer bitmask might be a bad solution.
 

throAU

macrumors G3
Feb 13, 2012
9,239
7,398
Perth, Western Australia
But it's not a trivial fix - it requires major changes to a critical core component of the kernel, with real performance issues.

Simply supporting a flat scheduler with more threads than fit in a native integer bitmask might be a bad solution.

Well then i guess Apple need to give up?

Never mind that Linux has switched scheduler many, many times in the past without breaking things. So has Windows.

The internal workings of the scheduler is not exposed to user apps. They can make kernel changes of this nature as they see fit without breaking apps.

Everybody else has done it.
 
  • Like
Reactions: koyoot

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
But it's not a trivial fix - it requires major changes to a critical core component of the kernel, with real performance issues.

Simply supporting a flat scheduler with more threads than fit in a native integer bitmask might be a bad solution.
Which is why they need All of the APUs that have been leaked in Mac OS Catalina Beta. To test the stability of their system with all of the upgrades required on new hardware on all fronts. From large 64 CPUs to the simplest ones.
 

AidenShaw

macrumors P6
Feb 8, 2003
18,667
4,677
The Peninsula
Which is why they need All of the APUs that have been leaked in Mac OS Catalina Beta. To test the stability of their system with all of the upgrades required on new hardware on all fronts. From large 64 CPUs to the simplest ones.
Amazing that once again you can try to link something to non-existent AMD products. Amazing.

Apple can't support an AMD CPU with more than 32 cores. Let that sink in.

Can't you realize that Apple OSX barely supports current Intel Xeons, and AMD processors with more cores are irrelevant? Or do you realize it, but won't admit it?
 

nicho

macrumors 601
Feb 15, 2008
4,250
3,250
Also, it's important to point out, on modern systems we have dozens of processes going at the same time that don't need to be parallel but the overall system gets use out of running those separately on their own cores. I find systems with 6-8 cores on modern operating systems just are more responsive because so many of the little processes can always find a ready core.

For everyone talking about programs not being optimised for multiple threads, sometimes its good for these background ones.

For a week my mac mini was a bit warmer than normal but otherwise running perfectly. Turned out an "updater" process got a bit uppity and had 100% utilisation of one of the 6 cores for that entire time. If it had been better written (or a dual core system), it might have crippled the system.
 
  • Like
Reactions: throAU

throAU

macrumors G3
Feb 13, 2012
9,239
7,398
Perth, Western Australia
For everyone talking about programs not being optimised for multiple threads, sometimes its good for these background ones.

For a week my mac mini was a bit warmer than normal but otherwise running perfectly. Turned out an "updater" process got a bit uppity and had 100% utilisation of one of the 6 cores for that entire time. If it had been better written (or a dual core system), it might have crippled the system.

Well... yeah i guess.

But the proper solution for that is more cores, rather than slower processing (for your background jobs) :D
 

TheRealAlex

macrumors 68030
Sep 2, 2015
2,985
2,251
AMD ThreadRipper 3 is not a good match to the top end end Mac products.

AMD went for core count max over TDP. That throws it off from easy placement in the iMac Pro.

AMD prioritized it behind EPYC and Ryzen so timing wise it was a bad fit for the Mac Pro 2019. Most likely it would arrive later than the Intel option (looking from when the design path would have been made back in 2017 or so). The memory capacity is kneecapped to make the EPYC more viable. Both moves by AMD are understandable because the major priority is not placement into Mac product, but more so in subsections of the the overall market. Solely selling to Apple wasn't going to solve their revenue generation and profitability problems.

The window to tie into AMD products could come with Zen 3 architecture but that looks to be 2021 for the Threadripper zone.

The Mac Pro has a decent chance of going back into Rip van Winkle mode. At that point may not have many options for a sizable number of years. ( Apple shouldn't do that, but ..... then there is their track record. ) . If the Mac Pro actually meets basic technical requirements for workload, but doesn't have the 'right' brand or doesn't have 'alternative universe' upper cap then this is a dubious stance.

If enough of the current Mac Pros get sold to give Apple confidence they have a viable product a new Mac Pro will show up in the future. There is extremely unlikely no written in stone requirement that it has to have an AMD processor. There are probably a faction inside of Apple pondering putting an Apple CPU in there ( if they can keep the system price high enough to pay for very high priced , super low volume custom one) or "wait and see" faction. There is good chance that a future Threadripper like product makes the cut, but close to 100% guaranteed or not buying isn't a prudent bet.
All True, however the Xeon CPU Line Apple chose is EOL, Intel ain’t making anymore so They are stuck. I’d like to know what they are gonna offer in 2 years. By then AMD will be so far ahead that I think all these issues stated will be solved by AMD. And Apple will go with EPYC or ThreadRipper 5 maybe by then.
I just can not support something that’s dead. Sure it’s gonna be a great machine for a few years but already a 64 Core PC can wipe the floor with it, for about 50% of the Price. Today not in a few years today.
 
  • Like
Reactions: ssgbryan

ssgbryan

macrumors 65816
Jul 18, 2002
1,488
1,420
It amazes me everyone’s getting so butt hurt over this. This Mac Pro is exactly what my industry (film) asked for and we’re buying it in droves. I don’t know anyone that is complaining over potential “lost” computing power when Apple has given us 99% of what we need. None of us have any desire for the fastest possible workstation, just what gets the job done, reliably.

Because rather than delivering a general purpose workstation - they delivered a product that only shines in a subset of what the older systems could reach.
 

polishpanda

macrumors newbie
Jan 19, 2020
4
4
At what point did the (hack / Windows ) transition happen?

I always wondered what all the people used before whom the higher-end Mac Pro combined with an XDR display were made for. The previous Mac Pro was outdated like 4-5 years ago, while iMacs were still slow and they have a good display, but nowhere near the XDR. It seems like majority of the people already went with alternative solutions, likely costing a lot both in terms of money, but other infrastructure, software and training of employees.

According to some random tech Youtuber's review, his exporting went from approximately 20 (Macbook Pro latst maxed out) -> 12 (Maxed out iMac) -> 8 (high-specced Mac Pro). Recalling this from memory so might be little different numbers. Were studios relying on Final Cut Pro just *ing around for hours watching the beachball while their machines were loading until now?
I'd say it was around three years ago when people started getting serious about hacks. By then ARRI had already released the Alexa 65 and the Alexa LF was on the way. A lot of our issues were with io throughput, not necessarily compute power, though that was *definitely* lagging. Outside of the final grade or vfx, realtime playback at max res is less important, and most grading/vfx systems at major facilities were already based on unix/linux.
 
Last edited:

fendersrule

macrumors 6502
Oct 9, 2008
423
324
I have no problem with Apple yanking Intel’s crank.

Seriously, I don’t. I consider the first gen i7 to be the best CPU ever made.

But AMD is causing some bruises that are hard to ignore. This is reminding me of the Athlon days where Intel just doesn’t make sense, for anything.

Stop feeding the esoteric BS that MacOS rules movie production. No one has provided any proof of exactly who and what these (2019 MPs) are designated for at their price tag. No one has provided any evidence of how much work gets done in a Mac vs PC for anything. Adobe sucks on a Mac now, we all know that.

Here’s my gripe as a long time apple fan. It’s Apple sinking back into the PPC days again. Apple already shut out Nvidia’s ecosystem. Maybe the #1 reason why I’m leaving. Nvidia makes the best GPUs with lowest TDP. FACT. Remember when Apple was on the bleeding edge? Not with AMD GPUs they won’t be...

Now with Zen 2 being as great as it is....plus with theadripper literally ripping it up, makes Intel look like a raw deal. AMDs Zen 3 will finally crown AMD without question, even though I feel like Zen 2 already does that. Intels roadmap for 2020 = nothing. I don’t even think Intel will have anything competitive until late 2021 for desktops...

With $5999 for the base level Mac Pro leaves a stupid as hell value prop, especially with a 4 year old GPU. The prosumer is f**^ed for the first time for the Mac Pro. But now the professional is scrutinized...heavily.

I love MacOS as much as anyone else, but I’m not stupid enough to pay thousands extra for it. I happen to remember when it was a hundred extra.

This will fail. Quote me in a few years if you’d like Aiden.

if you guys want to continue arguing software core limitations, that’s fine. But It’s AMD that made you do that, not Apple.
 
Last edited:

askunk

macrumors 6502a
Oct 12, 2011
547
430
London
It's funny because many articles online make the point of cutting some slack to Macs compared with TR3 PCs because "macOS and software like FCP create synergies not possible on PCs". Sure: an Intel Mac could beat an equivalent Intel PC or even an AMD PC because the software is so well written... but the point is: how much better would they actually be if they were running also on AMD chips?

Steve Jobs sent IBM to hell because in three years they couldn't close their promises on the 3 GHz chip and their roadmap on efficiency was crap. Intel has been doing even worse so far and their roadmap to 10nm is depressing. As much as I know Apple is going ARM, I am starting to think that it wouldn't take much to switch at least the Pro line to AMD until some ARM chips are ready for them.

The general excuse, so far, has been that "Intel gives very good deals to Apple", but AMD has been providing GPUs to Apple and - besides - their CPUs are anyway cheaper than Intel's, plus they can easily let Apple design its custom architecture as much as Intel could in the past.

It wouldn't take much to switch to AMD for the new iMacs, for example.
 
  • Like
Reactions: throAU

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Amazing that once again you can try to link something to non-existent AMD products. Amazing.

Apple can't support an AMD CPU with more than 32 cores. Let that sink in.

Can't you realize that Apple OSX barely supports current Intel Xeons, and AMD processors with more cores are irrelevant? Or do you realize it, but won't admit it?
Yes, they can. All what they need to do is to update their software.

If not, they will be not one, but three generations behind anybody who is not bound to Apple ecosystem of software and hardware.

It's funny because many articles online make the point of cutting some slack to Macs compared with TR3 PCs because "macOS and software like FCP create synergies not possible on PCs". Sure: an Intel Mac could beat an equivalent Intel PC or even an AMD PC because the software is so well written... but the point is: how much better would they actually be if they were running also on AMD chips?

Steve Jobs sent IBM to hell because in three years they couldn't close their promises on the 3 GHz chip and their roadmap on efficiency was crap. Intel has been doing even worse so far and their roadmap to 10nm is depressing. As much as I know Apple is going ARM, I am starting to think that it wouldn't take much to switch at least the Pro line to AMD until some ARM chips are ready for them.

The general excuse, so far, has been that "Intel gives very good deals to Apple", but AMD has been providing GPUs to Apple and - besides - their CPUs are anyway cheaper than Intel's, plus they can easily let Apple design its custom architecture as much as Intel could in the past.

It wouldn't take much to switch to AMD for the new iMacs, for example.
ARM never will be ready to take anything that is High-Performance. It is too limited compared to x86 ISA.
 

cube

Suspended
May 10, 2004
17,011
4,973
It's funny because many articles online make the point of cutting some slack to Macs compared with TR3 PCs because "macOS and software like FCP create synergies not possible on PCs". Sure: an Intel Mac could beat an equivalent Intel PC or even an AMD PC because the software is so well written... but the point is: how much better would they actually be if they were running also on AMD chips?

Steve Jobs sent IBM to hell because in three years they couldn't close their promises on the 3 GHz chip and their roadmap on efficiency was crap. Intel has been doing even worse so far and their roadmap to 10nm is depressing. As much as I know Apple is going ARM, I am starting to think that it wouldn't take much to switch at least the Pro line to AMD until some ARM chips are ready for them.

The general excuse, so far, has been that "Intel gives very good deals to Apple", but AMD has been providing GPUs to Apple and - besides - their CPUs are anyway cheaper than Intel's, plus they can easily let Apple design its custom architecture as much as Intel could in the past.

It wouldn't take much to switch to AMD for the new iMacs, for example.
If Apple had stuck with PowerPC, maybe they would now be at 16nm FinFET SOI instead of 14nm FinFET bulk.
 

Coconut Bean

macrumors 6502
Jul 21, 2011
400
380
I'd say it was around three years ago when people started getting serious about hacks. By then ARRI had already released the Alexa 65 and the Alexa LF was on the way. A lot of our issues were with io throughput, not necessarily compute power, though that was *definitely* lagging. Outside of the final grade or vfx, realtime playback at max res is less important, and most grading/vfx systems at major facilities were already based on unix/linux.
Was going over to windows/Linux considered as a viable alternative?
 

DoofenshmirtzEI

macrumors 6502a
Mar 1, 2011
862
713
Are you guaranteed to get Intel if you are not running a ***a instance or they might give you the AMD instance?
CPU type is specified by instance code, so you will always get the CPU type specified for the given instance code. They also offer instances with ARM processors, you won't get those snuck in on you either.
 

cube

Suspended
May 10, 2004
17,011
4,973
I ran across the AWS re:Invent video from the session put on by AMD to promote the Epyc and Radeon stuff that AWS is offering. As an aside, the video had 89 views as of the time I was writing this so it's definitely not a hot topic. Note this was done in December, so not about the stuff just announced.

AMD Epyc instances are offered at a 10% discount from Intel instances. Since they were launched last year, AWS has been offering them in the T, M, and R lines (burstable, balanced, and memory weighted). This year they are offering some "semi-custom" processors in the C line (compute weighted) with faster processors than you can get elsewhere, so apparently AWS is buying that yield instead of Apple.

Notably missing from the AMD offerings on AWS is anything in the X line (extreme memory weighted). The most memory in an AMD offering is 768 GiB. The X line goes up to 3,904 GiB, all Intel.

I'll be keeping an eye out for any announcements about the new stuff going into any AMD instances, but it is interesting that AMD had to sponsor a session to tell a crowd notorious for jumping on any cost optimization that they could save 10% on their EC2 bill by tacking an "a" on the end of the instance type.
So they make more profit off AMD?
 

askunk

macrumors 6502a
Oct 12, 2011
547
430
London
ARM never will be ready to take anything that is High-Performance. It is too limited compared to x86 ISA.

Never said otherwise, at least in the short term. The point is if Apple would occupy the progressively shrinking range of Macs with CISC (x86) chips with Intel or AMD. I would love them to make the right move and sell the next Pro macs with AMD chips, but I have the feeling Apple has become too big to take such a drastic decision.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Never said otherwise, at least in the short term. The point is if Apple would occupy the progressively shrinking range of Macs with CISC (x86) chips with Intel or AMD. I would love them to make the right move and sell the next Pro macs with AMD chips, but I have the feeling Apple has become too big to take such a drastic decision.
So you are saying that Apple consumers are dumb enough so they will not notice, that everybody can do the same thing they do 2-3 times faster, for less?

Nobody is THAT stupid to risk putting their computers into complete irrelevancy when it goes to performance. For the time being Intel is "that'll-do" option for Macs. But for how much longer?

ARM NEVER will be serious contender into high-performance market. They will be good option where you need reliability, like Data Center jobs. But absolutely nothing that requires highest possible performance.

And that last thing is actually what consumers and clients demand. And no, Intel does not guarantee that last thing right now.

They even do not guarantee power efficiency.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.