Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Also modularity and "form over function" are somewhat different dimensions. Folks try to equate the two at times, but they aren't.


Usually that's true, but Apple does a hell of a job bringing the two together .
[doublepost=1549811180][/doublepost]
After so many years of waiting for the next MP I can definitely say that there is no logic left regarding Apple's choices.

There is a logic .
It just isn't based on customer needs or expectations .
I'm sure for Apple ( management ) personnel it all makes perfect sense .
 

That article is a bit overhyped. Several fundamental flaws.

1. The article references 'via Tomshardware' article which ends in....

"... and the first PCIe 5.0 devices should debut this year. Broader availability should come in 2020. ..."

Some announcements may come in 2019 but product? Neither the PCI-e docs , the Tom's article , nor the notebookcheck article have any example of any CPU that was coming with PCI-e v5 in 2019. There are explicit examples of PCI-e 4.0 ( e.g., AMD ) , but there are zero for 5.0. Huawei has a ARM server chip with PCI-e 4 announced last month. IBM has had Power 9 announced and shipping since 2017.

2020 isn't a "short term" timeline to getting to another Mac Pro.


2. The first CPU onto PCI-e 4 was IBM Power 9 (2017). Roadmaps from around 2017 era had them on PCI-e 5 in 2019 with Power 10. .... but then Global Foundries flaked on 7nm (IBM was shooting for a custom high density '10nm' ). GF stopped at 14nm. IBM has had to shift to Samsung for 7nm and the roadmaps now say 2020+ for Power 10 arrival.

CPU (and to lessor extent GPU) development hiccups can kick out the time lines.


3. It will be shorter than the PCI-e v1 -> v2 -> v3 pace that had updates at 4 year intervals. But that doesn't mean it is going to get skipped in a significant way. In part, it is going to be short because IBM has been the only implementer for almost two years. So since the others have been hitting the snooze button for over a year .... yeah once they get to 4.0 they will probably transition to 5.0 not hitting the snooze button for extra two years. It will certainly be shorter than the 3.0 -> 4.0 transition if use that as a 'baseline'. It will probably be shorter than 4 years. but shorter than two. Probably not. Even two is a stretch.

there are parts and techniques short of fully implementing 5.0 that can be 'borrowed' from 5.0 and used to implement 4.0 ( which more than several are going to do because it is simpler. ).

That the gap between 4.0 -> 5.0 is only two years but that may not be a broad spectrum deployment path. Look at Ethernet. 10GbE BaseT passed in 2006. [ and Ethernet speed standards increasingly came over that decade. ] A Mac with 10GbE standard didn't arrive until 2017 ( 10 years later) . if PCI-e 5.0 cards remain extraordinarily expensive then 5.0 won't deploy into the Mac Pro (let alone general Mac) zone for a long while. (e.g., if the primary drivers are > 100Gbps I/O cards ). Nvidia pushing NVLink and AMD now moving to push Infinity Fabric (for GPGPU card to card backplane) will also dampen deployment to the mainstream. [ Proprietary backplanes puts more money in those GPU vendors pockets so they aren't in hurry dump them. ]

When 3.0 came PCH chipsets held onto 2.0 for a long time. It was more cost effective to implement and that is where the bulk of the I/O device inertia was.

In part, v 4.0 didn't come quicker to the broader market because to a large extent v3.0 was fast enough for an extremely large set of solutions. There are data center folks moaning at the 'delay' but the broad workstation market grossly impeded because didn't have PCI-e v4.0 for almost two years. Almost nobody was 'having a cow' over that delay.


Indeed, a part of the Toms article that notebookcheck glosses over.

"... PCI-SIG expects the two standards to co-exist in the market for some time, with PCIe 5.0 used primarily for high-performance devices that crave the ultimate in throughput, like GPUs for AI workloads, and networking applications. That means that many of the leading PCIe 5.0 devices will land in data center, networking, and HPC environments, while less-intense applications, like desktop PCs, are fine with the PCIe 4.0 interface. ..."


4. In the context of a Mac Pro ( being on target in this forum). Intel isn't jumping to 5.0 this year. AMD isn't either ( in fact sticking with the same socket has a decent chance of meaning have issues getting to 5.0 and need another whole set of motherboards just to get to 4.0. ).

Pragmatically need a CPUs, motherboards, socket (both sides) , and cards/plug-ins to get to the new solutions. Getting all those typically disparate groups to synch up logistically wise takes time.

If want to engage in Apple ARM A-series fantasies for Mac Pro , it isn't even particularly a full user of 3.0 . There is zero momentum there for 4.0 let alone 5.0
 
Last edited:
  • Like
Reactions: -hh
The problem is not the intel CPUs. This is not so important right now.
The main concerns are about the presence of upgradeability, expansion, better thermal management, repairability, and of course the long awaited availability of the new models.
 
That article is a bit overhyped. [...]

Not to mention... does it really matter? The members on this forum still hanging onto PCIe 2.0 Mac Pros are pretty good demonstration that many GPUs or other devices aren't heavily hamstrung by the speeds of older PCIe revisions.
 
  • Like
Reactions: ssgbryan
Don't think anybody has said MacPro on ARM.

No one has seen Apple's approach to an a-series desktop part. Considering the A-Series team was built from the acquisition of PA Semi Conductors, who was known for the fastest custom PowerPC silicon made specifically for the federal government, this is the last team I would underestimate.

If Apple did take a whack at an Arm based a-series desktop chip, it would kick Intel in a way that AMD dreams of.
 
Not to mention... does it really matter? The members on this forum still hanging onto PCIe 2.0 Mac Pros are pretty good demonstration that many GPUs or other devices aren't heavily hamstrung by the speeds of older PCIe revisions.

It matters. What you are pointing at is the demographic that either dogmatically devoted to clinging to their old Mac Pro ( for some in 'religious' fervor ) or for whom overall system performance doesn't really matter ( for some a 'religious' fervor primarily just over their GPU card properties ).

Gaming where often there is 'lowest common denominator" then sure. The spurt in x4 PCI-e v3.0 ( x8 v2.0 ) may keep the x16 v2.0 ( x8 v.3.0 ) target baseline around even longer. But that isn't a market that Apple has traditionally chased with Mac Pro.

The folks in 2015, 2016, 2017 , 2018 , and possibly also 2019 that had real business requirements for PCI-e 3.0 x8-16 bandwidth for the cards and had the budget would have left. Doubtful the forum will actually measure those folks in any reasonably accurate way. [ Apple probably has metrics about ownership via upgrade pulls.... if they are bothering to look. ]

Likewise 2015-2018 updates on CPU opcode coverage in overall system context and had budget.... many of those folks probably left also and the forum won't measure them very well.

On another dimension, if Apple's intention is to go into another 3-5 year Rip van Winkle slumber after updating the Mac Pro it also would matter more. Similarly if Apple put the old Mac Pro back into production (magically getting obsolete chips supply from somewhere) and was charging $2,100 - 2,800 for them I'd bet there would be more than a few unhappy people. For some, the 'satisfaction' of their cMP performance is heavily driven by fact it is paid for. ( or was priced at xMac price range 3-4 years ago when they picked it up).

If Apple came out with a $3,000+ system with PCI-e v3.0 and then proceeding to try to sell it in 2021, 2022, 2023 it would increasingly matter. In 2019, not much.
[doublepost=1549840076][/doublepost]
No one has seen Apple's approach to an a-series desktop part.

There is little to no indication that apple is even doing a desktop part at all. But bringing up the notion that they are doing a desktop part drives the fantasy.

2012 ( there is 'walk back from the inhaling too much hype' parts in article too )

"... a report that surfaced on Monday from Bloomberg, which indicated that Apple's engineers are confident that the company's A-series custom chip designs will one day be powerful enough to run the company's desktop and laptop machines. ARM-based silicon in Apple devices is currently limited to iOS devices. ..."
https://appleinsider.com/articles/1...ble-but-apple-unlikely-to-switch-anytime-soon

Bloomberg again in 2018.
"
Apple’s decision to switch away from Intel in PC’s ... That would indicate Apple will start the transition with laptops before moving the designs into more demanding desktop models. Apple has to walk the fine line of moving away from Intel chips without sacrificing the speed and capabilities of its Macs. ..."

Macrumors in 2018
"... The question of custom Apple-designed CPUs destined for notebooks and desktop systems seems less a question of capability, and one more focused on will and perceived market advantage. ..."
https://www.macrumors.com/2018/06/01/apple-oregon-workforce-desktop-cpu-ambitions/


Considering the A-Series team was built from the acquisition of PA Semi Conductors, who was known for the fastest custom PowerPC silicon made specifically for the federal government, this is the last team I would underestimate.

Doing one chip for one customer doesn't mean you can do three substantially differeent designs for 3-5 major customers. The question is not doing "a custom" chip. It is doing several in concurrently . the iOS product line mainly consists of "hand me down" chips. Not multiple targeted versions. Apple has no substantive track record here at all. ( the Watch is a mash up that looks to be increasning "head me down". )


If Apple did take a whack at an Arm based a-series desktop chip, it would kick Intel in a way that AMD dreams of.

They could blow alot of money on some "bragging" hot rod, but would it make any sense business wise. That is what is grossly missing in any real solid analysis in all of these "desktop ARM' articles.
 
  • Like
Reactions: handheldgames
what do you do with all of these cards and where are you offloading your old 1080tis!!!
Pulled a quad set of Titan X (Maxwell) out to pasture today (actually, into a storage cabinet) and moved the old 1080 Tis in.

add-both.jpg

add-new.JPG
 

Attachments

  • add-both.jpg
    add-both.jpg
    213.9 KB · Views: 127
Looks awesome!, you do not have any fans attached to the CPU heatsinks?
There is an air baffle that channels the case fan air flow through the CPUs, memory and slots - the baffle was not in the picture. The eight fans are at the bottom of the picture (not shown).

117612--3-von-5-[1].jpg 35-0[1].png 4092855[1].jpg
System is an HPE ProLiant ML350 Gen9. Its power supply is 3200 watts (four 800W supplies with N+1 redundancy).
 
Last edited:
There is an air baffle that channels the case fan air flow through the CPUs, memory and slots - the baffle was not in the picture. The eight fans are at the bottom of the picture (not shown).

View attachment 821248 View attachment 821250 View attachment 821251
System is an HPE ProLiant ML350 Gen9. Its power supply is 3200 watts (four 800W supplies with N+1 redundancy).
WoW, I'll love having your sponsors.
Meanwhile I'm Proud building a 16 Core DIY with AMD TR + 2 RTX2080......
 
Next SIRI or Alexa more likely if you know a bit about Aiden's work.
I don’t have a clue about Aiden’s work. It was a rather poor attempt at humour.

To be honest I too am curious to know what he’s using such powerful hardware for.
 
What are you going to do with the old cards?
Some will end up upgrading systems with lesser cards - the rest will be eWaste. They are 8+6 dual slot cards, so many systems couldn't support them.


What are you rendering with that Aiden ?
Machine learning and other AI work on big data. (My standard disk purchases have been several 96TB arrays at a time, but that will soon be 144TB or 168TB now that 12TB and 14TB disks are entering the mainstream.)

I've ordered NVLink Bridges so that the four Quadro RTX cards can be used a pair of virtual CUDA GPUs with 48 GiB VRAM, 9216 CUDA cores, and 1172 Tensor cores.
[doublepost=1549925503][/doublepost]
what a freakin beast, unfortunately the MacPro will never look like this!
The GPUs from HPE don't even have fans. They look like "blower cards", but no blower. The front (away from the bulkhead) is open - the entire card is a flow-through heat sink, and the case fans are strong enough to push the air across the GPU and out the back. (There isn't a lot of work to make them quiet ;) .)

Unfortunately, HPE isn't selling the newest Quadro RTX cards, so I'm buying blower cards. The case fans are probably making the blowers spin even if the card has turned the blowers off.
[doublepost=1549928083][/doublepost]
the MacPro will never look like this!
But, if history is our guide, it will be pretty - and pretty useless.
 
Last edited:
Some will end up upgrading systems with lesser cards - the rest will be eWaste. They are 8+6 dual slot cards, so many systems couldn't support them.



Machine learning and other AI work on big data. (My standard disk purchases have been several 96TB arrays at a time, but that will soon be 144TB or 168TB now that 12TB and 14TB disks are entering the mainstream.)

I've ordered NVLink Bridges so that the four Quadro RTX cards can be used a pair of virtual CUDA GPUs with 48 GiB VRAM, 9216 CUDA cores, and 1172 Tensor cores.
[doublepost=1549925503][/doublepost]
The GPUs from HPE don't even have fans. They look like "blower cards", but no blower. The front (away from the bulkhead) is open - the entire card is a flow-through heat sink, and the case fans are strong enough to push the air across the GPU and out the back. (There isn't a lot of work to make them quiet ;) .)

Unfortunately, HPE isn't selling the newest Quadro RTX cards, so I'm buying blower cards. The case fans are probably making the blowers spin even if the card has turned the blowers off.
[doublepost=1549928083][/doublepost]
But, if history is our guide, it will be pretty - and pretty useless.

You guys can't sell off old cards? Seems like a waste of money and useful hardware to send them to e-waste.
 
You guys can't sell off old cards? Seems like a waste of money and useful hardware to send them to e-waste.
Corporate capital expense regulations make it impossible for me to eBay the surplus items. I believe that our IT eWaste processing does feed some stuff into the "refurbished" market (those "server pulls" that are supplying old CPUs for the cMPs). Whether they'd be aware that a Titan X had some value or not is a question.

Spinners and SSDs, however, are always shredded.
 
It matters.
Unless there's a reason that next-gen PCIe would strongly influence the "design" direction of the next MP, I still don't see how it matters.

I.e., would PCIe 4 or 5 change whether it has zero PCIe slots, 1 PCIe slot or multiple PCIe slots? Would it change whether it has serviceable drive bays? Would it change the form-factor of the computer? If it doesn't affect those things, then I'm not seeing how it matters.

The actually "real life" user base for the Mac Pro (i.e. not MacRumors) does not care whether it's PCIe 3, 4 or 5.
 
Unless there's a reason that next-gen PCIe would strongly influence the "design" direction of the next MP, I still don't see how it matters.

I.e., would PCIe 4 or 5 change whether it has zero PCIe slots, 1 PCIe slot or multiple PCIe slots? Would it change whether it has serviceable drive bays? Would it change the form-factor of the computer? If it doesn't affect those things, then I'm not seeing how it matters.

The actually "real life" user base for the Mac Pro (i.e. not MacRumors) does not care whether it's PCIe 3, 4 or 5.
Good points. Most of the "real life" users just want a couple of double-wide PCIe slots and a power supply with four 8-pin GPU power leads. (And some simple dongles to map dual 8-pin to 8+6 and dual 6.)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.