Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Care to elaborate ?

From where I am standing, I can only see prices going up and the products barely getting better for what my money buys .

Compare cars from 2006 to current ones, compare their price, and you might understand how progress is supposed to work .

That was exactly the situation in the 2003-2005 timeframe. IBM failed to deliver the performance they promised with the PowerPC architecture, so Apple looked elsewhere.

Now instead of looking at other solutions they might as well vertical integrate it, as they have done successfully with their iOS devices and save money in the long run. Why spend hundreds of dollars per processor when you can build them to spec yourself and include features for your own vision of computing.

In a sense that has already started happening with the T2 coprocessor being a derivative of the A10.
 
  • Like
Reactions: zephonic
Now I remenber, when many people and bloggers anticipated a bigger mac mini, only a blatant leaker we know as "DNG" some time a year ago anticipated an more powerful mac mini, space gray option, more powerful on the same pizza-box form factor, accounted almost every detail (those fast with forum search look for DarkNetGuy or DNG),

So the guy who said as recently as a few months ago it was going to have AMD processors? https://forums.macrumors.com/threads/waiting-for-mac-pro-7-1.1975126/page-372#post-26378139

Yeah, makes me think it's bull.
 
  • Like
Reactions: Mago
Gotta wonder how eager Apple would be to have AMD anything in their systems, given Papermaster is in a lead role there nowadays.
 
Gotta wonder how eager Apple would be to have AMD anything in their systems, given Papermaster is in a lead role there nowadays.

Apple brought in Papermaster and put him in charge of iPhone hardware. The phone. He helped built several successful semiconductor (CPU) products at IBM and Apple brought him and to some extent moved him out of semiconductors. (if look at the following article Mansfields was technical lead on the SoC and Papermaster was moved up the ecosystem in supervision duties. )

Just in time to be the fall guy for Antenna gate too.

That move did foretell that Apple was going to be great resource emphasis on differentiating their iOS products on the SoC inside. However, the phone consist of far more than just a custom SoC.

Similar to the Mac Pro product. The Mac Pro is more than just a CPU container. It is also more than just a GPU container. Some xPU chip is the end all, be all definer of the whole product. It isn't. Neither is simply just being a container in which to fill "stuff". It is the good balance of all of those will make a good, successful Mac Pro or not.

Is Apple extremely happy with AMD's timeliness in getting better performing ( perf/watt) GPUs out the door and into their hands? Probably not. Was that all Papermaster's fault? No. So mad at AMD GPU they want to "blow up" the relationship. Probably not.

Is Apple extremely happy with the CPU progress at AMD? Probably Not extremely happy. AMD CPUs are far more viable. They are still a bit of a miss in the mobile space of the Mac laptop product line up (more so as go down to the thinnest, lightest models ) . That happens to be the vast majority of Macs so they still have work to do there. AMD's weighting has been on the higher power, more profitable CPUs. .... frankly a sane move by them to keep the lights on and pay the bills. But that isn't Apple's biggest 'need'.

For the iMac (and iMac Pro) and Mac Pro space AMD has some competitive offerings Apple can't 'off hand' dismiss. They aren't necessarily far and way best options, but Apple should be happy that they can use them as a design "bake off" competitor against Intel at the very least to get more leverage on Intel in the pricing and R&D focus aspects of that context. Apple should also be happy that AMD is making enough progress that Apple doesn't "have to" shift the Mac line up over to some A-series derivative. It isn't necessary because both Intel and AMD are screwing up at the same time. Apple may need to switch horses in a year or so, but they do not need to get off of x86 ecosystem.

So there is no reason for Apple to be concerned about Papermaster's role at AMD in any significant at all.
 
I'd bet we'll see an Apple-designed processor in Macs beyond the T chips before we see an AMD one. Moving to AMD is a lateral move that doesn't come with as many benefits for Apple. They've been burned repeatedly by their partners who don't move in the direction they want (from the PowerPC to Intel to AMD themselves with ever-hotter GPUs.)

I did think James Thomson and Dan Moren had an interesting point that while everyone has been (quite understandably) assuming Apple would shift their lower-end to ARM first, there's a certain amount of sense in switching their pro hardware first instead. People's memories are short but I remember the PowerMac G5s that were outclassed significantly in less than a year by the Mac Pro and were obsolete software-wise in less than four years. That timeframe seems even worse in today's landscape where computers have longer operational lifespans in a lot of ways.

It's in the higher-end performance space that Apple's processors are (outwardly) untested, but if that's the future it makes sense to try and push hard and not leave people with legacy machines in the new era.*

*Of course, Apple would probably still sell a Xeon SKU for compatibility for a period afterwards, the same way they kept native OS 9 machines around even when the G5s came out.
 
That was exactly the situation in the 2003-2005 timeframe. IBM failed to deliver the performance they promised with the PowerPC architecture, so Apple looked elsewhere.

Now instead of looking at other solutions they might as well vertical integrate it, as they have done successfully with their iOS devices and save money in the long run. Why spend hundreds of dollars per processor when you can build them to spec yourself and include features for your own vision of computing.

In a sense that has already started happening with the T2 coprocessor being a derivative of the A10.

Only back then, Macs had been depending on the OS to sell units for ages, and eventually Apple was in desperate need to catch up with the competition re. CPU speed .

Today there is not the one CPU manufacturer offering a vastly superior solution, so why force a change ?

The iOS devices don't serve as an example either; not only are they not fit for demanding tasks, Apple is still charching a premium for any of the miniscule steps forward those things make .
It's also naive to believe Apple can just go and start making top of the line chips themselves - even more naive to think savings would be passed on to customers .

Finally, Apple's vision of computing is i-based . It's not much of a vision, it's not computing, and it's hit the ceiling already .
 
  • Like
Reactions: pl1984
I'm seriously starting to feel like the second coming of Christ will happen before we get a new Mac Pro.... Apple is seriously over-complicating such a simple task. All we want is a beautiful tower PC that runs macOS! It's not that big of a deal!
 
...
...
I did think James Thomson and Dan Moren had an interesting point that while everyone has been (quite understandably) assuming Apple would shift their lower-end to ARM first, there's a certain amount of sense in switching their pro hardware first instead.

Those theories are far more click bait than grounded in reality.

Moren's mostly arm flapping justification for ARM in the Mac Pro being the primary target is wrapped up in a bow with this rhetorical question.

"... And when has Apple ever been satisfied with the status quo? ..."

Err... a whole bunch of times they have.

2009-2013 no major changes to Mac Pro baseline architecture. Status quo ran for 4 years there.
2012-2016 non retina MacBook Pro ... status quo ran for 4 years there too.
2014-2018 Mac Mini is blissful status quo for another 4 year cycle.
2013-2019 Mac Pro in Rip van Winkle like status quo for 6 years.
2015-2019+ iPod Touch at least 4 year cycle of status quo.
2013-2018 Apple Time Capsule + Airport router 5 years of status quo.

MBA minor speed bumps from 2015-2018 3 years of status quo.

entry level iMac ( with MBA like processor and non Retina screen ) 2015-2019+ 4 years of status quo. ( one minor speed bump).


Thunderbolt Display Docking station. entire Thunderbolt v1 implementation its entire lifetime. Zero updates before dropped. ( defacto the LG Ultrafine were probably part of some effort to break the status quo but Apple flaked on that and pragmatically did nothing in terms of something to market themselves. )


Most of this is hocus pocus click bait because lots of Mac Pro fans a almost desperate from some positive news. So any story that has "Mac Pro is going to be super duper important to Apple again" story is bound to generate a decent number of clicks. The theory that the Mac Pro is most important system in Apple line up is a fantasy porn story. Pure and simple.

In a Mac Pro context , how can anyone seriously talk about Apple not being statisfied with the status quo. Apple is 100% in Rip van Winkle mode right now.


People's memories are short but I remember the PowerMac G5s that were outclassed significantly in less than a year by the Mac Pro and were obsolete software-wise in less than four years. That timeframe seems even worse in today's landscape where computers have longer operational lifespans in a lot of ways.

The A-series isn't even close to parity with the Intel W class options. Let alone passing it up with some hand waving substantive dominating edge on anything but short time duration, single core , drag racing sideshows.

The primary reason Apple is sales pitching "faster than a mobile laptop PC" is to sell more iOS devices. Just stop right there. That's the primary objective. Apple has an OS for its A-series chips. It sells more instances of that OS every year than the size of the entire Mac OS installed base ( never mind the smallish fraction of that user base that upgrades every year). More performance in A-series will sell more iOS instance.

Apple doesn't need some 'new' operating system running on the A-series to be successful.


It's in the higher-end performance space that Apple's processors are (outwardly) untested, but if that's the future it makes sense to try and push hard and not leave people with legacy machines in the new era.*

When has Apple not had a problem leaving folks on Obsolete systems (and interfaces) behind? Unmodified Mac Pro 2009 getting macOS upgrades? Nope. There is a variance in the Vintage and Obsolete rules for more expensive Macs versus less expensive ones? Nope.


*Of course, Apple would probably still sell a Xeon SKU for compatibility for a period afterwards, the same way they kept native OS 9 machines around even when the G5s came out.

Keep old systems on the books at roughly the same prices and sale for a long time. But isn't that following the status quo?

That was driven more so because the software was immature, not because the software was ready to go but compiled to a new platform. In the last platform transition Apple did a "big bang" transition and promised to flip the entire Mac line up in a single year. ( 68K -> PPC was pretty much the same timeline. definitely less than 2 years but incrementally over a strict 12 months. ). That was all driven because what they were moving too is generally, across the board faster/better.

Apple dragging out a roll out over an extended period of time would be a tacit admission that it really isn't this time. That what being pulled into is a less comprehensive architecture/platform. If the Xeon W worked out better the desktop i6-i9 systems probably would too ( i9 especially since largely the same die implementation as W ).

Apple could split up the desktop and laptop line ups of macOS into two camps. It probably doesn't work as well as most of the folks cheerleading this kind of notion thinks it will. To date Windows on ARM hasn't really worked so well. ( it also doesn't access to the A-Series chips. ). Windows though is much bigger. ( Apple is about 7-10% of Windows is covering. So if Windows splits out a 10-15% of out there 90% the x86 core market is still 75-80% and this new branch is a big or bigger than the whole Mac subset. ). Apple taking their 7% and chopping it up 75/25 only fragments further something that is already relatively small. That is likley going to have a number of negative side effects. There could be some positive ones too but it isn't all going to be positive.


Over the last decade Apple has split iOS mulitple times to cover new usings that spanned A-series use. iOS ---> tvOS , watchOS . Another easy fork could be iPad/iBook OS . The A nn X processors could get a minor variant that also came with slighly different "scaled up" silicon and the other iOS 'forks' largely got "hand me down" processors and designs of the iPhone bow wave.


The Mac line up cannot survive on "hand me down" processors from iPads. ( Apple could cherry pick off a few but it generally won't work. ). Similarly, Macs getting their own processor is very often accompanied by woefully incomplete , or just plain simple wild gesturing hand waving, analysis about how that works economically on the relatively low volumes and fragmented market the Mac currently covers.
 
  • Like
Reactions: barmann and Biped
Only back then, Macs had been depending on the OS to sell units for ages, and eventually Apple was in desperate need to catch up with the competition re. CPU speed .

Today there is not the one CPU manufacturer offering a vastly superior solution, so why force a change ?

The iOS devices don't serve as an example either; not only are they not fit for demanding tasks, Apple is still charching a premium for any of the miniscule steps forward those things make .
It's also naive to believe Apple can just go and start making top of the line chips themselves - even more naive to think savings would be passed on to customers .

Finally, Apple's vision of computing is i-based . It's not much of a vision, it's not computing, and it's hit the ceiling already .

I have no desire to continue this discussion but nowhere did I say anything about savings being passed on to consumers.

The A12X Bionic is impressive with its small ~120 mm2 die (10 Billion transistors). Hopefully Andrei from AnandTech will do a deep dive soonish.
 
Last edited:
The A-series isn't even close to parity with the Intel W class options. Let alone passing it up with some hand waving substantive dominating edge on anything but short time duration, single core , drag racing sideshows.

Have you seen the A series running at equivalent wattage with non-passive cooling systems?
 
  • Like
Reactions: Boil
The Mac line up cannot survive on "hand me down" processors from iPads. ( Apple could cherry pick off a few but it generally won't work. ). Similarly, Macs getting their own processor is very often accompanied by woefully incomplete , or just plain simple wild gesturing hand waving, analysis about how that works economically on the relatively low volumes and fragmented market the Mac currently covers.

I'm not suggesting as a critique to your points, maybe to further augment, but I can't help notice a dissonance between some speculative notions that :
  • Apple sees no future in macos / desktop / pro ( aside from laptop )
  • Apple is prepared to sink a metric fecal tonne of resources to port vestigial ( for whatever reasons ) product lines to new architecture.
I know it is counter to Apple's modus operandi, but in lieu of their ( maybe only perceived ) questionable attention given to their macos product lines; perhaps publicized roadmaps would help maintain confidence in the professionals who seek future macos offerings. As the calendar ticks over to 2019, I can't imagine that all the speculation and rationalization that seem to power megathreads like these here, will satiate office managers enough to delay purchasing decisions for 'another year', especially after a notoriously less than performant previous offering.
 
Didn't Apple move away from "custom" CPUs and to Intel as they were unable to compete? Apple moving to ARM sounds like a move back from where they came. Designing and producing competitive CPUs is hard. We've seen the industry consolidate into a small number.
Apple had IBM and Motorola (the AIM alliance) to design and manufacture the Power PC. They couldn't keep up with Intel in design or manufacturing. That led to the wind-tunnel 800 MHz dual G4 Mac to try and compete, and the unreliable, leaky, liquid-cooled, dual 2.5 GHz G5 Power PC embarrassment (the original cheese grater) after Jobs promised 3.0 GHz G5's but couldn't deliver. I got rid of my dual G5 after it leaked, but I still have the old dual G4 in my antiquated Apple collection along with my 8-slot (fully loaded) Apple II.

ARM for Macbooks, maybe low-end iMacs is coming. But Apple has gone down the "Apple CPU" path for top of the line Macs before. Been there, done that. They couldn't keep up with design or manufacturability of high-end CPUs. They remember. Not going to happen again.
 
  • Like
Reactions: pl1984
Apple brought in Papermaster and put him in charge of iPhone hardware. The phone. He helped built several successful semiconductor (CPU) products at IBM and Apple brought him and to some extent moved him out of semiconductors. (if look at the following article Mansfields was technical lead on the SoC and Papermaster was moved up the ecosystem in supervision duties. )

Just in time to be the fall guy for Antenna gate too.

That move did foretell that Apple was going to be great resource emphasis on differentiating their iOS products on the SoC inside. However, the phone consist of far more than just a custom SoC.

Similar to the Mac Pro product. The Mac Pro is more than just a CPU container. It is also more than just a GPU container. Some xPU chip is the end all, be all definer of the whole product. It isn't. Neither is simply just being a container in which to fill "stuff". It is the good balance of all of those will make a good, successful Mac Pro or not.

Is Apple extremely happy with AMD's timeliness in getting better performing ( perf/watt) GPUs out the door and into their hands? Probably not. Was that all Papermaster's fault? No. So mad at AMD GPU they want to "blow up" the relationship. Probably not.

Is Apple extremely happy with the CPU progress at AMD? Probably Not extremely happy. AMD CPUs are far more viable. They are still a bit of a miss in the mobile space of the Mac laptop product line up (more so as go down to the thinnest, lightest models ) . That happens to be the vast majority of Macs so they still have work to do there. AMD's weighting has been on the higher power, more profitable CPUs. .... frankly a sane move by them to keep the lights on and pay the bills. But that isn't Apple's biggest 'need'.

For the iMac (and iMac Pro) and Mac Pro space AMD has some competitive offerings Apple can't 'off hand' dismiss. They aren't necessarily far and way best options, but Apple should be happy that they can use them as a design "bake off" competitor against Intel at the very least to get more leverage on Intel in the pricing and R&D focus aspects of that context. Apple should also be happy that AMD is making enough progress that Apple doesn't "have to" shift the Mac line up over to some A-series derivative. It isn't necessary because both Intel and AMD are screwing up at the same time. Apple may need to switch horses in a year or so, but they do not need to get off of x86 ecosystem.

So there is no reason for Apple to be concerned about Papermaster's role at AMD in any significant at all.
One reason for Apple to pick AMD over Intel: Semi-Custom designs, and very competitive roadmap, and - reliable, compared to Intel.

And latest speculation says that AMD will have higher capacity at TSMC available to them on 7 nm then they had from GloFo with WSA. Rightly so, because Rome is quite hell of a design.
 
One reason for Apple to pick AMD over Intel: Semi-Custom designs, and very competitive roadmap, and - reliable, compared to Intel.

I hear enough problems from 3D animators about AMD cpus being radically slower in some animation / 3d packages, because developers on Windows won't / cant optimise for a theoretically similar CPU under the same OS, I can't see how macOS using a non-Intel / custom CPU is supposed to make things better.

"but developers could..."

Developers won't.

The Mac being just another flavour of standard PC (with a pretty, easy to use standard UNIX on top) was the best thing that ever happened to it - not the iMac, not the titanium / aluminium powerbook, not the G3 processor, not any particular innovation Apple has come out with.

I just spent the weekend at an AR & VR workshop - all the AR was done with Eyejack (authoring on mac, deploying on iOS / Android), for which ARKit is just a dumb pipe to hardware. All the VR was on PCs, of course. Laptops not significantly bigger than a macbook pro from a couple of years ago can comfortably drive a Vive Pro or Rift, no external GPU needed - you can do VR development on a PC mbp 15" sized device, with all your tools in VR to create animatable 3d models, motion capture with Vive Pro trackers etc. All that because of Nvidia graphics, pure and simple. And, correct me if I'm wrong, but there's nothing even remotely in the near future to put Vega 64 plus-level GPUs into Apple laptops, is there?

Ironically the only Apple product to have a place in this, is the iPhone X - whose dot-spraying camera can be used for facial capture streamed over wifi back to Unity, to puppet the face of a 3d character (which can also be the person being motion captured) in real time.
 
  • Like
Reactions: Hps1
yeah, Apple fought a very expensive battle with IBM to be able to employ him in the first place, and then publicly burned him at the stake, and threw the charred remains under the bus for the iPhone 4 antenna design. Doesn't seem a recipe for the two to work well together.

AMD ( and Papermaster) isn't solely working for Apple. If AMD does good then several other system vendors in addition to Apple benefits (and AMD is more profitable. ) They are primarily a component supplier so better parts make for more competitive and better systems. Being at AMD was a better match to his skill set. There is no rational reason at all for Apple to be 'mad' at someone going to a better fit job.

Papermaster's recruitment wasn't all that expensive for the position he was in. Yeah they paid some lawyers but IBM's case was complete bull . It was entirely unenforceable in California. IBM has a large R&D lab in San Jose ... they knew that before they started. Someone at IBM decided it waste money ( rattle the saber to pause bleed elsewhere. ) IBM blew more money in that match than Apple did. IBM and Apple are close partners now in enterprise sales. A win/win for both companies and they do business. ( because they are businesses to make money. It isn't some dating relationship. )

Apple screwing up hiring an interim retail chief lead to them paying the current one mega. 10's of millions to 'buy' her out of Burberry. That Papermaster thing was chump change in comparison to those two hires.
 
I hear enough problems from 3D animators about AMD cpus being radically slower in some animation / 3d packages, because developers on Windows won't / cant optimise for a theoretically similar CPU under the same OS, I can't see how macOS using a non-Intel / custom CPU is supposed to make things better.

"but developers could..."

Developers won't.

The Mac being just another flavour of standard PC (with a pretty, easy to use standard UNIX on top) was the best thing that ever happened to it - not the iMac, not the titanium / aluminium powerbook, not the G3 processor, not any particular innovation Apple has come out with.

I just spent the weekend at an AR & VR workshop - all the AR was done with Eyejack (authoring on mac, deploying on iOS / Android), for which ARKit is just a dumb pipe to hardware. All the VR was on PCs, of course. Laptops not significantly bigger than a macbook pro from a couple of years ago can comfortably drive a Vive Pro or Rift, no external GPU needed - you can do VR development on a PC mbp 15" sized device, with all your tools in VR to create animatable 3d models, motion capture with Vive Pro trackers etc. All that because of Nvidia graphics, pure and simple. And, correct me if I'm wrong, but there's nothing even remotely in the near future to put Vega 64 plus-level GPUs into Apple laptops, is there?

Ironically the only Apple product to have a place in this, is the iPhone X - whose dot-spraying camera can be used for facial capture streamed over wifi back to Unity, to puppet the face of a 3d character (which can also be the person being motion captured) in real time.

MacBook Pros are getting Vega GPUs this month. What SKUs or pricing for that is, we don't yet know.
 
I hear enough problems from 3D animators about AMD cpus being radically slower in some animation / 3d packages, because developers on Windows won't / cant optimise for a theoretically similar CPU under the same OS, I can't see how macOS using a non-Intel / custom CPU is supposed to make things better.

"but developers could..."

Developers won't.
Yeah, like AMD has any different x86 instruction set, than Intel.

AMD is slower, in those boarder situations, because Zen 1 lacks a bit in front end, and width of pipeline. Zen will alleviate a lot of those bottlenecks. And we are talking about Custom designs based on Zen 2.

And, correct me if I'm wrong, but there's nothing even remotely in the near future to put Vega 64 plus-level GPUs into Apple laptops, is there?
Who knows? There was a lot of shifts in the graphics industry during the years. People say that Nvidia was always the best which cannot be further to truth.
MacBook Pros are getting Vega GPUs this month. What SKUs or pricing for that is, we don't yet know.
We know the specs for Vega 16 and Vega 20.

Vega 16: 192 GB/s memory bandwidth, 1024 cores, 1185 MHz core clock, 35W TDP.
Vega 20: 192 GB/s memory bandwidth, 1280 cores, 1300 MHz core clock, 35W TDP.

In essence Vega 20 is the same performance level as Nvidia Quadro P3000.
 
  • Like
Reactions: fuchsdh
MacBook Pros are getting Vega GPUs this month. What SKUs or pricing for that is, we don't yet know.

It's Vega 16 & Vega 20. Vega 64 is roughly equivalent to a mobile Nvidia GPU (1080) when it comes to 3D / VR performance. Unless AMD are being very weird with how they space out model numbers with no relation to performance, why they'd go so much lower in a mobile part than desktop, unless they expect a logic to the proportional reduction in performance, I suspect another round of hardware that has to sit it out.
 
I hear enough problems from 3D animators about AMD cpus being radically slower in some animation / 3d packages, because developers on Windows won't / cant optimise for a theoretically similar CPU under the same OS, I can't see how macOS using a non-Intel / custom CPU is supposed to make things better.

"but developers could..."

Developers won't.

Not really. The vast majority of applications on the Mac are developed with Xcode. The vast majority of applications developed on Windows are not ( Visual Studio). Two completely sets of compilers. Some folks do critical sections with Intel's compliers on Windows/Mac but most developers aren't. [ There are some cases where the Intel compiler optimizer pushes special CPU targeted codes into worse performance on AMD CPUs. In a few of these cases that maybe one of the root causes. ] There are smaller, handcrafted x86 assembler kernel by relative to whole application libraries they are corner cases.

Some of that just the compilers and just how hand tuned the system libraries are. If Apple and AMD were on top of their games then would augment their system math libraries and AMD/Apple would weave their compiler optimizers into Apple's branch/fork of LLVM (and Xcode ).

If Apple switched over to using AMD in most Macs then the density of AMD systems would be much higher there and animation developers would face different demographics. It is easy to 'kick the can' when it is some side '10%' or in the corner. However, if decide the side '10%" of macOS in the corner is good and that's 70% full of AMD processors then a business choice is to get on the work.

It doesn't make sense for Apple to just pick one Mac product. If switch, then it would be a wholesale switch. Drop optical drives on all Macs. Put Thunderbolt on all Macs as quick as they can . Put Retina screen support on all Macs. That's how get better market adoption with Mac given its relatively smaller market standing. Fragmenting an already relatively small Mac space only leads to problems.

The Mac being just another flavour of standard PC (with a pretty, easy to use standard UNIX on top) was the best thing that ever happened to it - not the iMac, not the titanium / aluminium powerbook, not the G3 processor, not any particular innovation Apple has come out with.

AMD based systems are another flavor of standard PC, so jump whole hog into AMD would work just fine. However, the UNIX part of the above happened before the x86 change. x86 Macs have never been plain BIOS focused commodity systems (never been standard race to the bottom on standards systems). This is also overstated the product emphasis mix that Apple did versus magically the Intel processors being the whole story. Apple bet heavy on laptops dominating (e.g.moves like the MBA) and won. They didn't have 6-10 box-with-slots designs boat anchoring them down.


I just spent the weekend at an AR & VR workshop - all the AR was done with Eyejack (authoring on mac, deploying on iOS / Android), for which ARKit is just a dumb pipe to hardware.

So the augmented reality (AR) tech that is deployable to 100's of millions of users is completely viable to do development on with a Macs. Yeah that's a 'horrible' position for Apple/Mac to be in. *cough* Widest, most commonly deployed platform is best ( e.g., standard PC) so somewhat this is some kind of disadvantage ?

All the VR was on PCs, of course. Laptops not significantly bigger than a macbook pro from a couple of years ago can comfortably drive a Vive Pro or Rift, no external GPU needed - you can do VR development on a PC mbp 15" sized device, with all your tools in VR to create animatable 3d models, motion capture with Vive Pro trackers etc.

What is the operational battery life of these? Is it better than standard PC laptops? Or worse? If riding with the larger herd is better than which is the better subsegment to be in overall?

Which laptop subsegment is growing faster at volume. lightweight, longer batter life laptops or short battery life desktop replacements ?

All that because of Nvidia graphics, pure and simple. And, correct me if I'm wrong, but there's nothing even remotely in the near future to put Vega 64 plus-level GPUs into Apple laptops, is there?

Are these in Windows laptops of the same size and design objectives as the MBP? No ( your earlier quote admits that they are in bigger, heavier laptops ). Then it really isn't all because of Nvidia. The AMD Vega stuff is just pure indirection.


Ironically the only Apple product to have a place in this, is the iPhone X - whose dot-spraying camera can be used for facial capture streamed over wifi back to Unity, to puppet the face of a 3d character (which can also be the person being motion captured) in real time.

Which puts Apple more than slightly ahead on AR then the competition. And the primary development system for iPhone X apps is ????? Macs.

Apple putting the same camera on the iPad Pro and cranking the GPU on the A12X is going to have what kind of impact on AR over next year or so ?


To bring this back to the Mac Pro. If Apple doesn't put an empty slot in the next Mac Pro then they'd have a problem in the VR space. However, of the two AR is a substantially bigger place than VR. Even for AR it still would be better there was an empty slot but it isn't necessarily the end of the world if they don't. All the "even remotely near future Vega64" should be able to fit in a next Mac Pro without any great difficulty. (there is a Vega64 in the iMac Pro ... so a Mac Pro with a wider thermal envelope shouldn't be a problem at all. )
 
Apple and NVidia had a huge falling out (probably over the self-destructing mobile GPUs), and Apple has coded AMD dependencies into their products since then... I know Final Cut performs much less well on NVidia, and I am suspicious that at least some parts of Metal also depend on AMD. Apple's aggressive "no NVidia" stance isn't changing anytime soon.

On the other hand, a Ryzen or Threadripper finding its way into a Mac seems more likely... Apple already works with AMD on graphics, and will have to weigh the benefits they get from Intel exclusivity (they probably get special pricing, and they may get some semi-custom SKUs - some graphics-heavy processors in the 13" MBP never seem to show up anywhere else, and the Xeons in the iMac Pro seem to be slightly tweaked) against the advantages of a second CPU source.

ARM Macs are a MUCH bigger change, and all the benefits seem to be in the lightest notebooks. A few manufacturers have designed ARM-based servers that use a huge number of cores to overcome Intel's per-core speed. They've never taken off. Yes, Apple's A-series are better chips than publicly available ARM processors, but are they enough better? Remember that parallelism is HARD - developers don't like being faced with a ton of low-power cores...
 
With Apple not using standard M.2 slots in any Macs to date, I'm losing faith the the Modular Mac Pro will be anything other than a Cube 3.0.

I don't think Apple truly learned much from the 6,1 debacle. I think to be a success the 7,1 would have to have at least 2 internal 3.5 SATA drive slots, at least 1 standard M.2 slot, at least two 16x PCI-E double wide slots, with maybe some 1x slots between.

I don't think it would really need more than 4 Thunderbolt 3 ports, but it would be nice to have an 1000w-1200w PSU.

6 or 8 USB-A ports and 10Gb ethernet or 2.

That system really wouldn't have to be that big, maybe half the hight of the cMP.

Those are my must haves, but I know Apple doesn't exist to pease me, so I may end up skipping the 7,1 until Apple stops trying to reinvent the wheel and profit at every turn of it.
 
With Apple not using standard M.2 slots in any Macs to date, I'm losing faith the the Modular Mac Pro will be anything other than a Cube 3.0.

I don't think Apple truly learned much from the 6,1 debacle. I think to be a success the 7,1 would have to have at least 2 internal 3.5 SATA drive slots, at least 1 standard M.2 slot, at least two 16x PCI-E double wide slots, with maybe some 1x slots between.

I don't think it would really need more than 4 Thunderbolt 3 ports, but it would be nice to have an 1000w-1200w PSU.

6 or 8 USB-A ports and 10Gb ethernet or 2.

That system really wouldn't have to be that big, maybe half the hight of the cMP.

Those are my must haves, but I know Apple doesn't exist to pease me, so I may end up skipping the 7,1 until Apple stops trying to reinvent the wheel and profit at every turn of it.
If it will be reliable, and will provide even limited upgradeability down the road - it may not be that bad thing.
 
C

x86 is a dinosaur with a **** load of legacy crap no one has any use for. It is even holding technology back at this point in time.

RISC could easily replace CISC again. We are a long way from 2006.


Yeah..... About that

Let's review software development cycles, shall we?

Most software is on a 18 - 24 month design cycle.

Let's assume that every software house dumps their current Mac development process so they can restart at the same time they get their hands on an ARM Mac. (which won't happen, but I am giving a best case scenario - software houses will conduct an analysis to see if the uptick in sales will justify development costs - and every last one of them will remember the hoops they had to jump through and factor in the possibility that Apple may sabotage them like they did with Carbon-64.).

Version 1 software is anywhere from 18 - 24 months out (Delivery dates - late 2020 to late 2021)- this will be a straight port - no new features. Have fun being a beta tester for every piece of software you "upgrade" to.

Version 2 - 1st truly native version with new features (Delivery dates - Mid 2021 to late 2023). All for a shrinking market.
 
Apple and NVidia had a huge falling out (probably over the self-destructing mobile GPUs), and Apple has coded AMD dependencies into their products since then... I know Final Cut performs much less well on NVidia, and I am suspicious that at least some parts of Metal also depend on AMD. Apple's aggressive "no NVidia" stance isn't changing anytime soon.

That did happen for Nvidia GPUs in 2008 MacBook Pros, but since then the following has also happened:
  • The 2011 MacBook Pro GPU failures that resulted in an extended warranty campaign were AMD GPUs.
  • The 2013 Mac Pro GPU failures resulting in an extended warranty campaign are AMD GPUs.
So if they are upset about Nvidia meltdowns from 10 years ago, wouldn't it follow that they'd also be upset about the two AMD meltdowns, both of which are more recent?
 
  • Like
Reactions: Synchro3
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.