Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Just test runs? hmmm...

I can't talk about certain things but I can repeat that Apple made a public statement that they will "begin production soon at the same Austin facility where Mac Pro has been made since 2013." I know the cool thing to do in the forums is to not believe Apple and talk about December 23rd pre-order dates but people should really listen to that statement. "SOON" That statement was made several weeks ago, a lot can happen in that time period. That's all I can say on the matter.

I’m not saying anything about final production. Just that preproduction units existed and were being distributed since WWDC.

Also, they can always start orders any time they want. Even if the ship time is a few weeks or even a few months.
 
  • Like
Reactions: thisisnotmyname

There is nothing wrong with preferring macOS to Windows but you're behaving just as badly as those "PC fanboys" when you slide off the road into the ditch with hyperbole like this. There are countless audio and video professionals who are using Windows-based workflows and they're just as serious about their work as you are.

For the most part, they're doing it on Windows workstations that are priced similarly to the Mac Pro, so for the rest of your rant I'm on board. But let's not pretend that Windows isn't a perfectly valid platform choice for serious professionals in 2019. It absolutely is. I'd wager those Windows-using professionals have better hardware support and a lot less anxiety about the long-term survival of their platform. They probably spend less of their time bickering about operating systems on web forums as well.
 
Last edited by a moderator:
There is nothing wrong with preferring macOS to Windows but you're behaving just as badly as those "PC fanboys" when you slide off the road into the ditch with hyperbole like this. There are countless audio and video professionals who are using Windows-based workflows and they're just as serious about their work as you are.

For the most part, they're doing it on Windows workstations that are priced similarly to the Mac Pro, so for the rest of your rant I'm on board. But let's not pretend that Windows isn't a perfectly valid platform choice for serious professionals in 2019. It absolutely is. I'd wager those Windows-using professionals have better hardware support and a lot less anxiety about the long-term survival of their platform. They probably spend less of their time bickering about operating systems on web forums as well.


I can only talk from my experience. Post Production engineers visibly show their frustration if you have a windows machine. They literally moan about it till the work gets done. They aren’t comfortable with it at all.

I’m a musician and Windows is absolutely terrible for music. I have nothing against PC’s and I was just reacting to that terrible video.

“WE ABSOLUTELY HATE APPLE PRODUCTS BUT WE ARE GOING TO SHAMELESSLY COPY THEIR ENTIRE DESIGN TO THE CORE AND GIVE IT RICH” 😂
 
I can only talk from my experience. Post Production engineers visibly show their frustration if you have a windows machine. They literally moan about it till the work gets done. They aren’t comfortable with it at all.

You're not describing behavior that I would call "professional," but maybe our industries differ in that regard. Welcome to MacRumors, btw.
 
  • Like
Reactions: ssgbryan
Working in Windows means I spend more time confronting problems and more time troubleshooting them than in Mac—the former is just part and parcel of the platform, the latter would be helped if I was as familiar with Windows as I am Mac but is still something I avoid. Having to go to the console to try and get an update to work is a level beyond any troubleshooting issues I've ever had on the Mac.*

At the end of the day, though, it's your work that matters, not really how you do it. When I've had to use Windows, I deal.

(Besides, at this point most of my software complaints are probably not with Apple or Microsoft but Adobe, who have absolutely given up on providing a great experience thanks to the loss of Macromedia and subscription pricing. And no matter if I'm on Mac or Win, I will always have to deal with Adobe.)

---
Anyhow, the leaked MBP image in 10.15.1 I guess is a sign we could still be on for the October/early Nov event (or at least this stuff is coming spoonerism.)
 
You're not describing behavior that I would call "professional," but maybe our industries differ in that regard. Welcome to MacRumors, btw.

Thanks 😊

I’m not dissing those who have windows setups. But I highly doubt if they were given an option between Mac and Windows they would choose Windows. They have PC’s because the same configuration they have is not affordable in a Mac Pro which I understand but that’s not Apple’s fault. They still have to cover operating expenses, R&D costs and make profits as well. If you want quality I’m afraid you have to pay for it. There’s no point in blindly blaming something just because it’s expensive. It helps no one.

Trust me it’s soooo annoying when engineers moan about having to work in a PC but they do have a point. There are constant crashes in the middle of a project. Hours and hours of work lost which is simply not acceptable in a professional environment. None of these problems even remotely exists in macOS. Yeah it can be a little buggy during OS upgrade cycle but nothing game stopping. Anyways you should never update your OS immediately after Apple releases it. Give it a month or two.
 
Last edited:
Working in Windows means I spend more time confronting problems and more time troubleshooting them than in Mac—the former is just part and parcel of the platform, the latter would be helped if I was as familiar with Windows as I am Mac but is still something I avoid. Having to go to the console to try and get an update to work is a level beyond any troubleshooting issues I've ever had on the Mac.*

At the end of the day, though, it's your work that matters, not really how you do it. When I've had to use Windows, I deal.

(Besides, at this point most of my software complaints are probably not with Apple or Microsoft but Adobe, who have absolutely given up on providing a great experience thanks to the loss of Macromedia and subscription pricing. And no matter if I'm on Mac or Win, I will always have to deal with Adobe.)

---
Anyhow, the leaked MBP image in 10.15.1 I guess is a sign we could still be on for the October/early Nov event (or at least this stuff is coming spoonerism.)

I do understand what you’re saying. If you think adobe is terrible I urge you to deal with Avid. Protools have caused me so many headaches I don’t even know where to start. Like you said these problems are unavoidable and only the software developers are to blame.

I think PC’s are ideal for programming, gaming (which I have no clue about) and Macs are ideal for music mixing, mastering, surround mixing and animation.
 
At least the new Mac pro would be a good Linux workstation for those into openCL SYCL ROCm ... but not those demanding CUDA, those need to wait a bit more
 
Very interesting the job posting at apple, they are hiring a lot of people for ML/AI requiring Tensorflow Pytorch CUDA OpenCL skills, it means either apple ml/Siri backend servers run on nVidia/Linux/ROCm plataform (no jobposting explicitly names ROCm just CUDA and opencl) or they have nVidia CUDA / OpenCL prototypes on macOS, two of these jobposting where immediately removed (but still available in Google search cache)...

I've hears high level rumours on nVidia comeback to the Mac, specifically Volta/Turing GPUs.

Although given its importance in the non-cuda ML-hpc frameworks/libraries the OpenCL comeback is another win for the mainstream vs evil "illuminated" out the band attempts from apple to self impose their propietary focus.

Apple's strategy failure in AI/ML is only comparable to Microsoft failure in Mobile phone OS, most (90%+) research papers on AI/ML are using tensorflow/Pytorch/mxnet and Cuda ir opencl TPUs, meanwhile almost no research relies on ROCm and research done with core ML and related are like unicorns or either seems sponsored by Apple (and sometimes seems done in tensorflow and re-build in core ML metal).

This is the very sad state of ml/ai at apple the most capitalized technological corporation (but among the Corp with less product diversity, only coal and oil had simpler product lines in the Fortune 500).
 
Very interesting the job posting at apple, they are hiring a lot of people for ML/AI requiring Tensorflow Pytorch CUDA OpenCL skills, it means either apple ml/Siri backend servers run on nVidia/Linux/ROCm plataform (no jobposting explicitly names ROCm just CUDA and opencl) or they have nVidia CUDA / OpenCL prototypes on macOS, two of these jobposting where immediately removed (but still available in Google search cache)...

I've hears high level rumours on nVidia comeback to the Mac, specifically Volta/Turing GPUs.

Although given its importance in the non-cuda ML-hpc frameworks/libraries the OpenCL comeback is another win for the mainstream vs evil "illuminated" out the band attempts from apple to self impose their propietary focus.

Apple's strategy failure in AI/ML is only comparable to Microsoft failure in Mobile phone OS, most (90%+) research papers on AI/ML are using tensorflow/Pytorch/mxnet and Cuda ir opencl TPUs, meanwhile almost no research relies on ROCm and research done with core ML and related are like unicorns or either seems sponsored by Apple (and sometimes seems done in tensorflow and re-build in core ML metal).

This is the very sad state of ml/ai at apple the most capitalized technological corporation (but among the Corp with less product diversity, only coal and oil had simpler product lines in the Fortune 500).

AMDs strategy failure in ARM is only comparable to Microsoft failure in Mobile phase OS, most (88% of processors for mobile devices and PCs) CPUs are ARM based, meanwhile almost no mobile devices use x86 based processors and mobiles devices such as these are like unicorns or sponsored by mobile device companies (and sometimes seems they have devices designed on ARM and redone in x86).

This is the very sad state of ARM at AMD a highly capitalized technological corporation (but among the corp with less product diversity, with only two product categories at AMD [CPU and GPU] only coal and oil had simpler product lines in the Fortune 500).


...wow, it's easy to dig on companies for not supporting a technology. Amazingly not every tech company supports every technology.
 
  • Like
Reactions: Mago
I've hears high level rumours

Is it from your usual darknet sources? I ask because they were consistently inaccurate in the past.
IMO the only chance to see Nvidia GPU on Mac again is if they will use Metal instead of CUDA. Both companies have very good reason to support only their proprietary standard so CUDA on MacOS it is very unlikely to happen, especially now that Apple started to push hard on Metal development.
 
Very interesting the job posting at apple, they are hiring a lot of people for ML/AI requiring Tensorflow Pytorch CUDA OpenCL skills, it means either apple ml/Siri backend servers run on nVidia/Linux/ROCm plataform (no jobposting explicitly names ROCm just CUDA and opencl) or they have nVidia CUDA / OpenCL prototypes on macOS, two of these jobposting where immediately removed (but still available in Google search cache)...

Any company trying to do anything remotely successful in AI is using CUDA. I said in a different post that even Apple uses CUDA/Linux internally for their machine learning. You'd be a fool not too. But, I think it highlights a deep issue with Apple. They are a multi-billion dollar computer company that can't even develop the necessary OS/Hardware to do machine learning using their own brand. So they use what is better. How pathetic is that...
 
  • Like
Reactions: Mago
IMO the only chance to see Nvidia GPU on Mac again is if they will use Metal instead of CUDA.

Not exactly, but I believe nVidia could comeback to macOS not exactly as GPU but as an Compute Acceleration Peripheral, they could build a non-gpu driver (as for headless servers) and introduce it w/o any graphic display capabilities buy enabling 100% CUDA API w/o touching metal, it would require the Mac keep it's incumbent GPU but also will release nVidia GPU from GUI duty allowing 100% compute capabilities, the ncgMP could fit a rx580 and two or even there rtx titab or Quadro rtx and have 10x the ML training power as a double Vega II duo setup.

The new driver API do not require apple approval for non-gpu Peripherals.
 
Would anyone care to guess what a comparison of a 3.3ghz 8 core, otherwise fully loaded Mac Pro 6,1 vs base model Mac Pro would look like in terms of Geekbench scores.

And a graphics benchmarking comparison vs 6,1's dual D700's vs mac pro's base Radeon Pro 580X?

I am going to sell my current rig, for the 7,1 and eventually upgrade necessary parts over time, but purchase the base model first. Just want to know what initial hit I may take on performance for video editing,
 
Not exactly, but I believe nVidia could comeback to macOS not exactly as GPU but as an Compute Acceleration Peripheral, they could build a non-gpu driver (as for headless servers) and introduce it w/o any graphic display capabilities buy enabling 100% CUDA API w/o touching metal, it would require the Mac keep it's incumbent GPU but also will release nVidia GPU from GUI duty allowing 100% compute capabilities, the ncgMP could fit a rx580 and two or even there rtx titab or Quadro rtx and have 10x the ML training power as a double Vega II duo setup.

The new driver API do not require apple approval for non-gpu Peripherals.

All this noise about Apple needing to support Nvidia because of ML/AI strikes me as a bit disingenuous. AI/ML and Nvidia are not synonymous. TensorFlow is not dependent on Nvidia CUDA. In fact Google Cloud AI - you know, the cloud offering from the company that created TensorFlow - doesn't run on Nvidia GPUs. GPUs have been convenient for training given that a lot of that training relies on big matrix math which benefits from a lot of parallelism. GPUs being readily available compute devices that support a lot of parallelism made for a good stopgap solution at early stages of this evolution. Going forward we'll see more ASIC development toward AI/ML tasks that will be much more efficient than repurposing a GPU. In fact that's what Google has already done with the TPU that powers Google Cloud AI (others are as well, including Apple). Small businesses that don't have the resources to invest in ASICs at this time can leverage cloud computing and have access to TPUs today. Not having an Nvidia branded card in your system doesn't prevent you from advancing your AI/ML path at all.

I think a lot of these comments (but not all) come from people who are just looking for a reason to push Apple into supporting Nvidia chips. Who knows what their real motivation is in wanting that (games? benchmark envy?). In any case, none of us outside of Apple and Nvidia leadership know what is really going on, there's a dispute, possibly surrounding Metal, and there aren't drivers today and probably won't be for the foreseeable future. We'll be fine.
[automerge]1571503343[/automerge]
Would anyone care to guess what a comparison of a 3.3ghz 8 core, otherwise fully loaded Mac Pro 6,1 vs base model Mac Pro would look like in terms of Geekbench scores.

And a graphics benchmarking comparison vs 6,1's dual D700's vs mac pro's base Radeon Pro 580X?

I am going to sell my current rig, for the 7,1 and eventually upgrade necessary parts over time, but purchase the base model first. Just want to know what initial hit I may take on performance for video editing,

no need to guess...

 
AI/ML and Nvidia are not synonymous. TensorFlow is not dependent on Nvidia CUDA. In fact Google Cloud AI - you know, the cloud offering from the company that created TensorFlow - doesn't run on Nvidia GPUs.
You're right neither nVidia is indispensable for tensorflow neither will be the last way to achieve HPC compute, but nowadays and the mid term foreseeable future, the fact is no-body doing independent serious AL/ML/HPC research is doing it without CUDA Even Apple (look at jobpost), about Google I mentioned earlier this thread with the time compute offload won't be ever matter if GPU but purpose built hardware as Google TPU or Intel NPU, nVidia comes handly when no such hardware category existed, even nVidia was wise enough to model it's GPU around CUDA and not the opposite this enable CUDA more flexible (and sometimes crucial) programming flexibility indispensable for many algorithms which even can not be run efficiently on other compute enabled GPU coz has not the required silicon (while seems AMD catched the issue and will introduce into it's architecture similar features, but even Vega II Haven it).

Just look at research papers with few exceptions all where done using something with CUDA as Accelerator.

Maybe this next AMD architecture is being build around Metal 3, and enable Metal to fight side by side with CUDA, but it belongs to the future, meanwhile simple mortals don't have direct access to TPU/NPUs ( except USB dongle), these near mithologic unicorns are not available for purchase as much you can rent them at Google farms or with some luck see it at some supercomputer done by IBM, or Cray, maybe someday will arrive.

But if you look at CUDA 10, nvidia already has done the homework to enable compute acceleration in USER MODE avoiding to load kernel (display) drivers, this is now available in Linux where you can boot a system with main AMD or Intel GPU and do Compute in a headless user mode nVidia GPU, no need to wait, learn something new, or have access for privileged exotic hardware.
 
Last edited:
  • Like
Reactions: thisisnotmyname
Any company trying to do anything remotely successful in AI is using CUDA. I said in a different post that even Apple uses CUDA/Linux internally for their machine learning. You'd be a fool not too. But, I think it highlights a deep issue with Apple. They are a multi-billion dollar computer company that can't even develop the necessary OS/Hardware to do machine learning using their own brand. So they use what is better. How pathetic is that...

And their iCloud datacenters all use non-Apple hardware running Linux and Windows instead of Macs running macOS. They could have developed custom Xserves and server versions of macOS, but why spend resources (monetary, time and human) to reinvent the wheel when perfectly-serviceable alternatives exist already that can support the core business objectives (like serving customers data and performing machine learning)?
 
And their iCloud datacenters all use non-Apple hardware running Linux and Windows instead of Macs running macOS. They could have developed custom Xserves and server versions of macOS, but why spend resources (monetary, time and human) to reinvent the wheel when perfectly-serviceable alternatives exist already that can support the core business objectives (like serving customers data and performing machine learning)?

I’m reasonably certain Apple has never run Macs in any of their data centers.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.