Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Vega is not in the Mac AMD drivers because some Radeon engineer tripped and fell on his keyboard and accidentally typed that in. C'mon now.

I've been telling you guys...
 
  • Like
Reactions: koyoot
The E3 line usually is tragically limited in the amount of RAM that it can address.

E3 for entry would be OK, with E5-16xx and E5-26xx for the higher RAM and core counts.

Forget i5/i7 though - I don't buy workstations without ECC RAM. No ECC is a deal-breaker.

Thanks. What is the RAM limit for E3's..is it 32 GB?
 
IMHO seeing KEXT with mentions to POLARIS XT2 and VEGA10 only means a New Mac Pro is coming, it isn' for eGPU cages neither for iMacs or cMP 5,1.

that Polaris 12 should be for the next iMacs, and Polaris XT2 and Vega 10 pr the nnMP.
If it is coming out then I got my gift card ready.
 
Because it has a good performance to watt ratio. It would be easy to cool (74W TDP). Plus I also don't need more than 4 cores, or more threading. (It would be nice to have a larger L3 cache though).

I do though - and I am not a professional in the least.
 
After looking at threads about MBP and other things, I can tell you immediately that no matter with what Apple will come up for their desktop computer lineup it will be not enough Pro for majority of users on this forum.

Others, who will actually buy the hardware and use it, will be surprised with how powerful it is, despite the words naysayers will always come up with.

Thankfully, however, there are the benchmarks that will save us from both of the aforementioned parties and reveal the cold truth, as they did before.
 
After looking at threads about MBP and other things, I can tell you immediately that no matter with what Apple will come up for their desktop computer lineup it will be not enough Pro for majority of users on this forum.

Others, who will actually buy the hardware and use it, will be surprised with how powerful it is, despite the words naysayers will always come up with.

This is arguably the most accurate comment in this thread. If the latest MBP is any indication, we will get an AMD GPU based "workstation" that will be powerful when compared to Apples current lineup but when compared to real workstations, it will be average at best. You can probably count on most if not all of the machine being non-upgradable thus making the machine a terrible investment. Form over function and not a single PCI slot in sight.

Whatever nMP they come out with is gonna disappoint most users. It will benchmark well against Apple stuff and probably get hammered by dedicated workstations. This is the biggest gripe of the nMP haters. Apple has tailored their machines to work well with their software, and that's it. If your workflow is heavily based on something other than FCPX, the nMP will likely struggle to compete. As long as you live in the Apple ecosystem, you are likely to be happy as long as you don't mind throwing out the hardware when it's time to upgrade.
 
No ECC is a deal-breaker.

What kind of work are you doing that needs it?

I'm in graphics/animation and moved away from the classic mac pro and Xeon PCs to non-ECC macs and i7 PCs and haven't noticed an issue for what I do, but I've got a limited view of what people outside my industry do.
 
anyone care to speculate what the meaning of "apple GPU" is in this slide from the presentation, as well as the performance compared to nVidia's current flagships?

The specs for AWS GPU are
High-performance NVIDIA K80 GPUs, each with 2,496 parallel processing cores and 12GiB of GPU memory
or
High-performance NVIDIA GPUs, each with 1,536 CUDA cores and 4GB of video memory

https://aws.amazon.com/ec2/instance-types/ types P2 and G2

(sorry about the hotlinking, I can only link to images, not upload them)

http://qz.com/856546/inside-the-sec...e-aapl-revealed-the-state-of-its-ai-research/

mnxbjwiq-e1481075281770.jpg
 
anyone care to speculate what the meaning of "apple GPU" is in this slide from the presentation, as well as the performance compared to nVidia's current flagships?

The specs for AWS GPU are
High-performance NVIDIA K80 GPUs, each with 2,496 parallel processing cores and 12GiB of GPU memory
or
High-performance NVIDIA GPUs, each with 1,536 CUDA cores and 4GB of video memory

https://aws.amazon.com/ec2/instance-types/ types P2 and G2

(sorry about the hotlinking, I can only link to images, not upload them)

http://qz.com/856546/inside-the-sec...e-aapl-revealed-the-state-of-its-ai-research/

mnxbjwiq-e1481075281770.jpg
From the article you linked it seems to be in the server/web service end of how well the GPU's are performing. Likely not any indication of any consumer/prosumer/professional device. Whether custom built servers or third party, I couldn't say. But it seems more geared to AI development, and if you ask me it seems Apple is in panic mode trying to court any talent to help them keep up with the competition in that field (read not be left in the dust of the competition).
 
anyone care to speculate what the meaning of "apple GPU" is in this slide from the presentation, as well as the performance compared to nVidia's current flagships?

MXNet uses CUDA for GPU acceleration.

http://mxnet.io/get_started/setup.html#prerequisites

Unless Apple has ported MXNet to old OpenCL 1.2 or Metal—which I doubt—the apple GPU is just a name for a nVidia powered GPU cluster. At least that's where I'm placing my money in this bet. :D
 
MXNet uses CUDA for GPU acceleration.

http://mxnet.io/get_started/setup.html#prerequisites

Unless Apple has ported MXNet to old OpenCL 1.2 or Metal—which I doubt—the apple GPU is just a name for a nVidia powered GPU cluster. At least that's where I'm placing my money in this bet. :D


Oh well... At least it's proof Apple knows AI research is important and that nVidia GPUs are the best for it. You'd hope this meant they'd release Pro machine that use them, but apparently we're not here yet.

I had hoped it was a proof they'd made their own GPU that was useful for AI research (my next hobby). I had looked at the iPad powerVR analysis a while ago and saw that they would have needed a lot of them to equal the D700.
 
  • Like
Reactions: tomvos
What kind of work are you doing that needs it?

I'm in graphics/animation and moved away from the classic mac pro and Xeon PCs to non-ECC macs and i7 PCs and haven't noticed an issue for what I do, but I've got a limited view of what people outside my industry do.

Until I discovered Autodesk cloud rendering I was gagging for a more powerful computer. Now I am able to run most things I need comfortably on a macbookpro, but ideally would be best on an Imac. The same renders that made my fans blow out for 2.5 hours are now done in 10 minutes, to a higher resolution too. In fact I calculated, that the cloud could render 5 images in 10 minutes that would take me 40 hours on the MacBook pro. This is insane but at the same time fantastic and changes the way I look at things.

What I am trying to say really that say a couple of years ago I was ordering xeon PC workstations for the team and now they are not needed [ this is in the design / architecture field ]. It certainly appears to me the requirements are reducing each year, in terms of powerful hardware, especially for what I need.

I understand others need more power but this is just an example where once a computer as powerful as I could afford was always the goal, it is now about mobility, ease of work etc. The CPU power is just not needed for my role anymore [and the process in my job hasn't changed in 20 years really].
 
Until I discovered Autodesk cloud rendering I was gagging for a more powerful computer. Now I am able to run most things I need comfortably on a macbookpro, but ideally would be best on an Imac. The same renders that made my fans blow out for 2.5 hours are now done in 10 minutes, to a higher resolution too. In fact I calculated, that the cloud could render 5 images in 10 minutes that would take me 40 hours on the MacBook pro. This is insane but at the same time fantastic and changes the way I look at things.

What I am trying to say really that say a couple of years ago I was ordering xeon PC workstations for the team and now they are not needed [ this is in the design / architecture field ]. It certainly appears to me the requirements are reducing each year, in terms of powerful hardware, especially for what I need.

I understand others need more power but this is just an example where once a computer as powerful as I could afford was always the goal, it is now about mobility, ease of work etc. The CPU power is just not needed for my role anymore [and the process in my job hasn't changed in 20 years really].

Well, that sounds freaking awesome. I'm on c4d and after effects and don't of know a solution like that. IIRC you have to have a certain software package and/or maintenance plan to get access to that, right? Is it useful for animation?
 
Well, that sounds freaking awesome. I'm on c4d and after effects and don't of know a solution like that. IIRC you have to have a certain software package and/or maintenance plan to get access to that, right? Is it useful for animation?

Autodesk a360 cloud rendering is just for still image visualizations, though I believe they also do model turntables as well.

There are a handful of cloud rendering solutions out there for pretty much all of the high end software (maya, max, c4d, nuke, ae, etc.). There are varied rates for render time. People have also set up render farms using Amazon cloud services as well.

However, even with these services you'll likely need a pretty high end machine locally to run test renders and simulations.
 
  • Like
Reactions: Jack Burton
Well, that sounds freaking awesome. I'm on c4d and after effects and don't of know a solution like that. IIRC you have to have a certain software package and/or maintenance plan to get access to that, right? Is it useful for animation?

Its Autodesk....... works on Revit, Autocad and Fusion 360. Probably Max I would imagine but not sure.
I was looking into C4D recently as have colleagues who use it, but I must say this solution is all over it.

Just on subscription with an Autodesk account unfortunately so not great for macs except for Fusion..... I was doing it all on bootcamp with Revit. The shame is that the best 3D is on PC not a mac.
[doublepost=1481251101][/doublepost]
Autodesk a360 cloud rendering is just for still image visualizations, though I believe they also do model turntables as well.

There are a handful of cloud rendering solutions out there for pretty much all of the high end software (maya, max, c4d, nuke, ae, etc.). There are varied rates for render time. People have also set up render farms using Amazon cloud services as well.

However, even with these services you'll likely need a pretty high end machine locally to run test renders and simulations.

I was running test renders for free on low res with the cloud rendering. For me the costs to render are low against time lost / investment in hardware. My usage is not rendering every day so it works for me - it would all depend really on how much visualisation you do to make it stack up for each person.
 
  • Like
Reactions: Jack Burton
Its Autodesk....... works on Revit, Autocad and Fusion 360. Probably Max I would imagine but not sure.
I was looking into C4D recently as have colleagues who use it, but I must say this solution is all over it.

Just on subscription with an Autodesk account unfortunately so not great for macs except for Fusion..... I was doing it all on bootcamp with Revit. The shame is that the best 3D is on PC not a mac.
[doublepost=1481251101][/doublepost]

I was running test renders for free on low res with the cloud rendering. For me the costs to render are low against time lost / investment in hardware. My usage is not rendering every day so it works for me - it would all depend really on how much visualisation you do to make it stack up for each person.

Yeah, it's a solid solution for visualization. But Jack had asked specifically about animation, which it doesn't do.
 
  • Like
Reactions: Jack Burton
This could possibly be relevant to this sub-discussion, as far as virtualization/cloud workflows.

https://www.fra.me/products

I haven't used them myself but am interested to try it using revit (right now I boot camp, and generally do all my models and plans and then export certain images for cover page back into my Mac side to edit and composite for who ever it is designed for and then toss it back). I just haven't had the time to seriously check it out and see if it is worth it-or at least using the subscription model they have. It seems reasonable I just want to have more than a few hours to try it out, possibly over the holidays I'll try it.
 
This could possibly be relevant to this sub-discussion, as far as virtualization/cloud workflows.

https://www.fra.me/products

I haven't used them myself but am interested to try it using revit (right now I boot camp, and generally do all my models and plans and then export certain images for cover page back into my Mac side to edit and composite for who ever it is designed for and then toss it back). I just haven't had the time to seriously check it out and see if it is worth it-or at least using the subscription model they have. It seems reasonable I just want to have more than a few hours to try it out, possibly over the holidays I'll try it.

While related, I'm not sure that's a viable solution to what we're talking about. First, if you're using Revit then why not just use autodesk's services?

If not a360, then I would imagine a cloud render farm would be a better option.

The service you linked to seems best suited to those who have a severely underpowered system or mobile devices. And the price starts going up rapidly if you need any of their machines with dedicated GPUs.
 
Re: Cloud Rendering. It's a moot point for a lot of people with NDA's or who care about their data's security. I know I'm not able to use it due to contracts and agreements with customers.
 
  • Like
Reactions: singhs.apps
While related, I'm not sure that's a viable solution to what we're talking about. First, if you're using Revit then why not just use autodesk's services?

If not a360, then I would imagine a cloud render farm would be a better option.

The service you linked to seems best suited to those who have a severely underpowered system or mobile devices. And the price starts going up rapidly if you need any of their machines with dedicated GPUs.

Exactly. I've used autodesks cloud rendering and didn't find it worth much more than what I was rendering. It would save some time but it wasn't a great option in my workflow. Using frame is interesting (even if just for discussion) for the apps that many of us are using that aren't native Mac options, when you look at boot camp and the virtualization options.
 
Exactly. I've used autodesks cloud rendering and didn't find it worth much more than what I was rendering. It would save some time but it wasn't a great option in my workflow. Using frame is interesting (even if just for discussion) for the apps that many of us are using that aren't native Mac options, when you look at boot camp and the virtualization options.


I think I'm confused as to what you're trying to do with Frame. Are you just looking at that kind of service so you don't have to boot into Windows?
 
I think I'm confused as to what you're trying to do with Frame. Are you just looking at that kind of service so you don't have to boot into Windows?
Kind of-I've tried both boot camp and parrallels-as well as different Citrix virtualization solutions through the web, very much like frame, all have some strengths and I'm not totally sold on virtualization it eats up as much disk space anyway. Years ago autodesk (inadvertently) demo'd revit running through Firefox on OSX, and many were hoping that there was an impeding solution to do so officially. To the best of my knowledge it wasn't frame but there wasn't anything that has come of it apart from virtualization that different enterprises and institutions were using (essentially dialing into the network and remote accessing a workstation or server/renderfarm). Right now the local college uses some Citrix based solutions for various desktop environments but have there limitations.

When working multi platform I am always trying to watch and observe what there is available. I was sour on parralells 8 when the retina support was just coming around, and it really was from all of my research based on windows 8, not being able to really adjust/scale to normal standards.

Going back to the AI/AR slides I think they most definitely go to a server/renderfarm and not likely any indication of anything we will see in an upcoming Mac Pro. Especially with all the data they are processing.

Right now I'm in a holding pattern for desktop options. I've invested into Thunderbolt and would very much like to have thunderbolt 3 in my desktop. I really want a reasonably powerful system where I don't have to rely on being connected, as well as able to handle modern i/o (yes I believe USB-a isn't going anywhere too soon and Thunderbolt is backwards compatible, but it will only work as fast as your machine is capable). With that said I'm looking for something powerful enough that in a pinch can handle what I throw at it on its own reasonably with modern i/o.

There will be some here that will direct me elsewhere, and if I were in a purely a revit tech on windows I'd be sitting at an hpz series. But there's a lot more I do on the Mac side, video and images,as well as working more in AutoCAD for Mac, mainly because I do believe macOS is a better product, more secure, and an overall better experience. Where we find ourselves in the 6,1 product cycle I think I'd be a fool when we are/should be a few months out of it isn't discontinued altogether-and if that's the case it'll be an easy decision on where I go from here.

I apologize for this lengthy post. This has become a home for many of us who have been/are trying to put the pieces together, there was another one prior that was shut down because we wandered way off topic. But many of us continued the spirit of the original thread here. Over that time (it has been several months if not years) many have brought up different solutions/possibilities as many of us are unsure of the future of the Pro-hell yes we're hopeful, but hope doesn't will a new Mac Pro into existence (obviously, or maybe we need to sacrifice a few more goats).

Which is where I felt bringing up frame to be relevant, if not to at least hear other possible solutions that others may have found-given there isn't a new Mac Pro, the future of the Pro users fall into mainly 3 categories- upgrade/modify 5,1's (no Thunderbolt options). Move to HP-Z's or other workstation class (possibility, limited options with Thunderbolt but still an option). Lastly virtualization-server-render farm solutions (privacy and trade secrets/intellectual property property concerns). Given there is a new Mac Pro, (and that isn't castrated to save an ounce) it will likely (near certainty) have Thunderbolt 3 more modern GPU + CPU, to be celebrated by nearly all here.

The observations I have, as Apple as a company, really come from the trends and precedents of its own history and releases (buyers guide is a great review). Like many here I am surprised the 6,1 has lasted as long as it has, it really is unprecedented in Apple's history and personality, prior to the 6,1 the 5,1 languished and had a very long cycle as well. I wouldn't say I'm doom and gloom, but I am concerned, it's common knowledge that the "titan"-car project is on the ER table and could be nixed next, which is why I think we should be concerned-instead of keeping priorities on established products to keep them up with the current tech and competition, it seems they are resting on their laurels while tossing their money into a pit.

Again sorry for the long winded post/response. Being an iOS user has turned me into an entitled demanding customer/prosumer.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.