You should not blow up 1080 to 4K. Glad you asked. Hope that helps you out in the future.How am I trolling?
I did NOT say I upscale 1080p to 4K. Where did I say that? How is that a troll when I corrected his statement?
[doublepost=1490649681][/doublepost]
But wait, I thought 4K was the standard for a while now? According to res0lve, 4K is the ABSOLUTE MINIMUM you should EVER WORK ON. 6K is apparently mainstream. And 8K is widely adopted now. How are those movies "fake" 4K then? Even the ones from 2016.
I need to go back to 480p
You should not blow up 1080 to 4K. Glad you asked. Hope that helps you out in the future.
Mandatory minimum is what many of us have to work under.
We don't all capture video at 720.
For the MAJORITY of us, 4K is usually the minimum. That's because WE POST IN 4K. It does not mean we DISTRIBUTE IN 4K. Subtle but major difference.
You're trolling hard at this point. And yes, the nMP still sucks for the above reasons in post #118.
And we don't all capture video at 4K or 6K or 8K. You guys know there are different workflows right? I primarily use 720p. So the Mac Pro is good for me. It does not OVERALL SUCK like people here say.
I don't think the nMP is a bad idea, but this is not a winning argument.
What's not a winning argument? Everybody should be required to work on videos at 4K at a minimum?
Why not work in 1080p? Much less 4k?
And if you're working in 720p, why buy a Mac Pro? Just buy a MacBook Air at that point...
Arguing that the Mac Pro is really great for 720p workflow is a bad argument.
Is this real life?
And that is my point. Why was someone arguing with me that a $10,000 computer would make things even faster when my $3,000 Mac Pro is already over powered for what I do. My 2010 Mac Pro is STILL not sweating with my work flow, and it is the 6-core single CPU version.
Perhaps you need a slower computer then. I would also suggest a Mac Book Air.
In other words, it sucks for a lot of people.It is a real shame that the Mac Pro doesn't have a config that benefits a lot of the workflows.
In other words, it sucks for a lot of people.
Did you see what I did there?
I don't disagree - I've frequently said that the MP6,1 is a nice upgrade for the MiniMac.I agree. But it is a nice system for a lot of people too.
I don't disagree - I've frequently said that the MP6,1 is a nice upgrade for the MiniMac.
And that it's a horrible step backwards for people with MP5,1 systems.
But wait, I thought 4K was the standard for a while now? According to res0lve, 4K is the ABSOLUTE MINIMUM you should EVER WORK ON. 6K is apparently mainstream. And 8K is widely adopted now. How are those movies "fake" 4K then? Even the ones from 2016.
Bottom of #114.
"Why should I blow up a raw 1080p footage to 4K?"
....
And that it's a horrible step backwards for people with MP5,1 systems.
Are you talking about gaming on your PC or Mac?
....
Here's some very recent Premiere benchmarks comparing the fastest machine Apple makes to a couple of standard fairly light weight NON Xeon workstations.
The performance gap is a total cluster with CUDA acceleration (Rendering, NOT gaming) since you can't run any Nvidia cards on a nMP without external expansion, and you can't run any Pascal series cards at all. eGPU solutions are also not officially supported by Apple and they have intentionally made it this way.
If you just need to edit FCPX, Apple has you covered and they think you can get by just fine editing on a MBP instead of the trash can.
poo-pooing FCPX for being locked into Apple and cheerleading CUDA at the same time is bizarro, hypocritical stuff.
To be a viable Mac software product means running on multiple Mac devices and multiple workloads. These "only good" for a couple thousand Mac applications typically have problems long term. Mac is like 6-7% of overall market. if aiming at 0.1% of that 6% the user base is awfully small to support a long term developer and support base. 0.1% of 90% is a more viable number since about an order of magnitude bigger.
I'm not saying CUDA is better, but other platforms have a choice to use CUDA or OpenCL. Up until the 6,1, we had that choice too. For many applications, CUDA is faster. I'm not saying that the vendor lock-in is ideal, but when speed matters, it is what it is.
The impression I've gotten from multiple developers of GPU renderers is that they want to support Open CL, but some limitations on AMD and/or Apple's side prevent feature parity on the few that even bother attempt to support an Open CL version.
At least one dev waffles back and forth on what they believe can work on AMD cards. One week they say its coming and they had a break through, and the next week they are throwing up their hands and giving up. As frustrating as that is to me, it's likely more so for the developers working on the problem.
The Cinema 4d owner base is roughly 50% Mac according to Maxon, the makers of C4d. If there was a great renderer that supported all of my rendering needs on AMD based hardware, I'd stay on the Mac in a heart beat as my tasks greatly vary. I don't only do 3d. But when I do, I need that speed provided by multi GPU rendering.
Sounds like amd is dead.....You know, it's funny you mention AMD failures and driver issues because this same thing happened with Octane. Version 3.1 of Octane was supposed to support AMD cards, but it kept getting delayed. They finally came out and said that they were waiting on AMD.
I pulled this from the user forum: "We have an AMD branch we made for 3.0, but as I said earlier, we can't support a commercial release (or spend time optimizing) until AMD brings their driver stack to the level they have conceded they need to achieve for Octane to be as stable as it is on NVIDIA - so this is on them, and we are moving onto other features until this changes. "
And
"i think AMD drivers will never be fixed. lol
I think they have to, and they know it, which is why they agreed to fix it. Otherwise, they are by default ceding the high end commercial GPGPU market to NVIDI. OpenCL 2 doesn't exists (Linux and MacOS are still on 1.2, and apple is only support Metal, so even 1.2 is not a sure thing in the future). We did all this crazy work to cross compile CUDA to AMD IL. We had AMD Octane 3 running at a demo machine at siggraph and would have done a first test release around them if they had addressed this when they were supposed to.
In any case at least headless rendering will bring some relief to fustated Mac users. We can still use the local GPU for OctaneImager or the host app raserized viewport."
I had high hopes for an AMD rendering option from Octane and to keep my Mac, and instead, Maxon integrates AMD ProRender into their software. I have no doubt it was done to keep the Mac users happy, but almost all mainstream GPU rendering is done on CUDA at this point. Arnold is even working on a GPU based version.
Like I said, I hate the hardware lock-in but at this point, there just isn't much else to choose from.
You know, it's funny you mention AMD failures and driver issues because this same thing happened with Octane. Version 3.1 of Octane was supposed to support AMD cards, but it kept getting delayed. They finally came out and said that they were waiting on AMD.
I pulled this from the user forum: "We have an AMD branch we made for 3.0, but as I said earlier, we can't support a commercial release (or spend time optimizing) until AMD brings their driver stack to the level they have conceded they need to achieve for Octane to be as stable as it is on NVIDIA - so this is on them, and we are moving onto other features until this changes. "
And
"i think AMD drivers will never be fixed. lol
I think they have to, and they know it, which is why they agreed to fix it. Otherwise, they are by default ceding the high end commercial GPGPU market to NVIDI. OpenCL 2 doesn't exists (Linux and MacOS are still on 1.2, and apple is only support Metal, so even 1.2 is not a sure thing in the future). We did all this crazy work to cross compile CUDA to AMD IL. We had AMD Octane 3 running at a demo machine at siggraph and would have done a first test release around them if they had addressed this when they were supposed to.
In any case at least headless rendering will bring some relief to fustated Mac users. We can still use the local GPU for OctaneImager or the host app raserized viewport."
I had high hopes for an AMD rendering option from Octane and to keep my Mac, and instead, Maxon integrates AMD ProRender into their software. I have no doubt it was done to keep the Mac users happy, but almost all mainstream GPU rendering is done on CUDA at this point. Arnold is even working on a GPU based version.
Like I said, I hate the hardware lock-in but at this point, there just isn't much else to choose from.