Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Every walled garden has a secret back door.
Well, it's not really a secret, but it might as well be
to those who pass it by every day as if it weren't there.

I wonder how many of the photogs I sold Macs to 10 years ago
have slipped through that door lately. Prolly quite a few
since the Aperture fiasco and now the trash can farce.

6a0192acc3343b970d01b8d12abd58970c-pi
 
Yep, that was the one. I would have dug up the quote that you did but couldn't find it.

I think many people here (including me) were hoping that AMD would be viable. AMD seems eager to win people over with reasonably priced hardware. But Siggraph was an nVidia show.

I'd even settle for an external GPU solution with a modern nVidia card. I'd pay a premium for that to stay on the Mac. But there are no nVidia drivers for Mac for the 10XX series cards.

Software. It's the darn software. Whether it's AMD giving devs what they need or nVidia releasing drivers for the external GPU market (and Apple supporting that!), people like me are painted into a corner with Apple.

It's a damn shame.
Sounds like amd is barking like taco bell dog "yipe yipe."
 
Yep, that was the one. I would have dug up the quote that you did but couldn't find it.

I think many people here (including me) were hoping that AMD would be viable. AMD seems eager to win people over with reasonably priced hardware. But Siggraph was an nVidia show.

I'd even settle for an external GPU solution with a modern nVidia card. I'd pay a premium for that to stay on the Mac. But there are no nVidia drivers for Mac for the 10XX series cards.

Software. It's the darn software. Whether it's AMD giving devs what they need or nVidia releasing drivers for the external GPU market (and Apple supporting that!), people like me are painted into a corner with Apple.

It's a damn shame.


I gave it my best shot to stay with the Mac and held out for a supported eGPU solution for as long as I could. I ran an Amfeltec chassis on my 5,1 with decent results but it wasn't a good long term solution. Once the Pascal series cards came out, and we didn't get Pascal driver updates, I started looking at Windows. The lack of modern CUDA GPU options on Mac was one of the main reasons I switched.

Apple putting its head in the sand and pretending CUDA will go away while they engineer another solution is probably not the best strategy. Especially when that plan involves slower GPU's from AMD.
 
It will be many years before 4K is the standard everywhere. I even need to produce some 1280x720 footage for people.

"Never ever ever ever plan to work on *K footage"? What does that mean? 10 years from now I will obviously have a new computer. 4K might be the standard by then. There is NO 4K footage. There are no discs yet that have 4K.
[doublepost=1490537696][/doublepost]

Well yeah that is what I meant when I stated the 2010 Mac Pro could be beaten by a Dell at the time.

I'm not sure what year this quote is from but I haven't shot anything less than 4K for about 8 years now. Mastering is at minimum 4K.
 
Mastering is at minimum 4K.

I master all my work in 4k as well. Have for about 4 years or so.

Wish the major movie studios and networks felt that way. Half of Hollywood films are are still shot and/or mastered in 2k, even if shot on 4k+ or film. Most network shows are 1080p. Only people really sticking to their guns on 4k are Netflix and Amazon.
 
  • Like
Reactions: aaronhead14
I think many people here (including me) were hoping that AMD would be viable. AMD seems eager to win people over with reasonably priced hardware. But Siggraph was an nVidia show.

Would it matter what AMD released? It's not like we're going to get turnkey retail cards with user-installable drivers. At best, we get scavenged drivers specifically made for the gelded, soldered, cut down versions of the proper cards, which apple is putting in their knowledge-worker lifestyle-organiser devices.
[doublepost=1490863387][/doublepost]
I wonder how many of the photogs I sold Macs to 10 years ago
have slipped through that door lately. Prolly quite a few
since the Aperture fiasco

Bingo. Once Aperture stops working, it's over to Capture One, or whatever other options there are (Seriff's rumored Affinity DAM product), and a lesson I will never unlearn - Never trust an app from an OS vendor.
[doublepost=1490863503][/doublepost]
I master all my work in 4k as well. Have for about 4 years or so.

Wish the major movie studios and networks felt that way. Half of Hollywood films are are still shot and/or mastered in 2k, even if shot on 4k+ or film. Most network shows are 1080p. Only people really sticking to their guns on 4k are Netflix and Amazon.

I spoke recently to a friend who works at a major national / state TV broadcaster, they're already moving to 8k for master files.
 
To change slightly subject..

“If there’s any doubt about that with our teams, let me be very clear: we have great desktops in our roadmap. Nobody should worry about that.” - Tim Cook


That plural is what gives me hope atm. What are those "desktops" he is referring to?


Case #1 - (Timmy ain't a Pro-fan) - He refers to iMacs and Mac Mini (isn't the Mac mini dead?), the Mac Pro is dead.

Case #2 - (Timmy the reckless) - He refers to 27" iMacs and 21" iMacs (Mac Mini and Pro are dead)

Case #3 - (#comeback) - Whole new lineup of iMacs, Mac Mini, Mac Pro.


Beside the hope, I imagine a guy of his caliber (also given his character) never says too much or too little. Right ?? :)
 
Last edited:
Would it matter what AMD released? It's not like we're going to get turnkey retail cards with user-installable drivers. At best, we get scavenged drivers specifically made for the gelded, soldered, cut down versions of the proper cards, which apple is putting in their knowledge-worker lifestyle-organiser devices.

An excellent point.
 
I gave it my best shot to stay with the Mac and held out for a supported eGPU solution for as long as I could. I ran an Amfeltec chassis on my 5,1 with decent results but it wasn't a good long term solution. Once the Pascal series cards came out, and we didn't get Pascal driver updates, I started looking at Windows. The lack of modern CUDA GPU options on Mac was one of the main reasons I switched.

Apple putting its head in the sand and pretending CUDA will go away while they engineer another solution is probably not the best strategy. Especially when that plan involves slower GPU's from AMD.

Apple lost another sale here too. I've been using a 5.1 for a long time that I upgraded as best I could. While it still works well for daily tasks, I needed something faster. I put the money towards a hackintosh compatible build with the intention of running OSX on it. 4 months later I'm still using Windows 10 without issue. I'm also one of those burned Aperture users looking for an alternative (not a fan of LR). There was absolutely no way I was going to shell out what Apple wants for clearly outdated technology WITH technical issues.
 
Its gone way beyond worrying. Everyone is past that stage. What's on our minds now is BOXX, HP, Dell, etc.

People don't worry about things they've dumped. New Mac Pro? Who effin cares anymore. For many, the romance is over.

Then why are you still on this forum then? Just to cause trouble?
 
  • Like
Reactions: aaronhead14
I'm not sure what year this quote is from but I haven't shot anything less than 4K for about 8 years now. Mastering is at minimum 4K.

Why do you keep on saying that is needs to be the "minimum"? How can I master raw 720p 1080p footage at 4K? That is bad news and something your should NOT do. So no, 4K should not be considered the "minimum". When will you guys get it, there are different workflows and jobs out there.

And not even EVERY SINGLE MOVIE in 2016 was even in 4K.
 
  • Like
Reactions: MrUNIMOG
Why do you keep on saying that is needs to be the "minimum"? How can I master raw 720p 1080p footage at 4K? That is bad news and something your should NOT do. So no, 4K should not be considered the "minimum". When will you guys get it, there are different workflows and jobs out there.

And not even EVERY SINGLE MOVIE in 2016 was even in 4K.

720 is like SD back when we used to lay stuff to tape. It will be the last to go and will hang around far longer than it should.

If you capture in 4K, what do you think he should do, down convert everything to 720?
 
  • Like
Reactions: aaronhead14
720 is like SD back when we used to lay stuff to tape. It will be the last to go and will hang around far longer than it should.

If you capture in 4K, what do you think he should do, down convert everything to 720?

If you capture at 720p, what do you think I should do, upscale everything to 4K just because it should be the "minimum"?
 
  • Like
Reactions: MrUNIMOG
If you capture at 720p, what do you think I should do, upscale everything to 4K just because it should be the "minimum"?
You didn't answer the question.

If you capture at 720, you can work effectively on a Mac mini.

No one is going to shoot 720 for modern distribution.
 
If you capture at 720p, what do you think I should do, upscale everything to 4K just because it should be the "minimum"?

Anything captured above 1080p should be upscaled to 4K (and if the 1080p is RAW, it actually upscales really well to 4K when graded correctly). Anything captured below 1080p should be mastered AT 1080p minimum.
 
Why do you keep on saying that is needs to be the "minimum"? How can I master raw 720p 1080p footage at 4K? That is bad news and something your should NOT do. So no, 4K should not be considered the "minimum". When will you guys get it, there are different workflows and jobs out there.

And not even EVERY SINGLE MOVIE in 2016 was even in 4K.

I meant that for my work, I would never master at less than 4K, and it has been that way for awhile now.

1080p is probably still the lion's share of the low end of the market where shelf life of the content isn't important.
 
  • Like
Reactions: aaronhead14
Anything captured above 1080p should be upscaled to 4K (and if the 1080p is RAW, it actually upscales really well to 4K when graded correctly). Anything captured below 1080p should be mastered AT 1080p minimum.
No it doesn't. It makes my videos look horrible because it is 4x the image size. You gain NOTHING. You don't just magically gain resolution just by upscaling 1080p to 4K.
 
No it doesn't. It makes my videos look horrible because it is 4x the image size. You gain NOTHING. You don't just magically gain resolution just by upscaling 1080p to 4K.

Resolution isn't the only thing that makes an image look good. RAW 1080p footage (like from the Blackmagic Pocket Cinema Camera, for example) contains much more information than standard 1080p footage. The only way to keep that quality is to either keep everything in a visually-lossless codec (such as ProRes, DNx, or Cineform) OR to upscale. It's very common to upscale, especially for the web.
And you actually gain LOTs, especially for web delivery. The lower the resolution on YouTube, for example, the more compressed it looks. So when you upscale 1080p to UHD, it actually retains the full 1080p detail that would otherwise be lost upon compression.
 
  • Like
Reactions: tuxon86 and itdk92
Resolution isn't the only thing that makes an image look good. RAW 1080p footage (like from the Blackmagic Pocket Cinema Camera, for example) contains much more information than standard 1080p footage. The only way to keep that quality is to either keep everything in a visually-lossless codec (such as ProRes, DNx, or Cineform) OR to upscale. It's very common to upscale, especially for the web.
And you actually gain LOTs, especially for web delivery. The lower the resolution on YouTube, for example, the more compressed it looks. So when you upscale 1080p to UHD, it actually retains the full 1080p detail that would otherwise be lost upon compression.
Not necessarily. I have seen better looking 720 footage than some 4K footage.

And I gain nothing. It makes my videos look like crap. And as I said, it takes even me to buffer 4K content at 300mbps internet. My clients average at 5mbps. How does spending time upgrading 1080p to 4K benefit me? It doesn't. And it makes the video look like crap. It is a blown up image. Quality looks far worse than just keeping it at 1080p.
 
Not necessarily. I have seen better looking 720 footage than some 4K footage.

And I gain nothing. It makes my videos look like crap. And as I said, it takes even me to buffer 4K content at 300mbps internet. My clients average at 5mbps. How does spending time upgrading 1080p to 4K benefit me? It doesn't. And it makes the video look like crap. It is a blown up image. Quality looks far worse than just keeping it at 1080p.

I think you must just be doing something wrong... because none of this makes any sense whatsoever. A 50mbps connection is sufficient for 4K YouTube videos.... Also, blowing up an image doesn't make the image look worse. In an uncompressed codec, 1080p upscaled 4K will look exactly the same as the uncompressed 1080p. However, if uploading to YouTube, the higher resoulution combats YouTube's compression algorithms. This, in turn, makes the upscaled footage look less compressed than the non-upscaled 1080p compressed footage.
This is a fact. There's no debate, haha.
Just try it yourself. Take your raw 1080p files, grade them in Resolve, and export 2 versions (in ProRes, because it's a visually lossless codec) one in 1080p and the other upscaled to UHD. Then upload them both to YouTube and compare. The UHD upscale WILL look better. It's a fact. Higher resolution footage deals with YouTube's compression better than lower resolution footage.
 
  • Like
Reactions: tuxon86
I think you must just be doing something wrong... because none of this makes any sense whatsoever. A 50mbps connection is sufficient for 4K YouTube videos.... Also, blowing up an image doesn't make the image look worse. In an uncompressed codec, 1080p upscaled 4K will look exactly the same as the uncompressed 1080p. However, if uploading to YouTube, the higher resoulution combats YouTube's compression algorithms. This, in turn, makes the upscaled footage look less compressed than the non-upscaled 1080p compressed footage.
This is a fact. There's no debate, haha.
Just try it yourself. Take your raw 1080p files, grade them in Resolve, and export 2 versions (in ProRes, because it's a visually lossless codec) one in 1080p and the other upscaled to UHD. Then upload them both to YouTube and compare. The UHD upscale WILL look better. It's a fact. Higher resolution footage deals with YouTube's compression better than lower resolution footage.

Um, you are blowing up a 1920x1080 image to 3840x2160. The individual frames of a video are not vector graphics. It loses quality when you blow them up. I have seen it. Just take a simple AE project, produce it at 1080p, then go back and blow it up. I can tell the difference. It does not look as crisp and as good as the 1080p one did. Just like when I watch 1080p footage full screen on my 2560x1440 monitors, it looks worse than on a native 1080p display because it is blown up. You cannot gain image data just by blowing up the image. If you can, why not just take raw 720p footage and blow it up to 16K? It WILL NOT look crisp.

I have tried it. I upscaled one of my title sequences from 1080p to 4K and it looked HORRIBLE at 4K because....it did not magically generate extra pixel color information for the 4x resolution increase.

This is like saying "Generate a 100x100 image in Photoshop. Now blow it up to 200x200. SEE how it MAGICALLY gained those extra pixels and color data?" No, you blow up an image higher than it's original size it looks bad. 100x100 vector image CAN be blown up to 10,000x10,000 and look crisp.

Well all I have is Spectrum internet, so that is probably why it sucks for me. But I cannot expect my clients to download a 4K video because it does not benefit them AT ALL. and I get massive requests to even serve videos at 720p due to our crappy internet in our country.
 
  • Like
Reactions: MrUNIMOG
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.