Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
In terms of computer monitors, broadcast standards are irrelevant.

Really??? Do some research. I guess you missed my post above where I sample a pro video supplier, a hugely popular web vendor, and a big box store.

There are vastly more 1920x1080 monitors on the market that any other rez...and I mean by far! B&H has 200 1080p models. 1440 trails way behind at 30 models. Newegg: 1,000 plus 1080p monitors, 231 at 1440, Best Buy: 496 at 1080 and a paltry 40 at 1440. Need I say more? Never a popular computer rez? Today, how about up to TEN times more popular than any other rez out there!

Although I compared only 1080 to 1440, you'll find the same overwhelming popularity of 1920x1080 monitors compared to ANY other rez. Reliable stats are hard to come by when it comes to usage because they're only tracked by a few websites and include phones, tablets, laptops, etc. Split those out and 1080 is still a big chunk on stats available from places like Steam. Include other 16:9 monitors and you're way up there, even on these biased, limited sample size stats.

Retailers do NOT stock products they don't expect to sell. 1920x1080 is five to ten times more available for purchase than any other rez...period. Now, let's think about it. Where in the world did 1920x1080 resolution and 16:9 aspect come from?

BROADCAST STANDARDS for HDTV

What is driving web consumption (and ISPs crazy due to bandwidth) these days? People watching videos, that's what. How may camcorders have 4:3 output now? How many new decent still cameras that do video don't offer 16:9 at 1080p? How many 4:3 monitors are still made compared to widescreen?

The future is here, and it is 4K, or more accurately, UHD. The former is a DCI acquisition format only, and the latter is television a BROADCAST standard. There are new UHD cameras, monitors, GPUs, and peripherals hitting the market on a weekly basis, and after CES and NAB early next year, you can triple the offerings at a minimum. Two weeks ago, one vendor I checked had 14 3840x2160 monitors. Today, they have twice that many.

There are TWO 5K displays, maybe one really because it's been said they're the same panel. No single GPU supports that rez. Then you have all of this crazy up/downscaling going on to display the image. Oh, and let's not forget driver headaches. As for all the retina hoopla, depending on screen size and viewing distance, you can get a "Retina" image from almost any rez (look at your iPhone, for example). Simple math along with a wee bit of knowledge about what the eye can see and the brain can process.

For the past few years, 1080p has driven the market, and over the next few years, 2160p will ballon faster than you'd expect. Early sales numbers say adoption is far faster than HD was. Prices for TN UHD panels are below 500 bucks now. The SST Samsung is on sale for 470 right now and there are additional 100 dollar coupons to be had if you look hard enough. The lack of content won't last long. More native UHD content is coming soon from Netflix, Amazon, Ultraflix, and more. And you can always roll your own in the meantime. Movies/TV shot in 4K/UHD then rendered out at 2K/HD look substantially better than stuff originally shot at 2K, The difference is obvious. Check out pro vid/film forums. They're salivating over 4K, and rightly so.

Bottom Line/ TLDR: Broadcast standards (resolution/aspect ratio) are not only relevant to computer monitor resolution, today, and for some time now, they determine computer monitor resolution and aspect ratio. If that isn't so, call the VP of purchasing at Newegg, B&H, and Best Buy and let them know they're making some awful decisions regarding what to stock their warehouses with.
 
  • Like
Reactions: Mac47
Of course it's been around a while. It's still not a viable distribution resolution.

I agree, but you still need to be able to display it to work with it adequately.
You could always work with either a downsample image or by zooming in and out, but it's more efficient to be able to display it at native res.

The end product can of course be downsampled to 2k for mass distribution.
 
Really??? Do some research. I guess you missed my post above where I sample a pro video supplier, a hugely popular web vendor, and a big box store.

There are vastly more 1920x1080 monitors on the market that any other rez...and I mean by far! B&H has 200 1080p models. 1440 trails way behind at 30 models. Newegg: 1,000 plus 1080p monitors, 231 at 1440, Best Buy: 496 at 1080 and a paltry 40 at 1440. Need I say more? Never a popular computer rez? Today, how about up to TEN times more popular than any other rez out there!

Although I compared only 1080 to 1440, you'll find the same overwhelming popularity of 1920x1080 monitors compared to ANY other rez. Reliable stats are hard to come by when it comes to usage because they're only tracked by a few websites and include phones, tablets, laptops, etc. Split those out and 1080 is still a big chunk on stats available from places like Steam. Include other 16:9 monitors and you're way up there, even on these biased, limited sample size stats.

Retailers do NOT stock products they don't expect to sell. 1920x1080 is five to ten times more available for purchase than any other rez...period. Now, let's think about it. Where in the world did 1920x1080 resolution and 16:9 aspect come from?

BROADCAST STANDARDS for HDTV

What is driving web consumption (and ISPs crazy due to bandwidth) these days? People watching videos, that's what. How may camcorders have 4:3 output now? How many new decent still cameras that do video don't offer 16:9 at 1080p? How many 4:3 monitors are still made compared to widescreen?

The future is here, and it is 4K, or more accurately, UHD. The former is a DCI acquisition format only, and the latter is television a BROADCAST standard. There are new UHD cameras, monitors, GPUs, and peripherals hitting the market on a weekly basis, and after CES and NAB early next year, you can triple the offerings at a minimum. Two weeks ago, one vendor I checked had 14 3840x2160 monitors. Today, they have twice that many.

There are TWO 5K displays, maybe one really because it's been said they're the same panel. No single GPU supports that rez. Then you have all of this crazy up/downscaling going on to display the image. Oh, and let's not forget driver headaches. As for all the retina hoopla, depending on screen size and viewing distance, you can get a "Retina" image from almost any rez (look at your iPhone, for example). Simple math along with a wee bit of knowledge about what the eye can see and the brain can process.

For the past few years, 1080p has driven the market, and over the next few years, 2160p will ballon faster than you'd expect. Early sales numbers say adoption is far faster than HD was. Prices for TN UHD panels are below 500 bucks now. The SST Samsung is on sale for 470 right now and there are additional 100 dollar coupons to be had if you look hard enough. The lack of content won't last long. More native UHD content is coming soon from Netflix, Amazon, Ultraflix, and more. And you can always roll your own in the meantime. Movies/TV shot in 4K/UHD then rendered out at 2K/HD look substantially better than stuff originally shot at 2K, The difference is obvious. Check out pro vid/film forums. They're salivating over 4K, and rightly so.

Bottom Line/ TLDR: Broadcast standards (resolution/aspect ratio) are not only relevant to computer monitor resolution, today, and for some time now, they determine computer monitor resolution and aspect ratio. If that isn't so, call the VP of purchasing at Newegg, B&H, and Best Buy and let them know they're making some awful decisions regarding what to stock their warehouses with.

My point. You missed it. You also threw a bunch of words in my mouth I never said. By and large, broadcast standards are irrelevant to computer monitors. Of course the vast majority these days are 1920x1080, but that was more born out of convenience rather than necessity. You'll also notice that Newegg lists monitors that fall under 22 other distinct resolutions. It's because computers aren't primarily used for viewing full screen broadcast video.

----------

I agree, but you still need to be able to display it to work with it adequately.
You could always work with either a downsample image or by zooming in and out, but it's more efficient to be able to display it at native res.

The end product can of course be downsampled to 2k for mass distribution.

Oh, of course. I'm still of the mind that having a full res output monitor while using the other workspace monitor(s) and working at lower res is still the best workflow.
 
My point. You missed it. You also threw a bunch of words in my mouth I never said. By and large, broadcast standards are irrelevant to computer monitors. Of course the vast majority these days are 1920x1080, but that was more born out of convenience rather than necessity. You'll also notice that Newegg lists monitors that fall under 22 other distinct resolutions. It's because computers aren't primarily used for viewing full screen broadcast video.

I tossed in some background info to address a few other posts, but I did not miss your point in saying that broadcast standards are irrelevant to computer monitors. That simply isn't true. While there are other factors, the 16:9 aspect at 1920x1080 was directly taken from HDTV standards proposed over ten years earlier when the gurus of display marketing and manufacturing recommended it in 2008 in a white paper (and there are many more articles on the subject from that area if you know where to look). True, in addition to HDTV compatibility, the computer industry jumped on it for familiarity as HDTV was adopted, compatibility, better yield, and PC specific advantages like two apps visible at once. But, there's not a snowball's chance in Hades you can manipulate the numbers of various monitor resolutions out there and say 1920x1080 isn't hands down numero uno, and it was undeniably the broadcast industry that adopted that rez early on, as they did UHD.

If that's not enough, now LG and one or two others are selling 19:10 C4K monitors, another broadcast (cinema really) standard, but no one is grasping the fact that it is an acquisition resolution and gets cut way down either vertically or horizontally depending on the director's chosen aspect of either 2.39:1 or 1.85:1. And yes, those ultra-wide screens popping up like the other LG offerings are 2.35 to 2.39:1 aspects, a Cinemascope aspect. Therefore, how you can possibly think Cinema and Broadcast standards aren't influencing, if not flat out dictating, the vast majority of computer monitor resolution defies all logic and factual history. And I don't mean just in the last five years or so. Remember 320x240 monitors at 4:3 aspect, just like that big ole x-ray emitting CRT TV sitting in that big wood console on the floor of your living room? They came just out just a little before the computer monitor. Is the identical aspect and lines of rez merely pure conicidence?
 
I don't think people realize how much bigger 5k really is compared to UHD (most computer monitors aren't even true 4k, they're UHD).

Dell-UltraSharp-27-Ultra-HD-5K-Monitor-5120x2880-Resolution-Comparison.png
 
I tossed in some background info to address a few other posts, but I did not miss your point in saying that broadcast standards are irrelevant to computer monitors. That simply isn't true. While there are other factors, the 16:9 aspect at 1920x1080 was directly taken from HDTV standards proposed over ten years earlier when the gurus of display marketing and manufacturing recommended it in 2008 in a white paper (and there are many more articles on the subject from that area if you know where to look). True, in addition to HDTV compatibility, the computer industry jumped on it for familiarity as HDTV was adopted, compatibility, better yield, and PC specific advantages like two apps visible at once. But, there's not a snowball's chance in Hades you can manipulate the numbers of various monitor resolutions out there and say 1920x1080 isn't hands down numero uno, and it was undeniably the broadcast industry that adopted that rez early on, as they did UHD.

If that's not enough, now LG and one or two others are selling 19:10 C4K monitors, another broadcast (cinema really) standard, but no one is grasping the fact that it is an acquisition resolution and gets cut way down either vertically or horizontally depending on the director's chosen aspect of either 2.39:1 or 1.85:1. And yes, those ultra-wide screens popping up like the other LG offerings are 2.35 to 2.39:1 aspects, a Cinemascope aspect. Therefore, how you can possibly think Cinema and Broadcast standards aren't influencing, if not flat out dictating, the vast majority of computer monitor resolution defies all logic and factual history. And I don't mean just in the last five years or so. Remember 320x240 monitors at 4:3 aspect, just like that big ole x-ray emitting CRT TV sitting in that big wood console on the floor of your living room? They came just out just a little before the computer monitor. Is the identical aspect and lines of rez merely pure conicidence?

Again, you missed the point. Television standards ARE irrelevant in regards to how it was being discussed in this thread. Some were questioning why the 5k res was chosen since it's not slated to be a standard. That doesn't matter. This is a computer monitor. It's primary function is not to playback full screen video. Besides, as we've both mentioned, the 16:9 ratio is a more important attribute. No one said cinema/broadcast standards didn't influence monitor design. That wasn't the issue. I'm well aware of the history of television and film formats. I've been producing film and video for close to 20 years now. 16:9 will probably remain the most common ratio for the foreseeable future, but it doesn't negate the varied other ratios and resolutions that will be readily available. And those 2.35:1 monitors are certainly influenced by the film aesthetic. But it's not like that ratio was chosen for displaying 2.35:1 content. And that's the issue we were talking about.
 
Film is Dead?

Film has been dead for years not sure why anyone would print out to 35.

There are some good reasons. The recent movie, "Interstellar" was shot on Film because it looks better. And some scenes were shot on Imax, which is about 220-MP per frame. (At 24 or more fps, it adds up fast).

But the world is changing of course. However, the problem with tape & digital formats is that there are so many of them, they are transient, and they rely on other devices & apps for playback.

You can pick up a piece of 35mm film decades old and just shine a light thru it to see an image (even if slightly damaged). But good luck if it's 2" video tape, or even BetaMax/Secam etc. Same is true for digital. (If you've ever tried to open an old legacy file from an expired application, then you'll understand).

Which is why the preservation experts use (three-strip) film for long-term archival. Sometimes, a stack of typed pages are preferable to a WordStar file. And sometimes film is better than AMAC.
 
For "Pro Pro PROOO" use you really need to be using a dedicated previewing monitor that supports the correct color gamuts anyway. If you're just making corporate/wedding/Internet videos then blast away, the iMac retina does the job.

But of course if you're doing any of these things you ARE NOT A PRO (no matter how much you get paid). It is so great we live in a world of open minded people who believe there is only one way to do things. /sarcasm
 
But of course if you're doing any of these things you ARE NOT A PRO (no matter how much you get paid). It is so great we live in a world of open minded people who believe there is only one way to do things. /sarcasm

I think you misunderstood the intent of his comment. The "pro" moniker certainly has spread across many different subsets of the industry, especially since tools that cost upwards of $100,000 a little over a decade ago can be had for cheap now (depending on your interpretation of cheap of course). But I took his "Pro Pro PROOO" designation as more of an exclamation, akin to something like "super high end." And in that case he's right.
 
I've just bought a new 4K Sony TV and I've got very hard time finding UHD contents.

I bought a 1080p LED television yesterday. I'll buy a 4K television the week that 4K movies become available in iTunes. Regardless, I'll buy a 5K monitor (or two) as soon as I can use it with a Mac (no, I don't want an iMac).
 
I think you misunderstood the intent of his comment. The "pro" moniker certainly has spread across many different subsets of the industry, especially since tools that cost upwards of $100,000 a little over a decade ago can be had for cheap now (depending on your interpretation of cheap of course). But I took his "Pro Pro PROOO" designation as more of an exclamation, akin to something like "super high end." And in that case he's right.

Gonna disagree. Although, part of my beef is the designation of "pro" applied to any equipment. It is a tricky enough labeled when attached to people. I still think his comment perpetuates a silly & elitist division.

As a related tangent: As part of the fall of the equipment barrier there has been a change in workflow. Not everyone. Sure. But given that 4K mainstream adoption is likely going to IP based (at least in the USA) I think "the way things are done" will start to change rapidly. Yes, you'll want a nice monitor but if you have any external video monitor you'll also want a crappy one and a device (software or hardware) that emulate how intense compression will destroy an image. THAT's how the viewer will see their media in the foreseeable future. That is where good editors (especially ones tasked with handling finishing as well) will head. As a video editor who moved over to audio I've seen the rise of MP3 change the way good producers mixed audio. Now the same thing is already happening with video.

A 5K iMac could fit into that world well. Not saying it is the best solution but an interesting one.
 
Gonna disagree. Although, part of my beef is the designation of "pro" applied to any equipment. It is a tricky enough labeled when attached to people. I still think his comment perpetuates a silly & elitist division.

As a related tangent: As part of the fall of the equipment barrier there has been a change in workflow. Not everyone. Sure. But given that 4K mainstream adoption is likely going to IP based (at least in the USA) I think "the way things are done" will start to change rapidly. Yes, you'll want a nice monitor but if you have any external video monitor you'll also want a crappy one and a device (software or hardware) that emulate how intense compression will destroy an image. THAT's how the viewer will see their media in the foreseeable future. That is where good editors (especially ones tasked with handling finishing as well) will head. As a video editor who moved over to audio I've seen the rise of MP3 change the way good producers mixed audio. Now the same thing is already happening with video.

A 5K iMac could fit into that world well. Not saying it is the best solution but an interesting one.

You hit the nail on the head. For both video and audio, i produce it on the best the budget allows and generate the most pristine master possible for archiving. However, unless I know the client has decent equipment, i render and preview different versions and basically dumb it down to the lowest common denominator in terms of playback equipment, both displays and speakers. i still have a few clients asking for a DVD. My answer is a flat no and i send them a cheap BR or HD media player. UHD though is going to present some problems except for a couple of good customers who already have UHDTV and a decent laptop for playback like an rMBP.

...Oh, and don't get me started on folks asking for mp3 audio they can play on their iGizmos when the content requires critical listening, which is every single time in my biz.
 
Another 5k monitor coming

Another new 5k monitor coming in March.
HP Z27q and its gonna be around $1299!!
 
Viewsonic bringing out a 5k display too.
Also of interest is the 5k over USB 3 'Displaylink' (TFTCentral)
Looks like its more than a gimmick!
 
Viewsonic bringing out a 5k display too.
Also of interest is the 5k over USB 3 'Displaylink' (TFTCentral)
Looks like its more than a gimmick!

I don't think USB3 has the bandwidth for uncompressed 5k, so I'd be worried about compression artifacts.
 
I don't think USB3 has the bandwidth for uncompressed 5k, so I'd be worried about compression artifacts.

The 3.1 spec offers faster speeds than baseline USB3... I'm still not sure if that's enough to drive 5k@60 (I'm guessing not since I don't think it's faster than TB2).

As to the topic: 5K is possible on a Mac Pro, assuming there are drivers for a MST monitor. I don't see them ever driving a single 5K display at 60fps given the limitations of TB2.

As to the digressions in this topic: 5K is a retina scaling of the 27 in iMac, so it's not surprising they used it. All that resolution has tons of uses, especially for still image manipulation. A lot of the comments in here are pretty damn stupid. Everyone has different workflows! I use multiple 1080p monitors because that's more useful for my work than one big one (and I plug consoles and other inputs into the other monitor as well.) But everyone's work and needs vary.
 
The 3.1 spec offers faster speeds than baseline USB3... I'm still not sure if that's enough to drive 5k@60 (I'm guessing not since I don't think it's faster than TB2).
Correct. USB 3.1 has a maximum bandwidth of 10Gbps, half that of Thunderbolt 2. USB 3.1 cannot drive 4K display (never mind a 5K display) at 60Hz.

As to the topic: 5K is possible on a Mac Pro, assuming there are drivers for a MST monitor. I don't see them ever driving a single 5K display at 60fps given the limitations of TB2.
Correct. With Displayport 1.2 over Thunderbolt 2, the current Mac Pro should drive a 5K display at a maximum frequency of approximately 45Hz or 50Hz, certainly not 60Hz. Driving a 5K display at 60Hz will require Displayport 1.3 over Thunderbolt 3.
 
HP z27q 5k display now available through MacMall @ $1249

Early reports are "sensational" - Still a dual cable solution of course, but much cheaper than the Dell (and I personally think better looking as well)
 
Nice. I wonder if a nMP could drive two of these.

It won't. The Mac Pro can drive three 4K displays each on a separate DisplayPort (DP) 1.2 'source'. If take two 'sources' to drive one 5K display then only have one left. The one left maxes out at 4K.

Because a single DP 1.2 'source' isn't enough, you have to use "overkill" to drive the 5K. That 'excess' works against being able to use it one something else. The likely matched patch length requirement of the dual cables also is a limitation that basically sinks a heavy physical port allocation to the 5K display.

However, a 5K primary and a 4K 'augment'/'other stuff' screen set up is not hobbled. Especially if the content relegated to the 4K screen is 'raw' image pixels ( photo/video) that don't need "doubling' to remove jaggy renderings.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.