True but the problem is there is very little true 4k content available now and it will be years before 4k is even available from a satellite provider if at all. Most companies have just upgraded to 1080 HD cameras to shoot their content and that was expensive, now they would need to buy 4k capable cameras? Not a chance anytime soon. Its also too much data for a cable connection to handle, most cable isn't even full 1080, its 720. I believe the majority of people in the US have cable, not satellite. So there is that. Not to mention how difficult it would be to upload 4k video in its native size to online websites and then the bandwidth needed to put it back out for consumption on a mobile device. I really don't see 4k ever taking over. If it does it will be 5 years at least.
TV will probably never be broadcast in 4k. They don't even broadcast in 1080p b.c the bandwidth
I believe Direct TV broadcasts in full 1080 but you are right, I can't see 4k ever becoming the norm.
I believe Direct TV broadcasts in full 1080 but you are right, I can't see 4k ever becoming the norm.
But Blu-ray and all 1080p content we see is compressed either by Mpeg-2 or AVC. We wouldn't expect them to give us uncompressed files of the movies for 4k when there is no storage for it. In fact, a lot of the movies we will get in 4k will be brought down in resolution from there original source of 6k or even 8k, since a lot of movies are filmed higher than 4k. So that is a comprise already. But we use what we can get. The studios do a great job at encoding and compressing Blu-ray films with an AVC encode and that usually takes 40GB. They could possibly encode 4k films with AVC/.h264 and end up with 90GB, but the new HEVC/.265 encode will likely use up those 90GB and yield far better results than an AVC encode would. The 100GB to 125GB discs are there if they decide to use those. .h265 will be far more efficent then what .264 is. Same way that AVC was far more efficient than Mpeg-2H.265. Still compressed, a full length movie would be in excess of 100-150GB. BluRay, whether AVC or MPEG can be 20-40GB with its extras, audio codecs, etc. quadrupole that....& there's your answer. Anytime a codec is compressed, it's lossy. RAW 4k on a RED cam (internally compressed with a proprietary algorithm) for the original cinema camera will fill up a 480GB SSD with approximately 50 minutes of footage. Shooting in ProRes (also lossy, Apple's FCP native format for easy editing, four hours on the same 480GB SSD. The production 4k cam recording ProRes will only allow about 60-70 minutes on the same 480GB SSD. Same time recording CinemaDMG RAW. For the pocket cinema camera....ProRes on a 128GB card will allow 70-80 minutes of footage and 30 minutes to the same card shooting RAW. Which really isn't 'raw' but it looks damn good. Keep in mind, they're now developing 5k and 6k cams. These are massive MASSIVE files. Even if h265 is everything it's promising, we are still looking at 'lossy' discs/cards/downloads of 80-200GB per flick! We are a ways off. Even though RED offers a deck for playback, there isn't any content now. Nor in the near future. 1080p has a LOT more going for it and can still be improved upon.
I think I'm with the majority....'Why?' Why would an OEM manufacture a 5" 4k display? Talk about an energy suck. As well...if we are already at 440-450ppi on these current 1080p displays....the iPhone @ 324(?)...and we can't see pixels, everything is sharp as a tack, why in the world would someone want to release a 2,000ppi 5" display for ANYTHING other than spec/bragging rights?
I don't disagree. I've played with, color corrected, transcoded and viewed a lot of 4k content on displays that my wife would never let me purchase shot with cameras that cost as much as my truck and editing on computers that are worth as much as my cabin! It's a very expensive endeavor right now and without distribution channels, limiting bandwidth and the 'speed' to reasonably download a full movie....or a medium to 'sell' it on, I'm not so sure we aren't at least 10-20 years from renting a 4k flick OnDemand or picking one up at the RedBox
Precisely. But there are ways to negate a lot of the information (visually) that the human eye can't distinguish at 24 or 29.97fps. Just like audio. SACD sure didn't take off. Nor did HD-DVD. The sound is phenomenal....but can't people only here '20-20'? I'm with ya. I listen to vinyl at home and the difference between analog and digital...to me, as a sound geek is night and day. Decay of a piano key, splash of a cymbal, overtones on guitars and horns...so much we might not be 'hearing' consciencously, but subconsciously, I believe our brains DO hear music and sounds below 20Hz and above 20,000. Even as we get older and objectively our hearing can be measured and shown to have degraded, RAW motion, stills and analog recordings are so much more 'real'. A demo at Best Buy on their 4k displays is incredible if you don't have access any other way....but on a phone? And the power on the SoC to make that work graphically without glitches would be astronomical.
But then again...what do I know. Samsung certainly knows how to make displays. I love my rMBP. That said, IMO....they should stick with their 350-450ppi displays, refine the OLED saturation issues and spend the other 98% of the time refining 'TouchWiz'
J
Why not? Isn't it great that they increased the resolution of the ipad mini 2 from the first gen? Or do you think that is a waste.Smartphone screens do not need any more pixels, and the cameras don't need more megapixels.
Riiiiight. So let's see you figure that out without 4K being made available first. If the market was made up of people with your mentality, then we'd still be stuck on VHS tapes and the like. You can thank me and the rest of the early adopters for partly feeding the motivation of innovators to keep bringing out newer and better tech. Us early adopters help bring the cost down so the closed minded whiners can jump on the wagon down the road. It really is that simple.
I believe Direct TV broadcasts in full 1080 but you are right, I can't see 4k ever becoming the norm.
Why not? Isn't it great that they increased the resolution of the ipad mini 2 from the first gen? Or do you think that is a waste.
TV will probably never be broadcast in 4k. They don't even broadcast in 1080p b.c the bandwidth
https://en.wikipedia.org/wiki/High_Efficiency_Video_Coding
Requires about 35% less bandwidth, so if we're lucky, there will be 1080p broadcasts instead of 720p on our 4K TVs.
Than again, the world's first 4K broadcast in HEVC was made two weeks ago, so don't expect this anywhere prior to the year 2125.
So we can put it in context. The iPhone 5's is slight below 720 and at 4 inches. Record a video in 720p or 1080p and you won't see the difference when you play it back on an iPhone for obvious reason. Play back the same video on a 5 inch 1080p display phone and you will be able to see the difference between the two videos.... Now, record a 2k/1080p video and a 4k/2160p video and play it back on the Note 3 and you will not see a difference on the Note 3's 5.7 inch 1080p display. Later on, play those two videos on a 5.2 inch 4k/2160p display and all of a sudden you get to see all the extra detail that wasn't visible and had to be downconverted on the lower resolution displays.Clearly you know nothing about PPI. 1080 resolution at 5" is already over 400 PPI. You won't even see a difference going higher than that, so it'll just be an extra burden on the GPU, the battery and overall performance without ANY benefit for the user.
HDMI 2.0 will be capable of 4k/ 60fps. Also, like early adopters of HDTV's which only had component connections to view HD content, they know these risks and will likely upgrade when technology improves.There is a place for early adopters. It's buying things before they're fully baked, just because they want it first. There isn't even an HDMI spec integrated into equipment that includes 2160p at the higher refresh rate that will make them functional for broadcast TV and gaming (50-60 Hz). Right now, 2160p tops out at 30Hz, which will be fine for Blu-ray based on 24p film sources, but you'll get your ass handed to you in a Call of Duty game!
Also, there is no way to see the colors capable thanks to Rec 2020 on these currently available UHDTVs! Early adopters will handicap themselves when they see how realistically Rec 2020-compliants sets can display a picture.
But, what do I know?
----------
Oh, I can. But it won't be broadcast TV as we know it. It will more likely be some sort of internet delivery. Displays are driving the delivery mechanisms into different forms that can provide the bandwidth for these gorgeous new sets.
Netflix is, of course, already testing 4K streaming content, and says they'll begin offering 4K (2160p) content next year.
"Samsung Electronics reportedly will launch flagship smartphone models equipped with 64-bit CPUs, WQHD displays and 16-megapixel cameras in 2014, which will further heat up hardware competition in the smartphone segment, according to industry sources."
So we can put it in context. The iPhone 5's is slight below 720 and at 4 inches. Record a video in 720p or 1080p and you won't see the difference when you play it back on an iPhone for obvious reason. Play back the same video on a 5 inch 1080p display phone and you will be able to see the difference between the two videos.... Now, record a 2k/1080p video and a 4k/2160p video and play it back on the Note 3 and you will not see a difference on the Note 3's 5.7 inch 1080p display. Later on, play those two videos on a 5.2 inch 4k/2160p display and all of a sudden you get to see all the extra detail that wasn't visible and had to be downconverted on the lower resolution displays.
But you're saying that only a certain size screen should only have a certain resolution. A 4 inch iPhone should only have a resolution of 1136x640 and not 1920x1080 because the difference can't be seen? A 5 inch phone shouldn't exceed 1920x1080? The only way to see a 1080p image is on a 50 inch screen and a 4k on an 85 inch screen, otherwise you can't see the difference?Wrong. To go from around 300 ppi such as on the iPhone to the +400s, there will be a small difference yes. However, once you past a certain PPI, the pixels will become too small to be seen and there will be NO difference past that threshold. You will NOT see a difference, except perhaps if you use a magnifying glass.
But you're saying that only a certain size screen should only have a certain resolution. A 4 inch iPhone should only have a resolution of 1136x640 and not 1920x1080 because the difference can't be seen? A 5 inch phone shouldn't exceed 1920x1080? The only way to see a 1080p image is on a 50 inch screen and a 4k on an 85 inch screen, otherwise you can't see the difference?
It's not only because because of the pixel count, it has a lot to do with the source. If you play a 720p or a 1080p video on an iPhone you can't see a difference because the output will be down converted to 1136x640. But if the resolution of the iPhone was increased to 1920x1080, you will be able to see the difference between those two videos. Same thing with the Note 3. It records videos at 3840 x 2160 but the phone only displays a 1920x1080 image. A while from now, when they increase the resolution of the display, you really think you won't be able to see a difference?
I would understand your argument if we were talking about displaying 1080p content on a 4k display and arguing that there will be no difference, but we're talking about having the proper content. Even the OS of the phones can be given far more detail and the picture quality will improve from our current phones. Just compare the Moto X to the HTC One, same size displays just different resolutions, and you will be able to see the difference. Same way we will be able to see the difference between our current 1920x1080 phones and the 2560x1440 phones once they are released next year.
I think I've been pretty clear about sitting distances from a TV and the distance to which you view a phone to appreciate the quality through out this thread. I haven't ignored that.You're really just missing the whole aspect of this where the proper screen sizes at the proper distances matter JUST AS MUCH as the number of pixels built into the panel and the quality of the content you're watching.
While it may not be curved, the display of Samsung's upcoming Galaxy S5 is certainly going to be a step up from the screen that we can all admire on the Galaxy S4.
Korean website DDaily is reporting that Samsung Display started mass-producing 5.25-inch QHD (Quad HD) AMOLED panels for the new Galaxy S5. If that's true, the next Galaxy flagship's screen will have an insane pixel density of 559.47 PPI, resulting from the 2560 x 1440 resolution. It's said that the new display uses a diamond-like PenTile pixel arrangement (similar to the AMOLED screens of the S4 and Note 3), this eliminating color reproduction issues that older PenTile screens have been blamed for.
Obviously, we must note that reports from Korea aren't always accurate, so for now nothing is certain. And we assume nothing will be certain until Samsung actually announces the S5. Still, it wouldn't be a surprise for the new Galaxy to sport a 5.25-inch screen with 2560 x 1440 pixels. China's Vivo already released a handset with a 2560 x 1440 pixel display (albeit a larger one, measuring 6 inches).
Sure, some will argue that 5.25-inch is too big for a smartphone. However, if you read our LG G2 review you'll see that a 5.2-inch handset can still be comfortable for one-hand usage as long as it's well-designed.
Samsung could unveil the Galaxy S5 as early as February (at MWC 2014). We'll be there to keep you in the know.
http://www.phonearena.com/news/5.25...-Samsungs-Galaxy-S5-now-in-production_id50668