Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Could you please provide the math on how you end up with 0,7 billion colors for 10 bit?
Someone posted this. Read previous comments....
Screenshot 2020-10-15 at 14.46.36.png
 
Ahahahah LG, Sony and Samsung are supporting 1.07 billion colors and Apple only 0,7. Sneaky bastards! 😂


It states 700 billion colours, "and then it goes one better, recording in Dolby Digital". Meaning more... much more! Do some research before trolling forums. It records in the 700 billion colours before Dolby Digital even kicks in and starts working.

You own a LG V50 ThinQ, trying to make yourself feel better because it's not as good as the latest from Apple?
 
Last edited by a moderator:
Someone posted this. Read previous comments....
Sigh... you have to start understanding this stuff. What a chip captures has nothing to do with Dolby Vision or HDR in that first step. The result of what is captured is then converted to video with HDR or Dolby Vision and static or dynamic metadata and can be, in the case of Dolby Vision up to 12 bit.
Given that, I wonder what you say about LG using Sonys IMX sensors capturing LOG video with 8-bit and 4:2:0 subsampling. That's what's captured and has nothing to do with the fact that it's stored in 10 bit or even 12 bit in memory.
 
Last edited:
Sigh... you have to start understanding this stuff. What a chip captures has nothing to do with Dolby Vision or HDR in that first step. The result of what is captured is then converted to video with HDR or Dolby Vision and static or dynamic metadata and can be, in the case of Dolby Vision up to 12 bit.
Given that, I wonder what you say about LG using Sonys IMX sensors capturing LOG video with 8-bit and 4:2:0 subsampling. That's what's captured and has nothing to do with the fact that it's stored in 10 bit or even 12 bit im memory.
You're almost at the point of saying that Dolby Atmos is better than Quad Dac! 🤣
See my Metadata. It's recorded by a 10 bit sensor in 10 bit HVC1 bt2020 rec2020 with 1.07 billion colors. That's better then your Dolby 0.7 I'm telling you, you're at the point of saying next that Dolby atmos is better than quad dac 😂
 
Are you actually thick?! Seems you are struggling to even read and understand what sentences mean.

It states 700 billion colours, "and then it goes one better, recording in Dolby Digital". Meaning more... much more! Do some research before trolling forums. It records in the 700 billion colours before Dolby Digital even kicks in and starts working.

You own a LG V50 ThinQ, trying to make yourself feel better because it's not as good as the latest from Apple?
What's to kick in, a software codec? 😂 The sensor tops out at 0.7 billion colors.
 
You're almost at the point of saying that Dolby Atmos is better than Quad Dac! 🤣
See my Metadata. It's recorded by a 10 bit sensor in 10 bit HVC1 bt2020 rec2020 with 1.07 billion colors. That's better then your Dolby 0.7 I'm telling you, you're at the point of saying next that Dolby atmos is better than quad dac 😂
You still do not understand how this stuff works. Your metadata is showing what is stored after processing, not what is actually captured from the sensor. It's already pre-processed at this point. The camera sensor does not capture HVC1/HVEC, it captures raw data. In order to know what the sensor is capturing, you actually have to look at the signal output of the chip itself in a lab or in this case, listen to what the LG engineers and not their marketing department are saying, which is 8 bit LOG video with 4:2:0 subsampling.
 
You still do not understand how this stuff works. Your metadata is showing what is stored after processing, not what is actually captured from the sensor. It's already pre-processed at this point. The camera sensor does not capture HVC1/HVEC, it captures raw data. In order to know what the sensor is capturing, you actually have to look at the signal output of the chip itself in a lab or in this case, listen to what the LG engineers and not their marketing department are saying, which is 8 bit LOG video with 4:2:0 subsampling.
Meanwhile, back in 2017...
First 10 bit camera sensor. Are you blind?
[automerge]1602782402[/automerge]
It's in the first freaking post!
 
Meanwhile, back in 2017...
First 10 bit camera sensor. Are you blind?
Oh jeez... it doesn't mean that it's reading out at 10 bit. Is that so hard to understand? Have you actually ever worked in hardware or software development, so you can understand this without quoting marketing numbers? You sound worse than some of my first semester know-it-all students I teach at the university. Here's a link to an video related article of the V30, also explaining how it captures 8 bit log video as I've said before. LG also confirmed this to the authors, similar to LG engineers confirming it to me and others. https://www.nextpit.com/in-depth-lg-v30-perfect-video-smartphone.
 
  • Like
Reactions: gtg465x
And v30, v40, v50, g8 and v60 they are all using 10 bit sensors. I don't know where did you get that 8 bit. 😂
 
And v30, v40, v50, g8 and v60 they are all using 10 bit sensors. I don't know where did you get that 8 bit. 😂
From LG engineers and read outs in the lab. They capture in 8 bit. And guess what, I have access to 12 bit sensors in my research lab (and even higher)... surprise, the 12 bit sensors capture video at a shocking 10 bit. 😲
 
  • Like
Reactions: gtg465x
From LG engineers and read outs in the lab. They capture in 8 bit. And guess what, I have access to 12 bit sensors in my research lab (and even higher)... surprise, the 12 bit sensors capture video at a shocking 10 bit. 😲
10 bit sensor, 10 bit output file. What's next? Trust me this video only plays on LG phones.
Screenshot_20201015-204701.png
 
10 bit sensor, 10 bit output file. What's next? Trust me this video only plays on LG phones.
See the post above. Capture 8 bit, written to memory as 10 bit, encoded file is 10 bit from 8 bit capture. This is pre-processed. It's an MP4 file which is already encoded and not RAW from pixel data. At least you have confirmed multiple times now that you don't understand at all how this works. I can also take a 2k source and upconvert it to 4k or 8k. I can also take a Rec709 source and convert it to BT2020. And guess what, the properties of such a file would also read 8k resolution and BT2020 color. Shocking isn't it?

Again, I would recommend to take the sensor to a lab and read RAW data from the chip itself without any type of processing, but given you have absolutely no engineering knowledge, this is wasted time as you would never be able to do it. And guess what, that is ok. But some people have the ability to learn from others who are more knowledgeable... you have demonstrated you're not one of them. Maybe in the future if you try hard enough.

Edit: While I'm at it... your Atmos and Quad DAC comparison is very amusing, as these are completely different things for different purposes. Again, you don't understand this. It's like comparing a TV to a car. Both have cables and such, yet the use-case is something completely different.
 
So we have 10 bit sensor, next snapdragon 855 capturing in 10 bit, not even 865, then the 10 bit codec also, ALL CAPTURING IN 10 BIT. What's more now? SENSOR IS, SNAPDRAGON IS, CODEC IS!
Capture+_2020-10-15-21-08-26.png
 
And when I think that everyone was saying how good the iPhone 11 pro was... Best on any EVER... yet a 2019 February device can do what not released almost 2021 can... 😂
 
So we have 10 bit sensor, next snapdragon 855 capturing in 10 bit, not even 865, then the 10 bit codec also, ALL CAPTURING IN 10 BIT. What's more now? SENSOR IS, SNAPDRAGON IS, CODEC IS!
Read again, same problem with Dolby Vision. You claimed DV is 12 bit. It is up to 12 bit. There's no 12 bit DV content. It's all 10 bit. What you posted here is exactly the same issue, up to 10 bits Video Capture Feature, yet in reality it's only capturing in 8 bit.
 
Read again, same problem with Dolby Vision. You claimed DV is 12 bit. It is up to 12 bit. There's no 12 bit DV content. It's all 10 bit. What you posted here is exactly the same issue, up to 10 bits Video Capture Feature, yet in reality it's only capturing in 8 bit.
Yeah, right. All 3 things supporting, my videos can't be playable on iphones or Samsung, only on LG newer phones, yet somehow is 8 bit, because you say so. Mr. Professor, what more can I say. Apple is the FIRST, EVER and on ANY. 🤣🤣🤣
P. S. Please do an exercise of imagination and reflect if the iPhone 11 was the best video recording phone, now that it has aged.
 
All 3 things supporting, my videos can't be playable on iphones or Samsung, only on LG newer phones, yet somehow is 8 bit, because you say so. Mr. Professor, what more can I say.
Again, you demonstrate perfectly that you do not understand how it works. Your video is 10 bit in an encoded file, but originating from 8 bit raw data. Do you actually believe the sensor is spitting out a mp4 file? Two completely different things. Same with the example I've already described. Converting 2k Rec709 to 8k BT2020 works and guess what, you need a 8k and BT2020 capable device to play it as well, once converted, even though it originated from 2k Rec709. Please educate yourself just a little before posting these things. At least it's a win for LG, because people who don't know any better are falling for the marketing as you've demonstrated.
 
Dolby Vision *supports* 12-bit color depth, but there is no hardware on Earth that can display the full Dolby Vision color range at this time. Not even $45,000 reference monitors.

Adopting the Dolby Vision format means that Apple has plenty of overhead as hardware improves in the future.
 
Dolby Vision *supports* 12-bit color depth, but there is no hardware on Earth that can display the full Dolby Vision color range at this time. Not even $45,000 reference monitors.
Dolby had two demonstrators displaying 12 bit. PRM 4200 and 4220 if I remember correctly. But that's pretty much it. Barco Thor supports 12 bit on the dual HD-SDI input cards and the internal test patterns are 12 bit (at least some of them). I'm not sure what would happen internally when inputing a 12 bit 4:4:4 signal. Different price range though. The Barco Thor starts at around $600k and can go higher depending on configuration. Even the new Christie flagship is 10 bit, smallest version (least brightness) starts at around $300k and the brightest is more expensive than a Thor.
 
Your video is 10 bit in an encoded file, but originating from 8 bit raw data.
Hi, again! I still don't understand why would a 10 bit sensor capture in 8 bit if it's in an LG phone, yet if it's in Apple (which BTW is using also LG Innotek assembled sensors) it will" miraculously" do it in 10 bit (which in fact is a 8+2 bit)?
P. S. All this time I was referring to the korean models, idk if the option is enabled in the US one.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.