Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

JohnDoe12

macrumors member
Original poster
Nov 14, 2017
71
52
This should not be expected. H.265 should deliver comparable quality at half of the bitrate of H.264.

If you set same bitrate on both encoder can you tell the difference of quality?
You're right that it should deliver similar quality, but it seems like that doesn't happen when you encode using hardware. The image below should clear things up:

encode.png


You probably need to view this image separately and zoom in to tell a difference.

  1. This is a screen grab from a screen recording of MKBHD's video playing back at 1440p. Details are sharp but the file is extremely bloated, being an on-the-fly screen recording.
  2. After compressing, there is a noticeable decrease in quality and sharpness, but a good amount of detail still remains.
  3. This is hardware encoding with a bitrate that should be the same as that of 2 (although it's slightly different, probably because of the way I did the math). Note that the quality is slightly worse, but I presume that on average the quality should be the same as 2, so in other frames the quality should be slightly better.
  4. This is pretty much as good as 2, which is impressive given that it's only 65% of the size of 2. It's what we expect from HEVC videos. However it takes longer to run.
  5. This is HEVC hardware encoding and you can immediate tell it's way worse than any of the others. It's even more apparent when you're watching the video—it is very evidently "blocky" and distracting. This pretty much shows how bad HEVC hardware encoding is if you want a comparable file size.
What I find very surprising is that H.264 hardware encoding is significantly better than H.265 hardware encoding. Perhaps it's just my intel Mac?

Edit: looks like this forum resized my original image, but you can still tell the differences quite easily.
 
  • Like
Reactions: matrix07

JohnDoe12

macrumors member
Original poster
Nov 14, 2017
71
52
I thought this was always a known tradeoff of hardware encoding -- much faster but at the expense of quality
Originally I thought hardware encoding meant that there was hardware within the CPU that allowed for significantly faster encoding at the same quality. Basically like using a CPU to perform GPU tasks, but the GPU would be the "hardware accelerator" for those tasks. But I guess I'm wrong.

Strange however that the hardware encoding for H.264 seems to be almost as good as software encoding, but at 4x the speed. Can't say the same for H.265... weird
 
Last edited:

Gnattu

macrumors 65816
Sep 18, 2020
1,107
1,672
I thought this was always a known tradeoff of hardware encoding -- much faster but at the expense of quality
It should not be that bad. You will do lose some quality, but you should not lose that much as shown in the picture.
 

Gnattu

macrumors 65816
Sep 18, 2020
1,107
1,672
Basically like using a CPU to perform GPU tasks, but the GPU would be the "hardware accelerator" for those tasks. But I guess I'm wrong.
The video encoders are written as "software", and CPU is meant to execute software. In other words, video encoders are originally CPU tasks.

About GPU acceleration, we have 2 kinds:

The first one is relatively old and I don't think it is been widely used anymore. The method is using GPU's shaders to compute the encoding.

The second one is to integrate dedicated hardware circuits into GPU as "media codec engine", and use that block of circuit to perform encoding. This is the most common hardware accelerated encoding technique now.
 

rui no onna

Contributor
Oct 25, 2013
14,920
13,264
Originally I thought hardware encoding meant that there was hardware within the CPU that allowed for significantly faster encoding at the same quality. Basically like using a CPU to perform GPU tasks, but the GPU would be the "hardware accelerator" for those tasks. But I guess I'm wrong.

Strange however that the hardware encoding for H.264 seems to be almost as good as software encoding, but at 4x the speed. Can't say the same for H.265... weird

That's because H.265 encoding is relatively new while they've already managed to refine H.264. HW accelerated H.264 encodes using Intel Sandy Bridge QuickSync don't look that nice either.


The first one is relatively old and I don't think it is been widely used anymore. The method is using GPU's shaders to compute the encoding.

I remember trying this with Nvidia and it looked like garbage (macroblocks galore). QuickSync was still bad but at least it was watchable.
 
  • Like
Reactions: JohnDoe12

RulerK

macrumors newbie
Oct 6, 2009
3
0
I have 2 things to add. I've been Hardware encoding on my Intel Macbook Pro for the past few years. It is MUCH faster than the standard X264 or X265 encodes (obviously, I'm using 265 these since its been available). However, I recently discovered that the video quality, while high is EXTREMELY *muddy* as if there's a gray filter applied to the whole image and that's why it is able to deal with the bitrate so quickly.

Additionally, I just switched to a brand new M1 iMac with the new Handbrake beta version. The VideoToolbox encodes are still muddy as hell, but also, at least in the short run are not encoding ANY faster than a standard X265 encode (~20fps on a 4k file at 15-20mb/s). Additionally again, I'm just running a long file and it is running like absolute crap after an hour! I'm literally only getting 4fps!!!! WTF!? This isn't what I just shelled out $2k for a 16GB/1TB iMac for! Can anyone shed some light? Other things **seem** to run much better though, but haven't been able to test them thoroughly. Also, I only **JUST** upgraded to Big Sur on my Macbook so I have no relative experience to compare Catalina to Big Sur speeds there.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.