You're right that it should deliver similar quality, but it seems like that doesn't happen when you encode using hardware. The image below should clear things up:This should not be expected. H.265 should deliver comparable quality at half of the bitrate of H.264.
If you set same bitrate on both encoder can you tell the difference of quality?
You probably need to view this image separately and zoom in to tell a difference.
- This is a screen grab from a screen recording of MKBHD's video playing back at 1440p. Details are sharp but the file is extremely bloated, being an on-the-fly screen recording.
- After compressing, there is a noticeable decrease in quality and sharpness, but a good amount of detail still remains.
- This is hardware encoding with a bitrate that should be the same as that of 2 (although it's slightly different, probably because of the way I did the math). Note that the quality is slightly worse, but I presume that on average the quality should be the same as 2, so in other frames the quality should be slightly better.
- This is pretty much as good as 2, which is impressive given that it's only 65% of the size of 2. It's what we expect from HEVC videos. However it takes longer to run.
- This is HEVC hardware encoding and you can immediate tell it's way worse than any of the others. It's even more apparent when you're watching the video—it is very evidently "blocky" and distracting. This pretty much shows how bad HEVC hardware encoding is if you want a comparable file size.
Edit: looks like this forum resized my original image, but you can still tell the differences quite easily.