I know CDs use the Compact Disc Digital Audio standard, but are music cds really lossless or are they compressed like dvds
DVDs use lossy compression (MPEG-2), Blu-Ray's also use lossy compression (typically MPEG-4). CDs are not compressed, they are raw 44.1/16 LPCM.
does dvd mpeg2 toss out more info that mpeg4 bluray since i hear that mpeg4 is a better more effcient codec
I think this is inaccurate, as most films are shot in, well, film you can't really say it's "lossy" the way they are encoded in other media, it's just a different format, a digital one vs. an analog one. Lossy or non lossy compression doesn't apply here.
Also, there are is a lot of hd camera content that is itself in mpeg-2 or 4 so when it's transcribed to dvd or blu ray you can't say it's a lossy process since nothing is lost from the original which itself is mpeg 2 or 4.
No, it's accurate. MPEG 2 and 4 are lossy. Raw lossless exactly-what-the-sensor-captured HD footage is HUGE. MPEG 4 can contain lots of different formats but as far as I know they're all lossy.
I believe it's only consumer level cameras that shoot "HD" footage in MPEG 2 or 4. I believe when writing to DVD / Blu-ray there's another encoding pass so you're actually piling lossy upon lossy compression if you're starting with MPEG stuff.
Of course they are lossy formats, there's no question about that. But film to any digital conversion is by definition "lossy" that's what I am saying
Plus like I said surely there is material on dvd already recorded in mpeg 2 or 4 that just get put on the dvd without any further transcoding.
CDs are not compressed, they are quantized.
Hey dxtc, thanks for these posts, very informative.
I had read about that loudness "plague" in wikipedia some time ago. Does that mean, simplistically, that the loudest signals get out and the lower ones become louder disproportionately to the increase of the louder signals
Transients also lose their attack/impact because everything is at the same level. It turns the signal into a brickwall with no dynamic contrast which the brain interprets as noise, increasing listening fatigue and making it difficult to hear "through" the music. In my opinion, digital brickwall limiting/clipping is one of the worst things ever to happen in the music industry.
Also, in listening tests, would a higher khz sampling than the one used in cds, provide a better listening experience? Would that include older master tapes re-mastered at higher than cd khz rates?
Let's throw in another complicating factor, shall we?
Standard CD audio is 16-bit, 44.1KHz PCM. Each sample is explicitly defined in the data; it is not approximated using a set of mathematical algorithms (which is what lossy compression does).
Back when CDs first came out, that was one of the highest bitrates available to studios.
However, nowadays most studios-- even "bedroom" ones-- can easily record in 24-bit or even 32-bit depth, with bitrates often at 48KHz or 96KHz. The majors sometimes use 192KHz, which of course sounds awesome at the mixing desk. The problem is this: The waveform data of the final mixdown must be resampled downward to fit the relatively old CD standard for mass retail distribution.
Thus, bits are lost, even before the product hits the shelves. Even on a CD, we're not hearing what the artist or engineers heard in the studio.
There's another "compression" on most recordings that hasn't been mentioned yet.
The studio's final mixed recording (aka the final mixdown), in whatever format, must be "mastered" for retail distribution-- a sort of post-processing, if you will. During this step, recordings are often run through a sound processor called a "compressor." This effectively makes low-volume sounds louder, and relatively loud sound peaks quieter without overloading the audio signal (clipping). This evens out the sound volume, at the cost of limiting dynamic range and detail in the final retail product.
What's worse...
There's a solid difference between over-compressing for an effect and brickwall limiting a master so that it's "louder" than other CDs.
I don't see any benefit to clipping transients just for loudness' sake. Don't get me wrong, I love side-chain compression a la Daft Punk, but we're talking about two completely different things here. I think MGMT would have benefited from not being brickwalled. I believe the vinyl master isn't as compressed as the CD master for their latest record, but don't quote me on that. I can go check though...
It stands to reason that the vinyl version would not be as compressed - you simply can't limit to the same extent to vinyl as you can to CD.
I agree - passionately - that arbitrarily limiting CDs to compete in 'loudness wars' is a Bad Thing. I was merely pointing out that the 'artificacts' of heavy limiting (as distinct from compression over the mix) are not necessarily a bad thing, depending on the music and personal taste. Although I guess, given the choice, I'd rather live in a world where a few songs were under-compressed than the current world where lots and lots of songs are over-compressed.
I guess my point is that over-compressing for the sake of pure loudness is bad; over-compressing because you like the sound of it is fine
I should point out that I'm not the compression fiend that these posts make me sound like! If anything I tend to be fairly moderate unless the music particularly calls for it. I just have an aversion to the phrase "you should never ..." in music production.![]()