Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
For your information, here are some facts for those interestred:

- In the audio world "lossless" typically refers to lossless relative to CD quality, which is 16/44.1 or a dynamic range of 16 bits and a sample rate of 44.1kHz.
- The sample rate determines the maximum frequency you can represent. A digital signal is a discrete (made up of samples) representation of a continues signal (waves). To reconstruct a sinus wave you need exactly two samples. This means that the maximum frequency you can reconstruct equals half the sample rate. This reconstruction is exact and not an approximation (as it is for image pixels). In other words, with a sample rate of 44.1 we can accuratly reconstruct frequencies up to 22kHz, well above the limit of human hearring. For reference, the highest note on a piano is 4286 Hz and most speakers will not be able to produce signals over 22kHz either.
- Does playing at 192kHz make sense? Yes, if you like to play music for your dog and you have very high-end speakers with no other bottlnecks in the connection chain. Otherwise, absolutely not.
- The dynamic range determines the number of different volume levels you can represent. With 16bit you can represent signals from wispering to over 90dB, enough to cause damage to your ears after long exposure.
- Is there an advantage of a dynamic range higher than 16bit? Yes, if you want to accuratly represent fine details ranging from whispering to explosions. For most pop/rock music there is no difference at all.
- Recordings are mostly done at higher sample rates and higher bit rates. Why? This is not because we can hear a difference in the recording, but because it gives additional headroom during production, changing a signal invitably results in some losses which can as such be minimized.
- Airplay does support ALAC 16/44.1, in fact if I am not mistaken, it transcodes all input to this format for transmission. I don't see any reason why HomePods would not be able to play lossless input streams. If you will hear a difference is another question...
- Some people seem to belief everything lossy is the same, this is obviously not the case, the codec and bitrate make a huge difference.
- Currently Apple uses 256 AAC, truth is, most people don't hear a difference with lossless (CD quality) either, especially with low end equipment like HomePods for example. However, there are definetly people who can hear a difference on high-end equipment. If you want to check for yourself with your equipment you can do an ABX test here: http://abx.digitalfeed.net/itunes.html
- There are many reasons why you can compress a PCM signal lossy without any perceptual difference at all. For example, our sensitivity does not only depends on the signal intensity but also on the frequency. For example, humans can not hear sounds at 60Hz under 40dB. While these signals are encoded in PCM, these can be removed without any perceptual difference for humans.
- Eventhough most people can't hear a difference in a scientific ABX test they still belief they do hear a difference. Why? One reason is because mostly they don't test blind. At the moment you have prior knowledge you can't do an unbiased test. Tests have been done with exactly the same equipment audio but different logo's (Bose vs B&O for example), the more premium brand will consistenly perceived better even if the hardware is exactly the same. Secondly, it is common to decode the signal sligtly different. For example, simply increase the volume with 1dB and almost all test subjects will perceive this as higher quality. Third, often tests are done where other factors or at play, such as the DAC, connections and so on.
To be extremely pedantic (and annoying), the bit-depth doesn't strictly determine the dynamic range - it determines the number of discrete steps of loudness that can be reproduced. 8-bit samples give you 256 steps of loudness. 16-bit gives you 65,536 steps. 24-bit gives you 16,777,216 steps.

What a higher bit-depth gives you is the ability to represent a higher dynamic range without being able to hear distinct steps in loudness.

There's a direct analogy between this and digital images (and viewing devices such as TVs, monitors, etc) - non-HDR content would typically be displayed at 8-bits per color channel (RGB), giving us 256 steps between zero and maximum brightness. HDR content (usually 10-bits per channel), would give you 1024 steps of brightness.

The advantage of having a higher bit depth (or more steps) is less visible banding when showing a gradient from extreme brightness to extreme darkness.

If you have a TV or monitor capable of recreating a scene with a high dynamic range of brightness as your eyes would see in real life - e.g. a shot of a campfire where you're able to look into the flames but also the black cat hiding behind someone in the shadows, the 10-bit depth would allow enough steps of brightness to display a smooth gradient between the light and dark parts of the flames, and the texture of the cat's fur. If you only had 8-bits, the campfire would be a big white ball and the cat would look like a very dark gray stencil (if you can see it at all).

Coming back to audio, if you had an 8-bit recording and still wanted to capture someone whispering next to a rocket, you could but the whisper would sound like a bad Skype call with sudden jumps in loudness (it would sound really scratchy as it cuts in and out of the lower end of our dynamic range), while the rocket would also have weird jumps between loud and super loud (and would also sound distorted if it clipped the higher end of our dynamic range).

A 16-bit audio clip would be able to capture this range, but you'd probably want to record the source at 24-bits so you have headroom to make stuff louder, softer or to compress the loudness in post production to fit the 16-bit output file.

I'll keep quiet now.
 
Why would you be plugging high end headphones into a lightning to 3.5mm adapter which has the tiniest worst DAC ever - if you actually want to make use of lossless audio that you won't be able to hear anyway, at least invest in a £100+ DAC that'll retain the differences.
The DAC in the adapter is great: https://www.kenrockwell.com/apple/lightning-adapter-audio-quality.htm

"This tiny Apple device has better performance and more and cleaner output than many fancier "audiophile" devices I've tested. Apple has more resources to make better stuff than the smaller companies. Most 3rd-party headphone amps and DACs, all be they bigger and far more expensive, put out less clean power into 32Ω loads, and do it with more distortion, poorer sound and lousier frequency response.

If you're using regular headphones (under 100Ω), you can't do better than using this adapter or just plugging it into your iPhone, iPad or iPod. Only if you're running 240Ω or higher headphones are you likely to need a professional amp like the Benchmark DAC1 HDR to get more output."
 
I think in before Spotify does it, plus puts a bit nail in Tidals coffin
I tried Tidal and really wanted to like it, but their UI was profoundly bad. I tried it specifically because they advertised with a splash that they would have Atmos content available via their AppleTV app. They did, and it sounded good. Then I discovered that they made it impossible to actually search for Atmos content. Apart from a shortlist of featured albums and playlists, they buried the content. I sent a polite suggestion that they make that content easier to find and they politely responded that they'd mention the idea to their software engineers and did nothing about it. So really, Tidal's coffin came pre- self-nailed.
 
But to fair - there's not really any such thing as a badly encoded file these days. You can't really do it wrong..
Apple seems to have managed it though. For instance if I stream an album from Apple Music, it often sounds dreadful compared with if I rip the same album from the original CD -,to AAC 256kbs - and play that. It seems especially noticeable with older music. Some of it sounds like they simply upscaled a bunch of 64k mp3s to aac256(!)

I’m hoping that a side-benefit of Apple re-encoding their music to lossless might be that they do it right this time!
 
I have no idea how Apple’s products are managed or coordinated but it seems to me that songs and audio playing devices ought to be closely collaborating even it they are separate business units.

My iPhone with small run of the mill speakers CAN play lossless but a $550 ‘Pro’ set of headphones can’t. In fact no headphone or independent speaker that Apple currently makes can play lossless. Did anyone at Apple consider that maybe having a hi definition music source without a good speaker is kind of pointless? Unless this music quality upgrade was decided upon last week I would think that the headphone division and the HomePods could have been physically updated to handle the higher bit rate before rolling out the better audio quality songs.
 
  • Like
Reactions: ekwipt
Well actually it’s pretty easy to know if you are among the 1% who are able to hear that kind of difference. Please have an hearing test and post it here to let us know what is the « spec sheet » of your amazing ears …
Then we’ll know for sure if you *can* hear anything above 20khz or if it’s just « in your head » like so many audiophiles …

Because without proof anyone who says « they can hear a difference » is just like « I can see dead people » to me … and hearing could be measured so without it, it’s just a myth (that helps sell very expensive hardware that make people feel better and superior)
So there would be absolutely zero reason to record the original master in anything better than 256K AAC and yet they do? They would be absolutely no need for Apple to make a big song and dance about the launch of lossless and yet they do?

This thread is made up of folks who didn’t even understand lossless was a thing (until Apple announced it), those who are angry with their devices because (though were perfectly fine with them until yesterday) they can’t utilise it and those like you who claim it’s a myth.

l promise you if Apple stated tomorrow that their lossless versions was the best out there, a device firmware update allows the compatibility with it and a new cable, this thread would full of the exact opposite.

Buying headphones or speakers is a very personal thing as is listening to music; I subscribe to Amazon HD and I can tell you it is certainly better to my ears. I guess you actually believed “retina” displays were beyond the capabilities of the human eye to detect?
 
The DAC in the adapter is great: https://www.kenrockwell.com/apple/lightning-adapter-audio-quality.htm

"This tiny Apple device has better performance and more and cleaner output than many fancier "audiophile" devices I've tested. Apple has more resources to make better stuff than the smaller companies. Most 3rd-party headphone amps and DACs, all be they bigger and far more expensive, put out less clean power into 32Ω loads, and do it with more distortion, poorer sound and lousier frequency response.

If you're using regular headphones (under 100Ω), you can't do better than using this adapter or just plugging it into your iPhone, iPad or iPod. Only if you're running 240Ω or higher headphones are you likely to need a professional amp like the Benchmark DAC1 HDR to get more output."
Ken Rockwell has an article about Audiophile:

Audiophiles are what's left after almost all of the knowledgeable music and engineering people left the audio scene back in the 1980s. Audiophiles are non-technical, non-musical kooks who imagine the darnedestly stupid things about audio equipment. Audiophiles are fun to watch; they're just as confused at how audio equipment or music really works as primitive men like cargo cults are about airplanes. An audiophile will waste days comparing the sound of power cords or different kinds of solder, but won't even notice that his speakers are out-of-phase. An audiophile never enjoys music; he only listens to the sound of audio equipment.

Since sound and music perception is entirely imaginary (you can't touch or photograph a musical image), what and how we hear is formed only in our brains and is not measurable. Our hearing therefore is highly susceptible to the powers of suggestion. If an audiophile pays $5,000 for a new power cord, he will hear a very real difference, even though the sound is unchanged. Audiophiles do hear real differences in power cords when they swap among them (the placebo effect), but just don't ask them to hear the difference in a double-blind test.

To an audiophile, the hobby is all about playing with equipment, not enjoying music.
Please people stop with all this marketing that helps just more people make money.

Do people really think that you can’t fully appreciate music if you don’t have de 0.001% difference that the lossless brings to your ears … ?

As a kid the first Walkman I had made me so happy, then the first MP3 player I had before the iPod had 64mb of memory and boy … I was so amazed to put more than 30 songs in it and was enjoying every second of them on my way to school.

And you want me to believe that you now can’t appreciate music if it’s not 24 bits 192khz … seriously the state of technology with an unlimited catalog AAC 256 music playing instantaneously blows my mind and make me so happy and that what’s important. Please stop saying the 0,001% is what really define the whole experience
 
Last edited:
Are you sure? I know that Apple TV supports "Apple Music Lossless", but when it comes to "Hi-Res Lossless" Apple does not mention Apple TV as a compatible playback device.
The ATV needs to be plugged into an AVR/DAC via HDMI for Hi-Res. Natively the ATV passes 16/48 audio - lossless. With the music app passing Hi Res files using tvos 14.6 the 24bit 48/96/192 signal from the ATV needs an AVR/DAC to decode that signal and get it out to your speakers/headphones.
 
  • Like
Reactions: smulji
The DAC in the adapter is great: https://www.kenrockwell.com/apple/lightning-adapter-audio-quality.htm

"This tiny Apple device has better performance and more and cleaner output than many fancier "audiophile" devices I've tested. Apple has more resources to make better stuff than the smaller companies. Most 3rd-party headphone amps and DACs, all be they bigger and far more expensive, put out less clean power into 32Ω loads, and do it with more distortion, poorer sound and lousier frequency response.

If you're using regular headphones (under 100Ω), you can't do better than using this adapter or just plugging it into your iPhone, iPad or iPod. Only if you're running 240Ω or higher headphones are you likely to need a professional amp like the Benchmark DAC1 HDR to get more output."
Ken Rockwell was kind of a meme when I was more into photography, I didn't know he also reviewed audio stuff but it's hard for me to take him seriously.
 
  • Like
Reactions: peanuts_of_pathos
So there would be absolutely zero reason to record the original master in anything better than 256K AAC and yet they do? They would be absolutely no need for Apple to make a big song and dance about the launch of lossless and yet they do?

This thread is made up of folks who didn’t even understand lossless was a thing (until Apple announced it), those who are angry with their devices because (though were perfectly fine with them until yesterday) they can’t utilise it and those like you who claim it’s a myth.

l promise you if Apple stated tomorrow that their lossless versions was the best out there, a device firmware update allows the compatibility with it and a new cable, this thread would full of the exact opposite.

Buying headphones or speakers is a very personal thing as is listening to music; I subscribe to Amazon HD and I can tell you it is certainly better to my ears. I guess you actually believed “retina” displays were beyond the capabilities of the human eye to detect?
That is why I said, please measure your hearing. We measure the sight and some people have better vision than others right ?
Science says that the human ear cannot ear anything above 20khz and that is going down with age. Some people might be able to hear above that put please find the scientific material and you will see that it’s almost non-existant.
So I say it’s a myth up to the point where you will show me your hearing test to attest that you can hear differences. Otherwise it’s just like I say that I can see a pixel from 1m away of a retina screen while I might have myopia … until I prove my sight with the test I might just don’t know what I’m talking about.

And I will let engineers speak for themselves but I think they work with this kind of formats because …. It’s work files and just like with Photoshop and your 10000px canvas helps your have all the finess needed, when you export to the web you don’t use the original format but the compressed one. Even with bare minimum compression the file will be smaller and thus easier to share without people able to see the difference other that zooming 100x.
 
  • Like
Reactions: tonyr6 and lars666
Ken Rockwell was kind of a meme when I was more into photography, I didn't know he also reviewed audio stuff but it's hard for me to take him seriously.
In this case please tell me the measurements from a Rohde & Schwarz R&S UPL (which sells around 40000$ new) are not realiable and because you are more serious and know better, show me the measurements that back up claims.
it’s sound so everything can be measured.
 
This makes sense for AirPods -- they are fundamentally Bluetooth devices first. HomePod is confusing though. I have pretty decent music gear, and I currently use AirPlay to stream music. With Deezer HiFi, I can get bit perfect 16 bit / 44.1khz audio to my receiver via AirPlay. I don't know why it wouldn't be the same for the HomePod.

Now, I don't think it actually matters for HomePod, as I doubt it has the quality for lossless to actually matter (frankly, lossless likely isn't noticeable until your into speakers and equipment in the $1K range). But I'm a bit concerned if this means AirPlay won't deliver lossless Apple Music more generally. I'd consider switching from Deezer, as Apple has an attractive price point here, but I'll have to wait for verification that I can get lossless CD quality audio delivered via AirPlay before considering a switch.
 
This whole thread revived my interest in the Apogee Groove DAC. I’m surprised it hasn’t been brought up. I wonder if anyone here can comment on it with their experience with the product.
I've been getting huge enjoyment out of mine over the past 4 years. It offers fantastic sound quality in a super sturdy, very well-built metal case.

One drawback I can see is that it is pretty power hungry and will get surprisingly warm, which means it isn't ideally suited for battery-powered devices even though it works well plugged directly into the USB-C ports on iPads (using the appropriate cable or dongle). If you want to use it with iPhones/iPads that only have a lightning port, you'll need the Lightning to USB 3 Camera Adapter connected to a power adapter plugged into an outlet, iirc.

Some people may mind the fact that it uses bright LEDs to permanently display the volume level while playing music, which can't be turned off.
 
  • Wow
Reactions: jprmercado
I can't wait to listen to my Sonos Play5s via Airplay2 from my ATV and go "Ho Hum" and quickly forget about this feature.
 
So there would be absolutely zero reason to record the original master in anything better than 256K AAC and yet they do? They would be absolutely no need for Apple to make a big song and dance about the launch of lossless and yet they do?

This thread is made up of folks who didn’t even understand lossless was a thing (until Apple announced it), those who are angry with their devices because (though were perfectly fine with them until yesterday) they can’t utilise it and those like you who claim it’s a myth.

l promise you if Apple stated tomorrow that their lossless versions was the best out there, a device firmware update allows the compatibility with it and a new cable, this thread would full of the exact opposite.

Buying headphones or speakers is a very personal thing as is listening to music; I subscribe to Amazon HD and I can tell you it is certainly better to my ears. I guess you actually believed “retina” displays were beyond the capabilities of the human eye to detect?

When the original HomePod was announced a couple of years ago and Apple was claiming that it was an audiophile quality speaker I complained then. But most people don’t actively listen to music it’s something that they have on in the background.

Above a pretty low threshold most people don’t care about sound quality. They never did, and they never will. It’s not that people like crummy sounding music it’s that for casual or background music the quality can be lower because you aren’t intently listening to it. If it gets bad enough then yes, they won’t like it but it has to be bad enough to draw their attention to how bad it sounds.

If Apple wanted to hype audiophile quality sound then lossless should have been where they started. And all of their premium headphones and EarPods and smart speakers should have had an audiophile version above the consumer level ones. But audiophile is a very niche market. It always has been. Most people will buy some level of “good enough” speakers because they either can’t hear the difference or they don’t want to pay for it. A company will make a lot more money selling various levels of good enough speakers than they will make selling audiophile level ones.
 
  • Like
  • Love
Reactions: smulji and jtjones3
The DAC in the adapter is great: https://www.kenrockwell.com/apple/lightning-adapter-audio-quality.htm

"This tiny Apple device has better performance and more and cleaner output than many fancier "audiophile" devices I've tested. Apple has more resources to make better stuff than the smaller companies. Most 3rd-party headphone amps and DACs, all be they bigger and far more expensive, put out less clean power into 32Ω loads, and do it with more distortion, poorer sound and lousier frequency response.

If you're using regular headphones (under 100Ω), you can't do better than using this adapter or just plugging it into your iPhone, iPad or iPod. Only if you're running 240Ω or higher headphones are you likely to need a professional amp like the Benchmark DAC1 HDR to get more output."
Really doubt the honesty of those comments. Look DACs as semiconductor parts have specs, they don't magically perform better and produce exceptional clean output with a 10 cent part. If it even sells for that much. The adapter on Amazon is selling at $7.99.

I do think the adapter is quite usable, but this guys claims is for the birds.

How about this comment
Pretty impressive, but you have to realize that Apple has a lot more smart people and the world's nest audio engineers that "audiophile" companies can't afford. Heck, most of today's audiophile and mainstream audio companies can't even afford the laboratory facilities I have.
What a load of malarky. :D
If that was the case why hasn't Apple utilized lossless audio in their product lines for the last few years?
 
Don't know all the details about spatial audio but the article says you can enable it for other headphones.

So does any type of headphone support spatial audio as long as you manually enable the feature ? and is it only for tracks that are made specifically to support spatial audio ?
 
Really doubt the honesty of those comments. Look DACs as semiconductor parts have specs, they don't magically perform better and produce exceptional clean output with a 10 cent part. If it even sells for that much. The adapter on Amazon is selling at $7.99.

I do think the adapter is quite usable, but this guys claims is for the birds.

How about this comment

What a load of malarky. :D
If that was the case why hasn't Apple utilized lossless audio in their product lines for the last few years?
I completely agree with you. The Apple headphone adaptor is OK for in ear earphones, but thats it. They are practically useless with larger headphones due to lack of Ampage.
 
  • Like
Reactions: peanuts_of_pathos
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.