I’ve been a happy user of an M1 Mac Mini for a few months now. It is part of my dedicated stereo system. I run apps like Audirvana and Roon on it, upsampling to 768kHz or DSD256 as required, as well as convolution software for room correction. As expected, the Mini never breaks a sweat.
Looking at the chip temperatures as reported by iStat Menus, I’m a bit puzzled by what I see:
It looks like the CPU cores are the coolest part of the chip (about ambient temperature – this is Australia in summer, mind you). I can understand the GPU being a bit warmer: even though I’m running the Mini headless with remote screen sharing only I do have a dummy HDMI thingie plugged in for better remote screen performance. So far so good, and as expected.
What I don’t get is the neural engine temperature. Why would that be any warmer than the actual working parts of the chip? What is the neural engine doing? This Mac isn’t exposed to a flurry of online or iCloud or any sort of interactive stuff, its only purpose is playing music. I’m fairly certain that neither Roon nor Audirvana use machine learning, so what else is going on there?
This is not a serious concern, of course, but I’d really like to know what the NE is doing in this case, and also what its big mission is overall, since Apple puts to much emphasis on it.
Looking at the chip temperatures as reported by iStat Menus, I’m a bit puzzled by what I see:
It looks like the CPU cores are the coolest part of the chip (about ambient temperature – this is Australia in summer, mind you). I can understand the GPU being a bit warmer: even though I’m running the Mini headless with remote screen sharing only I do have a dummy HDMI thingie plugged in for better remote screen performance. So far so good, and as expected.
What I don’t get is the neural engine temperature. Why would that be any warmer than the actual working parts of the chip? What is the neural engine doing? This Mac isn’t exposed to a flurry of online or iCloud or any sort of interactive stuff, its only purpose is playing music. I’m fairly certain that neither Roon nor Audirvana use machine learning, so what else is going on there?
This is not a serious concern, of course, but I’d really like to know what the NE is doing in this case, and also what its big mission is overall, since Apple puts to much emphasis on it.