I’ve always kinda wondered about this. I’m not an audio guy and really can’t tell the difference between most of the standards. That said, I definitely remember tons and tons ‘experts’ telling me that no one can tell the difference between 720p and 1080p TV at typical distance to your couch. And I absolutely could and many of the people I know could. I can also tell the difference between 1080 and 4k, at the same distances.
So I’m curious if there’s just a natural variance in an individual’s ability to hear and audiophiles just have a better than average range that does exceed CD quality?
Similar to this, I can tell the difference between 30fps and 60fps, but not 60 to 120, yet some people swear they can. Which I believe, I just know that I can’t. Seems like these guidelines are probably more averages, rather than hard biological limits.
It’s a fair question. Human hearing ability is a spectrum like anything else, however when it comes to discerning the difference in audio quality, the vast, vast majority of people cannot tell the difference between, say, AAC @ 256kbps and lossless when they do a double blinded test. And that includes people with equipment worth thousands of dollars.
Of those few who can, they generally can only tell by listening to very the specific characteristics of the specific encoder used, which takes a highly trained ear and a lot of practice.
The blind aspect is important because side-by-side comparisons (be they different audio formats, or 60fps vs 120fps video) are highly unreliable because people will generally subconsciously prefer the one they know is supposed to be better.
I think this is the case where certain people simply can’t see it here the difference.
I collect video game and movie soundtracks and the main difference I can hear between a 320kbps VS a FLAC that’s in the 1000kbps range is not straight up “clarity” in the sense that something like an instrument is “clearer” but rather the spacing and the ability to discern the difference where instruments come from is much better in a Hi-Res file with some decent wired headphones (my pair is $200). All this likey doesn’t matter much though when most users stream via Spotify which sounds worse than my 320kbps locally and people are using Bluetooth headphones at lower bitrates since they don’t have better codec compatibility like aptX and LDAC.
i think hi res is for professional work.
If you’re going to process, modify, mix, distort the audio in a studio, you probably want the higher bit depth or rate to start with, in case you amplify or distort something and end up with an unintended artefact that is human audible.
But the output sound can be down rated back to human levels before final broadcast.
O couse if a marketing person finds out there is a such a thing as “professional quality”. . . See also
“military spec”, “aerospace grade”
Yeah to expand on this, in professional settings you’ll want a higher sampling frequency so you don’t end up with eg. aliasing, but for consumer use ≥44–48kHz sampling rate is pretty much pointless
I’ve always kinda wondered about this. I’m not an audio guy and really can’t tell the difference between most of the standards. That said, I definitely remember tons and tons ‘experts’ telling me that no one can tell the difference between 720p and 1080p TV at typical distance to your couch. And I absolutely could and many of the people I know could. I can also tell the difference between 1080 and 4k, at the same distances.
So I’m curious if there’s just a natural variance in an individual’s ability to hear and audiophiles just have a better than average range that does exceed CD quality?
Similar to this, I can tell the difference between 30fps and 60fps, but not 60 to 120, yet some people swear they can. Which I believe, I just know that I can’t. Seems like these guidelines are probably more averages, rather than hard biological limits.
It’s a fair question. Human hearing ability is a spectrum like anything else, however when it comes to discerning the difference in audio quality, the vast, vast majority of people cannot tell the difference between, say, AAC @ 256kbps and lossless when they do a double blinded test. And that includes people with equipment worth thousands of dollars.
Of those few who can, they generally can only tell by listening to very the specific characteristics of the specific encoder used, which takes a highly trained ear and a lot of practice.
The blind aspect is important because side-by-side comparisons (be they different audio formats, or 60fps vs 120fps video) are highly unreliable because people will generally subconsciously prefer the one they know is supposed to be better.
I think this is the case where certain people simply can’t see it here the difference.
I collect video game and movie soundtracks and the main difference I can hear between a 320kbps VS a FLAC that’s in the 1000kbps range is not straight up “clarity” in the sense that something like an instrument is “clearer” but rather the spacing and the ability to discern the difference where instruments come from is much better in a Hi-Res file with some decent wired headphones (my pair is $200). All this likey doesn’t matter much though when most users stream via Spotify which sounds worse than my 320kbps locally and people are using Bluetooth headphones at lower bitrates since they don’t have better codec compatibility like aptX and LDAC.
i think hi res is for professional work. If you’re going to process, modify, mix, distort the audio in a studio, you probably want the higher bit depth or rate to start with, in case you amplify or distort something and end up with an unintended artefact that is human audible. But the output sound can be down rated back to human levels before final broadcast.
O couse if a marketing person finds out there is a such a thing as “professional quality”. . . See also “military spec”, “aerospace grade”
Yeah to expand on this, in professional settings you’ll want a higher sampling frequency so you don’t end up with eg. aliasing, but for consumer use ≥44–48kHz sampling rate is pretty much pointless