I can’t speak for video, but for audio production that isn’t true. Audio signals can be perfectly reproduced, up to some frequency determined by the sample rate and up to some noise floor determined by the bit depth, digitally. Set that frequency well beyond that of human hearings and set that noise floor beyond what tape can do or what other factors determine, and you get perfect reproduction.
Wasn’t normal 35mm film about the equivalent of somewhere between 4k and 8k depending on the film stock?
Plus, the projector optics will always limit the sharpness of the picture. No lense is ideal, and even ideal lenses would have fundamental limitations due to diffraction.
As far as lens optics, we’re really splitting hairs here. 70mm through a quality lens in an imax theater is going to look absolutely fantastic and stunning. Digital is just more convenient and at some point it will catch up and surpass film.
My point was more like that even IMAX film doesn’t quite get to 18k equivalent, more like 12 to 16k. Honestly, anything above 4k (for normal widescreen content) even on big screens is barely noticeable if noticeable at all. THX recommends that the screen should cover 40° of your FOV; IMAX is what, 70°, so 8k for it is already good enough. Extra resolution is not useful if human eye can’t tell the difference; it just gets to the meaningless bragging rights territory like 192 kHz audio and DAC-s with 140 dB+ S/N ratio. Contrast, black levels, shadow details, color accuracy are IMO more important than raw resolution at which modern 8k cameras are good enough and 16k digital cameras will be more than plenty.
I’m sure I’m wrong, but it’s hard to imagine this being better quality than what we can do digitally these days.
Resolution and color reproduction is still unmatched. Plus there are a lot of things happening in the analog domain that our eyes notice as beautiful.
Same thing is true for analog vs digital music production btw
I can’t speak for video, but for audio production that isn’t true. Audio signals can be perfectly reproduced, up to some frequency determined by the sample rate and up to some noise floor determined by the bit depth, digitally. Set that frequency well beyond that of human hearings and set that noise floor beyond what tape can do or what other factors determine, and you get perfect reproduction.
See here. https://youtu.be/UqiBJbREUgU
You are in fact wrong lol. Actual film has a resolution equivalent of something like 18K.
Wasn’t normal 35mm film about the equivalent of somewhere between 4k and 8k depending on the film stock?
Plus, the projector optics will always limit the sharpness of the picture. No lense is ideal, and even ideal lenses would have fundamental limitations due to diffraction.
Something like that.
As far as lens optics, we’re really splitting hairs here. 70mm through a quality lens in an imax theater is going to look absolutely fantastic and stunning. Digital is just more convenient and at some point it will catch up and surpass film.
My point was more like that even IMAX film doesn’t quite get to 18k equivalent, more like 12 to 16k. Honestly, anything above 4k (for normal widescreen content) even on big screens is barely noticeable if noticeable at all. THX recommends that the screen should cover 40° of your FOV; IMAX is what, 70°, so 8k for it is already good enough. Extra resolution is not useful if human eye can’t tell the difference; it just gets to the meaningless bragging rights territory like 192 kHz audio and DAC-s with 140 dB+ S/N ratio. Contrast, black levels, shadow details, color accuracy are IMO more important than raw resolution at which modern 8k cameras are good enough and 16k digital cameras will be more than plenty.