I have often wondered this, as I'm sure we all have.
How much better is the technology going to get for some fields? Take audio for example. Over the years we have had breakthroughs in technology; reel-to-reel recording at fast speeds, then digital audio, better sampling rates, etc. (48k, and now 24-bit) However, now that it's hit a "peak point", really, how much of a difference between something recorded at 48k digital PCM vs. 24-bit digital can the human ear really hear? I would think things like that would simply start becoming hard to discern. As tech-geek, and audiophile myself (who still believes in the art of R2R recording at 15ips...!) even I have realized that this difference is so moot, that the human ear can't even tell.
I've wondered this about computer displays. Apple, (for example) undeniably, has some of the best computer displays on the market. They are using the most current technology, and probably already developing the next "latest and greatest" product in the display field. However, short of having a holographic image hovering right over your desk space, how much better can these LCD/LED/Plasma displays/TV's get at this point that the "average user" will have one of those "oooohh.....ahhhh" moments?
I guess these are things that I have wondered about that I would be interested to hear from other geeks, like myself, and perhaps if this post happens to find Chris, it would be most interesting to hear his input, as someone in the field as much as he is.
Happy New Year to you all!
Answer by TomMaxwell · Jan 04, 2011 at 05:56 PM
I have wondered these things then the Retina Display for the iPhone 4 came out and I was amazed! Audio may be more difficult to improve due to the human ears but computer monitors can still improve! My iMac display doesn't have Retina Display quality!