Στο AVforums εξελίσσεται μια πολύ ενδιαφέρουσα συζήτηση που αφορά τα χαρακτηριστικά μιας HD τηλεόρασης ώστε να προβάλει "σωστή" HD εικόνα, και άλλα σχετικά που αφορούν κατασκευαστές και καταναλωτές. Παρακάτω είναι το αρχικό ποστ :
"Full HD" is more than resolution!
I’ve been encouraged to spend a little time on this forum, and since I’ve been heartily welcomed both in the forums and in PM’s, and already been gaining very interesting info, I thought I’d throw in a little shout about one of my favourite topics… The importance of displaying accurate pictures.
This time, I’ll jump across the discussion about why or why not an accurate picture is the best picture. You’ve probably discussed this before, if not, I’ll be happy to take that discussion in a separate thread. In this thread, I’d like to talk a bit about how simple displaying an accurate picture actually is. Or rather, how simple it WOULD BE, if only the manufacturers would just try. If I’m missing something, or anyone disagree with some of the points, by all means chime in.
In my opinion, picture quality can be split into two main groups: Displaying the picture according to standards, and issues affecting picture quality. I believe that any TV that is said to be a PAL tv should actually display PAL pictures according to PAL standards, and an HDTV should display HD pictures according to HD standards – within reasonable allowances of course. If anyone disagree with me on this, I’d sure like to hear what arguments can be had against this.
The areas of the picture that is standardized can be fairly easily summarized. A given picture standard has the following references (not counting carrier-related stuff)
Resolution
Framerate/field rate
Black/white levels
Gamma (the curve of light output from black to white)
Color primaries (color gamut, saturation of colors)
White point
Color decoding (luminance/brightness of each color)
Note that there is no actual reference for light output or picture size. There are certain guidelines, but no actual reference included in the standard. Also, the output gamma isn’t nescessarily equal to the input gamma, this has to do with the human’s perception of light under different circumstances (sorry if I’m being too technical, I like to keep things correct when possible… Just skip if you don’t understand what I’m talking about! J )
Examples on quality-related issues that isn’t directly related to standards:
Contrast (!)
Edge enhancement/sharpness
Picture noise
Motion artifacts
Rainbows
Screen door effect
Hot-spotting
Viewing angle related effects (i.e. drop in contrast or color accuracy)
Etc.
Producing a high-quality picture is fairly simple: Adhere to the standards, and minimize the degrations caused by the quality-related issues. Adhering to standards is actually fairly simple. In most cases, it doesn’t cost a manufacturer a single penny more to adhere to the standards, than to get it wrong. The key here is that you can _never_ fix a problem using quality-related enhancements, if the problem is that standards haven’t been adhered. For instance, if a TV clips white, no picture enhancement can ever recreate that information, no matter how good the _quality_ of the TV is. Quality can never “rescue” a TV that doesn’t adhere to standards. Once the picture has been degraded, we can never get back what’s lost. So, the road to better pictures start with making sure we adhere to the standards, and then we can focus on the quality related stuff.
Like I said, adhering to standards is actually quite simple. We’ll take an example, 1080P HD (an HD-DVD or Blu-Ray movie for instance). The standards are something along the following (these are just some technical numbers, you don’t need to understand the numbers, only to understand how few there actually are to be aware of as a manufacturer when designing a TV set)
1920x1080P 16:9 resolution
24 fps framerate
Black at digital 16, white at digital 235
Input gamma 2.2
Primaries:
Red x=0.640 y=0.330
Green x=0.300 y=.0600
Blue x=0.150 y=0.060
White point D65, x=0.3127 y=0.3290
Color decoding: ITU Rec.709
Then there are a few other things to consider when designing, so the above IS a bit simplified – but for now it will do. Get these fairly few points right, and what you have is basically an accurate picture.
The reason this is interesting? To my knowledge, NO flatscreen tv on the market currently gets all the above right! Certainly not pre-calibration. If you want to try to find the best TV on the market, by starting at the top of the list and sort out any tv that doesn’t fulfil each point, you won’t even get to the bottom of the standards listing, before _every_ single TV on the market is ruled out. We don’t even get started with the “quality” list. To put it bluntly: If you want to say that any TV that doesn’t have 1080P resolution isn’t “full HD”, then I can just as well say that there are currently _no_ tv on the market that is actually capable of displaying HD. None, zip, nada. HDTV’s don’t exist. There may be tv’s that display HD in a quality that most people find acceptable or even remarkable, but it’s actually not HD, and impressed as the viewers may be, that picture could be easily improved by simply making sure the TV adheres to HD standards. Most of the job of a calibrator is to repair what you can of the manufacturers mess, and try to get the tv as close to standards as you can, but I don’t know any tv where it’s possible to get all the above points accurate (there are projectors on the market that will, however). A few can get pretty close, but there's still something left to be done, that could easily be corrected if only the manufacturers gave us the adjustments nescessary.
Because of this, it would actually be fairly simple to manufacture a high quality TV set: Adhere to the standards, include fair quality de-interlacing/scaling, and try not to mess it up with unnescessary picture “enhancement” etc. Do this with a panel with decent basic properties (contrast, response time, viewing angle etc), and you will have a high-end picture that only a handful of current tv’s would come close to.
So, if it’s that simple, why don’t the manufacturers do this? The answer to this is just as simple: Consumers don’t realize the importance of adhering to standards, and hence they don’t demand it from the manufacturers. The fact of the matter is, that the few parts of the standards that consumers actually DO understand (resolution and framerate), which are issues that consumers are actually demanding from manufacturers, just isn’t nearly as important as the parts that most consumers don’t comprehend (color accuracy and gamma). So, even though these issues are incredibly important to the overall picture quality, manufacturers don’t care, because consumers don’t know that they should care.
It’s not even because of the well-known effect that manufacturers want their TV’s to “scream” at you in the store. Starting out with an accurate picture would make it a lot _easier_ for the manufacturers to create a setting that scorches the sun, and then have a setting that actually works in your living room. Pixel Plus, Digital Motion, Digital Noise Reduction and whatever “enhancements” any manufacturer would want to implement, would work even _better_ if the starting point was an accurate picture. And, like I said, in most cases it doesn’t even cost the manufacturer a penny. It doesn’t cost more money to set black level at 16 instead of 0, all you need is the actual knowledge that you have to do it, and the will to even care.
So, what can we do about this? For one, you as consumers need to speak with one voice. The main voice of the consumers is (or should be) the press. If you demand from any reviewer that they test these quite few points and state in every review whether or not the set adheres to these standards, maybe – just maybe – the manufacturers will realize that it’s worth the effort to spend a few hours in development to at least give their buyers the _possibility_ of displaying pictures according to standards. I don’t care if they do it in the standard setting, as long as the possibility is there somehow, perhaps through professional calibration, perhaps just by having a preset that’s reasonably close. But most of all, you need to realize that not displaying color according to HD spec is just as big a problem (bigger, actually) than not displaying resolution according to HD spec.
To sum it up: Can we please start to take color as seriously as resolution? There IS such a thing as HD color, just as there is HD resolution. You want “full HD”? Fine, go buy a 1080P tv set, but if it’s not HD color, it’s not full HD, no matter what the resolution is.
"Full HD" is more than resolution!
I’ve been encouraged to spend a little time on this forum, and since I’ve been heartily welcomed both in the forums and in PM’s, and already been gaining very interesting info, I thought I’d throw in a little shout about one of my favourite topics… The importance of displaying accurate pictures.
This time, I’ll jump across the discussion about why or why not an accurate picture is the best picture. You’ve probably discussed this before, if not, I’ll be happy to take that discussion in a separate thread. In this thread, I’d like to talk a bit about how simple displaying an accurate picture actually is. Or rather, how simple it WOULD BE, if only the manufacturers would just try. If I’m missing something, or anyone disagree with some of the points, by all means chime in.
In my opinion, picture quality can be split into two main groups: Displaying the picture according to standards, and issues affecting picture quality. I believe that any TV that is said to be a PAL tv should actually display PAL pictures according to PAL standards, and an HDTV should display HD pictures according to HD standards – within reasonable allowances of course. If anyone disagree with me on this, I’d sure like to hear what arguments can be had against this.
The areas of the picture that is standardized can be fairly easily summarized. A given picture standard has the following references (not counting carrier-related stuff)
Resolution
Framerate/field rate
Black/white levels
Gamma (the curve of light output from black to white)
Color primaries (color gamut, saturation of colors)
White point
Color decoding (luminance/brightness of each color)
Note that there is no actual reference for light output or picture size. There are certain guidelines, but no actual reference included in the standard. Also, the output gamma isn’t nescessarily equal to the input gamma, this has to do with the human’s perception of light under different circumstances (sorry if I’m being too technical, I like to keep things correct when possible… Just skip if you don’t understand what I’m talking about! J )
Examples on quality-related issues that isn’t directly related to standards:
Contrast (!)
Edge enhancement/sharpness
Picture noise
Motion artifacts
Rainbows
Screen door effect
Hot-spotting
Viewing angle related effects (i.e. drop in contrast or color accuracy)
Etc.
Producing a high-quality picture is fairly simple: Adhere to the standards, and minimize the degrations caused by the quality-related issues. Adhering to standards is actually fairly simple. In most cases, it doesn’t cost a manufacturer a single penny more to adhere to the standards, than to get it wrong. The key here is that you can _never_ fix a problem using quality-related enhancements, if the problem is that standards haven’t been adhered. For instance, if a TV clips white, no picture enhancement can ever recreate that information, no matter how good the _quality_ of the TV is. Quality can never “rescue” a TV that doesn’t adhere to standards. Once the picture has been degraded, we can never get back what’s lost. So, the road to better pictures start with making sure we adhere to the standards, and then we can focus on the quality related stuff.
Like I said, adhering to standards is actually quite simple. We’ll take an example, 1080P HD (an HD-DVD or Blu-Ray movie for instance). The standards are something along the following (these are just some technical numbers, you don’t need to understand the numbers, only to understand how few there actually are to be aware of as a manufacturer when designing a TV set)
1920x1080P 16:9 resolution
24 fps framerate
Black at digital 16, white at digital 235
Input gamma 2.2
Primaries:
Red x=0.640 y=0.330
Green x=0.300 y=.0600
Blue x=0.150 y=0.060
White point D65, x=0.3127 y=0.3290
Color decoding: ITU Rec.709
Then there are a few other things to consider when designing, so the above IS a bit simplified – but for now it will do. Get these fairly few points right, and what you have is basically an accurate picture.
The reason this is interesting? To my knowledge, NO flatscreen tv on the market currently gets all the above right! Certainly not pre-calibration. If you want to try to find the best TV on the market, by starting at the top of the list and sort out any tv that doesn’t fulfil each point, you won’t even get to the bottom of the standards listing, before _every_ single TV on the market is ruled out. We don’t even get started with the “quality” list. To put it bluntly: If you want to say that any TV that doesn’t have 1080P resolution isn’t “full HD”, then I can just as well say that there are currently _no_ tv on the market that is actually capable of displaying HD. None, zip, nada. HDTV’s don’t exist. There may be tv’s that display HD in a quality that most people find acceptable or even remarkable, but it’s actually not HD, and impressed as the viewers may be, that picture could be easily improved by simply making sure the TV adheres to HD standards. Most of the job of a calibrator is to repair what you can of the manufacturers mess, and try to get the tv as close to standards as you can, but I don’t know any tv where it’s possible to get all the above points accurate (there are projectors on the market that will, however). A few can get pretty close, but there's still something left to be done, that could easily be corrected if only the manufacturers gave us the adjustments nescessary.
Because of this, it would actually be fairly simple to manufacture a high quality TV set: Adhere to the standards, include fair quality de-interlacing/scaling, and try not to mess it up with unnescessary picture “enhancement” etc. Do this with a panel with decent basic properties (contrast, response time, viewing angle etc), and you will have a high-end picture that only a handful of current tv’s would come close to.
So, if it’s that simple, why don’t the manufacturers do this? The answer to this is just as simple: Consumers don’t realize the importance of adhering to standards, and hence they don’t demand it from the manufacturers. The fact of the matter is, that the few parts of the standards that consumers actually DO understand (resolution and framerate), which are issues that consumers are actually demanding from manufacturers, just isn’t nearly as important as the parts that most consumers don’t comprehend (color accuracy and gamma). So, even though these issues are incredibly important to the overall picture quality, manufacturers don’t care, because consumers don’t know that they should care.
It’s not even because of the well-known effect that manufacturers want their TV’s to “scream” at you in the store. Starting out with an accurate picture would make it a lot _easier_ for the manufacturers to create a setting that scorches the sun, and then have a setting that actually works in your living room. Pixel Plus, Digital Motion, Digital Noise Reduction and whatever “enhancements” any manufacturer would want to implement, would work even _better_ if the starting point was an accurate picture. And, like I said, in most cases it doesn’t even cost the manufacturer a penny. It doesn’t cost more money to set black level at 16 instead of 0, all you need is the actual knowledge that you have to do it, and the will to even care.
So, what can we do about this? For one, you as consumers need to speak with one voice. The main voice of the consumers is (or should be) the press. If you demand from any reviewer that they test these quite few points and state in every review whether or not the set adheres to these standards, maybe – just maybe – the manufacturers will realize that it’s worth the effort to spend a few hours in development to at least give their buyers the _possibility_ of displaying pictures according to standards. I don’t care if they do it in the standard setting, as long as the possibility is there somehow, perhaps through professional calibration, perhaps just by having a preset that’s reasonably close. But most of all, you need to realize that not displaying color according to HD spec is just as big a problem (bigger, actually) than not displaying resolution according to HD spec.
To sum it up: Can we please start to take color as seriously as resolution? There IS such a thing as HD color, just as there is HD resolution. You want “full HD”? Fine, go buy a 1080P tv set, but if it’s not HD color, it’s not full HD, no matter what the resolution is.