Originally posted by CarbonFire
View Post
In theory yes loss is possible with analog signals however that is not always the case. Nor is digital always a better bet.
The argument often made for the DVI or HDMI signal formats is the "pure digital" argument--that by taking a digital recording, such as a DVD or a digital satellite signal, and rendering it straight into digital form as a DVI or HDMI signal, and then delivering that digital signal straight to the display, there is a sort of a perfect no-loss-and-no-alteration-of-information signal chain. If the display itself is a native digital display (e.g. an LCD or Plasma display), the argument goes, the signal never has to undergo digital-to-analog conversion and therefore is less altered along the way.
That might be true, were it not for the fact that digital signals are encoded in different ways and have to be converted, and that these signals have to be scaled and processed to be displayed. Consequently, there are always conversions going on, and these conversions aren't always easy going. "Digital to digital" conversion is no more a guarantee of signal quality than "digital to analog," and in practice may be substantially worse. Whether it's better or worse will depend upon the circuitry involved--and that is something which isn't usually practical to figure out. As a general rule, with consumer equipment, one simply doesn't know how signals are processed, and one doesn't know how that processing varies by input. Analog and digital inputs must either be scaled through separate circuits, or one must be converted to the other to use the same scaler. How is that done? In general, you won't find an answer to that anywhere in your instruction manual, and even if you did, it'd be hard to judge which is the better scaler without viewing the actual video output. It's fair to say, in general, that even in very high-end consumer gear, the quality of circuits for signal processing and scaling is quite variable.
That might be true, were it not for the fact that digital signals are encoded in different ways and have to be converted, and that these signals have to be scaled and processed to be displayed. Consequently, there are always conversions going on, and these conversions aren't always easy going. "Digital to digital" conversion is no more a guarantee of signal quality than "digital to analog," and in practice may be substantially worse. Whether it's better or worse will depend upon the circuitry involved--and that is something which isn't usually practical to figure out. As a general rule, with consumer equipment, one simply doesn't know how signals are processed, and one doesn't know how that processing varies by input. Analog and digital inputs must either be scaled through separate circuits, or one must be converted to the other to use the same scaler. How is that done? In general, you won't find an answer to that anywhere in your instruction manual, and even if you did, it'd be hard to judge which is the better scaler without viewing the actual video output. It's fair to say, in general, that even in very high-end consumer gear, the quality of circuits for signal processing and scaling is quite variable.
Comment