View Full Version : Shortchanging HDTV resolution
edtyct
11-25-2005, 09:40 AM
Gary Merson, a long-time writer about conumer electronics and the pubisher of HDTV Insider on the web, recently completely a study revealing that many mfgrs have been bobbing 1080i signals--that is, upconverting each 540-line field to a display's native resolution (720p, 768p,1080i, or 1080p) rather than buffering the two 540-line fields to construct a proper 1080p frame before releasing them to the TV screen. The result can be a shortfall in the detail that consumers expected to get, even though they may not have noticed it (Merson claims that he can see the dfference). The situation has apparently improved with the 2005 sets, but 2004 was rife with this sort of bobbing. As you might expect, 1080p sets that go this route are particularly at fault, since consumers often bought them to convert 1080i signals, not 540 vertical lines, to 1080p (Sony's LCoS sets don't bob). The companies that failed the test consistently were Sharp and LG; the mixed bags were Sony, Panasonic,and Samsung (all of its DLPs). JVC, Hitachi, Pioneer, and Toshiba had no failures (except for one out-sourced Toshiba). Faroudja/Genesis and PixelWorks "admitted" that some of their chipsets worked in this way. A few companies have disputed Merson's findings for certain sets--Sony and LG, for example.
The seriousness of this problem is up for grabs. I certainly don't think that it falls squarely into the category of false advertising. After all, scaling and deinterlacing are almost as much art as exact science. But I don't think that it is just a semantic problem either, or a choice between equal methods. A 1080i signal is not simply two independent 540-line signals; it is one coherent unit. To divide it up into discrete pieces the easier to deal with it is to distort it. Merson's study does indicate that cost is a factor; the sets that don't bob are more expensive. At the very least, mfgs should reveal which kind of scaling/ deinterlacing option they use. After all, we know the difference between TI's HD2, HD3, and HD4 chips. We should be equally aware of the disadvantages, as well as any advantages (like cost), of the bobbing chips.
Ed
Anyway, more detail on the study is available at hdtvinsider.com, and in this installlment of The Perfect Vision. I'm not usre whether TPV's website, AVguide.com, contains the info.
GMichael
11-25-2005, 10:46 AM
Can you tell me what the differences are between bobbing 1080i signals, rather than buffering them?
edtyct
11-25-2005, 11:03 AM
You mean visually? Bobbed signals would be less sharp, and they might have more motion artifacts, like jagged lines, because of the degree of separation from the integral frame. If you mean technically, what happens in the buffer with true 1080i processing is that the TV processes both 540 fields together, combining them as if they were to be displayed as a single progressive 1080 frame, before the scaler takes that frame and turns it into the display's native resolution. Bobbing, however, scales each 540 field independently of its legitimate complement. In other words, bobbing treats each half of the 1080 frame as it should treat the whole 1080 frame, thereby reducing the effective resolution to 540 (which eventually gets scaled to 720p, 768p, 1080i, or 1080p). Each 540 field, not the entire 1080 lines of information, is processed to reach the native resolution of the display,.
GMichael
11-25-2005, 11:29 AM
Thanks, that makes sense.
Gary Merson, a long-time writer about conumer electronics and the pubisher of HDTV Insider on the web, recently completely a study revealing that many mfgrs have been bobbing 1080i signals--that is, upconverting each 540-line field to a display's native resolution (720p, 768p,1080i, or 1080p) rather than buffering the two 540-line fields to construct a proper 1080p frame before releasing them to the TV screen. The result can be a shortfall in the detail that consumers expected to get, even though they may not have noticed it (Merson claims that he can see the dfference). The situation has apparently improved with the 2005 sets, but 2004 was rife with this sort of bobbing. As you might expect, 1080p sets that go this route are particularly at fault, since consumers often bought them to convert 1080i signals, not 540 vertical lines, to 1080p (Sony's LCoS sets don't bob). The companies that failed the test consistently were Sharp and LG; the mixed bags were Sony, Panasonic,and Samsung (all of its DLPs). JVC, Hitachi, Pioneer, and Toshiba had no failures (except for one out-sourced Toshiba). Faroudja/Genesis and PixelWorks "admitted" that some of their chipsets worked in this way. A few companies have disputed Merson's findings for certain sets--Sony and LG, for example.
The seriousness of this problem is up for grabs. I certainly don't think that it falls squarely into the category of false advertising. After all, scaling and deinterlacing are almost as much art as exact science. But I don't think that it is just a semantic problem either, or a choice between equal methods. A 1080i signal is not simply two independent 540-line signals; it is one coherent unit. To divide it up into discrete pieces the easier to deal with it is to distort it. Merson's study does indicate that cost is a factor; the sets that don't bob are more expensive. At the very least, mfgs should reveal which kind of scaling/ deinterlacing option they use. After all, we know the difference between TI's HD2, HD3, and HD4 chips. We should be equally aware of the disadvantages, as well as any advantages (like cost), of the bobbing chips.
Ed
Anyway, more detail on the study is available at hdtvinsider.com, and in this installlment of The Perfect Vision. I'm not usre whether TPV's website, AVguide.com, contains the info.
Hey Ed, I don't know the differences between TI's chips. All I have heard is that the HD2+ is better then the HD3. And I haven't heard anything about the HD4 chips improvements over the HD2+. Could you take the time to explain to the less knowledgeable.
evil__betty
11-25-2005, 07:14 PM
Hey Ed, I don't know the differences between TI's chips. All I have heard is that the HD2+ is better then the HD3. And I haven't heard anything about the HD4 chips improvements over the HD2+. Could you take the time to explain to the less knowledgeable.
The differences between the new TI chips are vitually none - at least no visibly. They have tweaked the chip slightly, and fixed anything that they found wrong with the previous year's chip. There is a big difference between the regualy '720p' chip and the new '1080p' chip. The DLP's that are 1080p use a method called "wobulation" (see S&V's explanation here (http://www.soundandvisionmag.com/article.asp?section_id=3&article_id=861&page_number=3)) I wouldn't loose sleep over any of the minor differences between the HD2+, HD3 or HD4 chips that are being used. The TV's own processing engine will make more difference in the PQ than the chip will.
edtyct
11-27-2005, 07:02 PM
The good things about the HD3 chip is that it reduces the price of DLPs and that it goes darker and is smaller than the HD2 chip. I beleieve that it also has an extra segment on the color wheel. The consensus, however, is that the HD2+ chip is the jewel in TI's crown at the moment, so far as contrast goes and distinct resolution. The HD3 chip wobulates to 1280x720 from 640x720, thus explaining the reduction in size and price. I'll leave it up to individuals about whether they can see the difference between the wobulators and the true 1280x720s. The HD4 chip wobulates 1920x1080p using half the horizontal resolution. I haven't seen enough of them yet, but I'm not sold that this type of chip will look as good as DLPs, LCDs, or LCoSs that have the full complement of pixels.
Thanks ed. The HD4 chip you state wobulates 1920x1080p, but I check out Toshibas website and they have the HD4 chip on all their 720p dlp's also.
edtyct
11-28-2005, 06:03 AM
Cam,
I'm not quite sure that I get what HD4 is all about. The xHD4 is a wobulating 1080p, but I can't tell whether the HD4 without the "x" wobulates at 720p or not. In other words, I don't know whether the distinguishing characteristic of any set with HD4 technology is wobulation at either resolution or a particular change to the mirrors and/or electronics. The contrast numbers go up with every generation; maybe that's the identifying characteristic for all "HD4" sets. As I said, I don't know. Maybe you do.
Ed
recoveryone
11-28-2005, 09:23 AM
Hmmm goes back to you get what you pay for. any time I see a deal to good to be true I wonder if the item is really a quality item. Not to pick on any one company, but when did LG get into the T.V. biz. in recent years this company has come out of left field and has its name from cell phones to washer's and dryers. When I make a big ticket buy I need a track record of how this company or product has done.
evil__betty
11-28-2005, 09:42 PM
Hmmm goes back to you get what you pay for. any time I see a deal to good to be true I wonder if the item is really a quality item. Not to pick on any one company, but when did LG get into the T.V. biz. in recent years this company has come out of left field and has its name from cell phones to washer's and dryers. When I make a big ticket buy I need a track record of how this company or product has done.
I wouldn't say that they have come out of left field with crazy things, its just that North America really hasn't known LG by the name 'LG' for all that long. LG is now the biggest flat pannel TV maker in the world! They are a HUGE Korean company that makes all sorts of stuff. If you ever travel to Korea, you will find LG bank machines, gas stations, tooth paste, deordoant, TV's, appliances... and all sorts of other fun stuff. They don't have many DLP's this year, as their focus has been Plasmas and LCD pannels, but they do make a really good product and warranty all TV's for 2 years (twice as long as the big guys like Sony or even Hitachi). So it shows that they stand behind their product. I wouldn't be shy about them - they make good stuff at reasonable prices. Can you get better? Sure, but you'll pay for it. They definately fill a niche in the market (people who want quality, but really don't want it bad enough to pay Pioneer Elite prices). Here's info from their website found here (http://www.lge.com/about/rnd/html/rndarea_digitald.jsp):
Major R&D Achievements
- First in the world to develop the largest-size 60" HD PDP
(Won the Presidential Award for Multimedia Technology)
- First in Korea to develop the LCD projector [Won Chang Young-sil Prize / acquired the KT mark
(A Korean new technology accreditation)]
- First in the world to develop the high-definition LCD projection TV
- First in Korea to develop the high-definition DLP projection TV
- First in the world to develop the lowest-height (56mm) DLP projector
- Won TL2005 Grand Prize (for developing the PDP low-voltage-driven method and driving IC)
Geoffcin
11-29-2005, 04:41 PM
Not to pick on any one company, but when did LG get into the T.V. biz. in recent years this company has come out of left field and has its name from cell phones to washer's and dryers. When I make a big ticket buy I need a track record of how this company or product has done.
LG is also the parent company of Zenith, one of the oldest names in TV manufacturing.
Woochifer
11-29-2005, 06:37 PM
Hmmm goes back to you get what you pay for. any time I see a deal to good to be true I wonder if the item is really a quality item. Not to pick on any one company, but when did LG get into the T.V. biz. in recent years this company has come out of left field and has its name from cell phones to washer's and dryers. When I make a big ticket buy I need a track record of how this company or product has done.
You might better remember LG by the Goldstar brand that they used to use in the U.S. market. At that time, Goldstar, as well as Samsung, was synonymous with cheap junk. But, they doggedly kept improving their products until now they've caught up with the longer established Japanese brands like Panasonic, Toshiba, and Sony. Hard to believe that 20 years ago, consumers viewed Samsung with the same kind of derision that they heap upon current bottomfeeding brands like Apex.
westcott
12-10-2005, 09:43 PM
Any 1080p signal is going to be contrived from some othe video format based on 24fps for now.
There are no 1080p cameras or editing equipment available at this time, that I am aware of.
It is all provided using 3:2 pulldown of 24fps material which is the long standing standard of the movie industry.
The only true progressive signals are those recorded for HD networks like ESPN and other sporting events (very general but basically true) using 720p native cameras. Even 480p DVD's are a product of 3:2 pull down and not a true progressive signal.
edtyct
12-11-2005, 05:14 AM
Hey, westcott, good morning. This may be a semantic issue, but why do you insist that signals derived from such processing as reverse 3:2 pulldown aren't truly progressive? Film undergoes processing only to make the frame rate compatible with our DVD standard, but the result is truly progressive, even though its immediate predecessor was interlaced. The anomaly is simply the cadence of the sequence and any judder that it introduces; its progressive nature isn't controversial. Obviously, the same can't be said about material that originates and displays as 1080i, but any broadcast at 1080i that upconverts to 1080p is truly progressive, even if contains artifacts of the conversion and/or would look soft compared to the same broadcast in 1080p proper. By whatever method 1080p arrives on a display, the result is a progressive scan, even if it's produced by an external video processor, and until displays can match a good processor's performance, externally created 1080p is likely to be our best source for some time to come. It remains to be seen whether Blu-ray, if it manages to gain a foothold at all, delivers on the promise of 1080p in a subsequent generation.
Ed
westcott
12-11-2005, 07:16 AM
Hey, westcott, good morning. This may be a semantic issue, but why do you insist that signals derived from such processing as reverse 3:2 pulldown aren't truly progressive? Film undergoes processing only to make the frame rate compatible with our DVD standard, but the result is truly progressive, even though its immediate predecessor was interlaced. The anomaly is simply the cadence of the sequence and any judder that it introduces; its progressive nature isn't controversial. Obviously, the same can't be said about material that originates and displays as 1080i, but any broadcast at 1080i that upconverts to 1080p is truly progressive, even if contains artifacts of the conversion and/or would look soft compared to the same broadcast in 1080p proper. By whatever method 1080p arrives on a display, the result is a progressive scan, even if it's produced by an external video processor, and until displays can match a good processor's performance, externally created 1080p is likely to be our best source for some time to come. It remains to be seen whether Blu-ray, if it manages to gain a foothold at all, delivers on the promise of 1080p in a subsequent generation.
Ed
I would liken the comparison to native versus upscaled. I think all of us would rather watch something at the highest possible native resolution or format possible. I think we would all agree also that upscaling introduces artifacts and compromises the picture quality.
The same with interlaced versus progressive signals. It is filmed at 24fps and upconverted to 60fps, we are missing a great deal of data or presented data that is just stolen from the previous frame. This "upconversion" is a poor substitute for actually filming at 60fps like some of the new 720p cameras used for HD sports and other similar events. The whole idea behind a progressive signal for sports and action scenes is to catch the fast movements. That can not be done unless you actually film the material at a higher frame rate. Otherwise, it is just a fascimile.
Here is a link that I think will explain this far better than I can.
http://www.tvtechnology.com/features/Tech-Corner/f_randy_hoffner.shtml
By the way, thank your for the kind words. It makes all the hard work worthwhile when your peers show appreciation (they are the only ones that can appreciate what it took to hide all those wires from the a\v credenca behind the couch to the speakers and tv!!)
edtyct
12-11-2005, 07:52 AM
The problem is worse with video processing, since the original fields have no inherent relationship with each other. The extrapolation has to be informed by a great deal of subsequent and consequent data in order for the progressive transitions to look acceptable on good quality monitors. In this context, the risk that the result will contain flaws not in the original stream intensifies. With film, you have to contend with the conversion to a different frame rate; film itself is progressive format. The original interlacing artifacts can theoretically disappear when the frames are reassembled for progressive display, again in a compatible cadence, but the degree of success depends on how good the chip is, as well as other factors. Any errors that enter the process are likely to be retained and exaggerated later. As sites like Secrets show, DVD players' ability to handle the deinterlacing varies, and the signal streams are complex.
One admirable thing, among many, of your HT is its illustration that a fixed screen in an elegant living space is not always a visual intrusion. Nice going.
Powered by vBulletin® Version 4.2.0 Copyright © 2024 vBulletin Solutions, Inc. All rights reserved.