Is 720P format dead? [Archive] - Audio & Video Forums

PDA

View Full Version : Is 720P format dead?



Smokey
06-04-2010, 10:09 PM
720P format is one of HD protocol when it was established. But with 720p TV on market disappearing and only available on elcheapos and smaller TVs, one wonder if 720P is a dying dinosaur.

The argument was that 720p is progressive and better for sport and high action programs, but with 1080P capable TVs, that argument is not true anymore. 720p was the preferred format for ABC and Fox network, but I am not sure if they have moved on to 1080 format or not.

markw
06-05-2010, 04:09 AM
I know broadcast tv can do1080 I, but can they do 1080 P?

AFAICT, only Bluray does 1080 P, but things do change.

Rich-n-Texas
06-05-2010, 07:09 AM
As far as I know, no Broadcaster is sending 1080P. I thought I read somewhere that this is due to compression issues at the local carriers end. Don't quote me on that though.

...But with 720p TV on market disappearing and only available on elcheapos and smaller TVs, one wonder if 720P is a dying dinosaur.
this one's got pixeldumbass written all over it! :1:

Sir Terrence the Terrible
06-05-2010, 07:49 AM
Nobody in broadcast is doing 1080p for several reason. The first, bandwidth requirements. Broadcasters are under pressure from the cell phone and broadband concerns, as more wireless products hit the market. The push towards digital compressed the broadcast bandwidth as it is, but now the wireless concerns are pressing for even more bandwidth, and are likely to get it sooner or later.

Second reason infrastructure. 1080p takes up quite a bit of storage even when compressed. Most television stations have upgraded their storage needs to accommodate MPEG-2, but not much more than that. They have built their infrastructure from the transmitters, fiber optics, to video encoders to handle either 720p or 1080i, but not 1080p. The move to digital television has cost broadcasters a bundle, and with the economic downturn, and consistent loss of viewers and advertiser money, it is not likely many of them can afford a complete infrastructure upgrade to 1080p.

720p and 1080i in consumer disc, television, and projector technology is pretty much done. It was a way for manufacturers to transition up from 480p to 1080p when the processing chips for 1080p had not been developed, or where super expensive and in short supply. HD DVD nor Blu ray was around when 720p and 1080i sets were selling like hot cakes, but when they both were introduced, the industry pushed towards 1080p for home video. The next push up in resolution will be towards 4K.

pixelthis
06-05-2010, 08:54 AM
Smaller size tv sets are still 720p, because it doesnt make that much difference
with the smaller screen.
BUT I SAW A 1080P THE OTHER DAY, 32", so its just a matter of time.
1080p tv and 1080i broadcast is a match made in heaven, because the TV just has to
deinterlace, no conversion.
THE PIC is almost as good as BLU-RAY(but not quite).
720p used to have its place, but with 1080p getting cheaper and cheaper, sooner it will
no longer be a savings to make a 720p in any size, and that will be IT.
AS FOR 1080P broadcast, why?
Todays TV sets can do an exelent job of deinterlacing 1080i, so you really dont need it.:1:
SO YES, 720P IS dead

pixelthis
06-05-2010, 08:57 AM
As far as I know, no Broadcaster is sending 1080P. I thought I read somewhere that this is due to compression issues at the local carriers end. Don't quote me on that though.

this one's got pixeldumbass written all over it! :1:

Wouldnt quote you on anything, except maybe the going price of crack ho's and
anti-depression drugs.
The DISH has advertised 1080p in its PPV offerings, but this is not a regular thing.:1:

Rich-n-Texas
06-05-2010, 09:17 AM
Wouldnt quote you on anything, except maybe the going price of crack ho's and
anti-depression drugs.
The DISH has advertised 1080p in its PPV offerings, but this is not a regular thing.:1:
Please. You're an ex-cop. You know damn well what the price of Prozac is. :rolleyes:

Smokey
06-05-2010, 08:38 PM
Thanks everybody for chiming in. Don’t have HD yet, but I figured that broadcasting 1080p does occupy alot of bandwidth. SirTT said that two main reason being bandwidth requirements and TV station’s lack of storage.

But Pixelthis had a good point as to why broadcast 1080p signal at all when 1080i can be deinterlaced and converted to 1080p by TV or cable box. Just like when DVD player signal was converted to 480p by Tv or DVD player, couldn’t the same thing be done with 1080i broadcast signal where 1080p broadcast would not be necessary.

I’m figuring that film base 1080i broadcasting (where the source is 24 frame per second) can be converted to 1080p by interlacing or 3:2 pull down method by the TV/cable box without loosng any [field/frame] information. But the question is can conversion can be done correctly with non film base 1080i materials?


Smaller size tv sets are still 720p, because it doesnt make that much difference with the smaller screen. BUT I SAW A 1080P THE OTHER DAY, 32", so its just a matter of time.

The time is already here. If you go to Bestbuy web site, over 75% of their 32-37 inch TVs are 1080p TVs and most of them are under $600.

pixelthis
06-06-2010, 10:04 PM
[QUOTE=Smokey]Thanks everybody for chiming in. Don’t have HD yet, but I figured that broadcasting 1080p does occupy alot of bandwidth. SirTT said that two main reason being bandwidth requirements and TV station’s lack of storage.

But Pixelthis had a good point as to why broadcast 1080p signal at all when 1080i can be deinterlaced and converted to 1080p by TV or cable box. Just like when DVD player signal was converted to 480p by Tv or DVD player, couldn’t the same thing be done with 1080i broadcast signal where 1080p broadcast would not be necessary.

I’m figuring that film base 1080i broadcasting (where the source is 24 frame per second) can be converted to 1080p by interlacing or 3:2 pull down method by the TV/cable box without loosng any [field/frame] information. But the question is can conversion can be done correctly with non film base 1080i materials?

I once had a 1080i set (two, actually) and the difference between that and 1080p is
night and day, as big a jump as 480p deinterlaced from 480i.
So why broadcast 1080p? You can use the extra bandwidth you save with 1080i
for extra content.
As for broadcasting at 24 fps, that probably wont happen, because you'd need a set that could handle it at the decoder stage, or a seperate tuner to feed a 24fps into an HDMI input, a lot of bother, really.:1:

GMichael
06-07-2010, 06:15 AM
720p is nice. At the time of my purchase 1080p was 5 times as much. If I had it to do over with today's pricing, I would not consider 720p.

kevlarus
06-07-2010, 08:39 AM
Smaller size tv sets are still 720p, because it doesnt make that much difference
with the smaller screen.
BUT I SAW A 1080P THE OTHER DAY, 32", so its just a matter of time.


Matter of time for what, most sets to be 1080p regardless of screen size ?

I use a 32" since the room is small, but it's 1080p -- wouldn't settle for anything less. And it does just fine sending DD5.1 to the receiver. Not sure why some hdtv's only trasmit in stereo, seems silly.

pixelthis
06-07-2010, 03:01 PM
Matter of time for what, most sets to be 1080p regardless of screen size ?

I use a 32" since the room is small, but it's 1080p -- wouldn't settle for anything less. And it does just fine sending DD5.1 to the receiver. Not sure why some hdtv's only trasmit in stereo, seems silly.

Matter of time, yes.
I am curious about your setup, if you have a "box" (sat or cable) you will get better results feeding the sound directly to the receiver.
Most TV's only send stereo from the optical out(if they have one).
THE REASON IS SIMPLE, they are cheap.:1:

Smokey
06-07-2010, 07:31 PM
As for broadcasting at 24 fps, that probably wont happen, because you'd need a set that could handle it at the decoder stage, or a seperate tuner to feed a 24fps into an HDMI input, a lot of bother, really.:1:

My concern was since 1080p Tvs convert all of their input signal (1080i) to its native resolution (1080p) , does any information get lost in the field or frame of image in the upconversion process?

I am asking this question because 1080i only produce 60 even and odd field per second (30 full frame persecond), while 1080p produce 60 full frame persecond.

kevlarus
06-08-2010, 09:14 AM
Matter of time, yes.
I am curious about your setup, if you have a "box" (sat or cable) you will get better results feeding the sound directly to the receiver.
Most TV's only send stereo from the optical out(if they have one).
THE REASON IS SIMPLE, they are cheap.:1:

6 of one and all that. Since it's HDMI from the cable box to the TV, I just use the optical cable to pipe sound from the TV to the receiver. Also means, it I connect anything else via HDMI, the receiver will automatically get that sound too, since the receiver doesn't have HDMI -- only component ins.

Not sure why you say "you will get better results" going from cable box direct to receiver.

Down side for blu, the hdtv prolly only passes DD and not DTS, but I'll experiment with that when the time comes...

pixelthis
06-08-2010, 09:45 AM
6 of one and all that. Since it's HDMI from the cable box to the TV, I just use the optical cable to pipe sound from the TV to the receiver. Also means, it I connect anything else via HDMI, the receiver will automatically get that sound too, since the receiver doesn't have HDMI -- only component ins.

Not sure why you say "you will get better results" going from cable box direct to receiver.

Down side for blu, the hdtv prolly only passes DD and not DTS, but I'll experiment with that when the time comes...

Sorry bout that, what I MEANT was that most cable boxes have digital outs,
you are better off hooking up your receivers input that you are using for cable to that
one.
Not all are 5.1(or 5.0, which a few of mine are) but whatever you get will be a
grand sight better than what comes outta the TV.
You can run the HDMI to the TV, and when you'r receivers off, you can still use the tv
speakers.:1:

pixelthis
06-08-2010, 09:57 AM
My concern was since 1080p Tvs convert all of their input signal (1080i) to its native resolution (1080p) , does any information get lost in the field or frame of image in the upconversion process?

I am asking this question because 1080i only produce 60 even and odd field per second (30 full frame persecond), while 1080p produce 60 full frame persecond.

Thats the great thing about 1080i broadcast.
It was intended to be the main format, indeed, I have had several 1080i sets, but
with the advent of 1080p sets you get a monumental increase in q without any
cost.
Its like the boost in q you get from a DVD, deinterlacing 480i to 480p.
THERE IS NO "UPCONVERSION", so you are not dependent on how good a processing
chip you have, you just have to take 30 alternating frames(interlaced) and stiche them
together.
And you get a 1080p picture pure from the broadcaster, now, its broadcast as a 1080i,
but even so the pic is still a lot closer to the BLU standard.
I don't know if broadcasting in 24 fps would work, doubt it.
Most 1080p sets that take 24fps take it from the HDMI input.
It would take an infrastructure change to broadcast at anything other than 60hz.
120 hz sets get there 120 by chicanery, there picture depending on whatever magic chip
resides in their innards.
Even so, I THINK 120 HZ IS worth it, I have seen several, and the q is there, and when
laymen shopping with their wives tell you they can see the difference...well!
Dont know about 240, seems to be pushing it a bit.:1:

Sir Terrence the Terrible
06-08-2010, 11:39 AM
Matter of time for what, most sets to be 1080p regardless of screen size ?

I use a 32" since the room is small, but it's 1080p -- wouldn't settle for anything less. And it does just fine sending DD5.1 to the receiver. Not sure why some hdtv's only trasmit in stereo, seems silly.

TV's only transmit is stereo because manufacturing a television requires paying liscensing fees for various technology within them(i.e MPEG decoders). Dolby liscenses(and their are two of them if I am not mistaken) just add to the cost, and then you have to add the signal path as well.

Sir Terrence the Terrible
06-08-2010, 12:16 PM
My concern was since 1080p Tvs convert all of their input signal (1080i) to its native resolution (1080p) , does any information get lost in the field or frame of image in the upconversion process?


] I am asking this question because 1080i only produce 60 even and odd field per second (30 full frame persecond), while 1080p produce 60 full frame persecond.

All you are doing is combining the sequential frames( you are not upconverting anything) from 1080i(the two seperate fields of information) into one 1080p frame. The television receives the first field and holds it while waiting for the other frame to show. Some earlier televisions just line doubled a single field and didn't really combine the two fields of information.

kevlarus
06-08-2010, 12:44 PM
TV's only transmit is stereo because manufacturing a television requires paying liscensing fees for various technology within them(i.e MPEG decoders). Dolby liscenses(and their are two of them if I am not mistaken) just add to the cost, and then you have to add the signal path as well.


Sony thought enough ahead to add a toslink out.. of course, it's probably just Dolby, but it's definitely 5.1. It's one of their XBR series, so that may be the reason for the feature.

jvc
06-08-2010, 01:17 PM
I know broadcast tv can do1080 I, but can they do 1080 P?

AFAICT, only Bluray does 1080 P, but things do change.
DirecTv and some cable companies (don't know about Dish Network) have some "On Demand" or PPV channels that broadcast in 1080p.

Sir Terrence the Terrible
06-08-2010, 02:56 PM
Sony thought enough ahead to add a toslink out.. of course, it's probably just Dolby, but it's definitely 5.1. It's one of their XBR series, so that may be the reason for the feature.

Its Dolby Digital, and because it is the top of the line of Sony, yes it has them. Good luck finding a toslink out on their V and Z series.

Sir Terrence the Terrible
06-08-2010, 02:59 PM
DirecTv and some cable companies (don't know about Dish Network) have some "On Demand" or PPV channels that broadcast in 1080p.

It's encoded in 1080p, but the bits are not there for true 1080p. The bandwidth is not there for true perceptive 1080p, as that would require close to 35mbps delivery, and you are not going to get that bit rate from either Direct TV or cable. Its pure market at work.

Sir Terrence the Terrible
06-08-2010, 03:05 PM
Thats the great thing about 1080i broadcast.
It was intended to be the main format, indeed, I have had several 1080i sets, but
with the advent of 1080p sets you get a monumental increase in q without any
cost.


1080i and 1080p have the same information, it is just presented differently so there is no monumental increase in quality. 1080p is just 1080i deinterlaced. Nothing more is added during deinterlacing. Your television just weaves the sequential frames together to form one single field simultaneously.

I thought you knew everything about televisions? Or not........

Smokey
06-08-2010, 07:26 PM
THERE IS NO "UPCONVERSION", so you are not dependent on how good a processing chip you have, you just have to take 30 alternating frames(interlaced) and stiche them together. And you get a 1080p picture pure from the broadcaster, now, its broadcast as a 1080i.

All you are doing is combining the sequential frames( you are not upconverting anything) from 1080i(the two seperate fields of information) into one 1080p frame. The television receives the first field and holds it while waiting for the other frame to show. Some earlier televisions just line doubled a single field and didn't really combine the two fields of information.

Thanks guys.

So basically there is no need for 1080p broadcasting if TV deinterlaced 1080i signal to 1080p. I am guessing it will look as good as 1080p broadcasting.

Sir Terrence the Terrible
06-09-2010, 11:23 AM
Thanks guys.

So basically there is no need for 1080p broadcasting if TV deinterlaced 1080i signal to 1080p. I am guessing it will look as good as 1080p broadcasting.

With good processing, you would be hard pressed to tell the difference between 1080i and 1080p. So no, there is no reason to broadcast in 1080p

blackraven
06-09-2010, 03:51 PM
720P format is one of HD protocol when it was established. But with 720p TV on market disappearing and only available on elcheapos and smaller TVs, one wonder if 720P is a dying dinosaur.

The argument was that 720p is progressive and better for sport and high action programs, but with 1080P capable TVs, that argument is not true anymore. 720p was the preferred format for ABC and Fox network, but I am not sure if they have moved on to 1080 format or not.


Does a Duck Fart in Water?

Rich-n-Texas
06-09-2010, 03:53 PM
Do ducks fart at all?

Ya know, I don't think my TV converts 1080i broadcasts to 1080p. If I press the TV's info button, it shows 1080i. Only time I see 1080p is when I watch BD's.

jvc
06-09-2010, 04:57 PM
When you press the tv's info button, it shows the signal it's receiving. Not what it's displaying. Tvs these days are supposed to display at their native resolution. That's why a lot of people say the tv upscales their dvds.

Rich-n-Texas
06-09-2010, 05:47 PM
OIC. Thanks for clearing that up for me.

Smokey
06-09-2010, 06:39 PM
Ya know, I don't think my TV converts 1080i broadcasts to 1080p. If I press the TV's info button, it shows 1080i. Only time I see 1080p is when I watch BD's.

You better not push your TV too hard :D

pixelthis
06-10-2010, 10:03 AM
Do ducks fart at all?

Ya know, I don't think my TV converts 1080i broadcasts to 1080p. If I press the TV's info button, it shows 1080i. Only time I see 1080p is when I watch BD's.

Thats true, because outside of some Dish offerings only BLU has 1080p.
Unless your TV's native format is 1080i.
Any TV will show only one format, its native one, if your set is 1080i , than thats all you will see.
Whatever your sets resolution, that is what it will display, it will scale non-native
formats to fit.
Your set is just telling what the input rez is.:1:

(If I AM "PIXELDUMBASS" and still know more than you, what does that make you,
carpetbagger?)

pixelthis
06-10-2010, 10:07 AM
OH, and your 1080p set wont convert 1080i to 1080p, it will deinterlace
1080i to 1080p, which is a real increase in resolution, where "upscaling" a resolution
will actually provide no increase in picture quality, outside of the slightly more stabe pic a
progressive as opposed to an interlaced will provide.:1:

kelsci
06-10-2010, 11:51 AM
I do not know whether a duck "farts" but they sure take "cwaps" around my neighborhood,the species is Muscovy.

I have a 720P Toshiba Regeza that is a year and half old. I am very pleased with this set. Picture quality is IMHO determined by the quality of material broadcasted. I have noticed that some films shown in high-def made during the 50s and 60s look quite awsome; like looking through a glass window. In fact they look better than the current lot of more recent films, at least to me. Just some examples that looked impressive to me was HATARI,THE SEARCHERS,TRAPEZE, THE MAGNIFICENT SEVEN and LITTLE BIG MAN.

I have had only one opportunity to observe a 1080P 40 inch television made by Proscan that a friend bought from Costco. Broadcasts looked good but his upconverting dvd player made dvds look astonishing. If I live long enough, my next set would be 1080P but there is no rush to upgrade over the performance of the Toshiba Regza I own now.

Smokey
06-10-2010, 06:19 PM
I have had only one opportunity to observe a 1080P 40 inch television made by Proscan that a friend bought from Costco. Broadcasts looked good but his upconverting dvd player made dvds look astonishing.

I wonder how these Proscan TVs hold up. They are really cheap. Walmart have 40 inch Proscan 1080p LCD for $448, and 42 inch LED-LCD model for $649. Proscan use to be brand name for higher end of RCA TV models, but now I think it is just a bought out name.

http://www.walmart.com/browse/TV-Video/TVs/All-TVs/Proscan/_/N-94iwZ1yzkyshZaq90ZaqceZ1yzo9en/Ne-aq6s?ic=48_0&ref=125875.425768+500500.4292397089+1000075.429255 0815&tab_value=All&clicked_tab_value=All&catNavId=1060825&waRef=+500500.4292397089&depts=&catNavId=1060825&clicked_tab_value=All

Rich-n-Texas
06-11-2010, 04:37 AM
(If I AM "PIXELDUMBASS" and still know more than you, what does that make you,
carpetbagger?)
That makes me one who doesn't sit in front of a monitor, at the hospital, watching granny walk around with the back of her gown open, while reading HT magazines.

Thank you for continuing down the low road pix. :rolleyes:

pixelthis
06-11-2010, 11:07 AM
That makes me one who doesn't sit in front of a monitor, at the hospital, watching granny walk around with the back of her gown open, while reading HT magazines.

Thank you for continuing down the low road pix. :rolleyes:

You say that and accuse me of going down the "low" road?
I AM THE ONE ON THE HIGH ROAD, AS YOU WILL
find out when you look up and see me up there, on top of the world,
tut-tuting your latest idiodicy.:1:

Sir Terrence the Terrible
06-11-2010, 05:46 PM
OH, and your 1080p set wont convert 1080i to 1080p, it will deinterlace
1080i to 1080p, which is a real increase in resolution, where "upscaling" a resolution
will actually provide no increase in picture quality, outside of the slightly more stabe pic a
progressive as opposed to an interlaced will provide.:1:

Pix, your are a professed TV expert who has seen all, owned all(at least all of the cheapo stuff), and nobody can tell you nothing. In saying that;

1080p has no more resolution than 1080i. In fact, they have the same resolution. The only difference is 1080p is painted in one complete field, and 1080i requires two fields of sequential information. 1080i paints those two sequential fields so quickly, the naked eye cannot see the process. So perceptively both 1080p and 1080i are essentially the same thing. Bad processing of 1080i may show line twitter and some stair stepping of lines(cheap sets), but really good processing allows 1080i to look just like 1080p.

Get your facts straight, or stop posting...you are spreading misinformation and you really need to stop.

Sir Terrence the Terrible
06-11-2010, 05:47 PM
You say that and accuse me of going down the "low" road?
I AM THE ONE ON THE HIGH ROAD, AS YOU WILL
find out when you look up and see me up there, on top of the world,
tut-tuting your latest idiodicy.:1:

Pix, your version of the high road puts you just below even from a curb.....that is apparent to everyone here.

Woochifer
06-11-2010, 08:56 PM
720P format is one of HD protocol when it was established. But with 720p TV on market disappearing and only available on elcheapos and smaller TVs, one wonder if 720P is a dying dinosaur.

It's still commonplace among TVs under ~42". Below that size, it's actually hard to distinguish between the 768 and 1,080 line grids unless you get in fairly close.

720p's not a dying dinosaur given how many broadcast networks use 720p as the default format. This has been discussed previously.

http://forums.audioreview.com/showpost.php?p=269232&postcount=66


The argument was that 720p is progressive and better for sport and high action programs, but with 1080P capable TVs, that argument is not true anymore. 720p was the preferred format for ABC and Fox network, but I am not sure if they have moved on to 1080 format or not.

Nope. I'm not aware of any networks that have switched to a different resolution. As others have noted, there are no broadcast networks that use 1080p, and 1080p is not a standard for HDTV broadcast transmissions.

Figure it this way, 1080p would roughly double the bandwidth needed, assuming equal data compression. 1080i and 720p are not that different in the bandwidth required.

1080i renders a 1,920 x 1,080 pixel map at a frame rate of 30 frames/sec, which gives you a data rate of 62.2 million pixels/sec.

720p renders a 1,280 x 720 pixel map at a frame rate of 60 frames/sec, giving you a data rate of 55.3 million pixels/sec.

1080p renders that 1,920 x 1,080 pixel map at a 60 frames/sec, for a data rate of 124.4 million pixels/sec.

Cable, satellite, and online providers are now advertising 1080p movies for rent/download. I wouldn't trust the picture quality, given that those files are highly compressed. Even with that resolution, a lot of other factors go into picture quality.


I wonder how these Proscan TVs hold up. They are really cheap. Walmart have 40 inch Proscan 1080p LCD for $448, and 42 inch LED-LCD model for $649. Proscan use to be brand name for higher end of RCA TV models, but now I think it is just a bought out name.

Yep, that's exactly what it is. High-end TVs don't sell for under $500. And LED backlit TVs in this price range are going to be the edge-lit models, which don't perform any differently than the CCFL models except for lower energy consumption and thinner panels.


Thanks everybody for chiming in. Don’t have HD yet, but I figured that broadcasting 1080p does occupy alot of bandwidth. SirTT said that two main reason being bandwidth requirements and TV station’s lack of storage.

Good gawd Smoke, how many more HDTV threads are you gonna start before you actually run out and get one? Bite the bullet, get the TV! :cool:

Woochifer
06-11-2010, 09:06 PM
I have had only one opportunity to observe a 1080P 40 inch television made by Proscan that a friend bought from Costco. Broadcasts looked good but his upconverting dvd player made dvds look astonishing. If I live long enough, my next set would be 1080P but there is no rush to upgrade over the performance of the Toshiba Regza I own now.

FYI, upconversion is one of those misnomers in home theater. There's nothing magical about it.

Think about it this way -- EVERY signal that gets fed into a 1080p TV MUST be rescaled/deinterlaced to 1080p just for the TV to display an image. All that happens with an upconverting DVD player is that the rescaling and deinterlacing occurs inside the DVD player. A non-converting DVD player will send a 480i/p signal to the TV, and the TV will do the rescaling and deinterlacing.

A difference in picture quality will show up only the TV or the DVD player uses a noticeably better video processor. Nowadays, the video processors built into most TVs are comparable in quality to those used in upconverting DVD players.

IMO, you really see the biggest difference in picture quality with Blu-ray. Basically, the smaller the screen, the less you'll notice the improvement from 1080p. 40" is right at the point where it's not that noticeable.

Invader3k
06-12-2010, 04:56 AM
Proscan used to be considered a professional grade television. When I first started working in an electronics store in 2002, they still have some CRT Proscan 4:3 sets used for display purposes that actually displayed in 540p.

They soon sold those sets off not long after I started there, and went with actual HD flat panel sets. I believe Proscan sets are now made by some Korean company.

Smokey
06-12-2010, 05:52 PM
720p's not a dying dinosaur given how many broadcast networks use 720p as the default format. This has been discussed previously.

http://forums.audioreview.com/showpost.php?p=269232&postcount=66


Thanks Wooch. That is good information to have and worth repeating:

720p: ABC, Fox, ESPN Networks, A&E Networks (A&E, History, Biography), Fox Sports Net, Fox News, Fox Business, CBS College Sports, MLB Network, Disney Channels (Disney, Toon Disney, ABC Family)

1080i: NBC, CBS, CNN, HDNet, NFL Network, Discovery Networks, National Geographic, HDTV, Food Network, Weather Channel, HBO, Showtime, TNT, USA, TBS, MTV Networks, Nickelodeon Networks, CNBC

With 1080p TVs, theoretically 1080i should look better than 720p channels due to not having either cablebox/satelite or TV do the rescaling. But as you said, other factors also go into picture quality.


Good gawd Smoke, how many more HDTV threads are you gonna start before you actually run out and get one? Bite the bullet, get the TV!

We don't have Costco around here :D

I still watch alot of SD materials from DVD and cable, so haven't felt the urge yet. But on the side, been eyeing 37 inch LCD TVs which I have a perfect space for.


Proscan used to be considered a professional grade television. When I first started working in an electronics store in 2002, they still have some CRT Proscan 4:3 sets used for display purposes that actually displayed in 540p.

Was the name of store Sears since that is only place I used to see Proscan TVs.

pixelthis
06-13-2010, 07:36 AM
Pix, your are a professed TV expert who has seen all, owned all(at least all of the cheapo stuff), and nobody can tell you nothing. In saying that;

1080p has no more resolution than 1080i. In fact, they have the same resolution. The only difference is 1080p is painted in one complete field, and 1080i requires two fields of sequential information. 1080i paints those two sequential fields so quickly, the naked eye cannot see the process. So perceptively both 1080p and 1080i are essentially the same thing. Bad processing of 1080i may show line twitter and some stair stepping of lines(cheap sets), but really good processing allows 1080i to look just like 1080p.

Get your facts straight, or stop posting...you are spreading misinformation and you really need to stop.

Pot calling the kettle black.
Been over this time and time again, this is what happens when lawyers try to do electronics.
Progressive is inherently better than interlaced.
A 1080i pic does have 1080 lines, but its shown in two fields of 540 lines laced together,
any res above 540 is an illusion.
And you have artifacts associated with interlaced pictures.
But the most important point is... progressive just looks better.
AS WELL IT SHOULD , with twice as much res at any given time than an interlaced
pic.
I would rather watch a 720p picture than an 1080i, because the 720 lines on the screen
at any given time is more than the 540 you get from the chicanery of interlaced 1080i.
Anybody just eyeballing the two side by side can see that any given progressive
pic is just more stable and better than any given interlaced pic.
And for the last time, any illusion of 1080 lines disapears when theres movement in an interlaced field.
A 480I WILL ONLY GIVE 240 LINES OF RESOLUTION.
This falls under the heading of "no such thing as a free lunch".
And a 1080i, where theres' movement, will only give 540 lines of res, nice but nowhere
near the 1080 lines from 1080p.
My electronics teacher first told me this, Joe Kane said in several writings for widescreen review, the very same thing , citing it as the reason he favored 720p over 1080i.
NOW WE HAVE 1080p sets, and 1080i broadcast , which is a good combo, as the 1080p sets stich the two fields of 540 frames together, giving a true pic of 1080p, with no interlace artifacts or res loss.
And I REALLY DON'T CARE ABOUT YOUR OPINION old sot.
I HAVE A 1080P SET right in front of me, and I am going to enjoy the glorious pic, no matter what your disinformed opinion is.:1:

Invader3k
06-13-2010, 09:20 AM
"Was the name of store Sears since that is only place I used to see Proscan TVs."

No. This was an audio company store that rhymes with "Blows".

pixelthis
06-13-2010, 09:39 AM
i saw a Proscan at Sears (rhymes with "tears") and almost bought it, glad now I DIDN'T.
Because its a Korean knock-off, basically buying a defunct name and slapping it on a
generic Asian TV(is there any other kind?).
Picture was great, but what about longetivity?
BTW all this talk of 720p being "dead" It occured to me that 1080i is DEAD.
At least from the receiving end, can't buy one anymore, really.
However as a broadcast medium its going great, looks like we're heading to a 1080i
world, with 1080p sets deinterlacing these to 1080p.
Which provide exelent pictures, and really, isnt that what its all about?
BTW Smoke, this is what a color pic looks like, for when you do upgrade.:1:

Sir Terrence the Terrible
06-13-2010, 11:06 AM
Pot calling the kettle black.
Been over this time and time again, this is what happens when lawyers try to do electronics.
Progressive is inherently better than interlaced.
A 1080i pic does have 1080 lines, but its shown in two fields of 540 lines laced together,
any res above 540 is an illusion.
And you have artifacts associated with interlaced pictures.
But the most important point is... progressive just looks better.
AS WELL IT SHOULD , with twice as much res at any given time than an interlaced
pic.
I would rather watch a 720p picture than an 1080i, because the 720 lines on the screen
at any given time is more than the 540 you get from the chicanery of interlaced 1080i.
Anybody just eyeballing the two side by side can see that any given progressive
pic is just more stable and better than any given interlaced pic.
And for the last time, any illusion of 1080 lines disapears when theres movement in an interlaced field.
A 480I WILL ONLY GIVE 240 LINES OF RESOLUTION.
This falls under the heading of "no such thing as a free lunch".
And a 1080i, where theres' movement, will only give 540 lines of res, nice but nowhere
near the 1080 lines from 1080p.
My electronics teacher first told me this, Joe Kane said in several writings for widescreen review, the very same thing , citing it as the reason he favored 720p over 1080i.
NOW WE HAVE 1080p sets, and 1080i broadcast , which is a good combo, as the 1080p sets stich the two fields of 540 frames together, giving a true pic of 1080p, with no interlace artifacts or res loss.
And I REALLY DON'T CARE ABOUT YOUR OPINION old sot.
I HAVE A 1080P SET right in front of me, and I am going to enjoy the glorious pic, no matter what your disinformed opinion is.:1:

1080i has more resolution than 720p, and the same resolution as 1080p. These are indisputable facts.

A 1080p flatpanel may display a full 1080x1920 image when static(some 1080p don't even do that), but when objects move across the screen, its resolution drops dramatically. Let's take your beloved Vizio set. When objects move on 1080p Vizio sets(all models), resolution drops to 330p. A 1080i RPTV will have 540 lines of information during motion. This is also a fact you cannot dispute.

LCD panels still suffer from motion blur, so even during fast motion, blurring will lower resolution even if the screen is painted in one pass. 1080i CRT RPTV have no motion blur, so any advantage that progessive scanning has on motion is lost while the pixels refresh themselves.

The amount of detail you get on the screen is the same for both 1080p and 1080i. Though the amount of information (data throughput) transmitted through the cable is higher in the case of 1080p, the number of pixels you actually see it’s the same – 1920×1080.

You cannot see each alternating 540 lines being painted on the screen separately. The whole process of combining the two 540 line fields is so quick, it is transparent to the eye. If it was not, you would see flickering, and you don't. Both lines are on the screen in 1/60th of a second(each taking 1/30 of a second), which is faster than you can blink. Your eyes only see the combined fields, not the alternating frames.

Joel Kane's comments on 1080p and 1080i are based on non processed images. With very good processing, 1080i looks exactly like 1080p, as line twitter and stair stepping(his two major points against 1080i) are totally eliminated. Motion processing can get rid of motion artifacts with 1080i. All of your assumptions are based on single gun CRT's, not high end projection systems and good outboard processing.

You can continue to deny these facts, but they don't go away because of your denial. In motion test done by Displaydata, your coveted Vizio finished at the bottom of the pack in motion resolution, finishing lower than a single odd field of the interlace format. (330 vs 540 lines of information). So this whole progressive is better than interlace when looked at carefully is not as simple as you say it is.

audio amateur
06-13-2010, 12:03 PM
Sir T, I don't get the part about 1080p dropping to 330p. Why is this?

Sir Terrence the Terrible
06-13-2010, 04:54 PM
Sir T, I don't get the part about 1080p dropping to 330p. Why is this?

What this refers to as the ability of the televisions pixels to mechanically twist and update to its correct brightness frame by frame. LCD panels use a backlight, and the pixels determine how much light is supposed to be on the screen with each frame by twisting themselves at varying degrees to allow light through(called rise and fall). With static images, there is no movement between frames, so there is no need for the pixels or move or flex to their correct positions. Once there is movement, the pixels have to twist and update very quickly, especially when 120 and 240hz refresh rates are deployed, hence the difference between resolution when images are static, and when they are moving. The refresh rate may be high, but the pixel response is slow, which mitigates the speed of the refresh rate. Once you get the response rate of the pixels and the refresh rate in sync(like Samsung and Sony have in their higher end models), this effect becomes unnoticeable. The ability of LCD panels to do this well is measured by its motion resolution, as it tells how quickly either the processing, or pixels are doing at correctly updating frame by frame with moving objects.

Samsung and Sony's upper line of televisions maintain their full resolution with moving images. Vizio's televisions across their entire line where all 400 lines of resolution(with their 1080p models) or less during motion, the slowest and worse of all televisions tested. The results where the same when they moved to the XVT line of televisions, and higher refresh rates. The response of the pixels cannot keep up with the refresh rate, so the resolution goes down until they can fully update and twist into position.

pixelthis
06-14-2010, 02:25 AM
Sir T, I don't get the part about 1080p dropping to 330p. Why is this?

Because its BS is why.
LCD pixels are like a shutter, they turn on and off, allowing light to pass.
These "magic twisting pixels" are just the latest from the plasma crowd(primarily PANASONIC) MAKING AN EFFORT to slam LCD in general and vizio in patricular.
EVEN IF IT DOES EXIST it makes no difference, my set has been measured at
FULL resolution.
BASICALLY this is a way of saying that I am an idiot who cant tell the diff between
330 lines of res and 1080p.
Insults such as this from this hatefull, vindictive, personality challenged beaurucrat
are the main reason this site is such a ghost town, people get tired of putting up with it.
As long as plasma backers like PANASONIC try to propagandize the virtues of plasma, trying in vain to save their idiotic format(something to do with a 300 million dollar factory)
shills like talky are going to be around.
THE SPEED at which pixels open and close only affects response time, BTW,
has nothing to do with rez.
And my brand of TV (vizio ) is number one because everybodies an idiot except talkie and his made up "tests" that have nothing to do with the real world.
And I AM STILL going to be enjoying my set, along with millions of other tommorrow,
and when I AM READY, will probably be buying my third VIZIO.
A great monitor at a great price.:1:

pixelthis
06-14-2010, 03:01 AM
[QUOTE=Sir Terrence the Terrible]1080i has more resolution than 720p, and the same resolution as 1080p. These are indisputable facts.

Its an indisputable fact that you can't do simple math.
A 1080i screen produces the illusion of 1080i by displaying two fields of 540 lines,
just like a 480i pic shows 480 lines by showing first one field and then another.
Your eye isnt fast enough to see the first field fade before the second is " interlaced"
between it, and this gives a great picture, but when theres movement of any
kind the picture collapses back to 540 lines.
This is why still graphics, etc, look so good on interlaced sets.
Not that it matters, 1080i is a DEAD format, you cant buy a 1080i set anymore,
excepting some esoteric crap.



A 1080p flatpanel may display a full 1080x1920 image when static(some 1080p don't even do that), but when objects move across the screen, its resolution drops dramatically. Let's take your beloved Vizio set. When objects move on 1080p Vizio sets(all models), resolution drops to 330p. A 1080i RPTV will have 540 lines of information during motion. This is also a fact you cannot dispute.

SO you finally agree with me about the true res of 1080i being 540 lines.
STILL havent explained how that beats 720p, which has 720 lines at all times.
THE effect of a sharp drop in res only affects interlaced pictures, doesnt affect progressive pictures.
When you think about it , theres no way it can.
On a 1080p set one thousand and eighty lines are on each field.
Half of those aren't going to "disapear" just because somethings moving.
It will disapear on an interlaced set because theres only half the lines on the screen at any given time, and interlaced resolution is an illusion at best


LCD panels still suffer from motion blur, so even during fast motion, blurring will lower resolution even if the screen is painted in one pass. 1080i CRT RPTV have no motion blur, so any advantage that progessive scanning has on motion is lost while the pixels refresh themselves.

Motion "blur" used to be a problem, but once we got into 8 mill, it passed a point where the human eye could detect any blur, and most sets these days are 4mil, and 120 hz sets have even less of a problem.
LCD sets have gotten a lot of blame from cheap cable and sat boxes, with their underpowered procs.


The amount of detail you get on the screen is the same for both 1080p and 1080i. Though the amount of information (data throughput) transmitted through the cable is higher in the case of 1080p, the number of pixels you actually see it’s the same – 1920×1080.

Unless theres' movement in the 1080i pic.
NOT THAT IT MATTERS, as 1080i sets are dead, just about



You cannot see each alternating 540 lines being painted on the screen separately. The whole process of combining the two 540 line fields is so quick, it is transparent to the eye. If it was not, you would see flickering, and you don't. Both lines are on the screen in 1/60th of a second(each taking 1/30 of a second), which is faster than you can blink. Your eyes only see the combined fields, not the alternating frames.

THIS is funny as he**.
By the time the second field is painted the first is already fading.
THE EYE CANT DETECT IT, but when theres movement the res drops, has nothing to do with what the eye can "detect".
And "flicker" has plagued interlaced pics since the dawn of TV.



Joel Kane's comments on 1080p and 1080i are based on non processed images. With very good processing, 1080i looks exactly like 1080p, as line twitter and stair stepping(his two major points against 1080i) are totally eliminated. Motion processing can get rid of motion artifacts with 1080i. All of your assumptions are based on single gun CRT's, not high end projection systems and good outboard processing.

Which is rediculously expensive, even more so as the CRT is mostly a museam
piece, and anything built to fix its shortcomings has to be custom made.
A display with a native 1080p is cheaper and better, not to mention more
versatile.
You can keep a horse, most people don't.
AUDIOPHILES CAN HANG ONTO THEIR OBSOLETE turntables, that relatively
inexpensive, very few are going to be sticking with tubes


You can continue to deny these facts, but they don't go away because of your denial. In motion test done by Displaydata, your coveted Vizio finished at the bottom of the pack in motion resolution, finishing lower than a single odd field of the interlace format. (330 vs 540 lines of information). So this whole progressive is better than interlace when looked at carefully is not as simple as you say it is.


No, but after months of hitting you in the head with a rock, I HAVE FINALLY
convinced you about the true resolution of interlacerd formats.
Like they said in the movie "Contact"...
SMALL STEPS... SMALL STEPS.
My set has been measured, but I AM GOING TO MY AUDIO DEALER,
(who also sells TV sets, and has a full service bay) see if I get a test pattern on it.
You can keep your "twisting pixels".:1:

Smokey
06-14-2010, 06:31 PM
Your eye isnt fast enough to see the first field fade before the second is " interlaced" between it, and this gives a great picture, but when theres movement of any kind the picture collapses back to 540 lines.

I really would not say "movement of any kind". I don't know how to measure movement, but if speed of movement is less than reframe rate of interlaced picture (1/30th of second), then I don't think there would be any lose of resolution.

I think you better get idea of timing If you look at frame rate of film (24 fps). At 24 frame per second, film capture most (if not all) of motion in a scene pretty smooth. So there is pretty good assumption that movement was below speed of 24 frame per second. So why would you think there would be any different with 30 frame per second rate of interlaced picture.

The kind of movement speed you are talking about which cut resolution to 540 (greater than 1/30th of second) would even be hard for films of 24 fps to capture without looking choppy.

Sir Terrence the Terrible
06-14-2010, 06:52 PM
[QUOTE]

Its an indisputable fact that you can't do simple math.
A 1080i screen produces the illusion of 1080i by displaying two fields of 540 lines,
just like a 480i pic shows 480 lines by showing first one field and then another.
Your eye isnt fast enough to see the first field fade before the second is " interlaced"
between it, and this gives a great picture, but when theres movement of any
kind the picture collapses back to 540 lines.
This is why still graphics, etc, look so good on interlaced sets.
Not that it matters, 1080i is a DEAD format, you cant buy a 1080i set anymore,
excepting some esoteric crap.

You are a straight up lying pile of crap. There is no fade while the odd and even fields combine. The odd lines are displayed first, then the even. The screen refreshes as the alternating fields are displayed. There is no fading when the screen is refreshed - as it is refreshed. For the english challenged

Refresh;

To renew (the image on a display screen) by renewing the flow of electrons from the cathode-ray tube.

You cannot fade and refresh at the same time dummy.


SO you finally agree with me about the true res of 1080i being 540 lines.

No I don't. the true resolution of 1080i is 1080i not 540 lines. The two separate fields make up 540 lines, but they are combined to a 1080i resolution not 540. 1080i is 1080i, and there is no such animal as 540i. If 1080i was 540, it would be called 540i, and there is no such animal as 540i


STILL havent explained how that beats 720p, which has 720 lines at all times.
THE effect of a sharp drop in res only affects interlaced pictures, doesnt affect progressive pictures.
When you think about it , theres no way it can.
On a 1080p set one thousand and eighty lines are on each field.
Half of those aren't going to "disapear" just because somethings moving.
It will disapear on an interlaced set because theres only half the lines on the screen at any given time, and interlaced resolution is an illusion at best

You still are piping up the lies huh Pix. Facts....

720p=900,000 pixels
1080 i/p= 2.073,000 pixels.

720=720 lines of information
1080=1080 lines of information.

720p data rate is 55.3 million pixels/sec

1080i data rate is 62.2 million pixels/sec.

If you cannot see that 1080i is more resolution than 720p, then you are a technical neophyte, or a card carrying ignoramus.


Motion "blur" used to be a problem, but once we got into 8 mill, it passed a point where the human eye could detect any blur, and most sets these days are 4mil, and 120 hz sets have even less of a problem.
LCD sets have gotten a lot of blame from cheap cable and sat boxes, with their underpowered procs.

You are a gullible idiot. Motion blur is still a problem, and only the very highest performing sets do 8 mill or even a number close to that. Manufacturers always overstate the spec's of their sets, ALWAYS. When Displaydata measured your beloved Vizio line of panels, it's panel response times(which has nothing to do with refresh rate) were 20+ mill on the average. Vizio was using gray to gray measurements to represent Black(active pixel) to white(inactive pixels) transitions. The best Sony and Samsung television have 8 mill, not 4. The refresh rate has nothing to do with the rise and fall times(pixel response) of the pixels themselves. You are trying to mix to unrelated processes together to make your misinformed point.


Unless theres' movement in the 1080i pic.
NOT THAT IT MATTERS, as 1080i sets are dead, just about

I guess if you keep repeating that 1080i sets are dead, you think that you don't have to present correct information on it.. dead or not. Since there are millions of CRT based RPTV still in operation, it ain't dead till they are. Secondly, if there is movement, then your prized high performing Vizio panel does even poorer than a CRT based television in that regard.


THIS is funny as he**.
By the time the second field is painted the first is already fading.
THE EYE CANT DETECT IT, but when theres movement the res drops, has nothing to do with what the eye can "detect".
And "flicker" has plagued interlaced pics since the dawn of TV.

You nose is reaches from your home town(Hickville) to Paris and back. After each field is displayed on the screen, it is refreshed. A refreshed image does not fade, or it would defy the very meaning of refreshed. When there is movement, the process does not change, each field is expressed and refreshed. Each 540 line is alternately displayed. When images move on your high performing vizio panel, resolution drops from a static 1080p, down to 330p because of the slow rise and fall times of each pixel. Test after test after test has shown this, and your denying it does not change the facts on bit.
Flicker was a problem in the first few years of B&W television, it has not been a problem since televisions went to 60hz refresh rates.



Which is rediculously expensive, even more so as the CRT is mostly a museam
piece, and anything built to fix its shortcomings has to be custom made.
A display with a native 1080p is cheaper and better, not to mention more
versatile.

Oh really. My museam piece(just to copy your wrong spelling) can display 480i, 480p, 540p, 576p, 720p, 768p, 1080i and 1080p all natively. Can you Vizio do that? I don't think so, a panel has only one native resolution. So much for your versatility argument. Panels have never been cheaper than CRT based sets when the screen sizes were equal. Don't let facts trip you up liar!



You can keep a horse, most people don't.

I know quite a few people who own horses. And the folks that don't, don't need them or ride them.


AUDIOPHILES CAN HANG ONTO THEIR OBSOLETE turntables, that relatively
inexpensive, very few are going to be sticking with tubes

There are a lot of people right on this website that prove you wrong daily.



No, but after months of hitting you in the head with a rock, I HAVE FINALLY
convinced you about the true resolution of interlacerd formats.
Like they said in the movie "Contact"...
SMALL STEPS... SMALL STEPS.
My set has been measured, but I AM GOING TO MY AUDIO DEALER,
(who also sells TV sets, and has a full service bay) see if I get a test pattern on it.
You can keep your "twisting pixels".:1:

The only thing you have managed to do is lie to yourself. You haven't convinced me of anything because everyone knows that a 1080i television is not 540 lines.

Your set has been measured...1080p with static images, and 330p with moving ones.

When you go to your audio dealer to get your television measured, tell your dealer he will need a CCD camera, a single point light sensor to trigger the camera, a video signal generator controlled by a PC to do the test. They'll need a piece of software called LCDacquire, which controls the camera: it includes a transition generator, and is in charge of the image acquisition. The second piece of software they will need is called LCDprocess. It performs the data processing by computing the LCD response times using the images acquired. Tell them the test consists of Black to white to black transitions, so they won't cheat the test like Vizio does(just measuring black to white transition only) When he finishes spending the $10-15,000 just to test your television, let me know how it works out. Static test patterns won't tell you anything about response time of your panel.

If the pixel's don't twist and untwist, what do you think the do the funky chicken?

Woochifer
06-14-2010, 07:20 PM
With 1080p TVs, theoretically 1080i should look better than 720p channels due to not having either cablebox/satelite or TV do the rescaling. But as you said, other factors also go into picture quality.

As T points out, this depends greatly on the processing. The 720p to 1080p conversion only includes rescaling, while the 1080i to 1080p conversion involves deinterlacing, a more complex process that demands more from the processor.

In my experience with HD broadcast channels, it seems that the 1080i picture quality is a lot more variable. Cable and satellite companies alike reduce the bandwidth on each channel, and from what I can see, it seems that the 720p stations are more consistent. But, when a 1080i channel has a good signal, the picture quality can be astonishing.


We don't have Costco around here :D

They do have a website, y'know ... :1:


I still watch alot of SD materials from DVD and cable, so haven't felt the urge yet. But on the side, been eyeing 37 inch LCD TVs which I have a perfect space for.

You still watch a lot of SD material because you can't watch HD! Check your channel selections. The vast majority of channels are now available in HD, and your cable company probably carries most of the top viewed HD channels. Shortage of HD programming got taken off the list of excuses more than two years ago.

Time is way past to stop eyeing and start buying!


Was the name of store Sears since that is only place I used to see Proscan TVs.

Proscan TVs were available in most of the higher end electronics stores in my area.

Woochifer
06-14-2010, 07:34 PM
Because its BS is why.
LCD pixels are like a shutter, they turn on and off, allowing light to pass.
These "magic twisting pixels" are just the latest from the plasma crowd(primarily PANASONIC) MAKING AN EFFORT to slam LCD in general and vizio in patricular.

Hmmm, so why are Samsung and Sony citing the same motion resolution test results for their high end LCD TVs? Are they part of this anti-LCD, anti-Vizio conspiracy too? :out:


EVEN IF IT DOES EXIST it makes no difference, my set has been measured at
FULL resolution.

Using which test? And whose benchmarks?


BASICALLY this is a way of saying that I am an idiot who cant tell the diff between
330 lines of res and 1080p.

For once, the truth.


Insults such as this from this hatefull, vindictive, personality challenged beaurucrat
are the main reason this site is such a ghost town, people get tired of putting up with it.

Right, so people with the highest reputation points are the ones creating trouble on this site. :out:


As long as plasma backers like PANASONIC try to propagandize the virtues of plasma, trying in vain to save their idiotic format(something to do with a 300 million dollar factory)
shills like talky are going to be around.

Obsess much about what others prefer?


THE SPEED at which pixels open and close only affects response time, BTW,
has nothing to do with rez.
And my brand of TV (vizio ) is number one because everybodies an idiot except talkie and his made up "tests" that have nothing to do with the real world.

As opposed to your claims about FULL resolution on your TV, which aren't backed up by any benchmark tests whatsoever?


and when I AM READY, will probably be buying my third VIZIO.
A great monitor at a great price.:1:

Yet another poor TV that you're about to kick to the curb. If they're so great, why do you keep buying new ones?

markw
06-15-2010, 09:10 AM
I can't remember when I last saw a throw-down this good here!

ding... Round two!

(Good subject, smoke!)

pixelthis
06-15-2010, 10:42 AM
[QUOTE=pixelthis]


;





[QUOTE]No I don't. the true resolution of 1080i is 1080i not 540 lines. The two separate fields make up 540 lines, but they are combined to a 1080i resolution not 540. 1080i is 1080i, and there is no such animal as 540i. If 1080i was 540, it would be called 540i, and there is no such animal as 540i

no, a more accurate LABEL WOULD BE 540P, since there are only 540 lines on the screen at any given time, interlaced with rapidly fading lines from the last field painted



You still are piping up the lies huh Pix. Facts....

720p=900,000 pixels
1080 i/p= 2.073,000 pixels.

720=720 lines of information
1080=1080 lines of information.

720p data rate is 55.3 million pixels/sec

1080i data rate is 62.2 million pixels/sec.


If you cannot see that 1080i is more resolution than 720p, then you are a technical neophyte, or a card carrying ignoramus.

And you are a technical know NOTHING.
The data rate difference between 720p and 1080i that you stated was 7.1.
Seven point one!
But a 1080i pic has (supposedly) 2,073,000 pixels to a 720p sets 900,000,
SO THE "DATA RATE SHOULD BE A LOT HIGHER, HUH?

Having trouble figuring that one out?





You are a gullible idiot. Motion blur is still a problem, and only the very highest performing sets do 8 mill or even a number close to that. Manufacturers always overstate the spec's of their sets, ALWAYS. When Displaydata measured your beloved Vizio line of panels, it's panel response times(which has nothing to do with refresh rate) were 20+ mill on the average. Vizio was using gray to gray measurements to represent Black(active pixel) to white(inactive pixels) transitions. The best Sony and Samsung television have 8 mill, not 4. The refresh rate has nothing to do with the rise and fall times(pixel response) of the pixels themselves. You are trying to mix to unrelated processes together to make your misinformed point.

I am not the "gullible idiot" you hope for with your skewered "tests"(propaganda)
from this website that is funded by, let me guess...Panasonic, Number one maker
of plasma screens.
Excuse me if I IGNORE the propaganda coming from Panasonics marketing dept.
In other words go peddle your dog and pony show somewhere ELSE


I guess if you keep repeating that 1080i sets are dead, you think that you don't have to present correct information on it.. dead or not. Since there are millions of CRT based RPTV still in operation, it ain't dead till they are. Secondly, if there is movement, then your prized high performing Vizio panel does even poorer than a CRT based television in that regard.

not really.
Yeah, I guess there are a few RPTVS and tube sets still out there in the various trailer
parks, bought at the local thrift store for 50-100 bucks.
BUT THAT DOESNT MATTER, you cant buy a 1080i in a store, rendering the form
factor dead(like your pre-frontal cortex)





You nose is reaches from your home town(Hickville) to Paris and back. After each field is displayed on the screen, it is refreshed. A refreshed image does not fade, or it would defy the very meaning of refreshed. When there is movement, the process does not change, each field is expressed and refreshed. Each 540 line is alternately displayed. When images move on your high performing vizio panel, resolution drops from a static 1080p, down to 330p because of the slow rise and fall times of each pixel. Test after test after test has shown this, and your denying it does not change the facts on bit.


Tests performed by a company desperatly trying to keep their outdated plasma form factor alive long enough to amoritize their huge, brand new plasma panel factory


Flicker was a problem in the first few years of B&W television, it has not been a problem since televisions went to 60hz refresh rates.
Not if you're blind




Oh really. My museam piece(just to copy your wrong spelling) can display 480i, 480p, 540p, 576p, 720p, 768p, 1080i and 1080p all natively. Can you Vizio do that? I don't think so, a panel has only one native resolution. So much for your versatility argument. Panels have never been cheaper than CRT based sets when the screen sizes were equal. Don't let facts trip you up liar!

ACTUALLY IT CAN.
It just scales the resolution to its native resolution, like all new panels can.
When scalers and deinterlacers were expensive that was a problem, not any more.
Yeah, CRT can handle multiple resolutions, my first HDTV could handle 480p and
1080i.
BIG deal, a horse can wade through a swamp, a PRIUS cant.
So whats your point?




I know quite a few people who own horses. And the folks that don't, don't need them or ride them.
Well, I KNOW NOBODY with a 1080i tv set.
Get with it.



There are a lot of people right on this website that prove you wrong daily.

I AM WRONG ON OCCASION, but I don't lie and quote obviously biased tests
in order to slander things for a big corp with ulterior motives



The only thing you have managed to do is lie to yourself. You haven't convinced me of anything because everyone knows that a 1080i television is not 540 lines.

I couldnt care less what everyone "knows".
1080i is a good format, but with some problems.
And everyone used to "know" the earth was flat, was it?


Your set has been measured...1080p with static images, and 330p with moving ones.

Progressive sets dont lose resolution when their is movement, the ones doing this "measurement" have a 300 million dollar plasma factory to pay for, not much credibility there



When you go to your audio dealer to get your television measured, tell your dealer he will need a CCD camera, a single point light sensor to trigger the camera, a video signal generator controlled by a PC to do the test. They'll need a piece of software called LCDacquire, which controls the camera: it includes a transition generator, and is in charge of the image acquisition. The second piece of software they will need is called LCDprocess. It performs the data processing by computing the LCD response times using the images acquired. Tell them the test consists of Black to white to black transitions, so they won't cheat the test like Vizio does(just measuring black to white transition only) When he finishes spending the $10-15,000 just to test your television, let me know how it works out. Static test patterns won't tell you anything about response time of your panel.

Great thing about you talky, you outargue yourself.

BASICALLY MY SET measures 1080 lines when measured in established tests for consumer items.
This "test" of yours is for data grade lab equipment, its nothing short of rediculous
to measure consumer sets this way, like saying a ford mustang comes up short because it is slower than a Corvette by .00005 SECONDS.
Not to mention that the "transitions" (lcd pixels opening and closing) have nothing to do
with resolution whatsoever, and have very little tolerance..
BASICALLY YOU'RE SAYING MY SET HAS 330 lines because it doesnt meet data grade tests, you havent actually measured the moving resolution because
its impossible to test moving resolution, you are inferring something you cant measure,
in other words, using a "test" that HAS NOTHING TO DO WITH RESOLUTION.
Misleading at best, intellectually dishonest certainly.
A nice way of saying you're "lying like a rug"



If the pixel's don't twist and untwist, what do you think the do the funky chicken?
You're brain needs to untwist.
People understand the concept of a shutter opening and closing, most dont understand
what a LCD pixel really does, so call it a shutter:1:

Sir Terrence the Terrible
06-15-2010, 07:58 PM
I see this issue has got your prostrate all swollen up. You can bury your head in the sand, but I can guarantee it does not taste so hot, and it does not erase the facts.

And independent company that does not make televisions did the testing. They are not owned by any CE company, so this whole conspiracy point you keep bringing up shows that you have zero confidence in what you believe, or you don't know $hit about televisions.

I have never heard of a pixel being called a shutter. A shutter is just one piece that makes up a sub pixel, and you need three sub pixels to make a single pixel. A LCD television resolution is measured by its pixel count, not its shutter count.

The alternating fields do not fade, that is also made up. The screen refreshes after each alternating field is displayed. That is a fact you can look it up(if you can actually read)

I never mentioned anything about a data difference of 7.1 in any of my post. If can find it, post it. If you cannot, you lied again liar.

The data rate AND the pixel count shows that there is more resolution in a 1080i set than there is in a 720p set. So we both agree on that, and can put that to rest.

No one "skewed" the test(skewers are for kabobs), Displaydata testing methodology is well understood. They used the setup I recommended to your "audio" dealer. LOLOL

Panasonic did not run the tests, Displaydata did. They are a completely impartial third party that has no relationship with any CE company. They just test televisions, and report the data. Your conspiracy argument that Panasonic did the testing is not truthful at all. If you don't like the results, too damn bad, you cannot change a thing. Maybe in the future you will purchase a quality set that does not send you into denial on its tested performance.

Tube sets are alive and well, and are still selling quite well. Maybe not in America, but they are in a lot of countries.

If flicker was a problem with CRT's, nobody would buy them. It would give them a headache. The fact that they actually were the number two selling display device in the first quarter of 2009 shows there is still much kick in their sales. Since nobody is reporting headaches or dizziness, even mentioning flicker is disingenuous.

There is no panel on the market that has multiple native resolutions. What they can except, and what they display may be different(480p to 1080p), but they have only one native resolution. Higher end CRT's do not have a native resolution. They are only limited by the processing in the television, so they can display multiple resolutions. That was my point, and it does not surprise me that point escapes you.

It does not surprise me that you don't know anyone that owns a 1080i set. A person living alone in a trailer probably does not have many friends(if any).

You said you don't lie, which is a lie. See how quickly you can betray yourself! You said you were wrong on "occasion". That is another lie, so you have told two lies in the same response. You have told a least two dozen in this thread, which makes you a perpetual and consistent liar(or a chain liar, or a habitual liar shall I go on?)

LCD panels do lose resolution when images are moving. It is a proven fact. There is something called a response time, and unless that response time is zero(and no panel does), then the panel is going to have blur. When a panel is "properly" measured using industry accepted protocol(black to white to black) the panel response time is quite a bit longer than 4 mill. Vizio is using gray to gray transitions to measure response times of their panels, and everyone knows that cheats the entire test. Rise times are much faster than fall times, and Vizio does not measure the fall time which skews the test in their favor. That is what you do to hide a low budget low performing panel's real response time.


BASICALLY MY SET measures 1080 lines when measured in established tests for consumer items.

Displaydata testing is for consumer sets, and your set does not even measure 1080 line in a static test. It is more like 900 lines for your set. The newer XVT line is the only Vizio televisions that resolve 1080 lines using static images. Even that line(which is Vizio top of the line sets) has resolutions dropping to between 300-400 lines of information during movement.


This "test" of yours is for data grade lab equipment, its nothing short of rediculous
to measure consumer sets this way, like saying a ford mustang comes up short because it is slower than a Corvette by .00005 SECONDS.

The equipment the test uses is data grade lab equipment, the televisions the equipment measures are consumer brands of all televisions. Your ford mustang analog is piss poor at best, and sorry as hell at worst.


Not to mention that the "transitions" (lcd pixels opening and closing) have nothing to do
with resolution whatsoever, and have very little tolerance..

Oh but you are wrong on this. With moving objects, the twisting and untwisting of the pixels has everything to do with resolution. If they cannot respond quick enough, resolution drops, and blurring occurs. Look it up rather than lying about it.


BASICALLY YOU'RE SAYING MY SET HAS 330 lines because it doesnt meet data grade tests, you havent actually measured the moving resolution because
its impossible to test moving resolution, you are inferring something you cant measure,
in other words, using a "test" that HAS NOTHING TO DO WITH RESOLUTION.
Misleading at best, intellectually dishonest certainly.
A nice way of saying you're "lying like a rug"

Is it impossible? I don't think so. All the camera has to do is snap the moving images, and you can see the blurring occurring. The software looks at how much blurring is occurring and measures the resolution of the image. That is not difficult at all when you have the right equipment. You need to keep up with technology pix, that way you do not have to worm your way out of a technical hole when you fall into one.(of which you do too frequently).


no, a more accurate LABEL WOULD BE 540P, since there are only 540 lines on the screen at any given time, interlaced with rapidly fading lines from the last field painted

That is not even accurate. In order for that to be accurate we would have to actually see the odd or even fields displayed separately, not combined. Since we never see the odd and even fields displayed separately(we only see them combined) then 540p is not accurate at all. Since no physical device can fade and refresh at the same time, you are completely wrong on an idea of a fading field. It never fades, it is continually refreshed after each field is painted.

For a person to proclaim themselves a expert, they have to actually know the basics first. Since you haven't even grasped the basics quite yet, you are a neophyte posing as an expert. Recognize your role, and conform. That means, realize that you know very little, and go back to your box(or trailer) until you learn more.

pixelthis
06-16-2010, 01:11 PM
[QUOTE=Woochifer]Hmmm, so why are Samsung and Sony citing the same motion resolution test results for their high end LCD TVs? Are they part of this anti-LCD, anti-Vizio conspiracy too? :out:

If they drink from the same tainted well (the propaganda web site backed by Panasonic)
and use those skewed test for extrapolated results, it would explain a lot.
Testing any consumer device with this idiotic "test" is rediculous.
Guess I need to kick my 12 year old MERCURY TO THE CURB, cant keep up with a
2010 Vette.
Which is just as rediculous as slamming a consumer monitor because it cant meet data grade "tests" that really dont prove anything because the peeps who came up with
said "tests" have a HUGE financial stake in plasma, and making LCD look bad
by any means possible.
THIS TEST has been rigged to showcase one of the few advantages of plasma at the expense of LCD, an advantage that doesnt matter at all


Using which test? And whose benchmarks?

Well, I'll tell ya, it wasnt in a million dollar lab using cleanroom conditions and computers programmed by people who wanted a certain result.
You cant measure "motion resolution" (no such thing) so they rigged a test and
extrapolated a result from the skewed results.



For once, the truth.
NEED SOME AROUND HERE



Right, so people with the highest reputation points are the ones creating trouble on this site.

lIKE THOSE "REP " POINTS mean squat, since you hand them out to each other for your own purposes.
They are worth about as much as the word of a ten dollar crack ho who proffesses her
chasity



Obsess much about what others prefer?
No, but I AM WORRIED ABOUT people who dont know better, who fall for the tech doublespeak nonsense, and might make a costly mistake as a result.
Talky for instance scared someone looking for information, talking about non-existant
"sync" problems when using a non HDMI receiver, had the poor chap so worried he
was planning to run component for this ghost "problem", which means he wouldnt have enjoyed all of the advantages of BLU, until I EXPLAINED IT WAS NOT A
PROBLEM, saved him some trouble, although if he followed talkys advice he probably deserved it.
SO wheres my greenie?



As opposed to your claims about FULL resolution on your TV, which aren't backed up by any benchmark tests whatsoever?

NOPE, JUST A TEST PATTERN, and other methods that have been in use for decades,
which you and other corp shills on this board hate, because they dont produce the results you want, so you can demonize one form factor over another, one brand over another



Yet another poor TV that you're about to kick to the curb. If they're so great, why do you keep buying new ones?
My folks wanted my last one, and I WANT TO UPGRADE TO 1080P, now theres LED
backlighting and 120 hz, not to mention a larger screen( at least a 46") so I AM SAVING MY PENNIES.
Anybody that knows me will tell you that I change sets every two or three years, I DONT GET EMOTIONALLY attached to an inanimate object, and have plenty of peeps who will give me top dollar for my old set, they know I TAKE CARE OF THEM, AND KNOW WHAT I am buying.
With the rapid rate of improvement thats the only way to stay on top, and not that expensive.
Maybe I CAN GET Talkys "people at work" to cobble me one together like they did for him (supposedly).
NAW, MOST LIKELY I will just get a real monitor, probably at my local TV\audio shop, so they can ISF it before I PICK IT UP.
Then I CAN READ talkies nonsense about how it has a 100 lines of resolution because its not a 1080i projector using top secret CRT tubes and circuit boards hustled from the CIA.:1:

E-Stat
06-16-2010, 03:13 PM
No one "skewed" the test(skewers are for kabobs)
That's what I was thinkin' :)


If flicker was a problem with CRT's, nobody would buy them. It would give them a headache.
I might disagree here at least in the computer display context. As one who has spent the last thirty years viewing a computer monitor, I was delighted to move to LCDs for that application. For me, it was an effect that was only noticed when it was absent. You are most certainly correct, however, when it comes to the question of native resolution with LCD monitors. Unfortunately, for some of my web presentations, I must dumb down the laptop below that setting and everything gets a bit grainy. Similarly, standard 480p cable video content looks a bit worse than with the old tube televisions. It is only when they are displaying HD they are really in their element.


It does not surprise me that you don't know anyone that owns a 1080i set. A person living alone in a trailer probably does not have many friends(if any).
Hey, thanks for the reminder as I am one of those individuals with a five and a half year old Samsung 61" 1080i DLP! With the old Oppo DVD player, I was limited to the 720p setting. After reading your post, however, I changed the Samsung Blu-Ray player's setup to 1080i. I'm also finding that indeed internet connectivity can be handy. While running CAT-5 to the family room where the player lives wasn't practical, I was able to reconfigure one of my access points as a bridge and tucked it behind the monitor. That fooled the BR player into thinking it had a wired connection. While I was updating the resolution setting today, it told me I had yet another firmware update. I'm all about getting the latest fixes. :)

Thanks again for all your video tips. You 'da man!

rw

Woochifer
06-16-2010, 07:16 PM
[QUOTE]If they drink from the same tainted well (the propaganda web site backed by Panasonic)
and use those skewed test for extrapolated results, it would explain a lot.
Testing any consumer device with this idiotic "test" is rediculous.

"rediculous" only because you don't like the results. There's nothin "extrapolated" about that test. It's a very simple scoping test pattern that's uses the same grid pattern as a static resolution test. CNET, Home Theater, HDTV Test, and the manufacturers themselves now use those test results in their benchmark specs. The only thing "idiotic" is your persistence in denying the results.


Which is just as rediculous as slamming a consumer monitor because it cant meet data grade "tests" that really dont prove anything because the peeps who came up with
said "tests" have a HUGE financial stake in plasma, and making LCD look bad
by any means possible.
THIS TEST has been rigged to showcase one of the few advantages of plasma at the expense of LCD, an advantage that doesnt matter at all

Like I keep pointing out, if the test is so rigged, then how come the sequential LED-backlit LCD models perform just fine in that test? If 60 Hz LCD had no issues with motion resolution, then why have all of the LCD manufacturers continued to bump up the refresh rates, and go with scanning backlights and motion interpolation, if the motion resolution was not an issue, as you claim? Geez, even Vizio is claiming that their 240 Hz TVs reduce motion blur. Are you they lying too?


Well, I'll tell ya, it wasnt in a million dollar lab using cleanroom conditions and computers programmed by people who wanted a certain result.
You cant measure "motion resolution" (no such thing) so they rigged a test and
extrapolated a result from the skewed results.

Of course you can measure it. Most of the major HDTV review sites already do.


lIKE THOSE "REP " POINTS mean squat, since you hand them out to each other for your own purposes.

Like the motion resolution tests, you make this claim only because you drew the short stick, as usual.


They are worth about as much as the word of a ten dollar crack ho who proffesses her
chasity

I'll defer to your expertise on the subject of crack ho's.


No, but I AM WORRIED ABOUT people who dont know better, who fall for the tech doublespeak nonsense, and might make a costly mistake as a result.
Talky for instance scared someone looking for information, talking about non-existant
"sync" problems when using a non HDMI receiver, had the poor chap so worried he
was planning to run component for this ghost "problem", which means he wouldnt have enjoyed all of the advantages of BLU, until I EXPLAINED IT WAS NOT A
PROBLEM, saved him some trouble, although if he followed talkys advice he probably deserved it.

No, you're worried about people figuring out that you don't know what the hell you're talking about. I got news for you -- people already know! :cool:


SO wheres my greenie?

I thought reputation meant squat to you. Tell you what, here's the sound of one hand clapping.


NOPE, JUST A TEST PATTERN, and other methods that have been in use for decades,

So, you've actually used a test pattern on your own TV? I find that claim rather laughable, since you've made a point of ridiculing people who use calibration discs on their TVs.


which you and other corp shills on this board hate, because they dont produce the results you want, so you can demonize one form factor over another, one brand over another

Speak for yourself. Like you don't shill for Vizio LCDs? :lol:


My folks wanted my last one, and I WANT TO UPGRADE TO 1080P, now theres LED
backlighting and 120 hz, not to mention a larger screen( at least a 46") so I AM SAVING MY PENNIES.
Anybody that knows me will tell you that I change sets every two or three years, I DONT GET EMOTIONALLY attached to an inanimate object, and have plenty of peeps who will give me top dollar for my old set, they know I TAKE CARE OF THEM, AND KNOW WHAT I am buying.
With the rapid rate of improvement thats the only way to stay on top, and not that expensive.
Maybe I CAN GET Talkys "people at work" to cobble me one together like they did for him (supposedly).
NAW, MOST LIKELY I will just get a real monitor, probably at my local TV\audio shop, so they can ISF it before I PICK IT UP.
Then I CAN READ talkies nonsense about how it has a 100 lines of resolution because its not a 1080i projector using top secret CRT tubes and circuit boards hustled from the CIA.:1:

Like I keep saying, if your TVs are so great and you're so satisfied with what you see, then why do you keep planning your next upgrade before you're even halfway through the warranty?

bobsticks
06-16-2010, 08:20 PM
Hey!!!!!

I am NOT a ten dollar crack ho who proffesses her chastity!!

I pimp them out...yeah, I said it...the worst of the worst being dealt to the worst of the worst...oh, the humanity...I mean, seriously, sometimes I'm ashamed...you remember
Mara Salvatrucha and the MS-13's...yeah, I couldn't stop that one...

...be it Kings or Eight Tray Gangstas or even old toofless, methhead, drunken bums I've provided them all with the worst ten dollar crack hoes, though let it be known that the Eight Trays really were quite gentlemanly 'bout things.

The worst combination I ever engineered was between Nightingale Flora and Richard Milhouse Worthington the third. Flora had done some time in the donkey shows down in Matamoros...yeah, the ol' "Villa del Refugio", where she was quite a hit despite being 19 and appearing 67...Worthington was a sad case having done 18 months on a white collar racketeering charge and suffered such cruel prison rape and beatings that he required 11 surgeries and another 36 months of rehab.

Things started going horribly wrong when Richard, or "Tricky Dick" as Flora liked to taunt him with,demanded that she drink paint whilst he dipped her toes and hair in lye...savage and venal indeed and very nearly not worth my $8 fee.

As they dragged him away he screeched in that horrifying way only a man who knows he's doomed can..." The donkey told me...it's all in the resolution!"

So that's the story. I may be just a pimp for ten dollar crack hoes...I may ply 'em full of the white and the sherm and I may collect my $8 and lil' stag nuzzle now and then...but just remember this...even the donkey from Tamaulipas knew that LCD panels do lose resolution when images are moving...

...I'm just sayin'...

kevlarus
06-17-2010, 05:57 AM
That's what I was thinkin' :)

Hey, thanks for the reminder as I am one of those individuals with a five and a half year old Samsung 61" 1080i DLP! With the old Oppo DVD player, I was limited to the 720p setting. After reading your post, however, I changed the Samsung Blu-Ray player's setup to 1080i. I'm also finding that indeed internet connectivity can be handy. While running CAT-5 to the family room where the player lives wasn't practical, I was able to reconfigure one of my access points as a bridge and tucked it behind the monitor. That fooled the BR player into thinking it had a wired connection. While I was updating the resolution setting today, it told me I had yet another firmware update. I'm all about getting the latest fixes. :)

Thanks again for all your video tips. You 'da man!

rw


That's the nice thing about that. As long as the player can request an IP from the network (DHCP), it doesn't matter and it doesn't know that you converted from network cable (10/100/1000) to wireless; they both use the same protocol regardless of even the speed (a/b/g/n) of the wireless network. The packets stay the same.

pixelthis
06-17-2010, 07:51 AM
[QUOTE]"rediculous" only because you don't like the results. There's nothin "extrapolated" about that test. It's a very simple scoping test pattern that's uses the same grid pattern as a static resolution test. CNET, Home Theater, HDTV Test, and the manufacturers themselves now use those test results in their benchmark specs. The only thing "idiotic" is your persistence in denying the results.

When your average TV SHOP cant run it its rediculous



Like I keep pointing out, if the test is so rigged, then how come the sequential LED-backlit LCD models perform just fine in that test? If 60 Hz LCD had no issues with motion resolution, then why have all of the LCD manufacturers continued to bump up the refresh rates, and go with scanning backlights and motion interpolation, if the motion resolution was not an issue, as you claim? Geez, even Vizio is claiming that their 240 Hz TVs reduce motion blur. Are you they lying too?

If well done 240hz can smooth out a picture.
Doesnt mean that theres anything wrong with 60 hz.
AND MOST ARE GOING WITH EDGE LIT BACKLIGHTS, btw



Of course you can measure it. Most of the major HDTV review sites already do.

Name a few



Like the motion resolution tests, you make this claim only because you drew the short stick, as usual.

Nope, made this claim because its true.
You remember the truth, dont you?


I'll defer to your expertise on the subject of crack ho's.

RICHIE TAUGHT ME EVERYTHING HE KNOWS



No, you're worried about people figuring out that you don't know what the hell you're talking about. I got news for you -- people already know! :cool:

Which is why you corp shills are all the time defaming me.
If you werent worried about me you'd ignore me


I thought reputation meant squat to you. Tell you what, here's the sound of one hand clapping.

A GREENIE MEANS SQUAT FROM SOMEONE who doesnt know what hes' talking about.
I argued with a momber for days about how rediculous the thought was that a power cord could do anything
to change the sound.
If someone so clueless wants to give a greenie, what is it worth?


So, you've actually used a test pattern on your own TV? I find that claim rather laughable, since you've made a point of ridiculing people who use calibration discs on their TVs.

NOT REDICULING anybody, its just that increasingly TV calibration is worth less and less, as more and more sets come from the factory pretty well set up already


Speak for yourself. Like you don't shill for Vizio LCDs? :lol:

Vizios are cheap TV sets with great performance for the price.
Most of the time you are paying for the "name" since most TV parts come from the same place, especially true for panels.
The more TV sets Vizio sells(number 1) the more the corporate propaganda machines try to slander them,
with idiotic, skewed tests, and outright baldface lies put out by corp shills pretending to be enthusiaists,
in some cases.
This amuses me, the fear such as yourself have for this upstart company.
Afraid that you might actually have to compete?
I know you have nothing but contempt for joe sixpack, but in the end its HIS vote (with his dollars)
that will determine the issue of quality, or how much a balance between quality, price, and build there should
be.
They will decide which products make it, not some elitist shill such as yourself, and I JUST KNOW
that drives you crazy, since they are voting for products you shun, products with a lower profit margin.:1:


Like I keep saying, if your TVs are so great and you're so satisfied with what you see, then why do you keep planning your next upgrade before you're even halfway through the warranty?

I AM ALWAYS PLANNING my next "upgrade", not just for TV sets, but for everything I own.
Always room for improvement.
Speakers are the only long lived component these days.
ONE OF THE REASONS I like Vizio so much, you can afford to upgrade them more often, and you dont
suffer reduced PQ, in spite of what your "tests" designed by a marketing dept say.
Vizio and the way they build sets is the wave of the future, and I think that that is what really has
you shills worried.:1:

Sir Terrence the Terrible
06-17-2010, 01:57 PM
I might disagree here at least in the computer display context. As one who has spent the last thirty years viewing a computer monitor, I was delighted to move to LCDs for that application. For me, it was an effect that was only noticed when it was absent. You are most certainly correct, however, when it comes to the question of native resolution with LCD monitors. Unfortunately, for some of my web presentations, I must dumb down the laptop below that setting and everything gets a bit grainy. Similarly, standard 480p cable video content looks a bit worse than with the old tube televisions. It is only when they are displaying HD they are really in their element.

I can fully understand flicker being a problem with computer monitors, they display static images constantly, and because they have the ability to control refresh rates through the settings menu, and it is not often set high enough for static image viewing. For instance, the PAL standard in Europe uses a 50hz to coincide with their 50hz alternating current standard. Flicker is sometimes noticeable(I always noticed it) at that refresh rate. With 60hz like we have in the states, flicker is not a problem.

For television(as opposed to computer applications) the constant changes in the fields combined with the 60hz refresh rates makes flicker impossible to see.

The thing about standard definition television and flat panels has to do with analog to digital conversion. Much like with audio, analog to digital conversion is not a perfect process, and in audio application it is much more forgivable(due to inner ear hair filters smoothing the resolution of the ears, and count in hearing loss) than to our eyes. If the television never has to use A/D conversion, the result is video with much higher resolution due to no conversion. This is the case with digital television, and high definition television. Much like a recording done digitally, and reproduced in a all digital format.


Hey, thanks for the reminder as I am one of those individuals with a five and a half year old Samsung 61" 1080i DLP! With the old Oppo DVD player, I was limited to the 720p setting. After reading your post, however, I changed the Samsung Blu-Ray player's setup to 1080i. I'm also finding that indeed internet connectivity can be handy. While running CAT-5 to the family room where the player lives wasn't practical, I was able to reconfigure one of my access points as a bridge and tucked it behind the monitor. That fooled the BR player into thinking it had a wired connection. While I was updating the resolution setting today, it told me I had yet another firmware update. I'm all about getting the latest fixes. :)

Thanks again for all your video tips. You 'da man!

rw

You are welcome. Anything to help a bruddah out! I figure this, television manufacturers have been offering 1080p sets since 2004 or 2005. There is no way they have sold enough 1080p(or 720p) sets in five years to counter sales over the last 15 years, let alone 20 or 30 of interlaced televisions. Considering interlaced CRT televisions outsold plasma at the beginning of 2009 shows there is still strong demand for interlaced CRT's in emerging third world countries.

E-Stat
06-17-2010, 02:48 PM
Flicker is sometimes noticeable(I always noticed it) at that refresh rate. With 60hz like we have in the states, flicker is not a problem.
I hear what you are saying, but at the time I had a pretty good video card that supported the higher refresh rate (I think I was able to go to 80 hz), but still the move to an LCD monitor removed a minor annoyance that I had not really thought about - until it was gone.


For television(as opposed to computer applications) the constant changes in the fields combined with the 60hz refresh rates makes flicker impossible to see.
I don't notice any difference with TV output - other than the higher resolution since my tubes were the basic NTSC flavor.


You are welcome.
While we may have different preferences when it comes to audio, I have always appreciated your depth of video knowledge.


Considering interlaced CRT televisions outsold plasma at the beginning of 2009 shows there is still strong demand for interlaced CRT's in emerging third world countries.
I confess complete ignorance in the practical distinctions between interlaced and non-interlaced output. I watched Avatar last night after making the switch from 720p to 1080i and was amazed by the differences. You could see far more texture in the Na'vi skin. In Sigourney Weaver's eyes. Color gradations were more distinct. Computer screens in the background were rendered with better clarity. There is no question in my mind that 1080 is decidedly superior to 720p using a BR source. Is 1080p any better?

rw

Sir Terrence the Terrible
06-17-2010, 05:58 PM
Avatar[/i] last night after making the switch from 720p to 1080i and was amazed by the differences. You could see far more texture in the Na'vi skin. In Sigourney Weaver's eyes. Color gradations were more distinct. Computer screens in the background were rendered with better clarity. There is no question in my mind that 1080 is decidedly superior to 720p using a BR source. Is 1080p any better?

rw

Aside from a slightly sharper picture, you are not going to notice much difference between 1080i and 1080p. The have exactly the same amount of information, it is just presented differently.

It is said that 720p is a benefit to sport programming, and 1080i is better for movies. I agree with that assessment.

E-Stat
06-17-2010, 08:14 PM
It is said that 720p is a benefit to sport programming, and 1080i is better for movies. I agree with that assessment.
That may well be, but at present I'm just basking in the glow of 1080i. I have a Blu Ray set of the entire first season of the original Star Trek series. I grew up watching the program's intro shot with the ship emerging from the field of white stars. Except that they're aren't all white. Some are red. Some are blue. As the ship approaches from the left and is still barely visible as a separate shape, you make out the pinkish hue of the anti-matter pods. As the ship comes into full view, you see the whirling multi-color character of the pods. In the episode Mudd's Women, there is a scene which involves a hearing in the ship's board room. With the monitors located on the center table still off, the camera pans the room. In the surface of the still powered off monitors, you can clearly make out shapes of individuals as they walk by.

That is high def in the resolution of the film that has been there since 1967 - but rarely, if ever viewed by folks in the broadcast world. Incredible.

rw

Smokey
06-17-2010, 09:10 PM
There is no question in my mind that 1080 is decidedly superior to 720p using a BR source.

Not only BR source, but probably with any [1080 capable] HD source. 720p will give you smother picture, but 1080 will give you more resolution not only in scan lines (1080 vs 720), but also in more pixels horizontally (1920 vs 1280) :)

kelsci
06-18-2010, 03:13 AM
My Direct TV receiver has a choice of resolutions that can be changed by a menu. Lately, I have been fooling with that menu outputting the resolutions between 720P and 1080i. I think I am seeing a difference. Rather then go into what I think I may be seeing, I will have to do a photographic evaluation with a digital camera to see what is really going on. I am going to need a picture is constant and does not change in order to be able to switch the receiver and do the photographs. If something comes up noteworthy I will put the results on my electronics blog and post a link here.

recoveryone
06-18-2010, 07:35 AM
I posted my own finding on this about 3-4 years ago when I brought a 32" 720p LCD and compared it to my old 55" 1080i Mits. The LCD 720p was better with cartoons and analog broadcast, but the 1080i was sharper/more res with HD broadcast and DVD movies. I used same cable boxes and DVD players for my little in home test as with same content. Also each unit was connect the same back then with component cables. With my newer 1080p set the difference is even more pronounce and again both sets are setup the same vis HDMI from the cable box. I'm sure some will say the size of the units is a issues, but the 720p LCD did win at first with cartoons. Now I have setup my parents 32" 1080p and the PQ is on par with my 55" LCD 1080p when playing DVD's and their DVD player is a up converter model connected via HDMI and my DVD is still component via my AVR outputting via HDMI to my unit. Oh by the way all of the LCD's are VIZIO. Now back to the original post about if the 720p format is dead:

I feel it was a marketing failure base on the fact that when HDTV was introduce to the public the standard was 1080i (and still is today), but through compromises with the FCC, broadcasters where able to create different levels of HD 480p 720p 1080i (correct me if I got those wrong). With this new water down version of HD in place broadcaster were able to cut cost on upgrading which lead to TV manufacturer to do the same. We all remember seeing all types of TV's advertised as HD ready, which was really to mean that the TV did not have a HD tuner built in, but with no regulation on the wording companies could put out anything and slap on the HD ready tag as long as it can reproduce the min 480p picture. So unless you were to educate yourself by coming to forums like this one and getting real information and not tit for tat BS about what test lab results showed what. All I say is speak what you know and what you done. Now if you have access to more detail information on issues feel free to share so we all can become more informed. I just remember the days on here when one simple acronym would stay off the flaming IMHO (In My Humble Opinion).
maybe we should get back to such simpler times.

pixelthis
06-18-2010, 11:22 AM
I posted my own finding on this about 3-4 years ago when I brought a 32" 720p LCD and compared it to my old 55" 1080i Mits. The LCD 720p was better with cartoons and analog broadcast, but the 1080i was sharper/more res with HD broadcast and DVD movies. I used same cable boxes and DVD players for my little in home test as with same content. Also each unit was connect the same back then with component cables. With my newer 1080p set the difference is even more pronounce and again both sets are setup the same vis HDMI from the cable box. I'm sure some will say the size of the units is a issues, but the 720p LCD did win at first with cartoons. Now I have setup my parents 32" 1080p and the PQ is on par with my 55" LCD 1080p when playing DVD's and their DVD player is a up converter model connected via HDMI and my DVD is still component via my AVR outputting via HDMI to my unit. Oh by the way all of the LCD's are VIZIO. Now back to the original post about if the 720p format is dead:

I feel it was a marketing failure base on the fact that when HDTV was introduce to the public the standard was 1080i (and still is today), but through compromises with the FCC, broadcasters where able to create different levels of HD 480p 720p 1080i (correct me if I got those wrong). With this new water down version of HD in place broadcaster were able to cut cost on upgrading which lead to TV manufacturer to do the same. We all remember seeing all types of TV's advertised as HD ready, which was really to mean that the TV did not have a HD tuner built in, but with no regulation on the wording companies could put out anything and slap on the HD ready tag as long as it can reproduce the min 480p picture. So unless you were to educate yourself by coming to forums like this one and getting real information and not tit for tat BS about what test lab results showed what. All I say is speak what you know and what you done. Now if you have access to more detail information on issues feel free to share so we all can become more informed. I just remember the days on here when one simple acronym would stay off the flaming IMHO (In My Humble Opinion).
maybe we should get back to such simpler times.

iT GOT BEAT IN the market.
720p was considered superiour to 1080i, because progressive is always superiour
due to artifacts inherent in interlaced programming.
1080p was a ways in the future, but DLP sets started coming out with 1080p using a method called "wobulation", which was a cheap but worked.
NEVER FORGET THE FIRST ONE I SAW.
So panel makers had to get off the pot, so to speak.
Once 1080p started coming out, there was no real niche for 720p, except in smaller screen sizes where cost was a consideration, and the difference was not seen as
important in a smaller screen.
Hope you like your Vizio panels, bet you didnt know they were only 330 lines of rez!!!
Yeah, I know, rediculous.
THIS IS USING A test designed to showcase the one area where plasma has
a slight advantage over LCD, and that doesnt matter.
Propagandists tend to hurt their own case when they get so overzelous that they
distort reality too much.
NOBODY SERIOUSLY BELEIVES this nonsense, but they might believe 900 or some other arbitary number if it was a little higher.:1:

Sir Terrence the Terrible
06-18-2010, 02:10 PM
iT GOT BEAT IN the market.
720p was considered superiour to 1080i, because progressive is always superiour
due to artifacts inherent in interlaced programming.

The problem with interlacing does not occur at 1080i, it occurs at 480i. Progressive is not always superior to interlace, as 720p and 480p are not superior to 1080i in any way.



1080p was a ways in the future, but DLP sets started coming out with 1080p using a method called "wobulation", which was a cheap but worked.

Wobulation introduced artifacts, very visible ones that are easy to see. Dithering noise is noticeable, especially in dark image areas. Error-diffusion artifacts caused by averaging a shade over different pixels, since one pixel cannot render the shade exactly.


So panel makers had to get off the pot, so to speak.
Once 1080p started coming out, there was no real niche for 720p, except in smaller screen sizes where cost was a consideration, and the difference was not seen as
important in a smaller screen.

If what you are saying is true, then why the darth of under 40" 1080p televisions? Here is a reality, very few people sit close enough to a 1080p set to really see every pixel in action. This is especially so with screens 40" and under. For most people visually 720p is just as good as 1080p, because they don't sit close enough to notice the difference in resolution. 720p sets are very popular in China, India, eastern europe, and several other regions in the world. They are not gone, but just not popular here where the numbers game always play a big role in what people decide is quality.


Hope you like your Vizio panels, bet you didnt know they were only 330 lines of rez!!!
Yeah, I know, rediculous.

This is where you get stupid as hell. You Vizio model resolves 900 lines of information using a static image. It is only when the image moves that you get 330 lines of information. When you don't take your medicine, you have these stupid laps of detail you forget to add.



THIS IS USING A test designed to showcase the one area where plasma has
a slight advantage over LCD, and that doesnt matter.

This test does not give any display an advantage over the other. The inherent nature of LCD's pixel structure gives it a disadvantage. Technologies that do not rely on moving mechnical parts(like the shutters, and the twisting nature of the liquid crystals) fair better because they do not have moving parts. Plasmas and CRT's do not have low response times because they do not have mechanical moving parts within their imaging processes. All the test does is take pictures of a moving object. It reveals whether there are image trails or not, and measures the actual resolution of the moving object. This test does not favor any panel but the one that does not have blur, and maintains its resolution whether the object is stationary or in movement. Some LCD's do quite well(Samsung, Sony, and Panasonic panels), and some do very poorly(Vizio, Toshiba, and Sharp in that order).



Propagandists tend to hurt their own case when they get so overzelous that they
distort reality too much.
NOBODY SERIOUSLY BELEIVES this nonsense, but they might believe 900 or some other arbitary number if it was a little higher.:1:

People who bury their heads in lies don't help their case much either. If you really believe your Vizio does not lose resolution with moving images, your ignorance is your bliss - believe on.

Sir Terrence the Terrible
06-18-2010, 02:37 PM
I posted my own finding on this about 3-4 years ago when I brought a 32" 720p LCD and compared it to my old 55" 1080i Mits. The LCD 720p was better with cartoons and analog broadcast, but the 1080i was sharper/more res with HD broadcast and DVD movies. I used same cable boxes and DVD players for my little in home test as with same content. Also each unit was connect the same back then with component cables. With my newer 1080p set the difference is even more pronounce and again both sets are setup the same vis HDMI from the cable box. I'm sure some will say the size of the units is a issues, but the 720p LCD did win at first with cartoons. Now I have setup my parents 32" 1080p and the PQ is on par with my 55" LCD 1080p when playing DVD's and their DVD player is a up converter model connected via HDMI and my DVD is still component via my AVR outputting via HDMI to my unit. Oh by the way all of the LCD's are VIZIO. Now back to the original post about if the 720p format is dead:

Now you know you cannot make these kinds of evaluations without using the same prgramming for each set. Cartoon A cannot be compared to a different cartoon B for reasons that are pretty obvious. Each has unique characteristics the other does not always have.


I feel it was a marketing failure base on the fact that when HDTV was introduce to the public the standard was 1080i (and still is today), but through compromises with the FCC, broadcasters where able to create different levels of HD 480p 720p 1080i (correct me if I got those wrong).

First, 480p is not HD by any measure. Secondly, 1080i is not a marketing failure as it was not created for marketing purposes. It was where we were technologically, as there were no 1080p sets even close to being invented at the time. Thirdly, the whole purpose of the switch to digital television(and HD as well) was to reduce the broadcasting footprint, as to give more room to wireless technology. Even after the introduction of 1080p sets, there still is no benefit in broadcasting in 1080p, when all of the information you need is already in 1080i. Your television could deinterlace, and you have 1080p after the process is done. The bandwidth required to transmit 1080p is twice that of 1080i, which means if televsion stations broadcast it, their electricity bills would go through the roof, not to mention the infrastructure costs to create and store content before broadcasting(it would be off the charts). 1080i made(and makes) perfect sense, and still does from a technological perspective.


With this new water down version of HD in place broadcaster were able to cut cost on upgrading which lead to TV manufacturer to do the same. We all remember seeing all types of TV's advertised as HD ready, which was really to mean that the TV did not have a HD tuner built in, but with no regulation on the wording companies could put out anything and slap on the HD ready tag as long as it can reproduce the min 480p picture.

An HD ready tag required the television have the necessary pixel structure for 720p or 1080i, not 480p of which no source was encoded in. DVD is encoded in 480 interlace, and the player deinterlaces into 480progressive. DVD is not HD, even when you upscale it. HD as we have it now is far from watered down.



So unless you were to educate yourself by coming to forums like this one and getting real information and not tit for tat BS about what test lab results showed what.

Real information(as opposed to what people make up when while in denial) come from lab tests. Without lab tests, people like pix would get away with telling folks their Vizio LCD panel retains full resolution when images move. Tests proved that to be the opposite. Testing proves the frequency response of a loudspeaker, actual watts per channel full bandwidth instead of 1khz measurements with one channel, reveals accurate color decoders, and tell whether a panel blurs, or does not via its true response time.(instead of fake factory measurements). Actual testing keeps people like pix in check, which makes testing vital to sites just like this where people can lie or twist the facts, instead of dealing with the fact that they purchased something based solely on its price, completely bypassing its performance.


All I say is speak what you know and what you done.

Then Pix would have to be mute, and not participate on this site.


Now if you have access to more detail information on issues feel free to share so we all can become more informed. I just remember the days on here when one simple acronym would stay off the flaming IMHO (In My Humble Opinion).
maybe we should get back to such simpler times.

To those of us who have access to the testing results, IMHO is totally unneccesary. It is not your opinion(for which some it is pretty damn useless), it is verified results using real testing equipment, rather than our faulty and often useless perspectives.

pixelthis
06-20-2010, 09:57 AM
That may well be, but at present I'm just basking in the glow of 1080i. I have a Blu Ray set of the entire first season of the original Star Trek series. I grew up watching the program's intro shot with the ship emerging from the field of white stars. Except that they're aren't all white. Some are red. Some are blue. As the ship approaches from the left and is still barely visible as a separate shape, you make out the pinkish hue of the anti-matter pods. As the ship comes into full view, you see the whirling multi-color character of the pods. In the episode Mudd's Women, there is a scene which involves a hearing in the ship's board room. With the monitors located on the center table still off, the camera pans the room. In the surface of the still powered off monitors, you can clearly make out shapes of individuals as they walk by.

That is high def in the resolution of the film that has been there since 1967 - but rarely, if ever viewed by folks in the broadcast world. Incredible.

rw

Yep, and added in.
DIGITAL ENHANCEMENTS of the original series were a part of the blu edition .
A lot of what you are marvelling at are digital insertions.
This was also done to several episodes of DR who, in ark in space they added digital effects, you can choose the new edition or the old one, using seamless branching, I would guess.
ONE THING ABOUT WATCHING the original series on BLU, the cheapness of the original production shines through(photos on the "monitors", etc) but it doesnt seem to matter.:1:

pixelthis
06-20-2010, 10:04 AM
Now you know you cannot make these kinds of evaluations without using the same prgramming for each set. Cartoon A cannot be compared to a different cartoon B for reasons that are pretty obvious. Each has unique characteristics the other does not always have.

You should be an expert on cartoons



First, 480p is not HD by any measure. Secondly, 1080i is not a marketing failure as it was not created for marketing purposes. It was where we were technologically, as there were no 1080p sets even close to being invented at the time. Thirdly, the whole purpose of the switch to digital television(and HD as well) was to reduce the broadcasting footprint, as to give more room to wireless technology. Even after the introduction of 1080p sets, there still is no benefit in broadcasting in 1080p, when all of the information you need is already in 1080i. Your television could deinterlace, and you have 1080p after the process is done. The bandwidth required to transmit 1080p is twice that of 1080i, which means if televsion stations broadcast it, their electricity bills would go through the roof, not to mention the infrastructure costs to create and store content before broadcasting(it would be off the charts). 1080i made(and makes) perfect sense, and still does from a technological perspective.

That is not why 1080i was created at all.
HDTV first got its started in the late 80's, and was pushed by ZENITH,
and was a response to the Japanese.
Might not be obvious now, but the JAPANESE WERE ON TOP IN THE LATE EIGHTIES,
and already had a analog HD system broadcast from satelite.
The US HDTV ATSC standard was a response to that.
Being largely political, they actually came up with seventeen standards, 1080i being one of them.
But the whole thing got started because economic domination by the Japanese was
feared.:1:







Real information(as opposed to what people make up when while in denial) come from lab tests. Without lab tests, people like pix would get away with telling folks their Vizio LCD panel retains full resolution when images move. Tests proved that to be the opposite. Testing proves the frequency response of a loudspeaker, actual watts per channel full bandwidth instead of 1khz measurements with one channel, reveals accurate color decoders, and tell whether a panel blurs, or does not via its true response time.(instead of fake factory measurements). Actual testing keeps people like pix in check, which makes testing vital to sites just like this where people can lie or twist the facts, instead of dealing with the fact that they purchased something based solely on its price, completely bypassing its performance.

I am reminded of what SAM CLEMENS said about lies, there are lies, Damn lies, and statistics.
same applies to "tests" which are rigged from the start .
THE RIGGED TEST that talky the seperate pixels in a panel, and the way they open and close,
and a screen resolution is extrapolated using a rigged formula and rigged software.
How does a pixels response time affect screen resolution?
IT DOESNT.
THE WHOLE THING comes from a site called "display data" which is a arm of the PANASONIC
marketing dept.
The whole thing was designed to make LCD look bad, why?
Panasonic has a brand new plasma panel factory, and if plasma keeps dying, its gonna be a white
elephant, a 300 million dollar white elephant






To those of us who have access to the testing results, IMHO is totally unneccesary. It is not your opinion(for which some it is pretty damn useless), it is verified results using real testing equipment, rather than our faulty and often useless perspectives.
A perfect description of your "perspective":1:

E-Stat
06-20-2010, 10:46 AM
Yep, and added in.
Yes and no. The spinning lights on the Bussard collector were slightly enhanced and more color was added to the intro.

rw

Sir Terrence the Terrible
06-20-2010, 01:21 PM
You should be an expert on cartoons

Yeah, I have to read your post every day. That is a real cartoon in and of itself.


That is not why 1080i was created at all.
HDTV first got its started in the late 80's, and was pushed by ZENITH,
and was a response to the Japanese.
Might not be obvious now, but the JAPANESE WERE ON TOP IN THE LATE EIGHTIES,
and already had a analog HD system broadcast from satelite.
The US HDTV ATSC standard was a response to that.
Being largely political, they actually came up with seventeen standards, 1080i being one of them.
But the whole thing got started because economic domination by the Japanese was
feared.:1:

Major history rewrite here Pixeltotallyout. First, it was NHK the introduced Muse system of analog HD in Japan, and it was first developed in 1969, not the early eighties. It was introduced to the US in 1981. While everyone was impressed by it(it could transmit 30mbps worth of information instead of ATSC's 19mbps), the bandwidth requirements of Muse was too high for the American market, as the FCC wanted(and received) a much more efficient system for HDTV. The Muse system require four times the bandwidth of NTSC broadcast, and despite NHK's attempts to pare the bit rate down, it was not adopted anywhere.

It was the Grand Alliance Consortium that brought our current ATSC system into fruition. While Zenith was one of the sponsors on the board of the Grand Alliance, it was one of six companies involved. AT&T, Philips, MIT, and Thomson Consumer Electronics(RCA) were also huge players within the Grand Alliance.

There was no fear that the Japanese would take over, as the Muse system has too many drawbacks that hindered its adoption. First, it could only be delivered by satellite, and second its infrastructure was cost prohibitive.




I am reminded of what SAM CLEMENS said about lies, there are lies, Damn lies, and statistics.
same applies to "tests" which are rigged from the start .
THE RIGGED TEST that talky the seperate pixels in a panel, and the way they open and close,
and a screen resolution is extrapolated using a rigged formula and rigged software.
How does a pixels response time affect screen resolution?
IT DOESNT.
THE WHOLE THING comes from a site called "display data" which is a arm of the PANASONIC
marketing dept.
The whole thing was designed to make LCD look bad, why?
Panasonic has a brand new plasma panel factory, and if plasma keeps dying, its gonna be a white
elephant, a 300 million dollar white elephant

This kind of argument is exactly what deniers come up with when they don't want to face the truth. Everything is rigged(to support their denial of the testing methodology which is widely accepted within the industry), so there is no way to test for anything. Convenient when you don't want a weakness uncovered.

Continued claims of affiliation when they cannot be supported by any facts whatsoever. Displaydata is not owned by Panasonic, and never has been. It is not a website, it is a legitimate stand alone testing lab with a website.

Plasma is not dying as not only is Panasonic in the market, but so is Samsung. It turns out that Plasma televisions are excellent at 3D, and a couple of the majors CE(and several marketing concerns) are sticking with it for just that purpose. Everyone already knows its superiority with 2D as well.

The testing methodology is not rigged. It cannot be because its concept is so simple. Just take high resolution, ultra quick snapshots of a screen showing a moving object. It is just that simple, and each television is tested exactly the same way with no exceptions. Every television has to stand up to the testing equally, and the performance is strictly on its ability to maintain its full image resolution whether a image is stable, or moving. It just so happens that Plasma televisions, CRT televisions, LYCOS based panels have better results than most LCD panels. Of the LCD panels, Samsung and Sony panels performed perfectly, while Vizio's were always at the bottom of the pack. If Samsung and Sony LCD panels where able to maintain a full 1080 lines of information while objects are moving, then there is no way this test could be rigged against LCD panels.

Panasonic's new factory was actually Pioneers new factory before they got out of the business of Plasma manufacturing. It was designed to make more Kuro's panels as Pioneer was going to expand the Kuro's line. Panasonic paid quite a bit less for the site than it cost to build, so your $300 million dollar figure is quite a bit higher than reality.

So much for your spin.


A perfect description of your "perspective":1:

So, are you going to throw your ball and jacks at me too?

Rich-n-Texas
06-20-2010, 02:02 PM
Major history rewrite here Pixeltotallyout.
I can't compete with that. I give up. You win. :incazzato:

So, are you going to throw your ball and jacks at me too?
He can't. Nurse Ratchett took them away.

recoveryone
06-20-2010, 03:19 PM
[quote=Sir Terrence the Terrible]Now you know you cannot make these kinds of evaluations without using the same programming for each set. Cartoon A cannot be compared to a different cartoon B for reasons that are pretty obvious. Each has unique characteristics the other does not always have.

The cartoons where the same ones, but thanks for point out that for a real comparison you need the same content.


First, 480p is not HD by any measure. Secondly, 1080i is not a marketing failure as it was not created for marketing purposes. It was where we were technologically, as there were no 1080p sets even close to being invented at the time. Thirdly, the whole purpose of the switch to digital television(and HD as well) was to reduce the broadcasting footprint, as to give more room to wireless technology. Even after the introduction of 1080p sets, there still is no benefit in broadcasting in 1080p, when all of the information you need is already in 1080i. Your television could interlace, and you have 1080p after the process is done. The bandwidth required to transmit 1080p is twice that of 1080i, which means if television stations broadcast it, their electricity bills would go through the roof, not to mention the infrastructure costs to create and store content before broadcasting(it would be off the charts). 1080i made(and makes) perfect sense, and still does from a technological perspective.

It was 720p was a failure not 1080i, I brought my first HD set back in 2000 anticipating the switch in 01 then push back to 03 and then 06-07 time frame. 1080i was the first standard I ever heard of back in the late nineties and the other formats came along a few years later.


Real information(as opposed to what people make up when while in denial) come from lab tests. Without lab tests, people like pix would get away with telling folks their VIZIO LCD panel retains full resolution when images move. Tests proved that to be the opposite. Testing proves the frequency response of a loudspeaker, actual watts per channel full bandwidth instead of 1khz measurements with one channel, reveals accurate color decoders, and tell whether a panel blurs, or does not via its true response time.(instead of fake factory measurements). Actual testing keeps people like pix in check, which makes testing vital to sites just like this where people can lie or twist the facts, instead of dealing with the fact that they purchased something based solely on its price, completely bypassing its performance.

the reason I'm not a big fan of testing lab/reports, because many are back by companies which tints the results to favor those companies. The same has happen in the audio industry, so its hard to really believe what is posted/published. I know Sir T of your access and value your opinion due to that access, for you rarely quote lab results but more often state what you have seen in the industry. (state what you seen/done)

Sir Terrence the Terrible
06-20-2010, 06:59 PM
The cartoons where the same ones, but thanks for point out that for a real comparison you need the same content.

What would have really been beneficial is if you could have somehow sync'd the two displays side by side to get a real comparison. Our visual memory is not that good(neither is our audio memory) so it is hard to detect real difference outside that kind of comparison.




the reason I'm not a big fan of testing lab/reports, because many are back by companies which tints the results to favor those companies. The same has happen in the audio industry, so its hard to really believe what is posted/published. I know Sir T of your access and value your opinion due to that access, for you rarely quote lab results but more often state what you have seen in the industry. (state what you seen/done)

Displaydata is a private standalone firm that takes no money from the industry, and is not owned by any company within the industry. The test all televisions, and you pay for the testing report. That is how they make money along with creating testing methodology, creating testing material, and setting standards within the post production community for displays and projection systems. They gain absolutely nothing for loading tests, so they don't have to do it. The CE pay big money to get the testing results so they can improve their displays.

I usually buttress my arguments with both test results, and my experience. I think having both gives an opinion credibility, rather than just relying on the biases and opinions of other individuals. Pix is the prime reason why testing is needed, as he would let folks believe that a low performing panel was the greatest thing every introduced to the market.

audio amateur
06-21-2010, 07:42 AM
What this refers to as the ability of the televisions pixels to mechanically twist and update to its correct brightness frame by frame. LCD panels use a backlight, and the pixels determine how much light is supposed to be on the screen with each frame by twisting themselves at varying degrees to allow light through(called rise and fall). With static images, there is no movement between frames, so there is no need for the pixels or move or flex to their correct positions. Once there is movement, the pixels have to twist and update very quickly, especially when 120 and 240hz refresh rates are deployed, hence the difference between resolution when images are static, and when they are moving. The refresh rate may be high, but the pixel response is slow, which mitigates the speed of the refresh rate. Once you get the response rate of the pixels and the refresh rate in sync(like Samsung and Sony have in their higher end models), this effect becomes unnoticeable. The ability of LCD panels to do this well is measured by its motion resolution, as it tells how quickly either the processing, or pixels are doing at correctly updating frame by frame with moving objects.

Samsung and Sony's upper line of televisions maintain their full resolution with moving images. Vizio's televisions across their entire line where all 400 lines of resolution(with their 1080p models) or less during motion, the slowest and worse of all televisions tested. The results where the same when they moved to the XVT line of televisions, and higher refresh rates. The response of the pixels cannot keep up with the refresh rate, so the resolution goes down until they can fully update and twist into position.
Thanks dude.

pixelthis
06-21-2010, 12:36 PM
Yeah, I have to read your post every day. That is a real cartoon in and of itself.



[QUOTE]Major history rewrite here Pixeltotallyout. First, it was NHK the introduced Muse system of analog HD in Japan, and it was first developed in 1969, not the early eighties. It was introduced to the US in 1981. While everyone was impressed by it(it could transmit 30mbps worth of information instead of ATSC's 19mbps), the bandwidth requirements of Muse was too high for the American market, as the FCC wanted(and received) a much more efficient system for HDTV. The Muse system require four times the bandwidth of NTSC broadcast, and despite NHK's attempts to pare the bit rate down, it was not adopted anywhere.

Not telling me anything I don't know.
The Japanese standard required TWO channels for all of its content, and there were
wories about it taking over in the late eighties, dont tell me what I WITNESSED
didnt happen.
Actually, most think the threat of the JAPANESE HDTV system was just a scarecrow
that Zenith used to get HDTV started in this country, it never really was much of a threat.
If it had been up to the broadcasters we still wouldnt have HD




It was the Grand Alliance Consortium that brought our current ATSC system into fruition. While Zenith was one of the sponsors on the board of the Grand Alliance, it was one of six companies involved. AT&T, Philips, MIT, and Thomson Consumer Electronics(RCA) were also huge players within the Grand Alliance.

But Zenith was a big influence.
In any group there is one that really pushes things, and Zenith was a major influence in
creating ATSC



There was no fear that the Japanese would take over, as the Muse system has too many drawbacks that hindered its adoption. First, it could only be delivered by satellite, and second its infrastructure was cost prohibitive.

bUT IT WAS A START.
In the eighties there was a great fear of the Japanese, this was before their collapse.
Their gear was the best on the planet.
Worries about them dominating broadcast TV on the engineering side was used to help get HDTV started, every article I READ AT THE TIME MENTIONED THEM




This kind of argument is exactly what deniers come up with when they don't want to face the truth. Everything is rigged(to support their denial of the testing methodology which is widely accepted within the industry), so there is no way to test for anything. Convenient when you don't want a weakness uncovered.

the truth is the truth, no matter how much anyone tries to lawyer it out of exsistence
with made up "tests" that extrapolate instead of prove anything


Continued claims of affiliation when they cannot be supported by any facts whatsoever. Displaydata is not owned by Panasonic, and never has been. It is not a website, it is a legitimate stand alone testing lab with a website.

Supported by PANASONIC I bet.
Just like the guy Wooch quoted until I was able to prove that he worked for 20 years for Panasonic


Plasma is not dying as not only is Panasonic in the market, but so is Samsung. It turns out that Plasma televisions are excellent at 3D, and a couple of the majors CE(and several marketing concerns) are sticking with it for just that purpose. Everyone already knows its superiority with 2D as well.

Unless you want to watch it in a lit room.
Making the phospers small enough to fit on a 1080P screen lowers the surface area of each, limiting the brightness a great deal.
Not to mention that a Plasma panel is just a squished CRT, a relic of the past.
LCD is already its better, as the market shows, another five years or less and plasma will be gone, no amount of rigged tests will change that



The testing methodology is not rigged. It cannot be because its concept is so simple. Just take high resolution, ultra quick snapshots of a screen showing a moving object. It is just that simple, and each television is tested exactly the same way with no exceptions. Every television has to stand up to the testing equally, and the performance is strictly on its ability to maintain its full image resolution whether a image is stable, or moving. It just so happens that Plasma televisions, CRT televisions, LYCOS based panels have better results than most LCD panels. Of the LCD panels, Samsung and Sony panels performed perfectly, while Vizio's were always at the bottom of the pack. If Samsung and Sony LCD panels where able to maintain a full 1080 lines of information while objects are moving, then there is no way this test could be rigged against LCD panels.

Know what you're measuring when you're "measuring" a "snapshot"?
A still image.'
Samsung and Sony passed the "test" for a simple reason, both are five hundred pound gorillas, and any dispersions on their sets would have led to a lawsuit, and since this idiotic "test" wouldnt hold up in a court of law, surprize!
They passed the "test"!


Panasonic's new factory was actually Pioneers new factory before they got out of the business of Plasma manufacturing. It was designed to make more Kuro's panels as Pioneer was going to expand the Kuro's line. Panasonic paid quite a bit less for the site than it cost to build, so your $300 million dollar figure is quite a bit higher than reality.

PANASONIC STILL HAS QUITE an investment in PLASMA.
Not that I CARE, plasma is DEAD


So much for your spin.
Only thing spinning is your head



So, are you going to throw your ball and jacks at me too?
Somebody has to be the grownup around here.:1:

bobsticks
06-21-2010, 12:53 PM
...and the donkey still has the goods on resolution.

Sir Terrence the Terrible
06-21-2010, 03:09 PM
Not telling me anything I don't know.
The Japanese standard required TWO channels for all of its content, and there were
wories about it taking over in the late eighties, dont tell me what I WITNESSED
didnt happen.

What you witnessed, and reality are often headed in two divergent directions. For example, you say that Muse requied two channels for all of its content. That is patently false, Muse required just one channel that had huge bandwidth requirements. (4 times the bandwidth of SD television of which they managed to get down to two times). It was never at any point in its developement or implementation was a two channel system.


Actually, most think the threat of the JAPANESE HDTV system was just a scarecrow
that Zenith used to get HDTV started in this country, it never really was much of a threat.
If it had been up to the broadcasters we still wouldnt have HD

This is also false. The FCC put the Grand Alliance together, and though Zenith was a major player, it was not the only major player in the game. The Muse system was not just a threat, it could have very well been a reality here if it didn't have several drawbacks. First it was not designed for a country the size of US, it was much more suitable to smaller countries that relied on satellite for signal distribution of all TV signals. The second would be its bandwidth requirements.

Me thinks you have been watching too much of the boogey man.



But Zenith was a big influence.
In any group there is one that really pushes things, and Zenith was a major influence in
creating ATSC

Zenith countributed VSB to the ATSC standard, and was basically first to create a workable digital broadcast system. The FCC created ATSC, not Zenith. Dolby and MIT as well as AT&T were as big a player in the game as Zenith was. Zenith was big in getting stereo audio incorporated into the analog broadcast system via MTS stereo system.

I think you are overselling Zenith's position in all of this, or you are mixing MTS stereo introduction with HDTV. I am trying to give you the benefit of the doubt for your obvious lack of correct information being thrown out here.



bUT IT WAS A START.
In the eighties there was a great fear of the Japanese, this was before their collapse.
Their gear was the best on the planet.
Worries about them dominating broadcast TV on the engineering side was used to help get HDTV started, every article I READ AT THE TIME MENTIONED THEM

Yes I do remember the panic and concern that Japanese technology was going to overtake us, but it wasn't because of the Muse system that is for sure. The Muse system while great for Japan, would never work in the US, and everyone knew it. It was analog, and the US really wanted a digital system, and what drove them was the US wanted to be the first to adopt a digitally based HDTV system. They got what they wanted, and it was not because of Zenith only.




the truth is the truth, no matter how much anyone tries to lawyer it out of exsistence
with made up "tests" that extrapolate instead of prove anything

Your truth is really just plain good ole fashion denial, and nothing more. Tests reveal the truth that people like you hate to see.




Supported by PANASONIC I bet.
Just like the guy Wooch quoted until I was able to prove that he worked for 20 years for Panasonic

Datadisplay's testing is not supported by anyone but Displaydata. Your constant harping on conspiracy theories does not erase that. What Wooch used to do, and what he does today has not marred his ability to tell the truth one bit. I have known him for a long time, and he has never exibited a Panasonic bias like you exibit a Vizio one.



Unless you want to watch it in a lit room.
Making the phospers small enough to fit on a 1080P screen lowers the surface area of each, limiting the brightness a great deal.

This is why three 9" CRT guns are used in projection system. Since one tubes surface area is 3 times the size of a single CRT gun in a direct view, it puts out much more light than a single gun CRT can. That is why you have never seen a 1080p CRT television, but there are dozens of CRT based projection systems that can easily do 1080p.

Movies are compressed, transferred, authored, and designed for exibition in dark rooms, not lit room. Its funny, you quote Joe Kane to bolster a point, but you never quote him when it doesn't fit your argument. Joe Kane has always said that movies are to be watched in a darkened room, just like you do in the theater.


Not to mention that a Plasma panel is just a squished CRT, a relic of the past.
LCD is already its better, as the market shows, another five years or less and plasma will be gone, no amount of rigged tests will change that

The only thing the market has shown is the cheaper technology usually wins the race. Beta was better than VHS, but lost. Dts sounded better than DD, but it lost out on DVD. Film projection is better than Digital Cinema, but it is being replaced by it. Plasma beat out LCD is just about every area except ulitmate brighness(which is hardly necessary when it is properly calibrated). Both CRT and Plasma are better technologies than LCD, but LCD is cheaper to produce, and cheaper in the market place. People are not looking for performance, they are looking for value these days. That is the reason why a companies like Westinghouse, Vizio, Apex, Colby are thriving, and high performance monitors by Pioneer are gone. Nobody wants to pay for performance except us(yes that's me) who put that before price.





Know what you're measuring when you're "measuring" a "snapshot"?
A still image.'

While the image is a still, what is on the television is a moving image. That camera catches the blur, and the computer measures the resolution of the capture. Nothing rigged about that, Vizio is just a plain failure in this area(as well as several more).


Samsung and Sony passed the "test" for a simple reason, both are five hundred pound gorillas, and any dispersions on their sets would have led to a lawsuit, and since this idiotic "test" wouldnt hold up in a court of law, surprize!
They passed the "test"!

Sony and Samsung wouldn't have a case because the entire industry has accepted the testing protocol that Displaydata developed. Their televisions are just plain better than Vizio's, and only the most ardent of fanboys for Vizio would deny it Mr. Viziofanboy.

Since the testing methodology has been accepted by the industry, no lawsuit could happen. More conspiracy theories, but no truth.


PANASONIC STILL HAS QUITE an investment in PLASMA.
Not that I CARE, plasma is DEAD

You wouldn't make this statement of you didn't care. It wouldn't have been necessary.



Only thing spinning is your head

It is spinning along with your lies, conspiracy theories and endless misinformation. Anyone would be dizzy from that!




Somebody has to be the grownup around here.:1:

You are right, who will baby sit you when your mother goes to work.

E-Stat
06-21-2010, 04:10 PM
Joe Kane has always said that movies are to be watched in a darkened room, just like you do in the theater.
To me, that is a no brainer not requiring *expert* commentary. My wife used to think it was strange that I always switched out all the lights in the room (and those visible in the adjoining kitchen as well) when we watched a movie. Now I find her doing the same when she is watching the big TV alone.


Sony and Samsung wouldn't have a case because the entire industry has accepted the testing protocol that Displaydata developed. Their televisions are just plain better than Vizio's, and only the most ardent of fanboys for Vizio would deny it Mr. Viziofanboy.
Having models from all three, I would have to agree. The Vizios, however, are inexpensive and work just fine in the office and bedroom. They deliver mid-fi receiver quality in the video sense and are noticeably better than the standard CRTs they replaced when viewing HD content. Good enough for most folks / applications.

rw

Sir Terrence the Terrible
06-21-2010, 06:17 PM
To me, that is a no brainer not requiring *expert* commentary. My wife used to think it was strange that I always switched out all the lights in the room (and those visible in the adjoining kitchen as well) when we watched a movie. Now I find her doing the same when she is watching the big TV alone.

This is also a no brainer for me, but obviously not for Pix. He is constantly touting the maximum high brightness that LCD can deliver. The problem with that argument is that you don't need the maximum brightness of LCD when it is properly calibrated, and when sources view on it are in the right environment.

It seems to me that when people learn that having the lights on destroys the purity of the color gamut(you paid for that), washes out the light levels on the screen thereby reducing the contrast levels(an Achilles heel with LCD's anyway), destroys shadow detail(also a problem with LCD panels), and drops the perceived(not actual) resolution lower(something that you also paid for), you would think they would turn out the lights. Not Pix though, it seems that the light levels emanating from his Vizio has burnt out his eyes, and his brain.



Having models from all three, I would have to agree. The Vizios, however, are inexpensive and work just fine in the office and bedroom. They deliver mid-fi receiver quality in the video sense and are noticeably better than the standard CRTs they replaced when viewing HD content. Good enough for most folks / applications.

rw

Your last words are a perfect description of the Vizio. "Good enough" for most folks/applications,(but I add) but not for critical viewing or for high performance applications such as gaming. I am not knocking Vizio at all, I just realize it has certain built in performance weaknesses much like Colby, Magnavox, Apex, Dynex, Insignia brands have. Vizio does no in house research and development, so what do you expect. They don't even design their own sets!(and neither does these other third tier brands).

Just like most things, you get what you pay for.

Smokey
06-21-2010, 06:48 PM
It seems to me that when people learn that having the lights on destroys the purity of the color gamut(you paid for that), washes out the light levels on the screen thereby reducing the contrast levels(an Achilles heel with LCD's anyway), destroys shadow detail(also a problem with LCD panels), and drops the perceived(not actual) resolution lower(something that you also paid for), you would think they would turn out the lights.

What about the light (ambient) behind the [non projection] TV?

I always thought that professionals recommended putting a low light (6500 k degree) behind TV to enhance viewing. And it does. I could not watching my CRT TV without the ambinet light behind it in dark room, or it will give me eye ache especially if there is alot of action on screen.

Also have an ambinet light behind computer monitor also :)

Sir Terrence the Terrible
06-21-2010, 07:03 PM
What about the light (ambient) behind the [non projection] TV?

I always thought that professionals recommended putting a low light (6500 k degree) behind TV to enhance viewing. And it does. I could not watching my CRT TV without the ambinet light behind it in dark room, or it will give me eye ache especially if there is alot of action on screen.

Also have an ambinet light behind computer monitor also :)

If you screen takes up your field of vision, you don't need a backlight. This recommendation was made when the average screen size was more like 32 and 40", and people sat far away from the set. Now that it is approaching 50", a backlight is not necessary for most folks. If you get eyestrain these days, it is from having your monitor too bright(or not calibrated), or you are not sitting close enough to it for it to cover your field of vision. I sit exactly 7 feet from my smallest television(a 52" XBR), and I don't need a backlight(its properly calibrated as well). My computer screens are 26 and 30", and I don't need a backlight there either(they are also calibrated believe it or not). Now when my computer screens were 17 and 19", I absolutely needed a backlight. When my largest TV was 40", I most definitely had a backlight. A properly calibrated set viewed from the proper distance needs no ambient lighting in the room(it also has to be a certain size as well).

Keep in mind, ambient light is designed for placement in back of the set - a place that should not change any viewing perimeters. Unfortunately the light does not stay exactly behind the television, and can splay on walls to the sides of the set. That will reduce the contrast of the set.

pixelthis
06-22-2010, 12:12 PM
What you witnessed, and reality are often headed in two divergent directions. For example, you say that Muse requied two channels for all of its content. That is patently false, Muse required just one channel that had huge bandwidth requirements. (4 times the bandwidth of SD television of which they managed to get down to two times). It was never at any point in its developement or implementation was a two channel system.

Right, two times the bandwidth, which was broadcast on TWO CHANNELS.
Every article I READ SAID THAT IT REQUIRED TWO CHANNELS.
YOU said it required TWO channels!!!
What is the argument?
Oh, right, the troll argues just to argue


This is also false. The FCC put the Grand Alliance together, and though Zenith was a major player, it was not the only major player in the game. The Muse system was not just a threat, it could have very well been a reality here if it didn't have several drawbacks. First it was not designed for a country the size of US, it was much more suitable to smaller countries that relied on satellite for signal distribution of all TV signals. The second would be its bandwidth requirements.

Zenith was a major player, indeed, the push that got the FCC off of its a**.
You gonna keep quoting stuff that I said?



Me thinks you have been watching too much of the boogey man.

And methinks you've been eating too many boogers!




Zenith countributed VSB to the ATSC standard, and was basically first to create a workable digital broadcast system. The FCC created ATSC, not Zenith. Dolby and MIT as well as AT&T were as big a player in the game as Zenith was. Zenith was big in getting stereo audio incorporated into the analog broadcast system via MTS stereo system.

Like I said.
And GENERAL INSTRUMENTS WAS key in developing codecs.
And without a campaign for it, we never wouold have had MTS stereo



I think you are overselling Zenith's position in all of this, or you are mixing MTS stereo introduction with HDTV. I am trying to give you the benefit of the doubt for your obvious lack of correct information being thrown out here.

And I AM GIVING YOU a handicap for having an IQ that matches your hat size.
Zenith was a major player in developing HDTV...and MTS.
If I am incorrect then every magazine of the day was incorrect, every article I READ tied
Zenith closely to HDTV.
Didnt do them any good, they still went outta business




Yes I do remember the panic and concern that Japanese technology was going to overtake us, but it wasn't because of the Muse system that is for sure. The Muse system while great for Japan, would never work in the US, and everyone knew it. It was analog, and the US really wanted a digital system, and what drove them was the US wanted to be the first to adopt a digitally based HDTV system. They got what they wanted, and it was not because of Zenith only.

No, mostly because of GENERAL iNSTRUMENTS and their work on codecs.
But Zenith was a major player, and without breakthroughs in codecs we would have wound up with an analog system, it was just fate that we did'nt





Your truth is really just plain good ole fashion denial, and nothing more. Tests reveal the truth that people like you hate to see.
A "test" will show anything you rig it to show.
Only truely independent testers will come up with accurate results , and then only
with time honored proceedures, not extrapolated crap that goes against
everything anybody knows about the way TV sets display pictures





Datadisplay's testing is not supported by anyone but Displaydata. Your constant harping on conspiracy theories does not erase that. What Wooch used to do, and what he does today has not marred his ability to tell the truth one bit. I have known him for a long time, and he has never exibited a Panasonic bias like you exibit a Vizio one.
Oh please, he was practically building a shirne in his back yard.
And Display datas "data" ISN'T SUPPORTED BY ANYONE ELSE.
Finally got one right




This is why three 9" CRT guns are used in projection system. Since one tubes surface area is 3 times the size of a single CRT gun in a direct view, it puts out much more light than a single gun CRT can. That is why you have never seen a 1080p CRT television, but there are dozens of CRT based projection systems that can easily do 1080p.

Three guns are used to show red, green, and blue, has nothing to do with light production.
And the reason you will never see a direct view CRT1080p(your words) is the same
reason you will never see a 2000p CRT PROJECTION SET.
Its at the end of the road, old man, time to get with the future.


Movies are compressed, transferred, authored, and designed for exibition in dark rooms, not lit room. Its funny, you quote Joe Kane to bolster a point, but you never quote him when it doesn't fit your argument. Joe Kane has always said that movies are to be watched in a darkened room, just like you do in the theater.
Even theaters keep some light going.
AND A THEATER EXPERIENCE IS NOT THE SAME as a home experience.
A theater experience is two to three hours., sometimes a home experience can last for hours, even days.
AND WATCHING a screen in a darkened room causes eye fatigue, as your retina open and closes in response to varying degrees of light.
You need a backlight to even things out and save your eyesight.
But dont worry, a CRT setup (or a plasma) will be so dim it probably wont cause too much damage



The only thing the market has shown is the cheaper technology usually wins the race. Beta was better than VHS, but lost. Dts sounded better than DD, but it lost out on DVD. Film projection is better than Digital Cinema, but it is being replaced by it. Plasma beat out LCD is just about every area except ulitmate brighness(which is hardly necessary when it is properly calibrated). Both CRT and Plasma are better technologies than LCD, but LCD is cheaper to produce, and cheaper in the market place. People are not looking for performance, they are looking for value these days. That is the reason why a companies like Westinghouse, Vizio, Apex, Colby are thriving, and high performance monitors by Pioneer are gone. Nobody wants to pay for performance except us(yes that's me) who put that before price.

Nobody can afford it.
You snobs are all the same, youo dont get that everybody having a system that is 95%
IS BETTER THAN A FEW RICH BUTTWIPES having a 98% system.
People live in the real world, which is getting more real with that incompetent boob
at the helm every day.
PEOPLE HAVE KIDS, wives who dont want the one entertainment area to be covered up with crap from a Frankenstein movie.
And the marginal results you cite are not all that obvious, and usually not worth the cost,
as that cost puts them outta reach of most, which in turn removes them from the market.
AND PRICE DIDNT KILL BETAMAX, it was the arrogance of SONY, who, secure in having a better system, refused to put out a two hour cassette.
Only too late did they realize their error, BUT COST WAS only a minor factor






While the image is a still, what is on the television is a moving image. That camera catches the blur, and the computer measures the resolution of the capture. Nothing rigged about that, Vizio is just a plain failure in this area(as well as several more).

BALONY.
There IS no "moving" picture on a TV, just a series of 60 frames.
On a progressive set a complete frame is painted one after another.
Only on a interlaced picture does resolution fall apart.
On 120hz sets every other frame is extrapolated, the set has time to create new frames and paint them, just showing the 60 frames it gets from its source is childs play by comparision, the only thing "blurry" is your reasoning.
And measuring the "blurriness" is impossible, if indeed there is any, taking a picture with
a CCD off of an LCD is going to be problematic at best.
All this "test" is is a slander




Since the testing methodology has been accepted by the industry, no lawsuit could happen. More conspiracy theories, but no truth.

I doubt it, but if so it will only lead to embarrassment in the future



You wouldn't make this statement of you didn't care. It wouldn't have been necessary.
Trolls and industry whores arent nessesary either, but we have them




It is spinning along with your lies, conspiracy theories and endless misinformation. Anyone would be dizzy from that!
Dont blame me for what seems a cronic condition





You are right, who will baby sit you when your mother goes to work.
Yeah, keep to the low road talky:1:

pixelthis
06-22-2010, 12:20 PM
Yet another point, how can Display Datas "test" tell the difference from blurriness
caused by natural motion, that is an intended part of the material, and blurriness injected by the TV?
and if a set is removing blurriness that is part of the material, how could that produce a better picture?
Such a "test" would favor fake looking images that look computer generated.
And what do you want to bet that Display Data uses cartoons for
its "testing?
Wouldnt surprize me.:1:

Sir Terrence the Terrible
06-22-2010, 03:11 PM
Right, two times the bandwidth, which was broadcast on TWO CHANNELS.
Every article I READ SAID THAT IT REQUIRED TWO CHANNELS.
YOU said it required TWO channels!!!
What is the argument?
Oh, right, the troll argues just to argue

Your are right troll. It never required two channels, and admit it, you just can't read very well. Bandwidth and channels are two different things, only an idiot would try and mux the two together.

I never said that MUSE required two channels liar. Pix, you lie just the sake of it. I said MUSE this:

(4 times the bandwidth of SD television of which they managed to get down to two times). It was never at any point in its developement or implementation was a two channel system.

How you got me saying it required two channels after reading this shows that you can't read PERIOD!







Zenith was a major player, indeed, the push that got the FCC off of its a**.
You gonna keep quoting stuff that I said?

Another false statement. Zenith never pushed the FCC off their butts, it was the Grand Alliance that did that. Zenith is not the Grand Alliance. The Grand Alliance was made up of 7 very large players. YOu act like Zenith was the entire GA, but it was just one of many big players in the game. You never said that, and nobody was quoting your lame remarks anyway.





And methinks you've been eating too many boogers!

Is this your trip back into childhood, or did you every leave?






Like I said.
And GENERAL INSTRUMENTS WAS key in developing codecs.
And without a campaign for it, we never wouold have had MTS stereo

Zenith developed the codec, General Instruments provided the wrapper. Still can't get your facts straight. MTS stereo was a lousy carrier of stereo by the way.





And I AM GIVING YOU a handicap for having an IQ that matches your hat size.
Zenith was a major player in developing HDTV...and MTS.
If I am incorrect then every magazine of the day was incorrect, every article I READ tied
Zenith closely to HDTV.
Didnt do them any good, they still went outta business

One can only gather one thing from your comment. The magazines were not incorrect, you just can't read. Zenith was one of many players on the GA, and one of many contributers to the ATSC standards.



No, mostly because of GENERAL iNSTRUMENTS and their work on codecs.
But Zenith was a major player, and without breakthroughs in codecs we would have wound up with an analog system, it was just fate that we did'nt

GI does not do audio codecs, and they never have. We would have never ended up with an analog system because the FCC didn't want it. They wanted an all digital system that would fit in the same broadcast spectrum they currently had.



http://en.wikipedia.org/wiki/General_Instrument

AT&T was a major player, MIT was a major player, Thomson Consumer Electronics was a major player, General Instrument Corporation was on the committee(they were not a major player though), and Philips Consumer Electronics, David Sarnoff Research Center were also major players. Zenith was the least powerful amoung these companies because it was basically a failing company in deep financial trouble. LG bought a stake in Zenith just to keep it alive(and gain access to their patents), and ended up buying Zenith whole.




A "test" will show anything you rig it to show.
Only truely independent testers will come up with accurate results , and then only
with time honored proceedures, not extrapolated crap that goes against
everything anybody knows about the way TV sets display pictures

You sound desperate as hell, grasping every straw you can. Displaydata is a truely independent tester, and just because you don't like the results of their testing does not change that fact one bit. If Displaydata testing was not "accurate", then Vizio would have a lawsuit victory in the bag. Have you heard any news that Vizio was suing Displaydata? Nope, because they can't, the industry has approved DD method of testing, and so does Joe Kane by the way. Joe Kane had a projector he worked with Samsung to produce tested by Displaydata. The results he got back helped him to tweak the performance of that projector, and it ended being one of the finest on the market.

http://news.cnet.com/8301-17938_105-10352083-1.html



Oh please, he was practically building a shirne in his back yard.
And Display datas "data" ISN'T SUPPORTED BY ANYONE ELSE.
Finally got one right

Pix everyone knows that you have never left your trailer park except to run from a tornado, so how do you know what Wooch has in his backyard?

Displaydata is a standalone company not owned or supported by any CE company. Its testing criteria and methodology are accepted by every CE company, and the Hollywood studios as well. None of your spin can deny this, and none of your lies will change it either.



Three guns are used to show red, green, and blue, has nothing to do with light production.

Oh really, and just how does color get on the screen if there is no light? LOL. It just so happens idiot that peak white is a combination of all three guns at full bore.


And the reason you will never see a direct view CRT1080p(your words) is the same
reason you will never see a 2000p CRT PROJECTION SET.
Its at the end of the road, old man, time to get with the future.

Irrelevant to the issue at hand dumbell



Even theaters keep some light going.
AND A THEATER EXPERIENCE IS NOT THE SAME as a home experience.
A theater experience is two to three hours., sometimes a home experience can last for hours, even days.

Sheesh, the only light you see in a theater comes from the exit signs, guide lights, and projector light spill coming from the view hole in the booth. Because of the large space that light is very dim, and when compared to the amount of light on the screen, it is nothing. It certainly does not compare to watching television in full light at home.

Movies are the same length of time whether played at home, or in the movie theater. Only you would sit in front of a television for days, everyone else goes to sleep and to work. Regardless, movies are edited in the dark, authored and compressed in the dark, QC in the dark, and should be played back in the dark whether in the theater, or at home. Joe Kane's words in a hometheater magazine that you often quote from.



AND WATCHING a screen in a darkened room causes eye fatigue, as your retina open and closes in response to varying degrees of light.
You need a backlight to even things out and save your eyesight.
But dont worry, a CRT setup (or a plasma) will be so dim it probably wont cause too much damage

Eye fatigue is only a problem when the display does not take up a significant portion of your field of view, or when the brightness of the set is too high. If you calibrate your set, and sit the proper distance away from it, eye fatigue will not happen.

A properly calibrated set whether it be LYCOS, LCD, Plasma, or CRT will not cause eyestrain, and will not display peak white anywhere near the top of its range.

The only thing that is dim is you.


Nobody can afford it.
You snobs are all the same, youo dont get that everybody having a system that is 95%
IS BETTER THAN A FEW RICH BUTTWIPES having a 98% system.
People live in the real world, which is getting more real with that incompetent boob
at the helm every day.


Nobody can afford it? Right.

If you think for a moment your Vizio is 95% or the performance of a Kuro's, you really are stupid as hell. It ain't even close to that performace level, try 60%. That 38% performance premium is well worth the money, but you wouldn't really know because you need a job to pay for that kind of performance.

Pix, you have got to be one of those stupid tea party types that vote against their own best interests. You are unemployed, which means you probably don't have health insurance. He got you health insurance, so rather calling him a boob, you should thank him ungrateful twit. Buy a clue stupid...ooops you need a job to do that.



PEOPLE HAVE KIDS, wives who dont want the one entertainment area to be covered up with crap from a Frankenstein movie. And the marginal results you cite are not all that obvious, and usually not worth the cost,
as that cost puts them outta reach of most, which in turn removes them from the market.
AND PRICE DIDNT KILL BETAMAX, it was the arrogance of SONY, who, secure in having a better system, refused to put out a two hour cassette.
Only too late did they realize their error, BUT COST WAS only a minor factor

How would you know buying a performance based television has only marginal improvements over your low performing Vizio? You have never owned a performance based set!

Beta III mode could do five hours of recording, so much for your two hour comment.








BALONY.
There IS no "moving" picture on a TV, just a series of 60 frames.

The frames are not stationary are they? What a stupid statement!


On a progressive set a complete frame is painted one after another.
Only on a interlaced picture does resolution fall apart.
On 120hz sets every other frame is extrapolated, the set has time to create new frames and paint them, just showing the 60 frames it gets from its source is childs play by comparision, the only thing "blurry" is your reasoning.
And measuring the "blurriness" is impossible, if indeed there is any, taking a picture with
a CCD off of an LCD is going to be problematic at best.
All this "test" is is a slander

So objects do not move at all on a professive set? Since I have already explained the testing there is no need to go into that again.

Since the industry has already accepted and approved Displaydata testing procedure, your denial and dismissal of this test is meaningless. You are the lone ignorant dissenter on this issue.


I doubt it, but if so it will only lead to embarrassment in the future

You cannot predict the future can you?




Trolls and industry whores arent nessesary either, but we have them

Glad you know who you are!





Dont blame me for what seems a cronic condition

The only chronic condition we have around here is your lies and denials.






Yeah, keep to the low road talky:1:

I can't, you leave me no room.

pixelthis
06-23-2010, 09:33 AM
Your are right troll. It never required two channels, and admit it, you just can't read very well. Bandwidth and channels are two different things, only an idiot would try and mux the two together.

So it required the bandwidth of two channels, but they never used two channels
to broadcast, guess they just broadcast half a signal.
What a maroon



I never said that MUSE required two channels liar. Pix, you lie just the sake of it. I said MUSE this:[b](4 times the bandwidth of SD television of which they managed to get down to two times). It was never at any point in its developement or implementation was a two channel system.
How you got me saying it required two channels after reading this shows that you can't read PERIOD!


No, you said it required twice the bandwidth, I SAID THAT IT REQUIRED TWO CHANNELS, AND WAS PRIMARILY BROADCAST BY SATTELITE,
which was the case.
You'll argue with the devil when he comes for your soul







[QUOTE]Another false statement. Zenith never pushed the FCC off their butts, it was the Grand Alliance that did that. Zenith is not the Grand Alliance. The Grand Alliance was made up of 7 very large players. YOu act like Zenith was the entire GA, but it was just one of many big players in the game. You never said that, and nobody was quoting your lame remarks anyway.

Zenith was the prime mover behind the grand alliance, p[ractically everything I READ ABOUT the grand alliance cited Zenith as a major player





Is this your trip back into childhood, or did you every leave?
You ever graduate from JR high?







Zenith developed the codec, General Instruments provided the wrapper. Still can't get your facts straight. MTS stereo was a lousy carrier of stereo by the way.
GENERAL INSTRUMENTS developed mpeg, never said they developed MTS, you just inferred that.
A lot of GI work on codecs helped develope digital TV





One can only gather one thing from your comment. The magazines were not incorrect, you just can't read. Zenith was one of many players on the GA, and one of many contributers to the ATSC standards.

I CAN READ, YOU JUST DIDNT.



GI does not do audio codecs, and they never have. We would have never ended up with an analog system because the FCC didn't want it. They wanted an all digital system that would fit in the same broadcast spectrum they currently had.
dIDNT SAY THEY DID, SAID THEY WORKED ON video CODECS, which they did


http://en.wikipedia.org/wiki/General_Instrument


AT&T was a major player, MIT was a major player, Thomson Consumer Electronics was a major player, General Instrument Corporation was on the committee(they were not a major player though), and Philips Consumer Electronics, David Sarnoff Research Center were also major players. Zenith was the least powerful amoung these companies because it was basically a failing company in deep financial trouble. LG bought a stake in Zenith just to keep it alive(and gain access to their patents), and ended up buying Zenith whole.
Their push for HDTV was the last major thing this company ever did, but they did do it




You sound desperate as hell, grasping every straw you can. Displaydata is a truely independent tester, and just because you don't like the results of their testing does not change that fact one bit. If Displaydata testing was not "accurate", then Vizio would have a lawsuit victory in the bag. Have you heard any news that Vizio was suing Displaydata? Nope, because they can't, the industry has approved DD method of testing, and so does Joe Kane by the way. Joe Kane had a projector he worked with Samsung to produce tested by Displaydata. The results he got back helped him to tweak the performance of that projector, and it ended being one of the finest on the market.

Dont care whos behind what, that "test" as you described it is BOGUS






Pix everyone knows that you have never left your trailer park except to run from a tornado, so how do you know what Wooch has in his backyard?

SATCOM CHECKED IT OUT, after astronauts complained about the light


Displaydata is a standalone company not owned or supported by any CE company. Its testing criteria and methodology are accepted by every CE company, and the Hollywood studios as well. None of your spin can deny this, and none of your lies will change it either.

There is no such thing as "motion" resolution, nothing will evere change that




Oh really, and just how does color get on the screen if there is no light? LOL. It just so happens idiot that peak white is a combination of all three guns at full bore.

Whos the idiot?
Sure the light comes from the tubes, but that is not the primary reason you have three.
Each is a primary color that, when mixed on screen, produces a picture.
This provides better sharpness than using one tube, has nothing to do with light production, although I GUESS C RTS need all of the help they can get


Irrelevant to the issue at hand dumbell
You're "irrelevant"




Sheesh, the only light you see in a theater comes from the exit signs, guide lights, and projector light spill coming from the view hole in the booth. Because of the large space that light is very dim, and when compared to the amount of light on the screen, it is nothing. It certainly does not compare to watching television in full light at home.

IT DOES NOT COMPARE TO TELEVISION, which is why tv watching requires a BACKLIGHT.
And if you dont use one it will hurt your eyes, not that you see well enough to tell what a decent picture is anyway



Movies are the same length of time whether played at home, or in the movie theater. Only you would sit in front of a television for days, everyone else goes to sleep and to work. Regardless, movies are edited in the dark, authored and compressed in the dark, QC in the dark, and should be played back in the dark whether in the theater, or at home. Joe Kane's words in a hometheater magazine that you often quote from.

YOU DONT just watch movies on TV, the average for TV watching is four hours a
day.
You need a backlight




Eye fatigue is only a problem when the display does not take up a significant portion of your field of view, or when the brightness of the set is too high. If you calibrate your set, and sit the proper distance away from it, eye fatigue will not happen.

It will if you dont have a backlight, you cant keep your eyes glued to the screen all of the time



A properly calibrated set whether it be LYCOS, LCD, Plasma, or CRT will not cause eyestrain, and will not display peak white anywhere near the top of its range.

DOESNT NEED TO, but bursts of brightness will cause the retina to respond,
this is reduced by a backlight






Nobody can afford it? Right.

GLAD you agree with me

If you think for a moment your Vizio is 95% or the performance of a Kuro's, you really are stupid as hell. It ain't even close to that performace level, try 60%. That 38% performance premium is well worth the money, but you wouldn't really know because you need a job to pay for that kind of performance.

THE PERFORMANCE of a Kuro is zero compared to my Vizio, because they dont make it anymore, which is what you get when you try to sell a 5,000 TV that
maybe is 2 to 3 percent better than the worst on the market.
ANY set will turn out 90%. 90 on up is what you pay for in a premium set,
that and better processing.
Most cant tell the diff between a high end and a low end set, and quite a few who cant just lie, and keep their neses in the air


Pix, you have got to be one of those stupid tea party types that vote against their own best interests. You are unemployed, which means you probably don't have health insurance. He got you health insurance, so rather calling him a boob, you should thank him ungrateful twit. Buy a clue stupid...ooops you need a job to do that.
BEEN WORKING FOR A YEAR, have health insurance and other benefits, in spite of
OBOZO 'S best effort to destroy the economy.
What you liberal twits dont realize is that when you vote in your interest, at the expense of the countries interest, long term you will suffer, as will your children.
But of course if you had any brains you wouldnt be a liberal puke in the first place




How would you know buying a performance based television has only marginal improvements over your low performing Vizio? You have never owned a performance based set!
one to talk, YOUR "SET" IS A COBBLED TOGETHER piece of POS from the last century

Beta III mode could do five hours of recording, so much for your two hour comment.
You dont know anything, do you?
BETA iii WAS WAY DOWN THE ROAD AFTER BETA ceased as a consumer format.
Only pros and pro-ams had any interest in it.
When beta died as a consumer format it had a running time of one and a half hours,
which was why I didnt but one.
A year later ypou couldnt find one outside of the hobbyist area









The frames are not stationary are they? What a stupid statement!
Of course they are stationary, where are they going?
They look like they are moving, but that is just a trick of the eye



So objects do not move at all on a professive set? Since I have already explained the testing there is no need to go into that again.

A "proffessive " set?
Dont know bout those, but on a [I]progressive[I] set there is no resolution loss with movement, the picture doesn "blur" or any other nonsense, and a new 1080p set doesnt
have the resolution of a 1989 EMERSON vcr, OR ANY OTHER DELUSIONS YOU ENTERTAIN in that small cranium of yours


Since the industry has already accepted and approved Displaydata testing procedure, your denial and dismissal of this test is meaningless. You are the lone ignorant dissenter on this issue.
VERDICT IS STILL OUT ON THAT ONE.
Do know that if everybody says the world is flat, doesnt mean its flat



You cannot predict the future can you?
Sure, ruin for this poor country for electing a commie, free gas, just park next to the gulf
with a sipon hose, bad eyesight for those who abuse their eyes, and a scandal involving
some corp back hack outfit that sells dubious "tests" of dubious value





Glad you know who you are!
BE SAD when you figure out [I]who] you "arte", everybody else already knows






The only chronic condition we have around here is your lies and denials.


AND YOUR TROLLING





I can't, you leave me no room.
pUT YOUR FAT ASS ON A DIET:1:

GMichael
06-23-2010, 09:47 AM
As a very wise bear once asked, "You two don't really come here for the hunting, do you?"

E-Stat
06-23-2010, 09:49 AM
What a maroon
That is funny on multiple levels. :)

rw

Sir Terrence the Terrible
06-24-2010, 11:29 AM
That is funny on multiple levels. :)

rw

He is funny on multiple levels

pixelthis
06-24-2010, 12:42 PM
That is funny on multiple levels. :)

rw

Its not supposed to be.
When calling someone a moron, spell it like they would(not to mention my hero,
bugs, spells it that way).:1:

Sir Terrence the Terrible
06-24-2010, 12:52 PM
It figures a cartoon would be his hero. A cartoon for a cartoon, how about that!

pixelthis
06-24-2010, 12:55 PM
It figures a cartoon would be his hero. A cartoon for a cartoon, how about that!

Its better to have a cartoon hero, than to be a cartoon, at least Bugs is possesed
of some inteligence.:1:

Sir Terrence the Terrible
06-24-2010, 01:56 PM
Its better to have a cartoon hero, than to be a cartoon, at least Bugs is possesed
of some inteligence.:1:

Maybe he does, but you don't. I just thought it was fitting that a cartoon admires another cartoon. Or should that be a clown admires a cartoon. Whatever...they both fit.

E-Stat
06-24-2010, 02:22 PM
spell it like they would(not to mention my hero, bugs,
spells it that way)
Do a search and you will find four pages of posts that provide the answer. Hint: there's one individual who overwhelmingly dominates the (mis)usage (as opposed to the term being found via quoted text by someone else)

BTW, will you ever figure out when to properly use the carriage return key?

rw

Sir Terrence the Terrible
06-24-2010, 02:41 PM
BTW, will you ever figure out when to properly use the carriage return key?

rw

Baby steps E, baby steps. He has to learn to think first, then the carriage return key.

kevlarus
06-25-2010, 10:36 AM
BTW, will you ever figure out when to properly use the carriage return key?

rw

Now that you told him about it, he'll over use it just like the caps key.

Woochifer
06-25-2010, 08:38 PM
Supported by PANASONIC I bet.
Just like the guy Wooch quoted until I was able to prove that he worked for 20 years for Panasonic

What guy would that be? If you're referring to Gary Merson of HD Guru.com, cite your source on that. First, you were claiming that the blue color of that website proves that he's a paid Panasonic shill. At least that claim had the oh-so-slightest measure of truth to it -- indeed HD Guru's website is colored blue!

This latest claim that Merson worked for Panasonic for 20 years is just laughably false. Just like all your claims that motion resolution tests don't exist, that photographic evidence of LCD's inferior viewing angles is imaginary, that every test result is somehow conspired against Vizio, etc.

The info from Merson's website about motion resolution, viewing angles, and color accuracy coincides with what several other professional review sites say. It's not just one guy, but a whole body of tests and reviews that support his conclusions.


PANASONIC STILL HAS QUITE an investment in PLASMA.
Not that I CARE, plasma is DEAD

Of course you don't care. You don't care so much that you never talk about plasma ... EVER! :out:

RGA
06-25-2010, 09:59 PM
Holy crap you video guys sure take this seriously :)

Anyway - I am looking for the cheapest possible decent 40 inch LCD - I'll be playing video games on it a lot and I'm not really a videophile. Yikes I have a Sony 1080i ( that seems to be nice enough).

The LCDs are coming down in price and this is the one I'm looking at. http://www.bestbuy.ca/en-CA/product/samsung-samsung-40-1080p-lcd-hdtv-ln40b540-ln40b540/10122398.aspx?path=7e1044fa877ba626a29aadc8e2272f0 0en02

I'll check out some competitors but Best-Buy usually has the best prices. Blu-Ray will be handled by a PS3 - but I tend to download and watch from my laptop so quality is never that great - I'd probably want a forgiving tv if there is such a thing. My Sony has 120hz while the Samsung is 60hz. But not really sure any of this is visible.

I remember years ago when I was buying a printer and looking at the dots per inch or whatever and there were two models and one had twice the pixels but as the salesman noted - that's all fine and good but no human eye can detect the added pixels anyway so you're paying for something that uses more ink and tends to clog up more for better numbers nobody can see. (and of course I see that link with the audio world where just because something has better numbers doesn't mean anyone can hear it.

I figure Samsung and Sony are pretty solid brands - LG is said to be good but a saleman I trust said they seem to get a lot more of them back for failing - though he liked the picture. Anyway, any thought on the above linked Samsung for good or ill.

Woochifer
06-25-2010, 11:59 PM
Holy crap you video guys sure take this seriously :)

Anyway - I am looking for the cheapest possible decent 40 inch LCD - I'll be playing video games on it a lot and I'm not really a videophile. Yikes I have a Sony 1080i ( that seems to be nice enough).

The LCDs are coming down in price and this is the one I'm looking at. http://www.bestbuy.ca/en-CA/product/samsung-samsung-40-1080p-lcd-hdtv-ln40b540-ln40b540/10122398.aspx?path=7e1044fa877ba626a29aadc8e2272f0 0en02

Can't really go wrong with a Samsung. Generally, they will at least give you solid performance and reliability, and for the most part, they make the right decisions with how they design their sets.

That particular model has 1080p resolution, but it lacks 120 Hz refresh rate, which means that you'll lose more resolution with moving images and get some juddering with film-based sources.


I'll check out some competitors but Best-Buy usually has the best prices. Blu-Ray will be handled by a PS3 - but I tend to download and watch from my laptop so quality is never that great - I'd probably want a forgiving tv if there is such a thing. My Sony has 120hz while the Samsung is 60hz. But not really sure any of this is visible.

Downloads will be heavily compressed, with a data rate barely higher than with a DVD. It might be technically 720p or even 1080i/p resolution, but you'll see a lot of visible macroblocking and other visual artifacts at that level of compression. No TV can make up for lack of bits.

If you're already watching a 120 Hz TV, you might want to reconsider going to 60 Hz. The juddering effect is a byproduct of the 2:3 pulldown process that's used to render a 24-frame film segment onto a video format that natively displays at 60 frames. That juddering will be visible on any 60 Hz TV, while a 120 Hz TV with true 5:5 frame repeating will eliminate that (as far as I know, Sony has correctly implemented it with all of their 120 Hz TVs). Also, if you use the motion interpolation feature (Sony calls it Motionflow) and like it, the absence of that smoothing effect will be very noticeable.

Why are you upgrading from the Sony?


I remember years ago when I was buying a printer and looking at the dots per inch or whatever and there were two models and one had twice the pixels but as the salesman noted - that's all fine and good but no human eye can detect the added pixels anyway so you're paying for something that uses more ink and tends to clog up more for better numbers nobody can see. (and of course I see that link with the audio world where just because something has better numbers doesn't mean anyone can hear it.

Yes and no. Difference between 1080p and 720p is visible, but it diminishes in importance as the screen gets smaller and your distance from the monitor increases. But, the difference between HD (720/1080) and SD (480) is quite visible. You're right in that the specs don't tell the full story, but there's simply no substitute for resolution. And with a broadcast quality HD signal or with a Blu-ray, the picture quality will simply blow away any SD source.


I figure Samsung and Sony are pretty solid brands - LG is said to be good but a saleman I trust said they seem to get a lot more of them back for failing - though he liked the picture. Anyway, any thought on the above linked Samsung for good or ill.

LG makes some good sets, but they can be hit and miss. For example, some of their 120 Hz sets don't allow you to disable the motion interpolation feature (which makes everything look like it was shot on a camcorder). IMO (and the opinion of most professional reviewers), a 120 Hz set works best when it just does the 5:5 frame repeating to reduce the juddering effect with film sources. When the motion interpolation is added to the mix, movies will look like a bad soap operas. If you like that smoothing effect, then great. But, I think it's a design flaw if a TV does not give you the option of bypassing the motion interpolation.

Samsung and Sony share a production line, with Samsung producing many of the LCD panels for Sony. Each company though uses its own designs for the video processing.

Enochrome
06-26-2010, 08:32 AM
Yeah, 720p is yesterday's numbers. It does not make sense for company's to make that resolution anymore in order to keep up with consumers need for "better numbers". I personally just bought of 42" Panasonic plasma, because I missed the richness and smooth aspect of plasmas. I started out with a plasma, went to lcd ( the plasma was defective) and now am back to plasma 720p. 720, because it is well known that 50" or less, you cannot tell the difference between 720 and 1080 after 9ft.

On the topic of 1080i with a 1080p set: I watched the NBA Finals (Lakers!!!) on my friends 1080p Vizio LCD on Dish ESPN and blew my plasma away: picture looked amazing!!

I know that I am a minority here but I think 1080p LCD sets with Blue-Ray looks terrible. Well not terrible, but not as good as 1080i. Why?: the motion is so unnatural that it ruins whatever you are watching, as well it seems that actors are standing in front of green screens when I know that they are not. The technology is just mismatched, it goes for clarity over naturalness. Looking back people will laugh, or maybe they will laugh at me.

Blue-Ray pick of the week" Days of Heaven. Cinematography is beautiful.

RGA
06-26-2010, 10:11 AM
Woochifer

I'm not upgrading the Sony - It was more of a family TV so my mom is keeping it. She is hard of hearing and uses the headphone outputs. It has 120hz - it's a Bravia but I forget the model number - it's a couple of years old but is a 1080i not a 1080p. I think it's fine but I'm not a videophile. It's not perfect. I suppose I could look for a similar TV. Motion is probably important if I am going to play games on it - so I may hold off a bit for 120hz

Here is a Philips which claims 240hz refresh and 1ms response time. It's a bit more than I want to spend but it's not on sale and I can wait. http://www.futureshop.ca/en-CA/product/philips-philips-40-1080p-240hz-lcd-hdtv-40pfl5505-40pfl5505/10145790.aspx?path=e99bbfcfd03961e7d24900aff6bbb9c een02

From what I have read the Philips uses a Samsung panel but that is only stated on a forum so who knows?

pixelthis
06-28-2010, 04:37 PM
Yeah, 720p is yesterday's numbers. It does not make sense for company's to make that resolution anymore in order to keep up with consumers need for "better numbers". I personally just bought of 42" Panasonic plasma, because I missed the richness and smooth aspect of plasmas. I started out with a plasma, went to lcd ( the plasma was defective) and now am back to plasma 720p. 720, because it is well known that 50" or less, you cannot tell the difference between 720 and 1080 after 9ft.

On the topic of 1080i with a 1080p set: I watched the NBA Finals (Lakers!!!) on my friends 1080p Vizio LCD on Dish ESPN and blew my plasma away: picture looked amazing!!

I know that I am a minority here but I think 1080p LCD sets with Blue-Ray looks terrible. Well not terrible, but not as good as 1080i. Why?: the motion is so unnatural that it ruins whatever you are watching, as well it seems that actors are standing in front of green screens when I know that they are not. The technology is just mismatched, it goes for clarity over naturalness. Looking back people will laugh, or maybe they will laugh at me.

Blue-Ray pick of the week" Days of Heaven. Cinematography is beautiful.

Well, the problem is that a high resolution source is very unforgiving.
For instance, how do you know the actors are not standing in front of green screens?
You would be surprized at how much green screening goes on(SANTUARY is almost
entirely green screen).
AND OF COURSE THE VIZIO BLEW YOUR PLASMA AWAY, funny it was a fast action sports program, which the Vizio haters on this site claim Vizio is worst at, but was exlelent, actually, and of course the pic was better, plasma fanboys(and corporate shills)
can screan over and over about the superority of plasma, doesnt change the fact that most, when given a choice, choose LCD.
Which is demonstrated in the marketplace daily.:1:

Woochifer
06-28-2010, 04:47 PM
Yeah, 720p is yesterday's numbers. It does not make sense for company's to make that resolution anymore in order to keep up with consumers need for "better numbers". I personally just bought of 42" Panasonic plasma, because I missed the richness and smooth aspect of plasmas. I started out with a plasma, went to lcd ( the plasma was defective) and now am back to plasma 720p. 720, because it is well known that 50" or less, you cannot tell the difference between 720 and 1080 after 9ft.

Well, 720p is still out there because it's less expensive and as you note, in the smaller screen sizes, you won't tell much of a difference.


On the topic of 1080i with a 1080p set: I watched the NBA Finals (Lakers!!!) on my friends 1080p Vizio LCD on Dish ESPN and blew my plasma away: picture looked amazing!!

I know that I am a minority here but I think 1080p LCD sets with Blue-Ray looks terrible. Well not terrible, but not as good as 1080i. Why?: the motion is so unnatural that it ruins whatever you are watching, as well it seems that actors are standing in front of green screens when I know that they are not. The technology is just mismatched, it goes for clarity over naturalness. Looking back people will laugh, or maybe they will laugh at me.


Sounds to me like you're watching a set with the motion interpolation switched on (Vizio calls it "Smooth Motion") and like that better. This has nothing to do with 1080p/i or LCD/plasma. Simply a feature that smooths out the motion.

1080p Blu-ray is pretty much superior to everything else because 1) it has by far the highest bitrate; and 2) natively displaying a 1080p signal on a 1080p TV requires the least amount of processing (which alters the image, and introduces all kinds of potential artifacts). Scaling a 1080p signal down for a 720p TV is not nearly as involved a process as deinterlacing a 480i/1080i signal -- so the Blu-ray should give you superior results whether you're watching a 720p or 1080p HDTV.

Personally, I hate it because the heavy handed processing required to produce that effect makes the image quality look fake.

Sir Terrence the Terrible
06-28-2010, 04:58 PM
Yeah, 720p is yesterday's numbers. It does not make sense for company's to make that resolution anymore in order to keep up with consumers need for "better numbers". I personally just bought of 42" Panasonic plasma, because I missed the richness and smooth aspect of plasmas. I started out with a plasma, went to lcd ( the plasma was defective) and now am back to plasma 720p. 720, because it is well known that 50" or less, you cannot tell the difference between 720 and 1080 after 9ft.

You are correct on the distance factor.


On the topic of 1080i with a 1080p set: I watched the NBA Finals (Lakers!!!) on my friends 1080p Vizio LCD on Dish ESPN and blew my plasma away: picture looked amazing!!

The picture may have looked amazing, but you will have to do a side by side comparison with the same material to make the claim it blows your plasma away. Plus this statement is in direct contradiction to the one you make below.


I know that I am a minority here but I think 1080p LCD sets with Blue-Ray looks terrible. Well not terrible, but not as good as 1080i.

Here is your contradiction. First you don't know if any of these sets were calibrated. Second 1080p has a higher data rate than 1080i, so unless the mastering and compression was poorly done, 1080p and 1080i look absolutely the same. Thirdly, you cannot compare two different programs, and proclaim that one resolution is better than the other. You have to have the same programming mastered identically for each resolution. If it is viewed on the same set(1080p), it will be deinterlaced and end up being 1080p in the end anyway. If you are comparing a 1080p set with a 1080i set, they would have to be calibrated to give the same visual result. Good luck in finding the difference between these resolutions, as they are essentially the same but presented differently.



Why?: the motion is so unnatural that it ruins whatever you are watching, as well it seems that actors are standing in front of green screens when I know that they are not.

This effect comes when the television is not calibrated, and has nothing to do with 1080p or 1080i.

Another problem with your statement is that most LCD televisions can accept the 24fps cadence of film. That means it should look motion wise the same as film looks in the theater. What makes motion look so unnatural is when anti-judder processing(poorly implemented) is turned on full or high.


The technology is just mismatched, it goes for clarity over naturalness. Looking back people will laugh, or maybe they will laugh at me.

Well, you'll be shocked to know that most Blu ray disc are QC on professional LCD monitors, so the technology is not mismatched at all. Clarity comes with the bump up in resolution, and the lack of 3:2 pull down for 120hz sets, and Plasmas that do 96khz processing


Blue-Ray pick of the week" Days of Heaven. Cinematography is beautiful.

I'll have to check that out.

Woochifer
06-28-2010, 05:11 PM
Woochifer

I'm not upgrading the Sony - It was more of a family TV so my mom is keeping it. She is hard of hearing and uses the headphone outputs. It has 120hz - it's a Bravia but I forget the model number - it's a couple of years old but is a 1080i not a 1080p. I think it's fine but I'm not a videophile. It's not perfect. I suppose I could look for a similar TV. Motion is probably important if I am going to play games on it - so I may hold off a bit for 120hz

Actually, if that set is a LCD, it's not technically 1080i because flat panel TVs are natively progressive. The motion interpolation IMO ruins the picture quality, regardless of the source. But, the 5:5 frame repeating on a 120 Hz TV (which Sony implements correctly) will reduce the juddering effect with film-based sources. With video-based sources such as live TV or video games, I don't think that the motion interpolation will help much.


Here is a Philips which claims 240hz refresh and 1ms response time. It's a bit more than I want to spend but it's not on sale and I can wait. http://www.futureshop.ca/en-CA/product/philips-philips-40-1080p-240hz-lcd-hdtv-40pfl5505-40pfl5505/10145790.aspx?path=e99bbfcfd03961e7d24900aff6bbb9c een02

From what I have read the Philips uses a Samsung panel but that is only stated on a forum so who knows?

Need to watch out for the Philips sets going to the North American market because I believe that they are rebadged sets from Funai (which sells off-brand sets under several different names). These sets are separate from the ones Philips sells elsewhere in the world.

Samsung is the largest OEM panel producer in the world, so they supply a lot of different panels for different markets. Just because Samsung makes the LCD panel does not mean that they have any input into other aspects of the TV design.

Sony's relationship with Samsung is a joint venture -- meaning that many of their sets are produced in the same factory. And the panels that Samsung makes are made to Sony's specs.

pinto79
06-29-2010, 02:33 PM
I don't think I have ever used 720p. I have 2 Sharp Aquos 32" LCDs that I use on my BD and HD-DVD able computer and a 46" 120hz Aquos in the living room.

My cable box puts out 1080i for HD and has a 480p mode for non-HD content.

It seems to me that it would seem like everything on the screen is bigger than it would at 1080, considering the lower resolution... Am I wrong?

Sir Terrence the Terrible
06-29-2010, 02:51 PM
I don't think I have ever used 720p. I have 2 Sharp Aquos 32" LCDs that I use on my BD and HD-DVD able computer and a 46" 120hz Aquos in the living room.

My cable box puts out 1080i for HD and has a 480p mode for non-HD content.

It seems to me that it would seem like everything on the screen is bigger than it would at 1080, considering the lower resolution... Am I wrong?

No, that is only the case with computers. 1080 looks more like film, and 480p looks more like video(haven't seen many DVD's that truely look like film)

bobsticks
06-29-2010, 02:55 PM
It seems to me that it would seem like everything on the screen is bigger than it would at 1080, considering the lower resolution... Am I wrong?

Yes, you're wrong. Resolution ratings (a can-of-worms topic in its own right) is based on micro detail not the scale of the figures/items on the screen. That scale could be altered by modifying the "Wide Mode", "Full Mode", "Vertical Center" or other similar category on your machine's menu.

Don't worry about it though, if'n ya never ask a question ya never learn. :smile5:

Amidst the sound and fury of this bravado-infected thread there have been a few facts shared. Let me summarize:

1) 720p(rogressive) can be utilized on fast/action based programming for smoother motion sequences. Your monitor, depending on the model, may give you the option to change between the two. Experiment for yourself and remember that often results will vary among televisions.

2) Every set will intuitively rescale the programming to its native resolution. The issue becomes whether the source material is best output by the source mechanism (Sat/Receiver/DVP/BRP) or if the monitor's chips will superscede it.

pinto79
06-29-2010, 08:51 PM
I just tried it on the PC... To select 720p, the resolution changes to 1280 x 720... doesn't fit the screen properly.

That's probably the video card doing that.

Next time I watch a movie, I'll give 720 a try on the 46"...

pixelthis
06-30-2010, 10:42 AM
I just tried it on the PC... To select 720p, the resolution changes to 1280 x 720... doesn't fit the screen properly.

That's probably the video card doing that.

Next time I watch a movie, I'll give 720 a try on the 46"...

It hardly matters.
The native resolution of your set is what you'll see, in the case of your 46", you will always see 1080p, regardless.
1080i will deinterlace to 1080p, and will look best on your set.
You probably have the option on your cable box for 480i material, 480p, thru,
and streach(upconvert).
480i is usually the best, because it gives you aspect control on your TV.
Finally, if your computer can handle 1920x1280, choose that, cant understand why anyone would choose anything else, except to get a bigger text, etc.
Some older stuff ruins better at 600x400.:1:

pinto79
07-01-2010, 08:42 PM
It hardly matters.
The native resolution of your set is what you'll see, in the case of your 46", you will always see 1080p, regardless.
1080i will deinterlace to 1080p, and will look best on your set.
You probably have the option on your cable box for 480i material, 480p, thru,
and streach(upconvert).
480i is usually the best, because it gives you aspect control on your TV.
Finally, if your computer can handle 1920x1280, choose that, cant understand why anyone would choose anything else, except to get a bigger text, etc.
Some older stuff ruins better at 600x400.:1:

The computer actually handles 1920x1080 beautifully on both of my screens. I love running 2 monitors. I don't know how anyone operates with a single.

I'll try switching the cable box back to 480i for non-HD content and see if it looks any better.

RGA
07-03-2010, 01:00 PM
I have found more info on the Phillips set. What intrigued me was the 90,000:1 Contrast and 1ms response and 240hz. I went and saw it at Future Shop and it looks as good if not better than all the other sets in the store - trouble is once things are split it's really not necessarily a fair comparison. Dynex which is an off brand looked better than usual too.

The poster MavrickRohan works for the company making the Philips sets and has tested them. Interesting commentary about the different models. http://www.avsforum.com/avs-vb/showthread.php?t=1256402

"Philips in USA & Canada is the same. It is run by P&F USA, which is a new company mostly consisting of ex Philips employees which is a subsidiary of Funai."


The 40 inch is $849. It's a time off for me but I'm keeping it on the radar.

Woochifer
07-03-2010, 10:34 PM
I have found more info on the Phillips set. What intrigued me was the 90,000:1 Contrast and 1ms response and 240hz. I went and saw it at Future Shop and it looks as good if not better than all the other sets in the store - trouble is once things are split it's really not necessarily a fair comparison. Dynex which is an off brand looked better than usual too.

Problem with comparisons at stores is that the demo units are in all likelihood uncalibrated and either using the default settings or "showroom" levels designed to compensate for the extreme flood-lit conditions in a big box store. Either way, they're not comparable to each other, or comparable to what you'll see at home.

What you should do if haven't already done so, is calibrate your Sony using a calibration DVD. That will familiarize you with what the reference color balances, brightness, sharpness, and contrast should look like. Side benefit is that the calibration will improve the picture quality of your TV from the factory defaults.

The reason for doing this is that every TV uses a different default setting, even models from the same manufacturer. Unless the demo units in a showroom are calibrated (in my experience, only higher end stores with dedicated demo rooms will do this), you're not really doing true A-B comparisons. This is the equivalent of trying to compare speakers with all of the amps using different EQ settings -- it's just not comparable. Doing a somewhat comparable demo will require that you do some tweaking with the settings so that you're not merely looking at which TVs give you the best factory settings.

With my Panny, the factory settings look awful. But, with the settings in Cinema mode (this is usually the preset that's closest to reference levels), and dialing down the brightness and sharpness, the set looks quite good. Because I had calibrated my previous TV, I knew what a reference level is supposed to look like. With any store demo, I try to reset the TV so that it's as close a match as possible. With the levels as comparable as possible, THEN you can look for qualitative difference in picture quality.

the other thing to watch out for is the overly bright settings used in showroom demos. The high level torch mode is far brighter than you'll likely want to use at home. It's fine for standing out in a crowd (which is why so many factory defaults purposely use a high brightness and lamp level), but that's like a speaker with the treble cranked way up. Might sound good in a crowded noisy warehouse, but intolerable under normal home conditions.


The poster MavrickRohan works for the company making the Philips sets and has tested them. Interesting commentary about the different models. http://www.avsforum.com/avs-vb/showthread.php?t=1256402

"Philips in USA & Canada is the same. It is run by P&F USA, which is a new company mostly consisting of ex Philips employees which is a subsidiary of Funai."


The 40 inch is $849. It's a time off for me but I'm keeping it on the radar.

Yep, but that still makes the Philips sets in North America different from what they sell elsewhere in the world. Funai is a decent off-brand manufacturer, but they also make HDTVs for a wide range of brands such as Sylvania and Sansui.

RGA
07-04-2010, 12:38 AM
I suppose my question with Philips or for that matter the off brands, is just because it's a name brand does that make it a superior product?

It's not like Sony or Samsung is any great shakes when it comes to supreme quality in other facets of their products. I get that they are said to be in the leading category for televisions but at the same time I wonder where the objective evidence would be for this.

We grow to expect certain brands to be quality and certain brands to be lesser and it becomes as comfortable as shoes when they're all using "common" parts made in the same Chinese plants. As Exotic PC noted people may buy an HP or Dell or Apple laptop but there is not one piece of the machine or screw that is actually made by HP etc in the entire machine. They get screens from one make and motherboards from Asus or what have you but the logo on the front is just that - a logo. If Sony doesn't make a single part of their laptops I am not quick to assume that their Televisions are automatically better than the no name brand since the no name brand may be the identical television for half the price. Even the engines may be better but that could be let down by poor refresh rates or some other issue. One laptop had a terrific video card but it was too slow to take advantage of it. They got to advertise how great the video card was but you needed to be cautious.

Here is an example of what I am concerned over.
http://www.xoticpc.com/really-makes-laptops-industries-best-kept-secret-a-47.html

I remember when I bought my Pioneer LaserDisc player - there was an exact replica under the Hitachi name badge for $200 more. Even though both were made by Pioneer. Indeed, Pioneer had two machines that were identical but one had the rosewood panels and piano black finish and the "Elite" name - exact same machines however.

This is why I sometimes wonder about names like Insignia and Dynex sold at the big box chains for a couple hundred less money.

Alternatively, if one TV is 240hz, has a very fast refresh rate and higher contrast etc then it should be better than the 60hz 8ms lower contrast TV from a bigger name especially if they're both using the same screens.

In other words - the big household name brand can get away with upping the price because there is a big name logo on the front - while a TV which may be just as good or even better may be selling for less because the name appeal isn't there yet.

I think this is difficult for the average person to go into stores to buy a television because he has to get permission to recalibrate the televisions - have them set them up side by side. I don't know if that is at all practical. The smaller stores might but they can't compete remotely on price or warranty.

Like I said I am no videophile. I never bought LaserDisc for the picture quality - I bought it for Wide Screen. Having said that if I buy another TV for a given price I'd like to get the best one I can - but I am not totally convinced by just "going with the name brand".

Sir Terrence the Terrible
07-04-2010, 12:37 PM
I suppose my question with Philips or for that matter the off brands, is just because it's a name brand does that make it a superior product?

The short answer is no


It's not like Sony or Samsung is any great shakes when it comes to supreme quality in other facets of their products. I get that they are said to be in the leading category for televisions but at the same time I wonder where the objective evidence would be for this.

Almost every review comes with measurements of certain necessary perimeters. How well the panels do on that testing is how they are rated.

For instance, I get my testing information from Displaydata. Displaydata takes all of the individually tested sets from various testing labs, and combines them together into one large report. This has been done on every new model release since the year 2000. Color accuracy, native contrast along with mechanically enhanced contrast, black levels, grey scale accuracy, and the newest test panel response time. Using standard testing procedures that the industry has adopted, all of the scores are compiled, a rating and placement are given. There is no subjective evaluation, so no bias. The same testing methods and equipment are used for all sets, so no biasing towards a specific technology.

The Digital Testing Center in Hollywood is probably the most thorough testing center in America. They test hundreds of panels, pull them apart, and through evaluating the individual parts, the processing, and how they work together they can tell if a set has been designed from the ground up to work with all of its parts, or it was just thrown together with all the parts it takes to make a set, without any consideration to how they work together.


We grow to expect certain brands to be quality and certain brands to be lesser and it becomes as comfortable as shoes when they're all using "common" parts made in the same Chinese plants. As Exotic PC noted people may buy an HP or Dell or Apple laptop but there is not one piece of the machine or screw that is actually made by HP etc in the entire machine. They get screens from one make and motherboards from Asus or what have you but the logo on the front is just that - a logo. If Sony doesn't make a single part of their laptops I am not quick to assume that their Televisions are automatically better than the no name brand since the no name brand may be the identical television for half the price. Even the engines may be better but that could be let down by poor refresh rates or some other issue. One laptop had a terrific video card but it was too slow to take advantage of it. They got to advertise how great the video card was but you needed to be cautious.

Here is an example of what I am concerned over.
http://www.xoticpc.com/really-makes-laptops-industries-best-kept-secret-a-47.html

The same situation that happens in the computer industry happens with televisions. The panel is sourced from one OEM, the individual parts from another. The difference between a Sony, Panasonic, Pioneer(when they were in the business)Samsung and other first tier manufacturers from Vizio, Insignia, Colby, and some of the other third tier manufacturers lies in the fact that their panels are designed in house, use panels and parts designed to work together, and have the same parts from the introduction of the product line, till when it is discontinued. They design their own processing algorithms to work specifically with the panel and the parts that make it up. The third tiers panels are not designed in house, do not have consistent suppliers or parts throughout their product lines, and don't have panels and parts designed to work together. They don't even develop their own processing, its off the shelf stuff. Sony sources its panels from Samsung, hence why they have the best performing LCD panels. After the panel, Sony is number one with LED backlit and edgelit panels, consistently scoring the highest with these two product categories across their entire line. Panasonic is the number one plasma in terms of testing once Pioneer left the market.

You may source your panel from the same OEM, but what you do with it after that is what distinguishes you from your competition.



I remember when I bought my Pioneer LaserDisc player - there was an exact replica under the Hitachi name badge for $200 more. Even though both were made by Pioneer. Indeed, Pioneer had two machines that were identical but one had the rosewood panels and piano black finish and the "Elite" name - exact same machines however.

Pioneer made every Laserdisc player ever sold, even if their brand name was not on it. That is a fact. Once again, it is what you do after the basic player has been assembled that distinguishes it. Better lasers, better D/A conversion, drive stabilizing, better analog and digital circuitry, advance hookups(coaxial), audio format support all distinguish a Hitachi from a high end Pioneer.

While casually scrutinizing the two players they may have seemed identical, but look under the hood. Pioneer Elite players always had better internal parts, better D/A converters and analog stages, advanced drive stabilization, advanced hookups, and advance audio format support. They were the first to support AC-3 and Dts coherent acoustic audio formats. These are not always things that are recognized with casual viewing. The Elites always tested better than non Elite models.


This is why I sometimes wonder about names like Insignia and Dynex sold at the big box chains for a couple hundred less money. ]Alternatively, if one TV is 240hz, has a very fast refresh rate and higher contrast etc then it should be better than the 60hz 8ms lower contrast TV from a bigger name especially if they're both using the same screens.

Just to make some things clear, 240hz is the refresh rate. Secondly it all depends on the quality of the panel itself. If the 240hz has better measurements, then yes your statement is true. However, this years 240hz LED lit top of the line panel from Vizio does not outperform last years 120hz edge lit Sony and Samsung mid tier panels. So specs listed by the manufacturers rarely tell the whole story till closer scrutiny and testing occurs.


In other words - the big household name brand can get away with upping the price because there is a big name logo on the front - while a TV which may be just as good or even better may be selling for less because the name appeal isn't there yet.

Not if they don't test better. I have been getting Displaydata testing results since they first offered them, and I have never seen a third tier manufacturers panels testing better than any first tier. The fact that you are a first tier manufacturer does not come with the name at all. It comes from consistent quality, best performance, and best features implemented in the best way. Sony, Panasonic and Samsung didn't get there because of their name(Samsung has not always been synonymous with quality), they got there because of their superior products.


I think this is difficult for the average person to go into stores to buy a television because he has to get permission to recalibrate the televisions - have them set them up side by side. I don't know if that is at all practical. The smaller stores might but they can't compete remotely on price or warranty.

You have some work to do before you enter the store. Look at reviews from reputable rags like Widescreen Review, Sound and Vision, and hit online sources such as HD guru.com, UltimateAV.com which publish testing measurements. Then you hit the stores and decide from the sets that test well, fit your budget, and space requirements. A lot of times the store already has certain brands side by side, and some don't mind you adjusting the sets to give you a better idea of how good the really look.


Like I said I am no videophile. I never bought LaserDisc for the picture quality - I bought it for Wide Screen. Having said that if I buy another TV for a given price I'd like to get the best one I can - but I am not totally convinced by just "going with the name brand".

This is where you have some things quite twisted. The name brands have the R&D money to produce the best performing panels. That is a fact. They also have the repair and customer service infrastructure that the non name brands don't have. The name brands produce televisions that are better constructed, have better processing that is better implemented, better features, and have panels with better performance. In ten years of published testing, I have never seen a non name brand set outperform a set from the major brands. Even the bottom tier sets from the majors can equal, or do better than the higher tiered products from the non brand names(this especially true for Sony and Samsung).

A name can only take you so far, then you have to actually produce high quality products that earn that name. You will not be doing yourself a favor by choosing anything from the third tier manufacturers, unless the application you will be using the set for is not for critical viewing. Consider the overall price erosion from last year, you can get a mid tier panel from a major for nearly the same price as a similar screen size Vizio at some of the big box stores.

RGA
07-04-2010, 04:21 PM
Thanks for the information regarding the Televisions. It's probably just easier to go with Sony/Samsung. They make less expensive ones than the Philips model. They're 8ms and 60hz but I can't find a site that sits a bunch of the models down and compares them. The Euro magazines do it with amps and speakers and probably TV's but the model numbers are different so it doesn't help much.

pixelthis
07-05-2010, 11:10 AM
Thanks for the information regarding the Televisions. It's probably just easier to go with Sony/Samsung. They make less expensive ones than the Philips model. They're 8ms and 60hz but I can't find a site that sits a bunch of the models down and compares them. The Euro magazines do it with amps and speakers and probably TV's but the model numbers are different so it doesn't help much.

Sony and Samsung are basically the same thing (the same panels, anyway).
With 240 hz machines remember, all tv signals are 60hz, the tv makes up the rest,
so with a 240hz set, 75% of the pic is interpolated, or fake.
120hz looks great, but I THINK WITH 240HZ YOU ARE ENTERING THE REALM
of deminishing returns.
ALSO, there was a rumour that Phillips sets were FUNAI, but every Phillips set I have seen has been exelent, and a friend, who bought a 42" Phillips for his bedroom, likes
the pic better than his SAMSUNG that he has in his HT.
SO IF THEY are Funai, then FUNAI MAKES SOME GREAT STUFF.:1:

Sir Terrence the Terrible
07-05-2010, 01:48 PM
Thanks for the information regarding the Televisions. It's probably just easier to go with Sony/Samsung. They make less expensive ones than the Philips model. They're 8ms and 60hz but I can't find a site that sits a bunch of the models down and compares them. The Euro magazines do it with amps and speakers and probably TV's but the model numbers are different so it doesn't help much.

Richard,
If you are sensitive to judder like I am, get a 120hz television. 3:2 pulldown does not eliminate all the judder that you get when 24fps film gets encoded at 60hz.

This crap about 240hz being fake is nonsense. My Sony XBRpro can do 60hz, 120hz, 240hz, and 480hz. Each of these settings smoothens the picture into a more film like presentation. This is not the same thing as clearmotion, trumotion, or any of the other judder processing. 60hz,120hz, and 240hz are refresh rate speeds, and describes how many times the screen updates itself in a second. This is not the same as interpolating frames designed to remove judder. Refresh rates don't add anything to the picture that is not already there.

Joe Kane puts it nicely:

“Refresh rates of incoming video signals are anywhere from about 24 pictures per second to 60 pictures per second. When displayed at their native rates on LCD displays you’ll often see lag in the motion. It looks as if detail is being smeared. This is because images from prior frames are still present as later frames come along. If the refresh rate is increased and frames of black added, you can often force the old image to go away before the new image comes along."

This is theory of course as the problem with imaging smearing(or blur) is not a refresh rate issue, it is a pixel issue. You can refresh a million times a second, but if the pixels cannot update themselves that fast, blurring will still occur. This is why I recommend panels from the first tier manufacturers, as they have put in the R&D to get their panels improved in the response time area, whereas a third tier player has no control over that perimeter because they are just buying panels on the open market.

It always helps to make decisions when seeing sets side by side. Our visual memory only lasts seconds, so any comparison made beyond that time, is completely unreliable.

And never trust anecdotal information from someone who does not believe in calibrating a set, or watching it in optimal conditions. And never take advice from someone that does not know the difference between judder preventive algorithms and refresh rates.

RGA
07-05-2010, 09:02 PM
Thanks again for the input. So I guess now I want to try and figure out if the Philips is a first tier maker. The European models are supposed to be top rate and "certain" models in the west are as well.

The 5 series that I posted about. Here is what that guy who works for Philips has said

I have tested the 5505 (Pre Prod) since November 2009 ... and also did the FMF (Fast Market Feedback) test on that model ... If you don't care about internet on the TV, the 5505 has really good picture quality. Mainly because of 1D Dimming and the Scanning Backlight which makes it a 240Hz class panel. There were many changes made to improve the PQ during FMF, it also has decent speakers.

It has a static contrast ratio of 7000:1 & 10 bit (8 bit + 2 bit dithering) Color Support.

Also, FYI ... all Philips 2010 5 & 7 series TVs have Samsung panels, not LG. 2009 6 & 7 series had LG panels.

AND

The Samsung & Philips are almost identical in terms of PQ performance, its upto you really.

AND

"Digital Natural Motion" i.e. TV interpolates video frames vs. doing a simple frame repeat to achieve the panels 120Hz refresh rate is something you eventually get used to.

Film content used to be shot at 24fps simply because of technological limitations back in the day. I wish studios started shooting movies at a higher film rate. Simply because real 60 fps looks way better than 24 interpolated to 120.

You will get used to DNM, and once you do, you'll start hating the 24fps judder (which I have no clue, why people still like) ...

AND

Well, I work for P&F USA, so I have access to the entire line-up in the lab so I have tested all of them. I also personally own the 5505.

In terms of pure picture quality and black levels, the 5000 series (120Hz panel + scanning backlight = 240Hz class & 1D local dimming) beats the 7 series (120Hz panel with edge-lit LED) hands down.

The only reason why the 7000 series is more expensive is cos it is an LED panel, which also means the boards used inside the TVs are more expensive to manufacture and just cos its LED-LCD, & as of now all LED-LCDs cost more than CCFL-LCDs.



The problem for me is that I can't see any rating sites that make direct comparisons. When I look at the specs the Philips seems to beat the hell out of the other models for similar prices at Future Shop and they seem to have their own uniquely designed systems but I have no clue because my learning curve on this is so far non-existent. If the specs don't matter and it boils down the design that is not quantifiable in numbers then it makes it very difficult.

It's not like the Philips is cheaper - it's actually a $150 more than some of the same sized Sony and Samsung models. The screen is first tier Samsung, so it seems to come down to the quality of the picture but without some sort of fair comparison all that is left is the specs and they seem to be considerably better.

1ms response time - lower is better no? That suggests that motion would be a considerably better and from just looking at in store it is a lot better than my Sony Bravia 120hz 8ms model. But Woochifer noted that there is not an equal calibration - maybe mine is off and the Philips is spot on out of the box?

Then there is the Aquas that people seem to like and Panasonic supporters.

I am not opposed to buying Korean - just traded my Toyota Corolla that was a break down machine for a Kia. Love my LG phone and I lived in Korea for 2 years - so I'm quite happy to own a Samsung - but the Philips is using the same panel - so does my Sony I believe.

Sir Terrence the Terrible
07-06-2010, 10:04 AM
Thanks again for the input. So I guess now I want to try and figure out if the Philips is a first tier maker. The European models are supposed to be top rate and "certain" models in the west are as well.

The 5 series that I posted about. Here is what that guy who works for Philips has said

I have tested the 5505 (Pre Prod) since November 2009 ... and also did the FMF (Fast Market Feedback) test on that model ... If you don't care about internet on the TV, the 5505 has really good picture quality. Mainly because of 1D Dimming and the Scanning Backlight which makes it a 240Hz class panel. There were many changes made to improve the PQ during FMF, it also has decent speakers.

It has a static contrast ratio of 7000:1 & 10 bit (8 bit + 2 bit dithering) Color Support.

Also, FYI ... all Philips 2010 5 & 7 series TVs have Samsung panels, not LG. 2009 6 & 7 series had LG panels.

AND

The Samsung & Philips are almost identical in terms of PQ performance, its upto you really.

AND

"Digital Natural Motion" i.e. TV interpolates video frames vs. doing a simple frame repeat to achieve the panels 120Hz refresh rate is something you eventually get used to.

Film content used to be shot at 24fps simply because of technological limitations back in the day. I wish studios started shooting movies at a higher film rate. Simply because real 60 fps looks way better than 24 interpolated to 120.

You will get used to DNM, and once you do, you'll start hating the 24fps judder (which I have no clue, why people still like) ...

AND

Well, I work for P&F USA, so I have access to the entire line-up in the lab so I have tested all of them. I also personally own the 5505.

In terms of pure picture quality and black levels, the 5000 series (120Hz panel + scanning backlight = 240Hz class & 1D local dimming) beats the 7 series (120Hz panel with edge-lit LED) hands down.

The only reason why the 7000 series is more expensive is cos it is an LED panel, which also means the boards used inside the TVs are more expensive to manufacture and just cos its LED-LCD, & as of now all LED-LCDs cost more than CCFL-LCDs.



The problem for me is that I can't see any rating sites that make direct comparisons. When I look at the specs the Philips seems to beat the hell out of the other models for similar prices at Future Shop and they seem to have their own uniquely designed systems but I have no clue because my learning curve on this is so far non-existent. If the specs don't matter and it boils down the design that is not quantifiable in numbers then it makes it very difficult.

It's not like the Philips is cheaper - it's actually a $150 more than some of the same sized Sony and Samsung models. The screen is first tier Samsung, so it seems to come down to the quality of the picture but without some sort of fair comparison all that is left is the specs and they seem to be considerably better.

1ms response time - lower is better no? That suggests that motion would be a considerably better and from just looking at in store it is a lot better than my Sony Bravia 120hz 8ms model. But Woochifer noted that there is not an equal calibration - maybe mine is off and the Philips is spot on out of the box?

Then there is the Aquas that people seem to like and Panasonic supporters.

I am not opposed to buying Korean - just traded my Toyota Corolla that was a break down machine for a Kia. Love my LG phone and I lived in Korea for 2 years - so I'm quite happy to own a Samsung - but the Philips is using the same panel - so does my Sony I believe.

Pay absolutely no attention to manufacturers response times on their panel. They are using the results of half of the testing(the easiest for a pixel to accomplish) to arrive at their response time figures. They are using the off/on measurement instead of the less marketable and flattering off/on/off measurement. There are no panels in existance(not even my high priced Sony XBRpro model) that can acheive a 1ms panel response time.

RGA
07-06-2010, 11:36 AM
Sounds to me like Philips is being rather dishonest using numbers as a marketing too. I checked CNET and some other sites and I think I am probably better off with a 120hz Samsung. They and Sony have slimmer televisions out as well and the prices are not that far off.

Thanks.