View Full Version : Blu-Ray Players
What Blu-Ray player in the 250.00 to 300.00 range is a good bang for the buck so fas as picture and sound quality and of course ease of use?
Bob
pixelthis
04-05-2011, 01:55 PM
What Blu-Ray player in the 250.00 to 300.00 range is a good bang for the buck so fas as picture and sound quality and of course ease of use?
Bob
i have a thread going on a Funai player I have purchased from WALMART for 68 $.
IF you want to pay out the nose and get a bunch if "features" of questionable value
get something like an OPPO.
Or go to Walmart and get a magnavox (Funai) for 68 bucks. Or splurge and get a
LG for 80$ , or a Sony for a 119, etc.
My first player was 400$, a Sony that was slow as CHRISTMAS and out of date in
six months. My second was a SHARP, died in less than a year. My fault, shoulda known better.
I love my new 68$ player, looks and sounds great, dies, and I trashcan it and get one
up to date. I HAVE SPENT ENOUGH ON BLU PLAYERS.
The day of the "250 to 300" BLU player is over.
Been over.:1:
Mr Peabody
04-05-2011, 04:45 PM
I haven't really looked deeply into BDP's for a while. I'd start by deciding what features you need/want opposed to those you'd never use. Like, interested in 3D, want Wyfi capability, want streaming features to receive Netflix or other internet movie services, do you need it to play music formats CD/SACD, etc. If you even think you'd want to play any games the PS3 is still a main contender. Oppo is always mentioned as a good value. I'm not sure if either is in your price. I still have two 2nd gen Samsung that are going strong as far as reliability. I'm not sure what price Marantz starts at but I am happy with mine and easy to use, it even automatically loads the best audio track. I feel Panasonic or Denon would also be good choices. Since I have a decent size DVD collection upscaling video is an important feature for me as I don't care to replace all titles with BD. Wyfi is also a nice feature for receiving firmware updates.
Although prices have dropped I still believe quality parts and chips cost and generally you get what you pay for. I really need to read some current reviews to see how these cheap players fare in video performance. I know my daughter's Samsung 1500 didn't have the picture quality of my older 1200. It's like with DVD you could buy one for $40.00 but the PQ was not up to par with a $250.00 player. Of course, with BDP level performance maybe even poor is still very good.
02audionoob
04-05-2011, 05:45 PM
The LG BD570 is nice, easy to use and is about as fast as average. I haven't seen any that are notably faster. It plays the music formats and streams YouTube, NetFlix, etc. They look pretty and have a good remote. I think i like the remote on my Sony better, but it's a close race.
bfalls
04-05-2011, 07:19 PM
The Sony BDP-S570 is a very good general purpose BD player and within in your price range. It has wireless wifi with a number of movie and music sites available. It's 3D capable with the firmware upgrade. It has very good picture and sound. I've read several articles praising it. There's a newer model, but I don't know much about it (BDP-S580). Being the S570 is an older model it should be discounted nicely. I've had one for about 6 months and really enjoy it.
recoveryone
04-06-2011, 06:54 AM
This list could become endless with suggestions, but you need to know some facts about how to get the most out of a BDP regardless of the make or model.
1. Do you have HDMI ( for max res and PQ)
2. Does your AVR support or passthrough the newer audio formats DD True, DTS HD (7.1)
3.Does you T.V. support 24fps (Flim standard)
All the rest is bells and whistle as to Wifi, netflick, voodoo, Youtube support, If you do not have the basic support gear to get the most out of it you will also wonder what if.
Woochifer
04-15-2011, 11:14 AM
This list could become endless with suggestions, but you need to know some facts about how to get the most out of a BDP regardless of the make or model.
1. Do you have HDMI ( for max res and PQ)
2. Does your AVR support or passthrough the newer audio formats DD True, DTS HD (7.1)
3.Does you T.V. support 24fps (Flim standard)
All the rest is bells and whistle as to Wifi, netflick, voodoo, Youtube support, If you do not have the basic support gear to get the most out of it you will also wonder what if.
Very very true!
For anyone without a HDMI connection on their HDTV, they'd better decide which Blu-ray player they want ... and fast! The first step of the analog video phaseout went into motion at the start of this year. Any Blu-ray players introduced this year will have NO analog component video outputs, so anyone looking for something with component video outputs will have to look for an older model.
http://forums.audioreview.com/showthread.php?t=36227
As for the other stuff, Netflix is rapidly becoming standard issue on Blu-ray players and those functions might finally gain some traction for internet TV services in the living room (up to this point, they've had relatively limited usage because they were largely confined to PCs).
pixelthis
04-15-2011, 11:53 AM
Very very true!
For anyone without a HDMI connection on their HDTV, they'd better decide which Blu-ray player they want ... and fast! The first step of the analog video phaseout went into motion at the start of this year. Any Blu-ray players introduced this year will have NO analog component video outputs, so anyone looking for something with component video outputs will have to look for an older model.
http://forums.audioreview.com/showthread.php?t=36227
As for the other stuff, Netflix is rapidly becoming standard issue on Blu-ray players and those functions might finally gain some traction for internet TV services in the living room (up to this point, they've had relatively limited usage because they were largely confined to PCs).
cart before the horse.
I HAD a receiver that was obsolete about six months or so after I bought it(maybe a little
longer), because it had component video switching.
I COULDN'T use that connection after seeing just how silly good HDMI is.
You need a monitor with an HDMI connection, not only for that best connection, but
any monitor that has all the new up to date stuff, will have HDMI.
If your set only has component, it is missing stuff like 1080p, 120hz, etc.
BUY AN OLDER BLU and you wind up with an obsolete(well, not up to date) BLU
player. This gives you two devices that need updating.
WHEN YOU get the BLU, get a new monitor with HDMI with it. COMPONENT is so
inferiour to HDMI I wouldn't even bother hooking a BLU up with it.
Solve the real problem, update your monitor.:1:
Woochifer
04-15-2011, 05:40 PM
cart before the horse.
I HAD a receiver that was obsolete about six months or so after I bought it(maybe a little
longer), because it had component video switching.
I COULDN'T use that connection after seeing just how silly good HDMI is.
You need a monitor with an HDMI connection, not only for that best connection, but
any monitor that has all the new up to date stuff, will have HDMI.
If your set only has component, it is missing stuff like 1080p, 120hz, etc.
BUY AN OLDER BLU and you wind up with an obsolete(well, not up to date) BLU
player. This gives you two devices that need updating.
WHEN YOU get the BLU, get a new monitor with HDMI with it. COMPONENT is so
inferiour to HDMI I wouldn't even bother hooking a BLU up with it.
Solve the real problem, update your monitor.:1:
If someone already owns a HDTV, and all they want is to add a Blu-ray player, why wouldn't they just buy a Blu-ray player? Buying a new TV doesn't address the original inquiry of which Blu-ray player they should buy, especially if their budget is under $300.
And even with HDMI connections, many of the older HDMI-enabled HDTVs have noticeably worse picture quality using the digital connection than with the component video connector.
pixelthis
04-17-2011, 11:27 AM
If someone already owns a HDTV, and all they want is to add a Blu-ray player, why wouldn't they just buy a Blu-ray player? Buying a new TV doesn't address the original inquiry of which Blu-ray player they should buy, especially if their budget is under $300.
And even with HDMI connections, many of the older HDMI-enabled HDTVs have noticeably worse picture quality using the digital connection than with the component video connector.
I have never seen a HDMI connection thats worse than component.
If someone already owns a HDTV set, why would they want a new one just to "add" a
BLU player? Maybe because a BLU player is just not effective with an inferiour set.
WHY buy PORSCH for a dirt road?
Back when they had live video clerks instead of a Red box, I would rent a BLU and
get the same comment, mainly that they had heard that BLU doesn't look any better than a DVD. And true, even tho there is an improvement with a 720p set and other inferiour devices, its not that much different than a decent DVD image.
Its only with a 1080p set that BLU really shines, and since they have, to the last one,
HDMI it stands to reason that you only need that on a BLU player.
If your monitor only has component, then you need a new set anyway.
BUYING an older, and probably inferiour BLU player to match your older, inferiour
monitor is throwing good money after bad.
I have an older set at the house, about five years old. And it has HDMI.
Using component is like the singer who has a great voice but "tears it up getting it out".
Get a new BLU player, and use the composite until you can afford a decent set
that doesn't belong in a MONGOLIAN VILLAGE. You will find that theres not much diff,
and you won't be compromising by getting a BLU player thats older and you will be stuck
with for awhile. COMPONENT is dead, involves two digital to analog conversions,
and you buy stuff to upgrade your system, not to accommodate obsolete tech, IMHO.
swallow hard and join the 21st century.:1:
pixelthis
04-17-2011, 11:45 AM
Or perhaps your monitor does have HDMI, and you just want to use the older component
switching on an older reciever.
This would be a horrible decision. I had an excellent 1200 dollar receiver that had component
video switching. I USED a direct connection to my monitor, and a very good universal
remote with macros that made up for my receivers shortcomings. All of this was better
than compromising my system with an inferiour connection. I tossed a 1200 dollar receiver
after only four years(cry cry) so that I could get the new codecs and video
switching. WELL WORTH IT.
BLU is the best video delivery system on the planet , and deserves some accomodation.
A BLU image thru even a cheap 1080p set is more addictive than crack mixed with
crystal meth,..silly good. PAY THE MAN.:1:
Mr Peabody
04-17-2011, 12:12 PM
Come on Pix, you are being a bit dramatic. Component is capable of 1080i and with decent cables the connection is not far off from HDMI. It could be, with standard video as most modern sets don't upscale via analog connections due to copy protection. Component is not as bad as you want to make it out to be though.
pixelthis
04-17-2011, 12:36 PM
Come on Pix, you are being a bit dramatic. Component is capable of 1080i and with decent cables the connection is not far off from HDMI. It could be, with standard video as most modern sets don't upscale via analog connections due to copy protection. Component is not as bad as you want to make it out to be though.
CAPABLE of 1080i? Thats like saying a girl has a great personality.
I know that you have a dlp 1080i, and didn.t mean to insult you, Mr P.
But the time has come for you to upgrade to 1080p, and not just for BLU.
I watched 2001 on ONDEMAND this morning in 1080i, which my set deinterlaced to
1080p. Deinterlacing gives a real improvement as opposed to upconversion,
progressive being so much better than interlace. This doesn't mean 2001 was an good
as BLU, but darn, it was close. WATCH it on your set and you start out with half the
rez of a 1080p;, and its downhill from there, as you lose a great deal whenever theres movement, and thats after two D/A conversions. SORRY, but the industry has passed
interlaced by. TIME to start shopping.:1:
Mr Peabody
04-17-2011, 01:03 PM
I also have a 1080p Toshiba 40" and have seen 1080p, I still don't feel it's as large a difference as you make it. I do use HDMI for my 62".
pixelthis
04-18-2011, 10:03 AM
I also have a 1080p Toshiba 40" and have seen 1080p, I still don't feel it's as large a difference as you make it. I do use HDMI for my 62".
Well, most people do, which is why 1080i is dead. Try finding one Technology as
passed interlaced formats by, and good riddance. They were always a con anyway.:1:
Woochifer
04-18-2011, 01:28 PM
I have never seen a HDMI connection thats worse than component.
This was the case with plenty of the early HDMI-enabled HDTVs. HDTV reviews used to run the tests using both connections and advise readers on which one provided the best picture quality, which was not always the HDMI connection.
If someone already owns a HDTV set, why would they want a new one just to "add" a
BLU player? Maybe because a BLU player is just not effective with an inferiour set.
WHY buy PORSCH for a dirt road?
But, it's still more effective than sticking with DVD, right?
Again, if somebody has a set budget of $300, why would they want to upgrade the monitor before adding a Blu-ray player, if all they want to add is the Blu-ray capability?
If your monitor only has component, then you need a new set anyway.
BUYING an older, and probably inferiour BLU player to match your older, inferiour
monitor is throwing good money after bad.
These so-called "older, and probably inferiour [sp]" Blu-ray players you're referring to are less than a year old and still being manufactured. Basically, anything introduced before the end of last year will still include component video. Are you saying that people should avoid the Oppo BDP-93 because it came out last year and includes component video outputs?
The difference in playback quality and disc compatibility between last year's models and this year's models is minimal. And that's been the case for at least the last two years. The primary changes are with the network connectivity features.
Get a new BLU player, and use the composite until you can afford a decent set
that doesn't belong in a MONGOLIAN VILLAGE. You will find that theres not much diff,
and you won't be compromising by getting a BLU player thats older and you will be stuck
with for awhile.
So, you're now claiming that there's "not much diff" between 480i (composite) and 1080p (component)? Nice advice :rolleyes:
Woochifer
04-18-2011, 01:42 PM
Come on Pix, you are being a bit dramatic.
You noticed that too, huh? :cool:
Component is capable of 1080i and with decent cables the connection is not far off from HDMI. It could be, with standard video as most modern sets don't upscale via analog connections due to copy protection. Component is not as bad as you want to make it out to be though.
Actually, component video is fully capable of carrying a 1080p signal, and in the early days of HDMI (before signal equalization and boosting switches became available), was preferred over HDMI for long cable runs.
Component video is well within the capabilities of existing HD video standards. It's getting phased out with Blu-ray players starting this year because of copy protection agreements with the studios.
Mr Peabody
04-18-2011, 04:49 PM
I have always heard component maxed out at 1080i as in http://www.buzzle.com/articles/hdmi-vs-component.html but apparently there is now a digital version of component that can support 1080p http://en.wikipedia.org/wiki/Component_video
I just did a quick search if you have anything to show analog component does 1080p I'd be interested in seeing it. Thanks for the clarification. At this point it looks like what component is capable of depends on whether it's digital or analog.
Woochifer
04-18-2011, 05:23 PM
I have always heard component maxed out at 1080i as in http://www.buzzle.com/articles/hdmi-vs-component.html but apparently there is now a digital version of component that can support 1080p http://en.wikipedia.org/wiki/Component_video
I just did a quick search if you have anything to show analog component does 1080p I'd be interested in seeing it. Thanks for the clarification. At this point it looks like what component is capable of depends on whether it's digital or analog.
1080p over analog component video is supported by several manufacturers, including Samsung.
Blue Jeans Cable also has a good explanation outlining some of the reasons (http://www.bluejeanscable.com/articles/dvihdmicomponent.htm) why you can't automatically presume that HDMI is always preferable. Keep in mind that BJC is generally very critical of HDMI because of its unreliable signal integrity over long distances and the poor basic design of the connector (that last point, I'm very much in agreement with).
The most important point is that with most sources, you're not looking at a pure unadulterated signal. At some point, the video signal is going to get rescaled, deinterlaced, or otherwise altered. And that will vary considerably between different HDTVs and HD devices. Even Blu-ray is a native 1080p24 format that needs either a 5:5 frame repeat or 2:3 pulldown applied just to display properly on a native 1080p60 display. In the early days of HDTV, more emphasis was given to the analog video path. That's why reviews of early HDMI-enabled HDTVs would point out instances where the component connection rated better than the HDMI connections.
Consequently, there are always conversions going on, and these conversions aren't always easy going. "Digital to digital" conversion is no more a guarantee of signal quality than "digital to analog," and in practice may be substantially worse. Whether it's better or worse will depend upon the circuitry involved--and that is something which isn't usually practical to figure out on paper. As a general rule, with consumer equipment, one simply doesn't know how signals are processed, and one doesn't know how that processing varies by input. Analog and digital inputs must either be scaled through separate circuits, or one must be converted to the other to use the same scaler. How is that done? In general, you won't find an answer to that anywhere in your instruction manual, and even if you did, it'd be hard to judge which is the better scaler without viewing the actual video output. It's fair to say, in general, that even in very high-end consumer gear, the quality of circuits for signal processing and scaling is quite variable.
So, which is better, HDMI or component? The answer--unsatisfying, perhaps, but true--is that it depends. It depends upon your source and display devices, and there's no good way, in principle, to say in advance whether the digital or the analog connection will render a better picture. You may even find, say, that your DVD player looks better through its HDMI output, while your satellite or cable box looks better through its component output, on the same display. In this case, there's no real substitute for simply plugging it in and giving it a try both ways.
pixelthis
04-19-2011, 01:07 PM
This was the case with plenty of the early HDMI-enabled HDTVs. HDTV reviews used to run the tests using both connections and advise readers on which one provided the best picture quality, which was not always the HDMI connection.
ANCIENT HISTORY
But, it's still more effective than sticking with DVD, right?
Not by much.
YOU MIGHT AS WELL buy BLU, but thats just because BLU is down to 68 bucks.
But my point is that an older BLU player, one with component, is not going to be as up to date as a newer one. So when you do get around to upgrading your monitor, you will
need a new BLU player, also.
BUT the most important thing is that a monitor without HDMI will be old as the sticks.
I have a five year old set that has HDMI, you would have to get one a lot older to have a monitor without one. I doubt BLU would be worth the trouble, really.
Again, if somebody has a set budget of $300, why would they want to upgrade the monitor before adding a Blu-ray player, if all they want to add is the Blu-ray capability?
BECAUSE why bother putting a CORVETTE ENGINE in a 59 STUDEBAKER?
What possible good would BLU capability be with a set so old that it doesnt have HDMI?
With a budget of three hundred bucks, you are three hundred bucks away from a 42"
1080p and a BLU player. EVER hear of penny wise and pound foolish?
These so-called "older, and probably inferiour [sp]" Blu-ray players you're referring to are less than a year old and still being manufactured. Basically, anything introduced before the end of last year will still include component video. Are you saying that people should avoid the Oppo BDP-93 because it came out last year and includes component video outputs?
No, but read the instructions, and they will tell you that component is an inferiour
way to connect your BLU player. Thats because it involves two D/A conversions.
AND AGAIN, if the reason to go component is that your TV doesnt have HDMI, that is an old honkin TV. But you lose the argument over quality.
Doesnt matter if component can carry 1080p(it can), nobody is going to say its as good as HDMI, because it isn't.
And a set that old is going to be 720p or 1080i, so its a moot point that component can carry 1080p, because it wont be 1080p at the end of the day.
I got burned with a 1200 dollar receiver with component switching, because I A/B the
HDMI/COMPONENT out from a cable box...a cable box.
The increase in PQ was so blatant that I COULD NOT STAND to use the component switching on my new receiver, and that was a flippin cable box.
WHY BOTHER buying a compromised device to accommodate a device that needs
replacing anyway? MAKES NO SENSE.
The difference in playback quality and disc compatibility between last year's models and this year's models is minimal. And that's been the case for at least the last two years. The primary changes are with the network connectivity features.
and you still wind up using an inferiour connection.
So, you're now claiming that there's "not much diff" between 480i (composite) and 1080p (component)? Nice advice :rolleyes:
Quite a difference, but I COULD LIVE WITH COMPOSITE for awhile until I got a new
monitor, better than I could live with an out of date BLU player for years.
A monitor is one of the more important pieces of your HT, and you need to replace it
before you bother with a BLU player. BUYING a BLU player for a monitor so old
that it doesn't even have HDMI is like buying 2,000$ worth of wheels for a fifteen year old MERCURY. Cart before the horse.:1:
Woochifer
04-19-2011, 04:57 PM
ANCIENT HISTORY
But, relevant if someone owns one of those sets, and is perfectly content with the HD picture quality they already get.
Not by much.
YOU MIGHT AS WELL buy BLU, but thats just because BLU is down to 68 bucks.
But my point is that an older BLU player, one with component, is not going to be as up to date as a newer one. So when you do get around to upgrading your monitor, you will
need a new BLU player, also.
But, that's ALWAYS going to be the case with consumer electronics. People don't all upgrade everything at the same time. And if someone wants to ONLY upgrade the video player, then the availability of component video outputs is relevant.
BUT the most important thing is that a monitor without HDMI will be old as the sticks.
I have a five year old set that has HDMI, you would have to get one a lot older to have a monitor without one. I doubt BLU would be worth the trouble, really.
My parents have a five-year old LG. They're perfectly happy with it. Their set is one of those models where the picture quality is simply better with the component connections than the HDMI connections. They're not going to buy a new TV if all they want to add is a Blu-ray player.
BECAUSE why bother putting a CORVETTE ENGINE in a 59 STUDEBAKER?
Still much for hyperbole, I see. :rolleyes:
What possible good would BLU capability be with a set so old that it doesnt have HDMI?
With a budget of three hundred bucks, you are three hundred bucks away from a 42"
1080p and a BLU player. EVER hear of penny wise and pound foolish?
Again, why spend more when all you're looking to add is the Blu-ray player? Even a five-year old HDTV is still going to look much better with a Blu-ray player than a DVD player.
Thats because it involves two D/A conversions.
AND AGAIN, if the reason to go component is that your TV doesnt have HDMI, that is an old honkin TV. But you lose the argument over quality.
But, if the TV uses a different scaler/deinterlacer on the analog path than on the digital path, then it's entirely possible to have a better picture using the component video connections. That's how a lot of HDTVs were designed, and in the early days of HDMI, you had a lot of really bad video processing and unreliable connections.
Doesnt matter if component can carry 1080p(it can), nobody is going to say its as good as HDMI, because it isn't.
Again, not "nobody" and not always.
And a set that old is going to be 720p or 1080i, so its a moot point that component can carry 1080p, because it wont be 1080p at the end of the day.
Not all of them. By 2006, there were 1080p HDTVs and not all of them handled the HDMI inputs very well.
I got burned with a 1200 dollar receiver with component switching, because I A/B the
HDMI/COMPONENT out from a cable box...a cable box.
The increase in PQ was so blatant that I COULD NOT STAND to use the component switching on my new receiver, and that was a flippin cable box.
WHY BOTHER buying a compromised device to accommodate a device that needs
replacing anyway? MAKES NO SENSE.
That's what happened in YOUR case. With my parents' TV, it would be a downgrade in picture quality to use the HDMI input. I know because I've actually tried it.
Quite a difference, but I COULD LIVE WITH COMPOSITE for awhile until I got a new
monitor, better than I could live with an out of date BLU player for years.
A monitor is one of the more important pieces of your HT, and you need to replace it
before you bother with a BLU player. BUYING a BLU player for a monitor so old
that it doesn't even have HDMI is like buying 2,000$ worth of wheels for a fifteen year old MERCURY. Cart before the horse.:1:
In other words, you'd rather watch 480i on a HD-capable TV than simply add a Blu-ray player and enjoy full HD. Knock yourself out!
Good gawd, it's not like these choices are mutually exclusive. If someone buys a Blu-ray player right now, it will work perfectly fine if they upgrade their HDTV later on. And aside from networked video features, it's not like the Blu-ray players from this year are going to perform any differently than last year's models.
pixelthis
04-20-2011, 12:05 PM
=Woochifer]But, relevant if someone owns one of those sets, and is perfectly content with the HD picture quality they already get.
yes, and according to you they can buy a dumbed down BLU player to match, locking them into inferiour tech for YEARS
But, that's ALWAYS going to be the case with consumer electronics. People don't all upgrade everything at the same time. And if someone wants to ONLY upgrade the video player, then the availability of component video outputs is relevant.
no reason to upgrade the "video player" if they are not going to get maximum use out of it.
If their set has no HDMI then it is an antique. NEED TO CONCENTRATE on that
first.
My parents have a five-year old LG. They're perfectly happy with it. Their set is one of those models where the picture quality is simply better with the component connections than the HDMI connections. They're not going to buy a new TV if all they want to add is a Blu-ray player.
SO NOW you are reduced to quoting your parents. My parents are constantly watching SD when a HD channel is available, and wouldn't know component from HDMI, as your parents probably don't.
And component is never going to be as good as HDMI, too many technical challenges.
Again, why spend more when all you're looking to add is the Blu-ray player? Even a five-year old HDTV is still going to look much better with a Blu-ray player than a DVD player.
THE IMPROVENMENT is going to be slight, and for the thousandth time, if you set is so old that it has no HDMI inputs you have no business buying a BLU player,
YOU NEED A NEW MONITOR.. You are saying to buy a BLU player with compromised tech to accommodate an obsolete TV when you should be concentrating on a
decent TV.
But, if the TV uses a different scaler/deinterlacer on the analog path than on the digital path, then it's entirely possible to have a better picture using the component video connections. That's how a lot of HDTVs were designed, and in the early days of HDMI, you had a lot of really bad video processing and unreliable connections.
even more reason to get a new set.
I CAN'T BELIEVE IT, this is the first time I HAVE EVER SEEN ANYBODY argue that
component is better than HDMI!!! Next thing you will be arguing about the reality of unicorns.
A BLU player operates in the digital domain, you need a D/A conversion to use component,which is analog, then you need another conversion when you get to the set.
THE ABILITY TO carry 1080p is moot, since there is no set so old that is 1080p,
so you automatically lose a lot of the advantage of BLU.
Basically, you hook a BLU player up with component to accommodate an older set ,
it will be operating in 720p or 1080i, or worse. And you will need an older player,
which you will be stuck with when you do upgrade your monitor.
UPGRADE YOUR MONITOR FIRST. Only thing that makes sense
Not all of them. By 2006, there were 1080p HDTVs and not all of them handled the HDMI inputs very well.
My five year old set looks fine for what it is, mainly 720p
That's what happened in YOUR case. With my parents' TV, it would be a downgrade in picture quality to use the HDMI input. I know because I've actually tried it.
THEN YOU NEED TO SHOP FOR NEW GEAR
In other words, you'd rather watch 480i on a HD-capable TV than simply add a Blu-ray player and enjoy full HD. Knock yourself out!
that is you. I WOULD RATHER HAVE a decent modern set that can take full advantage
of BLU before I STARTED INVESTING IN IT.
BLU didn't excite me much when it first came out, because the diff between DVD and
a 720p BLU was not that much different, not enough to justify the high price.
Now a 1080p BLU disc is simply spectacular on a 1080p set, enough to make
someone watching a BLU fan for life. Not so with inferiour equipment
Good gawd, it's not like these choices are mutually exclusive. If someone buys a Blu-ray player right now, it will work perfectly fine if they upgrade their HDTV later on. And aside from networked video features, it's not like the Blu-ray players from this year are going to perform any differently than last year's models.
NO THEY ARE NOT, and if your DVD player conks out, get a BLU of course.
But there is a right and wrong way to do things, is all.
And just makes more sense to upgrade your monitor either with or before your
BLU upgrade.
MAYBE not a few years ago, but a really nice monitor can be had for not too bad a price.
WHEN a nice 42" set can be had for less than 600, and a 47" for a grand or less, why
live with an inferiour set?:1:
Mr Peabody
04-20-2011, 07:28 PM
Pix, you aren't being rational here. I'd rather buy a BDP for $100.00 give or take opposed to dropping $600.00 give or take for a new HDTV when I have a HDTV with just not the latest technology. Component inputs on a TV are valuable even if a Progressive Scan DVD player not to mention a BDP. People were impressed with High Definition at 720p or 1080i, which by the way is all you can get from broadcast networks, the big step is not 1080i to 1080p, that may not even be that important at all with a smaller screen.
I think your big step was going from Vizio to Sharp and you think it was the 1080p. You really need to read the link posted to the Bluejeans article and pay attention.
Worf101
04-21-2011, 04:49 AM
i have a thread going on a Funai player I have purchased from WALMART for 68 $.
IF you want to pay out the nose and get a bunch if "features" of questionable value
get something like an OPPO.
Or go to Walmart and get a magnavox (Funai) for 68 bucks. Or splurge and get a
LG for 80$ , or a Sony for a 119, etc.
My first player was 400$, a Sony that was slow as CHRISTMAS and out of date in
six months. My second was a SHARP, died in less than a year. My fault, shoulda known better.
I love my new 68$ player, looks and sounds great, dies, and I trashcan it and get one
up to date. I HAVE SPENT ENOUGH ON BLU PLAYERS.
The day of the "250 to 300" BLU player is over.
Been over.:1:
Do you HAVE to trash everything you don't own or use? The OPPO players are great pieces of gear, even the basic ones are marvelous. True I don't use SACD or some of the other audio formats BUT when that machine UPDATES itself its and amazing thing to watch. Also if I do run across a Blu-Ray it won't play you go on-line to their forums and you'll find probably find a solution. I don't own a $68.00 Funai, but I'm not going to disparrage gear I don't own or use, that's just stupidity.
Worf
recoveryone
04-21-2011, 06:53 AM
well said Worf
BadAssJazz
04-21-2011, 11:00 AM
My first player was 400$, a Sony that was slow as CHRISTMAS and out of date in six months. My second was a SHARP, died in less than a year. My fault, shoulda known better. I love my new 68$ player, looks and sounds great, dies, and I trashcan it and get one up to date. I HAVE SPENT ENOUGH ON BLU PLAYERS.
Pix, I'm just hoping that your luck with blu ray players has finally improved for the better. Reminds me of that commercial catchphrase from decades ago, "You could have had a V8!"
Granted, most have upgraded their BD players due to technological improvements, but that's some seriously bad voodoo that you've had to replace as many as you have because of flawed units. I'm not one to shop another guy's wallet, but I would definitely consider an Oppo if I had the same experiences as you, no matter if it cost me a couple hundred more than another manufacturer's player. Dependability and reliability would trump all other concerns.
After all, in the long-term, you're only shelling out more money to constantly replace price-conscious BR players than you would have spent if you had purchased, say, an Oppo BDP-80. :mad2:
pixelthis
04-21-2011, 11:33 AM
Pix, I'm just hoping that your luck with blu ray players has finally improved for the better. Reminds me of that commercial catchphrase from decades ago, "You could have had a V8!"
Granted, most have upgraded their BD players due to technological improvements, but that's some seriously bad voodoo that you've had to replace as many as you have because of flawed units. I'm not one to shop another guy's wallet, but I would definitely consider an Oppo if I had the same experiences as you, no matter if it cost me a couple hundred more than another manufacturer's player. Dependability and reliability would trump all other concerns.
After all, in the long-term, you're only shelling out more money to constantly replace price-conscious BR players than you would have spent if you had purchased, say, an Oppo BDP-80. :mad2:
Only one (the SHARP) failed, and I managed to get it back with a memory wipe.
MY FIRST, a SONY, only had LPCM , couldn't tell what was coming out of it.
Gave it to my best friend, he had Cancer, but wanted one. Couldn't afford one.
I then bought the SHARP, paid too much, but all they had, only place I could buy one.
I LOVE my SHARP. But it got to where it would not play any of the newer discs, couldn't find a firmware update(thanks Sharp), so I bought the MAGNAVOX, and was pleasantly surprized, very nice. But cheap.
BEFORE throwing it in the dumpster, I THOUGHT I would try , in desperation, a reset
on the SHARP, and it worked , and without wiping the one firmware upgrade I could find for it. NOW it works great, the MAGNAVOX went back, happy ending.
THE fUNAI named MAGNAVOX was a very capable player, especially for 68 bucks,
but not even close to the SHARP aquos(a 232 price diff, after all.)
THE Aquos was a serious piece of kit, detachable power cord, full chassis, and most importantly, better DVD up-conversion. GLAD I could get it working, even tho I will
probably need a reset every once in awhile, and SHARP'S service stinks on ice. BIG SURPRIZE.
But they do make a great piece of kit. AND I have spent enough on BLU players,
although the first superfecta I hit at the track, I'M getting an oppo.:1:
BadAssJazz
04-21-2011, 12:47 PM
All of this talk about blu ray players made me want to buy another one. And so I did. It's a sickness, I tell you! :)
Woochifer
04-21-2011, 04:58 PM
yes, and according to you they can buy a dumbed down BLU player to match, locking them into inferiour tech for YEARS
How is a Blu-ray player that came out only a few months ago "dumbed down"? If the Blu-ray players that come out "YEARS" from now are that much better, then what's to stop someone from buying a new player at that time? In the meantime, they'll get years of HD viewing from their Blu-ray player.
SO NOW you are reduced to quoting your parents. My parents are constantly watching SD when a HD channel is available, and wouldn't know component from HDMI, as your parents probably don't.
You really need to buy some new reading glasses. Where am I "quoting" my parents. I know about my parents' TV because I used both component and HDMI-based video sources on that TV. No substitute for hands on experience.
And component is never going to be as good as HDMI, too many technical challenges.
And if that's the case, then why did HDTV reviews need to cite cases in which the component video connectors produced a better picture than the HDMI connection?
THE IMPROVENMENT is going to be slight, and for the thousandth time, if you set is so old that it has no HDMI inputs you have no business buying a BLU player,
YOU NEED A NEW MONITOR.. You are saying to buy a BLU player with compromised tech to accommodate an obsolete TV when you should be concentrating on a
decent TV.
And yet those HDTVs are still perfectly capable of rendering a reference spec HD resolution picture. Just because you buy a TV every few months doesn't mean that everybody else should follow you example.
even more reason to get a new set.
I CAN'T BELIEVE IT, this is the first time I HAVE EVER SEEN ANYBODY argue that
component is better than HDMI!!! Next thing you will be arguing about the reality of unicorns.
And where do I say that "component is better than HDMI"? All that I've pointed out is that there are cases where the component video connector will produce a better picture than HDMI. You're the only one arguing the unreality that HDMI is always better, no matter what anyone else's real world experience says.
UPGRADE YOUR MONITOR FIRST. Only thing that makes sense
The only thing that makes sense is buying a Blu-ray player, if that's all you're shopping for. If someone's happy with their existing HDTV, then why would they want to stay with 480i sources?
My five year old set looks fine for what it is, mainly 720p
And yet, you're arguing that a "720p" TV is only entitled to play 480i sources, because it's too old for HD sources.
And just makes more sense to upgrade your monitor either with or before your
BLU upgrade.
And again, if somebody's happy with their existing HDTV, it makes no sense to buy a new TV if all they need to add is a Blu-ray player. Might as well tell somebody that they need to buy a new house, when all they're looking for is a new car.
Woochifer
04-21-2011, 05:09 PM
Pix, you aren't being rational here. I'd rather buy a BDP for $100.00 give or take opposed to dropping $600.00 give or take for a new HDTV when I have a HDTV with just not the latest technology. Component inputs on a TV are valuable even if a Progressive Scan DVD player not to mention a BDP. People were impressed with High Definition at 720p or 1080i, which by the way is all you can get from broadcast networks, the big step is not 1080i to 1080p, that may not even be that important at all with a smaller screen.
And we're not yet even talking about the rest of home theater system. If someone also uses an older receiver that doesn't have HDMI switching, then that's yet another component that needs to get upgraded if HDMI is the primary priority. For someone who only wants to add a Blu-ray player, it is indeed irrational to expect them to upgrade the HDTV and AV receiver at the same time, if they already have those components in place.
Do you HAVE to trash everything you don't own or use? The OPPO players are great pieces of gear, even the basic ones are marvelous. True I don't use SACD or some of the other audio formats BUT when that machine UPDATES itself its and amazing thing to watch. Also if I do run across a Blu-Ray it won't play you go on-line to their forums and you'll find probably find a solution. I don't own a $68.00 Funai, but I'm not going to disparrage gear I don't own or use, that's just stupidity.
Oh man, why did you have to bring common sense into the discussion? I think I'm gonna pack my ball and go home! :cool:
pixelthis
04-22-2011, 12:14 PM
All of this talk about blu ray players made me want to buy another one. And so I did. It's a sickness, I tell you! :)
"Come on come on, get down with the sickness"...(DAWN of the dead).:1:
pixelthis
04-22-2011, 12:33 PM
How is a Blu-ray player that came out only a few months ago "dumbed down"? If the Blu-ray players that come out "YEARS" from now are that much better, then what's to stop someone from buying a new player at that time? In the meantime, they'll get years of HD viewing from their Blu-ray player.
I WATCHED a disc from 2009 yesterday thats 1080i, this tech is changing fast
You really need to buy some new reading glasses. Where am I "quoting" my parents. I know about my parents' TV because I used both component and HDMI-based video sources on that TV. No substitute for hands on experience.
What I SAID,
And if that's the case, then why did HDTV reviews need to cite cases in which the component video connectors produced a better picture than the HDMI connection?
WHY DO YOU KEEP bringing up stuff from years ago?
And yet those HDTVs are still perfectly capable of rendering a reference spec HD resolution picture. Just because you buy a TV every few months doesn't mean that everybody else should follow you example.
EVERY FEW YEARS.
And the one I had five years ago is dull compared to my new one. AND NO SET
more than a few years old can produce a "reference" picture, the tech is changing too fast.
And where do I say that "component is better than HDMI"? All that I've pointed out is that there are cases where the component video connector will produce a better picture than HDMI. You're the only one arguing the unreality that HDMI is always better, no matter what anyone else's real world experience says.
Real world "experience from several years ago. BUT YOU KEEP on believing the opposite
of what everybody knows, mainly that component is obsolete.
BEEN that way for years, requires two D/A conversions, that alone hobbles it.
The only thing that makes sense is buying a Blu-ray player, if that's all you're shopping for. If someone's happy with their existing HDTV, then why would they want to stay with 480i sources?
If your old DVD player breaks then sure, get a new BLU player, nothing else makes sense.
But it also doesn't make sense to upgrade a perfectly good DVD player when you have a 1080i set, a 720p maybe.
BUT A SET with component only? THAT PUPPY is gonna be old.
AND YOU WILL need a BLU with a component out. My 2009 model has that,
but is showing its age. Already had to do a memory reset once already.
When I bought my MAGNAVOX, not one set I LOOKED AT HAD COMPONENT!!
Not one. Buying one with component is a compromise, and that is all I AM SAYING.
And yet, you're arguing that a "720p" TV is only entitled to play 480i sources, because it's too old for HD sources.
i am just saying that compromising your BLU purchase to accomodate an obsolete
set is dumb. A 720P with HDMI should be quite nice, BTY
And again, if somebody's happy with their existing HDTV, it makes no sense to buy a new TV if all they need to add is a Blu-ray player. Might as well tell somebody that they need to buy a new house, when all they're looking for is a new car.
What are they teaching in school these days, a car has nothing to do with a house.
HOWEVER a new car won't be as much "fun" if you only drive it on dirt roads.:1:
BadAssJazz
04-22-2011, 12:44 PM
"Come on come on, get down with the sickness"...(DAWN of the dead).:1:
Good one. :thumbsup:
The Oppo BDP-93 has landed! Can't wait to get home and fire it up. Tron, Harry Potter, Unstoppable, Black Swan, The Fighter...not sure what to watch first.
pixelthis
04-22-2011, 12:57 PM
Good one. :thumbsup:
The Oppo BDP-93 has landed! Can't wait to get home and fire it up. Tron, Harry Potter, Unstoppable, Black Swan, The Fighter...not sure what to watch first.
DO YOURSELF a favor and try some SACD and(if you can find any) DVDA discs.
And of course some BLU music and music vid discs.
I played my DVDA of the DOOBIES The Captain and me yesterday. Listening
to South city midnight lady was fantastic. WHENEVER I think of the way
the industry pooch scr***d high res formats...GAWD :1:
Woochifer
04-22-2011, 01:06 PM
I WATCHED a disc from 2009 yesterday thats 1080i, this tech is changing fast
Nothing about the tech is changing. ANY Blu-ray disc done using 50/60 Hz HD video rather than 24fps film (or 1080p24 digital cinema cameras) is going to have a native output of 1080i60.
Blu-ray's native 1080p output format is 1080p24. Any output to 1080p60 requires deinterlacing or 2:3 pulldown.
[QUOTE=pixelthis]What I SAID,
Nope, you were claiming that I was "quoting" my parents. Thanks for agreeing with me that there are exceptions to your claim that HDMI is always superior.
WHY DO YOU KEEP bringing up stuff from years ago?
Because many of those TVs are still in use, and their very existence debunks your presumption that HDMI always guarantees better performance. My advise is relevant those people who still have a need, and prefer to meet that need.
EVERY FEW YEARS.
And the one I had five years ago is dull compared to my new one. AND NO SET
more than a few years old can produce a "reference" picture, the tech is changing too fast.
Of course an older TV can produce a reference picture, why do you think production houses still use CRTs as mastering references? Technology doesn't change the HD benchmark standards that are currently used (and were largely adopted back in 1992).
Real world "experience from several years ago. BUT YOU KEEP on believing the opposite
of what everybody knows, mainly that component is obsolete.
BEEN that way for years, requires two D/A conversions, that alone hobbles it.
But, if the scaler/deinterlacer used in the digital path is inferior to the one used on the analog path (common practice on early HDMI TVs), then the component video connection will result in a better picture. Again, my advice is aimed towards people who'd rather enjoy what they have, than obsess over what they don't.
But it also doesn't make sense to upgrade a perfectly good DVD player when you have a 1080i set, a 720p maybe.
By that logic, then you're saying that anybody with a "720p" HDTV shouldn't bother with HD cable/satellite service either. And along those same lines, nobody should bother with watching HD on Fox or ABC (both 720p) either, since there's no difference between 480i and 720p in your view.
Hey, you were the one claiming that you couldn't tell the difference between DVD and Blu-ray on your old "720p" HDTV. I would assume that you don't bother with HD broadcasts either, right?
i am just saying that compromising your BLU purchase to accomodate an obsolete
set is dumb. A 720P with HDMI should be quite nice, BTY
Yet, my parents' LG looks better when connected via component video than HDMI. That's something that I confirmed with a calibration disc, and reviews of other LG TVs from that era said the same thing.
What are they teaching in school these days, a car has nothing to do with a house.
HOWEVER a new car won't be as much "fun" if you only drive it on dirt roads.:1:
Yeah, and a Blu-ray player has nothing to do with buying a new TV, if someone wants the same HD picture they currently get via broadcast and cable/satellite. Like I keep saying, if someone's happy with their current HDTV, why should they buy a new TV when all they need is a Blu-ray player?
BadAssJazz
04-22-2011, 01:38 PM
DO YOURSELF a favor and try some SACD and(if you can find any) DVDA discs.
No worries, I used to own an Oppo DV981 (which has SACD capabilities) and I still have my John Coltrane, Miles Davis, Stan Getz, Concord Jazz, Stevie Wonder, Police and a few other SACDs in a box somewhere. The music will be heard eventually. But first on deck are the movies.
Mr Peabody
04-22-2011, 05:33 PM
"Come on come on, get down with the sickness"...(DAWN of the dead).:1:
How Disturbing
pixelthis
04-24-2011, 06:02 PM
How Disturbing
Out of context, maybe.
WITH SERIOUS actors like Sarah Poley and Ving RAMES (among others)
DAWN OF THE DEAD was one of the best zombie movies, and that was some of the music
played. Thought it fit.
BTW Mr PEA, you're such a DLP fan, they have been offering a special value on
HSN shopping network, two DLP sets, a "73 and an 82"(stands extra).
THE 2600 (or so) 82" has me salivating at the mouth. MIGHT want to check it out
(along with DAWN OF THE DEAD). While you're at it, try ZOMBIELAND.:1:
Mr Peabody
04-24-2011, 06:53 PM
Pix, you should check your movie to see who the band is that does that song.
pixelthis
04-24-2011, 06:56 PM
I WATCHED a disc from 2009 yesterday thats 1080i, this tech is changing fast
[QUOTE]Blu-ray's native 1080p output format is 1080p24. Any output to 1080p60 requires deinterlacing or 2:3 pulldown.
And what does this have to do with the price of eggs? I HAVE TWO 1080 interlaced
BLU discs, 1080i 60p. And my set handles 1080p24 quite nicely, as did my last one.
No pulldown for either type disc. HOWEVER, a component connection
quite often requires a downconversion to an inferiour format, 1080i or less.
MY friends older SAMSUNG would not pass 1080p over component, neither would
my old SONY. and none have said that component is "superiour"...all
have said that its inferiour. AND OF COURSE (yet again) any set old enough to
have no HDMI is not going to be 1080p.
Nope, you were claiming that I was "quoting" my parents. Thanks for agreeing with me that there are exceptions to your claim that HDMI is always superior.
WITH MUCH OLDER SETS!!!
GET out of the last decade why don't you? AND QUIT TALKING ABOUT the fact that
"some" antique sets have "better" component than HDMI...
THAT IS NOT THE POINT!!!
The point is that those older sets DON'T HAVE HDMI AT ALL!!!
The only way you can hook up a BLU player is WITH component, so you
need an OLDER blu p[layer, since newer ones don't have component, which will
render it obsolete out of the box. AND MAYBE you should have your parents post,
maybe they make more sense
Because many of those TVs are still in use, and their very existence debunks your presumption that HDMI always guarantees better performance. My advise is relevant those people who still have a need, and prefer to meet that need.
THOSE sets are in use because joe six has different priorities that your average HT nut.
THE reason these sets are still in use are primary monetary, not technical.
THE WAF (wife acceptance factor) allowed some guys too get one of those sleek new flat panels, but trade up to a 1080p? HARDLY. And your ignorance of the common man
(of which I AM ONE) is showing. WHERE HAVE YOU BEEN THE LAST FEW YEARS?
During the major recession that has been ongoing? HAVING a set that even runs has been
problematic for a lot of people starting their second year of unemployment, much less
upgrade for reasons not as complelling as groceries
Of course an older TV can produce a reference picture, why do you think production houses still use CRTs as mastering references? Technology doesn't change the HD benchmark standards that are currently used (and were largely adopted back in 1992).
SO WHY DON'T we abandon 1080p since its so much easier to produce 1080i sets?
PRODUCTION houses use CRT because they are cheap, and as these wear out they will be replaced by panels. but anybody who thinks 1080i is even close to 1080p
needs to have their head examined, as well as their eyes.
But, if the scaler/deinterlacer used in the digital path is inferior to the one used on the analog path (common practice on early HDMI TVs), then the component video connection will result in a better picture. Again, my advice is aimed towards people who'd rather enjoy what they have, than obsess over what they don't.
THIS is why your "advice" is nonsense. PEOPLE with a set old enough to not have HDMI
are not going to benefit from BLU very much. They buy a BLU player, hook it up to
their obsolete set, and they will wonder what the fuss is about. And you are right, people with such an older set should not obsess about what they don't have(a blu player)
if their antique set can't get the proper use out of it, better to wait and upgrade their set,
glad you finally agree with me on something
By that logic, then you're saying that anybody with a "720p" HDTV shouldn't bother with HD cable/satellite service either. And along those same lines, nobody should bother with watching HD on Fox or ABC (both 720p) either, since there's no difference between 480i and 720p in your view.
YOU OBVIOUSLY slept through any logic class you might have had.
BLU is the best way to watch HD on the planet. WATCHING broadcasts has nothing to do
with watching BLU, and you need to stop trying to compare the two. AND STOP MISQUOTING ME, there is a huge difference between 480interlaced and 720p,
but not so much between 480progressive and 720p. I said the latter, not the former
Hey, you were the one claiming that you couldn't tell the difference between DVD and Blu-ray on your old "720p" HDTV. I would assume that you don't bother with HD broadcasts either, right?
Apples and oranges. My first 1080i sets looked tons better than NTSC, and 720p looked better than them. And my first 1080p blew all of them away.
WATCHING BLU on a lesser set didn't mean much, didn't look much better, really,
than a decent DVD .
Yet, my parents' LG looks better when connected via component video than HDMI. That's something that I confirmed with a calibration disc, and reviews of other LG TVs from that era said the same thing.
Riiiiight...from that era. NOT LOOKIN FOR SETS from that era, which makes what you're saying totally irrelevant
Yeah, and a Blu-ray player has nothing to do with buying a new TV, if someone wants the same HD picture they currently get via broadcast and cable/satellite. Like I keep saying, if someone's happy with their current HDTV, why should they buy a new TV when all they need is a Blu-ray player?
BECAUSE if their TV doesn't have HDMI then its way past time to upgrade.
WHY buy a BLU player except as a replacement for a busted DVD player if your set
can't get the full use out of it? Why put the cart before the horse? IF your set is so
old that it doesn't have HDMI then that is what you need to concentrate on.
And while in the store buying that new set, shell out a few bucks on a new BLU player,
they will be even cheaper then(if thats possible):1:
Woochifer
04-24-2011, 09:19 PM
And what does this have to do with the price of eggs?
You're the one claiming that 1080i Blu-ray titles from 2009 have something to do with technology changing fast, and I'm pointing out that Blu-ray isn't capable of native 1080p60.
I HAVE TWO 1080 interlacedBLU discs, 1080i 60p.
No such thing as "1080i 60p." It's either interlaced or progressive. Get your terminology straight before you persist in these trolling adventures of yours.
And my set handles 1080p24 quite nicely, as did my last one.
No pulldown for either type disc.
Unless either TV was 120 Hz, they have to use 2:3 pulldown to display a 1080p24 output.
WITH MUCH OLDER SETS!!!
GET out of the last decade why don't you? AND QUIT TALKING ABOUT the fact that
"some" antique sets have "better" component than HDMI...
Thank you again for admitting that you were wrong on that point.
THAT IS NOT THE POINT!!!
That's exactly the point that I've been making all along, that there are exceptions to your inane rantings about HDMI.
The point is that those older sets DON'T HAVE HDMI AT ALL!!!
And my point was about specific TVs that have both component and HDMI connections.
The only way you can hook up a BLU player is WITH component, so you
need an OLDER blu p[layer, since newer ones don't have component, which will
render it obsolete out of the box.
And those "OLDER blu p[layer [sp]" include all of the current Oppo players, the PS3, and the majority of Blu-ray players still found in retail stores. Are you saying that those players are obsolete?
AND MAYBE you should have your parents post,
maybe they make more sense
Yes, they make a lot more sense than you do, thank you.
THOSE sets are in use because joe six has different priorities that your average HT nut.
THE reason these sets are still in use are primary monetary, not technical.
So, then why would somebody who has a tight budget need to buy a new HDTV when all they want is a Blu-ray player?
THE WAF (wife acceptance factor) allowed some guys too get one of those sleek new flat panels, but trade up to a 1080p? HARDLY. And your ignorance of the common man
(of which I AM ONE) is showing.
Based on your rantings, your views and purchasing habits are rather uncommon, thankfully.
During the major recession that has been ongoing? HAVING a set that even runs has been
problematic for a lot of people starting their second year of unemployment, much less
upgrade for reasons not as complelling as groceries
So, your solution for the "common man" is to have them spend more than they've budgeted on their home entertainment. I hear a Copland fanfare playing on the world's smallest violin.
YOU OBVIOUSLY slept through any logic class you might have had.
And you obviously don't know what logic is.
BLU is the best way to watch HD on the planet. WATCHING broadcasts has nothing to do
with watching BLU, and you need to stop trying to compare the two. AND STOP MISQUOTING ME, there is a huge difference between 480interlaced and 720p,
but not so much between 480progressive and 720p. I said the latter, not the former
So, if there's this huge difference between 480i and 720p, then why are you recommending that people that own 720p TVs stick with DVD rather than upgrade to Blu-ray? The mental gymnastics you play to justify all of these contradictions in your rantings is rather fascinating.
WATCHING BLU on a lesser set didn't mean much, didn't look much better, really,
than a decent DVD .
And yet you're claiming that there's this "huge difference" between 480i and 720p. So, which is it?
Riiiiight...from that era. NOT LOOKIN FOR SETS from that era, which makes what you're saying totally irrelevant
Nobody cares if you're "NOT LOOKIN FOR SETS from that era." My original post was directed towards somebody who might own an older set with inferior HDMI connections, or none at all.
WHY buy a BLU player except as a replacement for a busted DVD player if your set
can't get the full use out of it? Why put the cart before the horse? IF your set is so
old that it doesn't have HDMI then that is what you need to concentrate on.
But, once again, if someone has a HDTV that they're happy with, then it makes no sense to buy a new TV when all they want to add is a Blu-ray player. Any Blu-ray player that they buy now will work perfectly fine when they are ready to upgrade the TV.
pixelthis
04-25-2011, 01:29 PM
Pix, you should check your movie to see who the band is that does that song.
I knew but forgot. SORRY, will take a look next time I view this little gem, which will
probably be BLU.:1:
pixelthis
04-25-2011, 02:14 PM
You're the one claiming that 1080i Blu-ray titles from 2009 have something to do with technology changing fast, and I'm pointing out that Blu-ray isn't capable of native 1080p60.
DOESNT matter , it is capable of 1080p24, and with a 1080p set, 1080i is deinterlaced
A LOT BETTER , MUCH LIKE BROADCASTS IN 1080I.
[QUOTE]No such thing as "1080i 60p." It's either interlaced or progressive. Get your terminology straight before you persist in these trolling adventures of yours.
WHAT DO YOU THINK THE "i" STANDS FOR?
And I REALLY DON'T UNDERSTAND why you insist on stating the obvious, of course
there are only two types of signal, interlaced or progressive. Again, what the frack
does that have to do with the price of eggs?
Unless either TV was 120 Hz, they have to use 2:3 pulldown to display a 1080p24 output.
WHICH tv? My set displays a 1080p 24fps signal perfectly, no "pulldown"
Thank you again for admitting that you were wrong on that point.
And thank you for admitting that you are talking about tv sets at least a decade old,
which has nothing to do with anything
That's exactly the point that I've been making all along, that there are exceptions to your inane rantings about HDMI.
NOT IN THIS DECADE.
EVERY set with HDMI on the market is better with that connection.
Tv sets from a decade ago are of no matter
And my point was about specific TVs that have both component and HDMI connections.
NO, you said that if your set had NO HDMI that you needed to buy a BLU player with component, which is moronic in the extreme, because it ties you into an older BLU
player, if you can find one, and good luck on that
If your tv doesnt have HDMI then its not current, in fact, its quite old.
WASTE OF TIME to watch a BLU disc on such an old set, doesnt matter if component is better than HDMI(its not), doesnt matter if you buy a compromised player with component,
the pic on such an old set is not going to be worth a BLU player.
SIMPLE AS THAT
And those "OLDER blu p[layer [sp]" include all of the current Oppo players, the PS3, and the majority of Blu-ray players still found in retail stores. Are you saying that those players are obsolete?
now that you mention it, ever notice that EVERY one of those player have instructions
that state that component is the inferiour connection?
These players are just offering an inferiour connection for obsolete TV sets, no reason to
use that inferiour connection
Yes, they make a lot more sense than you do, thank you.
AND YOU MAKE NO SENSE AT ALL
So, then why would somebody who has a tight budget need to buy a new HDTV when all they want is a Blu-ray player?
Why would anybody want to waste their time with a BLU player when their tv is obsolete?
AND WASTE THEIR MONEY ON obsolete gear.
Based on your rantings, your views and purchasing habits are rather uncommon, thankfully.
And based on your postings you are totally clueless.
So, your solution for the "common man" is to have them spend more than they've budgeted on their home entertainment. I hear a Copland fanfare playing on the world's smallest violin.
And you obviously don't know what logic is.
If you know you certainly don't practice it.
So, if there's this huge difference between 480i and 720p, then why are you recommending that people that own 720p TVs stick with DVD rather than upgrade to Blu-ray? The mental gymnastics you play to justify all of these contradictions in your rantings is rather fascinating.
BECAUSE there is not that big a difference, really. AND when I had a 720p SET
I didn't waste my time on BLU, which was quite expensive back then, and not worth the trouble for the expense involved. And sorry if you don't understand, but I AM TALKING ABOUT 480 PROGRESSIVE, which you get when you deinterlace 480i.
Not that much diff between 480p and 720p. PLEASE stop misrepresenting what I SAY,
or do you just have comprehension problems?
And yet you're claiming that there's this "huge difference" between 480i and 720p. So, which is it?
NOT "HUGE " but it is there. But this is another of your worthless arguments, since
480i is extinct for all practical purposes. Unless you have an obsolete old CRT that
can show interlaced programing, and in that case, why bother?
Nobody cares if you're "NOT LOOKIN FOR SETS from that era." My original post was directed towards somebody who might own an older set with inferior HDMI connections, or none at all.
Your original post offered the lame advice that you need to waste money on a BLU player
that has component, and is therefore obsolete, unless they want to spend several hundred on a PS3 or OPPO, in order to accommodate an obsolete TV, which is probably in need of replacement IN OTHER WORDS, use an inferiour connection to see a compromised
picture, which is not going to show the full image that a BLU disc is capable of
GEE, is your system a load of compromised junk?
But, once again, if someone has a HDTV that they're happy with, then it makes no sense to buy a new TV when all they want to add is a Blu-ray player. Any Blu-ray player that they buy now will work perfectly fine when they are ready to upgrade the TV.
AND WHEN THEY FINALLY GET a decent TV that is actually capable of showing a decent BLUE picture , they are going to wonder why you advised them to waste
their time.
AND THAT IS THE LAST thing I HAVE TO SAY ON IT.
Never argue with a birdbrain, it wastes your time and annoys the bird.:1:
pixelthis
04-25-2011, 02:33 PM
Every since I HAVE COME ON THIS SITE I have heard that you need to buy compromised gear in order to accommodate obsolete gear you already have.
If you have a stack of old 8tracks, sure, you need an 8track player.
BUT IF YOUR MONITOR has no HDMI, then by its very nature its obsolete.
ADVISING SOMEONE to buy a BLU player for such an old set is like advising them
to buy hubcaps FOR THEIR HORSE , when what they need is an automobile.
IF your DVD player dies, then you might as well get a BLU player, BLU plays DVD better than a DVD player itself.
But don't waste your time even thinking about a BLU player if your set doesn't have HDMI.
Sure, its a free country, waste your time and money if you want.
IF your TV is that old, you need a new TV, one that has a decent picture, at least 720p,
1080p IS PREFERABLE.
And I am not being a "snob", that is a simple fact of life.
Its not just for BLU, 1080i broadcasts look spectacular in 1080p, deinterlacing producing
real improvement.
So get a BLU player, sure, but while you're at it, get a 1080p TV to go with it.
I SAW A 42" 1080P LED for less than five large today, if you can't afford that
thats not a crime, but why waste your money on BLU players and BLU discs
when your set is so old?
MAKES no sense. IT NEVER DOES to put the cart before the horse.:1:
Woochifer
04-29-2011, 04:35 PM
DOESNT matter , it is capable of 1080p24, and with a 1080p set, 1080i is deinterlaced
A LOT BETTER , MUCH LIKE BROADCASTS IN 1080I.
And guess what, this deinterlacing happens regardless of the connection standard.
WHAT DO YOU THINK THE "i" STANDS FOR?
And I REALLY DON'T UNDERSTAND why you insist on stating the obvious, of course
there are only two types of signal, interlaced or progressive. Again, what the frack
does that have to do with the price of eggs?
Because from your responses, it's obvious that you don't understand the obvious.
WHICH tv? My set displays a 1080p 24fps signal perfectly, no "pulldown"
Case in point (see above).
If you don't have a 120/240 Hz that does a 5:5 frame repeat to display a native 1080p24 signal, it's doing 2:3 pulldown. Just because you don't know anything about this, doesn't mean it's not happening.
And thank you for admitting that you are talking about tv sets at least a decade old,
which has nothing to do with anything
Pixelthis says 2006 = "a decade old" :out:
NO, you said that if your set had NO HDMI that you needed to buy a BLU player with component, which is moronic in the extreme, because it ties you into an older BLU
And if that "older BLU" includes the CURRENT Oppo players and the PS3, then how is this moronic to buy a CURRENT player that includes both component and HDMI outputs? Just in case this is news to you, the inclusion of component video outputs does not exclude HDMI outputs.
if you can find one, and good luck on that
Let's see ... any Oppo player, any PS3, ... yeah, good luck finding one of those. :rolleyes:
WASTE OF TIME to watch a BLU disc on such an old set, doesnt matter if component is better than HDMI(its not), doesnt matter if you buy a compromised player with component,
the pic on such an old set is not going to be worth a BLU player.
SIMPLE AS THAT
So, tell me again how the Oppo players are compromised, given that they are the highest performing units on the market?
now that you mention it, ever notice that EVERY one of those player have instructions
that state that component is the inferiour connection?
But, again that doesn't apply to every TV.
AND YOU MAKE NO SENSE AT ALL
Not my fault that logic's not your strongsuit.
Why would anybody want to waste their time with a BLU player when their tv is obsolete?
Let's see, a functional HDTV that can display full HD images and can accommodate a Blu-ray player is obsolete? Mmmm hmmm
AND WASTE THEIR MONEY ON obsolete gear.
Again, how's the Oppo or PS3 obsolete?
BECAUSE there is not that big a difference, really. AND when I had a 720p SET
I didn't waste my time on BLU, which was quite expensive back then, and not worth the trouble for the expense involved.
In other words, you never actually tried a Blu-ray player on your "720p" TV.
And sorry if you don't understand, but I AM TALKING ABOUT 480 PROGRESSIVE, which you get when you deinterlace 480i.
And again, DVD is native 480i format. Progressive scan does nothing to change that.
Not that much diff between 480p and 720p. PLEASE stop misrepresenting what I SAY,
or do you just have comprehension problems?
I'm not mispresenting anything. You're the one claiming that a 480i format has no visible advantage over a native 1080p format when viewing on a 720p TV.
NOT "HUGE " but it is there. But this is another of your worthless arguments, since
480i is extinct for all practical purposes.
480i is extinct? I didn't know DVDs and SD video feeds no longer exist.
Your original post offered the lame advice that you need to waste money on a BLU player
that has component, and is therefore obsolete, unless they want to spend several hundred on a PS3 or OPPO, in order to accommodate an obsolete TV, which is probably in need of replacement IN OTHER WORDS, use an inferiour connection to see a compromised
picture, which is not going to show the full image that a BLU disc is capable of
GEE, is your system a load of compromised junk?
How's it a waste of money to buy a CURRENT Blu-ray player that also happens to meet an immediate need?
AND WHEN THEY FINALLY GET a decent TV that is actually capable of showing a decent BLUE picture , they are going to wonder why you advised them to waste
their time.
How's it a waste of time when the Blu-ray player that they buy today will work perfectly with the TV that they buy later? In the meantime, they'll enjoy their Blu-ray player and still enjoy it when they actually ready to upgrade to a new TV. I don't know why incrementally adding according to need is such a difficult concept with you.
AND THAT IS THE LAST thing I HAVE TO SAY ON IT.
If only we were so lucky, as evidenced by the continued incoherency on display in your latest rant.
pixelthis
05-06-2011, 12:33 PM
Because from your responses, it's obvious that you don't understand the obvious.
And from your response, you don't "comprehend" the obvious
If you don't have a 120/240 Hz that does a 5:5 frame repeat to display a native 1080p24 signal, it's doing 2:3 pulldown. Just because you don't know anything about this, doesn't mean it's not happening.
WRONG (as usual). In order to avoid 2:3 pulldown, BLU has a rate of 72hz.
EACH frame is shown three times(72 divided by three= 24). This gives a frame rate
of 24fps without pulldown.
WHEN THIS happens my set displays 1080p24.
Pixelthis says 2006 = "a decade old" .
Still irrelevant, like most of what you say.
And if that "older BLU" includes the CURRENT Oppo players and the PS3, then how is this moronic to buy a CURRENT player that includes both component and HDMI outputs? Just in case this is news to you, the inclusion of component video outputs does not exclude HDMI outputs.
BECAUSE a "modern" player does not have component.
AND YOU WANT TO BUY A PLAYSTATION, and pay for a hackers crack addiction,
go right ahead
Let's see ... any Oppo player, any PS3, ... yeah, good luck finding one of those. :rolleyes:
YEAH, for less than three hundred bucks
So, tell me again how the Oppo players are compromised, given that they are the highest performing units on the market?
THEY COST THREE HUNDRED BUCKS. Yeah, this would be a great first player.
WHY BUY A three hundred player when your TV is as old as dino bones?
But, again that doesn't apply to every TV.
JUST every one ever made.
Not my fault that logic's not your strongsuit.
And not mine that average comprehension skills are above your skill set
Let's see, a functional HDTV that can display full HD images and can accommodate a Blu-ray player is obsolete? Mmmm hmmm
IF ITS NOT 1080P. This is where BLU shines, and if your set won't display it, then
you're wasting your time with a BLU player
In other words, you never actually tried a Blu-ray player on your "720p" TV.
ACTUALLY I DID...looked like really good cable. IF YOUR SET IS 720P, WHY PAY 300+ FOR A BLU player so you can watch the equivalent of basic cable?
And again, DVD is native 480i format. Progressive scan does nothing to change that.
It doubles the resolution. THIS is the one time (deinterlacing) that you can get a res increase on a signal. DEINTERLACING does not change the fact that it starts out as an interlaced signal, but does improve it immensely
I'm not mispresenting anything. You're the one claiming that a 480i format has no visible advantage over a native 1080p format when viewing on a 720p TV.
Because it doesn't. 1080i is down converted and 480i is upconverted.
BOTH look rather decent. And similar.
480i is extinct? I didn't know DVDs and SD video feeds no longer exist.
Not on my cable system. Without an adapter you can't get a signal.
AND dvd does still exist, and if you really wanted you could watch it on an old NTSC
set, but why bother? And any set you watch it on that is HD is going to upconvert it
weather you do or not. SO SURE its natively 480i, but it would cost you to actually watch
it in 480i. IF YOU ENJOY wasting your time, go right ahead
How's it a waste of money to buy a CURRENT Blu-ray player that also happens to meet an immediate need?
If your set is not 1080p, why waste your time? You need to be saving your coin to
buy a decent monitor.
How's it a waste of time when the Blu-ray player that they buy today will work perfectly with the TV that they buy later? In the meantime, they'll enjoy their Blu-ray player and still enjoy it when they actually ready to upgrade to a new TV. I don't know why incrementally adding according to need is such a difficult concept with you.
Yes, they will enjoy a picture that looks like a really good DVD, they will have to pay
300+ TO GET A SONY OR AN oppo to see this incredibly average picture on
their obsolete set with component, when if they bought a new set, they could see a much
better pic with a 100 buck(or less) blu player. AND SPENDING 300 plus on a
BLU player when your monitor is obsolete is not unwise, ITS STUPID.
If only we were so lucky, as evidenced by the continued incoherency on display in your latest rant.
anybody follows your lame advice, they will be decidedly unlucky:1:
harley .guy07
05-06-2011, 07:23 PM
I have owned a Oppo BDP 83 for around a year now and I still love it. It works very well with Blue Ray with little lag time loading up, has a very good video processing circuit that up scales my DVD's to 1080p and they look great and to boot the unit makes a awesome transport for playing cd's,sacd,and DVD audio disks to my DAC. So in my opinion this unit is very good and probably one of the best things going at its price point for doing all of this with the quality that it does. I would only hope the 93 is as good and probably better at this as well. And also the Build quality on these units is some of the best I have seen for under the 1000 dollar price class as well.
pixelthis
05-07-2011, 06:26 PM
I have owned a Oppo BDP 83 for around a year now and I still love it. It works very well with Blue Ray with little lag time loading up, has a very good video processing circuit that up scales my DVD's to 1080p and they look great and to boot the unit makes a awesome transport for playing cd's,sacd,and DVD audio disks to my DAC. So in my opinion this unit is very good and probably one of the best things going at its price point for doing all of this with the quality that it does. I would only hope the 93 is as good and probably better at this as well. And also the Build quality on these units is some of the best I have seen for under the 1000 dollar price class as well.
WELL, its common knowledge that without expensive equipment you don't get
a resolution increase when upscaling VIDEO, however I have noticed with all
of the BLU players I have used that the dvd video is quite good.
Might just be the total lack of any kind of flaw in the playback, but it usually looks
mighty good.:1:
dingus
05-07-2011, 10:01 PM
i went for the Oppo BDP-83 for various reasons. SACD, DVD-A, plus the emerging trend of music on blu-ray. sure you can get a universal player for less than the Oppo (i paid $330 to my door for a used unit from Audiogon), but it was the least expensive player i found that did all of the above and also did PAL, which is important for me because i have several concert vids in the format.
BadAssJazz
05-09-2011, 10:41 AM
I have owned a Oppo BDP 83 for around a year now and I still love it. It works very well with Blue Ray with little lag time loading up, has a very good video processing circuit that up scales my DVD's to 1080p and they look great and to boot the unit makes a awesome transport for playing cd's,sacd,and DVD audio disks to my DAC. So in my opinion this unit is very good and probably one of the best things going at its price point for doing all of this with the quality that it does. I would only hope the 93 is as good and probably better at this as well. And also the Build quality on these units is some of the best I have seen for under the 1000 dollar price class as well.
I've only owned the BDP-93 for a few short weeks now, but have absolutely no complaints with BR or audio (SACD) performance. In fact, I'm kicking myself for having waited so long to pull the trigger on a purchase. Time to find a new home for the Pioneer Elite Blu Ray player that it replaced.
It's also time to consider picking up either the Marantz AV8003 & MM7055 or Integra 80.2 & DTA 70.1. Getting back to separates is also long overdue, especially now that I have the space for it.
Sir Terrence the Terrible
05-09-2011, 11:53 AM
Wow Pix, your ignorance of the Bluray format is staggering. You really need to take a Bluray 101 course - you don't know your ass from a hole in the ground.
harley .guy07
05-09-2011, 01:29 PM
I've only owned the BDP-93 for a few short weeks now, but have absolutely no complaints with BR or audio (SACD) performance. In fact, I'm kicking myself for having waited so long to pull the trigger on a purchase. Time to find a new home for the Pioneer Elite Blu Ray player that it replaced.
It's also time to consider picking up either the Marantz AV8003 & MM7055 or Integra 80.2 & DTA 70.1. Getting back to separates is also long overdue, especially now that I have the space for it.
yeah I have not seen to many people out there that don't like what the Oppo players do for them in their setups. Any format I put in my player plays good and either sounds great through my dac(audio) or looks good either being DVD or Blue Ray. I know that DVD upscaling will not add any content to the material anyone with any kind of knowledge knows that just like taking a mp3 and making it flac format ro wave to transfer to a cd will not add anything to the origional data but the video processors in the Oppo players are top notch in their class for taking DVD's and making them look as good as they possibly can on a 1080p led tv. There might be better out there but at much higher prices and that is probably why Lexicon tried to take a BDP-83 and put their own face plate on it and charge over 3 grand for it. while its a stupid move of Lexicons part it does show you that the Oppo players are great and that is pretty much common knowledge at this point.
pixelthis
05-09-2011, 01:40 PM
Wow Pix, your ignorance of the Bluray format is staggering. You really need to take a Bluray 101 course - you don't know your ass from a hole in the ground.
If you are even in a war, you will be shooting at the people standing next to you.
What I KNOW is learned honestly, not through some press release.:1:
Sir Terrence the Terrible
05-09-2011, 02:15 PM
If you are even in a war, you will be shooting at the people standing next to you.
What I KNOW is learned honestly, not through some press release.:1:
Please be standing next to me, pleeeeeeeze.
If I learned what I have learned from press releases, then I need to hand them over to you. You don't know dooookey about audio or video.
harley .guy07
05-09-2011, 06:47 PM
I have put my input on this subject strictly as a view of what I have experienced and the Knowledge war is yours as far as I am concerned. I know my stuff but trying to out do each other with knowledge is not my cup o tea. I agree with the things that are correct form each individual and the things that aren't correct will show to the people that know.
pixelthis
05-10-2011, 09:07 AM
Please be standing next to me, pleeeeeeeze.
If I learned what I have learned from press releases, then I need to hand them over to you. You don't know dooookey about audio or video.
I KNOW audio AND video from actually using and working with it on a daily basis.
MY first HD set, a 47" rptv PANNY, I went into, worked over with the service menu,
you name it.
I HAVE INSTALLED sat tv, home theater, studied volumes on resolution, how
video actually works, took a three year class, have taken sets apart and reassembled
them.
The problem you have with me is that you are a total ignoramus on all things video,
you know just enough to come across as a complete buffoon, you don't know
an oscillator from an escalator. NOR do you understand the basics of how a
set displays resolution, which is why you buy into cons like "motion" resolution,
etc.
YOU, SIR, ARE a perfect example of someone who has had the common sense
educated out of them. If there is an instruction book you might be okay, but try
to color outside the lines and you will be lost at sea, because you have never
been educated in the basics, you just know what your masters tell you, which
is not surprizing for a PR HACK.
I AM A LAYMAN with quite a bit of experience in these things, don't pretend to be nothing
else.
YOU ARE a lawyer or other kind of PR type that has been trained in just enough
basic electronics to understand elemental terms.
WHAT gets your goat about me is that I SEE RIGHT THROUGH YOU, you are a
fraud, a creation of a corporation, and you know just enough, with the help of a search
engine, to fool the ignorant most of the time.
But you don't fool me , I HAVE never pretended to be more than I AM.
You have, and thats why you're FAKE.:1:
Sir Terrence the Terrible
05-10-2011, 10:00 AM
I KNOW audio AND video from actually using and working with it on a daily basis.
MY first HD set, a 47" rptv PANNY, I went into, worked over with the service menu,
you name it.
I HAVE INSTALLED sat tv, home theater, studied volumes on resolution, how
video actually works, took a three year class, have taken sets apart and reassembled
them.
The problem you have with me is that you are a total ignoramus on all things video,
you know just enough to come across as a complete buffoon, you don't know
an oscillator from an escalator. NOR do you understand the basics of how a
set displays resolution, which is why you buy into cons like "motion" resolution,
etc.
YOU, SIR, ARE a perfect example of someone who has had the common sense
educated out of them. If there is an instruction book you might be okay, but try
to color outside the lines and you will be lost at sea, because you have never
been educated in the basics, you just know what your masters tell you, which
is not surprizing for a PR HACK.
I AM A LAYMAN with quite a bit of experience in these things, don't pretend to be nothing
else.
YOU ARE a lawyer or other kind of PR type that has been trained in just enough
basic electronics to understand elemental terms.
WHAT gets your goat about me is that I SEE RIGHT THROUGH YOU, you are a
fraud, a creation of a corporation, and you know just enough, with the help of a search
engine, to fool the ignorant most of the time.
But you don't fool me , I HAVE never pretended to be more than I AM.
You have, and thats why you're FAKE.:1:
WAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA.,.....heheheheheh eheheheheheheh.. gotta pee......LOLOLOLOLOLOLOLOLOL...
Pix, after saying all of that, lets revisit one of your stupid comment in this thread.
SO WHY DON'T we abandon 1080p since its so much easier to produce 1080i sets?
PRODUCTION houses use CRT because they are cheap, and as these wear out they will be replaced by panels. but anybody who thinks 1080i is even close to 1080p
needs to have their head examined, as well as their eyes..
1080p and 1080i have all of the same information just presented differently. 1080i presents the information in two fields, and 1080p presents it in one field. At common viewing distances you will not see any difference between the two. The resolution is exactly the same, no difference.
Post houses still use CRT's because they are more accurate than the current professional panels on the market. The grey scale is more accurate, color rendition is more accurate, blacks are blacker, and there is no motion trails. The professional CRT's used in post houses costs far more than your panel costs - which means there is nothing "cheap" about them.
So if you think there is such a dramatic difference between 1080p and 1080i, YOU need to get YOUR head examined.
Somebody needs to call the men in the white coats. Pix is so delusional, he is dangerous to himself.
Good joke Pixie, I hope you have fully convinced yourself that you are a legend in your own empty head. Whatever you are taking, stop.
pixelthis
05-11-2011, 11:22 AM
WAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA.,.....heheheheheh eheheheheheheh.. gotta pee......LOLOLOLOLOLOLOLOLOL...
Pix, after saying all of that, lets revisit one of your stupid comment in this thread.
SO WHY DON'T we abandon 1080p since its so much easier to produce 1080i sets?
PRODUCTION houses use CRT because they are cheap, and as these wear out they will be replaced by panels. but anybody who thinks 1080i is even close to 1080p
needs to have their head examined, as well as their eyes..
1080p and 1080i have all of the same information just presented differently. 1080i presents the information in two fields, and 1080p presents it in one field. At common viewing distances you will not see any difference between the two. The resolution is exactly the same, no difference.
Post houses still use CRT's because they are more accurate than the current professional panels on the market. The grey scale is more accurate, color rendition is more accurate, blacks are blacker, and there is no motion trails. The professional CRT's used in post houses costs far more than your panel costs - which means there is nothing "cheap" about them.
So if you think there is such a dramatic difference between 1080p and 1080i, YOU need to get YOUR head examined.
Somebody needs to call the men in the white coats. Pix is so delusional, he is dangerous to himself.
Good joke Pixie, I hope you have fully convinced yourself that you are a legend in your own empty head. Whatever you are taking, stop.
this is an exelent example of your ignorance in basic knowledge of anything video.
THANKS for the demo of your ignorance, outmatched only by a dino walking into a tar pit.
No "difference"? THEN WHY CAN'T YOU BUY a 1080i anywhere?
BECAUSE side by side there is a marked difference between 1080i and 1080p.
You just have to compare the two, basically.
This is because a 1080p picture has twice the information at any given time
onscreen than a 1080i.
This is because the two "fields" in a 1080i picture are deinterlaced to create a
progressive picture with a true 1080 lines, as opposed to the 540 lines that are on a
1080i screen at any given time.
ALSO 1080p has few artifacts, while 1080i loses up to half its resolution everytime
theres' movement. This is common knowledge, but of course you don't have
a clue about it. This is why the entire industry went progressive, because you don't
lose res when theres movement, and there are fewer other artifacts.
Any progressive image is going to be more solid than any given interlaced pic.
THIS is why the computer industry went to progressive scan displays early, so
that text could be read easier on computer screens.
A system has gradually evolved where the broadcast standard for HD is 1080i, which
saves bandwidth, then 1080p sets deinterlace the two fields for complete frames in
a progressive 1080p picture, which looks outstanding.
EVENTUALLY THERE WONT be any interlaced displays, they are obsolete.
Anything else you care to display your ignorance on?:1:
Sir Terrence the Terrible
05-12-2011, 09:14 AM
this is an exelent example of your ignorance in basic knowledge of anything video.
THANKS for the demo of your ignorance, outmatched only by a dino walking into a tar pit.
No "difference"? THEN WHY CAN'T YOU BUY a 1080i anywhere?
Its is not because 1080p is better than 1080i. Its because most of the sources we use are progressive, and progressive going to a progressive reduces the amount of processing you have to include in the set.
BECAUSE side by side there is a marked difference between 1080i and 1080p.
You just have to compare the two, basically.
This is because a 1080p picture has twice the information at any given time
onscreen than a 1080i.
This is because the two "fields" in a 1080i picture are deinterlaced to create a
progressive picture with a true 1080 lines, as opposed to the 540 lines that are on a
1080i screen at any given time.
If I am refreshing each field 30 times a second, you are not going to see 540 lines - our eyes do not process the information that fast. If they did, you would see flickering as each field is sequentially painted on the screen. Painting the screen with each field 30 times a second, to the eyes we see exactly the same thing as a progressive screen painting the screen 60 times a second. Its one thing to talk about theory, its another to see how that theory works with the eyes and brain.
Learn something:
http://www.hometheater.com/geoffreymorrison/0807061080iv1080p/
Notice these words from the article
There Is No Difference Between 1080p and 1080i
I supposed you are going to claim he doesn't know what he is talking about as well.:rolleyes:
ALSO 1080p has few artifacts, while 1080i loses up to half its resolution everytime
theres' movement. This is common knowledge, but of course you don't have
a clue about it. This is why the entire industry went progressive, because you don't
lose res when theres movement, and there are fewer other artifacts.
Even progressive displays lose resolution when there is movement.
http://hdguru.com/will-you-see-all-the-hdtv-resolution-you-expected-125-2008-model-test-results-hd-guru-exclusive/287/
Note this line:
An HDTV may resolve a stationary test signal at full bandwidth, displaying all the detail within the 1920 individual pixels that appear across the screen, but not necessarily when motion is introduced, which on some sets causes a resolution drop.
So you are wrong again!
Any progressive image is going to be more solid than any given interlaced pic.
THIS is why the computer industry went to progressive scan displays early, so
that text could be read easier on computer screens.
A system has gradually evolved where the broadcast standard for HD is 1080i, which
saves bandwidth, then 1080p sets deinterlace the two fields for complete frames in
a progressive 1080p picture, which looks outstanding.
EVENTUALLY THERE WONT be any interlaced displays, they are obsolete.
Anything else you care to display your ignorance on?:1:
Actually, based on the links its you that is displaying the ignorance. Your the kind of idiot that does not know how to separate hype and theory from real world conditions. Theory is one thing, how our eyes and brain process that theory is another.
Here is some more information for you if you can actually understand it.
http://www.gamespot.com/forums/topic/26585312
I will look at Best Buys line up and go from there. Thanks for all of your replies.
I will look at Best Buys line up and go from there. Thanks for all of your replies.
Bob, it's a shame a few guys crapped all over your post but if you read every other response there is a lot of good info to be had.
I know from your original post your budget was not at $500 but if you can swing it, the OPPO 93 is a killer universal player that will have great Blu playback as well as 3D and awesome DLNA streaming capabilities from over the network and attached drives.
pixelthis
05-12-2011, 01:40 PM
[QUOTE=Sir Terrence the Terrible]Its is not because 1080p is better than 1080i. Its because most of the sources we use are progressive, and progressive going to a progressive reduces the amount of processing you have to include in the set.
1080P IS "better, has twice the information onscreen at any given time.
AND AS FAR AS "processing" goes, stitching two fields together is hardly processing at all.
AND most of the sources we use are interlaced , the only native progressive is
a few pay-per-view choices and BLU ray.
PROGRESSIVE simply uses too much bandwidth.
If I am refreshing each field 30 times a second, you are not going to see 540 lines - our eyes do not process the information that fast. If they did, you would see flickering as each field is sequentially painted on the screen. Painting the screen with each field 30 times a second, to the eyes we see exactly the same thing as a progressive screen painting the screen 60 times a second. Its one thing to talk about theory, its another to see how that theory works with the eyes and brain.
If there is movement in an interlaced pic, the illusion falls apart, and resolution drops,
sometimes fifty percent.
I USED to wonder at how still images looked so good on old 480i NTSC, reason is
that they are still images, and a true 400+ lines of resolution.
AND, NO, our brains do not process information that fast (our eyes don't process
anything), but on an interlaced pic the illusion of res thats not there falls apart
whenever there is movement, and the drop in res is noticeable
Learn something:
http://www.hometheater.com/geoffreymorrison/0807061080iv1080p/
Notice these words from the article
There Is No Difference Between 1080p and 1080i
I supposed you are going to claim he doesn't know what he is talking about as well.:rolleyes:
DEPENDS on what hes talking about.
If hes' talking about a 1080p pic and a 1080i pic that has been deinterlaced there
is no difference. HOWEVER, the non-deinterlaced pure 1080p pic that BLU delivers
does seem better to me than the deinterlaced pic from cable, although technically
they are the same(or close).
Even progressive displays lose resolution when there is movement.
SOME, but its nowhere near the almost (or more) fifty percent that an interlaced pic loses
Note this line:
An HDTV may resolve a stationary test signal at full bandwidth, displaying all the detail within the 1920 individual pixels that appear across the screen, but not necessarily when motion is introduced, which on some sets causes a resolution drop.
So you are wrong again!
Not really. THE RES DROP of a 1080p or other progressive picture is slight compared to the almost fifty percent of an interlaced pic.
APPLES AND ORANGES, really.
Actually, based on the links its you that is displaying the ignorance. Your the kind of idiot that does not know how to separate hype and theory from real world conditions. Theory is one thing, how our eyes and brain process that theory is another.
Our "eyes and brains" don't process "theory", but theory explains what our eyes and brains see under certain circumstances.
AND YOU ARE THE one displaying "ignorance", as your "quotes" back up what I'm
saying, not you. YOU DON'T UNDERSTAND that because of your ignorance.
Here is some more information for you if you can actually understand it.
I could teach them a thing or two.
WE HAVE ARGUED ABOUT THIS FOREVER, AND ALL IT BOILS DOWN TO
is your inferiority complex over your antique CRT based projection system,
which is quite obsolete.
Maybe you will figure that out one of these days.:1:
pixelthis
05-12-2011, 01:50 PM
BTW the LG "42 that passed your weird little "motion" res test in your gamespot link
(the only one that passed) is the same model I HAVE.
i didn't need some fake "test" to see what an excellent pic it produces , I could tell
by eyeballing it in the store.
THANKS for verifying the basic goodness of my panel, knew you'd be good for
something one of these days.
AND what does it say about my "expertise" when the set I picked out by eyeballing it
is the only one that passed your little "motion" res test?:1:
pixelthis
05-12-2011, 01:55 PM
ALSO the consensus on that forum page was that LCD looked better than
plasma.
KINDA shot yourself in the foot with that one, there.:1:
Sir Terrence the Terrible
05-13-2011, 12:39 PM
[QUOTE]
1080P IS "better, has twice the information onscreen at any given time.
AND AS FAR AS "processing" goes, stitching two fields together is hardly processing at all.
AND most of the sources we use are interlaced , the only native progressive is
a few pay-per-view choices and BLU ray.
PROGRESSIVE simply uses too much bandwidth.
If our eyes don't notice two fields being painted on the screen sequentially, then progressive is not better or worse than interlace - especially when you get to full HD resolution.
If there is movement in an interlaced pic, the illusion falls apart, and resolution drops,
sometimes fifty percent.
I USED to wonder at how still images looked so good on old 480i NTSC, reason is
that they are still images, and a true 400+ lines of resolution.
AND, NO, our brains do not process information that fast (our eyes don't process
anything), but on an interlaced pic the illusion of res thats not there falls apart
whenever there is movement, and the drop in res is noticeable
Sorry but you are wrong again.
http://www.gamespot.com/forums/topic/26585312
Notice the static resolution versus the moving resolution of the LCD models. Almost all of them lose half of their resolution with moving images. The plasma models do much better than the LCD models in this respect, maintaining at least 900 lines of resolution with moving images.
If moving images fall apart with interlaced sets, that would make their images unwatchable. Since that is not the case perhaps you should spare us the hyperbole in your responses.
DEPENDS on what hes talking about.
If hes' talking about a 1080p pic and a 1080i pic that has been deinterlaced there
is no difference. HOWEVER, the non-deinterlaced pure 1080p pic that BLU delivers
does seem better to me than the deinterlaced pic from cable, although technically
they are the same(or close).
Since we are mostly working with 1080i/p images, there is no depends here(except what YOU wear in the name of underwear). The world is basically moving past 480i as a standard resolution, only DVD supports that.
SOME, but its nowhere near the almost (or more) fifty percent that an interlaced pic loses
Either you cannot read, or you cannot see. Look at the static versus moving resolution of all of the LCD. They lose between 40-60% of their resolution with moving images. The chart does not lie Pixie.
Not really. THE RES DROP of a 1080p or other progressive picture is slight compared to the almost fifty percent of an interlaced pic.
APPLES AND ORANGES, really.
Slight?? The LG LCD goes from 1040 lines of static resolution down to 340 lines when images move. The Toshiba goes from 1040 lines of static resolution down to 320 lines with moving images. The Sharp 52" goes from 1080 lines to 330 lines. My dino CRT does better than this by a looooooong way. The stats don't lie Pixie, but you apparently do.
Our "eyes and brains" don't process "theory", but theory explains what our eyes and brains see under certain circumstances.
AND YOU ARE THE one displaying "ignorance", as your "quotes" back up what I'm
saying, not you. YOU DON'T UNDERSTAND that because of your ignorance.
Pix, is it so hard for you to admit that you don't know what you are talking about....wait that takes maturity, and everyone knows you lack that.
I could teach them a thing or two.
Based on what I have seen so far, I highly doubt that.
WE HAVE ARGUED ABOUT THIS FOREVER, AND ALL IT BOILS DOWN TO
is your inferiority complex over your antique CRT based projection system,
which is quite obsolete.
Maybe you will figure that out one of these days.:1:
LOLOLOLOLOLOLOLOLOLOL...this statement is beyond stupid, but par for the course for you. There is no need for me to have a inferiority complex over something that outperforms what you have, and any flat panel on the market today.
Try making a less stupid statements in the future if that is possible.
Sir Terrence the Terrible
05-13-2011, 12:40 PM
ALSO the consensus on that forum page was that LCD looked better than
plasma.
KINDA shot yourself in the foot with that one, there.:1:
Test prove otherwise, and I choose the testing over the opinion any day.
Sir Terrence the Terrible
05-13-2011, 12:52 PM
BTW the LG "42 that passed your weird little "motion" res test in your gamespot link
(the only one that passed) is the same model I HAVE.
i didn't need some fake "test" to see what an excellent pic it produces , I could tell
by eyeballing it in the store.
THANKS for verifying the basic goodness of my panel, knew you'd be good for
something one of these days.
AND what does it say about my "expertise" when the set I picked out by eyeballing it
is the only one that passed your little "motion" res test?:1:
It passed the deinterlacing test, but failed the 3:2 pull down. I'd rather have the test than your eyeballs, because it has already been proven that you don't know what image quality really is.
pixelthis
05-15-2011, 01:09 PM
It passed the deinterlacing test, but failed the 3:2 pull down. I'd rather have the test than your eyeballs, because it has already been proven that you don't know what image quality really is.
I have a lot better idea than you do, at least my monitor is not an antique(much like yourself).:1:
Sir Terrence the Terrible
05-16-2011, 09:08 AM
I have a lot better idea than you do, at least my monitor is not an antique(much like yourself).:1:
So what are you if I am younger than you....a fossil???
pixelthis
05-16-2011, 12:17 PM
So what are you if I am younger than you....a fossil???
SOME of us get seasoned as the years go by, some just get "old".:1:
Woochifer
05-16-2011, 03:28 PM
WRONG (as usual). In order to avoid 2:3 pulldown, BLU has a rate of 72hz.
EACH frame is shown three times(72 divided by three= 24). This gives a frame rate
of 24fps without pulldown.
WHEN THIS happens my set displays 1080p24.
And which of your TVs has a 72 Hz refresh rate? Do the math. Unless you owned one of Pioneer's plasmas, there aren't any HDTVs out there with a native 72 Hz refresh rate.
Blu-ray is not a 72 Hz format. You keep repeating this nonsense over and over. You might believe this, but that doesn't make it true.
Pixelthis says 2006 = "a decade old" .
Still irrelevant, like most of what you say.
And "a decade old" still doesn't apply to anything made in 2006. Nice try.
BECAUSE a "modern" player does not have component.
So, you're saying that your Funai player is the "modern" player, and the top-rated Oppo player isn't? :lol:
AND YOU WANT TO BUY A PLAYSTATION, and pay for a hackers crack addiction,
Judging from the responses here, us PS3 owners aren't the ones on crack. :cool:
THEY COST THREE HUNDRED BUCKS. Yeah, this would be a great first player.
WHY BUY A three hundred player when your TV is as old as dino bones?
And those models aren't the only ones still being made that have component outputs. If you're cheap, there are still plenty of current options to choose from.
IF ITS NOT 1080P. This is where BLU shines, and if your set won't display it, then
you're wasting your time with a BLU player
So, anyone buying a 32" HDTV shouldn't bother with Blu-ray either? At that screen size, any difference between 720p and 1080p is negligible in normal viewing.
It doubles the resolution. THIS is the one time (deinterlacing) that you can get a res increase on a signal.
Utter BS. Deinterlacing does not do anything to the native resolution of the DVD format.
Because it doesn't. 1080i is down converted and 480i is upconverted.
BOTH look rather decent. And similar.
And one is SD and the other is HD. Nothing at all similar about the two signals.
Not on my cable system. Without an adapter you can't get a signal.
And yet, the source signal getting fed to the cable system is still native 480i
AND dvd does still exist, and if you really wanted you could watch it on an old NTSC
set, but why bother? And any set you watch it on that is HD is going to upconvert it
weather you do or not. SO SURE its natively 480i, but it would cost you to actually watch
it in 480i.
And again, this is a native 480i signal. All of the processing in the world doesn't change this fundamental fact.
And an upscaled 480i signal is nowhere near the picture quality of a 1080p signal downscaled to 720p.
If your set is not 1080p, why waste your time? You need to be saving your coin to
buy a decent monitor.
Because if someone already has a HDTV and simply wants to enjoy Blu-rays, it's ridiculous to upgrade the TV first, given that whatever Blu-ray player they buy today will already have the HDMI output that you feel is so mandatory.
If they upgrade the TV next year or the year after, any Blu-ray player they buy right now for their older TV will work perfectly fine with their new TV.
Yes, they will enjoy a picture that looks like a really good DVD, they will have to pay
300+ TO GET A SONY OR AN oppo to see this incredibly average picture on
their obsolete set with component, when if they bought a new set, they could see a much
better pic with a 100 buck(or less) blu player. AND SPENDING 300 plus on a
BLU player when your monitor is obsolete is not unwise, ITS STUPID.
No, actually it's far more stupid to buy something that you don't need (i.e., a new TV) instead of something that you do need (i.e., a Blu-ray player). Plenty of Blu-ray choices in the $100 range will support that.
anybody follows your lame advice, they will be decidedly unlucky:1:
Yeah, most of us are unlucky to be happy enough with our TVs and video devices to enjoy them for several years before upgrading them. The lucky ones are those like you who buy a new TV or Blu-ray player every few months, right? :19:
Sir Terrence the Terrible
05-17-2011, 11:05 AM
SOME of us get seasoned as the years go by, some just get "old".:1:
If this is the case, you are just getting old. And these statements prove it.
WRONG (as usual). In order to avoid 2:3 pulldown, BLU has a rate of 72hz.
EACH frame is shown three times(72 divided by three= 24). This gives a frame rate
of 24fps without pulldown.
WHEN THIS happens my set displays 1080p24.
If you stated that your 42" LG set was the one that passed the 3:2 pull down, that set is a 768p set, not a 1080p - and that set is only capable of a 60hz refresh rate, so you MUST have 3:2 pull down and all the judder that goes with it. Bluray does not have a rate of 72hz, it does not have anything in any hertz. Bluray is a 24p 1080p carrier of video, it has no refresh rate, that is left up to the set.
AND YOU WANT TO BUY A PLAYSTATION, and pay for a hackers crack addiction
These guys were not trying feed a drug habit, they were trying to hurt and embarrass Sony. It worked on both counts, but you do need to get your facts straight. Those hackers did not diminish my use of my PS3 on bit, I does not depend on the PSN to function, it can and has functioned quite well in the absence of the PSN. Your statement shows your ignorance on the subject.
It doubles the resolution. THIS is the one time (deinterlacing) that you can get a res increase on a signal.
Deinterlacing increases resolution??? That's rich! You cannot get more signal than coming out than went in. Deinterlacing just brings two seperate sequential fields in a frame into one field. That does not increase resolution.
AND dvd does still exist, and if you really wanted you could watch it on an old NTSC
set, but why bother? And any set you watch it on that is HD is going to upconvert it
weather you do or not. SO SURE its natively 480i, but it would cost you to actually watch
it in 480i.
What would it cost you. The signal is 480i, and the display is 480i, so where is the loss? There is none, you are confused as hell.
Wooch has done very well in rebutting your misinformation. The fact that you think you are correct shows just how uniformed you are. Pix, your a legend in the empty space between your ears.
pixelthis
05-17-2011, 02:12 PM
If this is the case, you are just getting old. And these statements prove it.
Not older, just better
If you stated that your 42" LG set was the one that passed the 3:2 pull down, that set is a 768p set, not a 1080p - and that set is only capable of a 60hz refresh rate, so you MUST have 3:2 pull down and all the judder that goes with it. Bluray does not have a rate of 72hz, it does not have anything in any hertz. Bluray is a 24p 1080p carrier of video, it has no refresh rate, that is left up to the set.
MORE nonsense that doesnt matter.
Everything I HAVE READ ON THE PLANET...everything, states that 24p is achieved
by a freq of 72hz, each frame is showed three times for a total frame rate of 24p.
I DON'T care how they do it, it looks great. AND blu HAS A RATE OF 24P?
What about all of the viideo I HAVE ON blu THAT IS 60HZ?
These guys were not trying feed a drug habit, they were trying to hurt and embarrass Sony. It worked on both counts, but you do need to get your facts straight. Those hackers did not diminish my use of my PS3 on bit, I does not depend on the PSN to function, it can and has functioned quite well in the absence of the PSN. Your statement shows your ignorance on the subject.
and your statement shows your ignorance of the criminal mind.
THESE morons saw an opportunity after the Japanese earthquake, and all of the confusion created, and took advantage, and were wildly successful
Deinterlacing increases resolution??? That's rich! You cannot get more signal than coming out than went in. Deinterlacing just brings two seperate sequential fields in a frame into one field. That does not increase resolution.
You need to quit talking about things you don't have any comprehension of.
A 1080i pic has less than half of the res of a 1080p pic at any given time.
WHEN AN interlaced pic is deinterlaced (two fields stiched together) twice the
picture info is shown on screen, and that pic is not subject to artifacts, like res loss when theres movement.
One of the reasons I BOUGHT MY FIRST 1080I HD set was to see DVD in 480p(my
set had two native resolutions, 480p and 1080i).
THE 480P DVD pic, while SD, was a lot better than the 480i, because it had twice the picture info onscreen at any given time.
No, there was no "new" resolution created, just more pixels onscreen, creating a
more detailed picture.
This is why progressive has replaced interlaced, progressive has more picture information on screen at any given time.
DENY reality all you want, pull any layman off of the street, ask ANYBODY on this board,
and they will ALL tell you that a progressive pic is sharper, just plain better.
BECAUSE IT HAS MORE PICTURE INFOR MATION THAN AN INTERLACED PIC!!!
What would it cost you. The signal is 480i, and the display is 480i, so where is the loss? There is none, you are confused as hell.
And you are ignorant as "hell".
My electronics teacher told me about forty years ago that any interlaced pic was going to lose up to half its resolution whenever there is movement. JOE KANE and others have said
it, its common knowledge, its something I HAVE TAKEN FOR GRANTED.
You don't even have to take my word for it, set a progressive panel next to an interlaced
CRT, run a 480i DVD picture to the CRT and a progressive 480p to the panel.
EVERYBODY looking at the demo will tell you that the progressive is sharper and all
around better.
What I CAN'T BELIEVE is that I am arguing with a ninny that refuses to face the obvious
Wooch has done very well in rebutting your misinformation. The fact that you think you are correct shows just how uniformed you are. Pix, your a legend in the empty space between your ears.
"Wooch" knows juat enough to get himself into trouble.
IS THIS THE SAME Wooch that is running around saying that a component cable is better than an HDMI? Is this your champion?
FIGURES:1:
amarmistry
05-17-2011, 04:47 PM
After much consideration, I bought Panasonic DMP-BD85K BluRay player, primarily because it was one of the only two BluRay players that was under $200 and had 7.1 channel analog audio.
I hooked it up directly to my projectors and connected my audio (only 5.1) at present to my receiver. If I had an HDMI receiver, then 7.1 channel audio in the BluRay player would not be as imporant. It does an excellant job at upconverting DVDs, and sound is fantastic. Moreover it comes with built in wifi (actually its an external USB adapter which comes free with the player).
amarmistry
05-17-2011, 04:48 PM
After much consideration, last November I bought Panasonic DMP-BD85K BluRay player, primarily because it was one of the only two BluRay players that was under $200 and had 7.1 channel analog audio.
I hooked it up directly to my projectors and connected my audio (only 5.1) at present to my receiver. If I had an HDMI receiver, then 7.1 channel audio in the BluRay player would not be as imporant. It does an excellant job at upconverting DVDs, and sound is fantastic. Moreover it comes with built in wifi (actually its an external USB adapter which comes free with the player).
Sir Terrence the Terrible
05-18-2011, 12:33 PM
Not older, just better
You are just slight..and I mean slightly better than an idiot.
MORE nonsense that doesnt matter.
Everything I HAVE READ ON THE PLANET...everything, states that 24p is achieved
by a freq of 72hz, each frame is showed three times for a total frame rate of 24p.
I DON'T care how they do it, it looks great. AND blu HAS A RATE OF 24P?
What about all of the viideo I HAVE ON blu THAT IS 60HZ?
This is what I mean by your ignorance of this technology. 72hz is a refresh rate. There is zero within the video stream coming from the player that needs refreshing, that is the job of the set itself. 24p is a video stream - constant bits that when unpacked represent each frame 24 times a second. 72hz is used on some plasma's(my Kuro's has it). to reduce motion judder. There are some televisions with a 48hz rate, but flicker is a problem with them.
and your statement shows your ignorance of the criminal mind.
THESE morons saw an opportunity after the Japanese earthquake, and all of the confusion created, and took advantage, and were wildly successful
This is not what happen. Anonymous is a hacker group that hacked into PSN as payback for Sony going after one of its members for releasing the jail break to the PS3 on the internet. This had nothing to do with the earthquake because the servers they broke in to were in San Diego, not Japan.
You should stay with issues you know about, which means you will have to leave this site permanently.
You need to quit talking about things you don't have any comprehension of.
A 1080i pic has less than half of the res of a 1080p pic at any given time.
WHEN AN interlaced pic is deinterlaced (two fields stiched together) twice the
picture info is shown on screen, and that pic is not subject to artifacts, like res loss when theres movement.
I already provided two links that prove that you don't know what you are talking about. So I won't bother to address this AGAIN!
One of the reasons I BOUGHT MY FIRST 1080I HD set was to see DVD in 480p(my
set had two native resolutions, 480p and 1080i).
THE 480P DVD pic, while SD, was a lot better than the 480i, because it had twice the picture info onscreen at any given time.
First there are ZERO sets that sport a 480p native resolution. 1080i sets have 540p as a alternate display rate. If 480p was the native rate, the set would be throwing pixels away when trying to display full screen DVD's. 540p is either first or second sequential field on a 1080i image from a CRT set.
No, there was no "new" resolution created, just more pixels onscreen, creating a
more detailed picture.
This is why progressive has replaced interlaced, progressive has more picture information on screen at any given time.
If you cannot see the sequential fields being painted on the screen, then there is really no difference between interlaced images and progressive ones. Once again, the links I provided state pretty clearly that when it comes to Bluray, there is no difference between 1080i/p.
DENY reality all you want, pull any layman off of the street, ask ANYBODY on this board,
and they will ALL tell you that a progressive pic is sharper, just plain better.
BECAUSE IT HAS MORE PICTURE INFOR MATION THAN AN INTERLACED PIC!!!
Unless you can see the television painting the sequential images on the screen, you eyes cannot tell the difference between 1080i and 1080p. The screen refreshes faster than we can see, so we never see two fields painted separately or there would be too much flicker. When coming from a bluray disc, there is no more "information" in 1080p than there is in 1080i. All 1080i does(if it can take the native 1080p straight) is interlace the images in to two sequential fields, it throws nothing away in the process. If you television cannot take the 1080p video stream direct, then the player does the interlacing and sends it to the set. No matter which way you slice it, there is no more information in 1080p than in 1080i. A sharper picture is not a result of more information, it is a result of not having to perform any post processing on the incoming stream.
And you are ignorant as "hell".
So what does that make you? Stupid and retarded as hell?
My electronics teacher told me about forty years ago that any interlaced pic was going to lose up to half its resolution whenever there is movement. JOE KANE and others have said
it, its common knowledge, its something I HAVE TAKEN FOR GRANTED.
40 years ago we were not talking about 1080i or 1080p. Now that we are talking 1080i/p, no, interlaced images do not lose half of their resolution with moving objects. It is all about what processing the 1080i set uses. My old dino when placed into its interlace modes only loses 100 lines of information with moving objects as a result of using advance motion processing. That is far better than all LCD panels, and equal to the best plasma panels. Part of why you take these things for granted is because your information is outdated, and does not represent current technology.
You don't even have to take my word for it, set a progressive panel next to an interlaced
CRT, run a 480i DVD picture to the CRT and a progressive 480p to the panel.
EVERYBODY looking at the demo will tell you that the progressive is sharper and all
around better.
What I CAN'T BELIEVE is that I am arguing with a ninny that refuses to face the obvious
Your testing method is pretty stupid considering you would have to consider the quality of the deinterlacing chips in the test. You are going to dismiss this, but considering the fact that not all progressive players can accurately deinterlace an image, your test would be more about the quality of the deinterlacing chips, than the actual capability of the set or the images.
The real test is to run two identical 1080p video streams to a 1080p and 1080i set, and see if we can see a difference. Since I have already done this using a single set in the "butterfly" mode(split screen), nobody could see a difference between the two on my set. They can't because "perceptively" they are seeing identical images(they cannot see the refresh rate in action), and the amount of resolution going to each input is exactly the same. As the link I posted says, there is no difference between 1080p and 1080i when presented a 1080p signal.
"Wooch" knows juat enough to get himself into trouble.
IS THIS THE SAME Wooch that is running around saying that a component cable is better than an HDMI? Is this your champion?
FIGURES:1:
Wooch takes many things into consideration, and goes through great detail to explain his reasoning. You don't. You gloss over all detail in favor of absolutes, and absolutes don't exist when you actually address the detail.
Woochifer
05-18-2011, 01:09 PM
"Wooch" knows juat enough to get himself into trouble.
Actually, I know more than enough to get you wound up into knots by pointing out the errors and contradictions in your posts. I'm not the one trying to claim that deinterlacing a 480i source "doubles" the resolution, or claiming that there's no difference between a 480i source and 1080p source when viewed on a "720p" TV.
IS THIS THE SAME Wooch that is running around saying that a component cable is better than an HDMI?
Ah, back to distortions and lies to win a strawman argument. Bravo! Either that or you got me confused with some other guy named Wooch. :lol:
E-Stat
05-18-2011, 01:32 PM
72hz is a refresh rate...
Here's a genuine thanks for the detailed and informative posts. More greenies to you! When it comes to video matters, I'm the first to admit that I know what I don't know. Apparently, Pix isn't there yet. :)
rw
pixelthis
05-22-2011, 09:36 AM
Actually, I know more than enough to get you wound up into knots by pointing out the errors and contradictions in your posts. I'm not the one trying to claim that deinterlacing a 480i source "doubles" the resolution, or claiming that there's no difference between a 480i source and 1080p source when viewed on a "720p" TV.
Oh, gee, then every DVD player ON EARTH has a totally unnecessary feature, mainly progressive playback.
IF a 480p picture is no improvement , then why bother?
FACT is that with a 480i pic you have 240 lines on the screen
at any given time, then another 240 is painted between
them.
IF THERES MOVEMENT, resolution drops to as low as 240 lines.
Deinterlace the pic and 480 lines are painted onscreen one after
the other in a progressive manner.
No, there is no "increase" in res, but there is more resolution,
because the full resolution is displayed.
Really, why do you insist on showing your ignorance?
EVEN ANY LAYMAN, looking at a 480i and a 480p pic side by side will be able to tell which is superiour.
BUT YOU ARE stating there is no difference, basically like stating the world is flat.
SO why have deinterlacing DVD players at all?
[QUOTE]Ah, back to distortions and lies to win a strawman argument. Bravo! Either that or you got me confused with some other guy named Wooch. :lol:
NO, YOU ARE the same know nothing who believes that a
component cable is better than HDMI, but I bet you use HDMI on your setup. WHICH makes you a hypocrite and a liar,
not to mention uninformed.:1:
pixelthis
05-22-2011, 09:39 AM
Here's a genuine thanks for the detailed and informative posts. More greenies to you! When it comes to video matters, I'm the first to admit that I know what I don't know. Apparently, Pix isn't there yet. :)
rw
THE FACT that you applaud a know nothing like talky shows
that you most certainly are not "there" yet.
TALK about the ignorant leading the unknowing.:1:
THE FACT that you applaud a know nothing like talky shows
that you most certainly are not "there" yet.
TALK about the ignorant leading the unknowing.:1:
Maybe if you took as much time to google corroborating links to your claims as you do for gifs and jpegs for your posts, along with not typing like you still ride the Short Bus, someone might just take you seriously.
You realize just how easy it is to figure out whose claims have more credible backing than the other, don't you?
pixelthis
05-22-2011, 11:14 AM
You are just slight..and I mean slightly better than an idiot.
WHICH PUTS me about seven levels above you in the evolutionary chain
This is what I mean by your ignorance of this technology. 72hz is a refresh rate. There is zero within the video stream coming from the player that needs refreshing, that is the job of the set itself. 24p is a video stream - constant bits that when unpacked represent each frame 24 times a second. 72hz is used on some plasma's(my Kuro's has it). to reduce motion judder. There are some televisions with a 48hz rate, but flicker is a problem with them.
The "refresh" rate is the frequency (the rate that a picture is refreshed. EVERY video source on the planet has a frequency, usually 60hz THIS is the number of frames a second,
THIS is pretty basic stuff, and all you have to do is look on the
back of a BLU ray box with video content to see that its
60hz. OR MAYBE YOU HAVE READING COMPREHENSION SKILLS.
Any BLU box that contains a film states on the back that
the frame rate is 24p.
This is each frame three times, 72hz, which divided by three
is 24, which is shown in progressive format, 24p.
This is because the freq of a film is 60hz, has to be to be compatible with most sets. BUT IF a set with 24p is detected
a freq of 72hz is used, with each frame shown three times.
This is shown as 24p.
This is what I HAVE READ from every bit of info concerning this subject, BLU uses a frequency rate of 72hz to show 24p, or
24x3, which is shown as 24p.
Video source material is shown at 60hz.
This is not what happen. Anonymous is a hacker group that hacked into PSN as payback for Sony going after one of its members for releasing the jail break to the PS3 on the internet. This had nothing to do with the earthquake because the servers they broke in to were in San Diego, not Japan.
But the company affected was Japan, doesn't matter if the servers were on MARS.
Turns out that Sony was using outdated Apache software, with
no firewall, making them a target of opportunity.
You need to stick with what you know about, which is nothing.
You should stay with issues you know about, which means you will have to leave this site permanently.
You should try using your brain every once in awhile, and come back to this site in about fifty years
I already provided two links that prove that you don't know what you are talking about. So I won't bother to address this AGAIN!
Thanks, I HAVE HEARD ENOUGH OF YOUR NONSENSE.
First there are ZERO sets that sport a 480p native resolution. 1080i sets have 540p as a alternate display rate. If 480p was the native rate, the set would be throwing pixels away when trying to display full screen DVD's. 540p is either first or second sequential field on a 1080i image from a CRT set.
heres a clue, ace, nobody gives a hoot about what an antique
like CRT does.
My last CRT(about half a decade ago) had native resolutions of 480p and 1080i, sorry if my info on obsolete tech is a bit scant.
DON'T HAVE TIME TO WASTE ON HORSE AND BUGGY.
If you cannot see the sequential fields being painted on the screen, then there is really no difference between interlaced images and progressive ones. Once again, the links I provided state pretty clearly that when it comes to Bluray, there is no difference between 1080i/p.
YOU have been stating this nonsense ever since I have been on this site.
Its mostly a justification for your obsolete 1080i CRT rig.
BUT if there wre "no" difference between an interlaced and
a progressive pic, then there would not be an army of DVD players
out there with progressive scan.
THE ACTUAL TRUTH is that most sets are 1080p these days,
and it doesnt matter much what the res of the source material
is. 480i is going to be up converted to 1080p or 720p.
1080i is going to be deinterlaced to 1080p or 720p
.
AND YOUR INTERLACED gear is going to be just as obsolete
tomorrow as it is today, and when theres movement the res is
going to be reduced by up to half.
Unless you can see the television painting the sequential images on the screen, you eyes cannot tell the difference between 1080i and 1080p. The screen refreshes faster than we can see, so we never see two fields painted separately or there would be too much flicker. When coming from a bluray disc, there is no more "information" in 1080p than there is in 1080i. All 1080i does(if it can take the native 1080p straight) is interlace the images in to two sequential fields, it throws nothing away in the process. If you television cannot take the 1080p video stream direct, then the player does the interlacing and sends it to the set. No matter which way you slice it, there is no more information in 1080p than in 1080i. A sharper picture is not a result of more information, it is a result of not having to perform any post processing on the incoming stream.
Total nonsense.
THERE IS A FUNDAMENTAL difference between an interlaced pic and a progressive pic.
An interlaced pic loses up to half its res whenever theres movement.
I KEEP SAYING THIS, along with the likes of Joe Kane
and others who actually know something about how video acts
and its nature.
IS AN INTERLACED and a progressive pic the same?
YEP, until theres movement. You can deny this, but a
1080p set has two million pixels at any given time, twice as much
as an interlaced pic.
Interlaced picture tech was a good way to get around limited
airspace in its time, today its a great way to send video and save
space while doing it, but as a display tech, its obsolete.
EVEN 720 LINES progressive is better than 1080 lines interlaced,
which is why Fox and ABC broadcast in 720p, at the time they started most sets were still interlaced, and the 720p
was an improvement. PROGRESSIVE IS ALWAYS AN
IMPROVEMENT over interlaced.
Thats why most sets on the planet(except for a few antiques
like yours) are progressive, why most DVD players have
progressive scan.
ANYBODY looking at a progressive ands an interlaced side
by side who is honest will pick the progressive every
time.
GET OVER IT.
So what does that make you? Stupid and retarded as hell?
If I AM THEN YOU ARE IN BAD SHAPE, because I HAVE FORGOTTEN MORE ON THIS than you have ever known.
40 years ago we were not talking about 1080i or 1080p. Now that we are talking 1080i/p, no, interlaced images do not lose half of their resolution with moving objects. It is all about what processing the 1080i set uses. My old dino when placed into its interlace modes only loses 100 lines of information with moving objects as a result of using advance motion processing. That is far better than all LCD panels, and equal to the best plasma panels. Part of why you take these things for granted is because your information is outdated, and does not represent current technology.
Keep whistling past the graveyard ace, you know I AM
RIGHT.
Really, whats the point of using thousands of dollars of tech
to bring an obsolete CRT system barely up to date, when a
thousand dollar panel from WALMART still outperforms
it by a mile? Nostalgia?
AND YOU STILL HAVE TO LOOK AT YOUR SYSTEM IN THE DARK to even see the darn thing.
Totally irrelevant to 99% of HT enthusiasts, who can't see spending megabucks on an inferiour system that can't
outperform a n average VIZIO.
Your testing method is pretty stupid considering you would have to consider the quality of the deinterlacing chips in the test. You are going to dismiss this, but considering the fact that not all progressive players can accurately deinterlace an image, your test would be more about the quality of the deinterlacing chips, than the actual capability of the set or the images.
deinterlacing chips are of course cheap, but thats more a function of mass production than quality.
DEINTERLACING is very basic and not difficult anyway, and the most poorly deinterlaced pic is still going to look better than
any given interlaced pic
The real test is to run two identical 1080p video streams to a 1080p and 1080i set, and see if we can see a difference. Since I have already done this using a single set in the "butterfly" mode(split screen), nobody could see a difference between the two on my set. They can't because "perceptively" they are seeing identical images(they cannot see the refresh rate in action), and the amount of resolution going to each input is exactly the same. As the link I posted says, there is no difference between 1080p and 1080i when presented a 1080p signal.
Of course theres no difference between a deinterlaced 1080i and a native 1080p signal, because both are shown progressive.
AND YOU CAN split screen an image all you want, total image will either be interlaced or progressive.
1080p SETS were more expensive than interlaced sets, but they
still won in the marketplace, because any unbiased person will
see the obvious superiority of the progressive picture.
You can talk nonsense all day long about interlaced and progressive being the "same", but nobody sides with you.
WHEN MAKING THE MOST IMPORTANT VOTE,
with their wallet, they choose, progressive every time, which
put interlaced viewing displays on the scrap heap.
ARGUE nonsense all you want, you can't argue with that
Wooch takes many things into consideration, and goes through great detail to explain his reasoning. You don't. You gloss over all detail in favor of absolutes, and absolutes don't exist when you actually address the detail.
THATS because his "reasoning" is perception and opinion,
my statements are from training and stated fact.
WOOCH IS QUITE GOOD at blowing sunshine up your skirt,
but most of what he says falls apart on close inspection,
like nonsense such as a "component cable" is better than an HDMI cable.:1:
Sir Terrence the Terrible
05-22-2011, 03:52 PM
WHICH PUTS me about seven levels above you in the evolutionary chain
You are not even in the evolutionary chain.
The "refresh" rate is the frequency (the rate that a picture is refreshed. EVERY video source on the planet has a frequency, usually 60hz THIS is the number of frames a second,
THIS is pretty basic stuff, and all you have to do is look on the
back of a BLU ray box with video content to see that its
60hz. OR MAYBE YOU HAVE READING COMPREHENSION SKILLS.
Any BLU box that contains a film states on the back that
the frame rate is 24p.
This is each frame three times, 72hz, which divided by three
is 24, which is shown in progressive format, 24p.
This is because the freq of a film is 60hz, has to be to be compatible with most sets. BUT IF a set with 24p is detected
a freq of 72hz is used, with each frame shown three times.
This is shown as 24p.
This is what I HAVE READ from every bit of info concerning this subject, BLU uses a frequency rate of 72hz to show 24p, or
24x3, which is shown as 24p.
Video source material is shown at 60hz.
Since you have firmly established that you cannot read, its no wonder that you have this all wrong. You are so stupid, you cannot seperate what the Bluray player does from what the television does. The disc is encoded with a 24fps 1080p visual encode. The player reads the disc, and passes the 24/1080p data stream to the televsion set. Since a 60hz television cannot reproduce a 24p frame rate, 3:2 pulldown must be used. If the LCD set is a 120hz set, it will frame interpolate or frame double(24x5=120) the 24fps frame rate to match the televisions refresh rate of 120hz If the television is a plasma, it can double the frame rate to 48hz. The problem with that is that it introduces flicker. Some plasma's use 72hz, and some use 96hz, and both of these frame rates avoid flicker issues. The bluray PLAYER sends out a pure 24p frame rate, and the TELEVISION adds the necessary frame rates that have been programmed within it. No Bluray player sends out a 72hz frame rate, NOT ONE IN THE WORLD!
But the company affected was Japan, doesn't matter if the servers were on MARS.
Turns out that Sony was using outdated Apache software, with
no firewall, making them a target of opportunity.
You need to stick with what you know about, which is nothing.
Japan is not a company egghead. Sony is the company, its headquarters are in Japan. No servers in Japan were effected, it was the servers in San Diego. Specifically Sony Online Entertainment servers where the ones attacked, not any in Japan. Get your sh!t straight foo.
Remember stupid, you were the one that mentioned a earthquake and tsunami, events that had ZERO to do with the break in.
You should try using your brain every once in awhile, and come back to this site in about fifty years
Even if I did use it once in a while, it would be used far more than yours.
Thanks, I HAVE HEARD ENOUGH OF YOUR NONSENSE.
Since I have not spoken a damn thing to ya, you have not heard anything. It seems that READING the truth causes your air head to lose its pressure. Looks like you will have a raisin head pretty soon.
heres a clue, ace, nobody gives a hoot about what an antique
like CRT does.
My last CRT(about half a decade ago) had native resolutions of 480p and 1080i, sorry if my info on obsolete tech is a bit scant.
DON'T HAVE TIME TO WASTE ON HORSE AND BUGGY.
BS. No CRT has ever had native resolution of 480p. It would be throwing away information when it is presented in full screen 16x9. All rear projection based CRT televisions have 540p and 1080i as native refresh rates -as 540p is exactly one field of the 1080i image.
YOU have been stating this nonsense ever since I have been on this site.
Its mostly a justification for your obsolete 1080i CRT rig.
I don't have a 1080i CRT rig. My CRT based rear projection set does not have a native rate. It can take a incoming 480i/p to 1080i/p signal, and display it at that resolution. 480i/p is displayed as 480i/p. 720p is displayed as 720p. 1080i/p is displayed as 1080i/p. That is why it is not obsolete and will never be. It can display signals up to 1440p, so it will never be obsolete. It has more resolution than your little 42" 768p LCD set, that is for sure. So who's set is obsolete? It would probably be yours - it cannot even display full HD(that would be 1080p).
BUT if there wre "no" difference between an interlaced and
a progressive pic, then there would not be an army of DVD players
out there with progressive scan.
For your information stupid, DVD should always be displayed progressively, or line doubled. Its resolution is so low, anti aliasing lines crawl and line twitter become a problem if it remains interlaced. That is why most DVD players are progressive. When you get to 1080 lines of resolution, it does not matter if it is interlaced or deinterlaced, you don't get line twitter or anti aliasing crawl. The offset of the two interlaced fields is too fine to see.
THE ACTUAL TRUTH is that most sets are 1080p these days,
and it doesnt matter much what the res of the source material
is. 480i is going to be up converted to 1080p or 720p.
1080i is going to be deinterlaced to 1080p or 720p
In your particular case, your set has to throw away lines, and downconvert 1080p to 768p. My Dino set has more resolution than that!
.
AND YOUR INTERLACED gear is going to be just as obsolete
tomorrow as it is today, and when theres movement the res is
going to be reduced by up to half.
I don't have any interlaced gear, and even progressively scanned sets lose up to half of their resolution during motion. Your set drops down to 330 lines(from 768) when images move. Tests prove this, and oh look, that more than half!
Total nonsense.
THERE IS A FUNDAMENTAL difference between an interlaced pic and a progressive pic.
An interlaced pic loses up to half its res whenever theres movement.
I KEEP SAYING THIS, along with the likes of Joe Kane
and others who actually know something about how video acts
and its nature.
You are a bald face liar, Joe Kane never said this. The only way a interlaced set would lose half of its resolution during movement is if you can see each field being painted separately. Since we cannot, it does not. The reason that LCD panels lose so much information during moving images stems from the fact the pixels do not switch on and off fast enough to keep up with the motion. Since plasma's use fast switching phosphor technology, it does not lose much resolution at all.
IS AN INTERLACED and a progressive pic the same?
YEP, until theres movement. You can deny this, but a
1080p set has two million pixels at any given time, twice as much
as an interlaced pic.
Until your eyes can see each field being updated(of which you cannot), then 1080i and 1080p are perceptively the same resolution. My links prove this, and prove you to be the dumb idiot that you are.
Interlaced picture tech was a good way to get around limited
airspace in its time, today its a great way to send video and save
space while doing it, but as a display tech, its obsolete.
EVEN 720 LINES progressive is better than 1080 lines interlaced,
which is why Fox and ABC broadcast in 720p, at the time they started most sets were still interlaced, and the 720p
was an improvement.
ABC and Fox are major sports programmers, that is why they choose 720p. When it comes to fast moving objects, progressive is better. 1080i is better for stations that most show films, because you are not losing resolution just to support fast moving objects.
720p=1280x720 or 921600 pixels
1080i/p= 1920x1080 or 2,073,000 pixels
720p has a lot less pixels even when all shown at once, which means less resolution than 1080i/p.
PROGRESSIVE IS ALWAYS AN
IMPROVEMENT over interlaced.
Thats why most sets on the planet(except for a few antiques
like yours) are progressive, why most DVD players have
progressive scan.
ANYBODY looking at a progressive ands an interlaced side
by side who is honest will pick the progressive every
time.
GET OVER IT.
Once again, a response devoid of detail and context. 480p is definitely an improvement over 480i. At such low resolution, the images should be deinterlaced or line doubled to keep on screen diagonal lines straight. When you get to 1080i/p, one is barely sharper than the other. Since I have already "butterflied" 1080i and 1080p images on the same set, I can tell you for a fact you can barely tell them apart. So you are once again a liar, people will not always be able to pick them apart. It is not as easy as you are making it, and that is for sure. I seriously doubt you would pass a DBT on this.
So now you get over it liar.
If I AM THEN YOU ARE IN BAD SHAPE, because I HAVE FORGOTTEN MORE ON THIS than you have ever known.
Based on your responses pix, you have not even leaned what you claimed to have forgotten. 72hz to describe a native frame rate? Bluray disc is a 72hz technology? LOLOLOLOLOLOLOL, oh yeah, you truly have forgotten something. You don't even know a refresh rate from a frame rate, nor your ass from a hole in the ground.
Keep whistling past the graveyard ace, you know I AM
RIGHT.
Really, whats the point of using thousands of dollars of tech
to bring an obsolete CRT system barely up to date, when a
thousand dollar panel from WALMART still outperforms
it by a mile? Nostalgia?
If my CRT can display a full 1080p image, and your panel can only display a 720p image, which of these is really obsolete? No thousand dollar panel can display a 1440p image, but my CRT can. With moving images, my CRT maintains 1000 lines of resolution, your panel a measly 330 lines of resolution. Your set has very mediocre black levels, and only fair dynamic contrast levels. My set has such deep blacks, that with the lights off, you cannot see the set at all. It also has an excellent dynamic contrast, as all CRT's have over LCD. My set can display a totally accurate Cal Rec 709 color gamut, your set cannot, no LCD can.
I think you have this twisted. My dino set can outperform any consumer based LCD set easily.
AND YOU STILL HAVE TO LOOK AT YOUR SYSTEM IN THE DARK to even see the darn thing.
Totally irrelevant to 99% of HT enthusiasts, who can't see spending megabucks on an inferiour system that can't
outperform a n average VIZIO.
Since all films are made to be viewed in the dark, you have no point here. When you go to a movie theater, do you watch films with the lights on? No you don't, so the only point you made sits conveniently on your head.
A Vizio cannot even outperform your mediocre LG set, let alone my custom CRT big screen.
deinterlacing chips are of course cheap, but thats more a function of mass production than quality.
DEINTERLACING is very basic and not difficult anyway, and the most poorly deinterlaced pic is still going to look better than
any given interlaced pic
If deinterlacing was so basic, then how come so many televisions don't do it so well? The Toshiba 46UX600U, and at least 14 other LCD panels tested by displaymate all show mediocre to poor deinterlacing performance with 1080i and 480i test materials. It is not as easy as you think foo.
Of course theres no difference between a deinterlaced 1080i and a native 1080p signal, because both are shown progressive.
AND YOU CAN split screen an image all you want, total image will either be interlaced or progressive.
1080p SETS were more expensive than interlaced sets, but they
still won in the marketplace, because any unbiased person will
see the obvious superiority of the progressive picture.
You can talk nonsense all day long about interlaced and progressive being the "same", but nobody sides with you.
WHEN MAKING THE MOST IMPORTANT VOTE,
with their wallet, they choose, progressive every time, which
put interlaced viewing displays on the scrap heap.
ARGUE nonsense all you want, you can't argue with that
Sorry pix, but the whole example of a butterfly test is to show each image in its native rate. So on my set, the left image can be progressive, and the right split image can be interlaced. When viewed that way, nobody in the room could 100% guess which was progressive, and which was interlaced - the two looked too similar for that.
I am not interested in who sides with me - I am interested in exposing your lies, misinformation, and just general BS. Since this constitutes all of your posts, I am a busy man.
THATS because his "reasoning" is perception and opinion,
my statements are from training and stated fact.
Exactly what training and fact have you had on Bluray, or 1080i/p? They weren't even thought of when you were in high school. All of your training and supposed stated fact is based on the single gun CRT, and 480i images - two of the lowest forms of image and display technology there is. You have ZERO training with Bluray, and that is demonstrated by your 72hz comment and explanation. The only 1080p images you have seen are at the electronics store, because your display cannot do it.
Wooch has posted links to support his arguments, you have posted your uninformed opinion and that is all. Any person with a ounce of critical thinking would choose Wooch's posts over yours. You have not provided enough proof to support a training bra.
WOOCH IS QUITE GOOD at blowing sunshine up your skirt,
but most of what he says falls apart on close inspection,
like nonsense such as a "component cable" is better than an HDMI cable.:1:
He never said any such thing liar. Pix, I just saw your nose go past my window and down the street. From this thread alone, you have told enough lies for that beak of your to stretch from Alabama to San Leandro California. It seems that your nose needs the same kind of work that Bristol Palin jaw got. Perhaps those tornadoes have more of an effect than your are willing to admit.....
pixelthis
05-22-2011, 05:19 PM
You are not even in the evolutionary chain.
[QUOTE]Since you have firmly established that you cannot read, its no wonder that you have this all wrong. You are so stupid, you cannot seperate what the Bluray player does from what the television does. The disc is encoded with a 24fps 1080p visual encode. The player reads the disc, and passes the 24/1080p data stream to the televsion set. Since a 60hz television cannot reproduce a 24p frame rate, 3:2 pulldown must be used. If the LCD set is a 120hz set, it will frame interpolate or frame double(24x5=120) the 24fps frame rate to match the televisions refresh rate of 120hz If the television is a plasma, it can double the frame rate to 48hz. The problem with that is that it introduces flicker. Some plasma's use 72hz, and some use 96hz, and both of these frame rates avoid flicker issues. The bluray PLAYER sends out a pure 24p frame rate, and the TELEVISION adds the necessary frame rates that have been programmed within it. No Bluray player sends out a 72hz frame rate, NOT ONE IN THE WORLD!
Your keepers need to educate you better.
My set is 60hz, but displays 24p.
A BLU player might output 24p, but does it by means of a 72HZ
frame rate, elimating the need for 3:2 pulldown.
The entire reason for a 24p frame rate was to eliminate 3:2 pulldown, which sometimes caused artifacts in DVD playback.
No need for 24p if youi need 3:2 pulldown anyway.
VIDEO sourced material from BLU is 60hz, 24p is 72hz,
which is 60hz in a 60hz set, 24p in a 24p capable set.
This may be wrong, but if it is then the dozens of articles and
net pages I HAVE READ ON THE SUBJECT ARE WRONG ALSO.
Japan is not a company egghead. Sony is the company, its headquarters are in Japan. No servers in Japan were effected, it was the servers in San Diego. Specifically Sony Online Entertainment servers where the ones attacked, not any in Japan. Get your sh!t straight foo.
On the net it doesnt matter where anything is "located".
THE SERVERS COULD BE LOCATED in someones closet
under some dirty clothes...DOESNT MATTER
BS. No CRT has ever had native resolution of 480p. It would be throwing away information when it is presented in full screen 16x9. All rear projection based CRT televisions have 540p and 1080i as native refresh rates -as 540p is exactly one field of the 1080i image.
Again you show your ignorance.
MY FIRST HD set was a 47" PANASONIC RPTV, with two native resolutions, 1080i and 480p.
HD was rare then, I PRIMARILY purchased it to watch DVD's in 480p. THAT WAS THE MAIN REASON FOR the 480p resolution,
so once again you are exposed for the know-nothing that you
really are.
I don't have a 1080i CRT rig. My CRT based rear projection set does not have a native rate. It can take a incoming 480i/p to 1080i/p signal, and display it at that resolution. 480i/p is displayed as 480i/p. 720p is displayed as 720p. 1080i/p is displayed as 1080i/p. That is why it is not obsolete and will never be. It can display signals up to 1440p, so it will never be obsolete. It has more resolution than your little 42" 768p LCD set, that is for sure. So who's set is obsolete? It would probably be yours - it cannot even display full HD(that would be 1080p).
Whats sad is that you really think any of this is relevant.
THE ONE ADVANTAGE OF crt is that it has no native resolution,
and this matters NOT AT ALL.
This is because my 1080p(not 768) set upconverts or deinterlaces all resolutions to its native resolution.
TRUTH IS your security blanket-excuse me museum piece
of a science project is as obsolete as you are.
The future of projection video is DLP, for most panels...OLED.
The "future of CRT...the junkyard
For your information stupid, DVD should always be displayed progressively, or line doubled. Its resolution is so low, anti aliasing lines crawl and line twitter become a problem if it remains interlaced. That is why most DVD players are progressive. When you get to 1080 lines of resolution, it does not matter if it is interlaced or deinterlaced, you don't get line twitter or anti aliasing crawl. The offset of the two interlaced fields is too fine to see.
And the fact that the quality advantage of a 1080p set is so superior that people pay more for it, rendering 1080i obsolete
affects you not at all. YOU DO REALIZE THAT MOST READING
this at one time had to make that decision...AND CHOOSE 1080P.Because a progressive image will always be better than an interlaced one. AND those who choose 1080p to the point of making 1080i obsolete realize just what a know nothing you actually are
In your particular case, your set has to throw away lines, and downconvert 1080p to 768p. My Dino set has more resolution than that!
WHY DO YOU INSIST on calling my 1080p set 768p when even you know better? Just to get my goat? OR IS it just ignorance
on your part? I GO WITH THE IGNORANCE.
.
I don't have any interlaced gear, and even progressively scanned sets lose up to half of their resolution during motion. Your set drops down to 330 lines(from 768) when images move. Tests prove this, and oh look, that more than half!
actually they don't, which is why the world went progressive.
AGAIN WITH THE 768, you are even more delusional than usual.
You are a bald face liar, Joe Kane never said this. The only way a interlaced set would lose half of its resolution during movement is if you can see each field being painted separately. Since we cannot, it does not. The reason that LCD panels lose so much information during moving images stems from the fact the pixels do not switch on and off fast enough to keep up with the motion. Since plasma's use fast switching phosphor technology, it does not lose much resolution at all.
NOT JUST JOE KANE, but everybody else. I READ IT IN FREQUENT ARTICLES IN WIDESCREEN REVIEW
And about half a dozen other places.
Joe KANE was a champion of 720p at the start of the HD
transition, because after allowing for the resolution loss from
1080i, 720p actually had more real resolution.
IT WAS jOE kane and others that influenced ABC and FOX
to go 720p, for the very reason of resolution loss among interlaced formats.
GET A CLUE, do I CONSTANTLY have to educate you out of ignorance?
Until your eyes can see each field being updated(of which you cannot), then 1080i and 1080p are perceptively the same resolution. My links prove this, and prove you to be the dumb idiot that you are.
AND THIS IS HOGWASH , as I SAID EARLIER.
Not that it matters, as INTERLACED sets are EXTINCT.
So what the heck does it matter now?
ABC and Fox are major sports programmers, that is why they choose 720p. When it comes to fast moving objects, progressive is better. 1080i is better for stations that most show films, because you are not losing resolution just to support fast moving objects.
WHEN it comes to "fast moving objects" AND IN EVERY OTHER AREA.
720p=1280x720 or 921600 pixels
1080i/p= 1920x1080 or 2,073,000 pixels
720p has a lot less pixels even when all shown at once, which means less resolution than 1080i/p.
not really.
1080i is two FIELDS interlaced to make one frame, at any given time.
OF THE 1080 "LINES" you see on a 1080i pic, 540 are rapidly fading "image retention" and the others are the last "field".
On an interlaced pic at any given time only 540 lines are available,
so there are only about a million or so pixels onscreen at
any given time, and the whole illusion falls apart when theres movement.
AND again...it doesnt matter because 1080i is, for all
practical purposes...EXTINCT.
Once again, a response devoid of detail and context. 480p is definitely an improvement over 480i. At such low resolution, the images should be deinterlaced or line doubled to keep on screen diagonal lines straight. When you get to 1080i/p, one is barely sharper than the other. Since I have already "butterflied" 1080i and 1080p images on the same set, I can tell you for a fact you can barely tell them apart. So you are once again a liar, people will not always be able to pick them apart. It is not as easy as you are making it, and that is for sure. I seriously doubt you would pass a DBT on this.
you know, one of these days even you will figure out that its
IRRELEVANT because 1080i is EXTINCT.
Pretty much like your dino sel
Based on your responses pix, you have not even leaned what you claimed to have forgotten. 72hz to describe a native frame rate? Bluray disc is a 72hz technology? LOLOLOLOLOLOLOL, oh yeah, you truly have forgotten something. You don't even know a refresh rate from a frame rate, nor your ass from a hole in the ground.
EXCUSE me if I BELIEVE every HT magazine I HAVE EVER
read on the subject instead of you.
If my CRT can display a full 1080p image, and your panel can only display a 720p image, which of these is really obsolete? No thousand dollar panel can display a 1440p image, but my CRT can. With moving images, my CRT maintains 1000 lines of resolution, your panel a measly 330 lines of resolution. Your set has very mediocre black levels, and only fair dynamic contrast levels. My set has such deep blacks, that with the lights off, you cannot see the set at all. It also has an excellent dynamic contrast, as all CRT's have over LCD. My set can display a totally accurate Cal Rec 709 color gamut, your set cannot, no LCD can.
Well, you start out with the wrong res of my panel and go downhill from there.
My set is a full 1080p, you can watch it in a lit room if you like,
and 1440p would be relevant if you could rent 1440p FROM
the video store, which you cant.
BRAG ON your antique all you want, the measurements you
cite as spectacular are actually very slight improvement over a
panel from WALMART , HELPS you massage your ego as you explain it to those who dont notice, and don't really care.
Truth is any cheap DLP (the true comparsion for your piece of junk) will blow its doors off quite handily
I think you have this twisted. My dino set can outperform any consumer based LCD set easily.
In ways nobody gives a hoot about
Since all films are made to be viewed in the dark, you have no point here. When you go to a movie theater, do you watch films with the lights on? No you don't, so the only point you made sits conveniently on your head.
This statement is so ignorant I AM GOING TO GIVE IT A PASS
A Vizio cannot even outperform your mediocre LG set, let alone my custom CRT big screen.
it cant outperform my LG, it can outperform your science project
in all the ways that matter, like, you can actually buy one
If deinterlacing was so basic, then how come so many televisions don't do it so well? The Toshiba 46UX600U, and at least 14 other LCD panels tested by displaymate all show mediocre to poor deinterlacing performance with 1080i and 480i test materials. It is not as easy as you think foo.
Easier than for you to spell "fool"
Sorry pix, but the whole example of a butterfly test is to show each image in its native rate. So on my set, the left image can be progressive, and the right split image can be interlaced. When viewed that way, nobody in the room could 100% guess which was progressive, and which was interlaced - the two looked too similar for that.
WELL, your display is so dim its no wonder they couldn't tell
I am not interested in who sides with me - I am interested in exposing your lies, misinformation, and just general BS. Since this constitutes all of your posts, I am a busy man.
SLANDER all you want, you know I AM TELLING THE TRUTH.
And cry for your CRT glory days, they are gone forever,
obsoleted by more advance tech
Exactly what training and fact have you had on Bluray, or 1080i/p? They weren't even thought of when you were in high school. All of your training and supposed stated fact is based on the single gun CRT, and 480i images - two of the lowest forms of image and display technology there is. You have ZERO training with Bluray, and that is demonstrated by your 72hz comment and explanation. The only 1080p images you have seen are at the electronics store, because your display cannot do it.
AND YOUR BRAIN CANNOT process a single intelligent thought
Wooch has posted links to support his arguments, you have posted your uninformed opinion and that is all. Any person with a ounce of critical thinking would choose Wooch's posts over yours. You have not provided enough proof to support a training bra.
It is not my job to train simple minded children
He never said any such thing liar. Pix, I just saw your nose go past my window and down the street. From this thread alone, you have told enough lies for that beak of your to stretch from Alabama to San Leandro California. It seems that your nose needs the same kind of work that Bristol Palin jaw got. Perhaps those tornadoes have more of an effect than your are willing to admit.....
ACTUALLY HE DID, and on this thread. Guess the "unimpeachable" source he used?
HIS PARENTS.
They though that a signal from component was better, so it must be so!
BWHAHAHAHAHA!!!!!
I would say hes just about as ignorant as you are, but I DON'T WANT TO INSULT THE MAN, he actually has a personality .:1:
Woochifer
05-22-2011, 05:21 PM
Oh, gee, then every DVD player ON EARTH has a totally unnecessary feature, mainly progressive playback.
IF a 480p picture is no improvement , then why bother?
Because it's another feature to add to the checklist.
If every DVD player on the planet only output the native 480i resolution, an HDTV would display the output by simply rescaling and deinterlacing it to the native 768p or 1080p resolution.
FACT is that with a 480i pic you have 240 lines on the screen
at any given time, then another 240 is painted between
them.
IF THERES MOVEMENT, resolution drops to as low as 240 lines.
Deinterlace the pic and 480 lines are painted onscreen one after
the other in a progressive manner.
No, there is no "increase" in res, but there is more resolution,
because the full resolution is displayed.
Really, why do you insist on showing your ignorance?
EVEN ANY LAYMAN, looking at a 480i and a 480p pic side by side will be able to tell which is superiour.
BUT YOU ARE stating there is no difference, basically like stating the world is flat.
SO why have deinterlacing DVD players at all?
Man, you are a marketer's dream customer. Throw a bunch of fancy sounding features and specs on a sheet, and you're ready to junk whatever you bought just a few months ago for a brand new one. Like you've done so many times before.
The part that you're missing is that EVERY HDTV will do the deinterlacing to 480p and then do the rescaling to 768p or 1080p when fed a non-native signal. Progressive scan on a DVD player is a redundant feature, and always has been. The only value that it would offer up is in those specific cases where the the DVD player uses a video processor superior to the one inside of the TV.
NO, YOU ARE the same know nothing who believes that a
component cable is better than HDMI, but I bet you use HDMI on your setup. WHICH makes you a hypocrite and a liar,
not to mention uninformed.:1:
Anyone who wants to check who the liar on this thread is can simply check post #29 (http://forums.audioreview.com/showpost.php?p=358803&postcount=29) for themselves and see what I actually wrote. Seems that distorting and misquoting is all you got left. :17:
ACTUALLY HE DID, and on this thread. Guess the "unimpeachable" source he used?
HIS PARENTS.
They though that a signal from component was better, so it must be so!
BWHAHAHAHAHA!!!!!
And again, check post #29 (http://forums.audioreview.com/showpost.php?p=358803&postcount=29) to see what was actually posted everywhere other than in pix's head. :out:
Woochifer
05-22-2011, 05:51 PM
Maybe if you took as much time to google corroborating links to your claims as you do for gifs and jpegs for your posts, along with not typing like you still ride the Short Bus, someone might just take you seriously.
You realize just how easy it is to figure out whose claims have more credible backing than the other, don't you?
Shhhh! You're gonna spoil the fun for the rest of us ... :cool:
Sir Terrence the Terrible
05-22-2011, 07:55 PM
Your keepers need to educate you better.
My set is 60hz, but displays 24p.
A BLU player might output 24p, but does it by means of a 72HZ
frame rate, elimating the need for 3:2 pulldown.
The entire reason for a 24p frame rate was to eliminate 3:2 pulldown, which sometimes caused artifacts in DVD playback.
No need for 24p if youi need 3:2 pulldown anyway.
VIDEO sourced material from BLU is 60hz, 24p is 72hz,
which is 60hz in a 60hz set, 24p in a 24p capable set.
This may be wrong, but if it is then the dozens of articles and
net pages I HAVE READ ON THE SUBJECT ARE WRONG ALSO.
You still don't have it right stupid. Your display cannot reproduce 24fps, not with a 60hz refresh rate. There are no multiples of 24 that equal out to 60hz, so 3:2 pulldown MUST be used. So based on your assumptions, Bluray out puts at 72hz and your set refreshes at 60hz? It would have to discard 12fps to meet your sets refresh rate, does that sound right to you idiot? The picture would be severely degraded if that happens.
Film content=24fps
Video content=30fps
Your set can process programs shot with video without 3:2 pull down, as 60hz is a multiple of 30fps frame rate. The problem is that all material encoded to Bluray disc is done so at 24fps, so you are still stuck with 3:2 pull down no matter what.
http://www.tech-evangelist.com/2008/10/15/hdtv-refresh-rate-frame-rate/
Find where it states ANYWHERE that Bluray uses a 72hz "refresh" rate. If you cannot, then stop repeating the lie over and over again.
On the net it doesnt matter where anything is "located".
THE SERVERS COULD BE LOCATED in someones closet
under some dirty clothes...DOESNT MATTER
Look stupidpixel, these are your words.
and your statement shows your ignorance of the criminal mind.
THESE morons saw an opportunity after the Japanese earthquake, and all of the confusion created, and took advantage, and were wildly successful
Can you explain to me what confusion was happening in San Diego after an earthquake in JAPAN? The servers that were attacked was in San Diego, not Japan. Get your facts straight, or shut the hell up!
Again you show your ignorance.
MY FIRST HD set was a 47" PANASONIC RPTV, with two native resolutions, 1080i and 480p.
HD was rare then, I PRIMARILY purchased it to watch DVD's in 480p. THAT WAS THE MAIN REASON FOR the 480p resolution,
so once again you are exposed for the know-nothing that you
really are.
Pixelidiot, do you realize that there is no way to get to the maximum resolution of a 1080i RPTV from 480p? The only way a consumer RPTV can be progressive is if it is displaying one of the two fields it sequentially displays. 540p is one field of a sequentially displayed 1080i image. Using your logic, your set would only be capable of 960i resolution, not 1080i. No 1080i RPTV has a native resolution of 480p, not one, it would be impossible to do on consumer based RPTV's.
Whats sad is that you really think any of this is relevant.
THE ONE ADVANTAGE OF crt is that it has no native resolution,
and this matters NOT AT ALL.
This is because my 1080p(not 768) set upconverts or deinterlaces all resolutions to its native resolution.
TRUTH IS your security blanket-excuse me museum piece
of a science project is as obsolete as you are.
The future of projection video is DLP, for most panels...OLED.
The "future of CRT...the junkyard
Pix, either you are lying here, or you lied when you stated this.
BTW the LG "42 that passed your weird little "motion" res test in your gamespot link
(the only one that passed) is the same model I HAVE.
The LG 42" that passed the deinterlacing test was a 768p set. As a matter of fact all of LG's 42" models in that test were 768p. So either you are lying about owning the set that past the test, or you are lying to yourself about getting 1080p which is it?
http://www.gamespot.com/forums/topic/26585312
And the fact that the quality advantage of a 1080p set is so superior that people pay more for it, rendering 1080i obsolete
affects you not at all. YOU DO REALIZE THAT MOST READING
this at one time had to make that decision...AND CHOOSE 1080P.Because a progressive image will always be better than an interlaced one. AND those who choose 1080p to the point of making 1080i obsolete realize just what a know nothing you actually are
This link say you are a liar plain and simple.
http://www.hometheater.com/geoffreymorrison/0807061080iv1080p/
Once again, notice these words stupidpixel
There Is No Difference Between 1080p and 1080i
You can repeat your lies to yourself over and over again, but they are still lies.
WHY DO YOU INSIST on calling my 1080p set 768p when even you know better? Just to get my goat? OR IS it just ignorance
on your part? I GO WITH THE IGNORANCE.
It is a 768p set, or you are lying about your set passing the test. You said this
BTW the LG "42 that passed your weird little "motion" res test in your gamespot link
(the only one that passed) is the same model I HAVE.
Well according to gamespot, the LG that passed the deinterlacing test was the 42LB5D, which is a 768p set. Once again, either you lied with this statement, or you are lying now. Which is it pixelliar? Your goat is your lying tongue.
actually they don't, which is why the world went progressive.
AGAIN WITH THE 768, you are even more delusional than usual.
Your nose just went by and is halfway to China by now.
NOT JUST JOE KANE, but everybody else. I READ IT IN FREQUENT ARTICLES IN WIDESCREEN REVIEW
And about half a dozen other places.
Joe KANE was a champion of 720p at the start of the HD
transition, because after allowing for the resolution loss from
1080i, 720p actually had more real resolution.
IT WAS jOE kane and others that influenced ABC and FOX
to go 720p, for the very reason of resolution loss among interlaced formats.
GET A CLUE, do I CONSTANTLY have to educate you out of ignorance?
Pix, I have been a Widescreen Review subscriber since the first day it was printed. I have every issue they have ever printed in my AV library. Joe Kane never said any of what you state. I have been to over 20 of his workshops over the last 15 years, and I have never heard him make that claim. You are lying about Joe Kane influencing ABC to go 720p. The company I work for owns ABC, and I know for a fact that it was internal testing with sport material that convinced ABC to choose 720p over 1080i. David Session who is the head of ABC technical operations convinced the top brass at ABC to choose 720p over 1080i.
Do you have any more lies you want to tell and have holes shot in them?
AND THIS IS HOGWASH , as I SAID EARLIER.
Not that it matters, as INTERLACED sets are EXTINCT.
So what the heck does it matter now?
I hate to bring this to you, but 10 years of panel sales cannot overwhelm 50+ years of interlace television sales. Flat panels are all they are selling now, but tons of folks still own the venerable single gun CRT sets out there. Based on the amount of black boxes sold during the DTV transition(hundreds of millions of them) alot of people have kept them and have never purchased a new panel. Extinct means they don't exist, Smokey will probably tell you differently, and so would my grandmother.
WHEN it comes to "fast moving objects" AND IN EVERY OTHER AREA.
Not when it comes to displaying film content its not. Not with 921600 pixel out of 2,073,000 pixels
720p=1280x720 or 921600 pixels
1080i/p= 1920x1080 or 2,073,000 pixels
not really.
1080i is two FIELDS interlaced to make one frame, at any given time.
OF THE 1080 "LINES" you see on a 1080i pic, 540 are rapidly fading "image retention" and the others are the last "field".
On an interlaced pic at any given time only 540 lines are available,
so there are only about a million or so pixels onscreen at
any given time, and the whole illusion falls apart when theres movement.
AND again...it doesnt matter because 1080i is, for all
practical purposes...EXTINCT.
If what you say is true(which it isn't) one million pixels is still more than 921600 pixels of 720p. If the images fall apart during movement, then the set would be unwatchable 90 percent of the time. That is not the case is it liar?
1080i is still used on concert videos, so it is not extinct. Do you know what extinct means? I guess not.
you know, one of these days even you will figure out that its
IRRELEVANT because 1080i is EXTINCT.
Pretty much like your dino sel
My dino set is not a 1080i set, but your panel is a 768p panel, or you are a liar. How that clothes hanger for a nose you have Pixliar?
EXCUSE me if I BELIEVE every HT magazine I HAVE EVER
read on the subject instead of you.
Your nose just landed in China liar, no magazine ever stated that Bluray has a 72hz refresh rate. Post a link that proves this, or be the liar that you are.
Well, you start out with the wrong res of my panel and go downhill from there.
My set is a full 1080p, you can watch it in a lit room if you like,
and 1440p would be relevant if you could rent 1440p FROM
the video store, which you cant.
BRAG ON your antique all you want, the measurements you
cite as spectacular are actually very slight improvement over a
panel from WALMART , HELPS you massage your ego as you explain it to those who dont notice, and don't really care.
Truth is any cheap DLP (the true comparsion for your piece of junk) will blow its doors off quite handily
Your set is a 768p set, and movies are to be watched in the dark. They were edited in the dark, QC'd in the dark, and presented in a dark theater.
The rest of your statement is pure bull sh!t, and as stupid as you are, you know this.
This statement is so ignorant I AM GOING TO GIVE IT A PASS[quote]
You have lied enough, so perhaps you should.
[quote] it cant outperform my LG, it can outperform your science project
in all the ways that matter, like, you can actually buy one
Your LG is a peice of crap compared to my old dusty custom RPTV. It doesn't even do full HD, and my RPTV does and some. You are out of your league here, but you are so delusional you will never admit it, and you don't have to.
Easier than for you to spell "fool"
Calling you a fool is too generous. You are not even smart enough to earn the L.
WELL, your display is so dim its no wonder they couldn't tell
You have never seen my set liar, so you cannot make this statement. That nose is still growing
SLANDER all you want, you know I AM TELLING THE TRUTH.
And cry for your CRT glory days, they are gone forever,
obsoleted by more advance tech
You have been caught in so many lies in this thread, you are nothing more than a common low life liar, and that is all there is to it. When you stop your lying, you just be a dummy instead of a dumb liar.
AND YOUR BRAIN CANNOT process a single intelligent thought
And you have yet to present one liar
It is not my job to train simple minded children
A simple minded old man does not have the capacity to do so anyway.
ACTUALLY HE DID, and on this thread. Guess the "unimpeachable" source he used?
HIS PARENTS.
They though that a signal from component was better, so it must be so!
BWHAHAHAHAHA!!!!!
I would say hes just about as ignorant as you are, but I DON'T WANT TO INSULT THE MAN, he actually has a personality .:1:
Read his post, he caught your lies and posted a link to them. You have now proved my point, you can't read. LOLOLOLOLOLOLOL, you makes this so easy!
Shhhh! You're gonna spoil the fun for the rest of us ... :cool:
Taste Great......Less Filling......this is getting old for sure, but it is some of the best fun we have had here in a while.
pixelthis
05-23-2011, 12:53 PM
Taste Great......Less Filling......this is getting old for sure, but it is some of the best fun we have had here in a while.
and the vast majority of it is irrelevant.
WHEN just about every tv on the planet is either 1080p or 720p
any discussion of 1080i is a total waste of time.
THE TRANSITION phase is just about over, TV is going to be 1080p
eventually, with 1080i as a transmission medium .
AS FOR "LINKS" it might seem hard to believe, but how do you provide a "link" to the dozens of magazine articles read during the late ninties/early 2000's, the dozens of web sites surfed over the years?
Widescreen review is, I BELIEVE, DEFUNCT, if they still have a website, check it out. GOOGLE Joe Kane, video displays, etc.
None of it matters, really.
DURING THE EARLY years of the auto, there were three major types, electric, steam, and internal combustion
Internal combustion won out.
And we are closing down a similar period with video displays..
THE DIFFERENT formats and displays have sorted themselves
out, and two form factors have won out...
LCD and DLP.
I saw THOR in IMAX lite projected with DLP, simply magnificent.
DLP is it for theater and home projection, with the lcd for the standard display for everybody, soon to be replaced by OLED.
CRT is destined for the trash heap, and no amount of pining away
and nonsense like "1440" displays will change that
THE AGE of CRT is over, as is the age of its bastard child, PLASMA, which is really just a flattened CRT.
ANY talk of anything outside of 1080p is quite useless, really.
Might as well be arguing as to how many angels can dance on
the top of a pin...:1:
Sir Terrence the Terrible
05-23-2011, 01:12 PM
Pix, you are already reduced yourself to a common liar, how much more damage to your rep do you want to do?(as if it wasn't sh!t already!)
Link after link has proven that you don't know crap, so it might be wise to back away from this thread, and call it a day.
Widescreen Review is not defunct, I got my latest issue two weeks ago. Their website is also alive and well. Any more lies you want to tell?
http://www.widescreenreview.com/
amarmistry
05-24-2011, 09:46 AM
OMG!
This is one of those few time I am thankful that I do not know a whole lot about something!
and the vast majority of it is irrelevant.
WHEN just about every tv on the planet is either 1080p or 720p
any discussion of 1080i is a total waste of time.
THE TRANSITION phase is just about over, TV is going to be 1080p
eventually, with 1080i as a transmission medium .
AS FOR "LINKS" it might seem hard to believe, but how do you provide a "link" to the dozens of magazine articles read during the late ninties/early 2000's, the dozens of web sites surfed over the years?
Widescreen review is, I BELIEVE, DEFUNCT, if they still have a website, check it out. GOOGLE Joe Kane, video displays, etc.
None of it matters, really.
DURING THE EARLY years of the auto, there were three major types, electric, steam, and internal combustion
Internal combustion won out.
And we are closing down a similar period with video displays..
THE DIFFERENT formats and displays have sorted themselves
out, and two form factors have won out...
LCD and DLP.
I saw THOR in IMAX lite projected with DLP, simply magnificent.
DLP is it for theater and home projection, with the lcd for the standard display for everybody, soon to be replaced by OLED.
CRT is destined for the trash heap, and no amount of pining away
and nonsense like "1440" displays will change that
THE AGE of CRT is over, as is the age of its bastard child, PLASMA, which is really just a flattened CRT.
ANY talk of anything outside of 1080p is quite useless, really.
Might as well be arguing as to how many angels can dance on
the top of a pin...:1:
Glad you had time to attach a stupid pic and no links. If you do a simple google search, you should be able to provide thousands of links to support your story.....or maybe not. The rest of us have all used that great creation called Google throughout this post and have just been sitting back to watch the P|ssing match continue.
Powered by vBulletin® Version 4.2.0 Copyright © 2024 vBulletin Solutions, Inc. All rights reserved.