Quote Originally Posted by hermanv
Now I'm confused, since 1080i requires two passes to complete a picture, the update data rate is equivalent to 540p. No? So I think 720p requires a greater bandwidth than 1080i because the entire screen is updated twice as often. By the end of two screens 1080 pixels have been sent for 1080i, but during the same time interval I think 1440 pixels have been sent for 720p.
You forgot to account for the smaller number of pixels per frame on a 720p image. 1080i is rendering a 1,920 x 1,080 pixel map at a frame rate of 30 frames/sec, while 720p renders a 1,280 x 720 pixel map at a frame rate of 60 frames/sec. Do the math, 1080i gives you a pixel rate of approximately 62.2 million pixels/sec, while 720p gives you 55.3 million pixels/sec.

In addition, all things being equal, both the MPEG-2 and MPEG-4 codecs compress progressive video more efficiently than interlaced. More data + less efficiency = greater bandwidth

Because of this, 1080i requires a higher bandwidth, but whether or not that equates to a "better" picture is debatable. On my TV, I can see that 720p sources have less image detail (could also be a byproduct of the rescaling needed on a 1080p TV), but I can also see that it has smoother motion on the backgrounds than 1080i.