View Full Version : 1080p Source With 720P projector
Robert-The-Rambler
05-16-2007, 08:09 AM
Lately I've been experimenting with my Sanyo PLV-Z5 projector. I was about to sell it but gave up that idea and I'm pretty sure that was a good decision. I read the online manual and stumbled across the section in the manual that said the Z5 would accept a 1080p input via HDMI. I quickly ordered a 16 foot DVD to HDMI cable to run from the PC to the projector. My question is whether or not there is supposed to be better image quality using a 1080p source or sticking with native 720 resolution. The reason I ask this is because I've been building a budget gaming PC in my bedroom where I use the projector for late night gaming in bed with a 92" wide Grayhawk screen.(Naturally I don't fill the full screen. I use about 60" widescreen) I wouldn't even be asking about whether their is supposed to be a difference in image quality for video sources because it is simple to just use 1080p just in case because framerate is not a issue.. The thing with real-time 3D rendering is that 1080p will require twice the video horsepower to get rendered. I tried out an older game in Unreal 2 and that looked amazing set at 1080p with FSAA and all other filtering like Anisotropic filtering maxed out with an X1950 Pro ATI graphics card.. How does this 1080p compatibility work? When you switch from 720p to 1080p it sure seems like an increase in resolution. To 1080p or not, tis the question.
Rich-n-Texas
05-16-2007, 10:40 AM
Lately I've been experimenting with my Sanyo PLV-Z5 projector. I was about to sell it but gave up that idea and I'm pretty sure that was a good decision. I read the online manual and stumbled across the section in the manual that said the Z5 would accept a 1080p input via HDMI. I quickly ordered a 16 foot DVD to HDMI cable to run from the PC to the projector. My question is whether or not there is supposed to be better image quality using a 1080p source or sticking with native 720 resolution. The reason I ask this is because I've been building a budget gaming PC in my bedroom where I use the projector for late night gaming in bed with a 92" wide Grayhawk screen.(Naturally I don't fill the full screen. I use about 60" widescreen) I wouldn't even be asking about whether their is supposed to be a difference in image quality for video sources because it is simple to just use 1080p just in case because framerate is not a issue.. The thing with real-time 3D rendering is that 1080p will require twice the video horsepower to get rendered. I tried out an older game in Unreal 2 and that looked amazing set at 1080p with FSAA and all other filtering like Anisotropic filtering maxed out with an X1950 Pro ATI graphics card.. How does this 1080p compatibility work? When you switch from 720p to 1080p it sure seems like an increase in resolution. To 1080p or not, tis the question.
LOL! Isn't that statement an oxymoron?!?!? :confused:
I can't answer to the crux of your post RTR, but I do have some questions about your video card selection. Does 1920 x 1080 out from your ATI card equate to 1080i or 1080p at your projector? How often does the game crash at 720 OR 1080? What combination of ATI drivers are you using with your X1950 Pro?
I'm asking these questions because I'll soon be updating and integrating my computer with my DLP TV, but I was leaning towards nVidia because of all the past disasters I've had to deal with in the ATI world.
Sorry, don't mean to steal your thread but everytime I see ATI, I cringe.
Robert-The-Rambler
05-16-2007, 06:14 PM
LOL! Isn't that statement an oxymoron?!?!? :confused:
I can't answer to the crux of your post RTR, but I do have some questions about your video card selection. Does 1920 x 1080 out from your ATI card equate to 1080i or 1080p at your projector? How often does the game crash at 720 OR 1080? What combination of ATI drivers are you using with your X1950 Pro?
I'm asking these questions because I'll soon be updating and integrating my computer with my DLP TV, but I was leaning towards nVidia because of all the past disasters I've had to deal with in the ATI world.
Sorry, don't mean to steal your thread but everytime I see ATI, I cringe.
The performance is good and it almost never crashes. Stability is there. I just had Half-Life 2 running at 1920 * 1080p @ 60hz with every detail maxed out with fairly consistent performance. Maxed out for crossfire setups is 14x FSAA with 16X anisoptropic filtering. Some chugging occurs when the HDR lighting kicks in but with some tweaking of the FSAA that should be kept in check. For a shade over $150 or a shade under $150 depending on what brand you get at Newegg.com you get a solid, stable, fairly high performing card. I also spotted a Radeon X1950xt 256meg for just $170 that will kill a single X1950 Pro in performance. I'm not sure that you can buy two of them and use them in crossfire mode which is disappointing in that case.
With my setup I can run 1080i via DVI converted to VGA and 1080p with DVI converted to HDMI to the PLV-Z5. The video cards are HDCP compliant so the HDMI handshake is not a problem.
Powered by vBulletin® Version 4.2.0 Copyright © 2024 vBulletin Solutions, Inc. All rights reserved.