All HDMI not created equal?
-
I hear the words Ghosting and HDMI and think of the HDMI 1.4 specification for 3D. Having taken off my polarised glasses during my outing to watch Avatar in 3D at the cinema I saw lots of ghosting, but certain points appear to be almost in sync. If the entire screen seems to display an equal amount of ghosting then this is not likely to be the cause but if there are parts that appear normal-ish then this could be the cause. In which case I would suggest getting a 3D capable Monitor and glasses to see what is going on.
-
I installed a new PNY / Nvidia video card in the box that will be serving the plasma TV yesterday. As the TV hasn't arrived, I tested the HDMI output configuration on a smaller display. To my surprise, while the VGA output tested reasonably crisp and clear, the HDMI output was terrible, with a great deal of ghosting. Admittedly, I haven't done much research on this and just assumed (ah, we've located the source of the trouble now, haven't we?) that HDMI would be superior hi def quality in comparison to VGA. So, exactly how stupid and off track am I in this regard? Does the video card in question simply suck, does HDMI in general suck, or are there other considerations I should bear in mind when purchasing a video card to drive a plasma TV? It goes without saying that something sucks. I'm just not sure what, and thus couldn't provide the standardized subject for this post. :)
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting ServicesI've output HDMI from machines with both nVidia and ATI cards and I've never had a problem, I've gone through HDMI to a 720p LCD TV (with some slightly odd resolution it uses) a 1080p LCD TV, a PC Monitor and a 1080p projector. On my TV VGA provided a crisper image than HDMI, because the TV was trying to be clever and apply some filters and whatnot to any HDMI inputs but I found the option to turn that off.
My current favourite quote is: Punch them in the face, see what happens!
-SK Genius
-
I installed a new PNY / Nvidia video card in the box that will be serving the plasma TV yesterday. As the TV hasn't arrived, I tested the HDMI output configuration on a smaller display. To my surprise, while the VGA output tested reasonably crisp and clear, the HDMI output was terrible, with a great deal of ghosting. Admittedly, I haven't done much research on this and just assumed (ah, we've located the source of the trouble now, haven't we?) that HDMI would be superior hi def quality in comparison to VGA. So, exactly how stupid and off track am I in this regard? Does the video card in question simply suck, does HDMI in general suck, or are there other considerations I should bear in mind when purchasing a video card to drive a plasma TV? It goes without saying that something sucks. I'm just not sure what, and thus couldn't provide the standardized subject for this post. :)
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting Services -
I've used my Windows PC to my 26" HDTV, the quality seemed slightly downgraded, audio was ok though. Haven't got round to using VGA though, haven't had the need. However I use the same HDMI cable from my PS3 and Xbox to the TV, it was about £5 as does a brilliant job there.
He who makes a beast out of himself gets rid of the pain of being a man.
One possibility for the quality hit is that by default HDTV's strip the outermost two dozenish pixels from the image an up sample it. This had a technical reason in the analog TV era. The edge of the picture was hidden by the CRTs bezel anyway and was a perfect place to hide sub-channel data like closed captioning. Some TV broadcasters are still crapping up the normally hidden part of their video though even though digital broadcasts make the old reasons a moot point. TVs do the same playing DVDs because at least in theory Joe Moron will think a TV that shows a slightly blurred image with bigger heads, etc looks better than one that is sharper and shows the edge of the recorded video. To make it more fun, the wording of the options on the TV menus are often deceptive with the one that you'd think would force it into 1:1 pixel mapping actually being the one that chops and upscales. X|
3x12=36 2x12=24 1x12=12 0x12=18
-
One possibility for the quality hit is that by default HDTV's strip the outermost two dozenish pixels from the image an up sample it. This had a technical reason in the analog TV era. The edge of the picture was hidden by the CRTs bezel anyway and was a perfect place to hide sub-channel data like closed captioning. Some TV broadcasters are still crapping up the normally hidden part of their video though even though digital broadcasts make the old reasons a moot point. TVs do the same playing DVDs because at least in theory Joe Moron will think a TV that shows a slightly blurred image with bigger heads, etc looks better than one that is sharper and shows the edge of the recorded video. To make it more fun, the wording of the options on the TV menus are often deceptive with the one that you'd think would force it into 1:1 pixel mapping actually being the one that chops and upscales. X|
3x12=36 2x12=24 1x12=12 0x12=18
Overscan. TVs for 720p actually have 768 line displays to accomadate the overscan whereas 1080p dispalys don't so the overscan is hidden. It gets complicated after that partly because VGA inputs on 1080p TVs don't have overscan enabled even for the same resolution.
Join the cool kids - Come fold with us[^]
-
I installed a new PNY / Nvidia video card in the box that will be serving the plasma TV yesterday. As the TV hasn't arrived, I tested the HDMI output configuration on a smaller display. To my surprise, while the VGA output tested reasonably crisp and clear, the HDMI output was terrible, with a great deal of ghosting. Admittedly, I haven't done much research on this and just assumed (ah, we've located the source of the trouble now, haven't we?) that HDMI would be superior hi def quality in comparison to VGA. So, exactly how stupid and off track am I in this regard? Does the video card in question simply suck, does HDMI in general suck, or are there other considerations I should bear in mind when purchasing a video card to drive a plasma TV? It goes without saying that something sucks. I'm just not sure what, and thus couldn't provide the standardized subject for this post. :)
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting Services -
Overscan. TVs for 720p actually have 768 line displays to accomadate the overscan whereas 1080p dispalys don't so the overscan is hidden. It gets complicated after that partly because VGA inputs on 1080p TVs don't have overscan enabled even for the same resolution.
Join the cool kids - Come fold with us[^]
Most 1080P tvs operate at 1920x1200 resolution not the expected 1920x1080 though.
-
Yeah, you should expect that when you try to look at a Windows desktop via HDMI connection. Sometimes, the video card/TV combination can adjust a lot of it out for you, but that's pretty much the price you pay for the convenience of a single cable for both audio and video. I have the same issue with my HTPC. If I drop to the Windows desktop for anything, the visual quality degrades, but when I have Media Portal runnign, it's fine. Here's another little tidfbit for ya - BEFORE bringing your HTPC up (or waking it from a sleep condition), make sure your TV is turned on and your home theater receiver is set to the appropriate source. If you don't, the PC won't know how to negotiate the resolution, and your display will not be set correctly, forcing you to drop to the desktop to reset the display resoultion to the proper/desired size. I did find that Weven is less troublesome in this regard, but it's better to follow that procedure anyway to avoid potential visual issues.
.45 ACP - because shooting twice is just silly
-----
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"The staggering layers of obscenity in your statement make it a work of art on so many levels." - J. Jystad, 2001Makes sense on the startup order. I'm a little concerned about the desktop quality since I'm watching Netflix streaming stuff which, of course, is through a browser. I knew it wouldn't be bluray quality stuff, but was hoping for something at least watchable. Hell, I may have to use the TV's VGA input if this is as good as it gets. Yuck.
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting Services -
Ghosting means a hardware fault. Simply put, HDMI uses the same data links as DVI so you should get the same performance. {Edit] Have you set the graphics resolution to match the display? There are some odd modes supported by smaller TVs.
Join the cool kids - Come fold with us[^]
Yeah, I'm pretty sure I have stuff matched up. However, this is an inexpensive, small Philips set, so I should probably reserve judgement until I try it with the Panasonic 65 that's coming in. I'd whine about longing for simpler times, but I have vivid memories of black & white TVs and countless hours of fiddling with rabbit ears (and the requisite aluminum foil strip) to get good reception, so I guess it's all the same. :)
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting Services -
I've output HDMI from machines with both nVidia and ATI cards and I've never had a problem, I've gone through HDMI to a 720p LCD TV (with some slightly odd resolution it uses) a 1080p LCD TV, a PC Monitor and a 1080p projector. On my TV VGA provided a crisper image than HDMI, because the TV was trying to be clever and apply some filters and whatnot to any HDMI inputs but I found the option to turn that off.
My current favourite quote is: Punch them in the face, see what happens!
-SK Genius
When the Panasonic comes in, I'll experiment with that and look for the kind of filtering stuff you're talking about. Thanks, man.
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting Services -
Why, because you have to pay for it? Just kidding, Harold, just kidding... :-D
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting Services -
You mean HDCP, either it works or it doesn't.
Join the cool kids - Come fold with us[^]
If HDCP cannot be negotiated, the HDMI provider can output image and video in degraded quality... But this almost always only apply to BR movies, and then through a special player software. Bare Windows should never produce ghost images, etc due to HDCP negotiation failure. After all, the Windows desktop isn't copyright protected material in the same sense as BR movies. :)
-- Kein Mitleid Für Die Mehrheit
-
Makes sense on the startup order. I'm a little concerned about the desktop quality since I'm watching Netflix streaming stuff which, of course, is through a browser. I knew it wouldn't be bluray quality stuff, but was hoping for something at least watchable. Hell, I may have to use the TV's VGA input if this is as good as it gets. Yuck.
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting ServicesI don't stream form the net. I ccpy the DVD and put the iso file on my HTPC box, and watch it that way. I haven't actually tried a netflix movie that way yet, but I'll give it a shot and let you know how it looks for me if you're interested.
.45 ACP - because shooting twice is just silly
-----
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"The staggering layers of obscenity in your statement make it a work of art on so many levels." - J. Jystad, 2001 -
If HDCP cannot be negotiated, the HDMI provider can output image and video in degraded quality... But this almost always only apply to BR movies, and then through a special player software. Bare Windows should never produce ghost images, etc due to HDCP negotiation failure. After all, the Windows desktop isn't copyright protected material in the same sense as BR movies. :)
-- Kein Mitleid Für Die Mehrheit
I've been using HDMI for several years now, with HTPCs, media players, et top boxes etc. and never seen 'ghosting'. Still waiting. The only thing I can think of is alisasing due to overscan meaning that source and display mapping isn't 1:1 but I would take 'ghosting' as meaning multiple echoes of an image.
Join the cool kids - Come fold with us[^]
-
Wow, after reading this thread, I hope my CRT TV will still hold a bit, since I just got lost into the HDMI 720p nVidia VDI VGA gullible stuff.
-
I installed a new PNY / Nvidia video card in the box that will be serving the plasma TV yesterday. As the TV hasn't arrived, I tested the HDMI output configuration on a smaller display. To my surprise, while the VGA output tested reasonably crisp and clear, the HDMI output was terrible, with a great deal of ghosting. Admittedly, I haven't done much research on this and just assumed (ah, we've located the source of the trouble now, haven't we?) that HDMI would be superior hi def quality in comparison to VGA. So, exactly how stupid and off track am I in this regard? Does the video card in question simply suck, does HDMI in general suck, or are there other considerations I should bear in mind when purchasing a video card to drive a plasma TV? It goes without saying that something sucks. I'm just not sure what, and thus couldn't provide the standardized subject for this post. :)
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting ServicesI use a cheap onboard-shared memory HDMI output for my HTPC to power my 47in lg 47sl80 LCD and it works just fine. I had a hell of a time getting my HTPC up and running initially with HDMI but my issue was no picture at all (not the same as your problem). I even fried a motherboard on the PC at one point while connecting cables. eventually it all came down to a cheap cable. I had bought a pack of 3 HDMI cables for 10$ from china based on reviews from some friends. several friends have the cheap cables and they worked fine. During troubleshooting, I even gave them one of my cables and it worked fine for them. As soon as I went down to the local PC store (not electronics store) and bought a new cable for 10$ (so still not a "super high quality" cable) everything started working perfectly. I guess that the cheap cables (even though they were rated for HDMI 1.3) simply weren't good enough for my higher quality TV.
-
I don't stream form the net. I ccpy the DVD and put the iso file on my HTPC box, and watch it that way. I haven't actually tried a netflix movie that way yet, but I'll give it a shot and let you know how it looks for me if you're interested.
.45 ACP - because shooting twice is just silly
-----
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"The staggering layers of obscenity in your statement make it a work of art on so many levels." - J. Jystad, 2001Someday in the future, when the technology is a bit more stable, I'll probably go the same route as you for the movies I buy. It appeals to my inner geek, but I'm just not ready to fight the wars yet. As for Netflix, I cancelled my Dish subscription when I realized I was paying for 250 channels and only watching the movies, so my impules / something to watch while I eat dinner viewing is the instant streaming stuff from Netflix, a deciding factor in my going with them. I knew that watching a Silverlight / browser based source wouldn't be true HD, I'm just going for the best bang for the buck between convenience and quality. All of which is my typically long winded way of saying yeah, if you want to fire up some instant streaming stuff via the browser, I'd be most interested in what you learn in terms of optimizing the quality of the experience.
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting Services -
Overscan. TVs for 720p actually have 768 line displays to accomadate the overscan whereas 1080p dispalys don't so the overscan is hidden. It gets complicated after that partly because VGA inputs on 1080p TVs don't have overscan enabled even for the same resolution.
Join the cool kids - Come fold with us[^]
Ummm wait what? Am I misunderstanding what's going on? My understanding was that the overscan behavior cropped the 1280x720 (1920x1080) image down to 1260x700 (1900x1060) and then upsampled it to whatever the panels native resolution was.
3x12=36 2x12=24 1x12=12 0x12=18