All HDMI not created equal?
-
Most 1080P tvs operate at 1920x1200 resolution not the expected 1920x1080 though.
-
Ummm wait what? Am I misunderstanding what's going on? My understanding was that the overscan behavior cropped the 1280x720 (1920x1080) image down to 1260x700 (1900x1060) and then upsampled it to whatever the panels native resolution was.
3x12=36 2x12=24 1x12=12 0x12=18
No, 720p panels are for 1280x720 main area and 1366 x 768 to encompass the overscan. Yes, I know but that's what they do. Add to that some small TVs are 1440x900 and 1080p TVs are 1920x1080 and scale to put the overscan outside the visible area from an image that is broadcast at 1920x1080... I always turn the 1080 overscan off and it looks better because you get 1:1 mapping. :cool: There is an awfull lot more but I'll save you the pain.
Join the cool kids - Come fold with us[^]
-
It puts roadblocks in place for people who think they should be able to make unrestricted copies of digital media without having to pay for any of it. :rolleyes:
3x12=36 2x12=24 1x12=12 0x12=18
-
When the Panasonic comes in, I'll experiment with that and look for the kind of filtering stuff you're talking about. Thanks, man.
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting ServicesWell, on the Samsung TV I was using you had to plug in to HDMI slot 2 and then set the name of the source to 'PC' so it wasn't exactly intuitive. But it certainly did the trick.
My current favourite quote is: Punch them in the face, see what happens!
-SK Genius
-
I installed a new PNY / Nvidia video card in the box that will be serving the plasma TV yesterday. As the TV hasn't arrived, I tested the HDMI output configuration on a smaller display. To my surprise, while the VGA output tested reasonably crisp and clear, the HDMI output was terrible, with a great deal of ghosting. Admittedly, I haven't done much research on this and just assumed (ah, we've located the source of the trouble now, haven't we?) that HDMI would be superior hi def quality in comparison to VGA. So, exactly how stupid and off track am I in this regard? Does the video card in question simply suck, does HDMI in general suck, or are there other considerations I should bear in mind when purchasing a video card to drive a plasma TV? It goes without saying that something sucks. I'm just not sure what, and thus couldn't provide the standardized subject for this post. :)
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting ServicesChristopher Duncan wrote:
So, exactly how stupid and off track am I in this regard?
*bites tongue* That was a setup and I am NOT taking it. :) I have too much respect for you to fall for that one....
Christopher Duncan wrote:
Does the video card in question simply suck
which card? PNY used to have a Quadro Contract, that means they made the premium boards for nVidia based on a history of good board fab-methods; they lost it because of poor practices, bad customer service, and poor quality boards. They have been trying to restore their reputation, so quality has come up... but it is entirely possible you got a bad card. I have been running HDMI on nVidia cards for well over a year now.
Christopher Duncan wrote:
It goes without saying that something sucks. I'm just not sure what,
When in doubt, blame someone else... ;P picking randomly works too.
_________________________ John Andrew Holmes "It is well to remember that the entire universe, with one trifling exception, is composed of others." Shhhhh.... I am not really here. I am a figment of your imagination.... I am still in my cave so this must be an illusion....
-
No, 720p panels are for 1280x720 main area and 1366 x 768 to encompass the overscan. Yes, I know but that's what they do. Add to that some small TVs are 1440x900 and 1080p TVs are 1920x1080 and scale to put the overscan outside the visible area from an image that is broadcast at 1920x1080... I always turn the 1080 overscan off and it looks better because you get 1:1 mapping. :cool: There is an awfull lot more but I'll save you the pain.
Join the cool kids - Come fold with us[^]
I'm still not 100% sure I follow. IS this correct? 720P: Screen size: 1366x768. Normal video resolution: 1280x720. Video resolution with overscan: 1366x768. 1080p: Screen size: 1920x1080. Normal video resolution: 1920x1080 - overscan area. Video resolution with overscan: 1920x1080.
3x12=36 2x12=24 1x12=12 0x12=18
-
I'm still not 100% sure I follow. IS this correct? 720P: Screen size: 1366x768. Normal video resolution: 1280x720. Video resolution with overscan: 1366x768. 1080p: Screen size: 1920x1080. Normal video resolution: 1920x1080 - overscan area. Video resolution with overscan: 1920x1080.
3x12=36 2x12=24 1x12=12 0x12=18
Almost, most 1080p TVs scale the image up about 5% to hide the overscan area. Why, this is what happens when standards are developed my commitees with people from several industry sectors with their own proprietory solutions in place. It comes under various names but 1:1 mapping actually looks better so switch it off if you have the option. There is even an option for 1920x1088 and a pan vector to say which bit to crop vertically!
Join the cool kids - Come fold with us[^]
-
I installed a new PNY / Nvidia video card in the box that will be serving the plasma TV yesterday. As the TV hasn't arrived, I tested the HDMI output configuration on a smaller display. To my surprise, while the VGA output tested reasonably crisp and clear, the HDMI output was terrible, with a great deal of ghosting. Admittedly, I haven't done much research on this and just assumed (ah, we've located the source of the trouble now, haven't we?) that HDMI would be superior hi def quality in comparison to VGA. So, exactly how stupid and off track am I in this regard? Does the video card in question simply suck, does HDMI in general suck, or are there other considerations I should bear in mind when purchasing a video card to drive a plasma TV? It goes without saying that something sucks. I'm just not sure what, and thus couldn't provide the standardized subject for this post. :)
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting ServicesMost TV do a lot of things for optimal TV and movie viewing: 100/200hz and motion compensation removing probably ugly boarders blurring/sharpening to make the picture look better Newer TV often have a PC or native mode, where each pixel from the PC will correspond to one on the display. The already mentioned having the TV on while booting the first time makes sure that the PC sends a 1080p picture if the TV sends the correct info to it. If not set it to 1920x1080 on a full HD TV. To get the correct viewing mode however you have to enter the TVs menu. On mine you select Picture mode PC and Picture size native. More than 2 years old TV models may not have the correct modes or have them tied to VGA and DVI input. Regards, Rune B.
-
I installed a new PNY / Nvidia video card in the box that will be serving the plasma TV yesterday. As the TV hasn't arrived, I tested the HDMI output configuration on a smaller display. To my surprise, while the VGA output tested reasonably crisp and clear, the HDMI output was terrible, with a great deal of ghosting. Admittedly, I haven't done much research on this and just assumed (ah, we've located the source of the trouble now, haven't we?) that HDMI would be superior hi def quality in comparison to VGA. So, exactly how stupid and off track am I in this regard? Does the video card in question simply suck, does HDMI in general suck, or are there other considerations I should bear in mind when purchasing a video card to drive a plasma TV? It goes without saying that something sucks. I'm just not sure what, and thus couldn't provide the standardized subject for this post. :)
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting ServicesI purchased an ATI HD 4350 Card with HDMI to connect my HTPC (Half Height Case) to my Samsung 32" Series 6 and got exactly the same results that you have expressed. Using HDMI was gawd damn awful even when viewing DVD Quality video. If you have the option, connect using VGA. My Samsung (True HD @ 1920x1080) displays the VGA signal in near HD Quality at all times, as compared to the barely viewable HDMI connection, and no problems with resolution settings etc. Interestingly, the Samsung has a dedicated HDMI socket for PC/DVI connections. I tried all the HDMI sockets with the same results. A VERY disappointing experience to say the least, given the hype regarding HDMI. An aside, my friend just purchased a 42" Panasonic LCD TV with a free Panasonic Blu-Ray player. Connecting the Blu-Ray Player via HDMI produced a beautiful HD picture but IMHO, the picture quality was no better than my Samsung/VGA combination. I'm guessing that HDCP has a lot to do with the problem. degrading the signal because of a lack of the proper codes/certificates etc, that are needed when viewing HD content on anything. Fortunately, it appears that VGA bypasses this for now.
Regards Andy M
-
I don't think there are that much difference in HDMI cable quality past a certain price-point (max 20-30$); I would not get the cheapest of the cheap, but one from a more reputable vendor (if that ever exists). There is probably one area where "better" quality could mean something is if you want long(er) cable than what is usually the norm. anyway, I'm still on regular cable and tube TV, so my opinion counts for not that much. Max.
Watched code never compiles.
HDMI is a digital connection, there are different specifications (1.2, 1.3, etc) that will make a difference in features supported, but brand to brand does not make a difference in quality. Since interference either happens or doesn't (digital) you either get correct data to the TV or you don't, it's not "blurry" or "staticy". A blurry image is likely caused by analog/digital conversion at some point being subpar or upscaling.
-
Yeah, you should expect that when you try to look at a Windows desktop via HDMI connection. Sometimes, the video card/TV combination can adjust a lot of it out for you, but that's pretty much the price you pay for the convenience of a single cable for both audio and video. I have the same issue with my HTPC. If I drop to the Windows desktop for anything, the visual quality degrades, but when I have Media Portal runnign, it's fine. Here's another little tidfbit for ya - BEFORE bringing your HTPC up (or waking it from a sleep condition), make sure your TV is turned on and your home theater receiver is set to the appropriate source. If you don't, the PC won't know how to negotiate the resolution, and your display will not be set correctly, forcing you to drop to the desktop to reset the display resoultion to the proper/desired size. I did find that Weven is less troublesome in this regard, but it's better to follow that procedure anyway to avoid potential visual issues.
.45 ACP - because shooting twice is just silly
-----
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"The staggering layers of obscenity in your statement make it a work of art on so many levels." - J. Jystad, 2001John Simmons / outlaw programmer wrote:
Yeah, you should expect that when you try to look at a Windows desktop via HDMI connection
I completely disagree. The issues that both of you are having are abnormal. Something is not right somewhere. Windows should see the TV as simply a monitor with (assuming it's 1080p) a resolution of 1920x1080 and otherwise treat it as a normal display. Yes you do have to make sure the TV is on before starting Windows (7 anyway) in order for it to recognize it as an audio device (and you'll have to select it to use it if you have built-in audio on the PC) and if it's your primary display, for setting the resolution. Perhaps there are some issues with over scan or HDCP... All I can say is I have a 42" plasma connected to my system using ATI HD4850s and it's flawless whether watching video or at the desktop. Try the Hulu desktop client. I run this full screen and I cannot tell I'm not watching my satellite broadcast. Awesome!
-
I installed a new PNY / Nvidia video card in the box that will be serving the plasma TV yesterday. As the TV hasn't arrived, I tested the HDMI output configuration on a smaller display. To my surprise, while the VGA output tested reasonably crisp and clear, the HDMI output was terrible, with a great deal of ghosting. Admittedly, I haven't done much research on this and just assumed (ah, we've located the source of the trouble now, haven't we?) that HDMI would be superior hi def quality in comparison to VGA. So, exactly how stupid and off track am I in this regard? Does the video card in question simply suck, does HDMI in general suck, or are there other considerations I should bear in mind when purchasing a video card to drive a plasma TV? It goes without saying that something sucks. I'm just not sure what, and thus couldn't provide the standardized subject for this post. :)
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting ServicesAs mentioned before, it could be an overscan/upsampling issue. Also it could be that for whatever reason your computer/graphics card is insisting on a HDMI HDCP connection, and since it isn't getting a HDCP negotiation from the monitor (if HDCP isn't supported by it) then the graphics card is downsampling the HDMI port to the equivalent of TV resolution 480i/p. It is my understanding that the Windows media encryption chain only requires HDCP when it is playing media that is flagged as being DRMed. But if for some reason the HDCP flag is being set all the time, then you'll get terrible resolution over HDMI.
-
I installed a new PNY / Nvidia video card in the box that will be serving the plasma TV yesterday. As the TV hasn't arrived, I tested the HDMI output configuration on a smaller display. To my surprise, while the VGA output tested reasonably crisp and clear, the HDMI output was terrible, with a great deal of ghosting. Admittedly, I haven't done much research on this and just assumed (ah, we've located the source of the trouble now, haven't we?) that HDMI would be superior hi def quality in comparison to VGA. So, exactly how stupid and off track am I in this regard? Does the video card in question simply suck, does HDMI in general suck, or are there other considerations I should bear in mind when purchasing a video card to drive a plasma TV? It goes without saying that something sucks. I'm just not sure what, and thus couldn't provide the standardized subject for this post. :)
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting ServicesMy 15 months of experiences on Ati HD4850 with Sharp TV lc-le42dh77e: 1) Ati cards (all of them) are set by DEFAULT to scale 1080p picture down to 80% or so, you HAVE to install CCC to adjust it to 100%. Otherwise - blurriness or unnatural colors on some parts of the screen, almost on every flavor of Windows. 720p set to 100% actually doesn't fit on the screen and needs to be shrinked (wth?) 2) Windows XP - the thing someone described earlier - need to turn on tv first and set the source, otherwise no picture. Same with waking from sleep or monitor turning off and back on due to energy saving. 3) Vista - pissed me off, too slow, but no above-mentioned issue. I've been using it for like a week only. 4) Got Win 7 x64 (first RC and now release) - all good, still requires CCC but scaled-down image isn't SO deteriorated. You might notice slight picture errors and black margins, this will tell you to scale-up the picture properly and you're done. All in all - cables don't matter. I have the cheapest ones and they do work both with xbox 360, satellite receiver, PC and laptop. My bro has a 10 meter one, also cheapest possible and it works too. Sometimes, like once per day, picture would flicker for a split-second. You'd think a video has an artifact, like missing a lower half of one frame, but when you rewind it doesn't happen - so that makes me believe it is a hardware fault, cosmic rays, birds chirping too loudly - iono.
-
I installed a new PNY / Nvidia video card in the box that will be serving the plasma TV yesterday. As the TV hasn't arrived, I tested the HDMI output configuration on a smaller display. To my surprise, while the VGA output tested reasonably crisp and clear, the HDMI output was terrible, with a great deal of ghosting. Admittedly, I haven't done much research on this and just assumed (ah, we've located the source of the trouble now, haven't we?) that HDMI would be superior hi def quality in comparison to VGA. So, exactly how stupid and off track am I in this regard? Does the video card in question simply suck, does HDMI in general suck, or are there other considerations I should bear in mind when purchasing a video card to drive a plasma TV? It goes without saying that something sucks. I'm just not sure what, and thus couldn't provide the standardized subject for this post. :)
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting ServicesI'm not sure about nVidia cards but, with ATI, there is a selection in the Catalyst Control Center to enable some of the Hi-Def resolutions and formats. These are usually grayed-out because a lot of the earlier computer monitors didn't support them (as in - you will destroy your monitor if you try it!) That being the case, you might be attempting to force your TV/Monitor into a resolution that it can emulate but doesn't natively support. See if you can select one of the HD resolutions - preferably 1080p (1920x1080). You also said that you were trying it on a smaller display. Does that display support 1080P or is it just 720P? That would make a big difference. Just my 2c.
modified on Thursday, January 6, 2011 9:58 AM