All HDMI not created equal?
-
Makes sense on the startup order. I'm a little concerned about the desktop quality since I'm watching Netflix streaming stuff which, of course, is through a browser. I knew it wouldn't be bluray quality stuff, but was hoping for something at least watchable. Hell, I may have to use the TV's VGA input if this is as good as it gets. Yuck.
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting ServicesI don't stream form the net. I ccpy the DVD and put the iso file on my HTPC box, and watch it that way. I haven't actually tried a netflix movie that way yet, but I'll give it a shot and let you know how it looks for me if you're interested.
.45 ACP - because shooting twice is just silly
-----
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"The staggering layers of obscenity in your statement make it a work of art on so many levels." - J. Jystad, 2001 -
If HDCP cannot be negotiated, the HDMI provider can output image and video in degraded quality... But this almost always only apply to BR movies, and then through a special player software. Bare Windows should never produce ghost images, etc due to HDCP negotiation failure. After all, the Windows desktop isn't copyright protected material in the same sense as BR movies. :)
-- Kein Mitleid Für Die Mehrheit
I've been using HDMI for several years now, with HTPCs, media players, et top boxes etc. and never seen 'ghosting'. Still waiting. The only thing I can think of is alisasing due to overscan meaning that source and display mapping isn't 1:1 but I would take 'ghosting' as meaning multiple echoes of an image.
Join the cool kids - Come fold with us[^]
-
Wow, after reading this thread, I hope my CRT TV will still hold a bit, since I just got lost into the HDMI 720p nVidia VDI VGA gullible stuff.
-
I installed a new PNY / Nvidia video card in the box that will be serving the plasma TV yesterday. As the TV hasn't arrived, I tested the HDMI output configuration on a smaller display. To my surprise, while the VGA output tested reasonably crisp and clear, the HDMI output was terrible, with a great deal of ghosting. Admittedly, I haven't done much research on this and just assumed (ah, we've located the source of the trouble now, haven't we?) that HDMI would be superior hi def quality in comparison to VGA. So, exactly how stupid and off track am I in this regard? Does the video card in question simply suck, does HDMI in general suck, or are there other considerations I should bear in mind when purchasing a video card to drive a plasma TV? It goes without saying that something sucks. I'm just not sure what, and thus couldn't provide the standardized subject for this post. :)
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting ServicesI use a cheap onboard-shared memory HDMI output for my HTPC to power my 47in lg 47sl80 LCD and it works just fine. I had a hell of a time getting my HTPC up and running initially with HDMI but my issue was no picture at all (not the same as your problem). I even fried a motherboard on the PC at one point while connecting cables. eventually it all came down to a cheap cable. I had bought a pack of 3 HDMI cables for 10$ from china based on reviews from some friends. several friends have the cheap cables and they worked fine. During troubleshooting, I even gave them one of my cables and it worked fine for them. As soon as I went down to the local PC store (not electronics store) and bought a new cable for 10$ (so still not a "super high quality" cable) everything started working perfectly. I guess that the cheap cables (even though they were rated for HDMI 1.3) simply weren't good enough for my higher quality TV.
-
I don't stream form the net. I ccpy the DVD and put the iso file on my HTPC box, and watch it that way. I haven't actually tried a netflix movie that way yet, but I'll give it a shot and let you know how it looks for me if you're interested.
.45 ACP - because shooting twice is just silly
-----
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"The staggering layers of obscenity in your statement make it a work of art on so many levels." - J. Jystad, 2001Someday in the future, when the technology is a bit more stable, I'll probably go the same route as you for the movies I buy. It appeals to my inner geek, but I'm just not ready to fight the wars yet. As for Netflix, I cancelled my Dish subscription when I realized I was paying for 250 channels and only watching the movies, so my impules / something to watch while I eat dinner viewing is the instant streaming stuff from Netflix, a deciding factor in my going with them. I knew that watching a Silverlight / browser based source wouldn't be true HD, I'm just going for the best bang for the buck between convenience and quality. All of which is my typically long winded way of saying yeah, if you want to fire up some instant streaming stuff via the browser, I'd be most interested in what you learn in terms of optimizing the quality of the experience.
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting Services -
Overscan. TVs for 720p actually have 768 line displays to accomadate the overscan whereas 1080p dispalys don't so the overscan is hidden. It gets complicated after that partly because VGA inputs on 1080p TVs don't have overscan enabled even for the same resolution.
Join the cool kids - Come fold with us[^]
Ummm wait what? Am I misunderstanding what's going on? My understanding was that the overscan behavior cropped the 1280x720 (1920x1080) image down to 1260x700 (1900x1060) and then upsampled it to whatever the panels native resolution was.
3x12=36 2x12=24 1x12=12 0x12=18
-
OK, how?
Join the cool kids - Come fold with us[^]
-
Yeah, I'm pretty sure I have stuff matched up. However, this is an inexpensive, small Philips set, so I should probably reserve judgement until I try it with the Panasonic 65 that's coming in. I'd whine about longing for simpler times, but I have vivid memories of black & white TVs and countless hours of fiddling with rabbit ears (and the requisite aluminum foil strip) to get good reception, so I guess it's all the same. :)
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting ServicesChristopher Duncan wrote:
I have vivid memories of black & white TVs and countless hours of fiddling with rabbit ears (and the requisite aluminum foil strip) to get good reception, so I guess it's all the same
3 15 27 95 436 channels and there's nothing on that's worth watching.
Software Zen:
delete this;
-
Christopher Duncan wrote:
I have vivid memories of black & white TVs and countless hours of fiddling with rabbit ears (and the requisite aluminum foil strip) to get good reception, so I guess it's all the same
3 15 27 95 436 channels and there's nothing on that's worth watching.
Software Zen:
delete this;
Yeah. Hence my cancellation of Dish. :)
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting Services -
I installed a new PNY / Nvidia video card in the box that will be serving the plasma TV yesterday. As the TV hasn't arrived, I tested the HDMI output configuration on a smaller display. To my surprise, while the VGA output tested reasonably crisp and clear, the HDMI output was terrible, with a great deal of ghosting. Admittedly, I haven't done much research on this and just assumed (ah, we've located the source of the trouble now, haven't we?) that HDMI would be superior hi def quality in comparison to VGA. So, exactly how stupid and off track am I in this regard? Does the video card in question simply suck, does HDMI in general suck, or are there other considerations I should bear in mind when purchasing a video card to drive a plasma TV? It goes without saying that something sucks. I'm just not sure what, and thus couldn't provide the standardized subject for this post. :)
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting ServicesI'm using an nVidia GeForce 8500 series card for my media PC. Picture quality is fine with HDMI (Visio 32" 1080i/720p TV), but because I don't have the little cable that runs from the onboard sound card (and Dell didn't solder the connection on the motherboard), I don't get any sound. Windows 7 (and Windows Vista) cuts off analog sound output when you plug in an HDMI connection. My DVI-HDMI cable does the same. So, in order to use the HDMI cable, I have to purchase a retail sound card with the connecting cable to the sound card. When we get ready to replace the system in the coming months, it will be worth it, as the HDMI output looks much better - the system correctly identifies the display and resolution. With the VGA cable, I can set the TV to either 1280 x 720 (which looks okay, but too soft around the edges) and 1360 x 768. The display is rated at 1366 x 768, and I can't seem to get the card/display to that resolution with the VGA cable. Flynn
-
Most 1080P tvs operate at 1920x1200 resolution not the expected 1920x1080 though.
-
Ummm wait what? Am I misunderstanding what's going on? My understanding was that the overscan behavior cropped the 1280x720 (1920x1080) image down to 1260x700 (1900x1060) and then upsampled it to whatever the panels native resolution was.
3x12=36 2x12=24 1x12=12 0x12=18
No, 720p panels are for 1280x720 main area and 1366 x 768 to encompass the overscan. Yes, I know but that's what they do. Add to that some small TVs are 1440x900 and 1080p TVs are 1920x1080 and scale to put the overscan outside the visible area from an image that is broadcast at 1920x1080... I always turn the 1080 overscan off and it looks better because you get 1:1 mapping. :cool: There is an awfull lot more but I'll save you the pain.
Join the cool kids - Come fold with us[^]
-
It puts roadblocks in place for people who think they should be able to make unrestricted copies of digital media without having to pay for any of it. :rolleyes:
3x12=36 2x12=24 1x12=12 0x12=18
-
When the Panasonic comes in, I'll experiment with that and look for the kind of filtering stuff you're talking about. Thanks, man.
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting ServicesWell, on the Samsung TV I was using you had to plug in to HDMI slot 2 and then set the name of the source to 'PC' so it wasn't exactly intuitive. But it certainly did the trick.
My current favourite quote is: Punch them in the face, see what happens!
-SK Genius
-
I installed a new PNY / Nvidia video card in the box that will be serving the plasma TV yesterday. As the TV hasn't arrived, I tested the HDMI output configuration on a smaller display. To my surprise, while the VGA output tested reasonably crisp and clear, the HDMI output was terrible, with a great deal of ghosting. Admittedly, I haven't done much research on this and just assumed (ah, we've located the source of the trouble now, haven't we?) that HDMI would be superior hi def quality in comparison to VGA. So, exactly how stupid and off track am I in this regard? Does the video card in question simply suck, does HDMI in general suck, or are there other considerations I should bear in mind when purchasing a video card to drive a plasma TV? It goes without saying that something sucks. I'm just not sure what, and thus couldn't provide the standardized subject for this post. :)
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting ServicesChristopher Duncan wrote:
So, exactly how stupid and off track am I in this regard?
*bites tongue* That was a setup and I am NOT taking it. :) I have too much respect for you to fall for that one....
Christopher Duncan wrote:
Does the video card in question simply suck
which card? PNY used to have a Quadro Contract, that means they made the premium boards for nVidia based on a history of good board fab-methods; they lost it because of poor practices, bad customer service, and poor quality boards. They have been trying to restore their reputation, so quality has come up... but it is entirely possible you got a bad card. I have been running HDMI on nVidia cards for well over a year now.
Christopher Duncan wrote:
It goes without saying that something sucks. I'm just not sure what,
When in doubt, blame someone else... ;P picking randomly works too.
_________________________ John Andrew Holmes "It is well to remember that the entire universe, with one trifling exception, is composed of others." Shhhhh.... I am not really here. I am a figment of your imagination.... I am still in my cave so this must be an illusion....
-
No, 720p panels are for 1280x720 main area and 1366 x 768 to encompass the overscan. Yes, I know but that's what they do. Add to that some small TVs are 1440x900 and 1080p TVs are 1920x1080 and scale to put the overscan outside the visible area from an image that is broadcast at 1920x1080... I always turn the 1080 overscan off and it looks better because you get 1:1 mapping. :cool: There is an awfull lot more but I'll save you the pain.
Join the cool kids - Come fold with us[^]
I'm still not 100% sure I follow. IS this correct? 720P: Screen size: 1366x768. Normal video resolution: 1280x720. Video resolution with overscan: 1366x768. 1080p: Screen size: 1920x1080. Normal video resolution: 1920x1080 - overscan area. Video resolution with overscan: 1920x1080.
3x12=36 2x12=24 1x12=12 0x12=18
-
I'm still not 100% sure I follow. IS this correct? 720P: Screen size: 1366x768. Normal video resolution: 1280x720. Video resolution with overscan: 1366x768. 1080p: Screen size: 1920x1080. Normal video resolution: 1920x1080 - overscan area. Video resolution with overscan: 1920x1080.
3x12=36 2x12=24 1x12=12 0x12=18
Almost, most 1080p TVs scale the image up about 5% to hide the overscan area. Why, this is what happens when standards are developed my commitees with people from several industry sectors with their own proprietory solutions in place. It comes under various names but 1:1 mapping actually looks better so switch it off if you have the option. There is even an option for 1920x1088 and a pan vector to say which bit to crop vertically!
Join the cool kids - Come fold with us[^]
-
I installed a new PNY / Nvidia video card in the box that will be serving the plasma TV yesterday. As the TV hasn't arrived, I tested the HDMI output configuration on a smaller display. To my surprise, while the VGA output tested reasonably crisp and clear, the HDMI output was terrible, with a great deal of ghosting. Admittedly, I haven't done much research on this and just assumed (ah, we've located the source of the trouble now, haven't we?) that HDMI would be superior hi def quality in comparison to VGA. So, exactly how stupid and off track am I in this regard? Does the video card in question simply suck, does HDMI in general suck, or are there other considerations I should bear in mind when purchasing a video card to drive a plasma TV? It goes without saying that something sucks. I'm just not sure what, and thus couldn't provide the standardized subject for this post. :)
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting ServicesMost TV do a lot of things for optimal TV and movie viewing: 100/200hz and motion compensation removing probably ugly boarders blurring/sharpening to make the picture look better Newer TV often have a PC or native mode, where each pixel from the PC will correspond to one on the display. The already mentioned having the TV on while booting the first time makes sure that the PC sends a 1080p picture if the TV sends the correct info to it. If not set it to 1920x1080 on a full HD TV. To get the correct viewing mode however you have to enter the TVs menu. On mine you select Picture mode PC and Picture size native. More than 2 years old TV models may not have the correct modes or have them tied to VGA and DVI input. Regards, Rune B.
-
I installed a new PNY / Nvidia video card in the box that will be serving the plasma TV yesterday. As the TV hasn't arrived, I tested the HDMI output configuration on a smaller display. To my surprise, while the VGA output tested reasonably crisp and clear, the HDMI output was terrible, with a great deal of ghosting. Admittedly, I haven't done much research on this and just assumed (ah, we've located the source of the trouble now, haven't we?) that HDMI would be superior hi def quality in comparison to VGA. So, exactly how stupid and off track am I in this regard? Does the video card in question simply suck, does HDMI in general suck, or are there other considerations I should bear in mind when purchasing a video card to drive a plasma TV? It goes without saying that something sucks. I'm just not sure what, and thus couldn't provide the standardized subject for this post. :)
Christopher Duncan
www.PracticalUSA.com
Author of The Career Programmer and Unite the Tribes
Copywriting ServicesI purchased an ATI HD 4350 Card with HDMI to connect my HTPC (Half Height Case) to my Samsung 32" Series 6 and got exactly the same results that you have expressed. Using HDMI was gawd damn awful even when viewing DVD Quality video. If you have the option, connect using VGA. My Samsung (True HD @ 1920x1080) displays the VGA signal in near HD Quality at all times, as compared to the barely viewable HDMI connection, and no problems with resolution settings etc. Interestingly, the Samsung has a dedicated HDMI socket for PC/DVI connections. I tried all the HDMI sockets with the same results. A VERY disappointing experience to say the least, given the hype regarding HDMI. An aside, my friend just purchased a 42" Panasonic LCD TV with a free Panasonic Blu-Ray player. Connecting the Blu-Ray Player via HDMI produced a beautiful HD picture but IMHO, the picture quality was no better than my Samsung/VGA combination. I'm guessing that HDCP has a lot to do with the problem. degrading the signal because of a lack of the proper codes/certificates etc, that are needed when viewing HD content on anything. Fortunately, it appears that VGA bypasses this for now.
Regards Andy M
-
I don't think there are that much difference in HDMI cable quality past a certain price-point (max 20-30$); I would not get the cheapest of the cheap, but one from a more reputable vendor (if that ever exists). There is probably one area where "better" quality could mean something is if you want long(er) cable than what is usually the norm. anyway, I'm still on regular cable and tube TV, so my opinion counts for not that much. Max.
Watched code never compiles.
HDMI is a digital connection, there are different specifications (1.2, 1.3, etc) that will make a difference in features supported, but brand to brand does not make a difference in quality. Since interference either happens or doesn't (digital) you either get correct data to the TV or you don't, it's not "blurry" or "staticy". A blurry image is likely caused by analog/digital conversion at some point being subpar or upscaling.