Computer hooked to TV via HDMI looks bad

Got a shopping cart of parts that you want opinions on? Get advice from members on your planned or existing system (or upgrade).

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

Post Reply
MikeK
Patron of SPCR
Posts: 321
Joined: Mon Aug 18, 2003 7:47 pm
Location: St. Louis, USA

Computer hooked to TV via HDMI looks bad

Post by MikeK » Wed Jan 12, 2011 9:22 am

Continuing discussion started in this thread.
CA_Steve wrote:
arjunr wrote:does anyone have an why playing video with integrated graphics or 90% of graphics cards via hdmi to a TV doesn't look as good as say... a ps3 or wdtv high def media player? i would love to build one of these intel integrated graphics machines for a htpc, but in my experience they don't even come close to standalone players.
This is headed off-topic...but...
It could be a couple of things.
- DRM/Hollywood getting in the way. For example, Netflix streams higher data rate to an appliance (PS3, TV, BD player) than via Silverlight on your PC.
- Your TV settings for the PC input. Some TVs, like Samsung LCD, reduce their horrible input lag by removing most of the video processing that makes the picture look good.
- Your gpu doesn't support/has awful support for the file type/resolution you want to stream.
In my experience doing HDMI from a Radeon HD 4670 to my 720p Toshiba tv, it looks like crap compared to using the VGA PC input. I have researched this and I'm not completely sure yet exactly why. It may be something to do with 1:1 pixel mapping capability of your TV. When on VGA, it looks crystal clear, like a computer monitor should, and it picks native resolution at 1366x768 I believe (yes that's not 720 is it). No matter what res I pick over HDMI, I can't get it to look as good and fit the screen. There is one or two resolutions that look native but end up 4:3 or something with black on the sides oddly. I'm switching to a 5770 soon so we'll see how that goes over HDMI. I don't believe that card has VGA at all so I'll be forced into HDMI or using DVI -> VGA adapter or something since my TV doesn't have a DVI input. Hopefully that HDMI looks good. I'll report back how it goes. If this is something related to HDMI DRM stuff I will be pretty annoyed. How do others experience hooking their computer to their TVs... or HTPCs obviously?

So any ideas for this situation...? arjunr, I'm not sure my situation is the same as yours but it's possible. Have you tried using other inputs?

arjunr
Posts: 13
Joined: Sun Dec 05, 2010 11:02 am

Re: Computer hooked to TV via HDMI looks bad

Post by arjunr » Thu Jan 13, 2011 1:32 pm

it's really a strange issue... i've built a few of the SPCR build guides... and for the silent home server with the zotac h55-c-e board, the onboard video looks pretty bad compared to a standalone player... i can't figure out why _exactly_ it's bad... the picture isn't as vibrant, looks slightly degraded... i chalked it up to onboard video sucking and moved on...

for the gaming machine, i put in a 5870 and thought "this is one of the most expensive cards on the market so it should look great!" well, it looked just like the zotac board did... i should mention that in both cases the picture took up the whole screen and 1:1 pixel mapping was enabled on my TV (sony xbr4 52inch lcd)...

so I returned that card thinking it was bad and got a gtx460... that happened to look great on my TV, not perfect in displaying windows text and such, but for video I would say that it's as good as most standalone players, I can't notice a difference between it and my wdtv live plus box (which looks great on every tv).

i tried a radeon hd 6850 later and it looked poor as well though not as poor as the zotac. i've come to the conclusion that nvidia just makes better cards for TV performance, don't know why. Drivers on all cards were updated to latest (260.99 on nv, 10.12 on radeon).

MikeK
Patron of SPCR
Posts: 321
Joined: Mon Aug 18, 2003 7:47 pm
Location: St. Louis, USA

Re: Computer hooked to TV via HDMI looks bad

Post by MikeK » Thu Jan 13, 2011 2:04 pm

Hmm that's really disappointing. With HDMI/DVI I thought TVs worked just like large computer monitors now but apparently not. I guess I'll have to do some research on my TV model. I'll let you know how the 5770 goes too. I have a feeling I may be looking for a VGA adapter to use which is ridiculous. Digital to Analog to Digital all over again.

MelissaToshiba
-- Vendor --
Posts: 1
Joined: Wed Jan 19, 2011 6:41 am

Re: Computer hooked to TV via HDMI looks bad

Post by MelissaToshiba » Wed Jan 19, 2011 6:43 am

If your TV is able to display a maximum resolution of 720p, then make sure the resolution on your laptop is set to output to the TV at 1280x720 (720p). Depending on the native resolution of your laptop (if less than 1280x720) you may have to output to the TV only (not duplicate screens) to actually get 1280x720 (720p).

You will also need to turn on “Gaming Mode” to allow the graphics card on your laptop to do the video processing instead of the TV. You will see a significant difference in sharpness on desktop fonts and more.

Melissa, Community Manager for Toshiba Canada
Melissa@toshiba.ca

MikeK
Patron of SPCR
Posts: 321
Joined: Mon Aug 18, 2003 7:47 pm
Location: St. Louis, USA

Re: Computer hooked to TV via HDMI looks bad

Post by MikeK » Thu Jan 20, 2011 12:27 pm

Wow thanks for replying Melissa!

I just made a discovery. With the new Radeon 5770 instealled, I plugged my computer into the TV over HDMI once again instead of VGA. Over VGA, the recommended resolution and the clear one was 1360x768 - looked great. So I started going through the resolutions over HDMI with no luck. Most were blurry, most didn't take up the whole screen, and one was clear but looked like it was squeezed into 4:3. I had the TV's Pic Size set to Natural while doing this. Then I tried Native (which I had tried before). I happen to be on 1360x768 when I changed it and I saw a new option - Dot By Dot! It had taken the place of the Native option. This must be the 1:1 pixel mapping. It looked great. I also set it to the other option available - Full and that also looked great, the only difference being that the picture looked like it got slightly bigger than the screen. I guess this is the effect that I read about with HDTVs having overscan built in just because TVs have always had it.

I don't think I'd tried Full before since I stay away from it with TV and Bluray, and I don't think I saw the Dot By Dot option last time I tried HDMI with my other video card. I'm pretty sure it wasn't there as an option but I guess I can't be sure.

Anyway things are great over HDMI now. So apparently native resolution of the TV is 1360x768.
I checked and my TV model is Toshiba 37AV502U. The specs just say it's native resolution is 720p.
http://cdgenp01.csd.toshiba.com/content ... u_spec.pdf

I have not tried Game Mode so I'll have to do that. I wonder if I'll have to change the Pic Size or Game Mode when switching between using the computer and using other sources.

"Gaming Mode enhances the gaming experience by reducing the amount of time it takes for the signal to travel from the game controller to the TV"
So from reading this and what you said, Melissa, it sounds like it's turning off the TV's video processing to save on response time from the video source to the display. I will definitely try that!

I'm actually using a desktop tower computer so no laptop monitor involved. We have a wireless mouse and keyboard and we have been using it for games, internet browsing, video watching etc. and it's great. It is in the living room though so the plan is to put the computer in the basement right under the TV area and run HDMI and USB from it so we have video/audio and the wireless dongles. The HDMI working improves things since I can avoid VGA and might be able to avoid using audio cables, though probably not since my audio receiver just has an HDMI switcher and doesn't take audio from it. So I'm working my way to the HTPC but mine could be used for gaming more than everything else.

lodestar
Posts: 1683
Joined: Fri Aug 05, 2005 3:29 am
Location: UK

Re: Computer hooked to TV via HDMI looks bad

Post by lodestar » Thu Jan 20, 2011 1:48 pm

MikeK wrote:So apparently native resolution of the TV is 1360x768.
It's actually 1366x768 - the 1360 arises from the fact that ATI cards can only display resolutions which are divisible by 8. So the 768 part is OK, but 1366 is not and is reduced to 1360 which is divisible by 8. The alternative nVidia cards can do 1366x768 because they use a divisor of 1.

The combination of a 1366x768 native resolution screen, and the 720p designation was quite common at one time. It means that 720p which is 1280x720 is achieved at non-native resolution with some loss of quality. There were some TVs and TV/monitors with DVI inputs which meant that they could be fed a native resolution image from a grapics card, with some other arrangement being necessary to deal with the sound.

MikeK
Patron of SPCR
Posts: 321
Joined: Mon Aug 18, 2003 7:47 pm
Location: St. Louis, USA

Re: Computer hooked to TV via HDMI looks bad

Post by MikeK » Thu Jan 20, 2011 2:51 pm

Thanks for that info. Interesting, I thought I had seen 1366 before. Maybe it's an option in Windows when I plug it in with VGA.
*edit - it isn't an option over VGA either. I can't remember where I saw that. Maybe just remembered it was a standard resolution. I guess I'll stick with VGA for now since my TV only has 2 HDMI inputs and I'm using those for cable and bluray. My audio receiver has an HDMI switcher with 2 inputs so I could use another if I had too but then I'd have to switch inputs on both the TV and the receiver for video and the normal audio switching on the receiver as usual.

danimal
Posts: 734
Joined: Mon Jun 08, 2009 2:41 pm
Location: the ether

Re: Computer hooked to TV via HDMI looks bad

Post by danimal » Fri Jan 21, 2011 8:16 pm

MelissaToshiba wrote:If your TV is able to display a maximum resolution of 720p, then make sure the resolution on your laptop is set to output to the TV at 1280x720 (720p).
great post.

the moral of the story there is that you should always set the video card to display the native resolution of the monitor; win7 will read the make and model of the monitor, and usually default to the correct native resolution... vga is never better than the digital input.

wrt gaming on a monitor/tv: frame lag can be measured, don't depend on mfg specs, and panel type is a big factor:
http://www.avsforum.com/avs-vb/showthread.php?t=1131464

also, the capability of the monitor to display 4:4:4 color sampling can be checked by taking a macro shot of the screen, but few cheap monitors will have that capability... i just ordered a 32lg450 that will do that, but others didn't want it, because it can't do 120hz :roll: people who don't understand technology don't know what they are missing.

lastly, the digital port you use can affect the 4:4:4 display capability; people are reporting that dvi to hdmi is the way to go, for both nvidia and ati cards.

danimal
Posts: 734
Joined: Mon Jun 08, 2009 2:41 pm
Location: the ether

Re: Computer hooked to TV via HDMI looks bad

Post by danimal » Fri Jan 21, 2011 8:21 pm

triple post?
Last edited by danimal on Fri Jan 21, 2011 11:05 pm, edited 1 time in total.

danimal
Posts: 734
Joined: Mon Jun 08, 2009 2:41 pm
Location: the ether

Re: Computer hooked to TV via HDMI looks bad

Post by danimal » Fri Jan 21, 2011 9:45 pm

again?

MikeK
Patron of SPCR
Posts: 321
Joined: Mon Aug 18, 2003 7:47 pm
Location: St. Louis, USA

Re: Computer hooked to TV via HDMI looks bad

Post by MikeK » Wed Jan 26, 2011 8:04 am

I tried using Game Mode on my tv and found out that option is grayed out in the menu when connected to the PC Input (VGA). I'm guessing this means that it's enabled when using PC Input. I'm planning to continue using the VGA now since I don't have enough HDMI ports to use easily. It looks really good still so I'm happy. Now to cut a hole in the wall so I can put the computer in the basement :twisted:

jhhoffma
Posts: 2131
Joined: Mon Apr 25, 2005 10:00 am
Location: Grand Rapids, MI

Re: Computer hooked to TV via HDMI looks bad

Post by jhhoffma » Thu Jan 27, 2011 12:39 pm

Color space is also an issue depending on OS and drivers used. WindowsXP has to be tricked into sending the IRE standard black and white levels via HDMI. Otherwise, the standard 0-255 (PC levels) will be compressed into the 16-235 levels which will make the black look blacker and dim and the white look whiter and washed out.

Vista and Win7 should not suffer this issue as the newer Catalyst drivers recognize this better. Regardless of OS, there is a setting in the Catalyst drivers for "Pixel Format" (I don't remember where off-hand) that should be set to "full RGB 4:4:4" when using HDMI to send content produced with IRE standards (DVDs, BluRay, HDDVD, etc).

Not sure if that's the problem you're having, but it definitely something that comes up a lot with more people converting older systems into HTPCs...
HTPC: OrigenAE X11|Gigabyte GA-MA785GPMT-UD2H|Phenom II x3 740BE w/AC Freezer 7|150GB Velociraptor|Corsair VX450
Main: Antec 300 (SlipStream @ 800rpm/140mm @ 5v)|Asus M4A88TD-M|Phenom II x4 945 (Mugen2 pass.)|Asus EAH6850|Samsung 830 128GB|Antec TP750
WHS: DF-85|P8H67-M Pro|I5-3450S/Hyper 212+|Corsair AX650|Sandisk Extreme 240GB, 2xWD20EARS, 2x WD15EARS, WD15EADS

MikeK
Patron of SPCR
Posts: 321
Joined: Mon Aug 18, 2003 7:47 pm
Location: St. Louis, USA

Re: Computer hooked to TV via HDMI looks bad

Post by MikeK » Sat Jan 29, 2011 1:44 pm

Good info. I have Windows 7 on it now. Once I got the HDMI working properly, I actually didn't notice much difference between using VGA to VGA vs HDMI to HDMI but I didn't take any time to compare closely. Of course digital to digital is always better than lots of conversions.

Post Reply