where have the Home Theatre MOBOs gone?

Got a shopping cart of parts that you want opinions on? Get advice from members on your planned or existing system (or upgrade).

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

Post Reply
astrayan
Posts: 45
Joined: Fri Jun 17, 2005 5:54 am
Location: astraya

where have the Home Theatre MOBOs gone?

Post by astrayan » Fri Mar 28, 2008 10:59 pm

I'm having a hell of a job replacing my A8N-VM-CSM, which got hit by lightning. I thought it would be easy to replace, but finding tv-out on a mobo is hard, because they assume you want HDMI, and those boards with hdmi, such as the P5E-VM only have HDMI, and no DVI for your LCD! They have a spare stupid d-sub port which nobody will use.

Then when I look at Graphics Card reviews, Tom's Hardware says I'll be lucky to find any card that idles below 19W. SCREW THAT!

http://www.tomshardware.com/2007/08/21/ ... page5.html

I assume that anything which translates VGA to tv-out will be pretty bad.

jaganath
Posts: 5085
Joined: Tue Sep 20, 2005 6:55 am
Location: UK

Post by jaganath » Sat Mar 29, 2008 2:19 am

when I look at Graphics Card reviews, Tom's Hardware says I'll be lucky to find any card that idles below 19W.
THG is full of s**t. 7300GS/GT,7600GS/GT,7100GS,6200,6600,6800,8500GT all idle <19W. link

lobuni
Posts: 73
Joined: Thu Aug 23, 2007 2:33 am

Post by lobuni » Sat Mar 29, 2008 2:24 am

if your tv has an RGB-scart input, you can experiment with a vga to rgb-scart cable.

Would a HDMI to DVI cable/dongle work with that motherboard?

http://www.google.com/search?hl=en&q=as ... tnG=Search

The answer is: it should.(havent read the whole review though)

http://www.silentpcreview.com/article785-page1.html
"Accessories: 1 x HDMI-to-DVI conversion adapter"

mcoleg
Posts: 410
Joined: Fri Feb 23, 2007 11:55 pm

Post by mcoleg » Sat Mar 29, 2008 3:09 am

HDMI to DVI works perfectly; both are pretty much the same thing.

astrayan
Posts: 45
Joined: Fri Jun 17, 2005 5:54 am
Location: astraya

Post by astrayan » Sat Mar 29, 2008 4:05 am

I've figured out part of the puzzle. All the home theatre mobos are AM2. However, I know nothing about ATI X1250, and whether it will look OK on a TV. The GForce 6150 drivers only worked properly in one version, and by next version they had f^cked it up again.

I'm disappointed that the Intel chipsets don't have DVI

astrayan
Posts: 45
Joined: Fri Jun 17, 2005 5:54 am
Location: astraya

Post by astrayan » Sat Mar 29, 2008 4:07 am

mcoleg wrote:HDMI to DVI works perfectly; both are pretty much the same thing.
Yeah, it just seems really weird that they made this expensive board (P5E-VM) and you can run an LCD with adapter, OR a HDTV, but not both. Haven't we moved into an era where the d-sub is obsolete yet? Esp on this mobo.

astrayan
Posts: 45
Joined: Fri Jun 17, 2005 5:54 am
Location: astraya

Post by astrayan » Sat Mar 29, 2008 6:18 am

Thanks for the link, Jaganath. It would appear that if you are going to build a HTPC based around an Intel chipset, then the best gutless choice of graphics card is

HD3450 (directX 10.1) 7-10W
HD2400pro (directX 10) 7-14W
HD2400XT (directX 10) 7-19W

The specs for the HD2400XT say it is quite fast, but there are abviously hyped fan versions and dumbo passive versions.
http://www.hothardware.com/Articles/ATI ... ve/?page=2

The thing I like about the 3450 is that it goes from 7W-10W. Not a lot of dynamic range.

The HD3650 could be run without a fan in 2D mode, but I know that if I do that, I will forget, and blow it up playing a game.

mcoleg
Posts: 410
Joined: Fri Feb 23, 2007 11:55 pm

Post by mcoleg » Sat Mar 29, 2008 6:31 am

astrayan wrote:
mcoleg wrote:HDMI to DVI works perfectly; both are pretty much the same thing.
Yeah, it just seems really weird that they made this expensive board (P5E-VM) and you can run an LCD with adapter, OR a HDTV, but not both. Haven't we moved into an era where the d-sub is obsolete yet? Esp on this mobo.

really have no good answer for that. i guess most of the time designers are too afraid to take a leap - look at the a/v amps, for example. would it kill them to put a unit on the market that has only hdmi and component switches built-in and drop all the old-style connectors? no, has to be backward-compatible... shame, really.

Moogles
Posts: 315
Joined: Thu Mar 22, 2007 10:28 am

Post by Moogles » Sat Mar 29, 2008 8:52 am

astrayan wrote:Haven't we moved into an era where the d-sub is obsolete yet? Esp on this mobo.
No, but at least TV-out is!

:D

astrayan
Posts: 45
Joined: Fri Jun 17, 2005 5:54 am
Location: astraya

Post by astrayan » Sun Mar 30, 2008 4:50 am

Moogles wrote:No, but at least TV-out is!
:D
That may seem like a good joke, but it doesn't work. TV-out is not just an adapter, it's a whole graphics to analogue interpolation and clipping scheme, whereas d-sub can be adapted to from DVI. The newer graphics cards with 2 DVI connectors supply adaptors.

On my A8N board, the only way I ever used the second graphics output was via TV-out. That blocked the d-sub output. d-sub can't be used for anything except old analogue monitors. If you have a board with a d-sub and a DVI, they are really asking you to plug an LCD monitor into it in analogue mode (for no good reason).

DVI can be adapted to both d-sub and HDMI. TV-out is still needed for people who have hi-def TVs with heaps of S-video inputs.

astrayan
Posts: 45
Joined: Fri Jun 17, 2005 5:54 am
Location: astraya

Post by astrayan » Thu Apr 03, 2008 6:28 pm

aww, crud, the numbers in that article/forum post have changed

HD 2400 XT_______¦)¦]|' 7W-15W-19W ²
HD 2400 Pro______¦)}|' 9W-15W &
HD 3450 _________##¦)#} 18W-36W §

I can't see how the 3450 can take that much power, because it is passively cooled with a modest sink.

Spare Tire
Posts: 286
Joined: Sat Dec 09, 2006 9:45 pm
Location: Montréal, Canada

Post by Spare Tire » Thu Apr 03, 2008 11:01 pm

How about getting one of those D-sub to component dongles?

EDIT: Just put more research into it and that wont work with a TV, only specific proprietary projectors. The signals aren't the same.

Post Reply