where have the Home Theatre MOBOs gone?
Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee
where have the Home Theatre MOBOs gone?
I'm having a hell of a job replacing my A8N-VM-CSM, which got hit by lightning. I thought it would be easy to replace, but finding tv-out on a mobo is hard, because they assume you want HDMI, and those boards with hdmi, such as the P5E-VM only have HDMI, and no DVI for your LCD! They have a spare stupid d-sub port which nobody will use.
Then when I look at Graphics Card reviews, Tom's Hardware says I'll be lucky to find any card that idles below 19W. SCREW THAT!
http://www.tomshardware.com/2007/08/21/ ... page5.html
I assume that anything which translates VGA to tv-out will be pretty bad.
Then when I look at Graphics Card reviews, Tom's Hardware says I'll be lucky to find any card that idles below 19W. SCREW THAT!
http://www.tomshardware.com/2007/08/21/ ... page5.html
I assume that anything which translates VGA to tv-out will be pretty bad.
if your tv has an RGB-scart input, you can experiment with a vga to rgb-scart cable.
Would a HDMI to DVI cable/dongle work with that motherboard?
http://www.google.com/search?hl=en&q=as ... tnG=Search
The answer is: it should.(havent read the whole review though)
http://www.silentpcreview.com/article785-page1.html
"Accessories: 1 x HDMI-to-DVI conversion adapter"
Would a HDMI to DVI cable/dongle work with that motherboard?
http://www.google.com/search?hl=en&q=as ... tnG=Search
The answer is: it should.(havent read the whole review though)
http://www.silentpcreview.com/article785-page1.html
"Accessories: 1 x HDMI-to-DVI conversion adapter"
I've figured out part of the puzzle. All the home theatre mobos are AM2. However, I know nothing about ATI X1250, and whether it will look OK on a TV. The GForce 6150 drivers only worked properly in one version, and by next version they had f^cked it up again.
I'm disappointed that the Intel chipsets don't have DVI
I'm disappointed that the Intel chipsets don't have DVI
Thanks for the link, Jaganath. It would appear that if you are going to build a HTPC based around an Intel chipset, then the best gutless choice of graphics card is
HD3450 (directX 10.1) 7-10W
HD2400pro (directX 10) 7-14W
HD2400XT (directX 10) 7-19W
The specs for the HD2400XT say it is quite fast, but there are abviously hyped fan versions and dumbo passive versions.
http://www.hothardware.com/Articles/ATI ... ve/?page=2
The thing I like about the 3450 is that it goes from 7W-10W. Not a lot of dynamic range.
The HD3650 could be run without a fan in 2D mode, but I know that if I do that, I will forget, and blow it up playing a game.
HD3450 (directX 10.1) 7-10W
HD2400pro (directX 10) 7-14W
HD2400XT (directX 10) 7-19W
The specs for the HD2400XT say it is quite fast, but there are abviously hyped fan versions and dumbo passive versions.
http://www.hothardware.com/Articles/ATI ... ve/?page=2
The thing I like about the 3450 is that it goes from 7W-10W. Not a lot of dynamic range.
The HD3650 could be run without a fan in 2D mode, but I know that if I do that, I will forget, and blow it up playing a game.
astrayan wrote:Yeah, it just seems really weird that they made this expensive board (P5E-VM) and you can run an LCD with adapter, OR a HDTV, but not both. Haven't we moved into an era where the d-sub is obsolete yet? Esp on this mobo.mcoleg wrote:HDMI to DVI works perfectly; both are pretty much the same thing.
really have no good answer for that. i guess most of the time designers are too afraid to take a leap - look at the a/v amps, for example. would it kill them to put a unit on the market that has only hdmi and component switches built-in and drop all the old-style connectors? no, has to be backward-compatible... shame, really.
That may seem like a good joke, but it doesn't work. TV-out is not just an adapter, it's a whole graphics to analogue interpolation and clipping scheme, whereas d-sub can be adapted to from DVI. The newer graphics cards with 2 DVI connectors supply adaptors.Moogles wrote:No, but at least TV-out is!
On my A8N board, the only way I ever used the second graphics output was via TV-out. That blocked the d-sub output. d-sub can't be used for anything except old analogue monitors. If you have a board with a d-sub and a DVI, they are really asking you to plug an LCD monitor into it in analogue mode (for no good reason).
DVI can be adapted to both d-sub and HDMI. TV-out is still needed for people who have hi-def TVs with heaps of S-video inputs.
-
- Posts: 286
- Joined: Sat Dec 09, 2006 9:45 pm
- Location: Montréal, Canada