Page 1 of 1

All-in-Wonder x1900 - Could be the way we want it

Posted: Wed Feb 01, 2006 6:31 pm
by ~El~Jefe~
I was looking at the card specs and reviews. I havent seen one that shows wattage draw or heat, but...

check it out. It has a much lower clocked ram and core than the x1900 xt (and of course less than the xtx) BUT:

It still gets close to 7800GT speeds and also seems to be beating it at times. Supposedly when programmers do more texture something or other, i.e., make things more pretty, the 1900 will outshine the 7800GTX in terms of performance even at its low clock.

well regardless it's mad money so kinda quazy wigga crack at the moment.

But, it is what I have my sites set on. No theater 550 on it yet though, which is freaky yes, but the 200 is great or same for regular analog signal. I would say that in order to get HD on it you would have to get a pci-e or pci HDTV wonder card anyways it seems.

Well, thought I would throw this card out there. I expect it to not be screaming hot.

Right now the performance for watt is 7800GT from evga I would say.

Posted: Wed Feb 01, 2006 11:14 pm
by Weldingheart
Doesn't most newer Ati cards got malicious wattage?
Btw why does Nv w/integrated TV-tuner cards(forgot the name-some 'cinema' or something), not so popular compared to AIW solutions?

Posted: Wed Feb 01, 2006 11:40 pm
by frostedflakes
Only seems like the 512MB cards have super-high power consumption. For example, the X1800XT (625/1500MHz; 512MB) pulls 112w according to Xbit-labs, while the X1800XL (500/1000MHz; 256MB) only uses 59w. Considering the small difference in GPU clock and large difference in memory capacity and speed, I don't think it's an unreasonable conclusion that the memory is responsible for most of that 2x increase in power consumption. :shock:

Really it's the same deal with the GTX 512MB, although not quite as severe as with the Radeon X1800 series (maybe it has something to do with ATi's 512-bit memory bus?).

Posted: Fri Feb 03, 2006 2:44 pm
by ~El~Jefe~
the 1900 aiw is not xt or xtx, its just x1900 plain. it has much lower clock speed on the ram and on the core.

It DOES however kick a lot of butt for a few reasons:

1. It clocks fast for any game out at higher resolutions with AA enabled. So it hits the good enough mark for most anyone's gaming needs. It is the least wattage unit for the most technology. that SHOULD make it a great spcr type.

2. It has real technology for future advances in artwork for better looking games. I duno if it is just me, but even though I love BF2, it looks kinda ordinary in its level of artwork, texture design and imagination. The amount of shader options and such make it seem as if it could do limitless complex designs without being hindered performance wise. whether it is faster for current games or not, it should be faster in the future than many present faster solutions. if that makes sense. (from what I have been reading and stuff.

3. It isnt nvidia. I would buy them if their analog text and 2d images looked super sharp, but the never seem to (past 8 years I have tested this). Ati is acceptable. Matrox is the best, but they make low tech gaming cards. SLI is noted for producing crappy image quality. rarely spoken about to anyone who runs a paid off review site. which is almost all of them. pwnd.

4. It uses a very small heatsink and fan. from what a review said, it uses the excessively finned small ati fan. i have seen that before. It is terrible for noise. While this sounds bad, I mean, who really here would keep the original HSF on their machine anyways? The point is that it is the only card that has super fast gaming speed yet doesnt need a large heatsink. hm. has potential. i wish Mike C got one of these to actually test it vs paid off sites with crappy wattage measuring techniques.
Why it sux:
It is like 500 dollars. er. No vid card is worth that. even if you can afford it, that is thievery.

The AGP version probably wont exist. pci-e is a bunch of crap still. no card has yet to max out the agp bus. The bidirectional ability of pci-e is completely useless for gaming, the only real need for such a video card.

nvidia, haupauge, etc etc, do not have the clarity of an ati 200 tv tuner for analog. HDTV seems to not really matter in terms of which brand you get. I have been reading about these things and have used 3 of them (haupauge, a no name brand usb one, and ati aiw) and feel that it the ati always wins out in picture quality. being that tv is a picture, that would be the only real aspect worth rating a tuner by, I would guess)

I seem like an ati fanboy, but I am not. I just need my tv and gaming needs in one machine.

Jeez I am bored at my job.

Posted: Fri Feb 03, 2006 6:04 pm
by warriorpoet
The thing that really caught my eye with the x1k series is the ability to use the two-way bandwidth advantage of PCI-E to do things other than graphics with spare clock cyles. For example, I heard of a demonstration where they used an x1800xt to simultaneously render a moving ocean scene in real-time and simultaneously calculate the physics of the water. Supposedly it ran much better than with the cpu handling the physics and the card left alone to render.

Just one of the many wonders of these x1k cards. I could go on and on about how much I like the R5xx design, but then I'd sound like a fanboy (I'm not).

My personal vidcard history: Voodoo 3, eVGA GeForce3 Ti200, ATi 9500PRO, Leadtek 6800GT, ATi x1900xt :D

Posted: Fri Feb 03, 2006 9:13 pm
by ~El~Jefe~
I totally agree with you about the great 2 way ability of pci-e

unfortunately, NOTHING one will ever buy in these next 3-4 years that will be made for this function as far as gaming goes.

the physics calculating PPU only needs a pci slot for its massive boost to game physics. Yes, I hear of a pci-e version, but I just wonder if that actually does anythign or is it just a thing you can use for the slot.

I am totally a proponent of such designs, I just know how apathetic the programming world is in taking advantage of great technology (smart shader 3.0 and HDR!!! what % of modern games use it??!!?)

PCI-E on graphics was supposed to work like this eventually:
the memory on the vid card was supposed to be CUT and the vid card would use the system memory.

yeah right. Only one card I know of now does that, it is very low end, I read about it a week ago on Was a good idea but it hasnt worked. I think the only thing that can be said is the power of the slot has increased.... BUT take my example of why this is useless: PSU's with an extra 4 pin dont need it. really, it just puts more load on the board, which is more heat, which is more noise etc etc. a psu, if that takes the load and the current going through is more efficient. just a thought, might not be a big factore though in actuality.

Posted: Sun Feb 05, 2006 3:17 am
by gustavs_a
~El~Jefe~, I have the same experience with NVIDIA, bad image quality on analog connection. But with DVI I think image quality should no longer be an issue.

Posted: Sun Feb 05, 2006 9:48 pm
by ~El~Jefe~
I would rather gag on a chicken than use an LCD monitor. I do thank you for your comment, I thought I was going crazy. Everyone said that nvidia is fine for 2d. ick. terrible! at 640x480 it was clear, thats about it! (no joke)

I have a mitsubishi 22" diamond pro 2070 SB monitor. I have yet to see any LCD, even ones costing over 3 grand, that can even look as good as this one, one that costs only 700 dollars.

I wish it were so, but I returned all that I bought. I have a pro series brand new 19 inch viewsonic at my mom's house. It looks great, was only 445 from newegg, but heck, it just looks crappy on small text with the ati DVI.

Now the matrox, that looks great on analog. the 550. They actually care about 2d graphics and text it appears. A cheap card for what it does, makes you wonder what crap they put on ati and nvidia cards if a sub 90 dollar card outclasses them 3 fold in image quality. I use this card in conjucntion with whatever gaming/tv card I use. I do wind up hardly ever using it though, it is a pain to switch with an ATI as the main card. ATI's catalyst is really f'ing retarded and tries to "help" even when I turn it off. Once almost made my computer unusuble. I had to open it up and take out the cards.

Posted: Mon Feb 06, 2006 5:30 am
by gustavs_a
~El~Jefe~, you are not going crazy. In Google you can find information about this, but I still think consumers have been too quiet. When I had the NVIDIA card I was using 15" CRT on 800x600 res. The picture was blurry and colors bleak compared to ATI. Totally unacceptable. Never had a Matrox card.

If you have a chance look at Samsung SyncMaster 193P+ (19" LCD), awesome picture quality, high quality beautiful design. I have it. I did the research for best 19" LCD a few months ago and this was my pick.

These days I'm much more careful about buying new hardware, read tons of reviews and if possible test it personally, because I know there is a lot of crap out there and where I live it's usually impossible to get a refund (bought it - your problem). Still I have been burned lately.

Posted: Mon Feb 06, 2006 6:28 am
by mattek
Gustavs_a I also have the 193p+. I'm very pleased with the picture quality. I work a lot with photos and color clarity and reproduction is very important to me. The 193p+ delivers in these respects in my opinion.

If I had the money I'd probably upgrade to a EIZO. (

These beasts are unbeatable when it's comes to picture quality. I've seen a few in real life and they are amazing.

Bottom line: 193p+ gets the job done with out costing a fortune and can even be used for gaming. (I play BF2 a lot).

Posted: Mon Feb 06, 2006 6:45 pm
by AZBrandon
I'm using a 21.3" ViewSonic VP211b with a 7800GT at 1600x1200 using the DVI connector. I noticed the color reproduction definitely isn't the same as my 21" CRT, but the LCD with DVI connector is much crisper than any of the 3 different cards I used with my 21" CRT. I realize many graphic artists and such still need CRT's faithful color reproduction, but for the rest of us, an LCD with DVI is as sharp as it gets and "good enough" for color.

Posted: Thu Feb 09, 2006 9:34 pm
by ~El~Jefe~
yeah see, gota have the CRT.

I want to see whatever I am playing or movie watching or whatever, I never have to worry about a CRT. In fact, they are so much cheaper for the size and resolution and everything, I am considering buying another one jsut to hold as backup in case they stop making this one anymore.

I also do not like fixed pixel sizes. I suspect OLED is going to help this one out a lot, maybe then I would switch.