New Matrox videocards: The M-Series

They make noise, too.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

zistu
Posts: 76
Joined: Thu Jun 28, 2007 1:22 pm

New Matrox videocards: The M-Series

Post by zistu » Tue Aug 19, 2008 6:23 am

I'm not sure if I missed a topic on this, if I did then I am sorry. Matrox recently introduced a bunch of new cards, dubbed the M-Series. The series contains multiple cards with different specifications, but all of them are fanless and cooled by a heatsink on the card.

Short summary:
  • M9120 PCIe x16 (Dual DVI Single Link only)
    M9125 PCIe x16 (Dual DVI Dual Link)
    M9120 Plus LP PCIe x16 (Dual DVI Single Link only with Quad monitor upgrade option)
    M9120 Plus LP PCIe x1 (Dual DVI Single Link only with Quad monitor upgrade option)
    M9140 LP PCIe x16 (Quad DVI Single Link only)
For a full list of specifications: Matrox M-Series data sheet (PDF)

I've always been a fan of Matrox ever since their G series, but recently I too had switched to NVidia because it offered more functionality and peformance for the same price.

I recently build a machine for a client using the M1920 card, and have to say I was pretty impressed. I did not have the time or the tools to do a full scale test, but ran Vista Aero without any problems, was able to run a DVD with good video output. Hardly the level of testing it needed, but I was in a hurry. I tried to play Cloverfield on it, but this was refused due to a copy protection issue. I did not explore this issue any further, but it may be due to the lack of a proper video player being installed on the system.

The performance of the card seems better than most onboard video solutions, plus all the cards have at least two true DVI connectors, which was one of the main reasons for me to buy one for the machine I had to build.

Price / performance wise, the cards are still expensive. For the price of the simplest card you will be able to buy an 8800GT or 9800GT, with a custom cooling solution. Since only a short time has passed since introduction, I hope the prices will come down some in the future.

Without any fans, you do get a silent solution straight out of the box, and available for full and low profile hardware ecosystems (thank MS for putting that term into my head). I'm interested in the Quad dvi/monitor solutions myself, but the price has to come down first.

Emyr
Posts: 91
Joined: Wed Jun 06, 2007 10:48 am
Location: Cardiff, UK

Post by Emyr » Tue Aug 19, 2008 6:54 am

So what advantages do Matrox cards have over a similarly priced ATI card?

I get that they're not meant for gaming, and are good for quiet workstations, but so are some of ATI's lower specced fanless cards...

zistu
Posts: 76
Joined: Thu Jun 28, 2007 1:22 pm

Post by zistu » Tue Aug 19, 2008 7:02 am

Probably none, unless you are looking for a very specific solution, like a quad monitor setup.

One of the main reasons for me to go with Matrox in the past was that their cards had dual DVI connectors, where other cards usually had a single DVI and a SUB-D connector.

Matrox was also praised for their excellent 2D performance in the past, specifically color reproduction and image clarity, but nowadays that may not be valid anymore.

The low profile dual DVI may also be a reason to go with Matrox, though I have to admit I have no clue if ATI or NVidia offers any solutions in that specific direction.

lodestar
Posts: 1683
Joined: Fri Aug 05, 2005 3:29 am
Location: UK

Post by lodestar » Tue Aug 19, 2008 9:47 am

I used Matrox graphics cards for years. Once DVI connections on other manufacturers cards and flat panel displays became commonplace the case for buying the over-priced, under-specified and under performing Matrox products simply disappeared.

The M series is better than the previous Parhelia range in that it apparently supports DirectX 9, but its true competition in the same price bracket is in my opinion the nVidia Quadro NVS 440. True it uses a derivative of the nVidia 6600 GPU and it only has 256Mb of graphics memory but for workstation business apps it is arguably the industry standard.

And like the Matrox, the Quadro NVS 440 is a silent solution.

lm
Friend of SPCR
Posts: 1251
Joined: Wed Dec 17, 2003 6:14 am
Location: Finland

Post by lm » Tue Aug 19, 2008 12:18 pm

zistu wrote:Probably none, unless you are looking for a very specific solution, like a quad monitor setup.
It's probably cheaper to get a mobo that has 2 PCI-E GPU slots, and get 2 GPUs with 2 dvi ports on each, than to get a single matrox gpu with quad dvi ports.
zistu wrote: One of the main reasons for me to go with Matrox in the past was that their cards had dual DVI connectors, where other cards usually had a single DVI and a SUB-D connector.
For example ATI HD3650 has 2 dual link dvi ports. And it's cheap.
zistu wrote: Matrox was also praised for their excellent 2D performance in the past, specifically color reproduction and image clarity, but nowadays that may not be valid anymore.
If you use a digital dvi connection and a lcd display, then the GPU does not affect the image quality any more in that sense, since the data goes bit-perfect to the monitor. Only the monitor affects the image quality with that setup.

And no, I am not talking about 3D graphics now, where differences in 3D acceleration implementation might bring small differences. Talking purely about 2D here. Matrox would lose the 3D competition anyway.

As a disclaimer, I used to use only matrox cards in the past, but since image quality is not affected by the gpu anymore, and matrox's lack of motivation towards linux support, and being totally useless for any 3D made me switch away from matrox.
zistu wrote: The low profile dual DVI may also be a reason to go with Matrox, though I have to admit I have no clue if ATI or NVidia offers any solutions in that specific direction.
TBH I'm not sure what low profile means in gpus, but it seems only the expensive gpus take up two slots.

lm
Friend of SPCR
Posts: 1251
Joined: Wed Dec 17, 2003 6:14 am
Location: Finland

Re: New Matrox videocards: The M-Series

Post by lm » Tue Aug 19, 2008 12:22 pm

zistu wrote:
  • M9120 PCIe x16 (Dual DVI Single Link only)
    M9125 PCIe x16 (Dual DVI Dual Link)
    M9120 Plus LP PCIe x16 (Dual DVI Single Link only with Quad monitor upgrade option)
    M9120 Plus LP PCIe x1 (Dual DVI Single Link only with Quad monitor upgrade option)
    M9140 LP PCIe x16 (Quad DVI Single Link only)
Single Link port can only give you 1920x1200, which is good for 24". Dual Link gives you 2560x1600 which is good for 30".

Instead of two 24" monitors, get a single 30". Instead of 4 24" monitors, get 2 30" monitors. That way you only need a single GPU with 2 dual link dvi ports, so even a cheap ATI HD3650 would do.

And 30" monitor rocks 2x24" any day.

zistu
Posts: 76
Joined: Thu Jun 28, 2007 1:22 pm

Post by zistu » Tue Aug 19, 2008 3:02 pm

Don't get me wrong, I am not trying to convince you all to go out and buy Matrox cards now :)

I searched this area and found no info on them yet, maybe rightfully so. They are still expensive and they offer little extra over other solutions. For the system I build I had a card on order which never arrived, needed to find an alternative and found the Matrox to be in stock and available, so I went with that. It did what I needed it to do and was silent, which made me happy and made my client happy, so it all worked out.

To answer some of the points still open: Low profile means that the card and the bracket which is mounted to the rear of the machine (backplate or whatever) are half the height of normal cards, so they can be used in small machines or server cases.

I personally prefer a triple monitor setup over a dual wide-screen setup. While the dual wide-screen may be cheaper while offering the same number of available pixels as a triple monitor setup, it does not offer the same functionality to me. I mostly run applications full screen on one of the monitors, and with 3 monitors I could run 3 applications at once. If I tried to do the same on a dual wide-screen setup, one application would always be cut in half by the edges where the monitors meet. But, again, this is my personal preference.

As for your comment on getting two GPUs and a mainboard that supports it, you are right, it would be cheaper and offer more flexibility in the end as well most likely. The only counter argument I can think of that may be valid is energy consumption, but even that would have to be tested and verified properly.

NyteOwl
Posts: 536
Joined: Wed Aug 23, 2006 7:09 pm
Location: Nova Scotia, Canada

Post by NyteOwl » Tue Aug 19, 2008 3:55 pm

While such things are often quite subjective, I have not seen a video card that can beat Matrox for quality of image (all other things being equal). The old 3dLabs cards were their only competition for 2D performance and image quality.

The only thing that ever kept them from a larger market share was their relatively poor 3D gaming performance as compared to cards from nVidia and ATI.

Vicotnik
*Lifetime Patron*
Posts: 1831
Joined: Thu Feb 13, 2003 6:53 am
Location: Sweden

Post by Vicotnik » Tue Aug 19, 2008 4:00 pm

NyteOwl wrote:While such things are often quite subjective, I have not seen a video card that can beat Matrox for quality of image (all other things being equal).
Even today with DVI? If that is the case, how is that even possible?

lm
Friend of SPCR
Posts: 1251
Joined: Wed Dec 17, 2003 6:14 am
Location: Finland

Post by lm » Wed Aug 20, 2008 6:22 am

NyteOwl wrote:While such things are often quite subjective, I have not seen a video card that can beat Matrox for quality of image (all other things being equal).
2D performance is pretty much a non issue, it's just about scaling images, rendering fonts and drawing boxes around. The tasks for 2D acceleration have not changed, there's only more pixels on monitors. Consider 640x480, that's something you might have had on a 14" CRT monitor over 10 years ago. Now consider the largest single screen resolution, 2560x1600. It's just about 13 times the amount of pixels. But processing power has increased much much more than 13 times during this time period. So if fast 2D was possible then, pretty much anything should be able to do fast 2D now.

However this is just my subjective reasoning, so please prove me wrong with benchmark data.

But your second point, image quality. If you have a digital connection to a LCD monitor, then your 2D images should be IDENTICAL using any gpu (assuming you have the same brightness, contrast, gamma, etc settings on your display driver).

Image quality was a problem with analog signalling, but bit-perfect digital signals to the monitor mean that only the monitor can affect image quality.

In this you really need to show me hard proof, f.ex. digicam shots of the same LCD using digital DVI connection, showing the same picture, but 2 different gpus.

NyteOwl
Posts: 536
Joined: Wed Aug 23, 2006 7:09 pm
Location: Nova Scotia, Canada

Post by NyteOwl » Wed Aug 20, 2008 1:03 pm

What part of "subjective" got missed?

[rest deleted - not worth the bother]

Vicotnik
*Lifetime Patron*
Posts: 1831
Joined: Thu Feb 13, 2003 6:53 am
Location: Sweden

Post by Vicotnik » Wed Aug 20, 2008 1:59 pm

NyteOwl wrote:What part of "subjective" got missed?
Have you perhaps confused subjective with placebo? Subjective implies a difference.

Ex. If Matrox would output a bit warmer picture and nVidia outputs a colder picture, then it would perhaps be subjective as to which is best.
But if the output is the same and there is no difference at all, how can anything about it be subjective?

If asked the question "is A = A?" would the answer "I dunno, it's subjective" be correct? No it would not. A is A is an axiom. It's true by definition.

NyteOwl
Posts: 536
Joined: Wed Aug 23, 2006 7:09 pm
Location: Nova Scotia, Canada

Last comment on this topic

Post by NyteOwl » Wed Aug 20, 2008 8:48 pm

But if the output is the same and there is no difference at all, how can anything about it be subjective?
In legal circles that's known as "assuming facts not in evidence".

You seem to want to be contentious on the issue. I chose not to play that game. I stated my opinion that Matrox has consistanty produced better 2D image results than other cards in my experience and that such differences can rightly be considered subjective.

If you have a different opinion and experience, fine. That's what having a subjective opinion is all about.

Doomer
Posts: 69
Joined: Mon Dec 23, 2002 11:44 pm
Location: Finland

Post by Doomer » Thu Aug 21, 2008 12:24 am

I have fond memories of my Matrox G400. Actually I first went with Nvidia and their TNT2 back in 1999. I sold it and bought the G400 after less than a month because of its bad 3D and 2D-quality. It was maybe 20% slower than Voodoo 3 and TNT2, but reviewers ignored the picture quality and only looked at how many 3DMarks it scored.

The experince with Nvidia 3D rendering quality left me a bitter taste and I've never owned their card since then.

Vicotnik
*Lifetime Patron*
Posts: 1831
Joined: Thu Feb 13, 2003 6:53 am
Location: Sweden

Post by Vicotnik » Thu Aug 21, 2008 12:37 am

NyteOwl: I cannot see how "assuming facts not in evidence" can be applied to things that in fact are evident.

I'm not debating you about the picture quality of the Matrox cards. All I'm saying is that IF the DVI output is bit-by-bit identical then there can be no debate over which bitstream is better than the other. They are the same.
Are you with me so far?

One might prefer Matrox over nVidia or the other way around. That's subjective. There is nothing about either brand that makes one better than the other in every possible way. But given that the outputs from the cards are identical, you cannot prefer one over the other based only on the output.

Do you understand what I'm saying?

aztec
Posts: 443
Joined: Mon Dec 12, 2005 5:01 am
Location: Foster City, CA

Post by aztec » Thu Aug 21, 2008 2:09 am

Matrox did used to have superior 2D, because of the filters/capacitors they used on the card.

I remember extensive reports about this back in the G450 and TNT days!

However, that is long gone, since DVI and especially since nVidia 88xx cards and most Ati cards made in the last 5 years.

jfweaver
Posts: 30
Joined: Mon Jan 17, 2005 8:05 pm
Contact:

Post by jfweaver » Fri Aug 22, 2008 8:21 pm

lm wrote:
zistu wrote:Probably none, unless you are looking for a very specific solution, like a quad monitor setup.
It's probably cheaper to get a mobo that has 2 PCI-E GPU slots, and get 2 GPUs with 2 dvi ports on each, than to get a single matrox gpu with quad dvi ports.
One of the big things, for traders, etc is driver support. Matrox is the only company I know of that makes hardware and software which support 16 screens. (Windows Display Properties maxes out at 10) 14-16 heads simply can't be done in a normal PC with dual head cards. Sure Colorgraphic did make the 8 head Xenteras, but they've since switched to just selling ATI FireMV cards. Matrox and ATI both make quad head low profile cards now, something that others haven't done. One of the other things Matrox has going for them are their PCIe x1 cards, which are still on the rare side. ( http://forums.2cpu.com/showthread.php?t=88749 )

Pretty much, if you're not doing video editing/capture, medical imaging, stock trading, or some other wacky thing like video walls, etc, Matrox isn't for you.

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Tue Aug 26, 2008 6:07 am

I have their pci 550 card.

I used it when i had a analog monitor. 2070sb mitsubishi 22". was a beautiful monitor.

the matrox was the only card to offer a clear picture for text and photos. no other card beat it.

dvi and IPS monitor.... You can use anything just about.

lm
Friend of SPCR
Posts: 1251
Joined: Wed Dec 17, 2003 6:14 am
Location: Finland

Re: Last comment on this topic

Post by lm » Tue Aug 26, 2008 10:25 am

NyteOwl wrote:You seem to want to be contentious on the issue. I chose not to play that game. I stated my opinion that Matrox has consistanty produced better 2D image results than other cards in my experience and that such differences can rightly be considered subjective.
I totally agree on the part where Matrox used to have better image quality. I had Matrox cards and other cards, and I saw this difference with my own eyes. But it was with analog connections.

The point is, with digital connection, there CAN NOT BE a difference, or either card is BROKEN.

jfweaver
Posts: 30
Joined: Mon Jan 17, 2005 8:05 pm
Contact:

Post by jfweaver » Tue Aug 26, 2008 5:18 pm

~El~Jefe~ wrote:I have their pci 550 card.

I used it when i had a analog monitor. 2070sb mitsubishi 22". was a beautiful monitor.

the matrox was the only card to offer a clear picture for text and photos. no other card beat it.

dvi and IPS monitor.... You can use anything just about.
2070SB, I loved it. :)

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Thu Aug 28, 2008 7:58 am

yeah 2070sb was so nice :) it just is huge and no one wanted to pick it up. i refused to insure it and pack it. it was unshippable as it was so heavy it could crush itself.

planar PX2611w beats it in every waynow but cost a lot of $$$

well, beats it except for watching movies in darkness.

where is OLED?

IsaacKuo
Posts: 1705
Joined: Fri Jan 23, 2004 7:50 am
Location: Baton Rouge, Louisiana

Re: New Matrox videocards: The M-Series

Post by IsaacKuo » Thu Aug 28, 2008 11:23 am

lm wrote:Single Link port can only give you 1920x1200, which is good for 24". Dual Link gives you 2560x1600 which is good for 30".

Instead of two 24" monitors, get a single 30". Instead of 4 24" monitors, get 2 30" monitors. That way you only need a single GPU with 2 dual link dvi ports, so even a cheap ATI HD3650 would do.

And 30" monitor rocks 2x24" any day.
For the price of one 30" monitor, you can get three 1920x1200 28" monitors. (see I-Inc 28" 1920x1200 for $400)

Although, you'll probably need a new desk to fit them all, and a new swivel chair to be able to see them all...

Seriously, though, a more practical solution would probably be to flank a single 28" 1920x1200 monitor with a couple 20" 1400x1050 monitors.

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Sat Aug 30, 2008 3:17 pm

Just because the video output is digital doesn’t mean that there can’t be differences between two digital solutions. Digital signals suffer from jitter and probably other issues. How noticeable is the affect of jitter to the naked eye? Maybe negligible to most people but some people have unusual perception in certain domains. Is this measurable or objective? Who knows or even cares!!! Life is subjective and if spending extra cash on a Matrox card boosts your perceived appreciation of the images on your TFT then enjoy your experience I say.

lm
Friend of SPCR
Posts: 1251
Joined: Wed Dec 17, 2003 6:14 am
Location: Finland

Post by lm » Wed Sep 03, 2008 3:18 pm

Smilingcrow: Jitter on LCDs is eliminated, because they buffer a large part of the whole image to their internal memory before they actually show it. Also there's the blanking time, that they can use to catch up, so they don't really drift slowly to some direction either.

swaaye
Posts: 61
Joined: Tue Sep 16, 2008 3:11 pm
Location: WI, USA

Post by swaaye » Wed Sep 17, 2008 7:19 pm

I have a Matrox Millennium G200 that is rather blurry, actually. ;) You don't really want to run it above 1280x960.

I've run Mystique 220, Millennium II, Mystique/Millennium G200, and Millennium G400. Millennium II and G400 have nice quality, but frankly so does some of ATI's hardware from those days. Radeon 9700 had some problems with noise though. On several of them, I've seen faint lines on the screen while still being nice and sharp.

Matrox got such a good rep because not only were they pretty good, but some other vendors were terrible. NVIDIA cards prior to GeForce 4 were frequently about as bad as can be. And lots of the PCI cards from the 15-17" CRT era were obviously not designed for uber res+refresh rates.

It has been a long time since I've had to deal with blur though. It usually comes out of cheap, low-end cards with obviously simple PCBs and from IGPs. DVI has definitely solved most of it though. I've never seen a bad picture come out of a DVI connection, although there have been compatibility problems between card/LCD (ATI had some with 8500/7500.)

David Cole
Posts: 239
Joined: Mon May 12, 2003 3:52 pm
Location: UK
Contact:

Post by David Cole » Thu Sep 18, 2008 4:19 am

I have the Matrox M9125 and the image quality is high - it fully meets my needs - and the IQ is certainly as good as that of my Gigabyte 9600GT.

Pecorino
Posts: 13
Joined: Fri Dec 14, 2007 12:59 pm

Post by Pecorino » Mon Sep 22, 2008 2:01 am

Do these new cards support hardware acceleration of h264/MPEG2, XVID streams?
This is important part of the question- to buy, or not to buy.

Volken
Posts: 1
Joined: Wed Oct 08, 2008 2:31 pm
Location: Giethoorn

Post by Volken » Wed Oct 08, 2008 4:06 pm

Vicotnik wrote:NyteOwl:I'm not debating you about the picture quality of the Matrox cards. All I'm saying is that IF the DVI output is bit-by-bit identical then there can be no debate over which bitstream is better than the other. They are the same. Are you with me so far?
The world of theory makes many people (those who lack the understanding of proper implementation ) shape incomplete conclusions when exploring the same. I wrote long ago very extensive piece on this subject. Sadly, in my native language and away from mercy of your understanding. So in short :

The quality of 2D filters for DVI output are just as important for analog output. theoretically, all DVI output signals should be identical. In the real world they are certainly not.

Main ingredients that would affect the DVI quality on the graphic's card :

- dvi transmitter
- filters & capacitors used
- and mostly quality and execution on PCB design.

Implementation supplies the real merit of final results and mind you, not every DVI projects the same quality. Far from it. Particularly under serious scrutiny of pictures, fonts particularly on highest resolutions. This is where big boys are always separated. Is it viable for every user? Certainly not.

The World of audio for example is filled with failed promises from alleged and theoretical transmission standards that entered audio domain from IT world Simply on its promise of perfect transmission. In reality, nature of audio signal is not so simple and cannot be treated with their designated simplicity. That is why today, when one observes many consumer and High End audio products since early 90's; many of the I/O communications makes industry people laugh today. Again, this is where theory never presented its glory and only remained a raw declaration.

I wont even go to Satellite sphere, filled with failed promises, again based on opulence of theory. To simplify things even more... one should place very little significance to his new digital camera, after all every 10MP camera would yield the same picture quality. Right? Well, I think you know answer to that question :)

On my main working machine, also, partly a file server, I always use only Matrox. At the moment Parhelia and additional G450 PCI card. Would it be fair to say that present implementation of 2D filters is more then adequate on many (even cheaper cards) presently available cards. Yes, most likely will please even more then average users. This is not an exercise toward elitism where I would blindly defend virtue of Matrox. Far from it! On other working machines, I always use ATI cards. My main problem with Nvidia is that Nvidia slightly varies with 2D quality on consumer cards, while pro range serves to be more consistent.

On the other hand, ATI 2D quality (omit some bad apples) was always consistent, enough to make sure it was targeted priority for ATI. For many users difference between best Matrox output vs ATI would be completely pointless. Even the most economical range would be adequate in this domain. When observing video and some other areas, I found many Nvidia cards... not quite handling with same aplomb in quality department as ATI and Matrox. Yes, this is rather general comment, after all, there so many models from Nvidia. But this inconsistency among some models, always bothered me and secured my choices away from Nvidia.

With highest qualitative judgment of 2D quality, there is micro, but still visible refinement in the way my Parhelia treats fonts quality when compared to any other card. I've really tried to convince myself that I simply romantised Matrox relationship. No, the quality of fonts is a notch above any other card and that is the fact. Many would not be able to perceive that, but it is there and yes, it is a small notch better.Sure, one can equal with settings on other adapters, but certain flair of sharpens is destined with Matrox.

Precision of adjustability is another trademark that keeps many owners loyal for life with Matrox. With all those utilities and gadgets, I never have elegance and power of Matrox range while setting my ATI machine.

Towards the more realistic prospect... to justify the price of M9140 LP is surely mission impossible. Nevertheless, considering that no Matox card ever died on me (I can't say that for ATI and Nvidia range owned) I will most likely consider this card for my next upgrade.

Forgive my lengthy introduction since this be me my first post here
Thanks for your patience. :D
Last edited by Volken on Wed Oct 08, 2008 5:42 pm, edited 1 time in total.

Vicotnik
*Lifetime Patron*
Posts: 1831
Joined: Thu Feb 13, 2003 6:53 am
Location: Sweden

Post by Vicotnik » Wed Oct 08, 2008 5:09 pm

An interesting post, Volken. Thank you and welcome to SPCR. :)

The next question would then be how big the differences really are. Can they be measured objectively or are they subjective? Is there need for double blind tests? etc.

I guess the differences must be pretty small since no-one really talks about 2D image quality anymore. I know I've never noticed any difference at all when switching graphics cards on my current TFT. I've used mainly ATi and nVidia cards, but also a Matrox G550. It seems unlikely to me that the quality of the filters are just as important with digital and analog transmission.

About the quality of the cards themselves, in my experience it's pretty good. I've owned a great many cards from a lot of manufacturers and since '93 I've had one card die on me (a GF2MX, and it was a DOA really).

As for hifi and digital cameras I cannot see the connection to this specific area. A very central thing in those areas is the conversion from analog to digital or the other way around. With a modern graphics card it's digital all the way.

qviri
Posts: 2465
Joined: Tue May 24, 2005 8:22 pm
Location: Berlin
Contact:

Post by qviri » Fri Oct 10, 2008 3:01 pm

(The following is a simplification of an admittedly more complicated issue.)
Vicotnik wrote:With a modern graphics card it's digital all the way.
A digital, on/off, 0/1 distinction is nice in theory. It works on the most basic levels; for instance, ignoring the measurement difficulties and weird quantum stuff I don't know enough about, you can sense the presence or absence of a single proton.

As you scale up, stuff stops being discrete. You can't have either 0 or 1 electric potential difference (voltage). Wires don't have infinite conductance; electronic components modelled as perfect aren't. We do away with this by establishing ranges; on a 5 V "digital" signal line, we may consider a signal to be "off" if voltage is lower than 1 V, and conversely "on" for voltages above 4 V.

Consider a graphics card which by design outputs 4.8 V as "on". Through variation during manufacturing of subcomponents and the circuit board, a particular sample may actually be outputting around 4.5 V. You use a cable which uses thinner conductor wires, and perhaps one of the conductors is starting to fray a little bit where you bent it; the conductance of current drops, and a 4.8 V input is seen as only 4.3 V on the output (10% drop).

If the component controlling the red signal of the pixel controlled by that particular conductor is not that great, and responds to a 2% dip on the supply voltage with a 10% dip on the output voltage, your digitally-born pixel-perfection just went bye-bye. Same with a sensor on the receiving (monitor) side that reads a little low (again, sample variation).

Digital isn't, especially if you need to leave the relatively highly ordered world of your own PCB, and doubly so if you need to perform any amplification of the signal while doing so.

This of course is still orders of magnitude better than the situation with analog.

Post Reply