I own the Dell 1905FP. Beautiful for text, good for images. Absolutely god-awful for games and movies.
I recently saw the AnandTech review for my monitor, and they said the texture blurring in games was minimal. So do other reviews for the Samsung 193P, which uses the same internal panel. I don't understand how my monitor can look so much worse to my eyes. I'm aware that my statement about the 1905FP contradicts conventional wisdom, but for now I'm not changing my opinion of the screen.
I'll do a side-by-side comparison with the 1905FP and my roommate's brand new 8ms VP171b-2's when I get a chance. I think his has much less screen lag, but the colors are a bit off, and the blacks aren't as dark. But if the 1905FP's response time is nearly as good as it gets, then I guess there's no reason to pass it up over other screens.
Simple: you're more sensitive to/more capable of detecting motion blur artifacts than average people; would this surprise you at all? For example, it is a known problem that many people can detect, "rainbow," effects with single-chip DLP projectors as a result of the color wheel. A minor group of people are so sensitive that they can't even use single-chip DLP displays because it gives them headaches or nausea. I know for a fact that I personally have no idea what this rainbow artifact is, and try as hard as I have, every time I visit Best Buy and stare at the DLP rear-projection displays, I cannot detect rainbow effect.
This is the same reason why some people cannot even see the difference between anlogue output from a craptastically filtered nVIDIA card and a Matrox card; everyone's eyes are different, as is the way that everyone's brain deciphers what the eyes are feeding it. Clearly, you have very sensitive visual accuity and can easily detect poor display quality or strong display artifacting; higher than average. Many people don't notice that on poorly filtered cards, raising the refresh rate too high causes the image to blur; this is clear and obvious to my own eyes. I'm sure you've noticed this as well. That's why I often leave my refresh rate at only 70Hz or so even if the display can do 85 or 100Hz at that resolution--the image remains sharper and more stable at lower refresh rates (that aren't visibly flickering yet like 60Hz) when a video card has poor filters on its analogue output.
DVI avoids this problem, and buying a card that has good filtering is another way to avoid this problem; Matrox uses good filters even on their cheapest cards. It's not nVIDIA's fault when manufacturers skimp on filter capacitor quality; they can set guidelines or minimum standards of quality, but things like this are difficult to measure and implement controls for. Since Matrox makes the cards themselves, they have direct control over the components used to fab the cards. I believe Apple has similar control over exactly what components go into the filter stage of their computer's analogue graphics outputs, which is why Apples have always had such stellar visual output.
Heck there's also color accuity issues. Different people's minds interpret color very differently; some can hardly tell the difference between an LCD and a CRT's color accuracy while others can easily detect the extremely slight misstint of the red/orange band on plasma displays. This doesn't even have any particular link to the individual's ability to detect differences in sharpness, or ability to detect motion blur issues.
Edit: Said, "of," meant, "or."