an x800 would be
totally wasted on a 2100+.
i have a p4 2.4 right now, in literally any game i can switch from lowest res and crap IQ settings to 10x7 or 12x10 and max AA/AF etc and fps will be the same.

it's not so much that the CPU is a bottleneck, just that
something will always be limiting your performance, and the new cards all have insane fill rates.
as for cost, the X800XT has 160 million transistors, and the 6800U has
222 million. almost none of the X800 die is used for cache, there are several small L1-ish caches for each pipeline i think, but nothing else; the 6800 does have a larger cache, not sure how much though. compare that to an A64 FX53 at 106 million or a P4 XE at 178 million... but about half of a CPU's die is spent on cache, presumably even more for the P4 with its on-die L3 cache. so these cards are actually a lot more powerful than a CPU, just nowhere near as flexible. then, you're paying for at least 128-256MB of generally awesome RAM, and all the other odds and ends the card needs.
then on top of that, the newest cards need to subsidize R&D and i'd be blown away if ATI and nVidia sold even 10% as many X800/6800s as AMD and Intel sell typical-but-not-ludicrously-expensive CPUs, and it's also a volume thing too. we may have it lucky, SCSI users pay double or more for modest benefits.

tack on another $100 or so just for bragging rights for the absolute best of any line of new cards - the people who need (heh) them won't care, same as we see with new CPUs etc.
edit: reread your post, yeah, i've always been amazed at how cheap motherboards are too, for everything they do/facilitate, especially boards like Asus/MSI's nForce3 250gb ones with onboard everything and a half dozen IDE/SATA controllers. mb manufacturers get stuck with the blame for almost everything too.
