Speculation: GPU in a CPU-like socket?

The forum for non-component-related silent pc discussions.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

Post Reply
Filias Cupio
Posts: 81
Joined: Sun Oct 09, 2005 7:53 pm

Speculation: GPU in a CPU-like socket?

Post by Filias Cupio » Sun Sep 24, 2006 8:37 pm

A major problem (from our point of view) with GPUs is that, coming on an expansion card, you can't attach a Ninja-like heatsink. In a modern gaming rig, the GPU generates more heat than the CPU but is restricted to a much smaller heatsink.

So - wouldn't it be wonderful if the motherboard just had a socket for a GPU chip, like we have for CPUs now?

I don't know enough about GPUs to know how feasible this is. You'd have video RAM on the motherboard (either directly or in memory slots.) How similar is the support circuitry for various GPUs? Could you make a single socket which could efficiently support an entry level GPU or a macho gamer's GPU? Are there advantages other than cooling? If the companies were willing to support a standard, could ATI and nVidia chips me made to work in the same slot, or are they just too different? (In which case your m/b would determine your brand of GPU, like it does for CPUs now.)

EsaT
Posts: 473
Joined: Sun Aug 13, 2006 1:53 am
Location: 61.6° N, 29.5° E - Finland

Post by EsaT » Sun Sep 24, 2006 11:20 pm

RAM is way too slow for them, GPU would still need local big fast memory, which would mean either big die size (embedded memory) or making it card on socket (taking lot of space from motherboard)
Or then it would be like in consoles, meaning small (putting fast enough memory in amounts like 2GB would be very expensive) faster memory on memory bus whose speed would be still well under current memory bandwidth of top card.

Also different level chips use wastly different memory architectures. And ATI has currently much more advanced memory controller architecture.
So putting GPU to motherboard on socket would work only with lower ends chips, with highend chips it would slow them considerably and hinder any future developments badly.


Also if makers could get away with still increasing heat output they would use that.
Remenber how Intel tried to introduce BTX case standard because cooling of their super inefficient CPU became hard in ATX case, if BTX would have taken "wind under wings" we would propably still be having hotter than ever P4s coming out.

Filias Cupio
Posts: 81
Joined: Sun Oct 09, 2005 7:53 pm

Post by Filias Cupio » Mon Sep 25, 2006 1:05 pm

I'm unimpressed by your first objection - you'd either have fast VRAM soldered onto the motherboard, or have a new socket for DIMM VRAM modules.

However the second objection about memeory architectures is the sort of thing I was looking for, thanks. So you might be able to make a motherboard which would (e.g.) take plug-in AM2 CPU and nVidea series 7 GPU, but it would no more be able to take an ATI GPU than it could take an Intel CPU?

BillyBuerger
Patron of SPCR
Posts: 857
Joined: Fri Dec 27, 2002 1:49 pm
Location: Somerset, WI - USA
Contact:

Post by BillyBuerger » Tue Sep 26, 2006 6:50 am

I always thought something like this would be a good idea from a cooling perspective. But it would get really ugly for compatibility. Think of the motherboard choices:

- AM2 CPUs with DDR2 and ATI GPUs with DDR3.
- AM2 CPUs with DDR2 and nVidia GPU with DDR3.
- Intel CPUs with DDR2 and NVidia GPU with DDR2.
- Intel CPUs with DDR2 and ATI GPUs with DDR3.
....

And so on. GPU architecture and memory is too dynamic at the moment. You'd have to have standards for something like this. And that would probably limit the expansion of the GPU market. I don't see that happening. I think it would probably be better at the moment for ATI and NVidia to concentrate more on efficiency and cooler running chips that don't have so much problems being cooled in the PCIe slot.

Tzupy
*Lifetime Patron*
Posts: 1561
Joined: Wed Jan 12, 2005 10:47 am
Location: Bucharest, Romania

Post by Tzupy » Tue Sep 26, 2006 7:02 am

IMO it's more likely to have external GPUs, with their own PSUs, connected by Torrenza (external HT). And a coprocessor (or a second CPU) would use the second socket, connected by coherent HT. External GPUs that would dissipate 250W-300W would be relatively easy to cool quietly (in 2D at least) with two 92 mm fans and a lot of heatpipes and fins. While the mainstream systems will have an integrated CPU / GPU, see AMD - ATI announcement on 2008 production on 45 nm process.

Rusty075
SPCR Reviewer
Posts: 4000
Joined: Sun Aug 11, 2002 3:26 pm
Location: Phoenix, AZ
Contact:

Post by Rusty075 » Tue Sep 26, 2006 9:10 am

Within a few years there will be no such thing as a standalone GPU chip, either in a riser card or on the motherboard. The GPU is going the way of the FPU and the MCU...it will be integrated in to the CPU itself.

5 years from now we will be talking about 32-core CPU's. As the core count goes up, the individual core complexity goes down. Currently we have 1 or 2 complex cores tied together, each full of specialized components. Eventually you will have dozens of simple cores who can have their functionality reasigned on the fly. Doing something FPU intensive? Then more of you cores will switch to start handling FP tasks. Rendering lots of 3D? Then it will devote more cores to what are now GPU tasks. You won't be shopping for different brands of GPU anymore than you now shop for different brands of FPU's. (FPU's used to be a fairly big industry)

Cells are the future. That's why (partially) AMD bought Ati. And that's why Intel is hiring up every GPU engineer they can steal away from Ati and Nvidia. And that's why Nvidia is looking at buying Via. (supposedly)

Rusty075
SPCR Reviewer
Posts: 4000
Joined: Sun Aug 11, 2002 3:26 pm
Location: Phoenix, AZ
Contact:

Post by Rusty075 » Tue Sep 26, 2006 9:11 am

Within a few years there will be no such thing as a standalone GPU chip, either in a riser card or on the motherboard. The GPU is going the way of the FPU and the MCU...it will be integrated in to the CPU itself.

5 years from now we will be talking about 32-core CPU's. As the core count goes up, the individual core complexity goes down. Currently we have 1 or 2 complex cores tied together, each full of specialized components. Eventually you will have dozens of simple cores who can have their functionality reasigned on the fly. Doing something FPU intensive? Then more of you cores will switch to start handling FP tasks. Rendering lots of 3D? Then it will devote more cores to what are now GPU tasks. You won't be shopping for different brands of GPU anymore than you now shop for different brands of FPU's. (FPU's used to be a fairly big industry)

Cells are the future. That's why (partially) AMD bought Ati. And that's why Intel is hiring up every GPU engineer they can steal away from Ati and Nvidia. And that's why Nvidia is looking at buying Via. (supposedly)

Tzupy
*Lifetime Patron*
Posts: 1561
Joined: Wed Jan 12, 2005 10:47 am
Location: Bucharest, Romania

Post by Tzupy » Wed Sep 27, 2006 1:07 am

I had the idea of integrated CPU / GPU about 3 years ago. And mentioned it on the OpenGL forums - most people there almost laughed at me.
I was expecting that to happen in 2006 though, I was overly optimistic. There are several problems with integrated CPU / GPU, like:
1) memory bandwidth; even if in 2008 it will be possible to put a dual-core K8L CPU and 8800 class GPU on the same die, they will have to share the memory bandwidth, which currently is only 128 bits for CPUs; I suppose that a 256 bit bus with fast GDDR4 memory would be enough to feed both CPU and GPU.
2) the design of CPUs and GPUs are different in terms of silicon structures, making it problematic to get high yields on a CPU / GPU combo; one could have high yields on the CPU part and low on the GPU part, or viceversa; ATI has licensed a technology for building GPUs in a CPU like manner, but I don't know if they have progressed.

jaganath
Posts: 5085
Joined: Tue Sep 20, 2005 6:55 am
Location: UK

Post by jaganath » Wed Sep 27, 2006 2:05 am

5 years from now we will be talking about 32-core CPU's.
I'll believe it when I see it. :wink: We'll be talking about them, but they're very likely a bit further in the future. I don't think AM3 will support more than quad-core?

http://www.anandtech.com/cpuchipsets/sh ... spx?i=2565
It is no surprise that talks of multi-core are up next, although the desktop will remain predominantly dual core for the foreseeable future.
Through the use of extensions to the AMD64 architecture, Hester proposed that future multi-core designs may be able to treat general purpose cores as almost specialized hardware, but refrained from committing to the use of Cell SPE-like specialized hardware in future AMD microprocessors. We tend to agree with Hester's feelings on this topic, as he approached the question from a very software-centric standpoint; the software isn't currently asking for specialized hardware, it is demanding higher performance general purpose cores, potentially augmented with some application specific instructions.
I know the link is almost a year old, but it is on-topic.
Within a few years there will be no such thing as a standalone GPU chip, either in a riser card or on the motherboard. The GPU is going the way of the FPU and the MCU...it will be integrated in to the CPU itself.
Thermal problems? Let's say power consumption of CPU's and GPU's does not come down significantly between now and then, you will be looking at dissipating up to 250W from a bit of silicon about an inch square.

Rusty075
SPCR Reviewer
Posts: 4000
Joined: Sun Aug 11, 2002 3:26 pm
Location: Phoenix, AZ
Contact:

Post by Rusty075 » Wed Sep 27, 2006 12:16 pm

Re @ Tzupy:

1) 256bit bus, faster memory interface, and larger inclusive and exclusive cache's should help the bandwidth problem. But also consider that with the CPU & GPU communicating within the same chip there's less data to be transferred across the chipset than there is currently.

2) Current CPU and GPU designs use different silicon structures…but once they're just identical multifunction cells on the same chip that problem doesn't exist. Yes, with current designs throwing an Nvidia core in with a C2D core would be very complicated.


Re @ jaganath:
jaganath wrote:I'll believe it when I see it. We'll be talking about them, but they're very likely a bit further in the future. I don't think AM3 will support more than quad-core?
I was being conservative. At IDF Intel suggested that they'll be shipping 80 core CPU's by the end of the decade. 4 Cores will be out this year, 8 are projected for Q2 of next…32 in 5 years seems pretty reasonable.
jaganath wrote:I know the link is almost a year old, but it is on-topic.
A lot changes in a year. AMD's purchase of Ati in particular springs to mind, as does C2D. Plus there's all that stuff going on about Intel hiring up Ati and Nvidia engineers, and Nvidia's rumored moves to buy Via.
jaganath wrote:Thermal problems? Let's say power consumption of CPU's and GPU's does not come down significantly between now and then, you will be looking at dissipating up to 250W from a bit of silicon about an inch square.
You're still thinking in terms of a CPU and a GPU core being crammed on to one chip. These super-multicore chips aren't going to have a dozen 200million cores on them, they are going to have a bunch of simple cores tied together dynamically. The thermals could be very different.


I don't want to make this sound like this is my own pet theory…I'm just regurgitating what I've been reading elsewhere. I have my own theories about where the PC industry will be in 5 years, but most of them involve aliens and tinfoil hats. :wink:

Chocolinx
Posts: 311
Joined: Thu Jan 19, 2006 5:14 am
Location: Toronto
Contact:

Post by Chocolinx » Wed Sep 27, 2006 3:25 pm

What would be really cool is if the PCI-e slots were the last slot on the case, then with a little support on the end that has none, put a pillar of some sort. Then we would also have to put the GPU on the opposite side of where it usually is now. Then we could add a SUPER huge cooler like the NINJA on it :D There have been people though who have put on cheaper coolers like Coolermaster Heatsinks for CPU on their GPU and it works :shock: But it's scary in some pictures as it looks like the card is ready to snap in half :shock: But they say the card is perfectly fine.

Oooo on another note on my idea then would could all build lower chamber cooling ducts in all our ATX cases! LONG LIVE ATX AND OUR MODS!

zenboy
Posts: 65
Joined: Fri Dec 17, 2004 10:25 pm

Post by zenboy » Fri Sep 29, 2006 10:01 am

You could actually do that now, with a piece of allthread running down to the bottom of the case. I don't have any expansion cards in my machine that are anywhere near as long/large as my video card, so I doubt it would foul on anything. Maybe a pair of nuts holding something to clamp on the card, and threaded into a nut epoxied to the bottom of the case? That way it would be removable.

Rusty075
SPCR Reviewer
Posts: 4000
Joined: Sun Aug 11, 2002 3:26 pm
Location: Phoenix, AZ
Contact:

Post by Rusty075 » Fri Sep 29, 2006 11:49 am

Or more people could design cases like Fong kai did, with a built-in bracket to support large/heavy cards:

Image

I really liked that idea, too bad nobody else adopted it.

Shining Arcanine
Friend of SPCR
Posts: 502
Joined: Sat Oct 23, 2004 2:02 pm

Post by Shining Arcanine » Fri Sep 29, 2006 1:13 pm

EsaT wrote:RAM is way too slow for them, GPU would still need local big fast memory, which would mean either big die size (embedded memory) or making it card on socket (taking lot of space from motherboard)
Or then it would be like in consoles, meaning small (putting fast enough memory in amounts like 2GB would be very expensive) faster memory on memory bus whose speed would be still well under current memory bandwidth of top card.

Also different level chips use wastly different memory architectures. And ATI has currently much more advanced memory controller architecture.
So putting GPU to motherboard on socket would work only with lower ends chips, with highend chips it would slow them considerably and hinder any future developments badly.


Also if makers could get away with still increasing heat output they would use that.
Remenber how Intel tried to introduce BTX case standard because cooling of their super inefficient CPU became hard in ATX case, if BTX would have taken "wind under wings" we would propably still be having hotter than ever P4s coming out.
I believe that you forgot to consider the possibility of chip stacking, which would eliminate the need for external or on-die memory (as it would be above/below die memory) and according to Intel, is a very realistic option:

http://www.anandtech.com/tradeshows/sho ... i=2367&p=3

Graphics processors made with die stacking technology would allow the memory, GPU and memory controller interface to all be on one processor, in a rather small amount of square space, making a GPU socket a feasible option.

RasmusseN
Posts: 27
Joined: Fri Oct 27, 2006 11:08 am
Contact:

Post by RasmusseN » Sat Oct 28, 2006 11:04 am

since AMD bought ATi i've heard rumors of them later in the future putting a GPU and CPU on the same chip so that's kinda what your talking about. It seems kinda stupid to me but w/e. The thing is it's nice to be able to take a card out and replace it with another one plus on the cards there is more than just a GPU there is fast memory and such.

What i really think they should be working on is making multiple core GPU's taking care of tasks. All this talk about PPU and such couple be built into a GPU probably.

One thing I wish could be fixed is they keep making everything use more power more power = more energy = heat this is why I picked ATi most of the time becauase it seemed like they cared about the power and heat the card used. Well times are changing nvidia has never cared all they care about is performance ( sorry nvidia fans but that might be why you like them ). I can now say that ATi is just as bad at nvidia there new gen of cards are using alot of power and dispursing alot of heat.

I don't know if you've seen the new G80 pictures at XtremeSystems the card has four 4 pin molex connectors to power the card it's about the size of a bus and the heatsink is probably bigger than your head. haha jk but it is fucking huge.

nutball
*Lifetime Patron*
Posts: 1304
Joined: Thu Apr 10, 2003 7:16 am
Location: en.gb.uk

Post by nutball » Sat Oct 28, 2006 11:56 pm

RasmusseN wrote:since AMD bought ATi i've heard rumors of them later in the future putting a GPU and CPU on the same chip so that's kinda what your talking about.
That will primarily be a product for the sort of markets currently served by integrated graphics. It's unlikely that it'll ever be a replacement for high-end discrete graphics, probably not even the mid-range.

There are too many compromises in such a design for it ever to represent that best solution possible with a given amount of silicon.
Well times are changing nvidia has never cared all they care about is performance ( sorry nvidia fans but that might be why you like them ).
Err... I own an NVIDIA card because at the time I bought it (about four months ago) it was the best performance:power ratio. ATIs last generation sucked big-time, power-wise, and their next generation is going to be worse. In recent history ATI have been the criminals. Which is a shame because I've quite liked the ATI one card I've owned (X800XL). Fundamentally though its not the fault of the IHVs, it's the fault of the consumers. The graphics market is driven by people who think that replacing i's with 1's in a word like fatality is k3wl.

Filias Cupio
Posts: 81
Joined: Sun Oct 09, 2005 7:53 pm

Post by Filias Cupio » Thu Nov 16, 2006 7:23 pm

Here's an article about AMD's fusion plans: [url]http://www.sci-tech-today.com/story.xht ... 300CLPZM6F[/url]

However, it isn't terribly informative - in particular, we're left in the dark as to how powerful a GPU they'll be fusing with the CPU.

Thosser
Posts: 2
Joined: Fri Nov 17, 2006 4:19 am
Location: Wales

Post by Thosser » Fri Nov 17, 2006 4:47 am

Chocolinx, I had a similar idea, but it may be better to lay the card flat against the motherboard (Much like a laptop graphics card). That way the card could support a large heatsink, and could even be ducted so that the air is exhausted out the back of the case.

andyb
Patron of SPCR
Posts: 3307
Joined: Wed Dec 15, 2004 12:00 pm
Location: Essex, England

Post by andyb » Fri Nov 17, 2006 1:08 pm

this might be of interest for your people.

http://www.theinquirer.net/default.aspx?article=35818

The Inq is right far more than it is wrong, take note.


Andy

MachManX
Posts: 1
Joined: Sun Dec 17, 2006 10:49 pm

Post by MachManX » Sun Dec 17, 2006 11:21 pm

Anything's possible, it depends on how you look at it. Forget all the technical mumbo-jumbo and look at what Filias wants: to mount a big cooler on the GPU. Now, hypothetically, if you were to take the video card and attach it to the side of the motherboard and put the pci-e x16 connector there, then you can have the video card lying flat just like the motherboard and mount a big heatsink. See? That wasn't so hard.

Of course this would ring so many buzzers like "it will take up more room" and "it's at the end of the case", but hey, it's an idea. :)

Reality: You can get "L-extenders" for slots from ISA to PCI-e X16 to mount your video card parallel to the motherboard. You can choose left or right extenders. Problem is left would prevent you from using a few PCI slots and right might float near or over the CPU. But it gets the job done and now you can mount your big heat-sink on the card. DON'T FORGET TO STRUCTURALLY SUPPORT THE VIDEO CARD!

Now let's talk about technics. As proven by AMD, an integrated or "closer to the core" memory controller is much faster than leaving it on the northbridge, er south? But then you must also think, when I upgrade my processor to the next technology, I upgrade my motherboard with it. Same goes for a graphics core. You would be better off upgrading the graphics board along with the graphics core. That's why we use cards.

Well, I'm sure the future has better solutions to make this idea work. We'll have to wait and see. ;)

nutball
*Lifetime Patron*
Posts: 1304
Joined: Thu Apr 10, 2003 7:16 am
Location: en.gb.uk

Post by nutball » Mon Dec 18, 2006 12:13 am

MachManX wrote:But then you must also think, when I upgrade my processor to the next technology, I upgrade my motherboard with it. Same goes for a graphics core.
You just described a games console.

Post Reply