A Question of History: Why is the Graphics Card Market the way it is?

esplin2966

Reputable
Nov 22, 2014
14
0
4,510
Aside from the power supply and the storage, a computer is essentially 4 things:

1) Board
2) Processing Unit
3) RAM
4) Cooler

A graphics card is also composed solely of the above-mentioned 4 parts. However, the computer market is drastically different from the graphics card market.

For a computer, you can buy a motherboard (1), CPU (2), RAM (3), and cooler (4) separately, from different companies, and then assemble them. The ability to do this necessitates the PC building community.

For a graphics card, you CANNOT buy 1) 2) 3) and 4) separately. You HAVE to buy them in one package, from companies like Asus, Gigabyte, EVGA, etc. In fact, if you try to modify 1) 2) 3) or 4) on a graphics card, you void the warranty. Only select few enthusiasts build their own graphics card.

Why is it that a computer's parts can be bought separately and assembled, while a graphics card has to be bought essentially "prebuilt"? I am looking for a historical explanation as to how the graphics card market developed to become "buy prebuilt only" while the PC market has the option to buy components and assemble.

Thanks.
 
Solution
It is only obsolete when that particular PC no longer does what you need it to do.

This century, I have had 3.5 main PC's. 2 of those were P4 space heater/jet engines.

The upgrade time frame is 0 sec. I'm not sure how that came to be.
Marketing. You can choose to follow it, or not.
Integrated graphics from APUs depend on:
Board
CPU (APU)
RAM (amount and speed)
Cooler

So what you said above is only true concerning dedicated graphics. Short answer - because of:
fragmentation
the way GPUs are designed so they have the best efficiency and compatibility
Advancements in graphics surpassed those in CPUs
 
@emdea22: So you're basically saying that the technology in GPUs way surpass that of CPUs, and they're so cutting edge that making GPU components modular is infeasible without significantly sacrificing performance, is that correct?

@USAFRet: I mean, you can't change USB 2.0 to USB 3.0 on a motherboard, but you can buy a different motherboard. Why can't I buy a different "GPU Board", take the chip out of my old GPU, put it on the new GPU board, and have it work? I can do that for PCs, which have the exact same set of 4 components, yet the GPU market somehow developed in such a way to not allow that, and I'm not sure how or why.
 
Because that's what the market wants. If there were a lot of users modifying their graphics cards, the market would change. There have been some changes recently in the graphics cards to allow people to easily modify the cooling on the cards because a lot of users are using water cooling, and the graphic card makers want to allow these modifications to keep their customers happy. Some cards are easier than others(to set up water cooling), but if more users demand the capability to make certain modifications, the market will adjust.
 
Heck why don't we buy some silicon and hardwire our own CPUs! Lol, I can only assume that there is a lot of stuff soldered on the graphics card and that it would be too much of a challenge for one to do easily. Machinery is quick. I love this topic though round of applause to you.
 
@ss202sl: I think your explanation is that not enough people want to assemble their own GPUs. I guess that spawns a different questions, which is: How does the market tell the big companies what they want? Like, before PC building was an option, how did people push the big companies to sell components separately? Did people just start taking computers apart en-mass?
 
@turkey3_scratch: Thanks for the support haha. I'm sure CPUs used to be soldered onto motherboard, but somehow things developed in such a way that companies are now designing them to be removable. I'm curious about how that happened, and why that didn't happen for GPUs.
 
The further issue, is that even with CPU's, they are only changeable within a certain line.
I can't take the CPU from my old P4 and plug into my new Gigabyte motherboard to gain the benefit of the new board.

Everything works together, and the rest of the board, be it motherboard or graphics board, needs to know how to take advantage of a different chip.
 


That happened early on. But again, not universally changeable. And building a PC from parts in, say 1995, was far more difficult than it is today. You strolled through a 600 page Computer Shopper magazine, and went to your local computer show. Bought a crapload of parts, then spend 2 weeks getting it working. Tweaking expanded memory, extended memory, IRQ, the exact order of what loads first, etc, etc....
Just to get a game running.
Today, I can hand a 10 year old a box of parts, and say Go...2 hours later, a booting PC.

Graphics, OTOH, were/are dedicated cards. No different than RAM sticks.
 
@USAFRet: I totally agree. The LGA1150 socket is gonna be dead in half a year or so, so my Asrock Z97E-itx motherboard is gonna be obsolete in half a year too. But I still at least have a year or so to upgrade before it becomes obsolete. The upgrade time frame is 0 sec. I'm not sure how that came to be.

To your second response: So you're saying that it used to be super difficult to build a PC, but people did it anyway. Enough people started doing it, which prompted companies to make parts more modular because they see a market. That's cool. Maybe when enough people start modifying their own GPUs, companies will start making them modular too.
 


Yes, it was far more difficult. For one, there was no Tom's...:) Or internet, with a multitude of reviews, performance comparisons, basic builds, etc, etc....
All that stuff was embryonic.

People just figured it out.