BGA usage on High-END Video chips?

antielite

Honorable
Jul 28, 2012
14
0
10,520
BGA usage on High-END Video chips?
I want to ask some questions about BGA (Ball grid array) standart.
As you all know BGA is very popular in all cases, like for:
* RAM memory lusts,..
* North Bridges,
* South Bridges,
* other chips,
* GPU's.

Why BGA is used on GPU? GPU is heating wery fast, and those balls after ~2-3 yeas broke cause overheating or sudden changes in temperature..
63fdaff9_bga_sc2.jpeg

BGA%20defect.jpg

Is the is no other posibility to change BGA for High temp GPU's to socket LGA ??
I have searched this long, but i only found about fixing it, not about WHY they use only BGA.
Everyone knows that SOLDER balls will broke sooner or later if CHIP is always overheating, or just it is HOT..
Ps. we are talking only about other GPU placement Technique, not about how to fix, or protect from this, like with flux ..

MY opinion there are only few possibilities why eveyone use BGA on Hi-GPU's:

1. There is no better technique researched. Other than BGA dramaticly decreasing performance.

2. The Video cards brokes cause BGA after Warranty is over, and client must buy new one and so support the companies. High quality video cards is bad for business..

3. Fastest and cheapest way to sold GPU.

So who have strong opinion and experience... share ideas..

Additional Details:
As you see, the internet is full of "how to fix BGA", "how to reflow BGA", "BGA reballing", "Reflow GPU (like 8800gtx)", "XBOX fix", "DV6000-9000 gpu fixing".

There is short and correct info about BGA:
http://www.webopedia.com/TERM/B/BGA.html

So like i asked before, speaking technically. If BGA broke with high temp chips, why companies continuously use it?


Another info about BGA:
*
DVSeries-nVidia.jpg

*
BGA vs PGA:
http://forum.notebookreview.com/hardware-components-aftermarket-upgrades/370183-bga-vs-pga.html
*
BGA vs LGA:
http://m2m.gemalto.com/products-and-services/innovations-technology/lga-technology/comparison-lga-vs-bga.html

I hope someone is interested in it.. :)
 
BGA is used to reduce footprints and lower the latency. It also has to do ram architecture. In order to use GDDR5 which for instance will be used in AMD's APU Kaveri The incoming i5/i7 will be in BGA more or less because of the 128MB e-ram.

I would love to see gpu LGA for upgrades.
 
You can't pack LGA or PGA pins in as tightly - you need more space between them.

Also, BGA isn't always as bad as you make it sound; Yes, some fail, but many live and the failures are largely due to quality control issues.

LGA also has issues with the force required to keep it attached to the board over very large chips, and high-pin-count ZIF sockets (for PGA) are quite complex and expensive, plus space consuming. GPUs also require much larger footprints than CPUs, as they draw more current and have much wider memory interfaces, both major users of pins.

GPUs are also a lot more segregated than CPUs - they have varying numbers of memory channels, outputs, and everything changes between generations. You'd need a VRM capable of handling a 7970 GHz edition, and a heatsink capable of cooling it, on a 7750. Not going to happen.

Also, add another 5mm to the thickness of your GPU for the socket.

EDIT: Minor clarity/wording changes.

EDIT#2: According to intel's LGA1155 spec sheet, the socket itself is attached to the MB using a BGA. http://www.intel.com/content/www/us/en/processors/core/3rd-gen-core-lga1155-socket-guide.html
 
I know there are some issues with the BGA package but still it is best package for large chips because the size of the chip or the PCB is the deciding factor for the cost of PCB and that is why companies prefer BGA to make the PCB’s smaller and cheap.

pcb prototyping