• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

What good does LGA do?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

madcow235

Member
Joined
May 27, 2002
Location
Purdue University, IN
The new prescotts and Tejas will be based on LGA packages and im wondering WHY? I was thinking maybe higher pin density but that doesnt really make sense since the pins on the mobo take up the same amount of room. Then I was thinking a better connection but again that doesnt make much sense. Then I discovered the reason so you cant use dielectric grease in the socket to supercool your processor. SEE ITS ALL INTEL PLOTTING AGAINST US. Does anyone actually know what the usage of a LGA is.

FYI: LGA=Land Grid Array
 
I was reading through an article over at Anand earlier today, and they did mention that Intel moved to Land Grid Array due to increasing pin densities. Article can be found here (The reference to LGA and pin density is made on page 2.)

While we're on the subject of LGA and the like, Anand also posted pics of an early LGA Tejas in this article. Pretty interesting looking thing.
 
What does Intel have to gain by plotting against us? We're the ones going out and buying the bleeding edge technology every 6 months and we're the ones that burn our components and need new ones (which makes them money provided people don't RMA their overclocking victims). They really have nothing to gain by plotting against us and disenfranchising overclockers.

Having these contacts on the bottom of the processor rather than pins would most likely increase efficency. It would take less power (which is something Tejas could benefit from), and you could put more contact areas on the CPU.
 
Last edited:
Shoter trace lengths help preserve signal integrity at higher frequencies as well.... not to mention less chance of cross talk and EM interference.
 
From what I've read not having pins on the substrate allows them to pack the contacts closer together.
 
From what I've read not having pins on the substrate allows them to pack the contacts closer together.
I have also read that moving the pins to the motherboard "socket" allows for more power to be transmitted to the chip, something which would be useful of the power hungry prescotts and tejas processors.
xb-70
 
Shade00 said:
You also don't have to worry about bending pins when you're removing the processor from the package or installing it. ;)


But it will be much easier to install the processor the wrong way round - assuming that it's a square, then you would be able to put it in any orientation in the socket whereas with conventional pins you have certain ones keyed to prevent this happening. Crispy :eek:
 
In the article I linked to, it shows that the pre-production units has 2 sngled corners so that the processor can only be alined one way.
 
xb-70 said:

I have also read that moving the pins to the motherboard "socket" allows for more power to be transmitted to the chip, something which would be useful of the power hungry prescotts and tejas processors.
xb-70
They are planning to got 100w to 150w. For that much juice, they introduced LGA775. The contacts have alot of surface area because they are flat. It is like one huge metal wire going into the proc now. Like a higher gague of electrical cord. Each pin seems to remide me of 16 gague wire. Each of those can carry 10 amps (16 AMG wire can carry 10 amps max). Now the current socket has 478 pins, now it will have 775, thats 297 more pins, since the bus hasn't changed I will assume only new power lines were added. That would be 148 wire pairs (1 gnd pin, 1 vcc). Each can carry about 5 amps I think. So thats 740 more amps to the proc. I think intel will have one amp per pin, thats 148 more amps. I don't know the voltage of the P4 off the top of my head but I'll assume 1.1v, at 150w and 1.1v and the proc needs 136 amps.

Anways LGA 775 should pave the way for OEM liquid cooling and 110W+ procs.

That is if Intel doesn't realize their microarchtiecture for the P4 sucks and goes back to the P3/Pentium M/Centrino series.
 
Back