E510 Upgrade Video Card:NewEgg


Feb 4, 2010
Okay so I have this clunky old e510 with the standard 128mb x~~~ series card in it. I know it's pci-e 16x but since Im not that tech savy so I have no idea if its just "pci-e 16x" or "pci-e 16x 2.0, 2.1" etc, since that's what the cards seem to be broken down into category wise when you browse newegg.

I'm guessing its just pci-e 16x without the 2.0 right? Anyone who knows about these old clunkers care to let me know :D? Dont want to get a card that this old pile can't really make use of.

And yup, know I'm limited by my psu.


As long as you don't spend over $350 on your card, your PCIe bus will be able to handle it. As you may know, when you use the new P55 mobos for the new intel CPUs like the the i5-750, and put two cards in the mobo, the slot maxes at at PCIe 2.0 x8, which is the same as PCIe 1.0 x16, which you have.

The other question is what CPU you have - but you did not disclose that. These articles show how various graphics cards are constrained by the CPU - and how it also depends on which games you play:


They also might give you an idea of when it is time to upgrade the system.

If you want to upgrade just the graphics card without a PSU upgrade, you can mprove game performance by purchasing this card if it fits in your budget:

ASUS ENGT240/DI/1GD3/A GeForce GT 240 1GB 128-bit DDR3 PCI Express 2.0 x16 HDCP Ready Video Card - Retail - $78 after rebate with free shipping

Of particular note, the GT 240 has a minimum power requirement of 300w - so it should work fine in your system. ASUS is a good brand and the price is discounted.

Here is a review on the card that also shows its performance. I don't think you will find any faster card with a 300w or less minimum power requirement. The 4650 and 4670 both carry a 400w minumum power requirement. Its important to note that as PSUs approach capacity, heat increases, noise level increases, and efficiency decreases.


That's a power reccomendation not a requirement, the fact that a 4670 uses about 8 watts more power than a GT240 isn't going to make it "require" a psu that's 100 watts more powerful.


The difference between the 4670 and ASUS GT 240 I listed, as shown in the review I linked, is 21w not 8w. And when you are pushing the limits on 300w - that is significant. I should think the 4670 manufacturers would not be putting 400w minimum recommendations on their cards - and thus losing sales - if they were comfortable with them running at 300w.

Why do you think you know the cards better than the manufacturers? Oh - you are not guaranteeing them have no costs if your ideas don't work.

Further, I think it is very disengeuous to say it is just a power recommendation and not a requirement when the manufacturer lists it as "minumum power requirement":


and "system requirements"


and "requires at least 400 Watt power supply".


You lose all credibility when you misrepresent that way.

8 to 11 watts difference depending on which version of the card

Newegg specs mean squat, read the manufacturers "System Requirements" tab in the XFX link

Here's an 8800 GTS ( with it's 425 watt " power reccomendation " and 106 watt actual consumption ) running on the Dell 305w.


And the thread it's from


It has long been known that video card manufacturers inflate thier wattage "reccomendations" in order to make up for the fact that there are crappy psu's out there that may have the wattage but not the amperage.



For the 240 I used the specific power usage measured by THG in their review for the specific card I mentioned, which performed better than the reference card. My point is valid and the difference is signficant.

I quoted from the specifications tab in the XFX link I posted and it states "minumum power requirement".
And there are the other similar examples of "requirements" posted for other vendors' 4670 cards.
Sure its true, they cannot force a customer, who can put in anything he wants - they can only recommend. So perforce their requirement has to be a recommendation. There are no graphic card police going around checking and throwing customers that violate it in jail. So because they cannot force but only recommend you consider their guidance less important?

Again I ask, why do you think you know more about the system requirements then the vendors who design, test, manufacture, and repair the cards and have a long history of support and customer interactions to further inform them?

First, please stop misleading. Here is the power requirement from the EVGA site (that is an EVGA card in the picture) listing the various versions of the 8800 GTS - ALL with a power requirement of 400w NOT 425w.


Second, that is just a photo.
Third, anecdotal information about one case carries little weight compared to multiple manufacturer's state requirements.

Again - very misleading. Watts = Volts x Amps. If they have the watts and are running within spec on voltage - as most do - then by the laws of physics they have the amperage. Now what you might be thinking of is that less of the power goes to the 12v rail - which was particularly a problem as graphics card requirements for 12v power grew quickly and many PSU manufacturers were slow to reallocate power to the 12v rail. It is not so much a problem today but one does need to watch for it.

Still it does not change the basic fact that the vendors are stating SYSTEM REQUIREMENTS of 400w or 100w more than poster has. And as you get close to PSU capacity, heat increases, noise levels increase, and efficiency decreases. I had a 6600 GT with a mimimum requirement of 350w on my Dell 8400 with a 350w PSU and was frequently annoyed with the noise level when the fan on the video card would speed up. Oops - now I am giving anecdotal information. Forget that poster - stick with the experts.



I have some questions about the reliability of the power chart you posted. I went to look up the power on my old 9800 GT and it shows it at 124w. But then it shows one OC at 83w. Why would wattage decrease - and by almost a third when you OC? Shouldn't it be increasng instead?

Wattage for the GTS 250 increases from 77w for the 512MB version to 81w for the 2GB version - which is reasonable, a little more for the added memory. But why then does it incresase almost 100% from 81w for the 2 GB version to 158w for the 1 GB version?




But this is your source - the one you provided that in some way was supposed to bolster your allegations. If you consider it a reliable source you should be able to explain the apparent discrepancies. If you can't, then why are you using as a source a table that either you have not really looked at or that has some strange results?

I have no desire to spend a lot of time trying to figure out why your source also seems to have misleading data.
You are the one asking us to beleive it not me. So then I guess you are saying you don't know if your source is reliable?