Upgrade to Geforce GT 640 2GB OC

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Khaleal

Honorable
Jan 19, 2014
442
0
10,860
Hey there,

I'm thinking of upgrading the graphics card on my little brother's old gaming desktop.
The current system specs are:
CPU: INTEL E4600 2.4GHZ
MB: Intel 945GC chipset
RAM: 2x1GB KINGSTON 667MHZ (MB's maximum)
HDD: 1xHITACHI 160GB SATAII + 1xWD BLUE 500GB SATAIII
GC: Geforce 9400GT 512MB
PSU: 280W ATX Power Supply; Robust; WW

I found an used GeForce GT 640 2GB OC (Clocked at 1050MHZ) for about 80USD including shipping in eBay. I saw some game-play videos in YouTube with this card and they were impressive.. But before I go on and buy that card I have some questions:
1. Will this graphics card work with the installed PSU (280W)?
2. The installed CPU and RAM are the best supported by the motherboard so I'm kind of limited by the CPU and RAM. If I install the mentioned graphics card, will I suffer from RAM, CPU bottlenecking? If yes, What is the best card I can buy which suites the current system so I don't have to pay for something that I can't really use it's full power.
3. Any suggestions for other equal/better/more suitable graphics cards? This is an old desktop, I don't want to put too much money on this upgrade.. My budget for this upgrade is 80USD..

Thanks guys in advance :)
 

Correct me if I'm wrong.. The CPU takes 65W from the 280W leaving about 215W for other components.. fans, hard drives, memory, etc.. takes about 80W leaving 135W.. the GTX750 takes 65W leaving about 70W..
If I'm left with 70W extra power why shouldn't I go with the OC version which takes only 10W more?
 
Yes, you're wrong. You're assuming the wattage of the power supply on the +12V rail which is not correct. The 280W is the total power of the 5 rails: +12V, +5V, +3.3V, +5Vsb, and -12V.
The wattage available for the +12V rail is not 280W but 216W as it's 18A. (18x12 = 216)
So that makes our calculation 65W + 55W + 80W = 200W.

Another thing is that you're not supposed to push your power supply right to the limit.
The values I've given you are theoretical. The minor components may require more power depending on what you have in your system.
Ah, since you have almost the same system as mine, I can say this:

There is a PSU calculator on the web(or was, the site doesn't load anymore) known as extreme power supply calculator. It listed the rough estimate of the power supply you'll need if you're planning to get one. I put my system in without the GPU and it said 236W.

That is without the GPU. With a 650Ti GPU which takes 110W power for itself, it recommended the wattage as 364W. That's 131W more than without GPU.
PSU wattage decreases with time and usage, and since you have an old system and if this power supply is its original one, you should not assume full wattage.

I'm still saying this, go for the stock one. It's already pretty powerful, and you can always overclock it yourself if you feel like it.
 


Thank you for your reply.
I ended up going for an used AMD Radeon HD 7750 1GB GDDR5 for 50$ from Amazon (core clock: 880MHZ and memory clock: 5000MHZ) . It's a 55W TDP card and many guys were reporting to run it smoothly with their 280-300W stock PSUs without having to upgrade it..
If we want to compare it to nVidia card, Well.. it's equivalent to the GTX650 and way better than the GT640 that I was trying to upgrade to..
It's my first AMD desktop graphics card.. We'll see.. Anyway, 50$ is not a huge investment.. If I run into power issues I'll see if I can get a 450W Zalman PSU to replace the current one.
BTW, extreme power supply calculator is loading fine.. just use Internet Explorer..
 
Hmm. The 7750 is a good card, actually the best one in terms of performance/watt before the 750Ti came along.
It's slightly slower than the GTX650, but it's good enough for this system.
Let me know how it goes.. I was also suggested a 7750 but I decided to stick to NVidia, especially since I'd just gotten a new power supply(as the old one had a broken fan).
 
Update: I got my HD 7750 today.
When I replace my old 9400GT with the new HD 7750 I get no output on the screen although the computer is booting normally (I can hear Windows sounds).. I can see the fan is working on the graphics card.. What could be the problem?
 


If I switch my monitor cable to integrated graphics (while HD 7750 is still connected) I am able to get output through the integrated graphics..
I made sure the BIOS is set to PEG/PCI.. No success to have output through the dedicated graphics card.. your help is appreciated..
 


Yes. My previous Nvidia 9400GT is working perfectly.. I tested the integrated graphics thing
I connected my old 9400GT and then switched to the integrated graphics and it didn't work.
I connected my new HD 7750 and then switched to the integrated graphics and it works (integrated graphics).

What could be causing this?
 


I've tested this card in another system and it worked well! What could be causing this issue? Motherboard or PSU or any other component?
 


Slow processor - Quad Core necessary