Fallout NV, Dead Island not workin after upgrade to radeon from nvidia

cheezstik

Distinguished
Aug 5, 2011
40
0
18,530
Today I upgraded my old, nvidia gt220 to a radeon gigabyte 6850 which is a huge improvement but now some games don't work. With Dead Island, the resolution changes into my game resolution and the whole screen just goes into a black fullscreen mode and the only way to do anything is by pressing ctr+alt+del and bringing up the task manager to end the process. There was no config exe and i don't know which ini or .cfg/.txt to edit to make it work again so any ideas anyone? With Fallout new vegas, I open from the mod manager and right before I launch, I can play around with the graphics settings and the only way it will work is by putting it in window mode which is ugly and horrible. When in normal mode, it goes to game resolution and the screen goes black but I can still hear the sound. Again only thing to be working is ctr+alt+del and ending process. Please help guys, and to a mod, please don't lock or delete this thread, if it's in a wrong section move it to the right one, I thought this would be but I'm not 100% sure.

My specs are:
amd phenom x6 ii 1055t
4gb ram 1333mhz
Radeon Gigabyte 6850
2TB hdd 7400rpm
64-bit windows 7
1080p monitor.
 

vitornob

Distinguished
Jun 15, 2008
988
1
19,060
- Brainless fanboy answer: AMD sucks, nvidia better!

- Correct answer: yes, agree with @longliverock1974, @Murissokah.
Try to do a clean install, unistall through control panel all graphics drivers.
Restart PC.
Get latest catalyst version, install, reboot.
And launch ALL the games through game launcher to try autodetect new card, this might help

- Do some games work?
 

cheezstik

Distinguished
Aug 5, 2011
40
0
18,530
Thanks for the answers guys :)! Uninstalled all the old drivers for nvidia, restarted when prompted, then installed latest ati drivers and restarted when prompted. Now dead island works which is great, but fallout new vegas still has the same issue as before even when starting from the normal launcher and doing graphics hardware auto-detect it still has the issue. My PSU is the one that was inside my thermaltake element-t case when i got it. It appears to be a 430 watt Litepower from Thermaltake. Any ideas on how i could get it to work guys?
 

cheezstik

Distinguished
Aug 5, 2011
40
0
18,530


got a 720w psu but its not installed and probably won't install it as the one i have right now still works and is much higher quality than the 720w and I would of installed it but the box for the 6850 said minimum 400w and I've seen people on the internet use 370w with 6850 and an i5. And anyone got any ideas for fallout other than patching because I have a lot of mods on and patching would break most of them and patching also makes your game less moddable so I never do it on singleplayer moddable games. Fallout NV is boring without mods so no patch.
 
the thing is it is better safe than sorry. he might be able to run his system using 370w psu if he doesn't have much in his system and not doing any overclocking. but if he got tons of hdd and fans and some other component for his case there are chance his overloading his psu which can lead to blown psu. its too late when that happen. about fallout nv maybe you can try re-install the games. i can't give any other advise since you're insisting to keep the game un patch to allow the mod
 

cheezstik

Distinguished
Aug 5, 2011
40
0
18,530


No, I've never had that problem. Yeah fallout supports native res for my monitor but in either native or 1920x1080 it still has the same issue. It only seems to work in windowed mode which looks horrible on my 1360x768 desktop, only time it looks good is in games with 1920x1080.
 

cheezstik

Distinguished
Aug 5, 2011
40
0
18,530


Ok, that guy with the 370w psu probably doesn't play too much graphically complicated games as I was playing crysis 2 on max settings at 1920x1080 and my computer turned itself off :/ but the old PSU still works so it's still good. Got the new 720w PSU in.