Does Your AMD FX Platform BSOD with Steam? Read This.

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]sarinaide[/nom]Well I will still give the overall to Nvidia, but the AMD card advanced GPU technology forward a lot, the 7970 practically oblitrated the prior standing champion card (580) across the board, the issue was pricing, the GTX 580 was around $530 at the time, AMD perhaps overpriced the HD7970. With Kepler's release, the GTX 680 has the advantage on superior core and memory speed which is the reason for the better FPS, but I will wait on the HD RADEON GHZ edition cards, that way we can see just how good the GTX 680 is when Clock for Clock its on a even par.Processor wise, modular design and GPU/CPU integration is revolutionary, its just not perfected. I am willing to bide my time, the AMD architecture suits my future needs more than power efficiency.[/citation]

The 7950 has way more memory bandwidth than the 680 does, let alone the 7970 and in fact, this is how the 7900 cards beat the GK104 cards in some games, strictly because of GK104's memory bandwidth bottle-neck. The Kepler cards only have an advantage in GPU performance and performance per watt and even then, it's not a big advantage.
 
[citation][nom]blazorthon[/nom]The 7950 has way more memory bandwidth than the 680 does, let alone the 7970 and in fact, this is how the 7900 cards beat the GK104 cards in some games, strictly because of GK104's memory bandwidth bottle-neck. The Kepler cards only have an advantage in GPU performance and performance per watt and even then, it's not a big advantage.[/citation]

I am not disputing that, I was waiting for Kepler and when it arrived it was very underwhelming, It needed to be at least 10% better than the 7970, and overall it is more along the 2-3% margin. Again it is why I am looking towards the GHZ edition Radeons as sending a shock straight back to Nvidia. I have grown irritated by Nvidia, and the claims they sandbagged the GK104
 


For gaming performance, the only problem with the GK104 that keep's it's performance lower than it should be is it's memory bandwidth and in some less common situations, memory capacity. It's memory bandwidth is too low for it's GPU performance (GK104 was given the same amount of bandwidth as the GTX 580, despite having a GPU that is more than 50% faster than the GF110 in the GTX 580).
 

I think what he meant was that it is always good to have a backup.

Trying to defuse a potential flame war here.
 

pwnbroker2

Honorable
Mar 8, 2012
4
0
10,510
blppt,
yes that makes sense 08xx version was the beta to 0901 in which i didnt flash too. i just dont like loading beta code unless there is a severe problem that the beta is a known fix.

the only issue i had with 0705 is that it was kinda sluggish with win7.
 

blppt

Distinguished
Jun 6, 2008
575
92
19,060
[citation][nom]Pyree[/nom]I think what he meant was that it is always good to have a backup.Trying to defuse a potential flame war here.[/citation]

Not to mention, the BIOS flash may not do anything at all for his BSODs as the CEG problem does not affect pre-Zambezi CPUs (1055T)....however, there is always the possibility that the newest BIOS would fix his problems due to another issue being rectified.
 
[citation][nom]sarinaide[/nom]That is rather interesting, do you have synthetics which show the memory bottlenecking and reletive GPU performance GK104 to GK110? This seems like a interesting thing to investigate.[/citation]

No, sorry... However, the GK104 cards are beaten by the Tahiti cards in bandwidth heavy games (not always at 1080p, but always at higher resolutions than 1080p) and all other cards that have a lot of bandwidth, such as the GTX 580, do well in these games, in fact the GK104 cards practically match the GTX 580 in such games (an interesting phenomenon considering that the GK104 cards and the GTX 580 have nearly identical memory bandwidth, separated by less than a few hundred MB/s). I don't think that it can mean anything but a memory bandwidth bottle-neck. I keep asking for such benchmarks to be done to prove it, but I'm sure of it being the problem regardless because all of the signs point to it and nothing else.
 
[citation][nom]blppt[/nom]Are you sure its just the bandwidth and not the fact that the 7970 has an extra 1GB of frame buffer? (2GB 680 vs 3GB 7970)[/citation]

Increased VRAM capacity does not increase performance. VRAM capacity bottle-necks are different from others because they drop performance like a rock instead of gradually. It's like falling off of a cliff instead of rollling down a hill. If the capacity was a problem, then it would almost definitely show itself by making the game more or less unplayable. Besides, 2GB of frame buffer is not limiting in any modern games at 2560x1600 or lower resolutions, it only becomes a problem at around 6MP resolutions (in some games, higher in others) such as triple 1080p. I'm sure that it is the bandwidth.

Increased frame buffer capacity only increases the headroom for the GPU so that the GPU has enough memory to function in the job that you give it, increased frame buffer doesn't actually increase performance like some people would have you believe. In fact, it usually decreases performance very slightly, except for when the reduced frame buffer would become a VRAM capacity bottle-neck.
 
G

Guest

Guest
I'm running an AMD FX 8150 on a Gigabyte GA-970A-D3 with dual AMD Radeon HD7970 graphics cards. I initially had a BSOD with Total War 2: Fall of the Samurai, but flashing my bios to F7 seemed to fix it. That worked for about a month, when I started to have BSODs on every steam game. I flashed F8 and even F9 without resolving the issue. I have given up using Steam for this reason. You get what you pay for, and I curse the day I decided not to go with Intel and NVIDIA. AMD can't even get their graphics card updates to work (search issues with Catalyst 12.4 update).
 
[citation][nom]dores893[/nom]I'm running an AMD FX 8150 on a Gigabyte GA-970A-D3 with dual AMD Radeon HD7970 graphics cards. I initially had a BSOD with Total War 2: Fall of the Samurai, but flashing my bios to F7 seemed to fix it. That worked for about a month, when I started to have BSODs on every steam game. I flashed F8 and even F9 without resolving the issue. I have given up using Steam for this reason. You get what you pay for, and I curse the day I decided not to go with Intel and NVIDIA. AMD can't even get their graphics card updates to work (search issues with Catalyst 12.4 update). [/citation]

The graphics cards aren't the problem at all and should work just fine. I know many people who use the 7970s and several who use them in Crossfire. They don't always work in Eyefinity and Crossfire at the same time in DX11 with Catalyst 12.4 drivers, but they do work if you use an older driver. That problem is supposed to be fixed in Catalyst 12.5, so that you don't have to use an older driver. Regardless, they can and do work if you know what you're doing. Intel and Nvidia are not inherently better, they both have problems too. For example, Intel's SATA degradation bug and Nvidia's Kepler stuttering bug are two fairly recent, yet substantial examples among many others. All of the companies have problems.

That you bought an FX-8150 for a gaming computer isn't a very smart move anyway. It's no better than an FX-6100 for gaming at all (if you just overclock the 6100 to 3.6GHz) and even that is hardly any better at all than the FX-4100 outside of BF3 MP. Also, the FX-8150 is just an 8120 with it's multiplier raised, it's not even better binned, so buying the 8150 instead of the 8120 doesn't make sense. You did not research your build very well if it's a purely gaming build, especially since you paired a mid-ranged CPU for gaming with extremely high end graphics.

EDIT: Also, if you think that it's so bad, then why not sell your build and make a new one? That would make more sense than complaining about it, especially when you're not even complaining about the correct problems and then go on to imply that the alternative companies don't have problems of their own.
 

blppt

Distinguished
Jun 6, 2008
575
92
19,060
[citation][nom]blazorthon[/nom]Besides, 2GB of frame buffer is not limiting in any modern games at 2560x1600 or lower resolutions, it only becomes a problem at around 6MP resolutions (in some games, higher in others) such as triple 1080p.[/citation]

Dunno about that....Grand Theft Auto 4 would use 1600+MB of VRAM at 1080p maxed out, and that was back in like 2009. I would think at least SOME a-list titles nowadays would use 2GB maxed out. Not a whole lot I'm sure since an awful lot of today's games are ported from Xbox and PS3 without significant texture upgrades, LOL.
 


GTA4 had a frame buffer bug that went unfixed. No modern game today maxes out 2GB at 2560x1600 unless you have some huge AA that would need a graphics setup that you would expect to be used in a much higher end display configuration. Playing at 2560x1600 with some big SSAA and multiple GTX 680s might be able to max out 2GB in the most intensive modern games, but not much else.
 

blppt

Distinguished
Jun 6, 2008
575
92
19,060
[citation][nom]blazorthon[/nom]GTA4 had a frame buffer bug that went unfixed. No modern game today maxes out 2GB at 2560x1600 unless you have some huge AA that would need a graphics setup that you would expect to be used in a much higher end display configuration. Playing at 2560x1600 with some big SSAA and multiple GTX 680s might be able to max out 2GB in the most intensive modern games, but not much else.[/citation]

Ummm, no...I'm pretty sure that it was intended to use 1600+MB of VRAM, it even told you how much VRAM certain features would use if you turned them on. I'm pretty sure its not a realtime monitor, but an estimate not based on any rendering bug. If you maxed everything and used the best reflection and shadow quality, draw distance, etc, @ 1920x1080, you got north of 1600MB VRAM on the configuration screen.

Now, in LATER patched versions the VRAM usage did come down on that very page with all features maxed (I think the max you can use @ 1080p, all features is like ~1200MB, IIRC, with the latest patch), but the game's draw distance was reduced (one reason I still use patch 1.2 to this day) and generally doesnt look/run as good as the older versions. The game is coded poorly anyways, with texture glitches and shadowing bugs even in the latest patch. My *guess* is that Rockstar got so many complaints about crashes from people maxing the game with 1GB cards and the like that they limited exactly how far Draw Distance=100 would actually get you versus the older patches. (for example Draw Distance (latest)=100 might be equal to like 50 in the 1.2-1.4 patches.

Thats just been my experience with GTA4 anyways.
 

blppt

Distinguished
Jun 6, 2008
575
92
19,060
Also, doesnt Shogun 2 DX11 hit a wall @ 1080p with mild MSAA on (4x) on 1GB cards? Coulda swore I heard something about that using north of 1GB too.

BTW---that 1600MB figure doesnt even include anti-aliasing on GTA4, which is not supported, and I could never get to work with forcing through AMD or NVIDIA's drivers, either. One memory-intensive, or poorly coded game, LOL.
 
[citation][nom]blppt[/nom]Ummm, no...I'm pretty sure that it was intended to use 1600+MB of VRAM, it even told you how much VRAM certain features would use if you turned them on. I'm pretty sure its not a realtime monitor, but an estimate not based on any rendering bug. If you maxed everything and used the best reflection and shadow quality, draw distance, etc, @ 1920x1080, you got north of 1600MB VRAM on the configuration screen. Now, in LATER patched versions the VRAM usage did come down on that very page with all features maxed (I think the max you can use @ 1080p, all features is like ~1200MB, IIRC, with the latest patch), but the game's draw distance was reduced (one reason I still use patch 1.2 to this day) and generally doesnt look/run as good as the older versions. The game is coded poorly anyways, with texture glitches and shadowing bugs even in the latest patch. My *guess* is that Rockstar got so many complaints about crashes from people maxing the game with 1GB cards and the like that they limited exactly how far Draw Distance=100 would actually get you versus the older patches. (for example Draw Distance (latest)=100 might be equal to like 50 in the 1.2-1.4 patches.Thats just been my experience with GTA4 anyways.[/citation]

It was just a buggy game and is not representative of how most games perform. That they were lowering VRAM usage as time went on at the sacrifice of quality tells us that it was not supposed to use that much VRAM.
 

A Bad Day

Distinguished
Nov 25, 2011
2,256
0
19,790
[citation][nom]doron[/nom]A fanboy and a troll walk into a restaurant..[/citation]

If you read one of my wireless networking threads that I posted months ago, you'll understand.

Maybe it was because I came in contact with too many stuff that self-destructed from an update.

Or that I just bricked my family's internet connection after installing an update for our new wireless router this weekend, requiring a lengthy tech support call for them to restart our internet service.
 

blppt

Distinguished
Jun 6, 2008
575
92
19,060
[citation][nom]blazorthon[/nom]It was just a buggy game and is not representative of how most games perform. That they were lowering VRAM usage as time went on at the sacrifice of quality tells us that it was not supposed to use that much VRAM.[/citation]

I dont agree...they obviously wanted the game to be able to look that good from the start, otherwise why give the gamer the capability to do so? (and when it did run properly, it DID look light-years better than the console versions)

My view is that the simplest, money saving solution was to limit the amount of graphical goodies the end-user could turn on to (possibly) cause problems with the game. I believe they were up to 8 official patches before the compromised visual quality came into play, and obviously that costs R* a lot of cash to keep releasing patches.

The game was certainly buggy though---no arguments there.
 
[citation][nom]blppt[/nom]I dont agree...they obviously wanted the game to be able to look that good from the start, otherwise why give the gamer the capability to do so? (and when it did run properly, it DID look light-years better than the console versions) My view is that the simplest, money saving solution was to limit the amount of graphical goodies the end-user could turn on to (possibly) cause problems with the game. I believe they were up to 8 official patches before the compromised visual quality came into play, and obviously that costs R* a lot of cash to keep releasing patches.The game was certainly buggy though---no arguments there.[/citation]

PC games that use half as much memory look better (the PC version looked better than the console version because consoles are, at best, 720p without AA, among other deficiencies). It used so much because of how badly it was made and ported. It's programmers were probably inept, explaining not only why it as poorly coded in the first place, but also why they couldn't fix the problem and had to trade off quality to pretend to have fixed it. Of course, maybe it was just unable to be properly ported for some reason and that is why it was coded so poorly and it wasn't just the programmers' faults, but who knows?
 


While I hate drm. this user is right/wrong. while steam isn't responsible and shouldnt take the blame... and shouldn't take responsibility since that hurts them as a company and doinga fix would imply it was them... it would gave been sick.if they stepped up and issued one
 


totally agree with you. AMD is not God. clock for **** they are ahead on gpus. should have done better binning/ pricing but nvidia couldn't compete so....

they get murdered on CPUs. but they due good work. we all need them. and the rediculous gpu bias is old
 

blppt

Distinguished
Jun 6, 2008
575
92
19,060
"While I hate drm. this user is right/wrong. while steam isn't responsible and shouldnt take the blame... and shouldn't take responsibility since that hurts them as a company and doinga fix would imply it was them... it would gave been sick.if they stepped up and issued one"

Short of actually removing the DRM entirely, what could they have done? It took AMD's patching of the CPU microcode to solve the BSOD problem with Valve-CEG, and it seems to me that if it didnt occur on any other processor (including AMD's own Phenoms!) the ball is squarely in AMD's court on this one.
 

pandelta

Honorable
May 28, 2012
2
0
10,510
I have the Asus M4A89GTD/USB3 board running a FX4100 cpu, its not a 900 series chipset but an 890 one. I originally had an older cpu and it ran steam games just fine but when i upgraded to the FX processor all steam games blue screen everytime. I have been waiting for 6 months for Asus to release a bios update and keep calling them but am getting no where. Asus pushed PR that this board was FX compatible and it is as long as you don't play steam games! Please someone have any contacts to get them to fix this? If not then I may have to take a class action laywer route cause I am getting no where.
 

oxford373

Distinguished
Jun 22, 2009
618
0
19,060
from the first i saw the bulldozer review there wasn't any benefit from bulldozer architecture at all .since the idea behind bulldozer is to drive efficiency like hyper threading do but the problem is bulldozer(cores in pairs) is 75% slower IPC ,so core i5 works as fast as almost 7cores AMD at the same clock ,would be happy if amd just forget about bulldozer architecture and make native cores like phenome, but i don't think that will happen after 3 years of researches and developing bulldozer architecture
 
Status
Not open for further replies.