BSOD/Motherboard Issues

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Sorry it took so long: Here is my bios. The only thing different listed than what you said is my Normal CPU Vcore, which is higher than what you said it should be running at.




CPU Clock Ratio - 2800Mhz
CPU NorthBridge Freq - 2000MHZ
CPU Host clock control - Auto
CPU Frequency - 200
HT Link Frequency - auto: 2000Mhz
Set Memory Clock - auto
Memory Clock - x6.66 1333Mhz
System Voltage control - Auto

Now, here's where things get different than what you said. The next 5 items say "auto", I figured that wasn't a big issue but my "Normal CPU Vcore" is showing 1.3250V. Is that too high from what it should be, or does that show it's running stable?

Also, in terms of overclocking, I've never done it before and am not sure what exactly goes into it but I'd love to learn. Now as for adequate cooling: I am modding my CPU case to add another 92MM case fan, that spins at 4800 RPM, and a PCI exhaust that pulls hot air out. My Video card is known for running hot so I like to keep myself prepared.
 
I'm very sorry - I typo'd that - it's supposed to be exactly what you've got - that's the CPU's rated voltage! I've got a bit of yard work to attend to for my dad - I'll be back in an hour or so, to get started...

Bill
 
Good deal, looking forward to hearing back.

Also, I've noticed a decrease in FPS since yesterday in games. What're the odds that has to do with heating issues? If not that, then the system restore? Not sure what else I've done but in l4d yesterday I was averaging 120 in an open area with lots of AI, now I'm averaging 90. That's a significant drop.
 
"Also, I've noticed a decrease in FPS since yesterday in games. What're the odds that has to do with heating issues? If not that, then the system restore? Not sure what else I've done but in l4d yesterday I was averaging 120 in an open area with lots of AI, now I'm averaging 90. That's a significant drop."

Yes it is, but this is a subject I'm not familiar with, as I don't game; so long as windoze' 'aero' works crisply, and I can do an occasional 3-D render, I'm a happy camper! I certainly can try to find out what's all involved; and I do mean 'all', I believe it's at its heart, a 'systems interaction' issue; to get good throughput, everything involved has to be pretty well optimised...

Before we start changing things, I want to point out a board feature that will save you tons of time and grief while 'tweaking': notice, at the bottom of the main BIOS page, the <F11> "Save CMOS to BIOS" - hit this, and you should get a menu that will show a number (the count varies by BIOS) of empty 'slots', each of which will store an entire set of BIOS parameters, to be re-loaded from the corresponding <F12> "Load CMOS from BIOS"; this is a wonderful overclocker's feature. What I do with it, is to save my 'baseline' working parameters, so if I change something that 'irritates' the board, and forces a reset of all the parameters to defaults, or, even worse, get so screwed up I need to do a 'clear CMOS', I can get back to my starting point with no effort, and without having to remember 85 separate settings! Another thing it prevents is two hours' troubleshooting, having forgotten a change to a crucial parameter - like, "wait a minute - didn't I have the Trd at seven?!" It's pretty self-explanatory, and I alway urge people to start right away by taking the time to give the 'slots' names that mean something: in two hours, "Try2" and "Try3" will not be very helpful, but "450@+10MCH" and "450@+15MCH" will! Another use is for 'green' settings; overclocks, as a rule, do not 'play well' with green features, such as 'down-clocking' and 'down-volting'; with the storage slots, you can set up one profile, say "Green", with all the settings at 'stock' values, and all the 'green' features enabled; another, say "Balls2Wall" with a full overclock, and all the 'green' stuff turned off... Another neat feature of this 'slot' system is, for most BIOS, the mechanism itself will keep track of which ones have booted successfully, and how many times (up to, I believe, a max of five)! Most BIOS will also allow you to save parameters to a disk or USB stick, so if you're forced to re-flash a BIOS (or, on some boards, doing a CMOS reset erases these 'storage slots') all is not lost...

Start by saving your current settins to one of those 'slots', named, say, "BaseLine"...

Then, on the "MB Intelligent Tweaker(M.I.T.)" page of the BIOS, try:

"CPU Clock Ratio" to "17.0"
"PCIE Clock (MHz)" to "100" [not 'auto']
"NorthBridge Volt Control" to "+0.1V"
"CPU Voltage Control" to "1.4275"
note that "Normal CPU Vcore" will likely stay the same - I believe it's read off a CPU register, and merely indicates the default voltage...

Then, on the "Advanced BIOS Features" page:

"AMD K8 Cool&Quiet control" to "Disabled"

and, on the "Integrated Peripherals" page:

"Legacy USB storage detect" to "Disabled"

And, you're ready to do an "<F10> Save & Exit" to try it!

Lemme know what happens...

Good luck!

Bill


 
The CPU Voltage control doesn't have a "1.4275", only goes by incriments of .025. That being said, I added .150, to make my Normal CPU Vcore add up to 1.4275, it of course said the same, but I'm not sure if it was a typo or what because I don't have the option to add anything like those numbers.

That being said, I am running now and will post results.

Also, am curious as to why we didn't increase the PCI-E Mhz? Is that very dangerous? What about RAM mhz and timings? I'd love to learn if you don't mind teaching.
 
So far what I've done is stable and got a .2 increase in windows test for the processor calculations. I'd like to see what the video can do in tests/performance.

I saved like you said just in case I notice any issues, but so far it's running smoothly.
 
The CPU Voltage control doesn't have a "1.4275", only goes by incriments of .025. That being said, I added .150, to make my Normal CPU Vcore add up to 1.4275, it of course said the same, but I'm not sure if it was a typo or what because I don't have the option to add anything like those numbers.
You got it figured out - the reason I don't quote these things as +such and such, is I hardly ever know which way they'll be presented (and it's a good point, I should adapt my narratives to cover both possibilities) as the manuals typically only show the default, or 'auto' settings; some MOBOs show everything as an 'incremental' increase, some show all as 'overall' voltages - the only standardization I've really seen is RAM voltages - all JEDEC 800 is 1.8V, and it's almost always shown as incremental, i.e., if your RAM is 2.1V, it'll be shown as +.3V, and I just found an exception to this, too - on some AMD boards, apparently as, like the new i7s, the CPU rather than the northbridge is 'handling' the RAM, so it's set with an overall voltage...

Um - we set the PCIe to '100' because, on some northbridges, the auto setting causes the PCIe to come off a fixed divisor of the FSB, so when we start 'bumping' the FSB, the PCIe can easily get to an unworkably high frequency; if you want to, you can actually just increase this a bit at a time 'till the thing just quits working, and then back off a couple MHz; it will depend on the video card - usually, they'll 'drop out' somewhere between 105 - 115 MHz (my 3850s will work at 108 - will 'die' at 109); but - there is no actual benefit to doing so: the PCIeX16 channel is so capacious that, for everything but a pair of 2x4870s in Xfire, the 'pipe' is nowheres near full; the data rate of a PCIe card is so small compared to the spec, that I continually point out to people whose PCIe slots 'kick down' to x8 operation when two are in use, they'll never see the difference between x8 and x16...

 


This was a VERY interesting post, I appreciate it. I'd like to learn more about voltages, and increase and speed and things like that. So you're saying the increase in northbridge can actually increase the speed of my memory because on some AMD boards it's actually handled through that? Interesting.
And I wasn't aware the my graphics card was not going to even need to be OC'd, that's kind of relieving too know.
Now, you posted something I have made multiple posts about in different forums, the crossfire in x8 vs x16. Would I benefit from running two cards at x8 over one at x16? Becuase my mobo obviously is stated that it's cross fire is at x8/x8. I know how much better x16/x16 is at really high resoltions but that's not exactly what I need to know but people insist on telling me that.
If I will notice a decent increase in performance I'd like to try it out, considering my graphics card is under 100 bucks. That's a steal. If not, and I should just keep the 1 card at x16, I will. I just would like more info on the subject. Also, at what point do you know that the x8 or x16 is too little or too high bandwidth for a specific card?
 
The northbridge increase contributes to the probability of getting several sticks of RAM working at one time; sometimes not needed to run two sticks; almost always needed to run four sticks; the increases in memory and CPU voltages contribute to their ability to actually clock faster.

Your 4850 is pretty much at the edge of using the bandwidth; you will certainly see an increase in frame rates if you crossfire a pair, and you may or may not see a difference between crossfiring a pair in a pair of x8 vs x16 wide slots.

Also, at what point do you know that the x8 or x16 is too little or too high bandwidth for a specific card?

Mostly, I go by reviews that are technical and thorough enough to deal with the bandwidths involved - if some spare time pops up, I'll try to hunt down a couple recent examples...
 
Okay, that's interesting. How did you learn this about the northbridge? feel like I'm not looking in the right places for knowledge.

So my 4850 being at the edge of bandwidth, meaning it's using up close to x16? Or were you referring to the x8?
I was told that if I use the x8, I will be bottlenecking my GPU.
And since I don't have x16/x16 capabilities I "have" to stick to the x8, but if I will notice an increase, I'd really like to give it a try. In laymen's terms: You are suggesting I get the extra 4850 to Xfire in x8, because I will notice an increase in performance, and won't restrict anything/increase my overall performance?
 
How did you learn this about the northbridge?

I'm giving away my secrets here, but:
0006y.jpg

This secret is: I read 'em, most people don't!

Second one:
0007v.jpg

This one happens to be my new 'collection' for the i7/x58 combo; I'm reading my way through, roughly, fifty megabytes of Intel's hardware and software developer's documents; these tell you, once you 'worry' it out of 'em, what happens at the hardware level in, say, the x58 when you adjust the QPI... Part of the trick with the GB manuals is that they are in three 'families': the Intels, the AMDs, and the 'red-headed bastard child' - the nVidias. The boards are 'evolutionary', they are all derived from common roots - once you know what's problematic in one BIOS, it's likely the same in all the Intels, and often the same for both the Intels and the AMDs; the nVidias are, to put it mildly, just bizarre!

I also pick up a lot of info from a really astute GB forum over at TweakTown; some of these guys are deadly serious overclockers - they're posting board modifications and patched BIOS for the x58s:
http://forums.tweaktown.com/f69
Especially valuable are the 'stickies' on memory timing, overclocking, and BIOS flashing...

I was told that if I use the x8, I will be bottlenecking my GPU.

I don't know, offhand, whether this is true; I do know that a 4850 is sort of on the 'ragged edge' of using the whole x16, but I don't know which side of the edge (I'm pretty sure I read that the only currently available setup that will 'show' the x8 slowdown is a pair of dual-GPU 4870x2s, but as I don't game, it kind of 'came & went' - I'm an old decrepit fart - can only hold so much in my head at a time - you'd be surprised at how often I stand at the bottom of the basement steps wondering "what the hell did I come down here for?" -- I'll try to get some actual test numbers and get back to you...
 
Haha holy jesus you have a lot of information piled onto your computer. I'm definitely going to check TweakTown forums and everything else for information. I'm still trying to learn whether I should stick with one video card or XFire it at x8, I can't seem to find promising answers anywhere. Tests appear to be 'wishy-washy', just like other forums answers. I was told that even though I'd be bottlenecking my GPU, if I run dual I will basically almost instantly get more power just due to the fact that two are running which can show an increase.

People still insist on contributing to the fact that I won't be getting peak performance, but at the same time running 2 GPU's is a win/win.
Where do you read about your video cards from? I really haven't found much information on the 4850.
 
Seems to me there was a really good 'comparo' type article right here on Tom's within the last month or so - I think they started with the 3850 and went right up to 4870x2 cards - I'll try to find it, but, for some reason, I always seem to have the damnedest time finding anything here other than by accident:)
 
I was thinking of something here - why, exactly, are you worried about bandwidth, video performance, etc.? Is it just for gaming? Are you aware that human eyeballs/visual cortex have a response rate of roughly twenty frames per second?
Look here:
http://en.wikipedia.org/wiki/Persistence_of_vision
or google "persistence of vision"; most modern video has a frame rate of 29.976 frames per second (it's a little off 30, to prevent 'beat' frequency interference from 60Hz lighting); once you're past that, the whole thing is moot! I've seen articles where people are torturing their hardware to get frame rates that exceed their monitor's refresh rate - which means that, besides your eye can't respond, you're 'throwing out' one frame in ten or eight because your screen can't 're-paint' any quicker... My lousy 3850s handle four screens and WinAero well enough to show artifact-free HiDef video streams - what more??
 
Well I'm curious just because I'm interested in getting the best performance possible, while I might get great framerates, certain things happen in gaming where it may cause a "hiccup" so to speak. Plus all of this is interesting to me, especially Overclocking. I just changed my major to Information Technology and couldn't be happier. Studying is a whole lot more enjoyable than what I WAS doing. And I just like seeing how high my FPS can get 😛
 
Ah-Ha! A noble cause - my aim is to get my whole system interface to be as close to telepathy as is physically possible! Just wanted to make sure you were aware that, at a certain point - "you can't see this!"

As for the 'hiccups', try this:
http://www.thesycon.de/deu/latency_check.shtml
This is a major problem in handling video streams; if the DPC (Deferred Procedure Call) queue gets too deep, you lose 'real time' response, and you get a 'hiccup'...
The tool checks the DPC response, shows you the approximate 'real-time' cost, and the cited page will show you some techniques for identifying and dealing with the 'misbehaving' thread...
Besides the TaskMgr, and the control panel 'Services' app, a good tool to find out what all crap is being loaded into your machine unbeknownst to you (and stop it) is:
http://www.glarysoft.com/qs.html
QuickStartUp from GlarySoft...
 
Okay I ran both things, my latency ranges from 112-189. Sounds a bit high too me, but it's all green so I thought I was good to go.

And That other program showed nothing that wasn't supposed to be there, so I guess I'm good to go in that aspect
 
Also: I was looking through my hardware one more time, printing off the things I need for a Mail In Rebate and noticed something:

http://www.newegg.com/Product/Product.aspx?Item=N82E16813128378
That's my motherboard, if you look it will tell you the crossfire spec's. Am I reading this incorrectly or does it say one will run at x16 while the other will run at x8 in crossfire? I was under the impression that the x16 will be brought down to x8 lowering the bandwidth to make it in correlation to the other PCI-E slot. I get the feeling that's not the case anymore and I am able to run another card at x8 while the original still holds x16.
Was curious if you had any more insight too that? Thanks!
 
Earlier post:

most testers who bothered with stock voltage testing seem to have gotten in the neighborhood of 3.3GHz, without any overvolting...

There really is no risk with a minimal overvolting; most of the overclocks I recommend here are of the 'very conservative' type. People are running that CPU at 1.55V 24/7; that I wouldn't do, or recommend. Modern CPUs have pretty much 'headroom'; with halfway decent cooling, they can usually pick up 20-25% with no risk whatsoever to the long-term survival of the equipment. The thing that kills equipment is severe, marginal overvoltage - it isn't the heat, it's the voltage itself: the thing to google is "electromigration" - high voltages/currents actually degrade the transistor junctions themselves over time...
 
Oh okay, so it's only over doing it really? I understand.
Learned a lot from this thread so far thanks for the help.

I'm trying to find more information on my motherboard in regards to the x16 and x8
 
PCI-SIG PCI-Express Base 2.0 specification: PCIe 2.0 doubles the bus standard's per/lane bandwidth from 0.25 GByte/s to 0.5 GByte/s, meaning a ×32 connector can transfer data at up to 16 GByte/s for both videocards (SLI 2×, xFire, etc.).

Sapphire list their twin-GPU 4870x2 vidcard at a 3.6GByte/s bandwidth, so two of them in crossfire will be cooking around 7.2GB/s, less than half the PCIeX16 spec, which also infers that one of the slots being at x8 will have little or no effect - and, you have to realize, the xFire interconnect takes some of the load out of the PCIe slots, and passes it across the dedicated connection.

The majority of people don't understand the difference between available bandwidth and the actual transfer rate that is being used. When you use a PCI Express 2.0 slot and a PCI Express 2.0 video card you double the available bandwidth compared to PCI Express 1.x, but that doesn't mean that the performance will double, or that the communications between the video card and the motherboard will be at double the speed - what happens is that the video cards do not use the full
bandwidth available, probably won't reach half of it, so when you increase the available bandwidth nothing will happen, because the bus wasn't causing a bottleneck in the first place. The performance will remain, at least observably, the same.

It's the same as the hose analogy: if you're dribbling a stream of water into a hose from your kitchen faucet down onto the floor, it doesn't matter if it's a half-inch, a five-eighths, or an inch hose - until the hose gets small enough to impede the water flow, the same amount of water will keep leaking onto your floor!
 
That was a great analogy. I think I am starting to understand it now, thanks.

I do have a question though: I was reviewing the gigabyte software once more getting familiar with it and it says my CPU is running at 1.312 V. I know for a fact it has said 1.324 (the stock speed for my CPU) before, but I haven't changed anything in the bios since the OC. I even put my old settings back on so it's not OC. Is it bad it's reading a lower Voltage rating? Did something get messed up in the bios or does it just do that from time too time?

I was thinking about putting on the "optimal settings" too see if it changes it but I didn't want to really change anything, even though I do have the BaseLine saved for what I originally had used.
 
What you are seeing is called 'vDroop': the real-time response of the on-board voltage regulation mechanisms adjusting to varying processor loading; on some northbridge/CPU combinations, this will be improved (become more stable - closer to 'nominal') with the "Loadline Calibration" item on the "MB Intelligent Tweaker(M.I.T.)" page of the BIOS set to "Enabled"; for mine (X48/Q9550) it doesn't work worth a damn, and actually seems to aggravate the problem...

Fairly comprehensive article here:
http://www.thetechrepository.com/showthread.php?t=126
 

TRENDING THREADS