workstation dual cpu build used for gaming as well.

jordank_boro

Commendable
Dec 26, 2018
41
1
1,545
hey guys, the thread is as states and i'll be talking about a workstation pc build with two cpus but is used to play games as well. this post is in no way an angry one, in fact quite the opposite. im new to workstation pc's so any (real) info people can lay on me would be amazing. heres whats up.
BATTLEFIELD 5 USES ALL 16 OF MY CPU CORES! thats right guys! i have a 3gb r9 280x graphics card and a 8 core 16 thread dual cpu. and im able to play at a locked 60 fps on high settings. right now my cpu's only have 2.4 ghz each core which isnt much. but i've upgraded to two 6 core 12 thread 3.4 ghz processor(for 50 bucks) and i assume it will just demolish this game for cpu usage. i've always understood that battlefield was very cpu intensive so it makes sense they would program it to just use all your cores fairly evenly which i absolutely adore. im just wondering if more games will be doing this as well. is it really that hard, i mean i know everything takes work but will other games follow suite now too considering the release of these ridiculous cpus like the thread ripper and such. this build handles a lot of other games really well too (new resident evil 2 reamake, sleeping dogs, gtaV.) but some games are just garbage(any far cry game, just cause 3, and others) because they only utilize a few cores of the cpu maxing it out and dropping fps a lot.

but heres where the crappy part comes in. it seems that if the devs are lazy (cough cough far cry franchise) that the game only utilizes maybe 4 cores and maxes them out causing my fps to drop to an abismal 35 fps in some places and my gpu usage drops to about 60 percent. it's maxed out while playing bf5. it seems that once a cpu core is maxed out the graphics card has a tough time working at it's full potential. and i understand why, because the processor has to process the information before the graphics card can turn it into an immage. so im wondering why the heck devs would create games to utilize only up to 4 cores maxed out. if your objective is fps then you should even the load amongst all cores and threads so the cpu isnt working as hard on a single thread causing single thread bottlenecking. this truly seems like a money grab going on between the devs of games and the people making computer parts. i cant see why you'd keep a game less efficient. the only reason for it being that they want you to think you have a bottleneck and buy new parts, mainly a cpu. lots of newer processors have up to 4 ghz with 4 cores and even they are starting to struggle a bit with new titles. so is anyone out there thinking this too? bf5 uses all the cores and works so smooth (especially for my shitty 3gb graphics card) so why havent more companies done this? im so confused and i cant find a single reason except the fact that they are doing a money grab hoping ppl will purchase newer cpus because their old one is obsolete when really it's not at all and the programming is just shit.
 
Last edited by a moderator:
latency from and to which parts? if it's because of the dual cpu should i just turn one off in the bios? also im noticing weird things. like when im playing battlefield my gpu usage will stay in the 80- 90 range and my frames dont really drop below 45 and usually stay around 55 to 60. but if i play different games that arent even as cpu intensive my gpu will not even run at 100 percent. playing far cry 4 it only runs at like 70 percent and i get maybe 50 fps if im lucky. if i enter a town i get like 35 and it suuuuucks. lowering the graphic settings will give me a bit more fps where im getting the 50 fps but when i get to a town it's still like at 37 so i gain like 2 fps. is this game to game optimization or is there a setting stopping my card from utilizing it's full potential. the card is only 3gb too so it should be maxed pretty much all the time. and it's weird that a brand new title like bfV runs so damn well with directX 12 enabled even. and older games and some newer ones just run like a potato lol.
 
Dual socket workstations have never been ment for gaming. A lot of their latency problem is due to numa memory access. Your gpus are old and could cause problems today, aswell as your cpus since server/workstation hardware isnt ment for gaming and the low clocks can really hinder them in games.
 
oh yea im def aware of the lower clock speed. the cpu im running is the xeon x5675 and it actually has a boost clock of 3.4ghz which isnt too bad at all. and if i find another mobo that allows me to overclock im sure i could hit almost 4.0ghz as my temps at a full 3.39ghz are in the 57 celcius range. this was all a system i've sourced for nearly free though minus the gpu(bought when it was still pretty good) and new cpu's(50usd) but the gpu i've had for over 5 years haha. looking into getting a new one. but as it stands it's running bfV at 60 fps(with dips hitting 52) with high settings in 1080p and as far as i can tell from having an overlay enabled my gpu is the only thing holding me back from getting more fps lol. and also the no overclock setting in my bios. if you know of a semi safe (because i know no oc method is truly safe haha) way to get it to overclock without using the bios or maybe even a third party program(that you would stand by yourself) i'd love to know about it.
 
is this somewhere in windows settings??? if it has anything to do with my bios and mobo settings and chipsets they dont allow for overclocking so i'd need to do it through software such as windows or a third party program made for this exact scenario. some mobo's believe it or not dont have the option to set the cpu settings to manual. mine is locked on auto and there is literally no option for manual settings haha. i've even read online that this mobo cant be overclocked using the software the bios provides. but like i said, if anyone knows of a third party program that is well trusted and they'd stand by i'd love to hear about it =)