Battlefield 1 Performance In DirectX 12: 29 Cards Tested

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


this problem might be isolated to AMD platform. because from tom's hardware GTX1050ti & GTX1050 initial review the GTX1050 2GB is nowhere near this bad in battlefield 1 using DX12. the test back then were using intel platform (6700K):

https://img.purch.com/r/600x450/aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9HL1cvNjIyNjg4L29yaWdpbmFsL2JmMS1mci5wbmc=


 


Do you ever have anything positive to contribute?
 
When B1 was released, I was only getting 11-12 FPS on Low with my i5-3570K and Fury Nano! Only got good frames after I OC'd my proc 400 Ghz over base. Odd.
 


most reviewer using reference 980ti for testing. 1070 was supposed to be faster than titan x maxwell according to nvidia marketing.
 


At launch OCed 980 TIs were either beating or tied with OCed 1070s depending on the game. I was just wondering if this changed with driver updates, if it was a result of dx12 between 2 different architectures, or if it was just a reference 980 TI.

 
Good but why didn't you throw in a quad core for the scaling? A huge number of people are using quads still.
Also, how about triple screen gaming benchmark?
 
Cpu makes HUGE diffrence in 64 man multiplayer, i7 is vastly superior in that case compared to i5. I know because I swapped my i5-4670k @ 4GHz for i7-4770k @ 4ghz and my fps is much more stable now.
 
As an R9 390 owner, I am now very happy with my purchase.

I mean, I was before, but wow, beating the RX 480 and GTX 1060 on ultra?

And the 390X was within striking distance of the 1070.
 


"30-35% more performance" Which doesn't translate to realistic scenarios. Stop lying

"3rd party GPU's produced ran at 1150 MHz and boosted up to 1250-1300 MHz" Stop lying.

28% OC, 18% Actual Gain https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_780_Ti/30.html

7% OC, 6%% Actual Gain https://www.techpowerup.com/reviews/EVGA/GTX_780_Ti_Classified/27.html

"boosted up to 1250-1300 MHz" Stop lying.

https://www.bjorn3d.com/2014/01/gigabyte-gtx-780-ti-ghz-edition-review-super-overclock-now-sleeker-cooler/4/

https://www.eteknix.com/gigabyte-gtx-780-ti-ghz-edition-3gb-graphics-card-review/19/

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/64491-gigabyte-gtx-780-ti-ghz-edition-review-8.html


 


I gotta agree. I went from my 2500k @4.4GHz to a i7-7700k@4.5GHz. Wow. With the same GPU, 980ti, I saw massive improvements. In 64 player maps I went from High settings getting 75fps to Ultra 90fps. And now the GPU runs at 100% instead of being variable (i.e. bottleneck solved).
 


i think you got better performance not just because it was i5 vs i7. but also because you got CPU that is a whole lot better architecture wise. we did not see massive improvement going one generation to another since Sandy but from Sandy to Kaby definitely quite a jump. for games that did not hit the CPU hard it might not that noticeable. but for game that did it really shows how old Sandy is vs Kaby. even stock 7600K (3.8ghz with turbo boost up to 4.2Ghz) will beat 2500K at 4.5Ghz!

http://www.gamersnexus.net/hwreviews/2792-intel-i5-7600k-review-ft-2500k-3570k-more/page-3
 
Status
Not open for further replies.