For Honor Performance Review

Status
Not open for further replies.
While it is nice to see how last-gen cards like the 970 and 390 compare to current-gen cards, I think it would have been nice to see cards outside this performance category tested. How do the RX 460 and GTX 1050 do at 1080p High, for example.
 

They had it enabled on both cards. Are you telling me Nvidia's TAA implementation is inferior? I'd believe you if you told me that, but you have to use your words.
 


as has been stated setting were the same for both cards, maybe Nvidia sacrificing detail for the higher FPS score? I also found it interesting that Nvidia used more system resources than AMD. I was confused why the 3GB version of the 1060 did so poorly considering the lack vram usage but after looking it up nvidia gimped the card, seems a bit misleading for them to both be called the 1060, if you didnt look it up you would assume they are the same graphics chip with just differing amounts of ram.
 
^ yeah it's a pretty shifty move . Probably marketing related. The 1060 3 GB and 6 GB have a similar relationship to the GTX 660 and 660 TI from a few gens back. Only they were nice enough to differentiate the two with the TI moniker. Not so much this time.
 


I've been calling this a dirty trick since release and is part of the reason I recommend people get the 480 4gb instead. Just a shady move on nVidia's part, but they are known for such things.

Odd thing though, the 1060 3gb in laptops doesn't have a cut down chip, it has all its cores enabled and is just running lower clocks and VRAM.
 
blurrier and using less vram for the Nvidia cards..... I'd probably go away from the presets and try to figure out whats wrong, there's not direct performance comparison with different settings. Nvidia's AA techniques lately seem to often produce a blurry image.
 
Seems like an optimized and scalable game where performance is in-line with the visuals. Not something we can say about most of the ports these days!
 
a bit small "Memory Usage" on "High" than "Extreme High" ?
I am only guessing but
Switching ON a specific setting may
allow CPU/GPU to cut some calculation
for shadows or rendering
behind the specific well-placed Objects. (it is unintentional and very situational).
 
Small memory usage on "Extreme High" ??
I am only guessing but.
Switching "ON" specific settings may allow CPU/GPU
to cut some calculations for shadows or rendering
Behind the "well-placed" objects. (un-intentional, very situational)
 
To me it's quite obvious that the differences in CPU and RAM usage comes from the frame rate.
Since the GTX 1060 can produce more frames the CPU needs to work harder to keep up with the demand, and while doing so it also use more RAM to store intermediate data.
 


Higher frames but lower quality. Something is off, I'm starting to question nVidia's performance numbers.Who cares if you are getting 100 frames if the image is garbage.
 
Status
Not open for further replies.