Rumor: Radeon R9-290X-X2 to Bring Double the Hawaii?

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

And the GTX 780 SLI breaks 5 ms more often than the 290X Crossfire in Bioshock Infinite.

Now, if you were bothered by eg. 1 ms frame variance, then the tables would be turned. But that is pretty much impossible for humans to perceive.
 


Your argument here doesn't make sense.

SLI is a different implementation than xfire, yet we've been comparing them using FCAT this whole time.

Folowing your logic to it's conclusion, FCAT would be almost worthless. If you can't compare the xfire's to each other using FCAT, you certainly can't compare either xfire with SLI.

 


Half of the available evidence refutes this claim. We can't make reasonably conclusive statements about Bioshock's frame latency in any capacity.

The tom's review shows the titans with better latency at every point, and neither solution was over 5ms.

Didn't we already cover this? Am I more senile than I thought?
 

I don't know what you mean. FCAT is a tool for analyzing performance, it's not a measure of performance.


Tom's review does not show how often each configuration exceeds 5ms variance. PC Perspective's review does. That's because Tom's review only shows a couple data points and an average, while PC Perspective has a full graph.
 


The issue is that the argument that we can't use FCAT to compare the two xfires falls apart immediately.

We've been using FCAT to compare SLI with Xfire this whole time. SLI is a different implementation than either xfire.

If we can use FCAT to compare SLI with Xfire, then we can certainly use it to compare the two different xfires. One can't be true without the other, so comparing the 7850's latency with the 290's is completely valid.
 


The plot points of the PC perspective review don't agree with the ones in Tom's. That is what invalidates the game as a reference.

If you take the 75th and 95th percentile points from the PC perspective chart, you see the the PC perspective chart completely reverses who has better latency compared to toms. Tom's points also imply different curves than PC perspective results.

It doesn't matter than tom's doesn't use the full chart. The data from each review that should coincide with each other doesn't.
 
Just to give you a hard numbers example....
At the 95th percentile

PC perspective: SLI ~4ms, xfire ~1.5ms
Tom's: SLI 0.78ms, xfire 2.64 ms

The two reviews are completely flipped on who has the better frame variance at the 95th (and at the 75th, and overall). This makes Bioshock invalid until considerable corroborating evidence can be obtained.
 

So? Of course we can compare Crossfire results from different configurations. That doesn't mean that the results of one configuration can be applied to a different configuration though, especially when there are different implementations of Crossfire involved.
 

No, it invalidates nothing apart from the decision to only publish 3 data points in Tom's review. Bear in mind their test setups were different, so different results were to be expected. But Tom's review doesn't contain enough information, so we have to rely on PC Perspective.
 

They tested at different resolutions. Tom's numbers are at 3840x2160. You're comparing apples and space stations.
 


Would you explain yourself better here? To me it sounds like you're talking in a circle.

Once again, I'm comparing the 7850's latency to the 290's in the same way as we're comparing the 780's to the 290's.

If we can't use FCAT to compare the 7850's to the 290's then we can't compare SLI to xfire either, which would make FCAT nearly useless, and this entire thread pointless.

 


While there is some change, the frame latency gap between cards does not statistically vary that wildly between resolutions.
http://www.tomshardware.com/reviews/geforce-gtx-titan-performance-review,3442-3.html


Guru 3d's review tested at 2560x1440. The 290x crossfire had worse average latency than SLI 780's, as well as having more runt/dropped frames (the spikes).

http://www.guru3d.com/articles_pages/radeon_r9_290x_crossfire_vs_sli_review_benchmarks,11.html
 


Besides the differing resolution claim, (correct me if I'm wrong) you seem to also be arguing that results can't be compared because the two sites didn't publish exactly the same chart, even though there are matching sample points. That would put a serious damper on experiment reproduction in any field of science were it true. Guru3d uses yet another chart format.
 

If x = 7 and y = 9, I can compare x and y, but I cannot say that y < 8 just because x < 8.
 

4.0 - 5.1 - 6.2 first turning into 14.1 - 15.2 - 16.7 and then 18.5 - 26.7 - 29.7 is a pretty "wild" variation. And that's just with the GTX 690. And as for the gap between the different cards; well the 7970 stayed put at practically the same frame variance.

(you mention statistics, so I should perhaps note that technically what I'm calling frame variance is not a variance - σ^2 - at all)
 

It's really difficult to get much out of a comparison of just a couple of points from a graph to an entire graph. We can assume that each of the curves in the former graph would have looked roughly like the ones in the second graph, but we can't know for sure. So we're left with just a few little islands of certainty in a large sea of uncertainty. It's a lot easier and more practical just to have a big old continent of data stretching from horizon to horizon (or at least from zero to the 100th percentile).
 


I'm sorry. It's just ludicrous to say that comparing SLI to the 290 xfire is ok, but comparing 7850 xfire to 290 xfire in the same manner we are comparing SLI is not.
 


You misunderstand. While the frame latency for any given card changed between resolution, they ALL changed in a similar fashion. The 680 always had worse latency than the titan, which in turn had worse latency than the 7970 (and of course, the 690 had the worst latency). This is true at all resolutions.

At no point did it vary so much that the 680 suddenly had better latency than the 7970.

 


Moot point. The guru3d data provides a completely different result than either Tom's or PCperspective. Bioshock is definitely not a valid game to use in claiming the 290 xfire has better latency in a game at all at this point.
 

But they clearly didn't?! They changed in very different ways.

 


 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780
with all the problems that sli/crossfire has, i cant understand why anyone would do that.

1920x1200 monitors, any 300$ card can handle
2560x1600 monitors, a 500$ card can deal with it
multi monitor, why are you even doing that?

and multi monitor setups will hopefully die with the oculus rift and others like it.
the problems that you gain with a multimonitor and than a multi card just arent worth the hassle.
 

because we want moar. :) people with one gfx card find it easier to add one more of the same to boost gfx performance no matter how cheap/costly it is, from dual radeon 6570 cfx to dual 7990/tri-780ti.

any? current cards can do that (after multiple price drops) now. the older gfx cards can't, without compromises. however, most current gpus can handle multimonitor for basic tasks except ones that tax gpu and vram e.g. gaming.

because of wider area of vision, multi-tasking, (this is gonna piss off some "real gamers and enthusiasts" -) power saving, etc. iirc 3-4x 768p/900p (and some 1080p, even ips) monitors were cheaper than one 1440/1600p. syncing and calibrating would be issues, but those can be solved with some effort.
when i tried dual monitors, i realized that it was better than having multiple virtual desktops on one 1080p screen, or using softwares like dexpot.

occulus has a Looooong way to go before becoming ubiquitous. even then people will prefer multidisplay to a v.r. headset stuck to their faces (wide planar surface vs 3d v.r.).
 
Status
Not open for further replies.