Star Trek Online: Game Performance Analyzed And Benchmarked

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

jennyh

Splendid
Except it does.

What you are showing people is what the game is like now. How do you think it will look in 2 months time after ATI make a driver fix to undo Nvidia's dirty work?
 

blevsta

Distinguished
Feb 18, 2009
61
0
18,630
I like these game specific reviews. This way we can see how well the game takes advantage of our PCs: What we've learned today:

2 core CPUs only, anything more isn't utilized.
Optimized for Nvidia, most probably due to "The way it's meant to be played" money thrown their way.

It would have been nice to see how well this game utilizes SLI and crossfire, 64bit vs 32bit, 2MB of memory vs 8MB of memory.
 

cleeve

Illustrious
[citation][nom]jennyh[/nom]How do you think it will look in 2 months time after ATI make a driver fix to undo Nvidia's dirty work?[/citation]

I don't know how it will look in 2 months, I'll have to wait a couple months to find out.

I'm certainly not going to assume it's going to change because you say it will.



 

jennyh

Splendid
[citation][nom]Cleeve[/nom]I don't know how it will look in 2 months, I'll have to wait a couple months to find out. I'm certainly not going to assume it's going to change because you say it will.[/citation]

And neither should you, because truth is ATI might not do anything.

That's not what I'm asking though, what I'm asking is that you apply a bit of common sense when you get benchmarks like this here today.

Star Trek is a huge name obviously. I read something like 1 million subs already for it, although many are a bit underwhelmed by the game. What Nvidia has done here is just make sure that their cards look a lot better than what they are, to catch newcomers who basically don't know any better.

Those people who play and quit the game are going to be left with inferior cards that cost more...not like Nvidia care because they've already sold the cards...and you don't think that's worth pointing out?

You really don't think it's worth telling people that the results for this game are wildly inconsistent with just about every other game you can find right now?

We aren't talking about a minor discrepancy here Don we are talking about an absolute joke of a benchmark.
 

cleeve

Illustrious
[citation][nom]jennyh[/nom]Those people who play and quit the game are going to be left with inferior cards that cost more...not like Nvidia care because they've already sold the cards...and you don't think that's worth pointing out? [/citation]

My god Jenny. You talk like folks who own or purchase a GTS 250 or GTX 260 will be pulling out their eyes in the streets in two months.

Neither are that horrible. The GTX 260 has become very overpriced lately, but the GTS 250 is a card I'd recommend to a buyer looking for something to play STO on exclusively.

If they want something more versatile, that's what the monthly BCFTM articles are for. This article focuses on STO though, and rightfully so. That's the point of it.
 

jennyh

Splendid
Ok fair enough in some respects.

I do believe you have a duty to this websites customers to point out serious discrepancies in benchmarks though, even during a title review. Fact is, very few people only play one game - even when i was addicted to WoW I still had other stuff I played, and by all accounts STO is not quite so addictive.

Interesting discussion regardless.
 

cleeve

Illustrious
Fair enough indeed. I'll keep that in mind in the future

I will say that I don't think I was as disturbed as you were by the Radeon results. There are a lot of games that show a preference for either Radeon or GeForce cards, but the 4890, 5850, and GTX 260 all had great playable framerates at 1920x1200 (especially high minimum FPS compared to average) and they all failed to reach 30 fps at 2560x1600. But how many people actually have 30" monitors, anyway?

For me, the only really disturbing result was the 5770. Frankly, I'll be surprised if that doesn't improve over time, but I'm just reporting the results I got.

For the record, I find STO more addictive than WoW. but like I said, I'm a Trek fan. :p
 

kettu

Distinguished
May 28, 2009
243
0
18,710


Why don't you care if a company is meddling with benchmarks? If I were a hardware reviewer I'd want to make sure my testing methodology was sound. And that includes all the software. Even if I were concentrating on a single title. If reviewers don't call out the "politics, BS, and questionable conduct" it won't ever stop. Sure, complaining about it doesn't help anyone play this particular game any better but maybe complaining now puts the pressure on game developers to avoid this with their future titles. Transparency isn't a bad thing.
 

fencer55

Distinguished
Oct 29, 2009
94
0
18,640
I pre-ordered this game, and boy was I dissappointed. It isn't terrible, but it isn't great either. Just a mediocre MMORPG set in the Star Trek universe. You can see that the game was designed with the xbox 360 in mind, not the PC gamer. I've been playing Fallen Earth for the last 6 months, now that is a MMO worth playing!
 
Despite some people, the game is quite good. I love and prefer the space part of it. The ships look amazing as does everything else.

I have a Q6600 @ 3GHz and a HD4870 clocked to 4890 speeds and I run the game at 1600x900 maxed out. I do get the ground mission crashes which if it is due to Dynamic Lighting would make sense with all the phaser/disruptor fire going off, with each having their own light source that would try to cast shadows on each perso. Imagine 10 or so beams of light doing that.

I do hate that it is subscription based and probably will not continue it. Might buy a month at a time now and then. If only NCSoft did the game, the ones who made GWs. Buy and then F2P.

Its a good game that has a lot of potential. I enjoy searching for people from older shows/movies. In fact the human looking Klingons actually get explained in the game.

As for the Klingons, I can understand why they are not specifically a starting faction for now. Star Trek has always been a look at humanity and how much potential we have to work together. Klingons are the opposite in a way. They prefer war. The Borg are not playable but hey, maybe they will add a quest where you can be converted to one.

It does seem however that in the future Cardassians and Romulans might become playable factions. Would be interesting to see them and have a 4 way PVP setup.

Oh and Tribble breeding is pretty cool. Depending on the food you let them eat, they will come out different. Basic Tribbles do self heal on ground missions. I have a Regen one, damage buff one and damage resistance one.

Overall, the game is great. Still has work to do but when that gets sorted out, it will probably be a pretty large MMO even compared to WoW.

 
You know, there was a time when if one card from nV did better than ATI it was because it was optimized. Well now only nV does that. They don't do it to specifically down ATI but to actually get better performance so they can sell cards.

ATI did it too before AMD bought them but we all know how AMD is in their optimizing/marketing.

jenny, Batman AA was an amazing game no matter what. Sure it was made better for nV but who the hell cares? It plays great on mid end hardware and its the God Damn Batman!!!!!!! In fact it is probably the BEST super hero game out to date.

I can see a GTX260 beating a HD5850. Its not that hard to understand. The GTX260 might have a lower shader count, but they have the same TMUs and a much, MUCH faster shader clock than ATI does since Radeons are limited to their core clock for the shader clock.

Considering that the 5850 even falls behind the 4890 in some cases, even with its 2x TMUs and SPs, can be proof that its not a nV optimization but simply that either the drivers for the HD5K series are really immature or its just not that great in this type of game. The 4890 barely out does the GTX260 as well. ATI is ahead for now but not always. In a lot of games, the 5870 still can't beat a GTX285.

Maybe all those extra SP units don't have any effect or maybe they will. One thing is for sure though, you are making a big deal over noting.
 

minijedimaster

Distinguished
Feb 18, 2010
1
0
18,510
I refuse to shell out $50-$60 for 30 Days of play time not knowing if I'll like the game or not. Once the entry price comes down or most likely they offer a free 14 day trial (read: no upfront purchase like Eve Online) I will consider this game. Until then I'm not paying that kind of cash for a possible 30 day rental.

And to Jenny... STFU already. No one cares except you. I came here to see how this game may play on my system specs. I applaud the author for giving us some useful info on how this game plays in the REAL WORLD on different PC specs.
 

cleeve

Illustrious
[citation][nom]minijedimaster[/nom] Once the entry price comes down or most likely they offer a free 14 day trial (read: no upfront purchase like Eve Online) I will consider this game.[/citation]

I believe there's already a free 5 or 10 day trial with no upfront purchase, you might want to look into that.
 

cleeve

Illustrious
[citation][nom]kettu[/nom]Why don't you care if a company is meddling with benchmarks? If I were a hardware reviewer I'd want to make sure my testing methodology was sound. [/citation]

My methodology is absolutely sound, the results are valid. What you see is what you get.

But 'meddling with benchmarks' is a pretty wide open statement. Both AMD and Nvidia have been sponsoring game titles since the dawn of graphics cards, lending expertise to projects and doubtlessly getting an inside track as how to optimize. I don't think I need to rehash that everytime I run a game bench. It's part of the business, not something I personally love, but this is a capitalist country and if companies want to form strategic relationships it's really none of my business to say they can't or shouldn't.

This is the status quo. Frankly, I wouldn't boycott a game I'm interested in if it's been sponsored by either AMD or Nvidia: if it's a good game, then they deserve some kudos, and if it sucks, they deserve to be told that, too.

Once again, all of this is irrelevant to someone who wants the best Star Trek Online experience, and that's who this article is for.
 
[citation][nom]Cleeve[/nom]My methodology is absolutely sound, the results are valid. What you see is what you get.But 'meddling with benchmarks' is a pretty wide open statement. Both AMD and Nvidia have been sponsoring game titles since the dawn of graphics cards, lending expertise to projects and doubtlessly getting an inside track as how to optimize. I don't think I need to rehash that everytime I run a game bench. It's part of the business, not something I personally love, but this is a capitalist country and if companies want to form strategic relationships it's really none of my business to say they can't or shouldn't.This is the status quo. Frankly, I wouldn't boycott a game I'm interested in if it's been sponsored by either AMD or Nvidia: if it's a good game, then they deserve some kudos, and if it sucks, they deserve to be told that, too_Once again, all of this is irrelevant to someone who wants the best Star Trek Online experience, and that's who this article is for.[/citation]

Actually AMD doesn't. ATI did for a while, HL2 was the first game to feature it.

And I am not sure how it could even be used to "hinder" competition since its not a compiler. Its nV helping them to optimize the game for nV cards.
 

graill

Distinguished
Sep 16, 2009
12
0
18,510
Really, really poorly made game. A stock gx2 is more than enough to run this game on max graphics, i really dont see a reason for the horsepower comments.

The bottom line is this, the game was released as was, meaning the way cryptic got it from CBS when the other company ran dry.

You def get what you pay for. I made it to RA in 12 days, that 4 hours a day, doing everything, reading everything. The JJ Abrams universe simply blows. In his universe there are no playable races other than the feds, ships are heavily restricted movementwise in space and both ground exploration and space exploration are restricted to cubicles and the letters O and I, meaning you move in a circle or you move in a straight line, very ignorantly done.

Pvp is a joke in the flavor of LOTR. You get what you pay for, thats the only warning you get. I cancelled before they could charge my card. In 3 years when they finish the game i will see what they have. Be warned.
 

reasonablevoice

Distinguished
May 10, 2008
71
0
18,630
TWIWMTBP strikes again. Anyway, Nvidia has ALREADY lost this round of the graphics game, they haven't had a product out in time for incorporation into any OEM designs and many consumers have already purchased ATI solutions.

Fermi is going to be hot and expensive and a paper launch. Nvidia has already given up on it and is focusing on Fermi 2.
 

cleeve

Illustrious
[citation][nom]jimmysmitty[/nom]Actually AMD doesn't. [/citation]

Actually they do. AMD has told me directly that they have sponsored a number of Dx 11 games. If memory serves, AvP and Dirt 2 are among them.
 
[citation][nom]graill[/nom]Really, really poorly made game. A stock gx2 is more than enough to run this game on max graphics, i really dont see a reason for the horsepower comments.The bottom line is this, the game was released as was, meaning the way cryptic got it from CBS when the other company ran dry.You def get what you pay for. I made it to RA in 12 days, that 4 hours a day, doing everything, reading everything. The JJ Abrams universe simply blows. In his universe there are no playable races other than the feds, ships are heavily restricted movementwise in space and both ground exploration and space exploration are restricted to cubicles and the letters O and I, meaning you move in a circle or you move in a straight line, very ignorantly done.Pvp is a joke in the flavor of LOTR. You get what you pay for, thats the only warning you get. I cancelled before they could charge my card. In 3 years when they finish the game i will see what they have. Be warned.[/citation]

This isn't the JJ Abrams universe.... this is the Prime ST universe that Spock left continuing on.

[citation][nom]reasonablevoice[/nom]TWIWMTBP strikes again. Anyway, Nvidia has ALREADY lost this round of the graphics game, they haven't had a product out in time for incorporation into any OEM designs and many consumers have already purchased ATI solutions.Fermi is going to be hot and expensive and a paper launch. Nvidia has already given up on it and is focusing on Fermi 2.[/citation]

DX11 probably wont be viable until next year anyways.

[citation][nom]Cleeve[/nom]Actually they do. AMD has told me directly that they have sponsored a number of Dx 11 games. If memory serves, AvP and Dirt 2 are among them.[/citation]

Ahhh so they have the only DX11 GPU and decided to get back into it, eh? Well I guess you wont see people complaining when a HD4650 beats a GTX285 in a AMD optimized game.......
 

kettu

Distinguished
May 28, 2009
243
0
18,710


You missed the point by narrowly focusing on the first two sentences of my post.
 

quizzical

Distinguished
Feb 19, 2010
3
0
18,510
My what absurd timing. One might guess that something is amiss from a GeForce GTX 260 beating a Radeon HD 5850. Having played Champions Online (same company, basically the same game engine) a while, I can tell you what's amiss.

Basically, the game engine is getting some big changes at the moment. Performance today isn't indicative of performance a month ago and probably won't be indicative of performance a month from now. This isn't just routine updates to a game, and is the sort of thing that even most MMORPGs don't do after launch.

What happened in Champions Online is that the zone Lemuria somehow became unplayable. People who had a character in that zone and tried to log in would quickly be disconnected. They couldn't even get the character out of the zone to go play elsewhere in the game. The rest of the game worked basically just fine.

So Cryptic basically panicked over this. They tried to fix the problem to make the zone playable, or at least let players get characters out. They pushed a build ("Kitchen Sink") live very shortly after putting it on the public test server, and ignoring a lot of other problems that the build created. It's debatable whether the problems that the patch created (e.g., sporadic instance crashes, erratic and bugged mob difficulty, anti-aliasing completely disabled on ATI cards, etc.) are game-breaking, but the "can't log in" from before the patch surely was game-breaking to people who had made it to Lemuria.

The player base logged in at a given time has fallen by about half since before the Kitchen Sink patch after being fairly stable for months before that, as measured by the number of public instances open at a given time. Presumably some players find the changes game-breaking won't play until they're fixed and some will play now anyway.

So what this article does is to take a game engine in basically a temporarily broken state, and benchmark it using old drivers from before the changes started, and try to declare the performance indicative of something or other. I honestly can't think of any other time that I've played any game for which benchmarking various hardware would be less meaningful. I guess I'd probably be able to come up with some examples if I played a lot of early betas, but I don't--and the reasons why beta versions of games shouldn't be used for benchmarking apply here more strongly than to most beta versions of games.

What particularly seems odd is that a lot of players with a GeForce card using the 195 drivers (which this review uses) couldn't get the game playable at all under those drivers. I don't just mean a G210 can't run the game. I mean players with a GeForce GTX 295 were getting frame rates in the single digits. Turn all settings to minimum and it was marginally playable. And the 196 drivers apparently greatly improved performance for a lot of GeForce card users, while others had been going back to drivers from 190 or before.

So is this an instance of a Radeon HD 4650 beating a GeForce GTX 295? No, it's a temporarily glitch, and irreproducible. Just like the benchmarks in this article.
 

quizzical

Distinguished
Feb 19, 2010
3
0
18,510
It's unclear where your frame rate numbers come from. The game engine has at least two different ways to give players frame rate information, and they often contradict each other. When that happens, /fpsgraph is right and /showfps is wrong. It really is about that simple.

Going from one area to another or tweaking settings can often qualitatively make a huge difference in frame rates. /fpsgraph picks this up quite well, and the data it outputs corresponds pretty flawlessly with the intuitive feel of how good the frame rate seems. Sometimes /showfps picks up the changes and sometimes it doesn't. That makes /fpsgraph by far the superior tool for a player wanting to monitor his frame rates as he tweaks settings, while /showfps is nearly worthless for this purpose. I'm not sure if /showfps is just bugged, or if it's tracking something that doesn't correlate very well with how the frame rate "feels".

The trouble is that /fpsgraph from a benchmarking view is that it doesn't give a numerical output. It gives bars that show how long each frame took to render, and they slide by, and off the graph within seconds. One could probably get meaningful numbers from /fpsgraph by taking lots of screenshots and coding some program to check how many blue and purple pixels there are in the bottom right corner of the screen, but it would be a major pain, and the article gives no indication of having done this.

It's more likely that the article used /showfps, in which case, the numbers presented are flatly wrong, and indicative of nothing but some idiosyncrasies of how a peculiar command works. That there's virtually no difference given between the average and minimum frame rates seems to point toward usage of /showfps, as that command tends not to pick up such things, even when changes in frame rates are glaringly obvious to a player.

-----

Also, trying to pick out "maximum" settings can be awkward. There is a main video settings slider that will adjust a lot of video settings at once, but it doesn't affect some of the particular other settings. For example, the slider doesn't touch anti-aliasing. Presumably the author of the article did put that slider to the rightmost position, which the game describes as "maximum". But that leaves it unclear what other settings were used.

But turning all settings to maximum is just a silly thing to do, no matter what hardware you have. The difference in image quality between maximum and going one notch to the left (what shows for me as "recommended", though the recommendation might be customized to the hardware that a player has) is basically zilch. The difference in frame rate performance is quite considerable, however. Moving two or three notches to the left rather than just one leads to glaringly obvious differences in image quality (as compared to each other, or to recommended/maximum), so the slider does work.
 

cleeve

Illustrious
[citation][nom]Ehsan W[/nom]"Once again, we're seeing the GeForces stand up well to the Radeon competition"ehm WTF.This Is clearly tweaked for Nvidia[/citation]

Once again in the STO benchmarks Ehan, not in general.

Way to keep an eye out for those conspiracies, tho. :D
 
Status
Not open for further replies.