Game, graphics, and the furtre of them...


Aug 5, 2009
honestly i would love to upgrade my graphics card, and get into playing pc games again.

the card i currently have is what almost every game has as its minimum, or at least close to it.

i want to run new games at 1920x1200, and at that resolution, i found out you would never need aa.

from what i can tell, the card im looking at, the 5770 is capabale of playing anything i would throw at it at 1920x1200 at a playable frame rate.

but i have one major question, well 2.

are there any games, that in the foreseeable future, that will be... well... have great graphics and are more than a glorified tech demo?

for an example, look at crysis, that came out and couldn't be played at its max settings, and if i remember right, can barely be played now at those settings. and all in all, crysis was more or less an average fps that no one would still e talking about if it wasn't for those graphics. games like that are excluded.

however something like, lets say if diablo was just screwing with us, and the graphics turned out to be uber realistic. that would be a game to mention because its more or less the best game of its kind.

im sorry if what i said was confusing, basicly i want to know if their is going to be any game that will be releasing within the next 2 years that will blow the 5770 away at 1920x1200 res with no aa or anything like that.

im asking because i can wait until the holidays, and get a better psu and get a better card.

and also, are their any dx11 games coming out that wont support dx9? i don't really want to move over to windows 7 until some things happen.

I personally do not find this true. Often times I find the jaggies really stand out, even at 1920x1200.

No one can say for sure what the future will bring, but there is a good chance a 5770 will not be very viable in 2 years. At 1920x1200, it already does not play all games at max settings, but if you are ok at medium and occationally low settings, then it has a decent chance.

It's hard to say what will happen in the future. It's possible that in a couple years they won't on a few games, but I do find it unlikely. What is more likely is that they won't be optimized for DX9, or miss quite a few visuals on the DX9 version.


Jun 16, 2006
The 5770 should be okay until (and a lot of people are going to hate me for saying this), the next generation of consoles comes out. A large chunk of PC games are also put out on the consoles so their system requirements are linked in a way.

I say DX11 games with tessellation enabled will make a 5770/5750 useless at that resolution.

I use as much AA/AF that my cards can handle to give me the best gaming experience I can get.

Games demands are going to get tougher and tougher on graphics cards. Not all games but remember what happened when Crysis came along........ next it'll be 4 games or 5 games that nobody will be able to play with decent settings. ......... unfortunately game developers are hooked on this stupid multi platform crap which really slows this down or more people would be hurting right now. This comes down to particle physics to an extent...smoke/dust/dirt/clouds/explosions/foliage.........etc.

The might 8800GTX........ great card and some people still think they're the greatest thing going, but let's not deny that they too have trouble keeping up with some of this stuff. If you say they're don't........ do you read sign language ?

I don't want to be the guy playing DX11 games in DX10/DX9 because my card can't play it...... I want to see the great advance that Dx11 has to offer.


Aug 5, 2009

ok let me be clear on what i want for gameing.

to me, 20fps is very playable for single player, if its a fps i try to get as close to 40+ as i can, single player, but if its multi player, i turn everything off so the game runs as fast as it can, last thing you ever want is to be killed by hardware slowing you down.

and also im looking at the benchmarks, fear 2, fallout 3, farcry 2, left for dead 2, all are very playable even at the highest you can set it at that resolution. and with farcry, the worst of the 4 fps wise, still has over 30fps.

i dont mean to sund like im saying you are wrong, but i believe we both have different ideas of playable.

from what you said, their are no known games right now that will kill a 5770, at least in the next few years. and i also see that no games are confirmed dx11 only.

thats more or less true though but even than, i dont think games are going to graphicly evolve much, im guessing that the next console hardware push will be true 1080p at 60fps.

but beyond that, its not even the consoles that are holding pc games back, its the fact that making something believable is always better than making something look real.

its a concept that i learned in early days of gaming, when you couldn't have physics because of a practical issue, you had to make things believable, they weren't real but they looked real.

on the consumer end, will they care if a cloud of smoke isnt fully intractable? like if you run through it, the smoke wont be pulled out with you for a bit? sure it will look pretty, but that alienates so many people without a rig to push that kind of graphics and physics.

im thinking that even with the next console bump, you will see games not push anything until quad core cpus become the standard. not dual or single core, but when you think new cpu, the WORST you can come up with is a quad core. because look at crysis. graphics for now don't need to be better than that. however the phisics is where the game could greatly improve, making everything more believable

im looking at battle forage for this, so their is a good chance im over all wrong. but the difference between 10 and 11 is 10fps. where 11 doing the same crap 10 does, pushes the game 10fps faster

aa and af, i don't know what af does exactly, but aa smooths out the jaggies, which use to be such a big part of games for so long, but with high resolution gaming possible now, without the most beefy of beefy cards, the jaggies that remain are all but non existent. and so you know im baseing this off of older games, like kotor, and off of tourchlignt, which my current gforce 6800 can handle at 1920x1200, and the difference between 1024x768 and 1920x1200 is a complete wow.

and also didnt crysis sell crappy because of the stigma that the game is unplayable unless you have such an over kill computer? not many companies want that, now they may make the game look nice on the best of the best, but are takeing a more blizzard and valve approach to it, making the games playable on even the lesser computers.

im sorry if i got a little off topic with this. see i dont mind upgrading my graphics card in 2-3 years, i just want my moneys worth and dont want to replace it sooner because of that.

I said that it won't play all games at max now, I didn't say it wouldn't play any games at max now. The games you showed typically show the highest benchmarks at max settings out of the typical benched games.

I also don't consider 30 fps playable for a 1st person view game, but if you can live with 20, more power to you. I believe 40+ is playable. Below 40 and I get a bit of motion sickness.

This review shows a few more games, including Crysis, the most difficult to run (they did choose high, instead of very high settings to make it more reasonable):,2446.html
I don't mean to be argumenative with this last thought. I just thought I'd pass on some thing I learned over the years that you might notice yourself, but haven't really realized.

I also used to be happy with FPS in the high 20's and low 30's. It seems smooth, it looks pretty good. However, when ever I got into a game, especially 1st person view games, I'd always feel a bit queazy after a few mintues and the longer I played, the stronger this got.

I'd usually take a few breaks, and eventually, if I was determined enough, I could withstand that queeziness. Eventually I started using better hardware, or turning down graphic settings and have come to realize that playing with over 40+ just feels a ton better on the body and mind. Now I can't stand anything below 40 fps.

I'm not saying you experience this, but many people I know do. When you're used to lower end hardware, you learn to live with it, but if you are in position to afford hardware that will prevent this, it is worth trying to have over 40 fps and is worth some extra $'s.


Aug 5, 2009
i do, and i get it worse than you. back in the 2d console era, i never got sick, or had a head ache, but than i got a n64. and holy crap, some games weren't bad, but some were murder on me. headaches that were borderline migraines, feeling like i would throw up, but that may have been a side effect of the bad headaches. and i forgot the game, but one time i didn't want to stop playing a game i got, and apparently i passed out... dont know why, i think it was from pain, because i have passed out that way other times, not video game related, but may have been something else.

my body cant handle games often, or i should say, the engines. old video game engines, back when 3d was new, looked like crap, and ran even worse. combine that with a crt monitor, don't really need to explain any more than that do i?

at least on the consoles, this happened allot. but the saving grace is my body also builds an immunity kind of. every new graphics engine on the consoles in the n64 era, and the early era after it i had to re build an immunity for them. but one console was the worst, the ps1.

the mix of jaggies, and the over all crap look of the games. a typical n64 game took 5-6 hours to build an immunity for, but a play station game, closer to 10-15 hours. and unlike the n64, if i didn't play the games for a long time, id lose the immunity i built up.

now this brings me to the pc, and how for 7 years i had no idea what a refresh rate was, and that it was a crt set to 60hrtz.

6 hours staring at that and i had a headache, which i built an immunity to, and soon i got to the point where 12 hours did nothing to me, but annoyed me.

than i learned i could set the refresh higher, and all this pain in my head i blocked out just went away.

and than i went to a lcd, something i never wanted to do until they got real blacks, holy ***, i haven't had a headache or eye strain in almost a full month.

games on the pc never game me any problem though, even the worst of the worst looking games, so long as it wasnt something that ANYONE would get a headache over, i never had a problem, and this is coming from someone who played everquest luclin on 300mb of ram a pos graphics card, and a 333mhrz p2, getting less than 5 frames a second.

more or less, for me, the fps on the pc doesn't mean crap, its the monitors refresh rate. id like to have a 120hrz, but that isn't happening anytime soon.

i may have went on a little long there...
I remember that issue too, but it's different than the low fps issue. I think I needed 75mhz for the refresh rate to not have eye strain.

The low FPS issue is more of a motion sickness thing and mostly only an issue if you are in first person and controling your aim with the mouse. For me, that requires 40+ fps.

It might not be a problem for you, but I just thought I'd bring it up, because a lot of people just push through it and eventually forget about it. I used to, but if you are in position to always have over 40 fps, it does feel a lot better.


Aug 5, 2009
lol, ok i see.

i use to play doom when i got my first computer.

4 og my 5 friends got motion sickness from the game. i personally never understood why.

i cant wait until 3d tech takes over. most of the people who still talk to me get motion sickness from that too, but i love the fealing of like you are falling. i will set my monitor onto a looping 3d falling lay on my bed, and look down or up at it and ride that feeling when i get something capable of it, but for now that is to expensive.

but low frame rates never for to me in any way but annoying, like in golden eye, playing 4 person splitscreen, and all we have are explosives, crap couldn't have been more than 1fps most of the time, but we played it for hours.

Similar threads