[citation][nom]jcb82[/nom]Thats what I love about PC gaming, zillions of ways to tweak and get the most out of your unique hardware.[/citation]
personally, i just wish every game just worked right the moment i put it in.
i mean feed the game what i have. than give me the options and show me a projected fps.
do i want the best graphics
do i want better graphics than my card can handle fast
do i want the best at 30fps
do i want 60fps
do i want the lowest setting so i can go really really fast...
and im not really talking about dumbing the graphics interface down, what im talking about is a game developer should have certain presets for each card... it cant be that much work, you more or less know how cards preform, make the game know what would look best at certain frame rates...
i hate how when im given the choice, how every time my graphics settings are put to mid range, when i have yet to play a game that i couldn't max on my 5770 at 1920x1200 with shadows turned off, and no aa. (i dont have battle field 3 but that is one, and if you don't have a retardedly far draw disntacen in skyrim, you can basicly max that too)
i mean look at the toms Hierarchy chart. i had a 6800 ultra and oblivion played like crap on it, so lets assume thats the minimum.
that would be about 17 graphic levels pre tweaked for your gaming experience, and they know better than anyone what their engine can do, do you don't spend an hour or so figuring out what is the best you can play at and still get decent frame rates.
games not working 100% of the time, and the initial tweak is probably the only two things i don't like about gaming on the pc.