Until a couple of years ago I'd never much liked the PC as a
games platform, or as a work platform either (my main system is
an SGI) though I do now use a PC for video encoding. For gaming I
bought a PS2 and enjoyed the games a lot, especially the later
releases such as Black which had surprisingly nice visuals. I was
looking forward to the PS3, with titles like, "Resistance: Fall
of Man", GTA IV, etc. I like 1st-person shooters such as
Mercenaries, Call of Duty, etc., and highly explorative/flexible
'open' games such as "Draken: The Ancients Gate", Summoner 2,
the GTA series, etc. Plenty of games on the PS2 of this type and
I certainly reveled in never having to worry about OS nonsense,
driver bugs, virus hell, etc. Game crashes/freezes are very rare.
But in the end, once the PS3 launched, I was not impressed at the
high pricing (450 UKP in the UK, ie. $900) and the high cost of
the games (50 UKP was typical, ie. $100), so I didn't buy one.
I already had a XEON Dell Precision 650 I was using for general
work stuff (an AGP system) which didn't actually cost me anything
(bought two, sold one at a profit that covered both) so I decided
after reading lots of online articles on toms, anandtech, etc.
to replace the existing Quadro4 900XGL with a modern gfx card,
namely an X1950 Pro AGP (this was in Dec/2006). I was very
impressed with the results and was delighted at being able to
easily play exactly the kind of game I really like (I bought
Oblivion and Stalker). A few months later I sold the Dell and
used the money to more than cover the cost of a new set of base
parts, keeping the X1950 card and the disks (4 x 147GB 15K U320
SCSI). The mbd was cheap (Asrock AM2/DDR2-800 AGP, only 35 UKP),
the CPU purchase was perfectly timed (Athlon64 X2 6000+, 156 UKP,
the very week the price was halved by AMD so it was massively
cheaper than an equivalent Core2Duo), RAM was much cheaper by
then (I bought 4GB), got a nice case (Centurion Plus 534),
reasonable PSU, and the results were fantastic. The faster RAM
(DDR2/800 instead of DDR266) enabled framerates as much as 6X
better than with the same gfx card used in the Dell. I ended up
getting better results than online review sites were seeing with
the PCIe version of the X1950.
😀 Indeed, for an Athlon64 X2
system with an X1950 card, I had the no. 6 spot on 3DMark06 (5583).
Finally this year for my bday (so the cost to me was zero), I
switched mbds again to an ASUS M2N32 WS Pro (so I could have
proper PCIX and PCIe SCSI RAID) and replaced the gfx with
a GF8800GT PCIe (gf bought me CoD4 for xmas, still not yet used).
Now the games run with all detail settings maxed-out at 2048x1536
(using no AA but 16A AF). I'm really pleased with the results,
and I have CoD4 sitting unopened for when I feel I've done
sufficiently more of the games I already have.
Do I miss having a PS3? No. It would be nice to play GTA IV,
but I don't have time to cope with more than 2 games at any
one time and it'll be ages before I'm bored with Oblivion and
Stalker. Heck, I still have PS2 games I haven't finished, and
a couple I've not even started (my brother bought me several
way back).
In the past I was never interested in PCs as a gaming platform.
Friends I knew who did use PCs always seemed to be spending time
reinstalling Win98, fighting virus woes, constantly upgrading,
etc. But things have changed; XP is a good platform for gaming,
and by not always getting the very latest gfx/CPU every time
something new comes out, one can stay nicely up to date without
breaking the bank (I'd say upgrade once every 18 months at most,
aim for the 100 to 150 UKP price point for the gfx). In my case,
through a bit of luck, I've been able to maintain a good system
at no net cost at all, whereas if I'd bought a PS3 for 450 UKP
I'd be pretty peaved at the degree of devaluation and changes
in disk capacity, etc.
My 1st console was actually an N64 (I had the 1st ever web site
on the Ultra64 as it was called before launch), then bought a
PS2. But now, for all the arguments that rage about APIs,
multiplatform availability of certain games, etc., I find it hard
to ignore the simple cost difference: PC games cost half as much
as PS3 games, I'm able to run the games at mad levels of detail
(twice the vertical line res of many PS2 games), and messing
around with the whole hw overclocking business is certainly a lot
of fun (783MHz core with the GF8800GT). In that sense, the PC as
a gaming platform is actually more than just a gaming system: I'm
pretty sure the kind of people who like to play games on good
spec PCs at high detail are very much the same people who are
into overclocking, etc. It's a natural match. I'll probably by a
PS3 at some point purely for its HD playback capability (far
cheaper than any dedicated player), but not until I have a decent
HD TV, and there's no point in buying an HD TV until the display
technology has matured (I'm waiting for OLED systems) and there
are plenty of HD channels worth watching with suitable content.
One other thing that persuades me of the continuing life of
the PC: the mod scene. I was so impressed at the plethora of
extensions available for Oblivion and Stalker. One could never
run out of more things to try, new maps, etc. But with the
consoles, the use of mods is very restricted and it's difficult
to sort out patch fixes for a particular game - the online
services for the consoles are just nothing like as open in how
they work as I would like and in some cases are not even free.
I know someone who bought the GoTY edition of Oblivion for his
PS3 and although he enjoyed it a lot, he kept finding quirks &
bugs which he was unable to fix because there was no way to
download patch fixes, in some cases quite bad ones (eg. an entire
side mission not available). In the end he sold the GoTY version
and replaced it with the standard edition which did not have
most of the various bugs. Indeed, it seems these days that many
console games are nowhere near as rigorously tested as they used
to be.
It's hard to say where the API 'war' is heading, the hardware
changes so fast. A single technical revolution could change
everything, eg. a GPU design using memristors and Qbits that
has true volumetric effects which currently do not exist at
all in any genuinely native form (fire, water, mud, smoke, snow,
ice, rain, etc.) Something like that would permit entirely new
kinds of games to be written.
It's certainly a pity OGL has slid in its relevance to modern hw
but no surprise given the infighting between consumer companies
and those supplying technical markets. Car companies have a lot
of influence anyway, so no wonder issues around CAD applications
have thrown the odd spanner in the works. SGI's legacy came out
of supplying visual simulation markets, which explains much of
OGL's early design (see my SGI site for details).
But all this makes me wonder about a key aspect of consoles
that is definitely a good thing: they're far more efficient
at making the best use of their gfx hardware than PCs.
Given the generation of GPU in the PS3, the realism in the
latest games is very impressive. On paper, the latest PC gfx
cards should be _massively_ faster, but why aren't the latest
PC games equivalently a lot more visually realistic?
Much is made of Crysis, used everywhere as a benchmark because
it taxes gfx cards so severely. Yet how do we know the fps
results with Crysis aren't lower purely because it's not written
very efficiently? Who has ever measured just what Crysis is
doing in terms of triangles/second, how much it's getting out
of the cards? I did try and ask someone at NVIDIA I used to know
about this (Ujesh Desai), but he didn't answer, too busy I guess
(when I was doing stuff for SGI he was the guy looking after the
O2 system I borrowed). Still, I do wish some site would look into
whether games are using gfx hw efficiently. A senior 3D developer
at Virtual Presence once told me that few PC apps ever get more
than about a third of peak performance out of any PC gfx card
before the technology has moved on to the next product. There's
never enough time to evolve the drivers to a decent degree. By
contrast, consoles traditionally end up squeezing every ounce of
possible speed out of the hw that can be obtained, though MS
seems to be breaking this somewhat by turning around products
much faster than has been the case in the past, ie. Xbox360 not
really out that long before the next MS console is released.
No doubt gfx cards will get ever faster, but to me it feels like
game writers are getting lazy, or at least I suspect so. And
meanwhile, for all that games continue to look ever nicer with
more visual effects, I still so strongly wish for a revolution
in _how_ games work rather than just visual realism, eg. properly
modifiable environments in the way Red Faction tried to do (a
grenade should blow up stuff no matter where one is, not just
where the game world 'allows' things to be damaged), a more
consistent continuous game world rather than simple 'spheres of
existence', more fluid AIs instead of the typical "every entity
is either friend or foe", and so on. Crysis has done some things
in some of these areas, but is still very lacking. My fear is
that so much focus is on 3D speed and realism, we may end up with
a gaming slump when people decide that the game play itself is
not that good even though the game looks incredible. I even heard
a comment recently on a TV gaming show, the presenter said about
the latest racer, "Yeah, yeah, we're all bored to death with
ultra-realistic driving sims. What's new??" Such an irony given
that only a couple of years earlier, all the focus for driving
games was on the ever improving realism.
See this old article for a good discussion of these issues:
http://www.sgidepot.co.uk/reflections.txt
At the end of the day though, all this effort is designed to make
money out of us consumers, so don't expect every change to be in
our interests. But as long as I can play games like Oblivion and
Stalker, I'll be happy.
Sorry for the long rant, just hope some of you find it interesting.
Ian.
PS. Enhanced gaming experience: watch Band of Brothers, *then*
play Call of Duty. Hot damn.... 8)
---
mapesdhs@yahoo.com