Discussion First GPU That Made the Difference?

Status
Not open for further replies.

jnjnilson6

Distinguished
My first GPU was a GeForce2 MX 400 (64 MB memory). However, it was not the first GPU to make the big difference for me.

Back in time the first GPU I had acquired which really scaled things up within the vigorous spectrum of graphics and computing was the Sapphire Radeon HD 6770 1 GB GDDR5 (775 MHz Core / 1000 MHz Memory). Before that I had had a GeForce 555M 1 GB GDDR5 which was blighted by a frame-skipping and an intermittent subdued framerate issue and before that had come up the Mobility Radeon 4530 which harbored 512 MB DDR3. There were many GPUs before those as well as many after. After the Radeon HD 6770 I got a Sapphire Radeon HD 7870 and then another one and put them to run in CrossFire.

But the giant step forward was the 6770. That was the first time I felt true power underneath my fingertips. What was it for you? What was the GPU which when you got it you felt masterful and even games like Crysis had their perky requirements capitulate in submission...

Originally I ran the 6770 on a Celeron G530, then on a Core i7-3770K. In games like Crysis the CPU made a huge difference in terms of hosting Multiplayer servers. Framerate was considerably lowered down after the 5th player joined-in on the Celeron; however the Core i7-3770K maintained the same high framerate even with 32 persons within.

So what was it? That GPU that made you for the first time feel the glamorous tinges within the spheres in which you were indulged? And the glitter of framerates and the assuredness...
 
A little earlier, but the Voodoo 2 opened up an entire genre of true 3d games. Quake and Unreal in glide was pretty impressive back then compared to software rendering. Though most of the games I played then just ran on my 1MB 2D card.

Voodoo 3 was a significant boost over my Voodoo 2 SLI set up and got rid of needing 3 GPUs. But I wanted that so I could keep running glide titles.
I had a Voodoo 3 for a while, till around 2001 or so. Multiple even in different systems.

I also moved to an MX400, I think that Half life and its mods were the driving force to upgrade there. I had a few of the lower/mid range FX cards, 5200 and 5600 come to mind. I don't think I ever had one of the original Ti cards, but a lot of my friends did.

Geforce 6600 after that. Not for anything particular, just to keep things up to date.

First big GPU was a 8800GTS solely for DX10 support on a few upcoming titles. Crysis did indeed run.
Pretty much had 80 class Nvidia cards since, so haven't recaptured that amazing graphics improvement since.
 
A little earlier, but the Voodoo 2 opened up an entire genre of true 3d games. Quake and Unreal in glide was pretty impressive back then compared to software rendering. Though most of the games I played then just ran on my 1MB 2D card.

Voodoo 3 was a significant boost over my Voodoo 2 SLI set up and got rid of needing 3 GPUs. But I wanted that so I could keep running glide titles.
I had a Voodoo 3 for a while, till around 2001 or so. Multiple even in different systems.

I also moved to an MX400, I think that Half life and its mods were the driving force to upgrade there. I had a few of the lower/mid range FX cards, 5200 and 5600 come to mind. I don't think I ever had one of the original Ti cards, but a lot of my friends did.

Geforce 6600 after that. Not for anything particular, just to keep things up to date.

First big GPU was a 8800GTS solely for DX10 support on a few upcoming titles. Crysis did indeed run.
Pretty much had 80 class Nvidia cards since, so haven't recaptured that amazing graphics improvement since.
Had had a GF 6200 TurboCache 256 MB and a P4 520 (2.8 GHz w/ HT) and Crysis was a slideshow. The Radeon 4530 was OK on Low in Single Player, not so much in Multiplayer. And Medium on that card felt like a privilege (if 17-20 FPS satisfied you in tough scenarios). The 4530 did run Crysis 2 but barely. On 800x600 at Low you were happy to get 20-22 FPS.

All games brightened up significantly with the 3770K and the 6770. Had friends at the time who had AMD T1090 and HD 5850 and Core i5-2400 and HD 6850 and a Core i5 (3rd) gen. and HD 7770. Of course, my CPU was faster, but they were getting the higher framerates; this did change when I got the first 7870. Those were the days in which I played seriously and had fun over the sun-strewn forests in Crysis and battling with lasers over forgotten Mesas, the light illuminating unchartered shards of polygons and tanks and VTOLs and intermittent chattering hashing out languorous sun-filled moods and exquisite delights in this aspect or that, passing like silent breaths before a corner was fragmentarily skirted and gunfire ignited. Too bad that with the death of GameSpy the Multiplayer died away in 2014 and was not revived in the Remastered series...
 
Voodoo 2 12MB was $299, adjusted for inflation $550. I inherited it, so didn't quite have it in 1999. (Unreal Tournament also 1999). So pretty much top of the line system when my brother built it, but he moved on to a Matrox card after that.

I definitely picked up Voodoo 3 after they were in the bargain bin.

All those in between cards, were bought quite cheaply.

I don't remember what I paid for the 6600. It was fairly reasonable, google says $200.

8800GTS 640MB was $450.

Makes a lot of sense that the times I saw the most improvement were the most expensive.
 
Had had a GF 6200 TurboCache 256 MB and a P4 520 (2.8 GHz w/ HT) and Crysis was a slideshow. The Radeon 4530 was OK on Low in Single Player, not so much in Multiplayer. And Medium on that card felt like a privilege (if 17-20 FPS satisfied you in tough scenarios). The 4530 did run Crysis 2 but barely. On 800x600 at Low you were happy to get 20-22 FPS.
I never really cared for Crysis, but did try it out with the 8800 GTS. Must have been 1280x1024 by then and it would get a good 40 FPS if I recall, but that era of card really liked to run hot. It would push 100C on that title. Never tried it on the 6600 at 1024x768, I'm sure it wouldn't have been great.
 
Voodoo 2 12MB was $299, adjusted for inflation $550. I inherited it, so didn't quite have it in 1999. (Unreal Tournament also 1999). So pretty much top of the line system when my brother built it, but he moved on to a Matrox card after that.

I definitely picked up Voodoo 3 after they were in the bargain bin.

All those in between cards, were bought quite cheaply.

I don't remember what I paid for the 6600. It was fairly reasonable, google says $200.

8800GTS 640MB was $450.

Makes a lot of sense that the times I saw the most improvement were the most expensive.
I remember when I'd had the 4530 cards like the 8800 Series looked temptingly desirous. There was this video of a rig from 2007/2008 harboring 4x 8800 Ultra (I think) cards together and seeming monstrous.

Had a friend back in the day who originally had an i7-2600 and GTX 560Ti (don't remember exactly if the Ti or Non-Ti; but am putting the cards on Ti). He then moved to an i5-3570K and a Radeon HD 7990. The 7990 was supposed to be a glorified talisman of a card in the day. It should, theoretically at least, be faster than my current 3050 Ti (despite I do not use it for gaming).
 
I think the first graphics card I purchased was the S3 Virge, but I have to say, the first time I was really wowed by graphics was with my original Amiga 500 and the game F/A-18 Interceptor. I don't think I even purchased a graphics card for my AST 486DX2-50 even when playing Doom II or Leisure Suit Larry. A few years later, I built my first system (PIII-800mhz, 128MB RAM later upgraded to 256MB). A few years later, I heard of a new game coming out that even had my non-PC (computer) friends excited. It was a MMORPG called City of Heroes. Unfortunately, my computer was severely limited and not even up to MSR. With what limited funds I had, I purchased an extra 512MB RAM module and an ATI Radeon 9200. I later upgraded that system to an Athlon X2-4800+ and later still with an 8800GTS AGP

I was never really one to push the graphics or even purchase/play AAA games, so when it was time to upgrade my card, I wanted something that could do more than just play games and that's when I went with an ATI All-In-Wonder X800XL. That was probably the first time I could really push the graphics (wasn't there a Crysis Demo Mission you could download?). That's when I decided I'd never spend more than $300 on another graphics card. If I'm running through the jungle looking for or evading enemies, I don't need to see just how many leaves are on that branch or how many blades of grass there are.

Of course, now you can't even get a mid-ranged card for under $300...

-Wolf sends
 
I never really cared for Crysis, but did try it out with the 8800 GTS. Must have been 1280x1024 by then and it would get a good 40 FPS if I recall, but that era of card really liked to run hot. It would push 100C on that title. Never tried it on the 6600 at 1024x768, I'm sure it wouldn't have been great.
Have you run games like Counter-Strike 1.6 on the 6600? And how did it run if you have?
I remember when I shifted into building a desktop PC (that happened after the problematic GT 555M experience) I originally got a Celeron G530 machine and basically used its Intel HD 2000. (That was right before I got the HD 6770; the machine had 2 GB RAM, then upgraded to 4 GB and after the 3770K came along the memory shifted to 16 GB of RAM in Q4 2012).

So - CS 1.6 lagged with many players on the Intel HD 2000 and Celeron G530. Which quite amazed me. No such thing on the 6770, of course. I used to think that CS 1.6 would surrender completely to a fast Pentium 4 and a 5xxx or 6xxx card. Yet the Intel HD 2000 built into a Sandy Bridge CPU from 2011 could not handle it very well with many players so many years after the game's release.
 
  • Like
Reactions: Order 66
I remember my first GPU was an RX 550 4GB. it was the first GPU that allowed me to play 3d games at all. The difference between my intel hd 500 graphics and the rx 550 was huge. I went from running fortnite on the hd 500 graphics at 18 fps at 800x600 with 33% resolution scale (so really 264x 198) to running at 1080p competitive settings at 100 fps.
 
I personally bought my first video card in 1991. It was an ISA based card that I installed into a Tandy 1000 TL. I wish I could remember the make/model.

What it taught me was that a discrete GPU would always be a better video option for gaming and before that installation, I had no idea what I was missing.
 
I remember my first GPU was an RX 550 4GB. it was the first GPU that allowed me to play 3d games at all. The difference between my intel hd 500 graphics and the rx 550 was huge. I went from running fortnite on the hd 500 graphics at 18 fps at 800x600 with 33% resolution scale (so really 264x 198) to running at 1080p competitive settings at 100 fps.
The RX 550 is synonymous to the Radeon HD 7850. I remember back in the day after the Radeon HD 6000 series and the Nvidia 500 series the Radeon 7000 series were viewed as Godlike. The 7850 was every gamer's dream presenting a vivid utopia of color, performance and saturation. Back then it cost around the 250 EUR mark and that was considered expensive. We really thought those cards were quite expensive but they did provide quite a good kick for the buck. Today we'd view them as almost 'cheap,' this said in vital honesty.

It is funny how a HD 7850 from 2012 supports DX12 and is quite a good card today, 12 years later. Whilst a card from the year 2000 would not hold a candle in the gaming world of 2012.
 
  • Like
Reactions: Order 66
Well, that says more about the API standardization and support paths then anything else. Used to be a new DX version every year practically. That is still true, but they are minor revisions instead of make or break features.
when will there be DirectX 13? I know that DX 12 Ultimate was released in 2021, but I guess I don't know what features would be interesting to add.
 
  • Like
Reactions: jnjnilson6
Have you run games like Counter-Strike 1.6 on the 6600? And how did it run if you have?
I remember when I shifted into building a desktop PC (that happened after the problematic GT 555M experience) I originally got a Celeron G530 machine and basically used its Intel HD 2000. (That was right before I got the HD 6770; the machine had 2 GB RAM, then upgraded to 4 GB and after the 3770K came along the memory shifted to 16 GB of RAM in Q4 2012).

So - CS 1.6 lagged with many players on the Intel HD 2000 and Celeron G530. Which quite amazed me. No such thing on the 6770, of course. I used to think that CS 1.6 would surrender completely to a fast Pentium 4 and a 5xxx or 6xxx card. Yet the Intel HD 2000 built into a Sandy Bridge CPU from 2011 could not handle it very well with many players so many years after the game's release.

I don't think I was playing much counterstrike with my 6600, but maybe at a LAN party around then. Really wasn'y any performance issues I had around then. Likely an Athlon XP 1800+ at the worst and a 2800+ at best.
 
when will there be DirectX 13? I know that DX 12 Ultimate was released in 2021, but I guess I don't know what features would be interesting to add.
I remember Crysis 1, Warhead and Wars used to run on DX9 and DX10. Windows XP had DX9, Windows Vista - DX10 and Windows 7 - DX11.

A capability of DirectX 10 on Crysis -
The time of day could be seen changing on Multiplayer servers - first abounding with surreal light, then briskly shimmering through the mellowed-out shades and beauty of noon and afternoon and finally partaking sparingly of the blue lingering darkness of night.

The Windows XP days were magical. The Windows Vista and 7 days too - each in their own way. The latter systems were beautiful and brought along an ethereal effect to the mind that only lingers on those mellow afternoons in those begone times like a hidden embrace or a fervid memory.
 
It really was the jump away from integrated graphics for me (everyone says that). I didn't have the money to get into having a real desktop of my own until around 2019 as I was entering high school. I got an Inspiron 3670 with a 1050 2GB included and played Titanfall 2 and Team Fortress on it to death. To put this into perspective, the laptop I had previously could not run Unreal Tournament 2004 on the lowest possible settings and have a playable framerate. These were Nahalem/Sandy Bridge laptop chips we're talking about here.

I think that 1050 was the first time I played any game above 30 FPS that wasn't some indie or mobile game because what I had before really was just that horrible. It was a budget card so it definitely was not amazing, but it was the first time I could install something without having to ask myself if it would crash and burn the second I opened it.

After that it was just incremental upgrades. I used a 1060 for about 3 years before upgrading to a used 1080 last year and building a completely new PC with it. I guess I was hoping the 40 series would be a major improvement over the 1080 when I got my 4070, and to its credit it was a roughly 200% speed boost but I think the whole RTX thing is overrated. Developers treat DLSS as a crutch to make games unoptimized which almost defeats the point. The 4070 did give me what I needed to push 144hz comfortably though, so I guess that counts? That's a new thing I've tried this year and I'm loving it so far.
 
It really was the jump away from integrated graphics for me (everyone says that). I didn't have the money to get into having a real desktop of my own until around 2019 as I was entering high school. I got an Inspiron 3670 with a 1050 2GB included and played Titanfall 2 and Team Fortress on it to death. To put this into perspective, the laptop I had previously could not run Unreal Tournament 2004 on the lowest possible settings and have a playable framerate. These were Nahalem/Sandy Bridge laptop chips we're talking about here.

I think that 1050 was the first time I played any game above 30 FPS that wasn't some indie or mobile game because what I had before really was just that horrible. It was a budget card so it definitely was not amazing, but it was the first time I could install something without having to ask myself if it would crash and burn the second I opened it.

After that it was just incremental upgrades. I used a 1060 for about 3 years before upgrading to a used 1080 last year and building a completely new PC with it. I guess I was hoping the 40 series would be a major improvement over the 1080 when I got my 4070, and to its credit it was a roughly 200% speed boost but I think the whole RTX thing is overrated. Developers treat DLSS as a crutch to make games unoptimized which almost defeats the point. The 4070 did give me what I needed to push 144hz comfortably though, so I guess that counts? That's a new thing I've tried this year and I'm loving it so far.
What current CPU are you running on? How much RAM are you harboring? :) It would be cool if you'd share.

The RTX 4070 is a hell of a mighty chip. You should be on the crest of the wave with it. Something like the Radeon HD 7950 back in the day; nearing the 'celestial heights' in hardware.
 
What current CPU are you running on? How much RAM are you harboring? :) It would be cool if you'd share.

The RTX 4070 is a hell of a mighty chip. You should be on the crest of the wave with it. Something like the Radeon HD 7950 back in the day; nearing the 'celestial heights' in hardware.
Right now, i7-12700k and 64GB DDR4. Old desktop was i5-8400 and I upgraded it to 16GB. I think the laptop was a ThinkPad T510 with an i5-480M and 6GB of DDR3.
 
Right now, i7-12700k and 64GB DDR4. Old desktop was i5-8400 and I upgraded it to 16GB. I think the laptop was a ThinkPad T510 with an i5-480M and 6GB of DDR3.
That's a wonderful setup. You can hardly get better than this. Your choice of RAM is very cool too. A lot of RAM opens up innumerable doors. The more - the better. A very strong machine. Could be used for many years as a high-end example and render a large portion of current machines wanting. That CPU is stronger than strong too.
 
Why I picked up a Voodoo 5 5500 PCI on ebay over a decade ago. I can still fire up Windows 98/2000/XP and DOS and run native glide titles.

I have been meaning to give PC EM a try, supposedly good enough to fully emulate glide.
Wasn’t glide a precursor to Directx? I keep thinking it was for some reason. I know I’ve asked this question before, but it felt so long ago, I don’t remember.
 
Last edited:
  • Like
Reactions: jnjnilson6
Wasn’t glide a precursor to Directx? I keep thinking it was for some reason. I know I’ve asked this question before, but it felt so long ago, I don’t remember.
No, Glide was proprietary to 3DFX. OpenGL was an alternative standard that quickly gained adoption. Direct X was more of a suite of APIs for Windows 95 gaming and multimedia, didn't have any relation to Glide as far as I know. Prior to that a lot of games still ran under DOS, though there were plenty of exceptions and full games that ran under Windows 3.1.

3DFX assets were sold to Nvidia, but they already had a better GPU by that point, so not sure if anything ever really made it into their stuff. SLI was re-used, though the exact method and acronym were changed. To this day I still can't believe Nvidia hasn't used Voodoo for anything. Titan V? It stood for Volta, but a man can dream...
 
Status
Not open for further replies.