The scheduled forum maintenance has now been completed. If you spot any issues, please report them here in this thread. Thank you!
Those were the days when having GT in the model name of the GPU didn't automatically mean it was a glorified video adapter.For something good, the GeForce 8800 GT/GTS 512MB There's never been a card since that:
Granted it's cheating a little since those two cards were the launch cards for the new G92 GPU, as opposed to the G80 the original GeForce 8000 series cards used. The only other card I would say that remotely came close was the GTX 1080 Ti.
- Performs almost as good as the flagship and halo cards
- Costs half as much as those cards
- With one of them using a much smaller cooler (the 8800 GT was a single slot design)
And I thought the 40 series was expensive!Shock price of Hercules Color Graphics Card which rivaled price of some cars in 1981.
Agree, that was quite the surprise. Incredibly cheap, within the same generation, performance that came close and sometimes even beat the bigger cards.For something good, the GeForce 8800 GT/GTS 512MB There's never been a card since that:
- Performs almost as good as the flagship and halo cards
- Costs half as much as those cards
- With one of them using a much smaller cooler (the 8800 GT was a single slot design)
For something good, the GeForce 8800 GT/GTS 512MB There's never been a card since that:
Granted it's cheating a little since those two cards were the launch cards for the new G92 GPU, as opposed to the G80 the original GeForce 8000 series cards used. The only other card I would say that remotely came close was the GTX 1080 Ti.
- Performs almost as good as the flagship and halo cards
- Costs half as much as those cards
- With one of them using a much smaller cooler (the 8800 GT was a single slot design)
most shocking gpu launches in bad way that should never been released a 1630 was outright embarrassmentwhat are the most shocking (in a good or bad way) GPU launches or first impressions you have ever seen? Ex: The size of the 4090.
what are the most shocking (in a good or bad way) GPU launches or first impressions you have ever seen? Ex: The size of $1500.00
The shock I received when GPU's hit $1500.00. Took 2 or 3 populated slots and weighs as much as my first born daughter. GPU's in this form have reached caricature proportions.what are the most shocking (in a good or bad way) GPU launches or first impressions you have ever seen? Ex: The size of the 4090.
You just reminded me that the RX 6500xt still exists, and that junker still costs $170. You can get RX 6600's that trash it for 20 dollars more. Sometimes even for the same price or less if open box or on sale. Also Intels Arc gpu's, they've since matured into some decent chips, but the drivers were a mess at launch. The previously mentioned RX 6600 at launch because of its ridiculous MSRP, then the RTX 3050 that decided to play the same greedy game, but worse. Times have been weird as of late, things haven't been this weird in the GPU market since the early 2000s.most shocking gpu launches in bad way that should never been released a 1630 was outright embarrassment
released during chip shortage it was one of the most idiotic cards i ever seen released
was overpriced launched at 199 usd let that sink in and is still overpriced lowest $134.99
for context the gt 1030 ddr5 launched at 79 usd still sits at that price. ( we dont talk about the ddr4 version)
Yeah, when I first saw the 4090, I was like, wow it is thick, but I didn't realize how big it was until I saw a comparison to the 3090.The shock I received when GPU's hit $1500.00. Took 2 or 3 populated slots and weighs as much as my first born daughter. GPU's in this form have reached caricature proportions.
Started laughing my rear end off after receiving my EVGA 3080ti OC. It works... I still have the humorous device in one of the PCIe slots. it lives with a 5950x and 32 gigs of RAM.
Yep, its honestly become kind of a problem for it. They very commonly break during shipping, especially if they're already installed in a PC, even if that PC has bracing and everything for it. That cooler is so big and heavy that its ripping apart the pcb, popping BGA balls, or breaking traces with any sudden impacts. Honestly it may be safer if it was a cooler sandwich type card, something we haven't seen since the Radeon 9700 days, though the poor socket would take the brunt of the impacts if everything else is solidly constructed and braced. There's not really a good answer for this with the current atx standard, well, not unless you move it outside the case into its own box.Yeah, when I first saw the 4090, I was like, wow it is thick, but I didn't realize how big it was until I saw a comparison to the 3090.
You haven't lived until you pay $55.00 a meg for Static column RAM for the Amiga A3000. Good thing it wasn't gigs of RAM.And I thought the 40 series was expensive!
I have seen the images where you can literally fry an egg on one.So hot that the Fermi was an apt name, also delayed for over 6 months and couldn't compete with the HD 5870.
what made it so slow? how much better was Nvidia's competition to it?Radeon HD 2900 XT - Hot and slow, opened the door for nVidia to take the lead.
was 3GB not enough in 2013? (I am genuinely curious)Radeon R9 280X
I remember hearing about that. why did this bring about a lawsuit? What was Nvidia's response to this class action lawsuit?GeForce GTX 970 - Marketed as having 4GB of VRAM but only 3.5GB were usable (Class-Action Lawsuit).
What do you mean?Radeon R9 Fury-X - I still don't know what ATi was thinking when they made this.
How does this compete now?Radeon Vega - Delayed, over-priced and couldn't compete with the GTX 1080 Ti.
Isn't the 3050 worse at RT?RTX 2060 - Marketed as an RT card, except that it couldn't actually do it.
This card would have been decent if it had 8GB of VRAM and a wider bus. What was AMD thinking about releasing a card with only 4GB in 2022?Radeon RX 6500 XT - What you release when you're grasping at straws and want to make a crap product look good by adding the "XT" suffix.
At least it had more VRAM as the other card in it's class the 6500 xt.GeForce RTX 3050 - Can't do RT to save its life but somehow manages to be more expensive than the faster RX 6600.
just ask the 4060 ti 16GB if it could get any worse. This GPU makes no sense when the 6800 exists at $390. Oh wait I forgot, you already did.GeForce RTX 4060 Ti 8GB - 1080p gaming with 8GB for $400USD, could it get any worse?
I agree, it is the worst value GPU of this generation. Maybe it wouldn't be if it gave significantly more performance than the 8GB version. (50%+) but then the 4070 wouldn't make sense.GeForce RTX 4060 Ti 16GB - I'm sorry I asked.
what was so game-changing about it?GeForce 8800 GTX - Absolute game-changer.
Could it run Crysis well though?Radeon HD 4870 - A GTX 260 for $150USD less than a GTX 260.
What do you mean by just works? I thought that dual GPUs had pretty much the same issues as running multiple GPUs in SLI/CrossFire.Radeon HD 4870x2 - ATi finally takes the performance crown back with the first dual-GPU that just works.
same as above.GeForce GTX 295 - nVidia takes the crown back with the second dual-GPU that just works.
Do you think AMD can do it again soon?Radeon HD 5870 - ATi finally manages the fastest single-GPU in the world again.
what made it such good value?Radeon HD 6850 - One of the top-3 best-value cards ever made.
what was Nvidia's competition?Radeon HD 7970 - ATi's new crowning achievement.
how does it compare with modern low end cards like the 6500xt?GeForce GTX 980 Ti
It had a good run as the most popular GPU, and still holds up decent today.GeForce GTX 1060 - Mainstream monster.
8GB of VRAM really helps it now.GeForce GTX 1060 - Mainstream monster.
Nvidia's "mistake" still holds up well today so long as the game doesn't require mesh shaders.GeForce GTX 1080 Ti - Too good for nVidia's own good.
Not to mention 8GB of VRAM as well.Radeon RX 5700 XT - RTX 2070 performance for RTX 2060 price.
I would agree, do you think the 6800 is second best?Radeon RX 6800 XT - Overall, the best card of the RX 6000 / RTX 30 generation.
The generational improvement was impressive, unfortunately, it had a bad case of melting adapters.GeForce RTX 4090 - Lots of people with more money than brains lining up for this thing.
The best value high-end card. Wish I had the budget for one.Radeon RX 7900 XTX - Sold out for 2-3 months solid without the help of scalpers.
what settings? also that is with DLSS, the reason why @Avro Arrow said that it couldn't actually do it was because without DLSS it couldn't. I dare you to try running the game at 1440p max with max RT, even with DLSS it won't be playable. 6GB of VRAM isn't enough for 1080p native at ultra let alone 1440p native at ultra.@Order 66
I have an RTX 2060 in a 3600X rig my bro uses, it can do RT in Metro Exodus Enhanced with DLSS at 60 FPS 1440P.
Yup, and it also created a meme because nVidia's motto at the time was "The way it was meant to be played." and someone came up with the ingenious "The way it was meant to be delayed."I have seen the images where you can literally fry an egg on one.
I don't remember what made it so slow but it was a disappointing architecture all around.what made it so slow? how much better was Nvidia's competition to it?
It was kinda iffy. At that time, 4GB was becoming the norm for higher-end GPUs. It was ok in 2011 when the HD 7970 came out.was 3GB not enough in 2013? (I am genuinely curious)
It brought about a lawsuit because even if the card has 4GB of VRAM, being able to use only 3.5GB (which wasn't advertised) is false advertising. IIRC, nVidia was forced to settle.I remember hearing about that. why did this bring about a lawsuit? What was Nvidia's response to this class action lawsuit?
In an attempt to get the absolute most out of the Fiji GPU that they could, ATi paired it with 4GB of very expensive HBM and made it liquid-cooled. It resulted in a card that was faster than the GTX 980 (but not the 980 Ti), was expensive to make and a pain in the butt to install.What do you mean?
It completely depends on the pricing. I actually should've included the Radeon VII with it.How does this compete now?
I'm pretty sure that the 3050 is better. Remember that the RTX 20-series was the first RTX series and the RTX 2060 was the weakest of them IIRC.Isn't the 3050 worse at RT?
It's a mobile GPU that they stuck on a card just to have something to sell at the low-end during the mining boom. That's why it only had 4GB of VRAM and a PCIe4 x4 interface. There's not much you can do with a mobile GPU.This card would have been decent if it had 8GB of VRAM and a wider bus. What was AMD thinking about releasing a card with only 4GB in 2022?
The RX 6500 XT isn't in the RTX 3050's class in price or performance. The RTX 3050 is 31% faster and costs 59% more money. The RTX 3050 costs 10% more than the RX 6600 while the RX 6600 is 26% faster than the RTX 3050. The RTX 3050 has no saving grace so there is no "at least" when it comes to the RTX 3050. The problem with the RTX 3050 is that it's too expensive for what it is. It should cost somewhere around $160 to not be a total rip-off. It is literally $60 too expensive.At least it had more VRAM as the other card in it's class the 6500 xt.
If it did have more performance, then it wouldn't be an RTX 4060 Ti anymore because more VRAM doesn't automatically equal more performance. VRAM doesn't increase GPU performance, it just avoids performance drops and allows for better textures to be used.just ask the 4060 ti 16GB if it could get any worse. This GPU makes no sense when the 6800 exists at $390. Oh wait I forgot, you already did.
I agree, it is the worst value GPU of this generation. Maybe it wouldn't be if it gave significantly more performance than the 8GB version. (50%+) but then the 4070 wouldn't make sense.
The 8800 GTX was a huge leap forward in GPU performance and caught ATi completely off-guard. Hell, I think that it caught nVidia off-guard as well. ATi wasn't able to match the performance of the 8800 GTX for two entire generations (HD 2000, HD 3000) and because of this, nVidia just kept using the same GPU, giving it different names.what was so game-changing about it?
Yep, IIRC it was the second video card that could do so with only a single GPU. The GeForce GTX 260 was the first.Could it run Crysis well though?
By that time, Crossfire and SLI had been refined pretty well. Hell, I ran twin HD 4870s in Crossfire and it was perfect. The thing is that previous iterations of multi-GPU cards like the GeForce 9800 GX2 and Radeon HD 3870x2 had a lot of teething issues, issues that were solved with the HD 4870x2 and GTX 295. I absolutely LOVED Crossfire because I could upgrade my PC for much less money by just buying another card like I already had. In this way, I could still leverage the power of the card I already had spent money on. By purchasing a second HD 4870 before they left the market, I got it for way cheap and I had performance that well exceeded that of the 4870x2 (dual-GPU cards are never as performant as two separate cards for cooling reasons). So, I went from the performance of a single HD 4870 to roughly the same performance as a GeForce GTX 295 for a small fraction of the price. I never had any issues with using Crossfire in any game I played at the time. I was so impressed with Crossfire's performance that, when ARMA III came out, instead of going broke on a GTX TITAN, I just bought twin HD 7970s (which were on sale at the time because they were being re-branded as the R9 280X).What do you mean by just works? I thought that dual GPUs had pretty much the same issues as running multiple GPUs in SLI/CrossFire.
Nope. They have no desire to because the people with more brains than money always buy nVidia anyway and developing something like that requires a lot of investment.same as above.
Do you think AMD can do it again soon?
Low price withexcellent performance. That's what always makes something a great value.what made it such good value?
The second iteration of Fermi.what was Nvidia's competition?
The RX 6500 XT isn't really a "modern low-end card" because it's a mobile GPU bolted onto a discrete card which makes it something altogether different. Having said that, the GTX 980 Ti is 16% faster than the RX 6500 XT.how does it compare with modern low end cards like the 6500xt?
Technically, I don't think that it was the most popular GPU as I'm pretty sure that the RX 580 out-sold it. The problem was that the RX 580 was being bought up by Ethereum miners so gamers couldn't get their hands on one (I got my R9 Fury to avoid paying $800 for an RX 580). The RX 580 was so good at mining that they were even used in the second Ethereum boom. That's why there are so many of them suddenly available on the used market. Of course, a lot of them died while mining over the years and so the number of them in the hands of gamers will never match the GTX 1060 for that reason.It had a good run as the most popular GPU, and still holds up decent today. 8GB of VRAM really helps it now.
Absolutely. The reason that nVidia considers the GTX 1080 Ti to be a mistake is the fact that a lot of people who bought one still haven't upgraded from it because of its performance and 11GB of VRAM.Nvidia's "mistake" still holds up well today so long as the game doesn't require mesh shaders.
Either that or the RX 6700 XT. There's a case to be made for either one.Not to mention 8GB of VRAM as well.
I would agree, do you think the 6800 is second best?
There's no questioning the performance of the RTX 4090, just the sanity of people who were chomping at the bit to pay its astronomical price. When compared to the RX 7900 XTX, it's about 70% more expensive but only 24% faster with the same 24GB of VRAM. The only reason that I listed it was because it sold like mad.The generational improvement was impressive, unfortunately, it had a bad case of melting adapters.
For what you do, your RX 6800 is absolutely perfect. See, I have a 4K display and wanted to see what 4K gaming was like before the next GPU shortage made it no longer feasible. Minecraft RTX isn't going to change anytime soon and 60FPS is more than enough for that game.The best value high-end card. Wish I had the budget for one.
I stopped paying attention to things like that because I'm sometimes afraid to know.... 😆This post took me about 40 minutes to write. My longest post by far.
It played it just fine on high settings with maxed RT on 1440P with DLSS, I ran it on the Gigabyte M27QI used to have.what settings? also that is with DLSS, the reason why @Avro Arrow said that it couldn't actually do it was because without DLSS it couldn't. I dare you to try running the game at 1440p max with max RT, even with DLSS it won't be playable. 6GB of VRAM isn't enough for 1080p native at ultra let alone 1440p native at ultra.