What do you guys think of the state of AMD and Nvidia as GPU manufacturers?

Status
Not open for further replies.

thedeathclox

Distinguished
Jun 22, 2012
172
0
18,710
First of all, I would like to make it clear that this is NOT an attempt to fan the ancient war on "AMD versus Nvidia - who is better?", but rather is just a discussion on how people FEEL about them personally (not me versus you type crap, please).

TL;DR Below.

WARNING: The following is my personal PERCEPTION and OPINION. I am looking for all of yours too, including conflicting / totally different ones that go against everything I've said - I just want to know what the general consensus here is.

So basically, I've been a long-time AMD user, starting with the ATI Rage 128. I've seen them and been with them as they've changed drivers, changed owners, got faster, overhauled the drivers, came out with new beefy cards right up to now. I was never big on Nvidia, but I understood that often times their drivers were superior and more reliable, and their tech was generally faster but more expensive. AMD / ATI was always kind of an underdog and seems to lean towards being closer to the community (as far as a corporation of such size would), so I always kind of perceived them as more of a gentle giant whereas I've always kind of seen Nvidia as being a bit cold and manipulative, with a penchant for trying to stick their hands into game studios' projects with proprietary tech (like gameworks) with either the intention of or disregard for causing games to be horribly un-optimized for the competitors. Needless to say that based on my perception of them and my financial status until more recently, I could neither afford to purchase their gear at a price that gave me the performance I sought, nor did I feel any particular draw to support the company due to how I perceive their attitude towards consumers and competitors.

Important to note, I LOVED AMD's products without wavering right up until I stopped using my HD 6770, and I really found the 280x to be a dream (for the most part, see below..)

I started to really wonder if AMD has been pulling its weight properly since probably around 2013 - 2015 or so, around when the AMD R9 series came out. As soon as I was able to, I got my hands on an AMD R9 280x from MSI, and I thought it was pretty swell. My games ran really well, overclocking was easy and very effective, and the new Crimson drivers really felt like AMD taking a swing back at Nvidia by finally cobbling together coherent drivers and adding a promise to do better. I was hoping to leverage the virtual super resolution, but quickly found out that due to the 280x being based on an older GCN arch, a lot of the features that cards like the 285 and 290 gained from the update would not be made available. It kind of stung because the 280x, while being a rebadge of an older GPU, was still marketed as a great, new GPU with full support (as in, not nearly end of life, not by a long shot.). Based on this, I decided to bump up the performance an other notch and nab those features for myself, and upgraded to the AMD R9 290x Urrggghh.

My software experience in a nutshell with the 290x, was that as time went on, drivers eventually became pretty fantastic, in my opinion. I found the ability to do shadowplay-esque recording and the extras to be very nice, and it was super easy to navigate the control center, and all of the features I cared about were laid at my fingertips.

My hardware experience with the 290x was pretty bad, though. I was dumb, and I bought a reference design thinking that it would be a bit loud, but would still dissipate heat effectively, look nice in my case (I hoped it'd be sleek looking like the GTX 970 and GTX 980 reference designs - clean lines, colors, you name it.) Unfortunately, it was freaking HUGE (thankfully it fit my case though) and when in a case, just looked like a dark mass inside. The blower was pretty damn bad at dissipating heat quickly enough - despite the fact that putting your hand near the I/O on the back of the computer while a game was running almost necessitated wearing an oven mit lest you give yourself 3rd degree burns from the heat cranking out of the back of the freaking thing (I'm not being serious about needing an oven mit, but the hot air did actually get bad enough to hurt a bit if you got too close). While it did manage to kick impressively hot air out of the back of my PC, it still wasn't good enough to prevent the GPU from climbing up to 95 degrees celcius under load, at which point the card would pussy out and start throttling down, causing noticeable FPS dips in games. On top of this, if I decided to control the GPU fan manually, I needed third party software to actually set fan speeds (otherwise, the software more-or-less takes your fan speed setting as a suggestion, unlike older cards.) Once you did force the fan to go above 60 percent, which appears to be the preferred maximum of the driver, even if you say 100% should be the maximum, the thing sounds like a tiny jet engine (I suppose it IS a blower design..). It could become so LOUD, that from beside the desk about 3.5-4 feet away, I needed to turn up my sound to hear dialogue.

Also, overclocks (lol, couldn't even try that with a reference cooler.) did not lock to a set value because the card would prefer to, again, use your provided value as a maximum "suggested" value, rather than as a direct command to run at that speed or die trying.

Boy, that was dumb, I shouldn't have bought a reference design thinking that I'd be able to use my GPU at even stock speeds without throttling *sarcasm, kinda getting salty just thinking about that right now*. The answer to my prayers came with the realization that the early aftermarket R9 290 / R9 290x cards were basically a reference board with a big cooler strapped on. So, I went to ebay, sourced a Tri-X cooler (heatsink and fans) and ended up paying $60 plus shipping. Eventually, it arrived and I peeled the stock cooler off, moved the VRM heat pads over to the new cooler, slapped on a bit of the ol' arctic silver 5, and bolted 'er down. Immediately, my temps were a solid 15 degrees lower, and I could even achieve a 150MHz overclock without a sweat- kinda. The card itself would no longer peg at 95 degrees and then throttle- it ran at a comfortable 85-87 MAXIMUM while overclocked, and would usually refrain from any throttling at all, and even if I cranked fans to 100 percent, it was completely tolerable in terms of noise. A new problem arose - thankfully for me, my case has very good airflow, so my CPU (which did not have liquid cooling at the time) did not suffer from increased heat inside of the case, however since the case was good at pushing the heat out quickly, gaming in the summer time was comparable to sitting in a sauna playing video games. The card ran so hot, that the best case scenario was the card acting as a space heater. Putting my hand over the top exhaust, you could feel the heat churning out of that bastard. A boon in the winter, but truly terrible in the summer.

Eventually, 2017 rolled around, and in april, I decided to upgrade. New GPU's are coming out all the time, and performance on cards was going well beyond what my 290x could do, and they did it while consuming a fraction of the power - that means less wear and tear, noise and less heat, all in a faster package! Yay! I looked at my options and - oh.. AMD, y u no offer significantly faster GPU at good price and lower power? I mean, sure, I could Get an RX480 and get some great performance per watt, but.. the performance just doesn't stack up against that card. I could wait for Vega, but AMD was taking forever, despite them promising a really competitive product.

I took a gamble, and I lept. Only one brand offered a product that suited the needs of someone who wanted the fastest gaming performance without setting fire to the furniture.

I sold my AMD R9 290x, and I purchased my first Nvidia graphics card. I bought an EVGA Nvidia GeForce GTX 1070 Superclock, with the ACX 3.0 cooler.

Here is what I noticed:
-> The card is leaps and bounds faster than my 290x, and holds its clock speeds (I haven't even cared to try to overclock - it's so damn fast dude).
-> The card, despite this massive leap in performance, is actually significantly shorter, and even under load I barely notice any heat escaping through the exhaust.
-> My room gets slightly warmer after several hours of gameplay, but is totally mitigated by opening a door or cracking the window- basically, like any electronic that runs in a closed space for too long. I didn't think about the heat output at ALL this summer.
-> Drivers are fuuuuugly. Really dated and confusing looking, but functional and frequently updated. I do need to use a separate tool (Nvidia inspector) to limit FPS to 74 to avoid screen tearing without vsync (and also to keep Fallout 4 from running so fast that my character runs with comical speed across the commonwealth). All in all, a good experience.

I thought recently - "Well, I still don't love the drivers, and the company is still "...ehhh..." in my opinion. I should see if Vega is out, and if it spanked the Pascal GPU's from Nvidia. That would be a good excuse to go back."

-> Vega GPU's are basically AMD counterparts to the Nvidia lineup.

All of that waiting, and AMD popped out basically a direct competitor. That is 100% not worth switching over.

So, where I am now, is I am wondering what the heck AMD is doing. I was a pretty hardcore fanboy, but the longer I use my Nvidia card, the more I enjoy the simple, hassle-free raw performance I enjoy, and the setbacks for me are so minor that I couldn't go back, at least not soon. I'm really confused as to why they keep doing weird crap, like releasing the R9 fury, a strange, half-way response to the GTX 9xx series (although, the R9 290x was so close to the 980 that there was no real point), all while costing way too much, cranking out heat like an oven, and not managing to beat Nvidia.

Now they release a middle to low end card setup (polaris), then follow it up with underwhelming responses to gaming tech that's been available for like a year and a half. AMD just spanked Intel recently with the Ryzen processors, outdoing or matching Intel performance and features for literally half of the price. It's like they're focused and on point in the CPU market now, but their GPU team is dazed and confused and out of touch.

Again, I know this is lengthy and suuuper opinionated, but that's how I have perceived things from the perspective of a dude who wanted so badly to stay with AMD, but eventually jumped because the grass really did get greener on the other side (so to speak).

I would really, really love to read what you guys think, and how you perceived the recent history of AMD / Nvidia / the two competing for market relevance and share. Maybe how I saw it was super skewed and Biased because of a blind allegiance to a corporate entity, or maybe you see that the same way. Please let me know, what do you think has happened, and where do you think things are going?

TL;DR, I used to be an AMD diehard, but AMD drivers and hardware got ridiculous and shitty, so I reluctantly went to the green team - super happy, but don't know where AMD went wrong, and have no idea where they're headed in the future. Would love to see how others have experienced this recent history.
 
AMD is a "small" company when compared to both NVIDIA and Intel. AMD has to battle those giants and it's not an easy thing.
AMD is also a poor company with a huge over 1bn debt and no profits despite ryzen and polaris sales. Other two companies are very profitable.
AMD relies on Global Founders (which was AMD fabs not so far ago and was sinned off to keep that business healthy due to losses in other departments) to produce their CPU and GPU dies. And even GF is not competing well against Intel and TSMC/Samsung.
So with limited resources AMD have, they do very good job. But it's simply not good enough. Vega performance and power consumption was such a facepalm after all that hype AMD made for the last couple of years. Just for fun, the VEGA 56 is more expensive to produce than 1080Ti and yet it barely beats a much cheaper to produce GTX 1070. Same goes to CPUs. only when it comes to over 8 cores, AMD has more flexible and efficient architecture that is cheaper than intel's monolithic CPUs. That especially true for threadriper and server chips.
The problem is that in a year or two intel will fix it and AMD will be kicked out again like it happened when intel released core architecture that was just a bit better than AMD and then the core i series just ruined AMD for almost a decade.
 

thedeathclox

Distinguished
Jun 22, 2012
172
0
18,710


That is very true, AMD is relatively poor by comparison. Most of my confusion surrounding this has been with their recent moves, such as hyping everyone up for Vega and falling suuuper flat on their faces. They haven't pulled this kind of stuff as far as I remember with previous cards- they weren't always as fast but they always managed to do more than you'd expect for the dollar, and their hype train never went as hard as the one for Vega. I also can't remember the last time they announced a new arch and then waited that long before pushing it out the door.

Very good response, on that point about TSMC / Samsung & Intel v. Global Foundries. I didn't know things were quite that dire with them.

I'm rooting all the way for AMD here- without competition, Intel and Nvidia would surely do exactly what Intel has done for the last 4+ years- Release a product that by generation introduces a tiny margin of improvement in performance and skyrocket the prices. Nvidia and Intel both need a competitor, otherwise we will all be screwed. I don't care if that competitor is AMD or some other new entry into the market, I think we need at least someone to stand against them and keep innovation rolling forward while keeping the market prices grounded on planet earth.

Thank you for the well-written and level-headed response, I honestly (despite all of my disclaimers and hints that this was entirely a biased opinion) expected some angry flame against me for being naive or hating on a company or something, haha. Good stuff dude.

*Edited for elaboration and re-wording
 
Status
Not open for further replies.