How AMD Could Disrupt The Graphics Card Market In 2019

Performance is key. While Turing is priced higher for the RTX aspect it does perform the best. AMDs biggest goal should be to match or beat RTX in rasterization performance. Ray Tracing is nothing to be concerned with now, support it yes. But hardware for it wont be viable for most enthusiasts until a few more generations.

I think the biggest shakeup will come if Intel finally delivers a discrete GPU that performs viably. A third viable option will shake things up a ton.
 
AMD need only make a very affordable (sub $160) GPU that can handle modern titles at medium settings @ 1080p 60fps to win the lion share of the market and there by upset Nvidia's dominance. The margins are much lower, but the average consumer doesn't want or need more than the above performance to be satisfied. This should be GPU practice 101.
 


That already exists in the RX 560, it can play most anything you throw at it at 1080p medium 60 fps.

But I agree with your line of thought. They just need to beat Nvidia at the bread and butter. They don't need to compete with the 2080ti, as long as they have an answer to everything else in the range, within the same performance envelope for a better price.
 
i think XGMI could be the key if AMD can all but guarantee two things:
- nearly 100% scaling in multi-card set-ups across most modern titles.
- finally vanquish micro stuttering.

AMD isn't likely to win back the performance crown anytime soon, but i could see Navi 7nm slightly trailing a 1080ti at much lower prices. and if they can eliminate the 2 downfalls that have plagued x-fire/sli, then they can sell extra mid-range cards to more people. Nvidia's decision to remove SLI/NVLink from all but the most expensive cards allows AMD to gain market share where the bulk of sales are, while still providing a path to enthusiast-level performance capabilities.

combine those with low power consumption & high yields from a small chip, & AMD is looking at a winner (or at minimum, a stop-gap). only needs to stick around $230 or less ...which will be the hard temptation to avoid, with how high Nvidia has pushed things. course, i expect typical RTG marketing to mess things up. :/
 
I think we still have to see what Raja Koduri did for AMD. Just as Jim Keller helped developping the Zen architecture, and left the company way before it was released to the public, Koduri has probably been working on an overhaul of the GCN architecture from the ground up to allow it to compete against Nvidia. Except we won't see that end result any time soon. Neither Polaris, nor Vega or Navi (or Arcturus) are Koduri's work. Those are just the bulldozers of the GPU division of AMD, outdated architecturally and lacking punch. But they must be working hard on the next architecture, and i guess we will see its results in a couple of years from now. My two cents!
 
I would use AMD cards more often but it seems AMD cards always need more power (watts) per equal performance. An example is the GTX1050ti that strictly uses the PCIe slot power. To use an equal AMD card I need an additional 6 pin power connector. Millions of computers were manufactured without PSUs' able to go beyond PCIe slot power. AMD literally gave away the biggest possible market for video cards by not having a direct market competitor to the 1050ti
 


Those "millions" of PCs dont need 1050Ti class cards, thats the 1030s market, which AMD has the RX 550 for.
 
AMD needs to cut Nvidia from the mainstream market and push them exclusively into the enthusiasm territory. You know what, I see this coming. It doesn't seems like a far fetch fantasy especially with the consoles business.

Nvidia RTX bet is going to be their big mistake like Intel 10nm venture.
 


Koduri was obsessed with discrete graphic. Don't lure yourself, Vega was his baby... and we all know how it turned out.

He tried to make an all around graphic architecture. It is working incredibly well at low power in APU, but it doesn't scale well with big dies. In all honesty, if it was not for mining, the Vega 64 would have been a failure.

 
I would have bought a AMD card instead I had to get an overpriced end of line item GB 1080. Nvidia stacked against their consumers. That is not good business practice. Trust is key. One client or millions. JfC, just make one great card at the time and supercede with a better one, cover with stellar driver service. Instead you atomize a market and move in tiny increments. That is expensive and WILL fail, like thousands robberbarons plots before. Feeding a marketing department is expensive - its not R&D and futureoriented.
 

The RX 560 doesn't really fill that void very well. It's good enough for the $110-130 range and that about it. It's got half the horsepower and bandwidth of a 570. There really should be something in AMD's lineup between the 560 and 570... but there isn't, since they haven't overhauled their mainstream GPUs since the RX 400 series hit a couple years ago. Oh well, maybe next year.
 
Give me a break... This author is living in LaLaLand if he actually believes any of this crap he wrote. AMD are so far behind Nvidia right now that there is NO HOPE that their next GPU could possibly compete with the RTX series. They would be lucky to beat the GTX1080ti tbh... And unless they have put serious development into RayTracing then they have no hope there either... Nvidia have brought us a decade ahead where RayTracing is concerned and they did it with VERY custom hardware. Tensor Core and RT cores... Where has anyone seen that AMD have anything like that in their pipeline? Nowhere, that's where. I get it the RTX series is expensive... It also has very cutting edge technology that is actually a deal for what it is... Titan V was $3000 and the RTX2080ti is 4.2x faster at Raytracing and 10-15% faster in rasterized gaming... It's expensive, true. But look at what it has brought to the table. People forget very quickly that before RTX RayTracing was still 10-15 years away...
 
The problem with AMD has never been hardware, it's the software, drivers, and APIs. I hate the way Nvidia digs its claws into software like they did with the Mercury playback engine, hairworks, gameworks, and now RTX. But the effort they put into software pays dividends in reliability, frame times, functionality in both pro and gaming apps, and market penetration that occurs because software developers utilize these standards. I wish AMD would put more effort into the software side and play proprietary hardball with it like Nvidia does.
 
If we can get some juicy info on the upcoming PS 5 / next Xbox gpu architecture and info, that would be great!

We know that Radeon Navi is specially designed for PS 5 in mind and RTG just have to PC-fy it...
 

I agree that the RX 560 isn't exactly a direct competitor to the 1050 Ti, since it tends to perform more like a 1050, despite offering more VRAM. However, either the 1050 or RX 560 can arguably run games pretty well at medium settings, which is what that prior poster specified. They might not be able to stay above 60fps all the time, but they should at least average frame rates relatively close to that in most existing games, again, with settings lowered. Plus, the availability of FreeSync on some inexpensive monitors means that dropping below 60fps isn't quite as bad when an RX 560 is paired with one of those displays.

But you also have the RX 570 and 580 priced not all that high above the 1050 Ti, which are at a completely different performance level, more comparable to the 1060 3GB and 6GB. Going by PCPartPicker's current US pricing, the lowest-priced 1050 Tis currently start at $170 (or $160-$165 after mail-in rebate), while RX 570s start at just $200, and you can even find an RX 580 4GB for $200 at Amazon right now. Or even better, an RX 580 8GB for just $220 after rebate at Newegg, that also comes bundled with download codes for three newly-released games (Assassin's Creed Odyssey, Strange Brigade and Star Control Origins). Or a 1060 3GB for $205 after rebate, if one considers lower power-draw to be important. Unless someone is limited by a very low-end OEM power supply, going with a 1050 Ti currently doesn't make much sense when you can get upward of 70% more graphics performance from an RX 580 for only around $30-$40 more. Or with double the VRAM and a decent bundle of games for not much more than that.

And this was even true back around when the 1050 Ti first launched, when RX 480s and 3GB 1060s were readily available below $200, with some RX 470s and 480s going on sale for very close to the 1050 Ti's price. The 1050 Ti was arguably a poor value for anyone who wasn't PSU-limited up until miners caused the price of anything better to skyrocket outside the range most people were willing to pay for a graphics card. And now that the prices of most of those cards have come back down to reasonable levels, the 1050 Ti is once again of questionable value. AMD's 14nm Polaris architecture wasn't efficient enough to match the 1050 Ti in a card not requiring external power, so it definitely still has that going for it for those upgrading a pre-built system, but otherwise there are better options available.

As for their next generation of cards, if they are all on 7nm, that could help a lot with improving efficiency. It's most important for them to compete at the price levels that most people actually buy, but lower power draw could also potentially allow them to be quite competitive at the enthusiast level as well. Even if they don't create a single huge chip to compete with the 2080 Ti, they have done dual-GPU cards in the past, and a pair of 7nm chips each with performance near that of the existing Vega cards could theoretically reach a competitive performance level at a similar power draw to the 2080 Ti, particularly if they could work together in the background and not depend on something like Crossfire support to divide up the rendering process. Of course, we have yet to see the effects of DLSS or RTX, and its possible that those could provide some additional edge to Nvidia's cards, but we'll have to wait to see on that.



No, this was written by the guy who wrote the "Why You Shouldn’t Buy Nvidia’s RTX 20-Series Graphics Cards (Yet)" article that was posted some days prior to that one.


Welcome to the Internet, new member whose username appears nowhere else online! Your first post was an interesting read, and I'm sure you'll continue to use this freshly-made account to engage in thoughtful discussions, and that you clearly don't have a vested interest in swaying public opinion about one company or another. : P
 


Did you read the article or you just decided from the title to make up a bunch of stuff?

"Lucky to beat the 1080ti", true they have nothing that can compete now, but they do have a card (Vega 64) that goes toe to toe with the 1080, so the reality is if they have a new GPU arriving, and the 1080ti isn't THAT much faster, the logic of this falls completely short.

You talk about Raytracing as if it was something that was a huge issue before. There are currently NO games that support it, there will be, but all you are spewing is marketing numbers.

10-15 years away? No.

I love these articles, it brings the one hit wonders like you out of the woodwork.
 


He asked for a very affordable under $160 GPU that can handle modern titles at 1080p medium. The RX 560 can do that, whether there is another Nvidia GPU that does it better is somewhat irrelevant. I agree there is no direct competitor between the 1050ti and it, although an RX 570 4gb on sale is VERY close to the same price and eats it for lunch. The point of the article is they are overhauling the line we just don't know yet with what or when.
 


Thanks for your feedback, Kris. I wanted to reply and set a few things straight for the record. For starters, I am not from LaLa Land - I hail from a famous shore town on the grizzled coast of New Jersey (I can literally see Dr. Weird's laboratory from the nearby beach). Although sometimes it can feel like LaLa Land (the movie with Ryan Goseling and Emma Stone dancing and singing everywhere) with all the local theater companies and high school children skipping and flash mobbing about town singing musical numbers, it is a relatively sane place of origin.

To your points on the article, I do believe in the things I wrote (despite my geographical location), and I tried to provide tangible context for my assertions in the form of linked articles on the various topics I touched on. If I wasn't clear in the article, I'd like to clarify that AMD's current flagship, the Radeon Vega 64, is a direct competitor to the GeForce GTX 1080 (check out our GPU Hierarchy chart) . The RTX 2080 is competitive with GTX 1080 Ti in rasterized game performance (check out our review). It is completely reasonable to hypothesize that with a smaller lithography and improved architecture, AMD could close the performance gap with it's next-gen graphics products.

As far as ray tracing is concerned, AMD has been a prominent figure in the professional market with Radeon Rays (which was formerly AMD FireRays) for some time now (I can't source its original launch date, but anything old enough to be referred to as "formerly known as" has likely been around for a few years). It was designed for the workstation content creator (ray tracing for movies for film and PC game scenes), but with recent upgrades to the software (that I detailed in the article), it seems to be moving in a direction where it could be adapted for real-time ray-tracing in PC games.

To say AMD has "no hope" with ray tracing, that ray tracing in games was "10-15 years away" otherwise, or that AMD is "so far behind Nvidia" in performance is outright inaccurate. Just because AMD's top-end GPU stops at a certain performance and price point in Nvidia's product stack doesn't mean AMD is far behind. It simply means AMD isn't focused on those price points. That may change because of Nvidia's price hike for its top end card (RTX 2080 Ti is almost double the previous gen 1080 Ti in price), but traditionally, AMD has been competitive in performance and price in every market segment below $500 (original MSRPs) for some time now (albeit somewhat late to the party, playing catch up to Nvidia).

To your point on Tensor and RT cores... you're not wrong. It is cutting-edge technology. No other company has brought a consumer gaming GPU to market that has proprietary co-processors (which is a gross oversimplification of what they actually are, yes, but follow me here) that can perform ray tracing and deep learning algorithms in parallel with traditional rasterization. I am in no way taking away from that achievement when I say that ray tracing isn't a thing (yet). It simply needs to take off in other areas (more games with both the DLSS and RTX features, and getting featured in a mainstream console would also help with widening adoption) before the hardware is worth the purchase for a large portion of the PC gaming population. Ray tracing is no doubt cool, but it's just not a priority for the majority of PC gamers... yet.

I think the reason you haven't seen something like that in AMD's pipeline is because the company tends to keep its features open source.

Two quick comparisons:

1) Nvidia G-Sync requires displays outfitted with a proprietary chip to function; AMD's variable refresh rate display technology is open source and compatible monitors tend to be far less expensive because manufacturers don't have to buy a special co-processor from AMD.

2) You need Turing or Volta GPUs for ray tracing Nvidia's way, and those cards are more expensive than previous GeForce offerings largely because of those proprietary features; Radeon Rays is open source and runs on the OpenCL 1.2 standard (so it doesn't need anything but the GPU shaders).

I'm not saying either implementation of those features are right, wrong, or better than the other. Nvidia is clearly ahead of the pack for real-time ray tracing, but AMD has a few cards (both proverbial and literal graphics cards) up its sleeve should it decide to go that route as well. The possibility of ray tracing taking off is wonderful for PC gaming enthusiasts. But wouldn't it be even more wonderful if there were two companies in the game?

Now that RTX is available and we see where it lands against the previous generation cards in traditional gaming performance and pricing, it's not outlandish to hope or conjecture that AMD, given the company's documented technologies, practices, and what we know about its currently available GPUs (everything that I detailed in the article), could and can compete in rasterized game performance at multiple price points with its next-gen graphics tech.

After that arrives, Intel will likely have something to say about discrete graphics cards, too. The cycle never ends.

Yes... my article depends almost entirely on the terms "if," "could," and "possibly." I was quite clear on that. It all boils down to what AMD brings to market and how it competes. But the company wasn't doing a terrible job of competing in price and performance (power consumption and heat, we know, could definitely be better for Radeon cards) in the GPU market to begin with. They just have less market share. Nvidia's focus on ray tracing (which attributes to the high price and only moderate gains for rasterized games) is an opportunity for AMD to catch up. All i did was lay out how and why it could happen.

Thanks for reading!

 
AMD has ALWAYS been competitive with every generation (more or less) since they launched the Radeon HD 4850/4870 that took Nv "by surprise" coming from the 3xxx generation, there were some that were very competitive some that were less, but to flat out state AMD was not competing and their power draw/watts and drivers (software) were always worse then what Nv offered is an outright fallacy.

many many times over the past generations the radeons were actually using LESS power and watts then competing Nv cards which is FACT, it truly was only when Pascal launched that Radeons were in "worse shape" in this regard BUT they also had much more "under the hood" then the Nv cards had (in hardware) as well AMD chose wisely to always use high grade for the VRM capacitors (115 and 125c) instead of the "normal" average that Nv uses and has used for many years (85 and 105c, sometimes 95c and 105c)

The larger capacitor generally means that at higher load the cap will have a longer lifespan because it is not running crazy hot just to "keep up" of course there are exceptions when they are not running at optimal clock rates and power use goes up which requires more juice which means more heat (via power use) goes through the caps, amps and volts always matter, generally Radeons favored lower external amp and volt but higher watts where Nv favored higher amps and volts at lower watts.

It is "amazing" how well the various GPU from as far back as Radeon 7xxx generation toe current Vega generation do with a "devolt" helps them maintain clock speeds better AND vastly reduces power consumption needed/required

I wish it were "easier" to lock in the downvolting.

-------------------------------------
--------

that XTGI or whatever you want to call it (claim it is, when at least the 4th generation in actuality was and is called XDMA) I will call it "bridgeless crossfire" was actually first enabled/released for the Radeon 290/290x, granted using Infini fabric might be a good bullet point, considering the speed that it operates at as it stands is more then ample enough.

the largest difference between AMD and Nv is IMHO AMD tries to give the best they possibly can (that is well built) and earnestly drives open source initiatives that can benefit themselves and other companies at a fair price point (justifiable), Nv on the other hand tend to build as "cheap as possible" so very very few times they do anything that can benefit anyone or anything other then their products (no open source) and also generally charge a "premium" more then they need to be doing, often disgustingly so.

Business is business no doubt about that, but, there is much to be said about trying to be "upstanding" in ones actions even if it means it may not always JUST benefit yourself.

----------------------------------
-----------

I like how folks "forget" how Nv was before Pascal the sheer crud they did, the vast amount of power they used (heat pushed) compared to their Radeon counterparts and price they wanted for them, as always seems "Nv is the best and AMD cannot compete because of reason X Y or Z" and yet not that long before was "Yes Nv does X Y or Z but that is ok because it is Nv and they have feature W"

People have such short and terrible memories of what others have done just because....I do not know why, IMO, I do not support a company whose ethics and business practices often comes at screwing people over and not admitting when they were at fault for doing something terrible and basically "laughing" about it all the way to the bank because they know people are stupid and will continue to drink their kool-aid anyways
 
HA! the idea that intel could come up with a powerful discrete card and compete with nvidia or amd is ludicrous. they don't innovate, they steal others ideas and throw more engineers and silicon at things. their stuff should be light years ahead of everyone else, yet, they suck and continue to suck and take lots of money from people who don't know better.