News No, Raja Koduri Didn't Say Intel's Discrete GPUs Will Debut at $200

I would doubt Intel would come out swinging like that but what a market shakeup it would be if Intel were to enter at equal performance for a hundred or so less than the competition.

One can only dream...
 
Nvidia has stagnated in GPUs in my mind. The 2080 offers equal performance to the 1080ti while having 3gb less vram and costing more money! What a steal! Sure they have RTX, but by the time this feature is relevant, original 20 series cards won't be relevant.

This leaves AMD and intel room to swoop in and get some market share.

I just watched an interesting video from gamer meld. Gamer meld said Intel did say all of their GPUs for data center, aib cards, and igpus would be based on one architecture.
AMD did this with GCN and it turned out to be a limitation, hence why gaming GPUs are now based on RDNA and everything else is still GCN.

Also, I saw something from Cortex that said (paraphrased) the 5700 die is tiny and can overclock to well over 2ghz with unlocked power. So it makes sense AMD would enlarge the die and add a bunch more stream processors. If they shipped this GPU with the 2ghz we know it can hit, it would have incredible performance.
 
Nvidia has stagnated in GPUs in my mind. The 2080 offers equal performance to the 1080ti while having 3gb less vram and costing more money! What a steal! Sure they have RTX, but by the time this feature is relevant, original 20 series cards won't be relevant.

This leaves AMD and intel room to swoop in and get some market share.

I just watched an interesting video from gamer meld. Gamer meld said Intel did say all of their GPUs for data center, aib cards, and igpus would be based on one architecture.
AMD did this with GCN and it turned out to be a limitation, hence why gaming GPUs are now based on RDNA and everything else is still GCN.

Also, I saw something from Cortex that said (paraphrased) the 5700 die is tiny and can overclock to well over 2ghz with unlocked power. So it makes sense AMD would enlarge the die and add a bunch more stream processors. If they shipped this GPU with the 2ghz we know it can hit, it would have incredible performance.


Most GPUs are based on similar uArchs. Nvidias are typically uArchs in the HPC world trickled down to consumer. We never got Volta, except the Titan V, but RTX cards are similar to Volta just without the high speed interconnects and such.

Also the current RDNA, as far as I can find, is still closely tied to GCN. If thats to it has a SPU limit of 4096 which is what Vega 64 has currently. Having more is not always the answer either. Nvidia has had less SPUs most generations while performing better.

I think the reason Navi is not clocked that high though is due to how much power draw it would take. Wouldn't look good if it sucked vastly more power.

Intel is not beginner in GPUs either. They have a ton of GPU patents. They just are newish to the discrete GPU market. If they leverage their patents and process technology correctly and get their driver team up to par they have a good shot at competing with both AMD and nVidia. I read somewhere once that Intel has the largest software team, even larger than Microsoft.

I am all for three full on competitors for GPUs. I just want to see top end GPUs back down to $500 bucks like my 9700 Pro and 9800XT were.
 
And AMD couldn't sell the cards with the same old crappy blower. Although i dont think they will for the higher end navi cards AMD confirmed are in the works.

My assumption on those cards is going to be more SPUs rather than higher clocks and possibly a Vega 64/56 replacement with less HBM than the Radeon VII instead of GDDR6 like the 5700 has for more memory bandwidth. But that will also up the cost quite a bit if so.
 
Yea they jumped the bandwagon on that. As my boss always says you want to be on the leading edge not the bleeding edge. Bleeding edge always adds costs and you are basically the guinea pig.
Reminds me of a little thing Nvidia did. Ya know, that thing called RTX.

Too little, too soon and too much at the same time.

To little software support and too little performance caused by too soon of a launch. To add insult to injury they also had high of a cost.
 
Last edited:
Which is why, even though I wanted one, I didn't buy a 2080. My plan is to wait till the next or following release. Although I think we need more than just nVidia in the hardware ray-tracing game before we see good results.
 

AnimeMania

Distinguished
Dec 8, 2014
334
18
18,815
If I was Intel's strategist, I would release the highest end gaming card I could make, letting relative performance dictate the price. Then incorporate the highest graphics possible into the new CPUs. That would allow Intel to enter the high end market and essentially eliminate the low end of the graphic card market (through high quality on-board CPU graphics). Sprinkle in a few cards from the defective chip manufacturing to create the mid-range market (this is a new process so the defective chip numbers will be higher and go down as the process matures.) By the time they are ready for their next high end graphic card release, the previous high end graphic card is now the mid-range card and should have decent yields, solid drivers, and a known reputation. They would have a solid footing in all the markets by their second high end graphic card release.
 
I just want to see top end GPUs back down to $500 bucks like my 9700 Pro and 9800XT were.
But what is considered a "top-end" card? The highest-end cards today are arguably targeting a more niche market than what those cards were aimed at when they were new. The 9800 XT had a 60 watt TDP, which is what you might expect in low-end parts now. They typically had tiny little heat sinks with tiny fans. In terms of the graphics chip itself, the processors in those cards were just 218 square millimeters, similar in size to what you can now find in a GTX 1650 or 1060.

You should see a trend here. Today's "equivalents" to those cards actually cost far less. In fact, if you account for inflation, $500 when those cards were new works out to nearly $700 today. If you have a look back at their reviews, they too were described as "expensive".

If we look at what today's $200 graphics cards have to offer, they can maintain upward of 60fps at high settings in almost all titles at the most common resolution currently in use, 1080p. Or you can get cards in the $300-$400 range that will handle 1440p rather well. Today's "top-end" cards are only necessary to push 4K resolution smoothly, but native 4K is arguably not a practical resolution for gaming yet, and the benefits of targeting that resolution over 1440p seem questionable at this point. For those willing to pay substantially more for a bit sharper image, the option is there, but those highest-end cards are hardly a necessity.
 

bigdragon

Distinguished
Oct 19, 2011
1,137
605
20,160
GPU prices have gotten wildly out of control. I don't like buying GPUs that cost more than gaming consoles. There isn't enough good* software to really push gaming GPUs today.

I would like to see Intel shake up the market. Undercutting AMD and Nvidia significantly would make me pay attention. However, we all know that Intel is gonna Intel -- they'll price their product slightly higher than the competition given the same performance benchmark, and the Intel graphics cards will only work with Intel CPUs an chipsets. If Intel is still making graphics cards in 4 to 5 years then maybe I'll pay attention. Until then, Intel does not make any inroads simply by getting price and performance parity.

*Good is defined as not crippled by microtransactions and progression.
 
GPU prices have gotten wildly out of control. I don't like buying GPUs that cost more than gaming consoles. There isn't enough good* software to really push gaming GPUs today.

I would like to see Intel shake up the market. Undercutting AMD and Nvidia significantly would make me pay attention. However, we all know that Intel is gonna Intel -- they'll price their product slightly higher than the competition given the same performance benchmark, and the Intel graphics cards will only work with Intel CPUs an chipsets. If Intel is still making graphics cards in 4 to 5 years then maybe I'll pay attention. Until then, Intel does not make any inroads simply by getting price and performance parity.

*Good is defined as not crippled by microtransactions and progression.

This isn't really a true statement though. Intel has been cheaper then their competition before. Core 2 launched outperforming Athlon 64 X2 and was cheaper for the performance. The closest you could get the a quad core for AMD at the time was the QuadFX which required a dual socket board that cost way more than a normal board, a much beefier PSU, 2 dual core FX CPUs and of course two decent cooler.

It all depends on the market.