News Low-End AMD RX 6300 Surfaces on Second-Hand Marketplace

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Hmm. I think with better encoding options, this would be an amazing GPU for many, many use cases. Without it, though, it feels kinda superfluous except for fringe cases, especially considering that APUs exist and ever non-F Intel CPU has integrated graphics, too, that will be more than enough for ecery day office tasks.
This comment seems weird to me, as I've never used a dGPU for encoding, nor do I think many people have.

As for needing a dGPU for desktop graphics, keep in mind that Xeon W 2000 and 3000 don't have iGPUs, nor do ThreadRippers. And AM4 systems are a relative bargain (like this card), where you still need a dGPU for the non-APU processors.

A good competitor to this is the nearly 10 year old Quadro k620 which does the same job for most and runs half the price used.
That's a garbage DDR3 card. Not GDDR3, but regular DDR3!

Given its age, does it even support 4k60? I'm pretty sure it won't support HDMI 2.0 or DisplayPort 1.4, either.

Utter trash, IMO. Its only saving grace is that it's old enough to have decent support in the Linux Nouveau driver. Even so, the RX 6300 is much better supported in Linux.

For your sake, I'm going to pretend you didn't mention it.

There is also the A310 which is hard to come by in the US that has a higher power draw, but has display capabilities that are top end for any gpu.
It won't be half-height, though. That, TDP, and price are really where the RX6300 basically stands alone.
 
Last edited:
I just noticed there are some RX 6400 cards that are half height, half length, and single slot. Not cheap, but all with 4 GB, and TDP seems to range from 43 W to 55 W.


This one is half-height, half-length, and bus-powered, but 2 slots (also, 2 fans):
 
Last edited:
That's a garbage DDR3 card. Not GDDR3, but regular DDR3!

Given its age, does it even support 4k60? I'm pretty sure it won't support HDMI 2.0 or DisplayPort 1.4, either.

Utter trash, IMO. Its only saving grace is that it's old enough to have decent support in the Linux Nouveau driver. Even so, the RX 6300 is much better supported in Linux.

For your sake, I'm going to pretend you didn't mention it.
The k620 is the semi Maxwell one like the GT750ti was, so it doesn't have the compatibility issues that older Kepler has with games for what that matters. It only has DP1.2 so only does 4k60 10 bit 4:4:4 chroma. So just one step up from HDMI 2.0. But with an active DP1.2 to HDMI 2.0 adapter it just does the HDMI 2.0.
I used to have an RX550. A HP 2GB version. And compared the k620, the video display quality at high resolution, power consumption, drivers and overall smoothness were utter trash.

I'm surprised you haven't heard about DP 1.2.
 
Even a potato can do 4k60, albeit in YUV 4:2:0 output with all of the garbled details chroma sub-stampling entails such as anti-aliased fonts looking smudged.
4:2:0 chroma subsampling wasn't officially specified until HDMI 2.0. That's also when HDMI had enough bandwidth to do 4k60. Before that, even 4:2:2 chroma subsampling wouldn't save enough bandwidth to do 4k60 - you'd be limited to 4k30.

As for DisplayPort, it didn't add support for 4:2:0 until 1.3. But DisplayPort 1.2 has enough bandwidth that it can do 4k60 @ 8-bpc.

The problem with potatoes is that a lot of them only have HDMI. For instance, the Raspberry Pi couldn't do any form of 4k until gen4.
 
Last edited:
The k620 is the semi Maxwell one like the GT750ti was,
Oh, if it was Maxwell, then forget what I said about Nouveau compatibility. I guess it's complete rubbish, then.

I used to have an RX550. A HP 2GB version. And compared the k620, the video display quality at high resolution, power consumption, drivers and overall smoothness were utter trash.
LOL at "video display quality". Digital is digital. If you used the same mode, then you get the same output.

Anyway, nobody is proposing that people buy a RX 550, in this thread. The subject is RX 6300.
 
Actually cheap gaming GPU!
The gamers are saved by this product!

🤣

In reality These are needed also in the market, even if they are not the fastest of the lot.
I am surprised. I was expecting these to be more expensive... based on material and shipping cost alone... I am quite sure that these are sold at discount at this moment...
 
A good competitor to this is the nearly 10 year old Quadro k620 which does the same job for most and runs half the price used.
There is also the A310 which is hard to come by in the US that has a higher power draw, but has display capabilities that are top end for any gpu.
The easily obtained A380 still needs external power though.

None of these are gaming gpus btw.

I'm going to assume that an RX 6300 is faster than Rembrandt 680M graphics and the A310. Since the A380 is around the RX 6400 (slower at launch). So the RX 6300 could deliver an... experience ranging from 720p/900p/1080p gaming, falling flat when VRAM is in demand.

A310/A380 would be better for HTPC. I don't see an Arc A310 at Newegg or Amazon, so thinking about it is an academic exercise, much like the non-existent RX 6300. RX 6500 XT hit $100 in November, so hopefully the remaining A380 cards are dumped for less than that eventually.

Intel should put their top mobile graphics in a desktop APU (it's 96 EUs mobile vs 32 EUs desktop this generation). AMD has neglected desktop APUs (5700G is the best to date) and Intel can absolutely flood the market with better iGPUs if it wants to.
 
RX 6500 XT hit $100 in November,
Wow! Well, they're not going for that now! Newegg's best price is $150. Are you sure you weren't looking at a used, refurb, or open box?

so hopefully the remaining A380 cards are dumped for less than that eventually.
IMO, it's unrealistic to expect A380 to drop much below the RX 6500XT. It's got 50% more memory and a 47% larger die that's made on the same TSMC node. That's going to significantly increase its price floor, and I'm not sure Intel is so desperate to move existing Alchemist inventory that they're willing to sell them at a negative margin.

Intel should put their top mobile graphics in a desktop APU (it's 96 EUs mobile vs 32 EUs desktop this generation).
Meteor Lake is rumored to upgrade the GPU to a TSMC N5 die-shrink of Alchemist @ 128 EU, which should be a nice upgrade. However, I'm not clear why you think they should upgrade their desktop iGPU. 32 EU is enough for desktop graphics, while even 128 EU isn't going to deliver a compelling gaming experience and yet adds a non-trivial cost. So, what's the use case you have in mind?
 
Meteor Lake is rumored to upgrade the GPU to a TSMC N5 die-shrink of Alchemist @ 128 EU, which should be a nice upgrade. However, I'm not clear why you think they should upgrade their desktop iGPU. 32 EU is enough for desktop graphics, while even 128 EU isn't going to deliver a compelling gaming experience and yet adds a non-trivial cost. So, what's the use case you have in mind?

Direct competition to AMD's desktop APUs.

Meteor Lake 128 EU (Alchemist) could be similar in performance to Rembrandt's 680M graphics, if not Phoenix's 780M. AMD could put one of those on the AM5 socket within a year or so (there is a leak suggesting a new desktop APU around late 2023/early 2024). Are you saying there is no use case for an Intel desktop "APU" when the 5700G is a thing?

I'm not asking for every desktop model to get a large iGPU. Just 1-2 models would be enough, kind of like how 5700G and 5600G were the only Cezanne chips to come to the DIY market (5300G is OEM only).
 
re you saying there is no use case for an Intel desktop "APU" when the 5700G is a thing?
Well, let's look at the 5700G. It came about in an era when AMD needed a more entry-level spec for desktops, hence the iGPU. The most natural & cheapest thing for them to do was reuse their monolithic laptop APUs for the AM4 platform. I think that explains why it has the iGPU it does, not that they necessarily had a long think and decided such an iGPU was optimal for the entry-level socketed market.

In an era of chiplet-based laptop processors, it's a lot easier and cheaper to swap out the GPU tile for a smaller one, which is something Intel has talked about. If AMD moves to chiplets for their laptop processors, we could well see them switch the GPU die, if/when repurposing it for the entry-level desktop.
 
Meteor Lake is rumored to upgrade the GPU to a TSMC N5 die-shrink of Alchemist @ 128 EU, which should be a nice upgrade. However, I'm not clear why you think they should upgrade their desktop iGPU. 32 EU is enough for desktop graphics, while even 128 EU isn't going to deliver a compelling gaming experience and yet adds a non-trivial cost. So, what's the use case you have in mind?
128EUs is basically an A380. If Intel sorts out its driver performance issues and whatever hardware quirks that may be slowing down performance improvements, that would be a decent amount of performance for people who want more than a tiny IGP but don't want to pay $150+ for a comparable dGPU. In a tile/chiplet format, the G-suffix big-IGP SKUs swap shouldn't cost much more than $20 extra to manufacture and if you don't want it, simply buy the non-G small IGP or -F no IGP variants.
 
128EUs is basically an A380.
Except with shared DDR5 @ 128-bit, instead of dedicated GDDR6 @ 96-bit. That's about 1/3rd the memory bandwidth. And, given how forthcoming Intel has been about the floor plan of Meteor Lake, I think it's safe to say there won't be any in-package DRAM.

Anyway, it's a moot point if the rumors are true about desktop Meteor Lake being cancelled.
 
Except with shared DDR5 @ 128-bit, instead of dedicated GDDR6 @ 96-bit. That's about 1/3rd the memory bandwidth. And, given how forthcoming Intel has been about the floor plan of Meteor Lake, I think it's safe to say there won't be any in-package DRAM.
Taking a page from Lakefield and putting a DRAM chip on top of the IGP tile wouldn't change the floor plan by much if at all, though it would add another $15-20 to manufacturing costs.
 
Taking a page from Lakefield and putting a DRAM chip on top of the IGP tile wouldn't change the floor plan by much if at all, though it would add another $15-20 to manufacturing costs.
Yeah, the thought crossed my mind. But, what about the issue of affecting the height of just that tile? Could they shave enough off to compensate for it? I doubt they'd want to add shims to the other tiles - especially the CPU tile.
 
Yeah, the thought crossed my mind. But, what about the issue of affecting the height of just that tile? Could they shave enough off to compensate for it? I doubt they'd want to add shims to the other tiles - especially the CPU tile.
How are AMD's 3D-Vcache CCDs made? They grind the CCD's back to expose the TSVs the cache chip connects to, then shims are added to bring the CCD's height back up so heat from CPU cores can reach the IHS. The main reason AMD's 3D-Vcache chiplets need shims is because the cache chiplet is only about half the CCD's size.

If necessary, the assembled substrate can be ground down to target thickness too. The exact height of each silicon stack isn't critical as long as there is enough spare silicon to grind everything down to the desired flatness plane. With HBM, the individual chips are ground so thin that they can squeeze an 8-tall stack + base die in the same thickness as a standard chip. Stacking things 2-high shouldn't be much of a challenge, especially if the IGP tile/chiplet is made to match the DRAM chip's footprint or vice-versa so shims can be avoided.