News Radeon RX 5500 Crushes the GeForce GTX 1650, According to AMD's Numbers

King_V

Illustrious
Ambassador
There's a bit that says: "There's a good chance that AMD created the document before the GeForce GTX 1650 Super's release." but the link is to the 1660 Super review.

Following sentence is: "The GeForce GTX 1650 Super will arrive with significant improvements, including 384 more CUDA cores and the upgrade from 8 GB/s GDDR5 memory to 12 GB/s GDDR6 memory."

I think the 1650 Super is meant to be launched in late November, if I'm not mistaken. That first sentence that I quoted is therefore confusing/misleading. That paragraph needs to be looked over and edited.
 

hannibal

Distinguished
So low middle range is coming, maybe in the undefined future.
heh... but good to get new architecture to that segment too,
Interesting to see the power balance after these releases.
 

srimasis

Distinguished
Who gives a crap about Gtx1650? The real comparison of Rx5500 should be with Rx570. As of now, an Rx570 costs about 120$. So if AMD wants a winner GPU, Rx5500 should cost about 150$ and give performance better than Rx570.
 
Last edited:
  • Like
Reactions: King_V

hannibal

Distinguished
Who gives a crap about Gtx1650? The real comparison of Rx5500 should be with Rx570. As of now, an Rx570 costs about 120$. So if AMD wants a winner GPU, Rx5500 should cost about 150$ and gives performance better than Rx570.

Eventually... I am sure that 570 and 580 will remain cheaper than 5500 untill almost all 570 and 580 has been sold. Everything else would be bad economical choise...
Untill that 570 and 580 will remain superior option.
 

King_V

Illustrious
Ambassador
Who gives a crap about Gtx1650? The real comparison of Rx5500 should be with Rx570. As of now, an Rx570 costs about 120$. So if AMD wants a winner GPU, Rx5500 should cost about 150$ and gives performance better than Rx570.

I would agree - though the 1650 Super, depending on its performance, could throw this equation completely off.

Eventually... I am sure that 570 and 580 will remain cheaper than 5500 untill almost all 570 and 580 has been sold. Everything else would be bad economical choise...
Untill that 570 and 580 will remain superior option.

Agreed - the 5500 will be released, but might be in an odd position. If it performs like the 570 or the 580 (or 590?), it would be good, but if it's priced competitively with the 5x0 cards (and I imagine it would be to compete against Nvidia in that tier), then the Polaris cards would naturally have their prices driven downward, until supplies dry up. Might be a while, though, since seems like there's a still a plentiful supply of Polaris cards.
 
  • Like
Reactions: alextheblue

revengeyo

Prominent
Nov 10, 2019
2
0
510
AMD fails again
They compare it to gtx1650 😂
I bet the RX5500 can barely keep up with RX 470 reference. Tech from 2016 😂

tenor.gif
 

Olle P

Distinguished
Apr 7, 2010
720
61
19,090
Who gives a crap about Gtx1650? The real comparison of Rx5500 should be with Rx570.
Assuming the RX 570 get discontinued it's just about uninteresting.
As I wrote in another thread the GTX 1650 is very interesting to those that don't have a PCIe power connector. The RX 5500 just seems too power hungry!
I was hoping for a better power to performance ratio, but it will still be interesting to see how well it actually perform once available.
 

InvalidError

Titan
Moderator
Agreed. 4GB is barely enough for 1080P gaming. I can think of plenty of titles that love more VRAM.
Increasing resolution while leaving everything else the same has relatively little impact on VRAM usage. The bulk of it is consumed by having more higher resolution textures and models - newer games put more stuff on-screen in general, that extra stuff gets incrementally more detailed and more varied even at lower detail levels, and all of it has to be stored somewhere.

At least there are no 2-3GB models... yet.
 
  • Like
Reactions: alextheblue

King_V

Illustrious
Ambassador
AMD fails again
They compare it to gtx1650 😂
I bet the RX5500 can barely keep up with RX 470 reference. Tech from 2016 😂

tenor.gif

Did you create this account 3 minutes before your post just for the purposes of making unsubstantiated speculative statements understating what the performance will be? If not, then I'm dying to see your citations for these claims.
 
Increasing resolution while leaving everything else the same has relatively little impact on VRAM usage. The bulk of it is consumed by having more higher resolution textures and models - newer games put more stuff on-screen in general, that extra stuff gets incrementally more detailed and more varied even at lower detail levels, and all of it has to be stored somewhere.

At least there are no 2-3GB models... yet.

I understand that. My point is AMD is touting this as the best 1080P gaming card. I feel the best would be sporting 8GB of VRAM minimum to account for all the extra "stuff" developers like to throw in.

It does also depend on how the developer codes the game. TF2 fore example, probably most any Source engine, has the ability to not render whats not on screen even if it exists which would save VRAM. GTA IV however was a beast and 4GB was easily recommended, well at least more than 2GB. I wouldn't run GTA V on anything less than 8GB personally since there is always so much going on.
 

InvalidError

Titan
Moderator
I understand that. My point is AMD is touting this as the best 1080P gaming card. I feel the best would be sporting 8GB of VRAM minimum to account for all the extra "stuff" developers like to throw in.
I'd be happy with 6GB, too bad neither AMD or Nvidia wants to put a 192bits memory controller in the budget mainstream range.

Not rendering off-screen stuff reduces rendering workload but not necessarily VRAM: you don't want to wait after system RAM to reload assets whenever they temporarily go off-screen, so you want to leave assets in VRAM for as long as possible while there is still a chance they'll be needed again unless the VRAM would be better used for pre-loading other assets that are more likely to be needed soon. Of course, the more spare VRAM there is to work with, the more opportunities there are to not compromise and do both.
 
  • Like
Reactions: alextheblue

hannibal

Distinguished
Most likely there will be 4Gb and 8Gb versions. The price difference is there and some people really care for that $20-30 price difference.
Lets see when this really comes to market. My prediction is after black friday, so They can sell as much of 580 as possible during the black friday sales. If not, then after chistsmast.
 
Last edited:
  • Like
Reactions: alextheblue

alextheblue

Distinguished
It does also depend on how the developer codes the game. TF2 fore example, probably most any Source engine, has the ability to not render whats not on screen even if it exists which would save VRAM.
Having lots of VRAM still means you can pack more eyecandy in, so yeah I agree with the bulk of what you're saying. With that being said. culling exists in various forms in almost every engine, and has been in use for many moons. Not all implementations are equal, but (as Invalid already said) that's not going to save much VRAM. Texture streaming, well that's a different story.
 
Sep 13, 2019
71
4
35
unsure where they came from with "best" for 1080p, to my understanding 1660ti is actually the best for 1080p gaming so it would need better/same performance than 1660ti to actually get called best 1080p card.

IDK man i was really hoping for 5500 to be good enough to upgrade from 1050ti, now it doesnt seem to. I think i will just have to wait and see for myself when benchmarks hit the YouTube, until then i deem this card pointless.

It makes me Wonder what gpu will be in newest consoles, Betting hard on 5500 to be actually inside of new consoles.
 

InvalidError

Titan
Moderator
It makes me Wonder what gpu will be in newest consoles, Betting hard on 5500 to be actually inside of new consoles.
Based on the last couple of XB/PS consoles, console GPUs are typically at the higher end of mainstream PC graphics and the coming ones are also supposed to support ray-tracing to some extent, so we're likely looking at something in the neighborhood of an RTX2070. The Radeon 5500 will be nowhere close.

If the 5500 is supposed to replace the 570-590, it should be faster than the 570-590 which are already considerably faster than a 1050Ti. With AMD switching product branding to break price points though, I wouldn't be surprised if the 5500 had much worse performance per dollar than the RX570/580 which you can sometimes get new for $120-150 including one or two free games... effectively free GPU if you were going to buy both games at full retail price anyway.