Rumble in the budget graphics card segment.
Radeon RX 5500 Crushes the GeForce GTX 1650, According to AMD's Numbers : Read more
Radeon RX 5500 Crushes the GeForce GTX 1650, According to AMD's Numbers : Read more
Who gives a crap about Gtx1650? The real comparison of Rx5500 should be with Rx570. As of now, an Rx570 costs about 120$. So if AMD wants a winner GPU, Rx5500 should cost about 150$ and gives performance better than Rx570.
Who gives a crap about Gtx1650? The real comparison of Rx5500 should be with Rx570. As of now, an Rx570 costs about 120$. So if AMD wants a winner GPU, Rx5500 should cost about 150$ and gives performance better than Rx570.
Eventually... I am sure that 570 and 580 will remain cheaper than 5500 untill almost all 570 and 580 has been sold. Everything else would be bad economical choise...
Untill that 570 and 580 will remain superior option.
Assuming the RX 570 get discontinued it's just about uninteresting.Who gives a crap about Gtx1650? The real comparison of Rx5500 should be with Rx570.
If the 5500 is supposed to replace the 570/580/590, then I hope we'll see 8GB variants. 4GB is getting a little tight.
Increasing resolution while leaving everything else the same has relatively little impact on VRAM usage. The bulk of it is consumed by having more higher resolution textures and models - newer games put more stuff on-screen in general, that extra stuff gets incrementally more detailed and more varied even at lower detail levels, and all of it has to be stored somewhere.Agreed. 4GB is barely enough for 1080P gaming. I can think of plenty of titles that love more VRAM.
AMD fails again
They compare it to gtx1650 😂
I bet the RX5500 can barely keep up with RX 470 reference. Tech from 2016 😂
Increasing resolution while leaving everything else the same has relatively little impact on VRAM usage. The bulk of it is consumed by having more higher resolution textures and models - newer games put more stuff on-screen in general, that extra stuff gets incrementally more detailed and more varied even at lower detail levels, and all of it has to be stored somewhere.
At least there are no 2-3GB models... yet.
I'd be happy with 6GB, too bad neither AMD or Nvidia wants to put a 192bits memory controller in the budget mainstream range.I understand that. My point is AMD is touting this as the best 1080P gaming card. I feel the best would be sporting 8GB of VRAM minimum to account for all the extra "stuff" developers like to throw in.
Having lots of VRAM still means you can pack more eyecandy in, so yeah I agree with the bulk of what you're saying. With that being said. culling exists in various forms in almost every engine, and has been in use for many moons. Not all implementations are equal, but (as Invalid already said) that's not going to save much VRAM. Texture streaming, well that's a different story.It does also depend on how the developer codes the game. TF2 fore example, probably most any Source engine, has the ability to not render whats not on screen even if it exists which would save VRAM.
Based on the last couple of XB/PS consoles, console GPUs are typically at the higher end of mainstream PC graphics and the coming ones are also supposed to support ray-tracing to some extent, so we're likely looking at something in the neighborhood of an RTX2070. The Radeon 5500 will be nowhere close.It makes me Wonder what gpu will be in newest consoles, Betting hard on 5500 to be actually inside of new consoles.