It massively depends on the game you play.
The difference between PCIe 3.0 and 4.0 on a RX 6500 XT can be as low as 10%
Or as high as 46%
And those tests were done with a 5950X, so on your end the difference should be less even in CPU demanding games.
Note that in Doom Eternal, where the impact is massive, the 6500 XT beats the normal 1650 when running at PCIe 4.0 while in Hitman 3, where the difference is barely noticable, they are pretty much equal when running the 6500 XT under PCIe 3.0 and with 4.0 the 6500 XT sits slightly ahead.
The real cause is that the 6500 XT only has 4 PCIe lanes, which will impact performance. That is only 1/4 of what a "proper" card has. So that would be like comparing a card running at PCIe 1.1 x16 and 2.0 x16 in terms of bandwith (It practically doubled with each gen)
Unless you plan to upgrade to a much beefier CPU with that card, it shouldn't make any real difference. What matters more is what games you want to play with it, how these are impacted, and if AMD or nVidia does better in those games. Oh and power consumption and other features obviously.
- Neither card will support DLSS, but both support FSR
- the Geforce supports G-Sync and Freesync (unter the moniker G-Sync-compatible) The Radeon only Freesync
- The Radeons can do Freesync over HDMI, the Geforce can'tdo G-sync over HDMI
- The 1650 will draw about 40W less under load, but 5W more in idle.
- The 6500 XT has better Linux support with open source drivers
- The 1650 has Reflex and Ansel, the 6500 XT Radeon Antilag and Chill
- For HDMI and DP versioons you would have to check the specific cards you would get.