Geef
Distinguished
Bah, no need for all those new techs! Lets go back to colors for textures like crayons. 🖍️ 8 pack for low end cards and 24 pack for high end cards!
AMD has had hardware RT acceleration since last gen (RDNA2/RX 6000), and added matrix-multiply acceleration (analogous to tensor cores) in their latest generation.AMD isn't implementing Tensor or RT cores in their GPUs,
Nothing is worth watching the cult dumb down the internet day after day year after year. Let AMD crash & burn while taking the cult with them.Even if you don't buy AMD products, you can't deny that AMD is quite a beneficial force. Do you remember how many years Intel put out new CPUs with just 5-10% performance improvements? Or how this generation Nvidia canceled the RTX 4090Ti because AMD didn't put out a stronger card? AMD performing well is good both in terms of prices we pay, as well as increase in performance we get. Any company that is in too strong of a dominant position will have no incentive to push out big upgrades.
So I'm personally very happy to have AMD here, and I hope they're successful in pushing new advancements. Because it'll mean better and cheaper products for us all.
Goodno, what don't you understand about Nvidia being a monopoly? With AMD gone a 50-tier GPU will be $500 and a 90-tier will be $5000. No, thank you. Without AMD nvidia's market share would increase to 90%+
I guess there is some truth that DLSS is here to stay. In my mind, there are 2 main reasons,
1. The advance nodes are hitting a wall where it is becoming increasingly difficult to shrink the transistors further. Hence, it will be difficult for GPUs to do the "brute force" method of spamming more cores. The jump from Ampere to Ada is significant because they went with some cheap mature Samsung node, to TSMC's cutting edge node. Going forward, I don't think we will see such big jump CUDA core count even on the flagship GPU. Therefore, the increase in performance is mostly gonna be driven by higher power requirement to push the hardware further, or, via software like DLSS.
2. As a side effect of point 1 on the transistor shrink difficulties, this will drive cost up, which to try and maximize margins, the way around again, software, which is cheaper.
Fortunately for me, I've pretty much quit AAA PC gaming. PC gaming is losing its appeal because the hardware are getting costly, and games getting boring. Seems like all game developers are focused on is the graphics. While it looks good, gameplay and storyline are mostly lacking. Which I feel is the reason why we keep seeing remasters because it saves the developer time and money to get a game out to market.
I 100% agree with this statement, however, that doesn't mean that I am an AMD fanboy, in fact quite the opposite I used to be an Nvidia fanboy until the 40 series changed that. So thanks Nvidia for opening my eyes and allowing me to choose AMD.Even if you don't buy AMD products, you can't deny that AMD is quite a beneficial force. Do you remember how many years Intel put out new CPUs with just 5-10% performance improvements? Or how this generation Nvidia canceled the RTX 4090Ti because AMD didn't put out a stronger card? AMD performing well is good both in terms of prices we pay, as well as increase in performance we get. Any company that is in too strong of a dominant position will have no incentive to push out big upgrades.
So I'm personally very happy to have AMD here, and I hope they're successful in pushing new advancements. Because it'll mean better and cheaper products for us all.
Thank you for admitting the 4000 series is crap.DLSS need to be implemented to improve graphics fidelity and circumvent the otherwise low gen-on-gen performance improvements seen in todays graphics hardware
In that case open your own wallet to help them fund the research, development, and highly knowledgeable manpower resources to create a new graphic technology.No thanks, nVidia.
I had a longer rant, but you know what... It boils down to: I do not want a future where nVidia is the sole provider of technologies to make games look good. If nVidia wants a pass, they need to make this not just accessible to other GPU vendors, but maybe include them as standarised API access across all engines. The Industry does not need another "Glide 3D" moment.
Been there, done that and hated it.
Regards.
why Nvidia is not console dominant? for OG Xbox they never gave a discount to Microsoft so they went AMD for X360 in just 3 years , PS3 to PS4 same story, Switch is expensive and underpowered and so will be Switch 2; I don't want Nvidia dictate how can I play (proprietary technologies) and how much I should pay for it. this AI and Tensor were made for enterprise and gamers are second though but have to pay for the development, it should be optional not mandatory.I don't mind paying higher prices if it means AMD going out of business within the next ten years or less. It would be due justice for the AMD cultist who've dumbed down the internet tech sites for the past 15+ years with their cult like behavior.
The Suits figured out that they can launch a MVP(Minimum Viable Product) and receive punishment from the masses equivalent to a tap on the wrist. Money just walks right on in regardless... why should they be incentivized to do better when they are so easily forgiven?People really expect rasterization to keep on reigning when development is at it's worst and few games can work correctly on day 1?
This major push for realism is something I've yet to grasp since the 20 series came out.We all want better and more realistic picture quality and fast, responsive frametime.
Nope, the corpos won't allow that hit to their profits. If something has great potential, it's more likely to be abused in some manner.DLSS is a great tool that will help extend the lifespan of GPUs by years...
Prove to me that DLSS can out do down sampling from a higher resolution in terms of image quality?Once the reconstruction becomes even more realistic and at a higher fps than the "real deal", assuming you mean rasterisation, how will you tell them apart?
With a good enough model a scene can be solid and accurate.
Larger chips with multichip modules ... Nvidia and AMD are already doing this
650 watt 4090s and new Radeon gpu / Ryzen cpu
Even with that AMD and Nvidia are still pushing DLSS/FSR technologies.
Just wait until Jensen abandons the GPU market because you expect too much for too little $$$.Nothing is worth watching the cult dumb down the internet day after day year after year. Let AMD crash & burn while taking the cult with them.
You haven't been paying attention the last 10 years if you think that's how Nvidia operates. Nvidia doesn't need competition to continue innovating. AMD hasn't been competitive with Nvidia at the high end outside of RDNA2 over the past decade. How many years did it take them to beat a 1080Ti? Didn't stop Nvidia from innovating. During AMD's awful Polaris/Vega era, Nvidia developed ray tracing cores and tensor cores along with DLSS.no, what don't you understand about Nvidia being a monopoly? With AMD gone a 50-tier GPU will be $500 and a 90-tier will be $5000. No, thank you. Without AMD nvidia's market share would increase to 90%+
Nvidia has to play it delicately or they will strangle their competitors then have to deal with anti-trust law.They have spent a ton of time and money in research, in order to create a technology that puts them way ahead of the competition. Why would they ever want to throw away such a strategic advantage? Would you do it if you were them?