News Balking AMD's Claims, Nvidia Says Smartphones Aren't Ready for Ray Tracing

thGe17

Reputable
Sep 2, 2019
70
23
4,535
That's a strange point of view. AMD has never been limited by Nvidia, they have no relevant cross licence agreement or any thing else that could bind their hands. AMD so far has always limited themselves, for example by developing the current GPU architecture exclusively for the consoles, etc. In the end it is simply a matter of resources and AMDs resources are much more limited than Nvidias are.

Maybe something good comes out of the cooperation with Samsung and they can deliver a chip that can really support at least a litte raytracing on mobile devices and not only something for marketing purposes on a spec sheet. Nevertheless the power envelop on mobile devices is quiet tight ... lets wait and see ...
 

d0x360

Distinguished
Dec 15, 2016
115
47
18,620
That's a strange point of view. AMD has never been limited by Nvidia, they have no relevant cross licence agreement or any thing else that could bind their hands. AMD so far has always limited themselves, for example by developing the current GPU architecture exclusively for the consoles, etc. In the end it is simply a matter of resources and AMDs resources are much more limited than Nvidias are.

AMD isn't saying phones are ready for ray tracing they are saying the hardware that Samsung is licensing is capable. That's different.

Also AMD has been massively limited by nVidia. Short memories... For years (like 15 years ago) nVidia cheated in benchmarking software. The drivers would detect the software and overclock the gpu and ignore thermal throttling (to a point) just so they would always win benchmarks. This increased their sales which hurt ATI which in turn limited their r&d budget.

Then they got caught and not long after we saw the birth of game works. Code so closed source that nVidia didn't even let the game devs see it which meant they couldn't optimize for it. Meanwhile AMD was releasing effects and making them open source so anyone could improve them which is why AMD's hair solution looks better than nVidia's hair works (by a significant margin) and runs on average 12-20% better.

AMD's driver team would also rewrite game works shader code. They would intercept something like HBAO+ and replace it with their own identical but better running solution. That's where the fine wine effect came from. Usually it would take a few weeks for the updated drivers but the games would get a good performance leap and often run better than they did using game works on an nVidia gpu of similar power.

AMD still has a fraction of the r&d budget of both Intel and nVidia but despite that they are absolutely crushing it in the cpu space and VERY quickly catching up to nVidia in the gpu space. I wouldn't be shocked to see them on par or better when RDNA3 is released.
 
  • Like
Reactions: artk2219

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
Some food for thought: what good is ray tracing on mobile phones if a lot of its detractors already balk at the performance loss on high end hardware? Sure, maybe AMD can slip in a single CU RDNA2 GPU in there, but ray tracing will just be a dancing bear.

Besides, PowerVR beat both AMD and NVIDIA to the punch years ago.
AMD's raytracing is pretty useless on current gen consoles and is barely usable on their halo $1000 GPU. How are they going to get it working on a phone?
 
  • Like
Reactions: Why_Me
"The data set is quite large, and there will be a time for it. When the time is right we might consider it."

That's why nVidia decided to gimp the Ampere generation with piss poor VRAM capacities so you have to upgrade when RT is actually something you can run! THE NERVE!

Also, Jensen is salty he can't put an nVidia GPU on mobile and their only real client is Nintendo. But hey, I'm sure as soon as Nintendo wants to put just a bit more eye candy in their Switch, Jensen will be all over that cake singing praises to RT on mobile then.

And for the record, I don't disagree with the premise: RT on mobile devices, unless they're using low resolution sampling, it's kind of overkill for the hardware capabilities. I'd love to be proven wrong by AMD and make Jensen eat his words though.

Cheers!
 
"The data set is quite large, and there will be a time for it. When the time is right we might consider it."

That's why nVidia decided to gimp the Ampere generation with piss poor VRAM capacities so you have to upgrade when RT is actually something you can run! THE NERVE!
Given the quote started with "Ray tracing games are quite large, to be honest.", it just sounds like a non-answer. Or that he's only looking at AAA titles for some reason.
 

Eximo

Titan
Ambassador
Nvidia was kind of right to not jump on the HBM bandwagon for consumer cards, particularly HBM 1. Their track record for predicting when people would actually want something has been decent so far.

Introducing ray-tracing will let developers get on with it, so I don't fault them for the RTX 20 cards too much.

People have been complaining about Nvidia shorting them on VRAM at each generation when AMD had the bigger buffer. Didn't really help out with longevity then either, you start piling the features on and the GPU croaks before the buffer is full.

Also, I have a tegra phone that still works, so there. Also has beats before apple bought it.
 
Nvidia was kind of right to not jump on the HBM bandwagon for consumer cards, particularly HBM 1. Their track record for predicting when people would actually want something has been decent so far.

Introducing ray-tracing will let developers get on with it, so I don't fault them for the RTX 20 cards too much.

People have been complaining about Nvidia shorting them on VRAM at each generation when AMD had the bigger buffer. Didn't really help out with longevity then either, you start piling the features on and the GPU croaks before the buffer is full.

Also, I have a tegra phone that still works, so there. Also has beats before apple bought it.
While I don't disagree with what you said, I do have a slightly different take on why: nVidia has been right, because they're usually with the big enough marketing machine to make it "a thing". Look at what happened with Tesselation, T&L and Shader 1.0, 2.0 and 3.0 support. Now with Upscaling when AMD has had a somewhat mediocre way of doing it, but it still had it!

Also, I still remember the huge arguments against Fury when it was introduced with "only" 4GB of VRAM when the Titan and 980ti had 6GB (I think). I mean... I don't know what to tell you there. You can just go back and read the huge rant paragraphs of people back then saying how it would not be enough. Ironically, back then and for the games of the time, I was pretty ok with 4GB, but I am not now as my RX480 4GB has shown me how cheaping out on the capacity can affect the real long term life of the GPU, even when it may not be fast enough, so looking back I should have also complained about it having "only" 4GB. I'm guessing that's also why they jumped to 8GB and 6GB and now doubled that as standard.

And no arguing on Tegra and Tegra 2. They were fantastic ARM SoC's in their day. I actually miss them doing more tablets and/or a proper Shield successor, but in tablet form.

Regards.
 
Mobile games with RT? Imagination tried that since 2014. Thing is on mobile developer try to make their game to work on as many hardware as possible. Still remember when Opengl ES 3.0 become a thing on mobile before. In the end majority of mobile developer still stick with Opengl ES 2.0. RT already very hard to do even on pc and console. On mobile the restriction will be even bigger due to power limitation.
 
While I don't disagree with what you said, I do have a slightly different take on why: nVidia has been right, because they're usually with the big enough marketing machine to make it "a thing". Look at what happened with Tesselation, T&L and Shader 1.0, 2.0 and 3.0 support. Now with Upscaling when AMD has had a somewhat mediocre way of doing it, but it still had it!

Also, I still remember the huge arguments against Fury when it was introduced with "only" 4GB of VRAM when the Titan and 980ti had 6GB (I think). I mean... I don't know what to tell you there. You can just go back and read the huge rant paragraphs of people back then saying how it would not be enough. Ironically, back then and for the games of the time, I was pretty ok with 4GB, but I am not now as my RX480 4GB has shown me how cheaping out on the capacity can affect the real long term life of the GPU, even when it may not be fast enough, so looking back I should have also complained about it having "only" 4GB. I'm guessing that's also why they jumped to 8GB and 6GB and now doubled that as standard.

And no arguing on Tegra and Tegra 2. They were fantastic ARM SoC's in their day. I actually miss them doing more tablets and/or a proper Shield successor, but in tablet form.

Regards.

People most often make generalization when it comes to VRAM situation between AMD and nvidia but in reality it is a bit more complex. But the short version is nvidia to certain extend can get away with less amount of VRAM than AMD.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
just like how nvidia's RT was pretty much useless on all its GPU's except maybe high end 2070's and 2080's ?
Sure, at launch. It isn't 2018 anymore though. Nvidia has DLSS 2.0 working properly now and pretty much all ray traced games being released now also support DLSS. As seen below, it's pretty much a blood bath with DLSS enabled. It's not on the chart, but we still know, the lowly 2070 you mentioned is faster than a 3060, so a 2 1/2 year old midrange 2070 with DLSS would crush AMD's current generation $1000 flagship 6900XT. If that doesn't whet your appetite for what AMD will bring to cellphones, I don't know what would.

XgkUXNMVKwrDXdKnX5Fqaj-2663-80.png
 

zodiacfml

Distinguished
Oct 2, 2008
1,228
26
19,310
Sour graping.😅 Nvidia has no presence on mobile or consoles yet. Consider that mobile gaming has lower resolutions and much simpler graphics, RT could be a lot simpler and less intensive making it more worthwhile than their RTX 2000 series implementation.😂
 

bignastyid

Titan
Moderator
Sour graping.😅 Nvidia has no presence on mobile or consoles yet. Consider that mobile gaming has lower resolutions and much simpler graphics, RT could be a lot simpler and less intensive making it more worthwhile than their RTX 2000 series implementation.😂
I wouldn't say no presence. Ever heard of the Nintendo switch?
 

Conahl

Commendable
Apr 24, 2020
243
82
1,660
Sure, at launch. It isn't 2018 anymore though. Nvidia has DLSS 2.0 working properly now
and the same could be said about amds FSR and its own RT come rdna 3, with FSR2 and their RT 2, as amd will probably improve both on version 2 of each, point is ? either way, there are a few people i know, believe it or not, could care less about DLSS and RT, for now, one friend even, and im not sure if he was joking or not, said nvidia should of called dlss for what it is, DLUS
 

UnCertainty08

Reputable
Sep 9, 2020
40
9
4,545
I've been PC Gaming Enthusiast for 30 years. I love Nvidia, I've tried ATI/AMD but stability and driver issues have always made me a Nvidia user. Before anyone tried to debate this, just name one generation when AMD didn't admit to driver issues.
Again I'm an Enthusiast so price/performance doesn't matter to me. I want frame rates in AAA titles, fast paced FPS, that's my number one priority.

Ray tracing has been lackluster since day one. They keep making marketing videos that make it look impressive but I've never seen it.
I game on a 49" Ultrawide, Samsung C49RG90 - 5120x1440@120Hz - HDR1000. I had a 2070 Super for this monitor at first and I kept Ray Tracing turned off.
It barely made any visual improvements and it destroyed frame rates.
So a couple weeks after launch I got a 3090 FTW3 Ultra. Brand new games like COD Black Ops, Control etc. I kept Ray Tracing turned off. It didn't make that much of a difference and for fast FPS it lowered frame rates too much.
I have now 3090 KingPin. With a very quick light overclock immediately upon purchase I can play COD Black Ops with Ray Tracing maxed and frame rates are good enough that I don't feel the need to turn it off.
Even now though I really don't think it looks much better. This is on a 49" 5120x1440 with 1000 nits brightness and a 3090 KingPin.

SO Given this why on earth are they even talking about Ray Tracing on phones????????????
It's gimmicky marketing buzzword BS for now. Hopefully soon they start making games that look amazing with Ray Tracing. But having a phone in the next 6 months - 1 year with RT seems like a complete waste.

I hope they prove me wrong. I do.
 

jkflipflop98

Distinguished
I think the raytracing acceleration in a smartphone will be completely different than what we're used to on PC.

You don't have to use raytracing for displaying graphics, you can also use raytracing for determination of positional data and a great many other utilitarian things.
 
D

Deleted member 2851593

Guest
Sure, at launch. It isn't 2018 anymore though. Nvidia has DLSS 2.0 working properly now and pretty much all ray traced games being released now also support DLSS. As seen below, it's pretty much a blood bath with DLSS enabled. It's not on the chart, but we still know, the lowly 2070 you mentioned is faster than a 3060, so a 2 1/2 year old midrange 2070 with DLSS would crush AMD's current generation $1000 flagship 6900XT. If that doesn't whet your appetite for what AMD will bring to cellphones, I don't know what would.

XgkUXNMVKwrDXdKnX5Fqaj-2663-80.png

Three years later the technology they used to justify the most insulting price increase in recent history is finally usable and present in more than two games that make a terrible implementation of it. Now that's a good investment.
 
Last edited by a moderator: