News Nvidia Ampere Purportedly 50% Faster Than Turing At Half The Power Consumption

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

TJ Hooker

Titan
Ambassador
I can't say though if this RT will be AMDs actualy hardware designed RT or just their software based RT that utilizes shader cores that are inactive. My bet is the latter as hardware RT would take time and need quite a bit of a design change to implement and I doubt even Big Navi had that in mind when being designed. Maybe RDNA 2.0 will be hardware RT for AMD.
As I said in my previous post, RT hardware is confirmed for PS5.

"There is ray-tracing acceleration in the GPU hardware”

It seems highly unlikely (and misleading) that he would phrase it that way if it were just running RT on regular shader cores.
 

joeblowsmynose

Distinguished
As I said in my previous post, RT hardware is confirmed for PS5.

"There is ray-tracing acceleration in the GPU hardware”

It seems highly unlikely (and misleading) that he would phrase it that way if it were just running RT on regular shader cores.

Considering that is more or less marketing speak, I'd say its not impossible that "RT hardware" is just a couple extra regular shader cores dedicated to the task. But yeah, if that's the case, that would be a bit misleading in comparison to NVidia's "dedicated" solution, and wouldn't be as powerful at all.
 

TJ Hooker

Titan
Ambassador
Considering that is more or less marketing speak, I'd say its not impossible that "RT hardware" is just a couple extra regular shader cores dedicated to the task. But yeah, if that's the case, that would be a bit misleading in comparison to NVidia's "dedicated" solution, and wouldn't be as powerful at all.
If that were the case then that statement would be a lot worse than "a bit misleading" in my eyes. More like an outright lie. Plus having regular shader cores that are restricted to only doing RT doesn't make sense to me. Why not have them broadly available, such that they can be beneficial no matter what you're running?
 
If that were the case then that statement would be a lot worse than "a bit misleading" in my eyes. More like an outright lie. Plus having regular shader cores that are restricted to only doing RT doesn't make sense to me. Why not have them broadly available, such that they can be beneficial no matter what you're running?

Considering AMD used fake renders of the next XBox and as we have seen in the past marketing teams can embellish on products even to the point to make them look worse than they are I wouldn't be surprised if it is a bit of a misleading statement.

But I guess we will see once the hardware is released and we can see the real specs.
 

TJ Hooker

Titan
Ambassador
Considering AMD used fake renders of the next XBox and as we have seen in the past marketing teams can embellish on products even to the point to make them look worse than they are I wouldn't be surprised if it is a bit of a misleading statement.

But I guess we will see once the hardware is released and we can see the real specs.
The quote was from MS employees, not AMD. Pretty big difference between AMD's marketing department getting carried away with trying to show everyone pretty pictures and failing to properly verify an image's source, and Xbox system architect making a statement about the system's technical capabilities.
 
Last edited:

joeblowsmynose

Distinguished
The quote was from MS employees, not AMD. Pretty big difference between AMD's marketing department getting carried away with trying to show every pretty pictures and failing to properly verify an images source, and Xbox system architect making a statement about the system's technical capabilities.

I think MS wouldn't give them any real renders, so AMD just used a fan made 3D model and rendered that for their presentation. Anyway, they admitted they downloaded it from TurboSquid - a 3D asset sharing marketplace.

There's nothing wrong with doing that, but showing it in detail without at least some small print disclaimer on the slide was a bit of a gaffe.

I think Smitty's point was just that "marketing" info often doesn't reflect detailed reality, for a lot of variable reasons - nefariousness doesn't even need to be one of them.

I personally am 60% skeptical of anything that I perceive as "marketing", for these reasons.

But let's hope AMD is working on some spectacular new ray-tracing HW ... NVidia needs more competition. :)
 

TJ Hooker

Titan
Ambassador
I think MS wouldn't give them any real renders, so AMD just used a fan made 3D model and rendered that for their presentation. Anyway, they admitted they downloaded it from TurboSquid - a 3D asset sharing marketplace.

There's nothing wrong with doing that, but showing it in detail without at least some small print disclaimer on the slide was a bit of a gaffe.

I think Smitty's point was just that "marketing" info often doesn't reflect detailed reality, for a lot of variable reasons - nefariousness doesn't even need to be one of them.

I personally am 60% skeptical of anything that I perceive as "marketing", for these reasons.
Yes, I'm aware of all that. Although I suppose we'll never know for sure if the marketing people knew the images weren't real when they used them or if someone was just lazy and basically took whatever they could find on an image search without bothering to check out the source.

I just really don't think it makes sense to liken that situation to the situation I've been referring to, namely an explicit statement made by a technical professional, i.e. the system architect.
 

joeblowsmynose

Distinguished
Yes, I'm aware of all that. Although I suppose we'll never know for sure if the marketing people knew the images weren't real when they used them or if someone was just lazy and basically took whatever they could find on an image search without bothering to check out the source.

TurboSquid - where AMD says they got the source for that image, is a 3D model marketplace mostly. That means they would have had to purchase and download the 3D file, setup a scene in blender / Max / Maya - whatever, with lighting and all that goodness, then do a render for the final image. (I'm a casual 3D artists and I have used Turbosquid before). So I think someone would have had to have known in order top go through all that trouble - its a lot to have to do for no one to know. The presenters probably didn't know or care, and the slide guy just needed an image for the slide. No one probably thought anything of it at all.


I just really don't think it makes sense to liken that situation to the situation I've been referring to, namely an explicit statement made by a technical professional, i.e. the system architect.
Yes that might be different. But just being a system architect doesn't mean you don't have to play the marketing game. His words were chosen to neither 100% confirm or deny dedicated RT hardware. That may have been on purpose.

Although secretly I am fully believing that AMD is working hard and making strides on dedicated RT cores, I just don't want to get my hopes too high. :)
 
  • Like
Reactions: TJ Hooker

none12345

Distinguished
Apr 27, 2013
431
2
18,785
"Therefore, stock shouldn't be a major concern for Nvidia."

Supply is tight at TSMC with a 12 month lead time on new orders. On top of that samsung is rumored to be suffering from very poor yields on 7nm. So, id say there is warrant for concern.
 

joeblowsmynose

Distinguished
"Therefore, stock shouldn't be a major concern for Nvidia."

Supply is tight at TSMC with a 12 month lead time on new orders. On top of that samsung is rumored to be suffering from very poor yields on 7nm. So, id say there is warrant for concern.

Agreed.

AMD is now cited as TMSC's biggest customer for 2020 over apple ... I imagine in case of conflict, AMD and Apple will take precedence over NVidia.
 
I guess what I am saying is that am having some doubts on what level of RT prowess AMD will be able to cram into a low cost console chip. NVidia's 2080 costs more than the entire console likely will...

With raytracing, it's really hard to tell what the performance might be like, since AMD hasn't really shown any examples of what they will be doing for hardware raytracing acceleration in a desktop card yet, and their solution might be quite different from Nvidia's from a hardware standpoint.

However, back when the RTX cards first came out, I came to the conclusion that no more 10% of those graphics chips are likely dedicated to RT. So it's not like Nvidia is dedicating a huge chunk of silicon to the technology. And that's on a 12nm process. At 7nm, the space dedicated to that amount of RT performance would likely be rather tiny, and the power use and heat output minimal as well.

You can't really base the cost of console hardware on the cost of a graphics card either. There's undoubtedly a very large markup on something like a 2080 over what the card actually costs to manufacture, whereas consoles are typically sold at or below cost, with profits coming from software, service and peripheral sales. There's also a fair amount of overlap with other hardware in the system. The CPU cores are likely to be part of the same processor as the GPU, and both will share the same pool of memory, the same cooling setup, and so on, making the design a lot more cost-efficient.

And of course, the graphics hardware may not offer RTX 2080 levels of traditional graphics performance. Based on the information presented so far, I wouldn't expect much more performance than a 5700 XT from the new Xbox, and the Playstation 5 probably won't be too different. But it's possible that RT performance could be higher than what a 2080 offers, or maybe even what a 2080 Ti offers, since again, it probably wouldn't cost an excessive amount to add that level of RT performance to a card, at least going by Nvidia's current implementation. That amount of RT performance might not add up to much more than 5% of a console's total cost.

Also, it's already been nearly two years since Nvidia first announced RTX, so presumably AMD might have had time to add something similar to RDNA2, even if they didn't have any forewarning about it prior to that.
 
With raytracing, it's really hard to tell what the performance might be like, since AMD hasn't really shown any examples of what they will be doing for hardware raytracing acceleration in a desktop card yet, and their solution might be quite different from Nvidia's from a hardware standpoint.

However, back when the RTX cards first came out, I came to the conclusion that no more 10% of those graphics chips are likely dedicated to RT. So it's not like Nvidia is dedicating a huge chunk of silicon to the technology. And that's on a 12nm process. At 7nm, the space dedicated to that amount of RT performance would likely be rather tiny, and the power use and heat output minimal as well.

You can't really base the cost of console hardware on the cost of a graphics card either. There's undoubtedly a very large markup on something like a 2080 over what the card actually costs to manufacture, whereas consoles are typically sold at or below cost, with profits coming from software, service and peripheral sales. There's also a fair amount of overlap with other hardware in the system. The CPU cores are likely to be part of the same processor as the GPU, and both will share the same pool of memory, the same cooling setup, and so on, making the design a lot more cost-efficient.

And of course, the graphics hardware may not offer RTX 2080 levels of traditional graphics performance. Based on the information presented so far, I wouldn't expect much more performance than a 5700 XT from the new Xbox, and the Playstation 5 probably won't be too different. But it's possible that RT performance could be higher than what a 2080 offers, or maybe even what a 2080 Ti offers, since again, it probably wouldn't cost an excessive amount to add that level of RT performance to a card, at least going by Nvidia's current implementation. That amount of RT performance might not add up to much more than 5% of a console's total cost.

Also, it's already been nearly two years since Nvidia first announced RTX, so presumably AMD might have had time to add something similar to RDNA2, even if they didn't have any forewarning about it prior to that.

I would assume that at 7nm Nvidia will be able to dedicate more die space to Tensor cores to increase their hardware RT power. Again TSMCs is at least double the density of 12nm so they should be able to cram more logic if they go with a similar die size to Turing.
 

joeblowsmynose

Distinguished
...

Also, it's already been nearly two years since Nvidia first announced RTX, so presumably AMD might have had time to add something similar to RDNA2, even if they didn't have any forewarning about it prior to that.

Yes, AMD has been developing their raytracing render engine for much longer than that - so they have decent software now that uses open CL and is hardware agnostic (CPU / GPU / any brand). I can't help but to believe that this product has been a part of the development for hardware raytracing - I know they love showing it off rendering photorealistic scenes in near realtime with their pro vega cards and Epyc CPUs. And these are usually quite complex scenes.

They even have this little ad:
View: https://www.youtube.com/watch?v=P2Jq4EcV3xk


They are working on raytracing with Vulkan, apparently, but mainly just as a game development tool at this time to preview advanced lighting effects that use raytracing - so they are providing software that can already allow game devs that don't have an RTX card to preview their products with raytracing effects with any hardware.

View: https://www.youtube.com/watch?time_continue=175&v=C49IYlJy_oY&feature=emb_logo


... no talk of dedicated hardware yet though ... they pretty tight lipped on that. (well, everything lately)
 

mradr

Distinguished
Oct 12, 2011
12
2
18,515
a bunch of marketing nonsense you bought, nothing is cheaper to make
You never made a game I can tell - it takes TIME - that you have PAY someone to create the effects that we love to see. At times - MORE than one person because textures, colors, environment, etc all have ot match to get the best effect out of the lighting that you would other wise have ot code out, place, and adjust to make it look correct and functional. With RT - a lot of that work is done for you. What we are seeing atm is just the engine not having RT fully coded out thus we're seeing a lot of "we have to do it by hand for now". It's already a lot easier to place light post with in engines that do support some RT already and only playing with some numbers here and there for performance reasons. Aka, I could have Joe blow play with numbers vs having to pay a high class programmer to correctly place lighting around $20/hr vs $60/hr. As customers - we wont see it in terms of price - but we will see it in terms of how the games look in the future. Less time spending on lighting more time they can focus on creating better games.

What is fun - that is just with lighting - it'll also be use in VR in terms of eye tracking along with VRS thus "nothing is cheaper" comment is more BS than who you are. Why? Because it opens the door for more users to start using VR in terms of hardware support of scale, features, and eye fluid motion.

Like most features though - a single feature isn't going to make or break - its the over all collection of tools that will be at someones hands to help push it that little bit more so they can work on other stuff. Even still - it takes close to 2-4 years before we see that feature in major use for triple A games.

Either way - you didn't read the whole thing and just copy past what line you disagree with:) maybe its just time for you to start reading a book instead of just copy past. Lighting can be a major game changer once we have both the hardware and software ready for it - RT/RT light features are the future.

Over all - I think you are reading RT as RTX witch is different:)
 
Last edited:
Jan 20, 2020
4
0
10
What are you on about?

Turing was claimed to be 6x faster than pascal, and we all know that that claim was spot on.[/s]

I'll wait for the reviews.

well giving than Volta is already 5,9x faster in Tensor/Half-Precision calculations you are correct its spot on.
:)

There are way bigger marketing-lies than this one. Especially if consumers arent readying all data.
 

joeblowsmynose

Distinguished
well giving than Volta is already 5,9x faster in Tensor/Half-Precision calculations you are correct its spot on.
:)

There are way bigger marketing-lies than this one. Especially if consumers arent readying all data.

Well then that's like saying a "Radeon VII is 10X faster than a RTX2080!"

Because it is ... !

At double precision floating point calculations ...

It's all bullshit ... just different flavours of it, and like I said earlier - the claims are to make headlines, bait clicks, impress potential investors ... not to relay performance metrics in the real world. But if you can twist the words to try to fool the suckers, then I guess that's how you spent your marketing money ... to be disingenuous toward the same people who buy your products. That's the great thing about fanbois - you can lie to them all day and they'll still happily give you money.

Granted, much of such "marketing" comes from the "journalists" trying to create click-baity headlines for their own purposes.
 
Well then that's like saying a "Radeon VII is 10X faster than a RTX2080!"

Because it is ... !

At double precision floating point calculations ...

It's all bullshit ... just different flavours of it, and like I said earlier - the claims are to make headlines, bait clicks, impress potential investors ... not to relay performance metrics in the real world. But if you can twist the words to try to fool the suckers, then I guess that's how you spent your marketing money ... to be disingenuous toward the same people who buy your products. That's the great thing about fanbois - you can lie to them all day and they'll still happily give you money.

Granted, much of such "marketing" comes from the "journalists" trying to create click-baity headlines for their own purposes.

I agree that marketing is mostly crap but I am not sure you can blame Nvidia, AMD or Intel for poor journalistic integrity these days. In the old days the title would be what was said with the information and not some click bait article.

But times change. From what I can tell its that we have journalists and not people passionate about technology at the forefront.
 

joeblowsmynose

Distinguished
I agree that marketing is mostly crap but I am not sure you can blame Nvidia, AMD or Intel for poor journalistic integrity these days. In the old days the title would be what was said with the information and not some click bait article.

But times change. From what I can tell its that we have journalists and not people passionate about technology at the forefront.

Yeah I agree (as my last sentence noted). Well, there is things we can easily blame on Intel, NVidia, AMD, but a lot of it these days is just crap that spreads around like stepped on fresh dog doodoo, but had almost no basis in reality to begin with.
 
Jan 20, 2020
4
0
10
Yeah I agree (as my last sentence noted). Well, there is things we can easily blame on Intel, NVidia, AMD, but a lot of it these days is just crap that spreads around like stepped on fresh dog doodoo, but had almost no basis in reality to begin with.

That's why you shouldnt let yourself get influenced by Marketing or media. Especially when one of the Red, Green,Blue Products is about to hit the market and you are interested in buying one. Other than that let natural selection happen. Im a big fan of removing all warning signs and letting the idiots decimate themselves :)

And you are abosloutely correct fanboys will always favor one over the other. But for some (especially enterprises) reliability and experience is much more valuable than money. A company wich has been running on xeon's for many many years will probably consider if they switch to epyc but only if the advantages are mayor.
Same goes for software or similiar.

(Just in case some AMD fanboys will start crying and trying to teach me what i have to do to compare 2 cpus.... idc about others opinions. I've got my own.)

I've got an i5 6600 and the Ryzen 1600 at home. Waiting for the 10th gen to compare. Will take the CPU that fits my needs the best. I would have to buy a new motherboard in both cases as none of the 3rd gen Ryzens wil work on the board i have for the 1600. I've got 32GB of Dual Rank 16GB Ram with Micron Dies laying arround. Therefore i will compare the Maximum the AMD can handle as reference when the new intel hits the market.
I dont byu the cpu becaus ei have to, i buy it becaus ei want to i can wait as long as i like to.
 

joeblowsmynose

Distinguished
That's why you shouldnt let yourself get influenced by Marketing or media. Especially when one of the Red, Green,Blue Products is about to hit the market and you are interested in buying one. Other than that let natural selection happen. Im a big fan of removing all warning signs and letting the idiots decimate themselves :)

And you are abosloutely correct fanboys will always favor one over the other. But for some (especially enterprises) reliability and experience is much more valuable than money. A company wich has been running on xeon's for many many years will probably consider if they switch to epyc but only if the advantages are mayor.
Same goes for software or similiar.

(Just in case some AMD fanboys will start crying and trying to teach me what i have to do to compare 2 cpus.... idc about others opinions. I've got my own.)

I've got an i5 6600 and the Ryzen 1600 at home. Waiting for the 10th gen to compare. Will take the CPU that fits my needs the best. I would have to buy a new motherboard in both cases as none of the 3rd gen Ryzens wil work on the board i have for the 1600. I've got 32GB of Dual Rank 16GB Ram with Micron Dies laying arround. Therefore i will compare the Maximum the AMD can handle as reference when the new intel hits the market.
I dont byu the cpu becaus ei have to, i buy it becaus ei want to i can wait as long as i like to.

Indeed.

I wish more people would just have the ability to just say, "I bought Intel, because that's the one I wanted to get." or "I went with AMD this time to give them a try" or "For what I do, the xxxxx works great for me"

But instead everyone, for some reason, most people need to make up some really strong "story" or "justification" often with falsities thrown in because it provides their own personal support of their stance -- because heaven forbid they didn't have a good enough "reason" to convince themselves.

One doesn't really need reasons and justifications for everything they do in life, yet people seem to have to "convince" themselves of their choices, sometimes to the point of plain lying about it to support their alignments.

Its almost like a disease of the mind ...
 
  • Like
Reactions: King_V
Indeed.

I wish more people would just have the ability to just say, "I bought Intel, because that's the one I wanted to get." or "I went with AMD this time to give them a try" or "For what I do, the xxxxx works great for me"

But instead everyone, for some reason, most people need to make up some really strong "story" or "justification" often with falsities thrown in because it provides their own personal support of their stance -- because heaven forbid they didn't have a good enough "reason" to convince themselves.

One doesn't really need reasons and justifications for everything they do in life, yet people seem to have to "convince" themselves of their choices, sometimes to the point of plain lying about it to support their alignments.

Its almost like a disease of the mind ...

Its just the way most people work. Best thing is you could easily point out the hypocrisy for any person.

I am not afraid to admit that I like Intels platform more than AMDs but just slightly.

I did at one point swear off Nvidia products until they had the better performing part for, well gaming. I went from the 9700 Pro to a HD7970GHz only buying ATI/AMD. Then the 980Ti was at the time the best part so I got it.
 
  • Like
Reactions: joeblowsmynose

joeblowsmynose

Distinguished
Its just the way most people work. Best thing is you could easily point out the hypocrisy for any person.

I am not afraid to admit that I like Intels platform more than AMDs but just slightly.

I did at one point swear off Nvidia products until they had the better performing part for, well gaming. I went from the 9700 Pro to a HD7970GHz only buying ATI/AMD. Then the 980Ti was at the time the best part so I got it.

My reasoning for leaning toward AMD (which I think is obvious) is that what they produce fits what I do exactly.

Partly, I got pissed with Intel, for offering more cores than four that any average income person could afford. I came real close to buying into Intel HEDT at the end of 2016 with a 6800k. It would have cost me over $600CAD + an expensive mobo.

I heard Zen was incoming and so I waited a few months. I ended up snagging a $140CAD mobo and a R7 1700 for $375CAD, M.2 SSD, for the less than the price the 6800k would have cost alone. And it performs way better to boot, and OC's pretty good (considering the base clock is 3.0ghz - you can get almost a ghz of OC out of these things)

That hit every nail on the head for me and left me with a bitter taste in my mouth for Intel. I still use Intel exclusively for work though, and I am responsible for all IT purchases at the office. Been thinking about giving AMD a try there too as well though, but generally I prefer to stick with one vendor at a time for various IT related reasons.

I only casually game so Nvidia and Intel offer no advantage to me there.

I used to be a big Nvidia fan, but now I don't like them as a company that much. Ultimately, if AMD became the "premium" priced brand across the board, but NV and Intel still had good products, I'd probably consider the switch back, as I really am a fan of bang for buck.

Also, my first computer that I purchased with my own money was a Thunderbird 1000 ... so a bit of nostalgia as well.


Anyone else have a good "reason for alignment story" that isn't just a "justification" for their alignment, but actually good sound reasoning (and maybe interesting)? I'm sure there's a good spread from all sides.