As I said in my previous post, RT hardware is confirmed for PS5.I can't say though if this RT will be AMDs actualy hardware designed RT or just their software based RT that utilizes shader cores that are inactive. My bet is the latter as hardware RT would take time and need quite a bit of a design change to implement and I doubt even Big Navi had that in mind when being designed. Maybe RDNA 2.0 will be hardware RT for AMD.
... whereas PC can ray trace everything.
As I said in my previous post, RT hardware is confirmed for PS5.
"There is ray-tracing acceleration in the GPU hardware”
It seems highly unlikely (and misleading) that he would phrase it that way if it were just running RT on regular shader cores.
If that were the case then that statement would be a lot worse than "a bit misleading" in my eyes. More like an outright lie. Plus having regular shader cores that are restricted to only doing RT doesn't make sense to me. Why not have them broadly available, such that they can be beneficial no matter what you're running?Considering that is more or less marketing speak, I'd say its not impossible that "RT hardware" is just a couple extra regular shader cores dedicated to the task. But yeah, if that's the case, that would be a bit misleading in comparison to NVidia's "dedicated" solution, and wouldn't be as powerful at all.
If that were the case then that statement would be a lot worse than "a bit misleading" in my eyes. More like an outright lie. Plus having regular shader cores that are restricted to only doing RT doesn't make sense to me. Why not have them broadly available, such that they can be beneficial no matter what you're running?
The quote was from MS employees, not AMD. Pretty big difference between AMD's marketing department getting carried away with trying to show everyone pretty pictures and failing to properly verify an image's source, and Xbox system architect making a statement about the system's technical capabilities.Considering AMD used fake renders of the next XBox and as we have seen in the past marketing teams can embellish on products even to the point to make them look worse than they are I wouldn't be surprised if it is a bit of a misleading statement.
But I guess we will see once the hardware is released and we can see the real specs.
The quote was from MS employees, not AMD. Pretty big difference between AMD's marketing department getting carried away with trying to show every pretty pictures and failing to properly verify an images source, and Xbox system architect making a statement about the system's technical capabilities.
Yes, I'm aware of all that. Although I suppose we'll never know for sure if the marketing people knew the images weren't real when they used them or if someone was just lazy and basically took whatever they could find on an image search without bothering to check out the source.I think MS wouldn't give them any real renders, so AMD just used a fan made 3D model and rendered that for their presentation. Anyway, they admitted they downloaded it from TurboSquid - a 3D asset sharing marketplace.
There's nothing wrong with doing that, but showing it in detail without at least some small print disclaimer on the slide was a bit of a gaffe.
I think Smitty's point was just that "marketing" info often doesn't reflect detailed reality, for a lot of variable reasons - nefariousness doesn't even need to be one of them.
I personally am 60% skeptical of anything that I perceive as "marketing", for these reasons.
Yes, I'm aware of all that. Although I suppose we'll never know for sure if the marketing people knew the images weren't real when they used them or if someone was just lazy and basically took whatever they could find on an image search without bothering to check out the source.
Yes that might be different. But just being a system architect doesn't mean you don't have to play the marketing game. His words were chosen to neither 100% confirm or deny dedicated RT hardware. That may have been on purpose.I just really don't think it makes sense to liken that situation to the situation I've been referring to, namely an explicit statement made by a technical professional, i.e. the system architect.
"Therefore, stock shouldn't be a major concern for Nvidia."
Supply is tight at TSMC with a 12 month lead time on new orders. On top of that samsung is rumored to be suffering from very poor yields on 7nm. So, id say there is warrant for concern.
I guess what I am saying is that am having some doubts on what level of RT prowess AMD will be able to cram into a low cost console chip. NVidia's 2080 costs more than the entire console likely will...
With raytracing, it's really hard to tell what the performance might be like, since AMD hasn't really shown any examples of what they will be doing for hardware raytracing acceleration in a desktop card yet, and their solution might be quite different from Nvidia's from a hardware standpoint.
However, back when the RTX cards first came out, I came to the conclusion that no more 10% of those graphics chips are likely dedicated to RT. So it's not like Nvidia is dedicating a huge chunk of silicon to the technology. And that's on a 12nm process. At 7nm, the space dedicated to that amount of RT performance would likely be rather tiny, and the power use and heat output minimal as well.
You can't really base the cost of console hardware on the cost of a graphics card either. There's undoubtedly a very large markup on something like a 2080 over what the card actually costs to manufacture, whereas consoles are typically sold at or below cost, with profits coming from software, service and peripheral sales. There's also a fair amount of overlap with other hardware in the system. The CPU cores are likely to be part of the same processor as the GPU, and both will share the same pool of memory, the same cooling setup, and so on, making the design a lot more cost-efficient.
And of course, the graphics hardware may not offer RTX 2080 levels of traditional graphics performance. Based on the information presented so far, I wouldn't expect much more performance than a 5700 XT from the new Xbox, and the Playstation 5 probably won't be too different. But it's possible that RT performance could be higher than what a 2080 offers, or maybe even what a 2080 Ti offers, since again, it probably wouldn't cost an excessive amount to add that level of RT performance to a card, at least going by Nvidia's current implementation. That amount of RT performance might not add up to much more than 5% of a console's total cost.
Also, it's already been nearly two years since Nvidia first announced RTX, so presumably AMD might have had time to add something similar to RDNA2, even if they didn't have any forewarning about it prior to that.
...
Also, it's already been nearly two years since Nvidia first announced RTX, so presumably AMD might have had time to add something similar to RDNA2, even if they didn't have any forewarning about it prior to that.
You never made a game I can tell - it takes TIME - that you have PAY someone to create the effects that we love to see. At times - MORE than one person because textures, colors, environment, etc all have ot match to get the best effect out of the lighting that you would other wise have ot code out, place, and adjust to make it look correct and functional. With RT - a lot of that work is done for you. What we are seeing atm is just the engine not having RT fully coded out thus we're seeing a lot of "we have to do it by hand for now". It's already a lot easier to place light post with in engines that do support some RT already and only playing with some numbers here and there for performance reasons. Aka, I could have Joe blow play with numbers vs having to pay a high class programmer to correctly place lighting around $20/hr vs $60/hr. As customers - we wont see it in terms of price - but we will see it in terms of how the games look in the future. Less time spending on lighting more time they can focus on creating better games.a bunch of marketing nonsense you bought, nothing is cheaper to make
What are you on about?
Turing was claimed to be 6x faster than pascal, and we all know that that claim was spot on.[/s]
I'll wait for the reviews.
well giving than Volta is already 5,9x faster in Tensor/Half-Precision calculations you are correct its spot on.
There are way bigger marketing-lies than this one. Especially if consumers arent readying all data.
Well then that's like saying a "Radeon VII is 10X faster than a RTX2080!"
Because it is ... !
At double precision floating point calculations ...
It's all bullshit ... just different flavours of it, and like I said earlier - the claims are to make headlines, bait clicks, impress potential investors ... not to relay performance metrics in the real world. But if you can twist the words to try to fool the suckers, then I guess that's how you spent your marketing money ... to be disingenuous toward the same people who buy your products. That's the great thing about fanbois - you can lie to them all day and they'll still happily give you money.
Granted, much of such "marketing" comes from the "journalists" trying to create click-baity headlines for their own purposes.
I agree that marketing is mostly crap but I am not sure you can blame Nvidia, AMD or Intel for poor journalistic integrity these days. In the old days the title would be what was said with the information and not some click bait article.
But times change. From what I can tell its that we have journalists and not people passionate about technology at the forefront.
Yeah I agree (as my last sentence noted). Well, there is things we can easily blame on Intel, NVidia, AMD, but a lot of it these days is just crap that spreads around like stepped on fresh dog doodoo, but had almost no basis in reality to begin with.
That's why you shouldnt let yourself get influenced by Marketing or media. Especially when one of the Red, Green,Blue Products is about to hit the market and you are interested in buying one. Other than that let natural selection happen. Im a big fan of removing all warning signs and letting the idiots decimate themselves
And you are abosloutely correct fanboys will always favor one over the other. But for some (especially enterprises) reliability and experience is much more valuable than money. A company wich has been running on xeon's for many many years will probably consider if they switch to epyc but only if the advantages are mayor.
Same goes for software or similiar.
(Just in case some AMD fanboys will start crying and trying to teach me what i have to do to compare 2 cpus.... idc about others opinions. I've got my own.)
I've got an i5 6600 and the Ryzen 1600 at home. Waiting for the 10th gen to compare. Will take the CPU that fits my needs the best. I would have to buy a new motherboard in both cases as none of the 3rd gen Ryzens wil work on the board i have for the 1600. I've got 32GB of Dual Rank 16GB Ram with Micron Dies laying arround. Therefore i will compare the Maximum the AMD can handle as reference when the new intel hits the market.
I dont byu the cpu becaus ei have to, i buy it becaus ei want to i can wait as long as i like to.
Indeed.
I wish more people would just have the ability to just say, "I bought Intel, because that's the one I wanted to get." or "I went with AMD this time to give them a try" or "For what I do, the xxxxx works great for me"
But instead everyone, for some reason, most people need to make up some really strong "story" or "justification" often with falsities thrown in because it provides their own personal support of their stance -- because heaven forbid they didn't have a good enough "reason" to convince themselves.
One doesn't really need reasons and justifications for everything they do in life, yet people seem to have to "convince" themselves of their choices, sometimes to the point of plain lying about it to support their alignments.
Its almost like a disease of the mind ...
Its just the way most people work. Best thing is you could easily point out the hypocrisy for any person.
I am not afraid to admit that I like Intels platform more than AMDs but just slightly.
I did at one point swear off Nvidia products until they had the better performing part for, well gaming. I went from the 9700 Pro to a HD7970GHz only buying ATI/AMD. Then the 980Ti was at the time the best part so I got it.
From the same company that hasn't done anything in 3 years since the 1080 ti launched that is worth mentioning. It is most likely 50 percent more performance OR half power consumption at best. Don't fall for it.