Understandable that they really want all the Fortnite kids to buy their gpu but it's not exactly a demanding title...
It's about to get ray tracing.
The scheduled forum maintenance has now been completed. If you spot any issues, please report them here in this thread. Thank you!
Understandable that they really want all the Fortnite kids to buy their gpu but it's not exactly a demanding title...
All the eye candy in the world doesn't matter if the game sucks. I still get on my N64 and play Zelda Ocarina of Time. Still an amazing game more than 20 years after its release. Sure it would be nice if you could get it on Switch in 1080p and upgraded textures. But the lack of graphical quality doesn't affect the fun of the game.IMHO most of us don't care about raytracing - the only reason we think about it so much is because Nvidia always is hyping it up and portraying it as the next best thing. I mean, I like to see more graphical effects and stuff - but I care more about raw performance. I'd rather see a 10% increase in performance from one card to the next than say, 30% ncrease in ray tracing performance. But I do agree - I don't think AMD will match Nvidia in RT - performance maybe though. If I had to guess, they'll probably have a form of ray-tracing that barely impacts performance compared to Nvidia's RT, but it won't look as good.
Yeah, they barely hyped up Navi and I think that this is because Lisa Su finally put a leash on AMD's marketing division and that's a good thing. As for why they didn't send it to Gamers Nexus... Steve tends to be brutally honest in a very entertaining way (did you see his review of Intel's 11th-gen mobile launch? It's priceless! LOL) and nVidia has a tendency to be the antithesis of that. I'm forced to wonder what concessions DF made to get their hands on that early hardware because I'm sure that they made some.I'd like to believe that they'll release a very good card. All the same, why'd they underhype Zen anyway? They've barely hyped up Navi - esp compared to how many RTX 3000s rumors we've got, and stuff like that. Excluding the reveal, the RTX 3000s has been one of the hot topic since like May - on most tech outlets.
I can see why you're hesitant - actually though it makes sense that they only offered a preview to one reviewer (it'll get the same press regardless, but it simplifies things for Nvidia) - what I don't know is why specifically Digital Foundry - why not Gamer's Nexus or something? All the same, Digital Foundry is pretty respected.
I'd like to think so but I felt the same way for each generation and I was sorely disappointed when the RX 4xx/5xx series came out. I already had twin Gigabyte Windforce HD 7970s (when Crossfire still worked) and the performance difference wasn't worth the upgrade cost. I got a nice reprieve when newegg had that insane sale on the Sapphire R9 Fury Nitro OC so that was my upgrade instead. I'm not looking to upgrade because I just got my XFX RX 5700 XT Triple-Dissipation for $480CAD ($360USD at the time) so I'm satisfied with what I got as it does all that I want it to. I just want to see prices fall even further (if that's even possible).I look forward to seeing what AMD has coming. Prior to purchasing my Nvidia 1070, I had an AMD 7950. Specs matter more than some brand loyalty. There have been random leaks throughout the year that show AMD should have something competitive with the 3080.
You'd think so, but the PS4 and XBox One both used AMD CPUs with ATi GPUs and it didn't seem to make any difference in that respect.Having AMD so dominant in the console space also helps given that most games are just console ports.
Yeah, especially since Fortnite is an nVidia title that will be getting RTX enabled. I'm actually surprised by this because I would've expected it to be put into Counter-Strike Potato Offensive where reflections could help see enemies better. Well, that and the fact that the graphics in Fortnite look like something from 2005. They are even worse than PUBG. CS:GO has much more realistic graphics and would have been a more natural choice IMO.I'm always suspicious of streamer or Fortnite marketing though.
Posts like this are inflammtory and don't paint you in the best light. You claim that you don't want to start a flame war in the kind of post that starts flame wars. Your open disdain for a brand that has been making video cards for far longer than any other company in existence (my first build in 1988 had an ATi EGA Wonder) only demonstrates your ignorance. If ATi made GPUs that are as bad as you think that they are (which means that you never actually OWNED one), they'd never have survived this long. Sure, AMD bought ATi but paid a pantload for it because ATi has a serious name and serious tech with the history to back it up behind that name.Once again AMD shows why their second best in Video card in 2020. nVidia launches their 3xxx series whom already 2xxx series trounced the 5700XT big navi call it what you want PCIe 4.0 bs and all that non sense. Now the best they can do just days after nVidia launch is to plant a easter egg in Fortnite for all the kids to see. Come on now. Not to start a flame war but AMD is always trailing so they must sell for cheap and people buy.
Yeah, I'm definitely suspicious of streamer/Fortnite advertising. Specs definitely matter more than brand loyalty. Although I wouldn't necessarily say specs - more real-world performance (but I think that was what you were getting at anyway). We're somewhat past the stage where AMD was infamous for drivers and that stuff - although that whole VBIOS affair was bad. It's more the fact that for the last 2 generations (I guess three now that Ampere is kinda out) Nvidia has dominated the market - and AMD hasn't really had a good top end/mid-range variety. Sadly. Competition is always good.I look forward to seeing what AMD has coming. Prior to purchasing my Nvidia 1070, I had an AMD 7950. Specs matter more than some brand loyalty. There have been random leaks throughout the year that show AMD should have something competitive with the 3080. Having AMD so dominant in the console space also helps given that most games are just console ports.
I'm always suspicious of streamer or Fortnite marketing though.
You know what the funny thing was about the driver issues? A good amount of them were probably user-caused, especially on AMD rigs. People weren't updating their chipset drivers and that was probably causing the vast majority of issues. Since I'm always on top of that stuff, I never encountered the issues and review sites never encounter them because they're always using up-to-date drivers. I bet that the average person installs the chipset drivers that come with their motherboard and never think of it again.Yeah, I'm definitely suspicious of streamer/Fortnite advertising. Specs definitely matter more than brand loyalty. Although I wouldn't necessarily say specs - more real-world performance (but I think that was what you were getting at anyway). We're somewhat past the stage where AMD was infamous for drivers and that stuff - although that whole VBIOS affair was bad. It's more the fact that for the last 2 generations (I guess three now that Ampere is kinda out) Nvidia has dominated the market - and AMD hasn't really had a good top end/mid-range variety. Sadly. Competition is always good.
It's going to be a sad launch if AMD isn't willing to be aggressive on pricing.
AMD hasn't been aggressive on GPU prices in years. No reason to believe, they're suddenly going to start now.Lol - you did witness what Zen2's "aggressive pricing" forced Intel to do with their pricing?? (you DO know who makes Zen2...right??)
So they can play at below 1080p resolution and all low settings with ray tracing enabled? As long as it's 300000000fps!It's about to get ray tracing.
That's really strange. I would've expected AMD to be intimidated by nVidia's amazing launch of the RTX 3000 series but this would seem to be the antithesis of that. Did ATi pull a rabbit out of their hat? If ATi cooked up something good and there's a mole at AMD, then nVidia's sudden "conscience" when it comes to pricing makes sense. I'm not getting my hopes up though. If they do, great. If they don't, then we're no worse off than we were before so why worry?
Imho, there isnt a need for AMD to compete with nvidia on the performance crown. If big navi can perform on par with 3070 but at a lower price, we got ourselves a winner. Coming close to 3080 will be a bonus.
Nvidia will already have a good feel as to whether AMD will have competitive cards or not, and is pricing accordingly so that they don't look like idiots in a couple of months. Nvidia's aggressive pricing is because they believe AMD can compete at those tiers.
We know that AMD's RDNA 2 / Big Navi is good because of Nvidia's pricing, not despite Nvidia's pricing.
Nvidia will already have a good feel as to whether AMD will have competitive cards or not, and is pricing accordingly so that they don't look like idiots in a couple of months. Nvidia's aggressive pricing is because they believe AMD can compete at those tiers. Its no surprise that the 3070/3080 have relatively low prices while the 3090 is comparatively high: Nvidia think AMD can't hit the 3090, but that they will be trading blows with 3070/3080. In contrast, when Turing was launched AMD had nothing and Nvidia knew that - and so they were priced high. When RDNA 1 came out, the areas where AMD could compete saw a reshuffle (Super cards) to maintain dominance and not be undercut.
AMD will be lucky if their flagship can keep pace with the 3060. Maintaining the status quo is hard enough, given the benefits Nvidia is reaping from a node transition.
It looks like Navi2 will be slightly faster than a 2080Ti which is pretty much what many people already thought was the best case scenario. Just like it has been for years, it's going to be Nvidia at the top by themselves and AMD a few years behind.
He's talking about nVidia, not AMD. Read his post again. You've just called exactly what companies have always done "backwards logic". He said that nVidia has a feel for how good RDNA2 is and they don't dare charge high prices because AMD will just come along and undercut them. That would hurt nVidia's sales and make them look like greedy rip-off artists AGAIN.Your logic is backward. If AMD can compete at the high end, it'd choose to charge high-end prices. Improving profit margin is one of the company's stated goals. It'd only use price as a leverage if it can't compete on performance.
That's a pretty lame excuse because ALL new architectures are hugely expensive to develop and if that were the case, we'd be paying far more for Navi cards than we are. After all, it's the first non-GCN GPU that ATi has designed since 2012. However, it was priced rather nicely and AMD still turned a good profit on it so your excuse goes right out the window. The fact is that nVidia charged what they did because they know that nVidia fanboys (and there are a ton of them) will buy GeForce cards no matter what nVidia does. They've proven this over and over again. Turing cards cost what they did because nVidia knew that they could get away with it, plain and simple.Nvidia is profit margin driven company. That's why Turning prices were what they were. That was a hugely expensive architecture to develop, and they passed those costs onto the customer such that they could maintain their margins while still covering all the R&D costs while the generation was current.
No, he didn't. He said 3060, not 2060. On that front, I'm fully expecting AMD's flagship will have slower ray tracing performance than a 3060. It's definitely going to be faster than a 3060 in rasterized graphics. Most likely somewhere between the 3070 and 3080.Not that it matters, you said AMD would be "lucky to reach RTX 2060 levels of performance" when the APU in the Xbox Series X already shows 12 teraflops. There's no way that your claim is anywhere close to being true.
I'll start off by saying that you're right, I had a typo because why would I compare the Xbox Series X to the RTX 2060, a card that loses handily to Navi? Now, I don't know how you're changing "performance" to "ray-tracing" but he didn't say that. Most people understand that when someone says "performance" they mean "rasturiazation" because that's what all modern GPUs have common. If you're here posting about it, you should already know this. I think that you do but are deliberately misunderstanding the word "performance" when it comes to GPUs because Ray-Tracing is nothing but a pretty "frill" with no real impact on gaming and is NEVER used to define a card's level of performance.No, he didn't. He said 3060, not 2060. On that front, I'm fully expecting AMD's flagship will have slower ray tracing performance than a 3060. It's definitely going to be faster than a 3060 in rasterized graphics. Most likely somewhere between the 3070 and 3080.
Everything after the first sentence was my opinion, not an interpretation of what Chung was saying, which why the 2nd sentence says "I'm fully expecting.."I'll start off by saying that you're right, I had a typo because why would I compare the Xbox Series X to the RTX 2060, a card that loses handily to Navi? Now, I don't know how you're changing "performance" to "ray-tracing" but he didn't say that.