News AMD Plants Big Navi Radeon 6000 Easter Egg in Fortnite

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
IMHO most of us don't care about raytracing - the only reason we think about it so much is because Nvidia always is hyping it up and portraying it as the next best thing. I mean, I like to see more graphical effects and stuff - but I care more about raw performance. I'd rather see a 10% increase in performance from one card to the next than say, 30% ncrease in ray tracing performance. But I do agree - I don't think AMD will match Nvidia in RT - performance maybe though. If I had to guess, they'll probably have a form of ray-tracing that barely impacts performance compared to Nvidia's RT, but it won't look as good.
All the eye candy in the world doesn't matter if the game sucks. I still get on my N64 and play Zelda Ocarina of Time. Still an amazing game more than 20 years after its release. Sure it would be nice if you could get it on Switch in 1080p and upgraded textures. But the lack of graphical quality doesn't affect the fun of the game.
 
I'd like to believe that they'll release a very good card. All the same, why'd they underhype Zen anyway? They've barely hyped up Navi - esp compared to how many RTX 3000s rumors we've got, and stuff like that. Excluding the reveal, the RTX 3000s has been one of the hot topic since like May - on most tech outlets.

I can see why you're hesitant - actually though it makes sense that they only offered a preview to one reviewer (it'll get the same press regardless, but it simplifies things for Nvidia) - what I don't know is why specifically Digital Foundry - why not Gamer's Nexus or something? All the same, Digital Foundry is pretty respected.
Yeah, they barely hyped up Navi and I think that this is because Lisa Su finally put a leash on AMD's marketing division and that's a good thing. As for why they didn't send it to Gamers Nexus... Steve tends to be brutally honest in a very entertaining way (did you see his review of Intel's 11th-gen mobile launch? It's priceless! LOL) and nVidia has a tendency to be the antithesis of that. I'm forced to wonder what concessions DF made to get their hands on that early hardware because I'm sure that they made some.
I look forward to seeing what AMD has coming. Prior to purchasing my Nvidia 1070, I had an AMD 7950. Specs matter more than some brand loyalty. There have been random leaks throughout the year that show AMD should have something competitive with the 3080.
I'd like to think so but I felt the same way for each generation and I was sorely disappointed when the RX 4xx/5xx series came out. I already had twin Gigabyte Windforce HD 7970s (when Crossfire still worked) and the performance difference wasn't worth the upgrade cost. I got a nice reprieve when newegg had that insane sale on the Sapphire R9 Fury Nitro OC so that was my upgrade instead. I'm not looking to upgrade because I just got my XFX RX 5700 XT Triple-Dissipation for $480CAD ($360USD at the time) so I'm satisfied with what I got as it does all that I want it to. I just want to see prices fall even further (if that's even possible).
Having AMD so dominant in the console space also helps given that most games are just console ports.
You'd think so, but the PS4 and XBox One both used AMD CPUs with ATi GPUs and it didn't seem to make any difference in that respect.
I'm always suspicious of streamer or Fortnite marketing though.
Yeah, especially since Fortnite is an nVidia title that will be getting RTX enabled. I'm actually surprised by this because I would've expected it to be put into Counter-Strike Potato Offensive where reflections could help see enemies better. Well, that and the fact that the graphics in Fortnite look like something from 2005. They are even worse than PUBG. CS:GO has much more realistic graphics and would have been a more natural choice IMO.
 
Last edited:
  • Like
Reactions: Shadowclash10

Turtle Rig

Prominent
BANNED
Jun 23, 2020
772
104
590
Once again AMD shows why their second best in Video card in 2020. nVidia launches their 3xxx series whom already 2xxx series trounced the 5700XT big navi call it what you want PCIe 4.0 bs and all that non sense. Now the best they can do just days after nVidia launch is to plant a easter egg in Fortnite for all the kids to see. Come on now. Not to start a flame war but AMD is always trailing so they must sell for cheap and people buy. Sure a 3950x and a 5700XT their both top of line consumer CPU and GPU as of now sounds good but a 10900k with a 2080Ti sounds much better even with the 6 less cores. TLDR 😂☮👩‍🦲🚔👌💯

Slow cores mind you compred to 5.2Ghz across the board so it will really behave like a 14 core so its close with AMD. Ok AMD has 7nm on their CPU die but 7nm or 14+++nm it is all the same to me. Call me when there is 1nm then I will say fine. Also their, being AMDs introduction to PCIe 4.0 was crummy. Saying nonsense like its 60 percent faster and then SSDs coming out with PCIe 4.0 and m.2 drives and what not. Sure it all sounds good on paper the price point the performance however in the real world you will not get these so called 60 percent faster speeds. Once again call me when PCIe 6.0 comes out then maybe I will say there is a reason. As for nand drives their all the same OS boot time is the same app load time.. These figures they dish out are not real world figures. A 500mbps ssd and a m.2 drive will see no difference in real world performance except if you transfer from m.2 to m.2 or m.2 to ssd or something like that but real world stuff you won't see a difference and you won't see a difference with PCIe 4.0 m.2 drives nore PCIe 4.0 video cards like the 5700XT which barely beats a 2060 on a good day. nVidia is not stupid they know 80 percent of their users are PCIe 3.0 so they built a product for that. If you have a PCIe 4.0 then fine but you wont see a real worl difference only in benchmarks will you see a difference. 😲✌👍🎗✝🤷‍♀️
 
  • Like
Reactions: Gurg
Once again AMD shows why their second best in Video card in 2020. nVidia launches their 3xxx series whom already 2xxx series trounced the 5700XT big navi call it what you want PCIe 4.0 bs and all that non sense. Now the best they can do just days after nVidia launch is to plant a easter egg in Fortnite for all the kids to see. Come on now. Not to start a flame war but AMD is always trailing so they must sell for cheap and people buy.
Posts like this are inflammtory and don't paint you in the best light. You claim that you don't want to start a flame war in the kind of post that starts flame wars. Your open disdain for a brand that has been making video cards for far longer than any other company in existence (my first build in 1988 had an ATi EGA Wonder) only demonstrates your ignorance. If ATi made GPUs that are as bad as you think that they are (which means that you never actually OWNED one), they'd never have survived this long. Sure, AMD bought ATi but paid a pantload for it because ATi has a serious name and serious tech with the history to back it up behind that name.

If you think that ATi has been behind nVidia for so long, then you must be a kid because the last time that Radeons were actual challengers at the very top with nVidia was only five years ago, a very short time for someone who has been an adult for awhile. In fact, during the terrible FX years, the ATi side of AMD is what kept AMD solvent. Without ATi, AMD wouldn't exist today to give us such great things as Ryzen CPUs. ATi suffered because its R&D budget was cut in favour of developing Zen (turns out that it was the right move) but it does have some serious catching up to do as a result. ATi literally held AMD back from falling into the chasm of bankruptcy and for that, we owe them some gratitude. The way you talk, it sounds like you'd be perfectly fine with paying $600+ for a Quad-Core Intel CPU.

ATi has only been solidly behind nVidia (and only at the very top level) since the release of the GTX 10xx series because the R9 Fury-X, while being more primitive than the GTX 980 Ti, was very much its equal with the biggest fps "win" for the 980 Ti being a measly 10fps. Here's a "massive" 2fps win for the 980 Ti (the margin usually was aroun 4-6fps):
Primal-p.webp

Even if you say (and it would be mostly true) that the Fury-X was the second-most-potent video card at the time, its far more popular little brother, the R9 Fury was third:
"The Fury occupies its own niche in the high-end graphics market. It's comfortably able to beat the GTX 980 in benchmarks, but it can't match the GTX 980 Ti and Fury X for performance. That pattern repeats at the checkout: it's more expensive than the GTX 980, but significantly cheaper than the other two cards. " - trustedreviews.com: October, 2015

So in 2015, two of the top three video cards in the world were ATi Radeons, a mere five years ago. That's not a long time (for an adult), it's just two GPU generations. If you want a long time to be far behind the competition, that was AMD being behind Intel for 13 years. Remember that even when ATi was at its farthest behind nVidia, its products were still usable by the majority of people (who don't spend upwards of $500 on video cards) which was a much smaller margin than the margin between Intel and AMD.

By the way, where is AMD now? Kicking Intel's ass in desktop, data centre and now even mobile? Yeah sure, ATi should just quit, eh?
 
Last edited:

Shadowclash10

Prominent
May 3, 2020
184
46
610
I look forward to seeing what AMD has coming. Prior to purchasing my Nvidia 1070, I had an AMD 7950. Specs matter more than some brand loyalty. There have been random leaks throughout the year that show AMD should have something competitive with the 3080. Having AMD so dominant in the console space also helps given that most games are just console ports.

I'm always suspicious of streamer or Fortnite marketing though.
Yeah, I'm definitely suspicious of streamer/Fortnite advertising. Specs definitely matter more than brand loyalty. Although I wouldn't necessarily say specs - more real-world performance (but I think that was what you were getting at anyway). We're somewhat past the stage where AMD was infamous for drivers and that stuff - although that whole VBIOS affair was bad. It's more the fact that for the last 2 generations (I guess three now that Ampere is kinda out) Nvidia has dominated the market - and AMD hasn't really had a good top end/mid-range variety. Sadly. Competition is always good.
 
Yeah, I'm definitely suspicious of streamer/Fortnite advertising. Specs definitely matter more than brand loyalty. Although I wouldn't necessarily say specs - more real-world performance (but I think that was what you were getting at anyway). We're somewhat past the stage where AMD was infamous for drivers and that stuff - although that whole VBIOS affair was bad. It's more the fact that for the last 2 generations (I guess three now that Ampere is kinda out) Nvidia has dominated the market - and AMD hasn't really had a good top end/mid-range variety. Sadly. Competition is always good.
You know what the funny thing was about the driver issues? A good amount of them were probably user-caused, especially on AMD rigs. People weren't updating their chipset drivers and that was probably causing the vast majority of issues. Since I'm always on top of that stuff, I never encountered the issues and review sites never encounter them because they're always using up-to-date drivers. I bet that the average person installs the chipset drivers that come with their motherboard and never think of it again.

There's a guy on YouTube from Portugal who runs a channel with a really strange (but cool) sounding name, "Ancient Gameplays" and he posted a 7-minute video about how to easily fix all the driver issues related to the RX 5xxx series. When I came across it, I watched it because I was curious about why I never had driver issues with Radeons and it was a real lightbulb moment for me. I now understood why I never had issues with Radeons while others often have had terrible issues:
View: https://www.youtube.com/watch?v=F1dQoJtkI-c&list=WL&index=6
 

escksu

Reputable
BANNED
Aug 8, 2019
878
354
5,260
Imho, there isnt a need for AMD to compete with nvidia on the performance crown. If big navi can perform on par with 3070 but at a lower price, we got ourselves a winner. Coming close to 3080 will be a bonus.

No doubt 3090 will be the fastest gaming gpu on earth. Unfortunately, the price is out of reach for most of us. 3080 is not exactly affordable as well. Only 3070 would be something closer to what many of us could cough out. 399 @ 2080ti performance would be an awesome combination.
 

Vesperan

Reputable
Apr 6, 2017
6
1
4,515
That's really strange. I would've expected AMD to be intimidated by nVidia's amazing launch of the RTX 3000 series but this would seem to be the antithesis of that. Did ATi pull a rabbit out of their hat? If ATi cooked up something good and there's a mole at AMD, then nVidia's sudden "conscience" when it comes to pricing makes sense. I'm not getting my hopes up though. If they do, great. If they don't, then we're no worse off than we were before so why worry?

We know that AMD's RDNA 2 / Big Navi is good because of Nvidia's pricing, not despite Nvidia's pricing.

Nvidia will already have a good feel as to whether AMD will have competitive cards or not, and is pricing accordingly so that they don't look like idiots in a couple of months. Nvidia's aggressive pricing is because they believe AMD can compete at those tiers. Its no surprise that the 3070/3080 have relatively low prices while the 3090 is comparatively high: Nvidia think AMD can't hit the 3090, but that they will be trading blows with 3070/3080. In contrast, when Turing was launched AMD had nothing and Nvidia knew that - and so they were priced high. When RDNA 1 came out, the areas where AMD could compete saw a reshuffle (Super cards) to maintain dominance and not be undercut.

Competition is good for everyone no matter your preferred brand.
 
  • Like
Reactions: Jim90

Chung Leong

Reputable
Dec 6, 2019
494
193
4,860
Imho, there isnt a need for AMD to compete with nvidia on the performance crown. If big navi can perform on par with 3070 but at a lower price, we got ourselves a winner. Coming close to 3080 will be a bonus.

AMD will be lucky if their flagship can keep pace with the 3060. Maintaining the status quo is hard enough, given the benefits Nvidia is reaping from a node transition.
 

Chung Leong

Reputable
Dec 6, 2019
494
193
4,860
Nvidia will already have a good feel as to whether AMD will have competitive cards or not, and is pricing accordingly so that they don't look like idiots in a couple of months. Nvidia's aggressive pricing is because they believe AMD can compete at those tiers.

Your logic is backward. If AMD can compete at the high end, it'd choose to charge high-end prices. Improving profit margin is one of the company's stated goals. It'd only use price as a leverage if it can't compete on performance.
 

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
We know that AMD's RDNA 2 / Big Navi is good because of Nvidia's pricing, not despite Nvidia's pricing.

Nvidia will already have a good feel as to whether AMD will have competitive cards or not, and is pricing accordingly so that they don't look like idiots in a couple of months. Nvidia's aggressive pricing is because they believe AMD can compete at those tiers. Its no surprise that the 3070/3080 have relatively low prices while the 3090 is comparatively high: Nvidia think AMD can't hit the 3090, but that they will be trading blows with 3070/3080. In contrast, when Turing was launched AMD had nothing and Nvidia knew that - and so they were priced high. When RDNA 1 came out, the areas where AMD could compete saw a reshuffle (Super cards) to maintain dominance and not be undercut.

Nvidia is profit margin driven company. That's why Turning prices were what they were. That was a hugely expensive architecture to develop, and they passed those costs onto the customer such that they could maintain their margins while still covering all the R&D costs while the generation was current. Ampere, despite being much faster is an evolution of Turing and development costs were likely much lower. It's also likely Nvidia got a pretty sweet deal to switch to Samsung, and combined with the smaller dies vs Turing could be cheaper to produce as well. Nvidia isn't going to price their cards at prices that will negatively affect their margins in any meaningful way.

The leaks have already started that Navi2 can't compete with the topend 3000 series.

AMD Reportedly Dropping “Big Navi” Prices Following GeForce RTX 30 Series Announcement

AMD wanted to charge $600 for the 16GB card and has decided to drop it to $550. If this card was as fast or faster than the 3080 and had 60% more memory while already being $100 cheaper, why would the announcement for the 3080 persuade AMD to drop the price $50? It looks like Navi2 will be slightly faster than a 2080Ti which is pretty much what many people already thought was the best case scenario. Just like it has been for years, it's going to be Nvidia at the top by themselves and AMD a few years behind.
 

Chung Leong

Reputable
Dec 6, 2019
494
193
4,860
It looks like Navi2 will be slightly faster than a 2080Ti which is pretty much what many people already thought was the best case scenario. Just like it has been for years, it's going to be Nvidia at the top by themselves and AMD a few years behind.

The RTX 2060 has around 70% the performance of the 2080, while the GTX 1660 has around 50%. If the 3080 really does double up on the 2080Ti and Nvidia maintain the same distance between tiers, even the budget 3050 could be at the former flagship's level. This is like an extinction level event.
 
Your logic is backward. If AMD can compete at the high end, it'd choose to charge high-end prices. Improving profit margin is one of the company's stated goals. It'd only use price as a leverage if it can't compete on performance.
He's talking about nVidia, not AMD. Read his post again. You've just called exactly what companies have always done "backwards logic". He said that nVidia has a feel for how good RDNA2 is and they don't dare charge high prices because AMD will just come along and undercut them. That would hurt nVidia's sales and make them look like greedy rip-off artists AGAIN.

Not that it matters, you said AMD would be "lucky to reach RTX 2060 levels of performance" when the APU in the Xbox Series X already shows 12 teraflops. There's no way that your claim is anywhere close to being true.
 
Last edited:
  • Like
Reactions: Jim90
Nvidia is profit margin driven company. That's why Turning prices were what they were. That was a hugely expensive architecture to develop, and they passed those costs onto the customer such that they could maintain their margins while still covering all the R&D costs while the generation was current.
That's a pretty lame excuse because ALL new architectures are hugely expensive to develop and if that were the case, we'd be paying far more for Navi cards than we are. After all, it's the first non-GCN GPU that ATi has designed since 2012. However, it was priced rather nicely and AMD still turned a good profit on it so your excuse goes right out the window. The fact is that nVidia charged what they did because they know that nVidia fanboys (and there are a ton of them) will buy GeForce cards no matter what nVidia does. They've proven this over and over again. Turing cards cost what they did because nVidia knew that they could get away with it, plain and simple.

You sound like a parent trying to explain away their child's bad behaviour because they love them and don't want to admit that they're not angels. The difference here is that you're defending one of the most indefensible corporations in tech history with a track record that is nothing short of terrible. What on Earth makes you love nVidia enough to do this?
 
Last edited:

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
Not that it matters, you said AMD would be "lucky to reach RTX 2060 levels of performance" when the APU in the Xbox Series X already shows 12 teraflops. There's no way that your claim is anywhere close to being true.
No, he didn't. He said 3060, not 2060. On that front, I'm fully expecting AMD's flagship will have slower ray tracing performance than a 3060. It's definitely going to be faster than a 3060 in rasterized graphics. Most likely somewhere between the 3070 and 3080.
 
No, he didn't. He said 3060, not 2060. On that front, I'm fully expecting AMD's flagship will have slower ray tracing performance than a 3060. It's definitely going to be faster than a 3060 in rasterized graphics. Most likely somewhere between the 3070 and 3080.
I'll start off by saying that you're right, I had a typo because why would I compare the Xbox Series X to the RTX 2060, a card that loses handily to Navi? Now, I don't know how you're changing "performance" to "ray-tracing" but he didn't say that. Most people understand that when someone says "performance" they mean "rasturiazation" because that's what all modern GPUs have common. If you're here posting about it, you should already know this. I think that you do but are deliberately misunderstanding the word "performance" when it comes to GPUs because Ray-Tracing is nothing but a pretty "frill" with no real impact on gaming and is NEVER used to define a card's level of performance.

I've read several of your posts and you're a total nVidia fanboy. You defend nVidia at every turn and do your best to smear AMD. You even say absolute BS like "AMD hasn't been aggressive on GPU prices in years" when that's exactly what AMD has been doing for a very long time. Aggressive pricing means LOW pricing because it's an aggressive tactic to increase sales levels. Either you have very limited understanding of the tech industry or you have some other reason for being GREEN TEAM, GREEN TEAM, RAH-RAH-RAH! God knows it isn't because nVidia has been particularly GOOD to you because they haven't been GOOD to anyone except themselves.

I'm guessing that you're a kid because someone who has been around for a long time has seen the industry evolve and knows that while no company is deserving of love, some are definitely deserving of hate. Your love for nVidia is nothing that anyone over the age of thirty would have because as we get older, we get far less emotionally involved and become more logical. Over the years, I've seen the crap that nVidia has pulled. I was so disgusted by them that I stopped buying their cards. I did this because I've had many cards made by ATi in the past and I knew that they were just fine. The same thing happened with Intel. If AMD had ever committed terrible actions like Intel and nVidia then I wouldn't care and would still buy Intel and nVidia but they haven't. My rig is all-AMD, not because I love them (I don't), but because they haven't done anything in all their decades of existence to turn me off of them.

Your "Green Team" membership doesn't make you look good. It makes you look inexperienced. People with actual experience don't have a "Team" that they cheer for. They want competition between all teams because that's ultimately better for all of us. One day, you'll figure that out but it appears that today is not that day.
 
Last edited:

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
I'll start off by saying that you're right, I had a typo because why would I compare the Xbox Series X to the RTX 2060, a card that loses handily to Navi? Now, I don't know how you're changing "performance" to "ray-tracing" but he didn't say that.
Everything after the first sentence was my opinion, not an interpretation of what Chung was saying, which why the 2nd sentence says "I'm fully expecting.."

As for the rest of it, the video card in the first PC I personally owned was an STB Powergraph 64 in a Quantex Pentium 133 which was brand new at the time. My first real 3D accelerator was the original 3dfx Voodoo paired with a Matrox Millennium. So, basically, everything you're so sure you know about me was wrong. It's odd you don't see the hypocrisy and the laughable nature of your don't get emotional statement, while inventing a 3 paragraph bio for someone you don't know at all like that's a shining example of mental stability.