AMD Big Navi temp header

AMD teases possible Radeon RX 6000 “Big Navi” via Fortnite Easter Egg in a custom map

It appears that AMD had worked with one Fortnite Creative Island modder @MAKAMAKES to create a custom Battle Arena in ‘Fortnite creative’. This leak comes via Hothardware.

Now, one AMD-sponsored Fortnite Streamer @MissGinaDarling has discovered an Easter Egg in one of the new Fortnite Maps. The Easter Egg appears to be a hidden quest that requires the 6000 code to be entered.

Basically, players have the option of playing three different game modes, and in one of the maps, there appears to be a baked-in Easter egg having references to Big Navi model numbering, or at least that appears to be the case for now. In the end a message is being shown which reads the following: “SOMETHING BIG IS COMING TO THE AMD BATTLE ARENA!”.  This could also basically mean a reference to the upcoming BIG Navi GPU, an indirect confirmation from AMD.

To quote TomsHardware, “AMD’s Scott Herkelman, CVP & GM at AMD Radeon, congratulated Gina on finding the easter egg, which surely has to mean something.”

The 6000 numbering makes sense logically if we compare it with the AMD Radeon RX 5000 series of cards. Fortnite’s AMD Easter Egg is hidden in a Battle Arena map, which is basically an AMD-themed stadium.  We expect the Big Navi consumer graphics cards to launch before the next-generation consoles hit the market, which are also based on the same RNDA2 architecture. The next-gen consoles are scheduled for a launch in November, so we expect AMD to announce the Big Navi cards around this timeframe.

According to HotHardware:

“To figure out how this Easter egg was found, you also have to get into the AMD Battle Arena. The AMD Battle Arena map is a Fortnite map where AMD fans get to duke it out in an AMD-themed stadium. You must get into either a Free-For-All or Capture The Flag Map to find the Easter egg.” “In any case, if you want to get into these games and find the Easter egg for yourself, you can do so by going into creative mode and entering the code 8651-9841-1639 to get to the map”.

Aside from this Easter egg, a contest is also being run as well, in which if a user captures an in-game action clip, then they can share that clip on social media with the tag #AMDSweepstakesEntry to get a chance to win a Maingear PC powered by AMD.

AMD Battle Arena – A Fortnite Creative Map

AMD Big Navi RX 6000 Fortnite

55 thoughts on “AMD teases possible Radeon RX 6000 “Big Navi” via Fortnite Easter Egg in a custom map”

  1. I doubt AMD will make that big of a jump to catch up to NVIDIA. That being said, i’d love it if they had counters to every NVIDIA cards announced to date. We’d be in a glorious consummer spot.

    Crossing fingers

    1. the 5700XT is an amazing card, of course they can bump the performance easily it was just a test run mid range card with a new architecture and it performed great imagine a highend card by em bro lol.

    2. Nvidia doesn’t have everything. They have the performance crown but not everyone is looking for that kind of $/performance ratio. Mainstream gaming isn’t at the 1400$USD price point it’s more likely to be at the 500$USD price point which puts the 3070Ti against whatever AMD will have (let’s hope something that beats it for the same if not lower price) We’ll see. AMD could very well surprise us like they did with their CPUs but crushing the 3090 at this point is very unlikely. Who shall live shall tell 🙂

      1. According to Steam hardware surveys AMD is in really poor state and that’s what matters in PC gaming. Nvidia does not only have the high end cards. No one cares if AMD gets their stuff to consoles since that does not help the competition on PC side one bit

        1. That’s because people are gullible. Nvidia is known to have the performance crown hence people think Nvidia is better in every cases; performance, $/fps ratio etc. I bet if Amd had the performance crown you’d see much more red in those hardware surveys and they wouldn’t be the top dog, they’d be the middle range to low range because that’s just how people are. We want the best for what we have (money)

          2 cents.

        2. It’s all about the mindshare. Even if AMD came up with a superior GPU, most people would buy Nvidia because they’ve always done so and their drivers are generally better.

          1. Yes. I bought one AMD GPU. When it worked properly it was nice but the drivers were horrible.

            Also fan broke after few months but that’s just bad luck

      2. Very, very few gamers can afford a $1,500 card or even need one. The vast majority are running entry level to midrange cards. I honestly don’t know why gamers judge AMD a failure just because they don’t offer competition for the less than 1% of gamers buying at the high end.

      1. I will answer some of your questions properly afterwards, but I don’t think AMD’s ray tracing is going to be very different. As you know, DirectX Raytracing (DXR), is actually a feature of DirectX 12 that allows for “hardware” real-time raytracing. It’s a compatible extension to DirectX 12. We have seen this on GeForce 20 series GPUs, and in future AMD might have their own compatible HW for this.

        Also, DXR has never been an NVIDIA-exclusive thing. Every DX12-capable card can access and use it, it’s just slow (depending on the HW). DXR contains a Raytracing pipeline, which can be compared to the existing Graphics/Compute pipeline states. So a compatible AMD hardware should also be able to take proper advantage of this feature, IMO.

        Apart from that, some of the DirectX Raytracing applications which are accelerated by RTX, might also require Microsoft’s DXR developer package, which consists of DXR-enabled D3D runtimes, HLSL compiler, and headers, depending on the case though. So AMD might have an answer.

        BTW, AMD has already claimed that all of its current DX12 graphics cards support ray tracing via Microsoft’s DXR fallback layer. On the driver level, the support will be there. But, this Fallback layer is just an emulation layer provided by MS, which is capable of running on any “D3D12” compatible GPU. It was originally meant so that the developers can learn the API (with having obvious DXR compatibility), and it was hardly intended to be able to run any games as such.

        Once Ray Tracing Turing HW came out in the market, it’s development was kind of halted, as it was deemed unnecessary. That it is technically supported was never in question, the question is how fast they can do it and my guess is not very fast, otherwise they would’ve already talked about it and showed some examples.

        But AMD is still free to provide DXR support through their D3D12 drivers though. Any D3D12 GPU is actually capable of running this DXR code, since it is just an extension of DirectCompute.

        Slow Performance remains a totally different issue though (imagine running the same operations on a GPU without specialized processors/cores). But as per one recent post, “”The fallback layer isn’t maintained anymore and it’s unlikely that developers will use the codebase for ray tracing support under GPUs which don’t support DXR directly.”” Performance, however, appears to be underwhelming via this “emulation/software” method.

        This could explain why AMD has not enabled the real-time ray tracing fallback layer on its drivers as its GPUs currently lack hardware components that could accelerate the ray tracing calculations. Theoretically AMD’s DX12 GPUs can support real-time ray tracing.

        Most importantly, I think MS’s DXR requires a certain “Hardware feature level” as well, which I presume should be 12_1 feature level. If I’m not wrong, some AMD GPUs are still on 12_0, except Vega. Right now I’m just writing all this text from my memory, so there might be some mistakes as well, lol.

        But anyways, Nvidia’s PASCAL and Turing cards have support for this feature level, with TURING having probably a even higher version of this feature, IMO. Call it 13_0, or rather 12_2 ? So I’m wondering whether only 12_2 hardware feature level would mostly be required to support DXR, or we can also run it on 12_1 as well, but NOT without having dedicated RT cores though (on 12_1) ? But this seems a bit doubtful from a performance perspective. My bet is still on 12_2 though.

        Which means there might be a HIGH performance cost, if we implement DXR on anything lower than 12_2/13_0, which is where AMD’s BIG NAVI GPU also comes into picture.

        We can expect AMD’s “Next-Gen” architecture to have proper support for the above Hardware feature level though, i.e. 12_2, most probably on the high-end NAVI 21 GPU and beyond. I suppose from a “Financial” standpoint it makes sense for AMD to follow this roadmap, and not just rush out to implement DirectX Ray Tracing in games.

        1. Ah sorry, I think I wrote a bit off topic with regards to some of your questions, lol. Overlooked your query. I will answer again properly, right now my head is spinning pretty bad since morning !

          But anyways, you are correct, it will all depend on how the Game Engine sees Nvidia HW (RTX APIs and Drivers) and AMD HW. Maybe some game patches and/or driver updates might be needed to make the current RTX games compatible on AMD’s Hardware ? Might be possible.

          The DX12 feature level will also play a role, apart from the hardware. Game patches might also be required. First of all, we don’t have any info on how AMD is going to implement ray tracing in Big Navi’s hardware. We also never got any leaked Block Diagram of BIG Navi’s GPU core and die.

          I mean will they place the dedicated RT cores on a separate die, from the stream processors, and will the RT cores do all the ray tracing/triangle intersection work separately ? In Nvidia CUDA cores and RT cores both have their own work to do.

          But regardless how AMD implements RTX in their hardware, the DX12 feature levels and support for DXR should help with ray tracing.

    1. I dont rly belive thats the new cooler. If they wanna pull a 2080 ti preformance they have to use someting else… Or its a mid range card?

      1. I don’t know. Yeah, that might just be a placeholder Image. I don’t think that’s a Big Navi design though, seems unlikely.

        Also, we don’t know whether AMD’s reference Big Navi cards will again have a blower-type cooler as shown in the image, so that’s not a Big Navi card..

      2. Coreteks is wrong all the time.Ihave no idea where this guy gets his info but his recent claims of AMD being surprised about Nvidias performance and having to change their plains is ridiculous.

        He also said that there will be no high end Navi despite AMD saying they will compete in the high end.

    1. As much as I hate big business & Jeff “The Leftists” Bezsos, there is one thing I’ve always admired the man for and it has led to Amazon’s success. He said, ” I never worried about what the competition is doing. The only thing I care about is the consumers/costumers happiness/satisfaction”. All you people ever do is talk about how AMD needs to do WHAT Nvidia is doing. I don’t give a rats a$$ about AMD being like Nvidia. There is a reason why I buy AMD & it’s because, IT’S NOTHING LIKE NVIDIA. I love what AMD does they do what they do at their own pace.

      You want Nvidia like tech, then go BUY NVIDIA. Oh wait, you’ll STILL buy NVIDIA no matter what . Because like many people you actually think that if AMD “competes” with NVIDIA, you’ll get a cheaper Nvidia card… Uhm, No you won’t. But y’all are hell bent on thinking that garbage. And if AMD does what you want, you won’t see a cheaper AMD card they’ll Jack their prices up as well because no one is into giving away their bread & butter. Besides, you’ll just buy NVIDIA anyways. You can always tell who’ve ever actually own an AMD card and who doesn’t. Vice versa for Nvidia as well

      I love AMD and where they are at. I don’t give a rats AZz if they “compete” with Nvidia because they really don’t have to, well they do compete but not in the way y’all would like. And I’m GLAD THEY DON’T!..

      1. Cool fanboy rant. I don’t think most people are actually fanboys, as evidenced by AMD selling 40% more CPUs than Intel in the last quarter and why I have a Ryzen processor in my PC right now is because it offers better performance for what I do at a better cost than Intel’s processors. The same back in forth I have done with ATI/AMD and Nvidia GPUs.

        I do favor certain companies based on my user experience and from my experience throughout the years I will say Nvidia has better driver support and while I do like to support the underdog AMD just hasn’t been able compete much with their GPUs outside of the mid-range tier and I purchase high-end to enthusiast tier cards nowadays.

        When ATI was around each company always competed to trump the performance with each other every generation, with the fastest stupidly priced cards for bragging rights to claim the fastest GPU. Since AMD doesn’t even bother to compete with Nvidia Titan ultra enthusiast tier the perception that Nvidia making the fastest GPUs will remain the general perception, but when it comes time to purchase a card people are going to look at benchmarks, features and cost.

        Right now AMD needs to address performance, ray tracing, and DLSS and if they have nothing that can compete with a RTX 3080 out of the gate come November I will be once again upgrading to an Nvidia card. If I was looking to buy a 3070 or a 3060 I would be excited by AMDs announcement. I am just not and Nvidia already has a 3070ti with 16gb ram sitting on the shelf and waiting for AMD to target the 3070, because under AMD, ATI is just not as aggressive when it comes to competing.

        It’s not a Nvidia fanboy thing, it’s a customer satisfaction thing, it’s a performance thing, it’s an exclusive game features thing and it’s a marketing thing. Nvidia beats AMD at all of these things, but if I was buying looking for a mid-range card late into the previous generation I would buy a 5700 XT because as I said I like supporting the underdog. I purchased a Phenom X4 955 which was a pretty bad processor by all accounts but it served for a few years.

  2. Why? You don’t know if it will, you have no idea.

    AMD didn’t fail with the 5700 XT. It was a very good card for its pricing and performance.

    1. Solid A$$ cards man, i wanted to9 get one so bad, i got a 5600Xt for my woman and i love the card but i’m waiting for big navi for myself. But i was very impressed with 5600 Xt.

    2. It’s slightly slower than the 2070s (while being a lot less advanced) with a slight cut down price on some parts of the world, in others it’s the same money.

      Now they are gonna have to cut to 200 if they wanna sell those things.

      1. Haha, yeah boi. I knew you’d deliver. Told you man, the first thing i do when i wake up is Open DSOG & GoG. Thanks Homie, I’m ready for a 16 GB card at a reasonable price. I hope AMD delivers on that, these 8 GB cards ain’t cutting it no more. they are getting maxed out even at 1080p. Because AMD will use as much memory as it has, that’s what i like about AMD cards, they don’t care, if the memory is there it’s gonna use it. 16GB should keep me good for a while, been using my woman’s new projector that i got her with the new comp i built for her and 120hz @ 150 inches, I can’t go back bruh.

        My biggest wish is that Optima and other projector companies start making more 21:9 lenses, which is 2:35.1. I’m kinda feeling 21:9 gaming these days. Fingers crossed brother, AMD will make us happy.

        1. Yeah I want a 3070 but the low vram is giving me serious pause. I remember what happened to all the 2 gig cards when last gen consoles got going. Stutter city.

          1. They know what they are doing. They are making sure you always come back. A card with that much speed should not have such low VRAM. With these new consoles coming along we know that these devs are just gonna dump everything on VRAM and its basically last generation switch to this generation stuttering fest all again.

  3. Its about time AMD showcases more on Big navi. Though I have a bad feeling about this. Nvidia has already outed a serious contender for the RTX 2080 Ti in form of RTX 3080 at much more affordable cost.

    Can AMD match their cards, or have the same performance at a lower price point ?

  4. I’m not buying anything until AMD launches their stuff. If nothing else, they’ll likely bring down Nvidias pricing a down a couple notches again.

  5. I want to get the 3080 but geeez with everything going on with AMD as of late I just have this weird feeling waiting a extra month or two might be worth it for performance and a little more vram

          1. Yup. Apparently they didn’t like people choosing Freesync monitors over theirs so the allowed AMD cards to use Freesync on their G Sync monitors lol.

  6. I hope they reveal something before the 3070 launch, otherwise they are putting themselves in the hole again.

  7. The RX 5**** were great cards, why would Big Navi be any different?

    Stop wasting your money on Nvidia paper launches before you see the whole market.

Leave a Reply

Your email address will not be published. Required fields are marked *