NVIDIA header image

First screenshots of the NVIDIA GeForce RTX2060 surface

Videocardz has shared the first screenshots from the upcoming NVIDIA GeForce RTX2060. According to reports, this rumoured GPU will come out on January 15th alongside the GTX1160; another graphics card based on the Turing architecture that will not support any ray tracing effects.

As Videocardz reported, the Founders Edition of the RTX2060 will feature two fans, will be equipped with a single 8-pin power connector and a full cover backplate. It will also feature two DisplayPort outputs, HDMI, DVI-I and USB TypeC connectors.

Although there aren’t any details at this point, we’ll be sure to keep you posted. Our guess is that more details – alongside the specs for this graphics card – will be leaked in the next couple of weeks so stay tuned for more.

On a related note, AMD is rumoured to be launching its brand new Radeon gaming graphics card at CES 2019 so it will be interesting to see whether this upcoming AMD GPU will be powerful enough to compete with NVIDIA’s latest offerings.

Kudos to our reader Metal Messiah for bringing this to our attention!

42 thoughts on “First screenshots of the NVIDIA GeForce RTX2060 surface”

      1. Yes, I saw it before when looking at the screenshot initially. We were waiting for that guy to respond back with an answer, but I think this is a known issue caused by Alt-tabbing, while the AOTS benchmark is running.

        Maybe he took that screenshot at that time.

        It should say 1440p, however when I ran my own test run on both Ultra and quad HD resolutions, the result set was displayed as ‘0 x 0’ as well, which makes this leaked screenshot a little more plausible, imo.

        But take it with a grain of salt though.

  1. Btw john, the specs for this upcoming GPU can easily be guessed, based on past leaks/rumors/marketing materials.

    The card will be using the TU106- 300 GPU core, but is a slightly cut down variant of the RTX 2070, featuring 1920 CUDA cores, 240 Tensor Cores, 30 RT cores, 120 TMUs, and 48 ROPs, along with 6 GB of GDDR6 memory, on a 192-bit bus interface, running at 14 Gbps..

    I think we can guess the ray tracing performance to be around 4-5 GIGA rays/second, since it has SIX less RT cores than the 2070 (so maybe it can offer 1080p/Medium settings ray tracing performance) ? We will see.
    ..

      1. Indeed. Hopefully AMD must deliver something with the consumer based NAVI GPU, or even a Vega gaming card, if rumors are anything to go by, else NVidia cards are going to sell at a higher price.

        But, this has been AMD’s motto: “WAIT”. They always promise, but fail to deliver the end product on time, though this time I’m a bit optimistic. But, only time will tell.

        The main detrimental factor with this upcoming 2060 card would be it’s pricing. IF the price/perf ratio isn’t that good, then it won’t be that much of a worthy contender in the mainstream category, imo, but this is just my guess.

        1. i remember amd being competent with r9 290, good times. it could beat gtx 770 and gtx 780’s performance at times, it was the main reason i bought an r9 290, although i remember its aftermarket versions reaching to 80 and 90 degrees which wasn’t going to cut it for me so i installed an aio cpu cooler on it. temps were the only reason that stopped r9 290 from being a godly underrated gpu.

    1. I think because they need to keep this as a “mainstream card”, and cut on some of the production cost/price as well.

      Since this GPU is using a cut down variant of the full RTX 2070 CHIP, so it makes sense to use 6GB VRAM, since it is based on the 192-bit bus interface, if this is how Nvidia is planning to design this chip.

      I think 6GB of VRAM is still okay to play modern demanding games, but I’m not talking about 4K/120 or 144Hz performance, rather 1080p/1440p gaming segment (despite the refresh rate used), imo.

      But the performance will still depend on the Game being played, and the in-game video graphic settings, and textures packs applied. So this whole VRAM debate sometimes becomes a bit subjective.

      But, 6GB of VRAM is still a sweet spot for this card though.

  2. Hey troll, stop using others profile name and pic…get a life. this guys is a fake, Beware all !!

    this guy is, @memescausebraindamage as MM told before.

  3. The entire 2000 line has been clunker after clunker. Price for performance has been ridiculously poor. I see no reason to upgrade in the foreseeable future. Not at these new prices.

  4. What’s wrong with 6GB ? This isn’t a high end card, and nobody is going to use it for heavy 4K gaming, for which this mem should be sufficient, in my opinion…

        1. Haven’t seen any game take more than 6 at 1440p and even then that’s a game + some hires texture mod. 6 gigs are enough, it’s an absolute sweet spot of 2018, which is what x60 cards are all about. A sweetspot of a year they come out in.

        2. But remember that VRAM used isn’t always the same as VRAM needed. TechPowerUp covered that topic about 3 1/2 years ago when reviewing the Maxwell Titan X:

          https://uploads.disquscdn.com/images/23ffb2466453a9d726946f5afa562830f4fada916a3deca60152873a41971542.gif

          W1zzard had this to say:

          “Modern games use various memory allocation strategies, and these usually involve loading as much into VRAM as fits even if the texture might never or only rarely be used. Call of Duty: AW seems even worse as it keeps stuffing textures into the memory while you play and not as a level loads, without ever removing anything in the hope that it might need whatever it puts there at some point in the future, which it does not as the FPS on 3 GB cards would otherwise be seriously compromised.”

          VRAM requirements are certainly higher today but 6GB is still more than enough for 1080p which is what this card is primarily targeted for.

    1. Probably more for the high end variant of the 2060 with 6 GB GDDR6 VRAM.

      The GTX 1060 released at $300 but the RTX 2060 high end variant should be far superior to the 1060 even in games that don’t use ray tracing. The 2060 should have 1920 CUDA cores whereas the 1060 had 1280 CUDA cores.

      There’s the cost to manufacture these chips to consider as well. The 2060 is more than twice the size of a 1060:

      1060
      200 mm²
      4.4 billion transistors

      2060
      445 mm² (even on a lower process node)
      10.8 billion transistors

      The GDDR 6 VRAM is more expensive as well.

      Right now the cheaper 2070s are selling for $500 so I don’t see the high end variant of the 2060 pushing too close to that but it will probably be more than $350.

      imo the entire RTX lineup is overpriced and if a person can wait to upgrade then they should see what AMD brings in 2019 and what Intel and Nvidia bring in 2020 and decide then.

        1. I’m certain that you already know this but for people who don’t understand the costs involved TSMC manufactures the GPUs for Nvidia and Nvidia pays TSMC by the wafer. This is what a wafer looks like:

          https://uploads.disquscdn.com/images/f103e36ee5f7900774c2b47765ffe1d68ef383c808a3618929f36529a87905ce.jpg

          Each square is a Nvidia GPU. When you more than double the size of a GPU you reduce the number of GPUs per wafer by that much. Nvidia passes that cost along to the consumer.

          btw it’s always good to see you back on here.

  5. 6 GB is plenty for 1080p. Sometimes a game will load up textures into VRAM even though they probably won’t be used or not delete textures when they are not applicable to the level just because the VRAM available can hold them. This was shown on a TPU review with a Maxwell Titan X 3 1/2 years ago:

    https://uploads.disquscdn.com/images/23ffb2466453a9d726946f5afa562830f4fada916a3deca60152873a41971542.gif

    W1zzard had this to say back then:

    “Modern games use various memory allocation strategies, and these usually
    involve loading as much into VRAM as fits even if the texture might
    never or only rarely be used. Call of Duty: AW seems even worse as it
    keeps stuffing textures into the memory while you play and not as a
    level loads, without ever removing anything in the hope that it might
    need whatever it puts there at some point.”

    The 2060 3 GB variant is the one that I wonder about at 1080p. I suspect it will have problems in some modern games at 1080p.

    imo a customer looking to buy a card to last for 4 years or so should skip the 3 GB variant.

  6. It’s the low end variant of the 2060 with 3 GB VRAM that I wonder about. It will probably have issues even at 1080p in some modern games and going forward into the future. If a person is looking at buying a card for around 4 years of use then they probably should skip that variant.

  7. Hello @JOHN,

    UPDATE this article.

    Some more official info on the RTX 2060’s pricing and performance has been leaked.

    These fps numbers are a bit too early to come to any conclusion though, so we need to wait for third party benchmarks. But this leak states that the RTX 2060 has similar performance as the GeForce GTX 1070 Ti.

    These numbers have been taken from Nvidia’s reviewers guide. I would rather wait for actual Gaming benchmarks from other TECH sites though:

    https://videocardz.com/79505/nvidia-geforce-rtx-2060-pricing-and-performance-leaked

Leave a Reply

Your email address will not be published. Required fields are marked *