GeForce RTX 2080Ti feature

NVIDIA GeForce RTX 2080Ti can run the Unreal Engine 4 Infiltrator tech demo with 78fps in 4K

While NVIDIA did not reveal any benchmarks for its new RTX graphics card, we got a taste of what the high-end model, the RTX 2080Ti, can achieve in a specific current-gen tech demo. As NVIDIA’s Jensen Huang claimed, the NVIDIA GeForce RTX 2080Ti was able to run the Unreal Engine 4 Infiltrator tech demo in 4K with 78fps at a “quality that has never be seen before“.

In order to give you an idea, an NVIDIA GeForce GTX1080Ti can run the Infiltrator tech demo with the very same settings at 30-something frames per second. Or at least that’s what Huang claimed at NVIDIA’s GeForce Gaming Celebration event.

Now we don’t know whether this special tech demo was using the RT cores of the NVIDIA GeForce RTX 2080Ti in order to accelerate its performance or not (something that could very well explain the performance difference between the RTX2080Ti and the GTX1080Ti). Unfortunately, and since NVIDIA has not provided any performance figures in current-gen games that do not use ray tracing, we don’t know how much faster the RTX2080Ti will be compared to the GTX1080Ti.

Still, it’s pretty cool witnessing a single GPU running the Unreal Engine 4 Infiltrator tech demo in 4K with more than 60fps.

Below you will find a video showing the Infiltrator tech demo running on a GTX1080Ti (do note that SLI is not working in this video so only the one GPU was being used, and that the tech demo NVIDIA showcased was running at higher settings than those featured below).

Unreal Engine 4 - Infiltrator Demo (4K)

104 thoughts on “NVIDIA GeForce RTX 2080Ti can run the Unreal Engine 4 Infiltrator tech demo with 78fps in 4K”

    1. Like you were gonna preorder it
      Like you were gonna buy the reference card

      Nah

      Wait till christmas
      Buy the Palit or Galax version for 700-800€
      ????
      Enjoy those rays

        1. Jepp, the standard MSRP price for the FE is $1000, that’s 870€, then regular versions of the card will be slightly cheaper say $900, that’s 780€, then add 3-4 months past release pricefall and there you have it. Just common logic.

          1. No, you know it doesn’t work like that, in these things $ = € you don’t convert the price sadly 1000$ will be 1000 or more € like it always was and that’s is the price for custom AIBs asus already listed for preorder their video card at 1199$ just like nvidia.

          2. That’s why you buy your cards in the land of the free, home of the brave 🙂
            No kidding, I know people in the US and I use to send them money to buy hardware and to ship them discreetly as gifts to me.

            “asus already listed for preorder their video card at 1199$”

            Yeah, custom cards of the higher tiers at companies like ASUS, Gigabyte, EVGA are always gonna be quite expensive. That’s why you buy Galax or Pailit non-overclocked entry cards. Anyways, your initial comment about 1300€ is for preordering the more expensive FE card, and a comment like that is just ignorant. I get it, nvidia is expensive. But they have all the rights in the world to do so, because they don’t have any competition, it’s just how the market works. My guess is that the 2080 will be the best choice and at around christmas or just after newyear, that card will be quite affordable and still beat the living cr*p out of 1080ti = worthy upgrade.

          3. Well if you can do that, good for you, i have a friend of mine who goes to canada sometimes, but it’s not easy to do what you say, i’ll ask him next time he goes, see if he can try and buy me stuff and bring it back when he comes back….
            Well personally, since i buy a videocard once in a while, i’ll buy a good model, like i had the last time (bought the 1060 gaming X from msi).

            How is that ignorant, that card costs like that, no matter preorder or anything, even after preorder the FE will cost that.

          4. I suggest so, because prices in Europe are outrageous. Or just wait until the prices starts to drop a little. Never buy directly from nvidia store that’s just dumb.

            It’s ignorant because you picked the worst-case scenario to aid your point in bashing the prices of nvidia. Furthermore with a comment like that you come across as childish and lacking the ability to think and reason in a fair and realistic way. 1300€ yes, IF you can’t wait and live in Europe. It’s like buying fast-passes when going to big amusement parks to skip the line and not having to queue, it’s gonna cost you.

            Most people, who has some brains, will wait slightly and order the cheaper brands of non-FE variants. Initially MSRP $1000, yes taxes and fees will equal €1000 in europe. Still, a price falloff and a cheap brand will get you down to maybe €800-900. Which is around the same amount of money the 1080Ti cost when it came. Thus, rending your earlier comments about turing being overly expensive useless and defacto, ignorant.

          5. I’ve not picked the worst case scenario, i’ve picked THIS case, which is the one we’re talking about, and no they won’t even come close to 800€, the lowest they will get is probably above 900€ and that’s like 1-2 years from now, just good look at a good 1080ti, it costs around 800€ still, so i think what you say is a dream, but not mix dreams with reality, those prices will stay like that for a long time, unless AMD does something unimaginable, say, next year, which is probably not happening.
            I really don’t know which planet you live on, but it’s surely not earth, 1080Ti costs NOW 800€ not at launch go check prices. and don’t look at sh***y models, go look at decent stuff, because FE is always around those prices.

          6. More and more ignorance from you. I’m getting used to it already.

            You picked the worst-case scenario.
            1. You live in Europe.
            2. Price based of the FE.
            3. Being an early adopter.

            “i’ve picked THIS case”
            What case? There are several options if you want to obtain the 2080Ti.

            “won’t even come close to 800€”

            Yes it will.

            “that’s like 1-2 years from now, just good look at a good 1080ti, it costs around 800€ still”

            Ignorance yet again. Ever heard about mining? The 1080Ti was down to 700€ and would have continued to drop if it wasn’t for the mining hysteria, prices still haven’t recovered yet.

            if you look at the prices of top tier cards, they always drop down around 100€ a few months after release.

          7. Seriously you can’t be this stupid, you’re absolutely incredible.
            You basically contradicted yourself “Prices still haven’t recovered yet”
            And that’s exactly why prices won’t fall like you said, and that’s why nvidia dared to price these new cards like they did, because they’re distant from the old cards’ pricing, so neither the old nor the new cards will drop in price unless AMD does something.
            Again, i talked about nvidia’s 2080Ti, and that’s not a worst case scenario, because that’s not the most expensive card, or maybe it is now, but won’t be when custom models will come out. You really don’t know sht about this and you’re stupid enough to claim you do, you’re really incredible.

          8. Ahahaha, you lost your composure for sure. Typing faster and faster, more angrier and angrier and letting your little emotions come in the way of decent discussion.

            Prices haven’t recovered yet from the mining hysteria, but are slowly doing that. New products from nVidia will drop in price because there’s an excess of pascal cards on the market still. Prices are adjusted based on availability and request. High demand and low availability result in higher prices, that is what happened during the mining hysteri. If there is a high availability and mediocre demand then prices will be lowered. Things are always gonna be more expensive after a product launch since the product is hyped up and there’s a high demand of it. You don’t seem to understand how the market works at all.

            Again, you’re embarrassing yourself. That is the moneywise worst case scenario for obtaining the 2080ti, and you used it to fuel your initial comment. While only managing to come across as someone with limited brain capacity. The regular cards that will come out later will have a lower price than the FE from nvidias own store, do really not understand that? Sure, some aftermarket cards will be more expensive, those who feature custom PCB’s and custom cooling solutions. But we’re talking regular cards here. Again, you talked about the 1300€ FE card from the nvidia store being viewed from a location in Europe. That is worst-case-scenario. End of discussion.

          9. You can damn tell i lost composure talking with a brat like you. You are telling ME i don’t understand something, jesus that’s laughable.

            You stupid fck do you understand that what you’re saying never happened, cards always cost as much as the FE does, or around that. what you’re saying NEVER HAPPENED you magnificent idiot.

            Those cards’ price is going to hover around the msrp of FE version for months, they might start to drop after that, but slowly, but they’ll never reach those price unless something happens from AMD, otherwise there’s not way the 2080Ti is going to reach under 900€ in less than 2 years, i’m saving this conversation and come back at you with all kind of insults possible in 2 years.

          10. Ahaha, more fuel to the fire. You’re uncontainable. And you have no insight into how marketing works. The funny thing is, you can never prove your claims or back them up with proof. It’s just a fart in the wind, your claims is nothing but air from your a$$.

            Again you’re dropping the discussion when you realize that you lost and doesn’t have anything to counter with. Instead you replace them with pathetic insults and foul language.

            “cards always cost as much as the FE does”

            Nvidia is putting the FE prices well above the MSRP of the regular cards for a reason. A reason they explained some time ago. Go look it up. I’m tired of teaching you stuff.

            2080Ti MSRP: $1000
            2080Ti FE: $1200

            Enough said.

            What I predict will happen to our discussions going forward:

            1. You will keep using a foul language.
            2. You won’t have anything to counter me or my claims with.
            3. You will provide no sources to your claims.
            4. You will dodge the parts where I prove you wrong.
            5. You will avoid parts of the discussion that doesn’t aid you.
            6. You will keep making things up and pull claims out of your a$$ with no support for it what so ever.

            Been there done that/I’m a master at this.

          11. WHAT? LMAO I proved all my points by showing proof, from wikipedia and from youtube, i’ve proven EVERYTHING, it’s you that haven’t proven sh** so if there’s one farting in the wind that’s you, not me lmao.
            Dropping the discussion? Lmao you’re damn right, i did my homework correctly, and there’s nothing more to add, especially when talking to a braindead individual like yourself.
            1 Possible, since i already tried with manners but apparently not even insults work for you because your situation is THAT desperate
            2 Counter? ROFL i’ve already proven EVERYTHING, it’s just you saying BS, with not even a hint of a proof.
            3 Already done that, the gamescom keynote contains everything i said.
            4 I’m not dodging sh** lol
            5 This is basically the fourth point with a different wording. you’re like a middle school kid to me
            6 Again, proven everything, facts are on the keynote video, and on wikipedia, easy.

            AHAH a master, you’re not even a master of weeping your butt, you can go on as much as you want, i’m in the right and there’s nothing i fear from lowly individuals like yourself, bring it on!

            Also it’s funny because Sun Streaker had your exact mentality, but the at a certain point stopped answering, even if he claimed he’d keep going forever, i’m unemployed atm, so i assure you i’ve go MUCH time to waste, also for this.

            A MASTER jesus christ you crack me up, you can’t even understand what you read and hear, and you think you’re a master at this ?

          12. You didn’t prove anything noteworthy. You proved some false information from a wikipedia article that anyone can edit. And that youtube link only proves that the video is uploaded in 4K, which has nothing to do with the discussion we had. You didn’t provide any legitimate sources for your other claims, or in the parts where you tried to prove me wrong or counteract what I said. So, gongratulations, you just fulfilled paragraph number 1 and 3.

            Yes, you dropped parts of the discussion where I crushed you. You didn’t comment on it, or even tried to counter it. Again, congrats, you just fulfilled paragraph number 1, 4 and 5.

            “1 Possible”
            No, you started with the foul language as soon as I trapped you in a corner regarding specific parts of our discussion that you didn’t have anything to counter with.

            “2 Counter”
            Yes, you didn’t have anything to counter some parts of the discussion, so you left it out completely. And no, you didn’t prove anything. You gave two useless “proofs” one that was false information and one that wasn’t relevant to the discussion.

            “3 keynote contains everything i said”
            No, you meant to say that the DLSS is nothing but an AA filter and is used in place of TAA. Which isn’t the case. The keynote specifically informs on how DLSS works, it turn low resolution into high resolution by adding pixels that weren’t there initially. Even the nvidia blog post about DLSS explained it, which I provided a source link for.

            “4 I’m not dodging”

            Yes, you were dodging parts of the discussion where I proved you wrong. Some examples of that are: 1-Worst-case-scenario. 2-Aftermarket cards VS founders edition pricing. 3-Wikipedia. 4-DLSS functionality. 5-RTX implementation.

            “5 This is basically the fourth point”
            No, doding parts where someone proves one wrong and avoiding parts that doesn’t aid ones arguments are not quite the same thing. Perhaps you should redo middle school.

            “6 Again, proven everything”
            Again, you didn’t bring proof of the majority of what you said. A wikipedia article with wrong information and a youtube video proving a videos resolution isn’t proof by any means, especially in this discussion.

            Yes I’m a master at this. What defines a master:
            1. Has proof to back up their claims with.
            2. Can practice discussion in a mature and sensible way.
            3. Proves points valid with logic, facts and sources.
            4. Doesn’t dodge questions or leave out parts of the discussion.
            5. Doesn’t let emotions interfere with the discussion.
            6. Has the upper hand and stays one step ahead of its opponent.

            I fulfill those requirements. You on the other hand, not even close!

          13. You ain’t no master at doing sht man, you keep claiming to be right based on what YOU understood, which isn’t completely not exact in some cases, but still not exact.
            I proved everything i said with wikipedia and youtube (linking the exact moment huang talks about what i’m saying) and wikipedia contents need to be checked and approved, otherwise it’ll be specified that the fact hasn’t got a source and nothing that confirms it, basically everything wikipiedia has on is correct, unless it’s specified to have no source, because wikimedia double checks infos users post on it.

            Again, you never crushed anything besides a dog turd while walking down your street, because what you said is incorrect and you showed no proof besides stuff i already knew by linking nvidia blog posts or other stuff.
            I started insulting you because of the way you startes arguing and the way you intruded into my other arguments with other users regarding different stuff, claiming they were right just to piss me off (and you managed), and not even understanding what me and the other part were talking about, and what i and him meant to say.
            Which parts? Just show me and if i missed that i’ll teach you on that too, because unlike you i know the matter treated here.
            Then your understanding of english needs some training because you got it all wrong, there’s no higher output resolution and consequentially no higher pixel amount, just right guessed pixels and that’s what huang talks about, just go watch again the part i linked you.
            I haven’t dodged that, as i said, more than once, this isn’t the wrong case scenario because nvidia FE cards never cost more than good AIBs versions they’re always around the price of the FE version, and that can be seen on every card of pascal gen, the 1080ti launched at 699$ and good AIBs could be found, at the veryeast for 100$ premium that, even before the mining craze. On wikipedia i already answered you and told you why you’re wrong, what about RTX implementation?
            No real proof of the bs you said whatsoever, no maturity, because you’re still answering me on 4 different posts, and we started with only 1 and you expanded your bs to 3 others intruding into others discussions just to join the shttalk (2v1) makes you feel stronger huh kiddo? Doesn’t work with me, i could handle an army of people like you without even blinking once.
            Rest is bs you keep prosting at what k already replied, again man, you picked the wrong opponent this time. You are the wrong part in this discussion and you might even have realized that, you just keep going for pride. Bad kiddo, bad.

          14. “You ain’t no master”

            Yes I am, let me remind you:

            What defines a master:
            1. Has proof to back up their claims with.
            2. Can practice discussion in a mature and civil way.
            3. Proves points valid with logic, facts and sources.
            4. Doesn’t dodge questions or leave out parts of the discussion.
            5. Doesn’t let emotions interfere with the discussion.
            6. Has the upper hand and stays one step ahead of its opponent.

            “you keep claiming to be right based on what YOU understood, which isn’t completely not exact in some cases, but still not exact.”

            No, since I provided sources and quotes to back up my claims with.

            “and wikipedia contents need to be checked and approved”

            No, not all wikipedia articles need to be checked and approved, and the ones that need to be quickly pile up, because theres so many article edits and not enough people to check and approve those edits. Anyway the information was wrong. If it had been checked it wouldn’t have been approved. Besides, the MSRP for the 1080 specifically had no source, thus, anyone could have edited the numbers. Someone put in the wrong numbers, no source and no one checked it. My point is, it’s unreliable to source wikipedia.

            “Again, you never crushed anything”

            Yes I did. Especially in the parts about the ray-tracing implementations where you just said “just a few stupid light effects”
            And also when you stated that $100 difference was equal to “Prices were milked much more” while a $50 difference was considered “they didn’t milk it that much” On the argument pascal were also overpriced. Your side of the argument was that nvidia didn’t milk prices even though AMD was absent, which isn’t true. I argued that pascal was also overly priced, which they were, around $100 more than previous generation. So yeah, I did some crushing alright. The funny thing is that you didn’t counter above things, you just left them out, or dodged them.

            Then I said the following:

            “It’s ignorant because you picked the worst-case scenario to aid your point in bashing the prices of nvidia”

            Which is correct. You used worst-case-scenario. You’re arguing that aftermarket non-FE cards bought from stores that compete about customers cost as much as the FE from nvidia’s page, which is totally wrong, with the exception of the top tier cards with custom PCB’s and top of the line cooling.
            Secondly. You used euros. Which in Europe cards are obviously gonna cost more. Also being an early adopter and pre-ordering from the nvidia store is gonna cost more than buying the same cards from online retailers who compete for customers. To answer on my argument you had this to say:

            “Again, i talked about nvidia’s 2080Ti, and that’s not a worst case scenario, because that’s not the most expensive card, or maybe it is now, but won’t be when custom models will come out.”

            First of all, I’m talking about worst-case-scenario NOW, your initial comment stating “1300€ to run infiltrator demo” also reflected on the current price of the 2080Ti. Which invalidates your whole argument.
            Here’s the thing, if someone wants the 2080Ti they should wait for the aftermarket cards and get it then, because it’s not gonna be 1300€. As I said (and something you dodged) is that there are several ways to obtain the 2080Ti, and either someone stupid, very rich, or very impatient will pre-order the 2080Ti from nvidias store, by doing that, it’s the worst-case-scenario from a price point of view. And I’ll include a link to archive dot org to prove this further down. Again, I crushed you.

            “because what you said is incorrect”

            But you didn’t have anything to counter my argument with. The only thing you had was “it’s not the most expensive card” “wait maybe it is” “it won’t be when the custom models comes out”

            That’s just pure guesswork with a total lack of logic. Not proof at all.

            “I started insulting you”

            Well you shouldn’t have. You should have kept the discussion mature and civil. But because you’re not, that didn’t happen. Then you’re claiming that I am immature. Talk about a perfect example of being a hypocrite.

            “there’s no higher output resolution and consequentially no higher pixel amount, just right guessed pixels and that’s what huang talks about”

            No. And let me provide you with another source image:
            images.anandtech(.)com/doci/13266/TuringVsPascal_EditorsDay_Aug22_2.png

            Do you honestly think that all that performance gain is due to removing TAA?

            To further support my claim, here’s PUBG anti-aliasing ranging from Very-low (off) to Ultra (TAA) and the performance difference is almost none: /watch?v=TYoZDEJ3S1A
            Here’s another one, also using UE4: images.nvidia(.)com/geforce-com/international/images/gears-of-war-4/new/gears-of-war-4-anti-aliasing-quality-performance-640px.png

            No! The performance gain is due to the game being initially rendered in a lower resolution but DLSS supersamples the game and fills in the missing pixels to make the output of higher quality, thus saving performance. Try to get that into your thick skull.

            Furthermore:

            “Nvidia’s Deep Learning Super Sampling”

            “Another full resolution set of frames with at least 64 samples per pixel acts as the reference that DLSS aims to achieve”

            “Now that technology is coming back to games, with NVIDIA harnessing banks of supercomputers to train network, such as the NVIDIA Deep Learning Super Sampling, which turn low resolution into high resolution ones, and can run on Turing’s Tensor Cores.”
            blogs.nvidia(.)com/blog/2018/08/20/gamescom-rtx-turing-real-time-ray-tracing/

            It’s not just a more effective AA solution. A super-sample technique doesn’t boost performance over TAA. If both were running in 4K one with TAA and one with DLSS, there wouldn’t be such a big performance gain. It’s due to going from a low-resolution and inserting information from a higher resolution gathered without adding load to the GPU.

            “I haven’t dodged that, as i said, more than once, this isn’t the wrong case scenario because nvidia FE cards never cost more than good AIBs versions they’re always around the price of the FE version”

            Ok, check this:

            web.archive(.)org/web/20160415000000*/https://www.newegg(.)com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007709%20601194948

            The link works, it’s just that there’s two sets parentheses in the url that needs to be edited out. Then click 2016, then click on JUNE 26, 2016, which is roughly when the aftermarket 1080’s hit the shelves. Now look at that! 600-650 dollars for a good AIB non-FE 1080! That’s no $700 if bought from the nvidia store. I could’ve certainly found cheaper ones, but working with archive dot org site is kinda tricky when U want to do specific filtering on sites like newegg. Anyway, I proved my point. FE cards bought from the nvidia store prior to when they hit the shelves are gonna be more expensive. Thus, my argument of worst-case-scenario is valid and you asked me what points I crushed you on? Well that’s one.

            “On wikipedia i already answered you and told you why you’re wrong”

            Wrong about what? My argument was that don’t use wikipedia as a source, because they’re not 100% reliable. And I was right, the information in there regarding the MSRP of the 1080 was wrong, you, and the article claimed $549 when in reality it’s $599; cdn.wccftech(.)com/wp-content/uploads/2016/05/2016_NVIDIA_Pascal_FINAL_published-page-032.jpg

            “what about RTX implementation?”

            If you read carefully further up, I explained what I meant.

            All in all, On the wikipedia, RTX implementation that I commented about, you dodged them both previously, your answer was filled with pathetic foul language and attempts at insulting:

            “No that’s BS and your mirror climbing is reaching retrded levels, come one show me how can anyone can edit that, you stupid idiot, you’re a fkin bad troll, i proved wrong every time and then you jumped on other posts that weren’t yours just to flame you damn idiot, showing over and over you have no clue of what you’re talking about a reading bs

            The 1080 could be 599 but surely isn’t 699 you turd

            RTX is nothing now, it’s just a fkin gimmick to bait braindead people like yourself, it’ll be a decent feature for the generations to come, not now, it’s not worth it for the cost of the card, period.”

            Furthermore, you invalidated my claim on performance benefits. Let me provide a source where Jensen talk about it, and also mentions that trying to achieve similar results with traditional rasterization techniques would clog down performance: /watch?v=Mrixi27G9yM&t=19m56s

            My argument was: There can be performance benefits running ray-tracing instead of traditional rasterization graphics in some situation. An example of a situation is the elevator scene in the star-wars demo. Woops, looks like you tried to invalidate my claim without any reference or anything to back that up with. Me on the contrary had a source. Woops, look like I crushed you once more. Wasn’t it you who said the following: “you never crushed anything”

            Apparently I did on several occasions. Especially since this very post I’m writing now…

          15. Hmmm… I’m from Canada and the prices are just plain BAD…

            Newegg prices: RTX 2080 = 750$ USD but… 1100$ CAD

            Fact are that 750$ USD = 975$ CAD

  1. Who actually cares ?

    Majority of the Gamers, if not all, won’t be able to reap out the full benefits, by purchasing such an expensive card like the RTX 2080Ti, let alone play games on 4K.

    This is just a tech demo. We need to see how good will be the performance when it comes to actual Gaming.

  2. Well Jensen did say that the 2070 outperforms the Titan XP, whether he meant real life game performance scenarios or in ray-tracing capabilities I’m not sure. I also read somewhere that the supposed 2060 performs similar to the 1080. So I don’t know.

  3. “As an example, Huang showed off the Infiltrator demo, based on the UE4 engine, running at 78fps at 4K, instead of 38fps on a GeForce 1080 Ti. The main reason for the increase is due to the Tensor cores filling in the resolution through neural network training.”

    At GTC Huang said that with DLSS (Deep Learning Super-Sampling) they can output a higher resolution than what is inputted.

    1. If tensor (RTX cores) can be used the way you described then we will see revolution comparable to unified shaders and performance in games should skyrocket.

          1. Maybe you should have paid attention to the presentation. Already explained it to you in the other comment.

          2. “Or maybe you should’ve.”

            Stop being so cocky, it won’t serve you any good and definitely won’t drive the discussion forward. Putin explained it to you already, and he is most likely right. The big performance advantage of the 2080Ti is due to the demo being rendered initially in a lower resolution.

          3. No idiot, they’re both running at 4K, go check and as i already said, go educate yourself before talking on websites, and stop bothering me with your ignorance and stupidity.

          4. Haha “ignorance and stupidity”

            Good job mr hypocrite, because that’s exactly what most of your messages lately has consisted of.

            Yes, it’s running in 4K. But what me and Putin are trying to feed your brain with, is that the rendering output of the GPU on a rasterization level is based on a lower initial resolution. The tensor cores then reconstructs pixels based on an AI algorithm to output a higher resolution. Thus takes less real world performance than bruteforcing a true 4K output in that particular scenario.

            If you still don’t understand, then don’t participate in discussions that are above your mental capacity.

          5. Talk for yourself idiot

            “Wtf are you talking about? I just explained how they got “double” the performance.

            2080Ti is X amount faster than 1080Ti but they also rendered the game at lower resolution and used DLSS to super sample it to 4K using tensor cores.”

            This sounds exactly what he meant, because of ignorance, and you’re following him because you’re as stupid and ignorant as him probably.

            What they showed is simply an AA filter that uses AI to reconstruct some of the pixels, and using also Super Sampling.

            Stop trying to make yourself like the one who’s right, because you’re not and everybody can see that reading this, you just got it all wrong, and you’re trying to save ur a$$, absolutely ridiculous.

          6. Ahahaha. Wow you’re persistent aren’t you?

            Ok here we go:

            “and you’re following him”
            No I am not lol. In fact I hate him! Doesn’t mean he isn’t right or isn’t cleaver, because he is.

            Quote from nvidia’s blog:

            “NVIDIA Deep Learning Super Sampling, which turn low resolution into high resolution ones, and can run on Turing’s Tensor Cores”
            blogs.nvidia(.)com/blog/2018/08/20/gamescom-rtx-turing-real-time-ray-tracing

            It’s not just a simple AA-method. It’s relieving compute performance from the main GPU cores by using a supersampling technique known as neural graphics acceleration to reconstruct a higher resolution output through the convolutional autoencoder, all running on the Tensor cores.

            The side effect of increasing the resolution is smoother edges and more details, it’s not a standalone anti-aliasing technique on its own.

            “Fkin disappear.”

            After you.

          7. NO FKIN NO, those were both at 4K you’re denying reality, fckn have your brain checked because you have some serious problems.

            It’s outputting a slightly denser image because it creates some pixels were standard AA method fail to, and it’s basically a SuperSampling AA, not much different from it, again go learn stuff and then maybe you can have a chance with me, you donkey.

            Get lost you ugly idiot troll, otherwise keep answering, you will also lose this, the last guy who tried this (SunStreaker) on this very website, gave up after a couple of days.
            I’m right, i know i’m right, i proved i’m right you keep ignoring that instead of admitting that you keep going on proud, aight, just keep doing it, i’m restless in these kind of situations, i’ll take the last word, i promise.

          8. Aahaha, you’re so pathetic. You’re so convinced that you’re right even tough you’re not. I have proof and sources to support my claims, unlike you. So you’re automatically losing this discussion because you don’t have anything, just a foul mouth.

            I never claimed that they weren’t both 4K resolutions. They are, the final output are, but it’s upscaled from a lower rendering resolution and AI is filling in the missing pixels and making it 4K, the only reason for that is to gain performance and get a superior image quality. If you listen to the keynote Jensen explains how the convolutional autoencoder and the neural graphics accelerator work. They are using supercomputers to teach the AI inside the Tensor cores what a better version of a lower resolution image look like, by using mathematics and algorithms the AI can increase the resolution to a specific output in realtime while being initially rendered in a lower resolution. The corrected imperfections are there due to TAA being turned off. The superior AA is there because the AI smoothes out the jaggies. And even at 4K, there are jaggies. But you don’t have to rely on traditional AA systems to counter them.

            “just keep doing it, i’m restless in these kind of situations, i’ll take the last word, i promise”

            What I predict will happen to our discussions going forward:

            1. You will keep using a foul language.
            2. You won’t have anything to counter me or my claims with.
            3. You will provide no sources to your claims.
            4. You will dodge the parts where I prove you wrong.
            5. You will avoid parts of the discussion that doesn’t aid you.
            6. You will keep making things up and pull claims out of your a$$ with no support for it what so ever.

            Been there done that/I’m a master at this.

          9. Copy paste of the same Bs, with some other dogsht in it…You understood nothing lol, nothing, the DLSS works similarly to SS AA filter, and could have improvements performance wise over SSAA, but this just isn’t happening now, learn to understand what you read and what you read because you have a serious problem with that. The AI behind DLSS is used to recreate non-existent pixels, much like what the “fill” feature of photoshop does, but much more intelligently, the image is 4k there’s not lower resolution of anything, it’s just different filters, i don’t know really what language to use with donkeys

          10. Congratulations, you just fulfilled paragraph number 1, again.

            First of all. SSAA or Super-sample Anti-aliasing is NOT a filter. The antialias properties is just a side-effect of doing supersampling.. Super-sample works by rendering the image at a much higher resolution than the one being displayed, then shrinking it to the desired size, using the extra pixels for calculation. The result is a downsampled image with smoother transitions from one line of pixels to another along the edges of objects.

            Anti aliasing filters works by calculating the frequency of the input signal in accordance with the sampling rate and after filtering out the low-pass then blur is added. A few examples where filtering is used are: MLAA, FXAA, SMAA and TAA.

            “the image is 4k there’s not lower resolution of anything”

            The image is 4K. Never stated otherwise. The output resolution isn’t lower than 4K, but the initial resolution is lower than the final result, in between is where nvidias new AI magic comes in.

            “it’s just different filters”

            No, DLSS is not based on frequency filters.

            Here’s a quote from the nvidia blog:

            “DLSS requires a training set of full resolution frames of the aliased images that use one sample per pixel to act as a baseline for training. Another full resolution set of frames with at least 64 samples per pixel acts as the reference that DLSS aims to achieve. Training also requires a variety of scenes from the game in order to generate the best results”

            So, DLSS works like supersampling yes, but it doesn’t have the performance impact that SSAA has. It uses a higher resolution reference and uses that to output a higher resolution from a lower resolution counterpart. That is saving resources and at the same time offers better aliasing countermeasures than say TAA. The whole idea of using DLSS is to achieve a higher quality image without having perform the supersampling in real time using traditional methods or increasing the resolution, allowing the 3D application to run at a lower initial level of quality/resolution. So it’s not just used for edge detection.

            ~1800p + DLSS supersampling = 4K output but without the performance penalty of doing 2xSSAA or going full 4K, this also increases the fidelity past 4K + TAA.
            2160p + TAA = true 4K output with the penalty of doing full 4K rendering.

          11. Anti Aliasing is a filter in electronics, and it’s considered a filter also in computer programming, as it’s a denoising filter in electronics just like in programming, the rest you said is correct.
            Both “input” and “output” images are 4K but
            they showed 2 different cases, using the 2 different methods of AA resulting in DLSS (similar to SSAA) to display a more accurate “guess” of antialiased pixels, the density and resolution stays the same but the DLSS just “guesses better” the pixels it needs to create a more accurate reconstruction of the “missing” pixels.
            I know DLSS isn’t as demanding as SSAA, i also said it, if you were more careful reading my posts, you would’ve seen it.

            Input and output images in both dlss and taa are both 4K, the quality however is superior using dlss because of the more intelligent algorithm it works on, using ai, there’s no magic, and there’s no output with more pixels or higher resolution, just, right-guessed pixels (dlss) and wrong-guesses pixels (taa). Simple as that.

          12. You’re wrong about that. It’s like saying supersampling has nothing to do with resolution while it does. A game running at 1440p with 2xSSAA can look similar as native 4K, because they’re effectively outputting the same amount of pixels. If then the 2xSSAA has less performance impact, then it’s a win. It’s like the consoles running 1440p or 1800p and upscaling/outputting to 2160p, but better. All in all, both supersampling and DLSS are gathering pixel-information from a higher resolution. It’s just not simply an AA technique alone. To prove that, here’s a picture from the infiltrator demo: abload(.)de/img/infiltratordlaa002a4dsb.jpg – See how DLSS resolves more detail, not just smoother edges, if it was solely a superior AA technique to resolve aliasing for edges, it wouldn’t look more detailed like that. It wouldn’t make sense to use DLSS just as a more superior AA solution (to smooth out edges) than the best solutions we have today, in 3D application running in 4K. It’s used to construct a high resolution version of the 3D scene. And for that the AI needs to “learn” the scene, as I quoted before. If you’re using a technique that works similar as supersampling, while also capable of scaling and outputting to a higher resolution, and do it with a significantly less performance impact, it makes sense to lower the initial rendering resolution to push a higher framerate. The problem with regular supersampling are two things: performance and the fact that it doesn’t scale or outputs to a specific higher resolution. Those are problems DLSS doesn’t have.

          13. It does has stuff to do with the resolution, but it doesn’t just outputs double the pixels (in case of x2) or quadruple the pixel (x4) of the resolution in use, it’s not a resolution scaling man, your mistaking the 2, resolution scale actually multiplies the pixels outputting from the videocard (based on what percentage you use, and how the game scales the game), of course if you have a 4K monitor, the monitor can only output that resolution, and those PPI (based on the inches of it) it’s not going to output more, it just can’t, even if the card is sending those, so the monitor just uses those, and it’s not going to use the others, but ofc the videocard will elaborate those, so you’ll likely see a huge loss in performance, while seeing basically no different whatsoever, unless the game uses a particular way of scaling.
            This is the same basically, but a bit more complex and intelligent, and ofc, when supported from the card, less heavy compared to both SSAA and Resolution scaling, but basically it’s very similar still.

            Besides 1440p isn’t half of UHD, so doubling that won’t give you what 4K gives you. Fact is SSAA isn’t like the real thing, If you quadruple 1080p it will theoretically give you same amount of pixels 4K/UHD does, but it’ll still be a filter, in practice the real 4K (with no SSAA) will always give you a better result, even if 1080P with SSAA set to x4 will probably have a less impact on performance, even if slighlty, but all these things depend on the game, and how the game engine is made, it’s not like every game is going to respond the same to this stuff, unless you use filters directly from the videocard panel, but that may cause problems to games that need particular algorithms to properly use a certain filter, in a few words, setting a SSAA to 4x on a game (from the game settings), won’t probably give you the same result of setting the SSAA to 4x from the control panel, or at least not in all games.

            I can see the thing you linked, find a better way to link stuff because i couldn’t check a single of your links (which as you claim, should prove you right) the way you link them.

            DLSS will have the same problems, it’ll just work in a slightly different way, and supposedly impact less performance on cards that fully support it. Also nvidia does the comparison with TAA, which is a poor a** AA method, they should’ve showed comparison with multiple AAs, SSAA included, and maybe there show the difference in framerates other than quality output.

            Oh and i just remembered when huang says something like “It shows pixels that have never been there” It’s not like it’s adding pixels, it just corrects the existent ones, to give a more accurate image.

          14. The reason you failed to open my links is because you forgot to edit out the parenthesis surrounding the dot in “dot com”. Sorry that I were unclear about that.

            “If you quadruple 1080p it will theoretically give you same amount of pixels 4K/UHD does, but it’ll still be a filter, in practice the real 4K (with no SSAA) will always give you a better result”

            I will do some testing, regarding 2880×1440 + 2xSGSSAA vs native 4K, because I own a 4K 24″ computer monitor. So I will test this to show you, instead of just taking my word on it.

            “1440p isn’t half of UHD”

            Thanks for pointing that out. 2880×1440 it is.

            “even if the card is sending those, so the monitor just uses those, and it’s not going to use the others, but ofc the videocard will elaborate those, so you’ll likely see a huge loss in performance, while seeing basically no different whatsoever unless the game uses a particular way of scaling”

            Well, unless you refer to different sampling patterns when you say “a particular way of scaling” I must tell you that forcing true SSAA that goes above the native 4K yields far better quality, details and fidelity than running just native 4K. In old games I use to force 2 or 4 times sparse grid supersampling on-top of my native 4K resolution. It’s particularly impressive in a decade old game like Crysis.

            Regarding the whole DLSS situation. I’m gonna keep digging on what exactly this new technology does in games and how it fundamentally works. But as far as I can tell, it does affect performance positively, and the only explanation I can think of is that the game is rendered initially at a lower resolution then DLSS AI reconstructs a higher resolution in after being trained in that 3D scene. Then learns what the higher resolution looks like and outputs that on top of the initial resolution (filling in) The anti-alias properties is just a side-effect and something that Jensen highlights during the RTX launch event.

            “Oh and i just remembered when huang says something like “It shows pixels that have never been there””

            He also said: “Because we can take a lower resolution image, and because we can train a neural network with all kinds of super high resolution and super high quality images, this neural network it runs on tensor cores, it could then in real-time enhance images in realtime. In realtime generate pixels it has never seen before.”

            If you watch from this timestamp, Jensen talks about NGX:
            /watch?v=Mrixi27G9yM&t=54m17s

            He particularly talks about games using this technology, even with a picture of Tomb Raider in low resolution to the left, and one running high resolution to the right, hinting that It’s just more than fighting aliased edges.

            Good night!

          15. Then how do you explain the performance gains when DLSS is activated on the 2080, compared to it being turned off?

          16. My theory is that either they compared that with theoretical performance DLSS would have on a 1080 resulting in the 2080 being “2X faster” than 1080 because 1080’s performance would fall very hard, so it’s all theoretical and just for display, or i’m wrong, and DLSS is somehow lighter than TAA, but will probably look worse, overall, except a few things in the frame which will be SuperSampled hence, look, better, and you’re right because that is a 1080p with DLSS which only super samples some textures of the frame, but then again, why write “Data measured at 4K” on the bottom right of the graphs picture? I mean i know it’s nvidia but this would be very misleading.

            Anyway you put it, nobody has a clear idea on how DLSS works, and might make 0 sense how it works. Anyway throughout the years i’ve learnt how to never trust graphs these companies release to show performance, ever.

          17. I wasn’t referring to the 1080 vs the 2080, I was talking about the 2080 vs 2080+DLSS, what is causing that big performance lead when DLSS is activated? Your theory that DLSS is “lighter” than TAA wouldn’t make sense, since deactivating TAA or swapping it to something else wouldn’t result in that big of a performance gain. The only logical explanation must be that the 2080+DLSS is somehow rendered in a lower resolution initially then upscaled using deep learing. As Huang also mentioned, the final image quality does look better than native 4K + TAA, which can also be seen in comparison images released the past couple of days.

          18. I think huang actually said that it looks almost as good, not better, even because it’s not a full frame SS, just a per-object SS, so some of the stuff would look better but overall the image would look worse than 4K+TAA. But still i can’t explain why they wrote “Data measured at 4K”, when it could be not (if what you say is true).

            Also my theory was that the 2080 is double the 1080 with DLSS, because the 1080 with DLSS is much slower than it would be without DLSS (although DLSS won’t probably be working on older cards, for obvious reason, but that’s why i said that they might have theorized the performance a 1080 would have with DLSS and compared to that of 2080 with DLSS). Anyway i don’t know, how it actually work, the only thing i almost sure about is that DLSS won’t make a full picture SS.

          19. I’m gonna quote Huang on that:

            “This is infiltrator, running on one GPU, at 78 FPS at 4K, at a quality that has never been seen before”

            Here’s my theory:

            Basically they use an unknown number of DGX-2 servers (each with 16 Tesla V100 GPU’s) to basically render and sort of “deep learn” all the game’s visuals and run algorithms where essentially the Tesla’s tensor cores attempt to upscale the game. If they upscale the images correctly, nothing happens, but if they upscale it incorrectly, a “ground truth” image is supplied to teach the tensor cores how to do it better. They basically do that for hours and hours, we don’t know how long this process takes or how many DGX-2 servers they use, but basically the end result is after many cycles of doing this the tensor cores become good enough at it that Nvidia considers the code good enough to send out as a driver update.

            So then on like an RTX 2080 when you run DLSS, the tensor cores use that game’s specific DGX-2-based code to upscale the images as close to ground truth as possible while requiring a very low amount of rasterization usage on the GPU’s part, so you could get 1440p-style performance with a 4K final image (or better?). It could end up being extremely useful for the advent and adoption of high frame-rate high resolution monitors, like the 4K 120 monitors. You can still take advantage of the 4K tech while also being able to hit 120 fps, or near that number.

            So unfortunately it’s not just a feature you can activate like the MFAA option in the Nvidia control panel, it is actually a specified thing that needs to be supported in specific titles, because the title needs to be run through the DGX-2 servers and processed.

            Anyway, what’s your theory on what causes the performance gains of DLSS?

          20. Yeah i also remember that quote. Actually i’m not sure if he said what i mentioned or someone else from nvidia did, but they said it i’m sure 100% Also if you look at the first image comparison of Infiltrator Demo, the one with 4K TAA and 4K DLSS and pause, you’ll notice TAA has a better quality except for the gun, which is where the Super Sample occurs. If i find the quote i mentioned i’ll post it here.

            Yeah that could be how it could actually work, but it wasn’t clear initially that’s why i thought it worked like a “fill” instead of using the AI, or at least not in that way, seems like i was wrong, but well i hadn’t got clear the exact way DLSS worked, and i still don’t now.

            As i already told you, i don’t know, i’m not sure we can even trust that graph nvidia showed, if it’s like you say, so, a lower resolution footage, upscaled to 4K, then it’s explained, but i’d honestly rather play at 20fps less a real 4K than a fake one, because another thing i’m pretty sure of is that the super sampled footage won’t look as good as real 4K, for the most part, some object and textures would probably look the same, as showed in that image.
            But my theory on why the 2080 + DLSS results (from nvidia’s graph) faster than 2080 no DLSS, is because they probably compared both to performance of the 1080, in the case of no DLSS, the 2080 is like 50% faster than 1080. In the case of DLSS is because they compared the 2080 + DLSS to what a 1080 would run with DLSS enable (which is probably only theoretical), to put it simply:

            4K Disabled DLSS

            1080 = 50 fps
            2080 = 75 fps (50% more)

            4K Enabled DLSS

            1080 (with theorized DLSS enable) = 30 fps
            2080 = 60 fps

            This is the initial thought i had, since they won’t probably make possible the DLSS on the 1080, they calculated mathematically the performance it would have with a supposed DLSS enabled, this is my theory, might sound crazy, but it’s based on the fact that DLSS is heavier than TAA, so should do a better work.
            On the other hand if you’re right, then DLSS has to be “lighter” and will probably do a worse job visually, and that’s how the better performance are explained with DLSS enabled.

          21. NVIDIA never claimed they did and never showed the tech demo using Ray Tracing.

          22. You said none of it was in the tech demo, nobody ever claimed it was so why did you say it wasn’t? They must have showed that tech demo for a reason, I just forgot why as I watched it.

          23. “Of which, none is used in that infiltrator tech demo…”

            Wrong, they use AI to implement DLSS.

          24. Correct, in fact if you check better you’ll discover i corrected myself in another post.

          25. Haha “check better” I’m not obligated to check all your posts. And I don’t want to because frankly, they mostly consist of nonsense.

          26. Fine at least have the decency to not talk and don’t interfere in other people’s discussions.

          27. Says who? A little clueless hypocrite somewhere in Europe called oxidized? Don’t make me laugh! I participate in any discussion that I feel like, that’s the beauty of it. Welcome to DSOG!

    2. The main reason is probably because the die size will be much bigger compared to 1080ti, that’s why, all that new stuff isn’t used in that, since it’s a 2013 tech demo…

      1. There is a reason why they compared Native 4K + TAA on the 1080Ti to 4K DLSS on the RTX.

        The game is being rendered at a lower resolution but the Deep Learning Super-Sampling fills in the missing pixels and removes aliasing even more effectively, so you end up with a render that looks identical/better compared to native 4K.

        1. How much better? Just go look at it again…It’s probably not worth the performance expense to gain that subtle enhancing in texture quality, besides, DLSS alone isn’t going to show any valuable benefit of Ray tracing or AI, it’s just a filter, building a game from the ground up with those tho…Actually Ray tracing in DLSS isn’t even remotely touched.

          1. Wtf are you talking about? I just explained how they got “double” the performance.

            2080Ti is X amount faster than 1080Ti but they also rendered the game at lower resolution and used DLSS to super sample it to 4K using tensor cores.

          2. HE IS running the demo at 4K he’s just using different Anti Aliasing filters, the DLSS is a super sampling filter, much like SSAA, probably optimized to work a bit better and use the AI potential of the chip, but it’s nothing that big, especially in a 2013 tech demo.

          3. It is but what i mean is, just a filter based on their new technology isn’t going to make the game look completely different, they used this on a 5 years old tech demo, and still with their 1300€ card they reached just slightly above the amount of fps required to make something play smooth (and they couldn’t even show it because of the vsync).

    1. I have a feeling it IS too good to be true.
      Why didn’t they show any other demo? If the performance in normal non-ray traced gaming is much higher than 1080Ti they would have hyped that to the nth degree. Remember they are trying to sell these cards at never before seen levels. It is INSANELY high price increase. Even worse if the 2080Ti is just 20% faster and the 2080 maybe even slower than 1080Ti but COST MORE…

      This all seem so suspect, like highway robbery…
      They are hiding something as long as they can. Offer Pre-orders for cards with no reviews. When do the reviewers get the cards? Do they have them NOW, or do they have to wait until cards are out in September 20th? Then I can tell you now the cards are REALLY bad…

      When something is good they would have no problem showing it…
      I think they are trying to pull a fast one and then release new cards end of 2019 and hence piss in the face of every 2000 series owner. They already do that with Titan owners..

      But I hope I am wrong.

      1. Hexus confirmed infiltrator was running in much lower resolution than 4K native, but it was reconstructed to 4K with amazing results. So DLSS is basically 4K for free, just imagine 4K picture quality with 1080p or even 1440p resolution framerates. If Nv can adopt this DLSS technology in many new games, not to mention all games it will be groundbreaking technology.

        1. So in other words a big fat LIE from Nvidia. Who the F**K cares.. I am not playing games at 1080p using 4X super scaling to “look like 4K”.. I AM AT 4K..

          We know a 1080Ti was close to 50-60 fps at REAL 4K in infiltrator demo and this ultra crappy Turing was playing at 1080p but pretend it was 4K 78Fps.

          This looks to be the worst launch in history and I seen them ALL. I literally had the first Geforce card in the 90’s..

          All smoke and mirrors. The stupid ray tracing is playing at 1080p BELOW 60fps.. WHy the F**K would I do that when I have a 4K monitor. That is just trading one image quality for another but WORSE FPS..

          Native is native. Reconstructed “Ai” approximation, is not native…
          It is not apples to apples either. The 1080Ti was actually running at 4K natively in demos I’ve seen.

          DLSS is yet another thing they came up with but will not be in most games. So there you are with a 1200 dollar GPU getting 10% performance over a 1080Ti in MOST games..

          Makes sense? NO…

          1. I doubt it will be just 10% in older games. RTX cards feature new architecture, more cuda cores, much higher memory speed. According to Nv chart RTX 2080 is 45-50% faster than GTX 1080, so RTX 2080 ti will offer probably around 70% more performance. When it comes to native 4K, on my strix 1080ti 2GHz I could run rise of the tomb raider at 4K just with 45-60 fps, and not to mention newer and more demanding games. So because even high end GPU’s are too slow for real 4K, then we need either much faster cards (not possible now with current technology) or SLI. So feature like DLSS can make a huge difference. Checkerboard rendering looked already great without any advanced calculations (have you seen watch dogs 2 performance review where John have talked about it?) and DLSS have tensor cores just for picture quality improvement. If the end result will look close to 4K picture quality, then RTX series will be a totall revelation. And BTW. I think 9800 GTX was the worst launch in Nv history. In some scenarios 8800 GTX and 8800Ultra especially was faster than new 9800GTX. Also 580 GTX and kepler 680 GTX offered very small performance improvement.

  4. I like the designs of the new cards! They don’t have that typical tacky, “gaming” look. Too bad 3rd party manufacturers are still gonna deliver us their hackneyed, overdone designs.

    1. To each his own. Personally I’m not digging the new look they adopted for these cards. Lacks any kind of character compared to the previous generations, especially the polygonal look from the GTX 10 series. Not exactly sure why it’s deemed as a bad thing for manufacturers wanting their cards to standout from the rest.

      1. The thing is, this flashy/tacky look doesn’t make them stand out. It’s a typical appearance of GPUs nowadays.

        1. Perhaps. I’ve also seen some designs that look somewhat overdone/exaggerated. But there have been a few that looked quite alright. The basic rectangular shape with two fans seems to resemble that of a vhs cassette. Who knows maybe I’ll grow into it eventually, but at the moment I’m more concerned about the price/performance ratio.

          1. But it has that classic clean look to it. It doesn’t look gaudy like “hey I am 14-years old, look at my “Extreme gamer ultra teenage gamer looking card”..
            Most of us are adults. Clean looking professional cards are more appealing. At least IMO.

          2. I agree with you on that but like computer cases I don’t mind having components that can display some kind of flair. At least without overdoing it, which I know some cards do. For me personally it is a step down from Nvidia’s previous design. But hey if it compensates style for other beneficial factors like grip and airflow then I’m all for it.

  5. They didn’t compare actual performance difference between 1080 and 2080 like last time when they anounced the 1080 being twice as fast as the 980, maybe because it was all about the Ray Tracing, well they have a point because it’s RTX not GTX. NVIDIA want people to buy into the Ray Tracing not the raw performance difference.

    1. Exactly.

      This is a completely different generation jump, and people who keep thinking “yeah but how much faster is it” are completely missing the point Nvidia is trying to convey. The entire focus of these new graphics cards is to finally introduce real-time ray tracing into videogames, not just increase framerate/resolution like previous generations. Hence why they’re now called RTX, not GTX. This is the new real-time ray tracing generation, and Nvidia is trying to emphasize on that heavily.

      I understand people want to see raw performance differences, but like I said, they have to realize that that’s not the focus for this new generation of cards, ray tracing is. If they want to see raw performance benchmarks, then they can just wait a bit. There’s no point in speculating so heavily on something that has practically no benchmarks out yet.

      1. That’s good, Ray tracing has potential, but not all games are going to implement that, that requires a pretty advanced engine, and you mostly see that in triple A games, so what about the rest? What if i’m one of those who buys a couple of triple A and rest is smaller stuff, with no support for ray tracing?
        I mean, not that these cards are slow,but the gained performance on classic rendered games is only thanks to the increased number of transistors and overall optimization of the uarch. Or at least that is what sounds like to me.

        1. True, but not all those are using all the features coming with the engine, hell not even epic fully utilizes their engine in that crap of game that’s fortnite, but i see your point. I’d also like to see that more used overall, i hope it’s not going to die like tesselation.

          1. Yeah and what about console videogames, do you think they’ll have something like that implemented in the next gen? I don’t quite think so…

          2. That’s why i said what i said, imagine all the multiplatform games looking very good on pc, and just aight on console, because they won’t be able to implement such stuff with low power – low cost hardware they use for consoles. That’s why this thing unless handled perfectly from nvidia, could be another fail.

          3. Could be but i wouldn’t count that much on it, AMD has much to work on atm, they’re an ocean distant from nvidia, they’ll have to pull the same stunt they did with intel, and that will require hard work and brilliant minds.

          1. The less powerful card will have the same capabilities of the more powerful ones, just not at the same level.

    2. Well, at least in Unreal Engine, the 2080Ti seem about twice as fast as 1080Ti. That’s an indication. But we can’t know for sure until benchmarks arrive. Besides, anyone pre-ordering the FE is dumb and impatient. Just get the freaking Galax in a couple of months instead.

    1. All those jokes about selling a kidney to be able to afford new GPUs don’t sound so funny anymore.

  6. So is it marketing trickery by Huang. What settings was the 2080Ti running at? Does the infiltrator demo even have different settings other than resolution?
    Is the infiltrator demo Vsync looked at all times? Huang said 78 fps but they showed a Vsync locked demo. Sure it was steady 60 fps.. But that area wasn’t exactly the outside part where it dipped to 32-35 fps for the 1080Ti in the video above using whatever settings.

    What can be made of this claim? Is there something more to these cards beyond the “Tflops” of Fp32 single precision we are used to in gaming? Can they use the Tensor/RT cores for general non-ray traced gaming, to boost the performance?
    The 2080Ti is listed at ~14Tflops and that is with the 1630Mhz extra boost. A stock 1080Ti is 11.2 Tflops..

    I mean charging $1,000 MSRP (every card will be higher). for when looking Stock vs Stock Tflops performance equalling 20% more performance. That is just NOT appetizing of a performance.
    I sincerely hope there is more to these cards… I am not talking Ray tracing, which likely have a performance impact in the NEGATIVE using Ray tracing.

    When will reviews be out. Did reviewers get any samples? Or do they have to buy them when they launch at 20 September. Which would be an EXTREME warning signal of a crappy card lineup…

  7. Wow, you expect me to be impressed with your foul mouth and random insults thrown around? You’re just copied my words there too. Too hard to come up with something original? I guess so. Your imagination couldn’t stretch that far.

    Surely more mature? I could see that. You lost it and started with pathetic insults and a foul language. Left out parts of the discussion, started to bring up stuff that wasn’t relevant to the discussion. That’s truly a mature way of doing conversation.

    1. Copying words? From a middle school kid? Why would i? Also what are you talking about, that makes even less sense than the rest of the blabbering you’re vomiting from your mouth.
      You’re probably behaving like this because of some past experience, against me, have i insulted your favorite game or brand? How come you’re so keen and mad to fight me, you think you can win an aegument with me? Lmao, again go pick an opponent more at your level, you’re wasting my time (and yours, when you could actually study something, and maybe have a clue when you pick an argument).

      1. “Copying words? From a middle school kid? Why would i?”

        Well you just did xD

        “Also what are you talking about, that makes even less sense than the rest of the blabbering you’re vomiting from your mouth.”

        I’m sorry if I was unclear, it’s in my nature to use a mature and complex language instead of resorting to foul language and insults like you 🙂

        “How come you’re so keen and mad to fight me”

        I’m not, I actually like you.

        “you think you can win an aegument with me?”

        You know what they say, try hard to die trying xD I’ll just skip the latter part 🙂

        “go pick an opponent more at your level”

        Well to tell you the truth, most of the discussions I’m having on DSOG resolves around civil conversation, where parts are respectful and humble to each other. So it’s hard to find a little golden nugget like you.

        “you’re wasting my time (and yours, when you could actually study something”

        Oh don’t say that. I’m learning tons and tons of stuff from these conversations, especially in the research I do regarding nvidias new stuff. I also learn a a lot about human behavior and interactivity. Don’t worry pal, I’m enjoying this 🙂

        1. Good thing you are, i’m not because i feel like it’s all wasted time, while you may have learnt something, i haven’t, well actually, i’ve learnt how to deal (just a bit) with insane people.

Leave a Reply

Your email address will not be published. Required fields are marked *