It appears that NVIDIA will release a new Titan card based on the Turing architecture, called NVIDIA GeForce Titan RTX. Although we don’t know much, we expect this GPU to be more powerful than the RTX2080Ti, and it will be even more expensive (so we’re basically looking at a $3000 GPU?).
This new GPU was teased/leaked by Linus on his latest livestream. Below you can find an image from that livestream in which we can see Linux holding the package.

Our guess is that this is a coordinated leak as we’ve seen other content creators also sharing pictures of this rumoured Titan graphics card. As such, we can safely say that the NVIDIA GeForce Titan RTX is real and will be announced shortly (perhaps next week?).
Stay tuned for more everyone!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
What? I thought they specifically said that there would no longer be Titans, and that the 2080 Ti was meant to be the new “Titan” for this gen (hence the price point many people complain about).
I could be wrong and remembering incorrectly of course. If so, then I now understand why so many people were pissed at the price, because this is just… it’s just f**k.
NVIDIA never said that. It’s just what everyone else was assuming since there wasn’t a Titan prior to the 2080 Ti and it was priced as much as the Titan. I cringe to think of how much that GPU costs. I stomached the 2080 Ti because I still needed more performance than the 1080 Ti provided, but that Titan will be outrageous… I’m wondering if it’s going to be more of a Titan V type card, geared for scientific workloads at the $3,000 price point? Or maybe it lets you play RTX games at 60 FPS without lowering your resolution to 1080p, in which case, a lot of customers are going to be pissed.
1080p @31 fps is more likely
do you have any common sense? as for as gaming goes it will only be a hair faster than the 2080 ti as its likely nothing more than a fully enabled TU102.
Why you have to be rude with him?
You could have said that in a polite way and it would not have lost any impact
Exactly what part of my comment lacks common sense? We’re having a technotec discussion anyways so common sense is hardly anywhere to be found.
Better be waaaaay more than 4k 60 for that price. Not sure how long the charade can last if this Titan doesn’t unanimously blow minds across the industry. None of the attempted cleverness and misleading marketing like last round.
Oh who am I kidding? Their day one adopters who are ready to click “preorder now” are either salivating scalpers or the same as in debt Apple fanatic drones buying everything on credit to look like a pampered Kardashian, so Nvidia will be fine.
There are a few that will pay the price trying to get “future proof” but there is just no such thing as future proof with PCs and there never has been. They would be better off buying a 2080 Ti and selling it when the 3080 Ti comes out. Then selling the 3080 Ti and buying the 4080 Ti when it comes out.
Nvidia tends to not support drivers very well for older cards.
You’re right that the titan could be aimed as a Faux workstation card that sits just above 2080ti and has more science/ai poten for workstation users, but can’t see it being massive boost over 2080ti and it will be nowhere near full turing
Nvidia competing has to stop… AMD or Intel do something! Your customers are crying!
AMD can’t do anything… They almost went bankrupt… The biggest thing AMD has done is Zen CPU Arch.. Where were people buying AMD GPU’s when AMD was ahead of Nvidia? Or that’s right falling for Nvidia’s marketing.. .
I AM SORRY! I AM SO SORRY! I DIDN’T WANT THIS! PLEASE FORGIVE ME!
That’s the problem with you people, you think there were many times were AMD/ATi was ahead of nvidia, but in fact it happened no more than 3 times, and all three times was when nvidia screwed up
The amd 4000 series was great they were the first to have gddr5 and were great performers, i loved my 4870
They were, but again, it’s when nvidia was underperforming.
edit: Actually no i compared it with 400 series from nvidia, i just mixed it all up.
Radeon 4000 series was surely a good one, but nvidia was ahead with their GTX 200 series, and they even cost less!
That isn’t entirely true, there were to,es even when Nvidia had the best card on the market AMD offered better performance for your dollar. Even the Vega 56 offered better performance that the gtx 1070 at the same price, at least until Nvidia released 1070ti to compete with it.
Considering it came out a year later, and the costs went above the 1070’s right away…Also the 1080 is a faster card and for a big while it cost less or as much as Vega 56, consuming less power.
You are thinking of the Vega 64. Vega 56 came out at the same price as the 1070 and outperformed it a bit.
No they were both launched a year later and the price went up instantly due to mining.
Haha Intel? That’s like asking EA to save you from Ubisoft or vice versa. Madness is what you are asking for son.
The madness I speak of is the pricing of the 2x series and Intel did grab AMD’s lead gpu designer so anyone offering some kind of competition would benefit consumers!
so when is a x60 coming out? Oh thats right the 2016 1060 is 300$ so why release a new one?
Probably when the stock of 1070s sell out. With the 2070 being a $500 to $600 card I don’t expect the price to be cheap on a 2060. Probably at least $400. Right now is just a bad time to be buying the RTX cards.
RTX ON
https://uploads.disquscdn.com/images/7480f59c6acfba655a9d866c210382f1beb1252bce5aa7a18a1174fee836634c.gif
https://uploads.disquscdn.com/images/876da60bf28f11a9042643943e51d7914202e9a0f993703a63e62633304dc943.jpg
I calld it ……
Bet it will cost
1999,99$ https://uploads.disquscdn.com/images/1095331421e4ce2ec4920f576390156302f5855f5b4f37ca8e89e11c15eec450.jpg
So Appl… er ehem… Nvidia has their frothing-at-the-mouth fans ready to choke down even more cheap hw and apply for more credit cards to pay them for the convenience. Betting 2080Ti SLI owners are pretty pissy now – given their gotta-have-it-this-very-millisecond, middle schooler finance habits. lol! It would be hilarious if Nividia started a proprietary “Support Bar” just to jack them further since they are running things exactly by Apple’s playbook.
Meanwhile my 1080 Ti still kicking along with no probs and dont see that changing. Might even skip next year given how small the boost in perf was this round relative to the 1080Ti during its time.
with MSRP of 2000$
we need competition in the gpu, or nvidia will start selling us the XX50 & XX60 card @ 400$ to 600$ soon
Omg. Now i can see the 2080ti owners being pissed. Glad i decided to just get the 2080 since it was the same price compared to the 1080ti.
I don’t expect the Titan RTX to be much more powerful than the 2080 Ti and it costs significantly more. I may be wrong about this but the 2080 Ti is already pulling down around 275 watts in average gaming. How much more can the push it with the Titan RTX. It will no doubt have more cores but they will probably have to lower the clocks on it to keep things reasonable.
I don’t know. But what i know is that Nvidia next to never released the Ti/Titan with the normal variants. Why ? Because the normale variant were 15-20% faster than older Tis but this gen, lmao the 2080 is +/- the 1080Ti which is a bummer and the 2080Ti is around 20-30% faster than the 1080Ti at such a premium price tag that it’s turning away customers. And now this Titan news… Hmm i wonder what’s Nvidia’s game here.. besides making money ofc.
Rtx20 itself really isn’t stellar at all. The tech is interesting though.
I believe the up to 70% is with the DLSS which is not to be found in many if not all games right now.
2080TI VS 1080TI 2K RESOLUTION
ASSASSINS CREED ORIGINS : 97FPS/80FPS (21.25%)
F1 2018 : 162FPS/122FPS(32.78%)
FAR CRY 5 : 130FPS/104FPS(25.00%)
OVERWATCH : 220FPS/162FPS(35.80%)
STTRAIDER : 106FPS/82FPS (29.26%)
GTAV : 112FPS/111FPS(00.90%)
FORTNITE : 148FPS/110FPS(34.54%)
PUBG : 132FPS/119FPS(10.92%)
BFV : 145FPS/110FPS(31.81%)
WITCHER 3 : 122FPS/96FPS (27.08%)
VERMINTIDE 2 : 131FPS/94FPS (39.36%)
DEUX EX MANKIND DIVIDED : 85FPS/68FPS (25.00%)
FIRESTRIKE(SCORE 1080P) : 26388/23733 (11.18%)
FIRESTRIKE(SCORE 1440P) : 15691/12838 (22.00%)
FIRESTRIKE(SCORE 2160P) : 8239/6822 (20.77%)
TIMESPY (SCORE 1440P) : 12636/9502 (32.98%)
Goes those from a youtube reviewer. Although it was on 2k. Granted 4k could yield different results. Still i don’t see anything that makes me say “of look it’s worth the price”. Not at 2k that’s for sure.
Oh? Do you think so? It crossed my mind but didn’t think of it much since it was 2K not 1080P
I have given an explanation above.
Not fully true. It would be an exaggeration to say that majority of games are bottlenecked by the CPU on 2K.
It’s not ‘always’ necessary to be bottlenecked by the CPU, when playing on less than 4K, i.e. on 1080p/2K.
But the reason why typically CPU bottlenecking is less at higher resolution is, because on high resolution you get low frame rates. If you can get HIGH frame rates at high resolutions, you are still gonna get the same CPU bottlenecking.
CPUs usually don’t care if it’s 480p or 12k resolution, they only care about the frame rate/fps.
On high resolution, because your GPU has to work 4X as hard, the
bottleneck on your CPU is STILL there, but there’s simply a “stronger” bottleneck on the GPU instead.
In other words, when gaming on high resolution, no the load does not actually “MOVE” to the GPU. The load simply “increases” on the GPU side without changing the CPU side (resolution does not affect CPU load).
So the LOAD becomes more “heavily” weighted towards GPU power than CPU power, but it’s not because the demand has been redistributed away from the CPU toward the GPU, nope, it’s because the GPU demand has “increased” and the CPU demand has stayed the same.
**edited my post for typo errors**
Oh this is quite a good explanation on the subject. I do not have time right now but i should dig in on benchmarks on the 2080ti @2k vs 4k. That should give a good idea.
When i’ll have the time -_-
lol, please stop posting here on DSOG, Mr. Luckynumber8. enough of your BS comments.
didn’t your mom suck your D*** yet ? Lmao, she must be hungry w**** for sure.
you are nothing but a pile of crap.
Who cares about the jump in perf on 4K, when the pricing is just ridiculous of these new RTX cards. End of discussion.
You are way too much defending/supporting this whole RTX release, as evident from your reply.
What else can u expect from him ? He is “LuckyNumber8”, a previous regular member of Dsog. Just created a new account/alias.
That says it all. Just move on../s
lol, please stop posting here on DSOG, Mr. Luckynumber8. enough of your BS comments.
didn’t your mom suck your D*** yet ? Lmao, she must be hungry w**** for sure.
you are nothing but a pile of crap.
That’s the heck of it. We really don’t know for certain how much Nvidia is taking advantage of the situation. The 2080 Ti is a large chip and the cards use a faster VRAM than the 1080 Ti which I imagine adds something to the cost as well. We don’t know the yield rates at TSMC. The R&D costs have to be recouped and we don’t know how many engineers had to be paid for this tech.
2080 Ti
18.6 billion transistors
754 mm²
4,352 shader cores
544 tensor cores
68 RT cores
1080 Ti
11.8 billion transistors
471 mm²
3,580 shader cores
It would be nice if someone in Nvidia or TSMC leaked the cost to manufacture the 2080 Ti.
Lol. Who cares, and why is this an Article over here ?
Hardly any gamer is going to spend this much money on a card, at least for GAMING, unless they are doing some other heavy compute workload. And also, we all know that TITAN has always targeted a different audience.
AMD may be able to compete better with something next year with Navi or possibly Intel will bring something in 2020 but right now Nvidia is the only game in town on the upper mid range and high end. Nvidia really takes it to heart with the philosophy “whatever the market will bear”.
Depends on your point of view. Honestly, I have no interest in anything over $300, and around that price point, AMD is very competitive.
99% of the gamers in the world will not buy anything more expensive than that. So, really, is the high end that big of a deal? Only to a select few.
I can only see the uber tech-dork hobbyists who love ripping these things apart for extreme benchmarking buying this garbage.
Ngreedia
I wonder if this card will die after 3 weeks too.
If a bunch do die in warranty it will cost Nvidia a fortune. They have to honor their warranty. Nvidia only offers a 1 year warranty though.
Last I heard the early production run of the 2080 Ti were the ones suffering problems. It makes sense that Nvidia would be quick to react to the issues. I’ve heard of some people that had to send the card back twice for replacement. That was very expensive for Nvidia.
Maybe it’s something with the holidays or lack of supply but the RMA’s on the dead cards are taking forever. From everyone I talked to and personal experience it’s taking weeks to get replacement cards. Only EVGA seems to be getting cards out to people quickly.
I would hope/assume Nvidia wouldn’t let another round of defective chips “escape” QA. But I’m not sure how much Nvidia even cares at this point.
They are all blowing up. Just go check the reviews on Newegg. 2070 & 2080 is dead as door nail.
I only heard of one card catching on fire. Never heard of one blowing up.
There do seem to be a high level of cards dying. I read some reviews an what I saw was artifacts on screen before they died. That’s usually a VRAM issue. I did hear that Nvidia switched to a different memory type. Maybe that will help.
Nvidia RTX Titanic
Lmao ??? I laughed so hard. Oh my, you did it. ?
Oh God say it ain’t so… This will be $8,000 I’m sure. Nvidia currently has no budget cards and this has nothing to do with inflation. This is 200% greed, coupled with ignorant consumers with more money than brains. Consumers that are basically aiding the destruction of the market.
milking continues….
https://uploads.disquscdn.com/images/b7f2e1637599799c2eddaca9d690f20890696524eda3bd2c50f1dc1dc8341559.jpg
nvida has to get its pricing under control..i mean I like money, so I can’t blame them for liking money…I blame amd for not being competitive…remember the 9700 pro??? we need another one of those, then the market will get tasty for all..thanks amd…I still love you
Monday morning meeting at Nvidia HQ:
‘GUYS, we have problem. With the current graphic cards, gamers can play games at 4k60+ fps… ambient occlusion, ultra settings, max quality shadows, particles, lighting. .. photorealistic texture packs… 3d… vr… msaa16x…they can do anything, there is basically nothing that can make them need ing to buy a more powerful card… they have all the power they need, and even more, to run all games at max resolution, max framerate, max quality…
…we need something, a new tech that will run games at low resolutions, low framerates, low quality settings, etc etc, so we can have another 3-4years cycle where each year, each new card, we can have more power to increase the resolution, framerate…
..like the era where we would review games at 768p… then 1280…. 1600×1200… then 1080p… 1440p…
…we need something like that… any ideas ?
Let’s do RAY TRACING !
With raytracing, a new race for power will born… first, low resolutions… then framerate.. then, we can do raytracing on 20% of the game… than 30%… 50%… 60%…. for each step, gamers will need a new highend card… every 6 months, they will be able to run their 20% ray traced games at 1024.then 1280… full hd…. 1440p…. 4k…. then, they will need new cards, to reach 60 fps at full hd… than 1440p… than 4k…. then we will unlock full frame ray tracing, with low antialiasung… then, they will need more power for a better antialiasing….
…
… and so on and so on…
We are good for another 5 years and 7 or 8 card models/reviews..
Just imagine all the gamers that will spend 1500 to 3’000 bucks just to go from 1080p 30fps 50% raytraced 4xmsaa to 4k 50fps 80% frame raytracing at 8xmsaa…
We’re going to sell dozens millions cards. Yeahhh.
We , @Nvidia, we love raytracing’
*end of meeting
https://uploads.disquscdn.com/images/52000b6fba5deef635b9d7d06f7dbb0de0f41e327f8205463edec24f8e953f04.jpg
….and out come the droves of ‘more money than common sense’ and ‘gotta have it now because its new’ folks. This gen is over-rated, guinea-pig garbage. Pass thanks.
I’m more than happy with my new 1080ti that is perfectly capable of playing the ceaseless plethora of unoptimized PC games.