NVIDIA will reveal its RTX 50 series at CES 2025 and it appears that Inno3D may have leaked their key new features. So, let’s see what NVIDIA has been cooking all this time, shall we?
Inno3D claims that the big new feature of the RTX 50 series will be Neural Rendering Capabilities. This is said to revolutionize how cards process and display graphics.
My guess is that this is related to the Neural Texture Compression that NVIDIA detailed at SIGGRAPH 2023. This AI-powered compression reduces texture sizes while keeping their quality high. NVIDIA claimed back then that this new format is better than the old AVIF and JPEG XL formats.
Inno3D also claims that we should expect improvements to DLSS. The RTX 50 series GPUs will offer even better quality and higher framerates. However, there is no mention of anything regarding Frame Generation. So, those rumors about DLSS Next-Gen using FG to generate two frames may not come to fruition.
Also, we can pretty much expect the visual improvements of DLSS to be universal. As such, pretty much all RTX GPUs will benefit from them. Or at least that’s my assumption.
Finally, Inno3D claims that the RTX 50 series will have enhanced Ray Tracing performance. These new GPUs will have improved RT cores that will deliver more realistic lighting, shadows, and reflections in games.
Since this is a rumor, I suggest taking it with a grain of salt. NVIDIA may have kept something hidden even from its AIB partners. Still, this may give us an idea of what the green team will reveal at CES 2025.
Stay tuned for more!
Thanks Videocardz

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email

hard to get excited, the price, the power draw, how awful and unoptimized modern games are, fake frames with ai, fake everything, i just cant be excited.
It's insane how we used to get new GPUs every year and now it's like every 3 years. It feels like forever since the 40xx series. I really want to upgrade, but I'm not gonna buy a 4 year old card at this point.
finally capitalism is taking a chill pill cause they're sucking us dry every year.
upgrade into what? you need more vram and better perfomance and no one is gonna give it to you due to unreal engine 5 sucking.
im gonna buy a 5080 and enjoy 2025 games and beyond.pc gaming is really exciting these days and worth every penny .path tracing is becoming normal on pc wow .
"pc gaming is really exciting these days"
maybe if you play indies.
Are you referring to the tech side of things or the games?
If you are referring to the AAA games they are mostly mediocre, uninspired, soulless, pushing a BS Liberal politically correct (woke) agenda, usually not ready for release, sometimes broken, overpriced at $70, mostly boring.
The state of gaming today is sh*t.
just tech makes me excited .not some deep geeky rpg elements
Are most of you going to get a 5070, 5080, 5090 or are you going to go with AMD or not upgrade at all?
Personally my machine runs most games just fine with a very aging 2080 TI. If I spring for a 5090, I will have to really consider CPU upgrade as well. I am very disappointed with Intel's latest and greatest; and I have always bought Intel…. not sure I can this time unless there is some magic update to improve the performance of Intel's flagship chip?
I will definitely get my hands on one of those. I have used both Nvidia and AMD GPUs over the years and I will never spend a dime on any AMD card again.
Why, if you don't mind me asking?
In all my time in PC gaming if someone's having an issue with a game on forum it can often be traced back to having a Radeon GPU.
absolute truth. Forums are full of whining AMDrones while Nvidia owners just play games while AMgays foam on forums.
I wish I could punch you in the face just because you attitude. Motherf*ker.
truth hurts, right? AMDf@g
Long story short: Nvidia is just better than AMD. From feature to driver, productivity to games, in all sections Nvidia has a net lead and advantage. More to that, Nvidia is consistent.
NVIDA moving graphics rendering in the wrong direction again? Faking everything (resolution, frames, high resolution textures, etc) hoping they can make everyone forget that AMD exists? Quit ruining gaming…
https://uploads.disquscdn.com/images/ee091d1098168375c892e11453531b98b28526eb28724cbca22b1268ec473a8b.gif
That reaction by King Linus is more than a decade old by now, though.
Over the years, NVIDIA's attitude towards Linux has changed quite dramatically.
Granted, the main reason for that attitude change is because they are making more money from Linux installations than they do from Windows nowadays (think all the enterprise data centers using NVGPU's for computational tasks).
Still, the recent push for gaming on Linux driven by Valve (among other developments) has motivated NVIDIA enough to properly adopt their driver for modern Wayland compositors like Valve's Gamescope used on SteamOS.
In fact, with a yet unreleased driver branch from NVIDIA I'm already seeing better frametimes than on their highly-optimized code-path for the old-school X11 display server.
2025 is really shaping up to be a pivotal year for the future of gaming on Linux…
I am aware of the age of video of Linus telling off NVIDIA, but I still enjoy it. It's become a timeless meme.
All frames are fake, they consist of nothing but 1's and 0's How you process those ones and zeros has no relation to whether a frame is 'real" or not
If you actually understood the technology you'd understand that AI and neural network processing is the only real way forward because we are hitting the limits of how small we can make the transistors. I've been saying this for over a year, to make them go even faster then you have to come up with new methods of processing those ones and zeros that is inherently faster and more efficient and that's exactly what a neural network does for tensors vectors, and scalars. In a couple more years you are going to also see neural networks in CPUs too like Intel is already doing for laptops and also ARM based CPUs like the Snapdragon X
what an unbelievably id iotic take, yes there are fake frames
there are frames which are generated from a very specific yet wide ranging set of instructions and then there are the frames of what an """ai""" "thinks" is appropriate to put in the middle on what it "estimates" to be an "acceptable" average based on limited information which are basically c rapshoots
All pixel positions are based on estimates just like the cache technology in CPUs (Predictive branching) is based on estimations. All the frame to frame motion vectors in video processing/compression technology are also estimations which are necessary to make encoding fast and efficient. Upscaling and frame generation are actually based on already existing video encoding technology that made streaming videos over the internet a reality and HDTV transmission (which is also a streaming technology now) possible.
You act as if estimations are a bad thing but that's just not the reality and hasn't been for quite some time in both how GPUs and CPUs work and how video processing works. Pretty much everything is based on estimations and the only differences are how accurate are those estimations. The reason everyone is going to AI/neural network based processing is because it can give more accurate as well as faster estimations and at the end of the day that is what really counts.
You remind me of the people who were upset when all games started using shader technology and hardware T&L and complained they had to buy a new video card that supported shaders and HT&L because their old one did not. Shaders and hardware T&L made all the Voodoo cards that people paid big bucks for obsolete and they were not happy campers …….
Well spoken 🙌
When I see comments like his, it just Shows how stupid they really are – total f*ckN clueless Idiots, that doesent even know the most basic form, how rendering works!
Yes,Nvidia stopped making true Raw power GPU's long ago.
Thier excuse is the processing nodes now at 5nm.
In order to have more raw power GPU's motherboard has to be bigger and also the cooling to be bigger or to work with 3nm or 2nm…but i guess those nodes have terrible yield's(more fails than passes on production lines) and let's be honest who wants a GPU the size of a case?
So,thier solution is to have same nr or maybe more "transistors" and better software,as today an RTX4060 with still 8GB, 8 GB is so 2014/15 and no DLSS or Frame gen can solve the fps drops.
I think we reached a point in "we want power,but as small as possible and to cost as less as possible and also consume as less as possible".
It's time to get bigger and better at rasterize rendering and overall raw power.
With RTX5000 16 gb should be minimum for RTX5060,24 standard and 32 future proof 384bit minimum and 512 bit standard(Asus R9 390OC had 8 gb 512 bit in 2015).
But again what good is a GPU you buy in 2025 and it lasts you till 2031/32,viewed from Nvidia's bussines side.
your mouth smells like AMD butt
I have an RTX 3070 Ti. The last time I purchased a GPU that wasn't NVIDIA, it was still ATI, and the drivers were so buggy I swore I'd never buy from them again.
Just because I dislike where NVIDIA has been going since the advent of TAA doesn't mean I'm a fanboy for a competitor. AMD and Intel are making the same mistakes right now (FSR 2 and XeSS).
ok, your mouth actually smells fresh. Sorry but the AMD fangaying on this site is surreal, I picked a wrong target
Better skip this generation and buy a better monitor that suits your current GPU so you can enjoy your games better.
Good point. I have always been an Asus monitor fan, but I may have to look seriously at that MSI QD-OLED that just came out with DP2.1. Never bought anything MSI before, but the specs on that 32inch monitor look "impressive".
Uh??!! Depends on what gen and tier you're at now
Nvidia is like Apple now, selling the same iPhone every year.
More fake upscale features are the future
Texture compression technology WHICH AAA DEVELOPERS WILL NEVER USE. Games will continue to be 100+ GB, thanks to uncompressed assets!
The days of competent programmers like John Carmack in AAA gaming are long gone. Now we just have lazy fxxks.
I hate to break it to you but most programmers back then weren't all that competent either. Carmack is a one in a generation type of person not the norm.
The devs want a 100GB+ to make it harder to download from others sources. So you can get nailed by the isp's.