AMD is, once again, caught playing catch-up. After revealing alternatives to NVIDIA’s DLSS Super Resolution and DLSS Frame Generation, the red team has announced an alternative to NVIDIA’s Neural Texture Compression.
For those unaware, at Siggraph 2023, NVIDIA presented their Neural Texture Compression (NTC) solution. This AI-powered compression reduces texture sizes while keeping their quality high. NVIDIA claimed back then that this new format is better than the old AVIF and JPEG XL formats.
And earlier today, AMD announced its plans to reveal its own Neural Texture Block Compression at EGSR2024. As with NVIDIA, AMD will be using neural networks to compress textures.
AMD promises that this technology will be easy to implement as it uses unchanged runtime execution. However, there is no word on when – and if – video-games will take advantage of it.
For what it’s worth, NVIDIA’s Neural Texture Compression hasn’t been used in any game. However, as we’ve noted in a previous article, NVIDIA may find a way to implement this tech in future versions of DLSS. Or at least that’s what NVIDIA’s CEO Jensen Huang hinted at.
There is nothing to add at this point about this new texture compression solution. To be honest, I expect it to be as good as NVIDIA’s solution. That will certainly be a good thing. But, since there aren’t any games that can benefit from either of them, they are – at least for now – a “nothing-burger” for most of us.
Stay tuned for more!
We’ll present “Neural Texture Block Compression” @ #EGSR2024 in London.
Nobody likes downloading huge game packages. Our method compresses the texture using a neural network, reducing data size.
Unchanged runtime execution allows easy game integration. https://t.co/gvj1D8bfBf pic.twitter.com/XglpPkdI8D
— AMD GPUOpen (@GPUOpen) June 25, 2024

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
John…. Hahahaha
When will you make an article about Intel catching up?
😂
Intel and AMD are pretty much tied when it comes to CPU.
AMD's X3D CPU slightly beats out Intel's CPU when it comes to specific games.
But the flipside of the coin is that X3D are horrible in everything non-gaming, and they get destroyed in single-threaded and other workflows, where even i5 easily beat them.
https://uploads.disquscdn.com/images/d3ffde1b145f2c774a8396c1c355cae893067dd2b756ff6d3981d131e5d230b8.png
Intel and AMD are pretty much tied when it comes to CPU.
AMD's X3D CPU slightly beats out Intel's CPU when it comes to specific games.
But the flipside of the coin is that X3D are horrible in everything non-gaming, and they get destroyed in single-threaded and other workflows, where even i5 easily beat them.
https://uploads.disquscdn.com/images/d3ffde1b145f2c774a8396c1c355cae893067dd2b756ff6d3981d131e5d230b8.png
AMD's X3D CPU slightly beats out Intel's CPU when it comes to specific games due to their cache design. So if you do nothing but gaming and have a lot of money, X3D CPU make sense.
But the flipside of the coin is that X3D are horrible in everything non-gaming, and they get destroyed in single-threaded and other workflows, where even i5 that cost a fraction of the price easily beat them. If you do other things besides gaming, or are looking for good value, Intel makes much more sense than AMD.
https://uploads.disquscdn.com/images/d3ffde1b145f2c774a8396c1c355cae893067dd2b756ff6d3981d131e5d230b8.png
Kinda irrelevant reply though..
Until you run into Intel 13th gen and 14th gen stability issues, and have to underclock your CPU to solve it and negate any performance advantage the chip may have had.
Or until Intel has to release another round of performance destroying security patches, which for some reason never seem to hit AMD as hard (I suspect because Intel chips may have more speculative execution features).
That said, now is not a great time to be buying a CPU anyway. You will want to wait until NPU with at least 40+ TOPS come out, if you are even the sligtest interested in using AI in the future.
The focus should be on writing, art style and such, focusing on hardware and shiny refwectuns etc is just selling hardware banality. This tech isn't going to save a stinker, it'll prop it up for initial sales I guess. I'm so glad that there are so many games past and present as to not really need any of this bloated scam tech for the most part.
Then they turn their own scam logic of supply demand on it's head, unlimited supply=zero value, free, but in the system it's the opposite, games go UP in price, ridiculous. And GPUs go up for justifications of garbage faux innovation and tech like this stuff. Ngreedia and it's "competitor" which it likely owns on the sly, needed to focus on doing more with less wattage and other things but focuses on addictive stuff that's shiny and such to get fools to pay for etc
Pretty sure there is some kind of agreement between AMD and Nvidia.
AMD's discrete GPU are total garbage. (performance/watt is terrible, their raytracing is terrible, AMD drivers are terrible, their FSR is inferior, their Nvenc alternative is pure bugged junk).
AMD doesn't even seem to care, because they don't even lower prices, they seem fine with having only 10% marketshare on PC.
But AMD keeps regulators off Nvidia's back. Nvidia can claim there is competition because 10% of users buy discrete AMD GPU.
In return Nvidia probably agreed not to compete with AMD for Playstation/Xbox.
Competition between amd and nvidia is a sham.
Nvidia carpet bombed any relationship with Sony and MS years ago, during the seventh gen of consoles.
6th gen for MS
If Nvidia got involved in the console market then you can expect way overpriced consoles and both Sony and MS know that pig won't fly.
My RX 6800XT runs very nice and drivers are solid.
I don't know what are you smoking with all this BS.
Good for you? An RTX 4070 has much better performance/watt, outperforms an RX 6800XT in raytracing, DLSS, Nvenc, CUDA, AI, etc. There's a reason everyone is buying Nvidia GPU.
AMD should be undercutting Nvidia on price, but they don't, AMD somehow maintains price parity with Nvida yet AMD GPU are objectively worse GPU. AMD's GPU marketshare has crashed into irrelevance. It makes 0 sense for a company like AMD not to lower prices, which leads me to believe there is some kind of deal between the two companies.
I have 16GB of VRAM and the price for that card was good at least in EU.
It cost me 370 Euro when RTX 4070 cost 500 plus it only has 12GB of VRAM.
I don't give a f@ck about ray-tracing/upscaling or streaming.
AVIF and JPEG XL = Fake Frames
Its like nvidia is the owner of everything. Cach up on something vaguely mentioned.
Nvidia's edge isn't their hardware, it's their software/firmware development teams. All the best programmers from gaming were snatched up by Nvidia and given increased pay, significantly better benefits and treated with the respect they deserved.
Nvidia's software stack is wide and deep as anyone who has used Omniverse for mechanical and electronics design can attest to. Their software allows you to string together 100's of GPUs and turn them into supercomputers with processing power only dreamed of a couple of decades ago. No one else, especially AMD has anything that come even remotely close to Omniverse
Nvidia hasnt even caught up to ATI IMAGE QUALITY 30 YEARS LATER LOL NOT TO MENTION NVIDIA IS BEHIND IN MCM.
My 980Ti, 1080Ti, 3080 and 4090 prove otherwise. There's a R9 290 and 7950 (IceQ) up there as well. Shame the last innovation we saw from AMD's GPUs was True Audio.
The latest series joins two chunks of silicon directly. Perhaps Nvidia just decided to skip it entirely.
lol its not a real gpu designed for graphics its just two ai chips glued together like intels poor attempts. nvidia and intel are both behind amd MCM and yes nvidia is behind amd in image quality look on youtube lol. nvidia default to lower image quality out of the box.
MCM, Multi Chip Module, meaning more than one chip. Nvidia have taken two GPUs and connected them directly to act as one, effectively avoiding the overhead incurred by other vendors.
Past image quality difference between AMD and Nvidia has been debunked many times. Perhaps you refer to the colour range being reduced (to provide a more pleasing image on cheap displays) in the default Nvidia install? Indeed if you look at modern reviews of AMD, Intel and Nvidia you will see that AMD come last in terms of image quality.
I find the number of tech experts that start a post with 'lol' to be quite surprising.
Multi chiplet design = worse performance than monolithic. There is simply no way around this engineering FACT
It's purpose is to lower manufacturing costs NOT increase performance which is DOES NOT DO
MCM is a flop for AMD because they still priced their GPUs like they were monolithic and is why the 7000 series FLOPPED Big Time and actually lost them market share. Instead of passing the cost savings to the Consumer they got greedy and tried to keep them as increased profits
Multi chiplet design = worse performance than monolithic. There is simply no way around this engineering FACT
It's purpose is to lower manufacturing costs NOT increase performance which is DOES NOT DO
MCM is a flop for AMD because they still priced their GPUs like they were monolithic and is why the 7000 series FLOPPED Big Time and actually lost them market share. Instead of passing the cost savings to the Consumer they got greedy and tried to keep them as increased profits
John, John… at this point it's very shortsighted to mock them, and on the contrary, we should celebrate every single time they make a step towards staying relevant in any way.
John is blowing Jensen again. How predictable.
Not at all. AMD really are that far behind as their focus is console/mobile.