By this time, we all pretty much know that NVIDIA will shortly release the GeForce GTX1080Ti. And while the green team has not officially announced anything yet, a lot of people believe that it will reveal (and perhaps release) this GPU during a GeForce event at this year’s GDC 2017. And…. well… we now know for a fact that the GeForce GTX1080Ti is real, as the official packshot of Halo Wars 2’s retail version has revealed/leaked this brand new unannounced graphics card.
As we can see below, Microsoft and THQ Nordic list NVIDIA’s GTX1080Ti GPU for the game’s Ultra Requirements.
Do note that the retail version of Halo Wars 2 will be released only in Europe as its North American release has been cancelled.
So there you have it everyone, the NVIDIA GeForce GTX1080Ti is indeed coming (duh).
According to rumours, the GeForce GTX1080Ti will pack 12GB of GDDR5, will feature 3328 Cuda cores, will be clocked at 1503MHz and will sport 10.8TFLOPs.
Stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email

I mean cmon. With all the buzz around AMD, one has to have come across an article talking about the 1080ti.
Unless you live under a rock OR you are Zatara’s Stalker :O:O:O:O
I’m actually confused by what you mean ha ha. I personally couldn’t care less about the Ti as i feel GPU’s have been progressing throughout the years where the CPU has just been staying still. Very boring for people wanting more CPU performance per dollar throughout the years.
Yea it was a little cryptic. The point of my post was to state that this article was a clickbait. Every goddamn body knows about the 1080ti. With everything that’s happening in the gpu market these days, everybody stumbled accross a 1080ti article 6months ago. I believe this site makes money out of clicks so yea…
I clicked it.
All sites do it to be honest but yeah i agree 1080Ti is probably going to be released at GDC i personally could see people buying it with Ryzen since Vega is still not coming to 2nd quarter.
https://uploads.disquscdn.com/images/96bef2b9018f614a46c770ee74419b469cb8822f88fcd7504c691b71ff87f61d.gif
Lel
Who’s Zatara?
He was a frequent commenter on this site, who mysteriously disappeared; I can not confirm or deny that Zatara is tied ip in my basement next to my Zatara shrine.
So either it’s you, they changed names to avoid you, or you did something to Zatara. The first 2 don’t count.
lol what a bunch of BS. If you need a 1080 TI for Ultra it better be the best looking game in the f’king universe. And notice on the AMD side it’s a Fury X… So pretty much they are saying they want you to have the best you can get right now from AMD / Nvidia. And when companies do that… That is normally a sign of a trash port…
Ultra = 4K@60.
looks like a 1080 does just fine…. Funny how the box says 1080 TI…
WCCFtech did a review on the game.. It’s P2W lmao just pathetic
1080p:
https://uploads.disquscdn.com/images/4e0bb3eebd200b3e9ebb7e404ce75e19f3233e2c811aa56abe8f28460b63eec1.png
1440p:
https://uploads.disquscdn.com/images/8ac299bbf2607b73ee5744c66aa10e212886011a058a479e7c85ac1e79173259.png
4K:
https://uploads.disquscdn.com/images/61ebb2553a5d0d81ca915476a87802ce6d9b0f88a1c3034c3780be78327c5d66.png
The benchmarks below completely demolish your reasoning.
I take it you don’t understand the reason of what I said… They put high end GPU’s… And BTW those benchmarks use a CPU that cost more if almost twice as much then 90% of the GPU’s in the benchmark.
cpu scaling:
https://uploads.disquscdn.com/images/12822da05a0bd2eb06435a29bb6222779849954a2b2d189f324a42c71c6d34f8.png
cpu cores scaling:
https://uploads.disquscdn.com/images/2526ff19f48f94c75385b4ce5100d0591092cd5dc590b3c6e10cd24b707339b3.png
I take it you don’t understand benchmarks. The CPU is really high-end in order to eliminate bottlenecks during testing so that a fair assessment of pure GPU performance can be obtained.
Your argument was that they said it needed a 1080Ti because it was probably a trash port. This is simply not true because “Ultra” to them means 4K@60 fps which is a very demanding standard. If you actually look at the benchmarks, even a 1050Ti can handle 1080p@60fps so your reasoning is simply illogical.
And how do you know that ULTRA for them means 4k60fps? I mean ultra is ultra, 4k is 4k, 60fps is 60 fps, it’s not something you just have to stick together.
Also, why would they write 1080@60 after recommended, and not 3840@60 or 4k@60 after ultra?
Just look at the benchmarks for ffs. All of the benchmarks are played at VHQ (Very High Quality) and at 1080p a 1050Ti never drops below 60fps.
If they didn’t write anything after Ultra but wrote 1080p@60 for recommended, that can only mean that Ultra to them is higher than 1080p 60fps, which proves my point.
You can save yourself a lot of trouble if you just look at the benchmarks.
I don’t know what benchmarks are you talking about, but still doesn’t prove you right, for all we know ULTRA might also mean 2k, might also mean 1080p with ultra settings.
I’m referring to the benchmarks above. The 1080 and 1070 get waayy over 100 FPS so a 1080ti isn’t even necessary.
which benchmarks above? I see nothing, if someone posted them, it might be someone i blocked, repost that for me please.
Just search “Halo Wars 2 GAMEGPU Bench” and they should come up in google images.
Found them, anyway we’ll see, nothing was proved yet, “ULTRA” might still not mean 4k.
Wait what? The benchmarks specifically state that the game was run at 4K at VHQ (Max settings)
Oh wait i just found the 4k ones. Wtf if a 1070 can run it at 60fps 4k at max settings, why the fck did they write 1080ti then?
Hell even a 980ti can run it at 58 fps at 4K. The devs probably did that to play it safe. It’s industry practice to add a bit of overkill to System Requirements so that users can’t get sue the company for having the right hardware. But in reality hardware much lower than the specified requirement will suffice. This is why it’s not a good idea to make assumptions about the game based on the requirements.
Yeah but that’s so stupid, i’d understand if the had put 1080, but 1080ti? which is probably be at least 20% faster than the normal 1080, which is another 20/30% faster than 1070 and that already suffice for 4k, so stupid.
Yeah I agree, its wayyy too much overkill
And looking at GAMEGPU benchmarks they are terrible on Performance. I mean wow really. I hope they don’t try to say it’s a DX12 game… Because if they do. It’s even a bigger Joke
Yeah I played the beta as well. And it’s nothing special in terms of visuals… Yet another DX12 Flop.
What are you smoking? The benchmarks show that the 980Ti can still get 58 fps average at 4K@60fps. If anything, it’s pretty well optimized.
lol a 980 TI should destroy this game at 4K… 58 avg? I don’t think so. It should be well over 60+
lol I been PC gaming way before you been even typing.
And the game does not need such a CPU at all.
And it’s Ultra for 4K lmao give me a break it’s a RTS not a fully open world game with high grade textures.
And the Box says a 1080 TI on Ultra. To me that’s a give away that it’s generally a trash port when a company puts a super high end GPU with a CPU it wants at 4K when at 4K a game is GPU dependent then it is CPU.
You seem to not understand the logic at all.
GameGPU cpu scaling benchmarks:
Core i7 5960X (amazon: 1199 usd): min. 87 fps, avg. 98fps
Core i3 6100 (amazon: 116 usd): min. 86 fps, avg. 92fps
Yeah and that’s terrible..
Judging by the way you type you haven’t been typing very long and you pretend to know my age which you have no way of knowing so try again. You might have a YT channel, but based on what you’re saying I feel bad for anyone who gets their info on CPU/GPU utilization from you.
“The game does not need such a CPU at all”
You still don’t understand do you? The CPU is overkill on PURPOSE to avoid CPU BOTTLENECKS. THAT’S HOW FAIR BENCHMARKING WORKS
Usually devs put overkill requirements to play it safe, but like I said before, even cards like the 1050 Ti perform just fine (60 fps) at 1080p Ultra.
This
“Usually devs put overkill requirements to play it safe”
^
Play it safe.. lmao so now MS dev’s are in the ranks of Ubisoft Kiev devs?
If i was a publisher or dev or whatever, i would also put higher requirements that what’s actually needed so the expectations are controlled. I do not know why you both fought over this but it seems normal to do that.
In my PoV that is. I’m not the only one on earth.
Publishers have nothing to do with putting on the min, rec specs… It’s clearly another Windows 10 MS disaster. And the game is also P2W… Such a shame
Microsoft isn’t the first on to do this. I think they might’ve followed a trend. And how is putting overkill specs a “disaster” ?.
I think this has gone too far.
Yeah I do fully understand. And the game is a joke and you are sounding like a typical M$ s h i l l that found out about DSO.
And fair benchmarkig works when you use a CPU with the cost frame of a GPU. That’s how benchmarking scaling works.
I guess we will see how the final game works. But so far nothing looks impressive at all.
The most shocking thing of all is a Halo game is published by THQ? What in the hell?
thq went bankrupt, this is thq nordic i think
Only the retail copies.
This only for 4k 60 fps max settings however for 1440p 30+ fps maxed my 970 or a similar caed will be enough!
needing a 1080ti to run a console based RTS at ultra makes for such a good joke.
No rebindable keys, console based UI command wheel on PC and now a far more expensive and higher end GPU that isn’t even out yet as the game’s “ultra” requirement, the whole game is a joke.
I believe ultra includes 4k resolution and 4k textures.
The keys are rebindable.
Still I played the Beta… it’s nothing that great to look at. No way should you need a 1080 TI heck even a 980 TI to play this game at 4K…
‘leaked’
Phil Spencer, 17th February:
“Gaming on PC is something that we are committed to”
19th February:
“the retail version of Halo Wars 2 (on PC aka UWP) will be released only in Europe as its North American release has been cancelled”
That is up to THQ Nordic, they are the ones taking care of the retail copies.
What a dumb comment.
The game is a “play anywhere” title. If you buy the XBone version it comes with a code for the windows store too
GDD5 instead of GDD5x? Even 1080GTX use GDD5x
GDDR5 and GDDR5x… the difference isn’t too big. Also, the 1080 was the first card to have it. I believe only the Titan xp and the 1080 currently use GDDR5x. Somehow I doubt many people are aware of what GDDR5x is and that’s why its not mentioned in the rumoured specs.
Ummmm, yes it is… My Titan X GDDR5x memory runs at 11800MHz… Good luck getting close to that with GDDR5. I would say 20-25% is “big” when it comes to memory performance… But I still think that the 1080ti WILL get GDDR5x… It makes very little sense to not have it. I think the difference memory wise will be the Bus Width. Likely the 1080ti will be 256-bit where the Titan X is 384-bit. My Titan X has 566.8GB/s of Memory Bandwidth! Anyway… I think we should wait for the “official” specs before we start to decide what it what…
The difference doesn’t seem to appear too much in gameplay. Sorry. Wasn’t specific.
of course the 1080TI is real, and it’s coming
but what about the PRICE
i couldn’t give a f**k about release dates, i want to know how much will Nvidia charge us for their top of the line card
For a second there i read this as “halo wars minimum requirements are a gtx1080ti” with the nonsense that ms is doing and the incompetence they constantly displaying, it wouldnt surprise me if that was true.
I have this game early and I can say it is not stunning game interm of graphics, textures are mid to low quality.
And the battlefield is not full with enemies to call the game a CPU and GPU hungry.
I love how it has listed for the “Ultra” spec the from AMD’s camp a Fury X is required but right beside that it states 8GB VRAM… lol… Too bad for AMD fans I guess… BTW… No hate here for AMD, I just thought it was stupid that they listed the card required to play the game on “Ultra” (the Fury X that only has 4GB VRAM) and then right beside it that for “Ultra” settings you need 8GB VRAM…
Does the writer mean it has 12GB GDDR5X? I doubt the TI would have inferior VRAM to the 1080
Well the 1080 only has 8Gb of VRAM, so even if it did turn out to be GDDR5 it would still have 4 more gb of VRAm
????????????
The AMD Fury card must be pretty good.
Not really, it’s AMD only card that can kinda play games at an enthusiast level. That’s still limited by 4Gb of VRAM in 2017. lol
Then why not show rx490 or whatever? Since 1080ti has not been launched yet?
Two months later the 1080Ti has been launched along with board partners of Nvidia to follow. No sign of the RX 490 or VEGA.
I like how they set an actual performance target for 1080/60 for reccomended HW. Having real world perf targets is so much better than the arbitrary Min, Rec, Ultra system that is currently the standard.
People tend to look at Performance reqs in too simplistic a fashion, then make assumptions that thier card should get Ultra settings in “X” games because it was a high end card. But what does Ultra even mean? It’s variable from Dev to dev?
Is Ultra Req Max settings at 4K 60 fps? or 1440p? etc. If the perf target is 1080/60 Ultra then a much lower card probably would be listed vs the others. Not even to mention Ultra on some games may just be 1080p lvl textures and 4x AA while another game’s ultra is 4K textures 4xSuper sampling, once again a huge diff in expected card range. All of which leads to premature fanboy rage.