The Witcher 3: Wild Hunt has finally been released and CD Projekt RED was kind enough to provide us with a review code. This highly anticipated RPG was mainly hyped via its incredible 2013 in-game trailer, and it’s time now to see how the final version looks and performs on the PC platform.
Before continuing, let’s get this out of our way. Yes, the game has been downgraded. Yes, CD Projekt RED should had admitted it. Yes, its 2013 in-game trailer can now be considered a target trailer. Yes, the PC version – even with all its config tweaks – cannot come close to it. And yes, The Witcher 3: Wild Hunt – even in its current ‘downgraded’ nature – is one of the best looking games to date.
As always, we used an Intel i7 4930K (turbo boosted at 4.2Ghz) with 8GB RAM, NVIDIA’s GTX690, Windows 8.1 64-bit and the latest WHQL version of the GeForce drivers. NVIDIA has already included an SLI profile for this game that offers exceptional SLI scaling, meaning that PC gamers won’t have to mess around with third-party tools – like the NVIDIA Inspector Tool – in order to enable it.
The Witcher 3: Wild Hunt is an open-world title and surprisingly enough, it does not require a high-end CPU to shine. In order to find out whether the game scales well on multiple CPU cores, we simulated a dual-core and a quad-core CPU. And to our surprise, even our simulated dual-core system was able to push constant 60fps on almost Ultra settings (albeit using High Foliage Visibility and without Hairworks as those settings were affecting our GPU rather than our CPU).
The Witcher 3: Wild Hunt is powered by the REDengine 3. In our interview with CD Projekt RED, Greg Rdzany claimed that CDPR’s technology is already using the full potential of multi-core CPUs. Well, we are happy to report that Greg was right. REDengine 3 scales incredibly well even on twelve threads. REDengine 3 is definitely future-proof and given its incredible scaling on multiple CPU cores, we are certain that AMD CPU owners won’t encounter any major performance issues with The Witcher 3.
And while The Witcher 3 does not require a high-end CPU, it demands a really powerful GPU in order to enable all its bells and whistles and retain a constant 60fps framerate at 1080p.
Our GTX690 (therefore, a single GTX970) was unable to offer a constant 60fps experience with Ultra settings at 1080p. Our framerate was averaging around 40-45fps, and that was without NVIDIA’s Hairworks effects. In order to come close to that 60fps experience, we had to lower Foliage Visibility to High. By doing that, we were able to enjoy the game, even though there were some minor drops to 51fps in certain places.
Those with weaker GPUs can tweak a number of graphical effects in order to adjust the game’s performance and visuals to their liking. During our tests, we noticed that Foliage Visibility, Ambient Occlusion, Detail Level and Shadow Quality were the options with the biggest performance hits. Naturally, lowering the resolution will also bring massive performance improvements (at the cost of somehow “blurrier” visuals).
In order to find out whether GPUs that are equivalent to NVIDIA’s GTX680 are able to run The Witcher 3, we tested the game in Single-GPU mode. With our above settings, PC gamers can lock the framerate to 30fps and experience an almost constant 30fps experience (there were minor drops to 27fps, so you may need to also lower some other options). However, we were unable to hit a constant 60fps experience at 1080p, no matter what settings we were using. Even when we lowered all of our settings to their lowest values, we were still unable to hit 60fps at 1080p. In order to somehow come close to it, we had to lower our resolution to 1680×1050 (with Low settings). This basically means that a 60fps experience is out of the question for all those owning GPUs equivalent to the GTX680.
As you may have noticed, we’ve not talked much about Hairworks. And that’s because this setting comes with a big performance hit. And to be honest, we’d take a 60fps experience over some fancy hair effects any day. There is a workaround to decrease the performance hit of Hairworks by tweaking the game’s “rendering.ini” file. All you have to do is find “hairworksMSAA”, and change its value to 2. Still, we’re recommending this option only to those owning cards that are more powerful than NVIDIA’s GTX980.
Those with even better GPUs (or SLI systems) can go ahead and further tweak the game’s options via its configuration files. NVIDIA has provided a great guide for most options, so make sure to give it a go. For what is worth, we believe that PC gamers should definitely use the Shadow Quality tweaks that are mentioned in that article as we noticed major shadow pop-ins, even on Ultra settings.
As we’ve already said, the game is not up to the standards set by its 2013 in-game trailer. The lighting system in particular seems ‘inferior’ (even though it now uses PBR). This can be easily noticed in almost all interiors. Not only that, but the smoke and fire effects are simpler, and the water effects are not as good as those featured in NVIDIA’s The Witcher 3 Tech Demo.
Still, The Witcher 3 is a beautiful game that will suck you into its world. CD Projekt RED has created an incredible world with highly detailed characters (though the overall lip-sync is average for today’s standards). There is bendable grass, there are a lot of high-resolution textures (despite the fact that the game does not require more than 2GB of VRAM), its day-night cycle is gorgeous, thanks to NVIDIA’s GameWorks players can destroy various objects (that’s of course only if you enable Hairworks), and PC gamers can use SweetFX configs in order to desaturate its visuals (a complaint coming from a lot of them due to the game’s new artistic direction). As we’ve said, the final version of The Witcher 3 may not be up to its 2013 in-game trailer, however it is one of the most beautiful games we’ve ever seen.
All in all, CD Projekt RED has delivered a beautiful game that requires a mid-end CPU and a really high-end GPU. Those with weaker graphics cards can adjust a number of settings, however most of them will be forced to also lock their games to 30fps. Owners of high-end GPUs and SLI systems can stress their cards by enabling NVIDIA’s Hairworks effects, and by increasing a number of settings via the game’s config files. And now imagine what it’d take to have those more advanced graphical effects that were shown back in its “in-game” 2013 trailer (it would have been really cool though to actually have them as that would make this game future-proof).
To summarize, we have to congratulate CD Projekt RED for offering one of the most beautiful open-world games of 2015. We also have to congratulate the team for REDengine 3’s incredible multi-core CPU scaling, and for the fact that the game’s textures look better and require less VRAM than those found in other open-world titles. However, we have to criticize its decision to pass a target trailer as a “in-game” trailer, especially when today’s high-end GPUs are being stressed with this ‘downgraded’ version. We also have to criticize its obsessive stance against those “downgrade-ation” claims. Yes we get it; it’s a business thing. However it would have been so much better – and we believe that a lot of PC gamers would appreciate it – if CD Projekt RED told everyone the truth instead of lying.
Enjoy! So, Dragon Age: Inquisition or The Witcher 3? Which one looks better in your opinion? Tell us in the comment section







John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email








Time to upgrade the GPU, John!
The GPU is not the problem. Nvidia being shady not bothering optimizing kepler.
As in every new GameWorks title. Long term fact. Problem if customers let developers use black boxed middleware with no way to optimize it.
Read what I told Icy… Nvidia pulled a fast one taking advantage of gameworks using DC and since Kepler lacks DC it makes Maxwell look much more dominate.
But the 200 series from AMD is good with DC infact higher then current 980 and below. So if AMD had optimization for it a 290X would beat a 980 with gameworks on at any resolution. That’s my guess at least.
It’s all about fast render paths in the end and games like TW3, ACU run better on 900series Maxwell plus they have better Compute performance than previous gens of GPUs. My guess is they give the fastest render path to the dev for that GPU.
Well it’s not that they really can. It’s because Kepler lacks Directcomput that is needed in how gameworks effects are being used. Maxwell has more DC power then Kepler. So Future Gameworks games are just going to be garbage on kepler cards when using gameworks.
Kinda like TressFX on kepler cards next to Maxwell cards since TressFX runs under Directcomput.
750Ti is a first gen Maxwell , does it have good DC or it’s exclusive to the 2nd gen Maxwell (9xx series) ?
Not sure But I did find these passmark directcompute scores
GeForce GTX 750 Ti
1,677
GeForce GTX 960
2,849
Radeon R9 270X
2,362
I guess the first Maxwell is between Kepler and 2nd gen Maxwell , thanks.
TressFX runs beatifully on Kepler Cards because its not some bloaty GameWorks POS that NVidia uses to degrade the performance for AMD users and their own customers with older gen NVidia cards. TressFX is optimized to run great on any decent Kepler card. NGreedia won’t se my money again.
still is garbage on kepler cards due to lack of DC.
The grass are so fake. Those in Crysis 3 look way better. And the in-game anti-aliasing is ineffective. Forcing FXAA through Nvidia CP works way better. Hope the Enhanced Edition will improve the graphics.
Crysis 3 isn’t open world and the only heavy grass scenes it has drop fps to half.
It’s Temporal AA, much better than FXAA, FXAA shimmers, it’s just bad.
The only thing I don’t like about the game is the grass. It just does not feel right, decade old games like Far Cry 1 have much better grass but I fully understand that in a insanely huge game like The Witcher 3 having high quality grass would have completely killed the performance of the game except for people with SLI setups and then players would have complained about the game being very badly optimized. CDPR chose the safer of two evils by downgrading some stuff especially the foliage system but players would have been pissed either way.
The game itself is actually much better than I expected but the PC visuals did not blow me away as they should have. Overall it does live up to the hype but not in the way I expected it to.
The changes to the foliage happened because they changed from static/scripted movement to collision detection, I’ve been told that making previous foliage like that would have killed even the most top end rigs and I concur because even right now my 980 SLI is pushed to the extreme.
i’m not really sure about the 980 sli is pushing to the extreme…
my pc specs:
i7 2700k (4.5ghz)
ZOTAC GeForce GTX 970 AMP! Extreme Core Edition (clock 1228MHz//1380 MHz boost)
ram corsair vengeance 4x4gb ddr3-1600
samsung ssd 840 pro
i pushed up all to ultra settings the only thing that i have disabled is motion blur getting sick of this effeckt.
it runs between 58-64 fps all the time, the only time it was was going down under 50 fps was in novigrad, fps was there 40-46 fps.
i still think that sli is not working/optimized on all cards properly. I’m so happy that i give up my gtx 770 sli setup for one singel gtx 970
Well yes SLI has it’s own share of problems, I was saying pushed to the extreme because I always see both of my cards to be utilized around 95% – 97% when running this game, I have never seen them this much high with other games.
Still I am getting great performance so far, 60 fps with every thing on ultra + HairWorks + 1440p so it’s great to have SLI for this game.
yeah and ppl just can’t stand no more a game you can’t max out with the top end cards even in SLi….I remember when it was normal, so you were also excited to play your games after a while if you upgraded your gpus and see the maxxed out graphics. I mean why delete/change something when you could keep it sayin, as the Attila guys did, that this setting was thought for future much powerful gpu? or better for the time dx12 will came?
Well CDPR are known to support their games for long periods and this game is a big deal for them so who knows maybe we get a DX12 patch down the line with upgraded graphics.
Sadly we will never know if thats true or not, right?
Yes of course we don’t have inside knowledge or official word on this though a few guys did a comparison of how the foliage went from static to partially animated/scripted to full collision detection, I forgot the thread but it’s on the official forums where they posted gif images comparing movements, they said that this could be the major reason behind changes in foliage.
“CDPR chose the safer of two evils.”
“Evil is evil. Lesser, greater, middling makes no difference. The degree is arbitrary. The definition’s blurred. If I’m to choose between one evil and another, I’d rather not choose at all”
heheen 🙂 …
but the only interesting part is choosing 🙂
black and white is boring
OK, but thats why theres SETTINGS, dont have SLI? set Foliage to LOW.
The question is why 2013 foliage is not there?
You’re stretching quite a bit there with FC1 grass. I just played that at 4K dsr recently… it’s rendered perfectly but it’s not nearly as dynamic as TW3.
I just think it’s a bit too wildly animated. FC1 did have a better subtle animation.
I just hope it runs decently on AMD, guys you need to also do performance analysis on AMD GPUs also.
And I entirely agree with that last paragraph.
Runs very well on my AMD processor and my 7950. 45-60 fps with the tweaked shadows and foliage. 45 fps feels smooth with no stuttering. Advice to all enable hardware mouse.
I have the same set up as you fx 8350 @ 4.7 (2) 7950 not in crossfire.I have a mixture of high and medium settings no blur ,motion blurr,bloom,or sharpening.I’m getting 45 to 60 fps as well cant wait for a crossfile profile!!!
Good to know, thanks.
Thanks for this small review on looks and performance John, you’re great! I am gonna (as I most often do these days) wait about a month or so before getting this game, so that patches, mods, tweaks, etc can be released for best gaming experience. I am also waiting for AMD’s new monstercard 🙂
Release the 980ti already.
Foliage looks like crap, eh? GTA V foliage looks much much betta 🙁
I’m getting 40-45 fps with the 970 and an i54460 with all maxed, looks like I’ll go medium with shadows and maybe let go of the hair physics.
Medium with shadows?… Look at this guide:
http://www.geforce.com/whats-new/guides/the-witcher-3-wild-hunt-graphics-performance-and-tweaking-guide
People complained about the downgrade but now they compaining about performance. Remember, a 770 before the day 1 patch was able to play ultra but now it struggles to. After the patch, which actually made the ultra settings much more demanding due to some increased graphical features, the performance has lowered substantially. Basically, people will never be pleased. Game gets downgraded, people complain about the graphics. Game graphics gets increased and people complain about poor performance.
That’s not what’s going on here at all.
Many PC gamers do it all the time though. If a game has lower visuals but high performance, they complain that the games are being held back by consoles. If it has great graphics and low performance, they complain that it’s an unoptimized console port.
That’s because 99% of games are. There are very few modern PC games that’s really optimized for PC. I think I can count it with two hands(or one?). It’s either the game can’t utilize the cores of the CPU or can’t utilize the full potential of the GPU.
On the top of my mind –
1. Max Payne 3 – Because it’s not a port. It’s a game developed alongside the console versions
2. GTA V – Runs great on all tier
3. Crysis 2/3 – CryEngine’s great PC compatibility, I suppose ?
4. Tom Clancy’s Splintercell : Blacklist – They say. Never played it myself so I’m not sure
Can’t name another anymore 🙁
Better graphics =/= poor performance in all cases. Original E3 demo was played on Single GTX 680. Yes, I agree that back then the game’s map wasn’t finished but still I think if they had spent as much time optimizing the game on PC as they did on Consoles, results could’ve been better.
The difference between MGS V PC and PS4 is huge and it runs smooth as silk even on low end hardware. There is no reason for a game to be THAT demanding when the difference isn’t huge. That’s what people are complaining about. If it had 2013 visuals then fine, why not.
Yeah, Witcher 3 isn’t that graphically impressive. Besides Hairworks and some particle effects, it’s pretty much the same as GTA V. And GTA V runs great
I remember that they said it was running on a Titan. And it was struggling to hold 30fps. I don’t think it’s easy for them to just make the pre-downgrade graphics ultra and the current graphics low or medium. Ground Zeroes runs well yes but the performance is mostly in line with what you’d expect. It’s nothing magical. To run it at PS4 level, you’d need a 7850/ 750 Ti or higher which just so happens to be equivalent to the GPU power of the PS4. It’s requirements are realistic.
If you guys are talking about the E3 2014 demo on Microshi*t’s conference. It was confirmed to be running on Xbox one 900p.
On Xbox one before the “toning down.” That’s a really sad fact 🙁
Optimization is not some magic wand trick and to be honest the game is very well optimized in the capacity of Dx11 API, haven’t you seen how well it scale on CPU ? (DSOG performance analysis). It’s just graphically demanding, so the only way it will get a significant boost is by changing the API to DX12 and even then the difference will be questionable as this is not a CPU bound title.
And you’re comparing it with the Fox engine that has amazing foliage, wow man simply wow !
When a Gtx 960 out performs or equally than a Gtx 780 on ultra 1080p no nvidia gamworks, then its obvious people are gonna complain… The game works well for people with 900 series but any thing else the game is mess… Just talking about nvidia not sure about Amd, I heard Amd users are having issues as well…
http://www.pcgameshardware.de/The-Witcher-3-PC-237266/Specials/Ghttp://www.pcgameshardware.de/The-Witcher-3-PC-237266/Specials/Grafikkarten-Benchmarks-1159196/rafikkarten-Benchmarks-1159196/
http://www.gamersnexus.net/game-bench/1947-witcher-3-pc-graphics-card-fps-benchmark
“is one of the best looking games to date.”
Where do people keep coming up with this stuff? Crysis 1 looks a lot better in many ways and that came out 8 years ago.
This game is borderline cartoonish with oil like water, inconsistent as hell textures, tree trunks made of rubber, and some of the worse grass and foliage I’ve seen in a long time.
Don’t get me wrong, I don’t think the game looks BAD…not at all…but it’s certainly not worthy of being put on some pedestal.
And this is coming from someone that can actually play it near max settings at 4K
Yeah I agree. While the game looks good, I don’t think it is one of the best looking games ever. It looks good and in some places really impressive but that’s it. I think character models are really impressive for an open world game.
I think for a such big game it looks gorgeous.. better than most linear games.. hell, how many games do look as good or even better than Crysis? Order 1888, Crysis 3 maybe.. yeah, the foliage isn’t great looking, but it’s better this way – atleast most people can actually play the game.. no one is going to make a game only 1% of people would be able to play.. even GTAV is very incosistant in quality, and thats a game from a much bigger company and with a ton of budget compared to CD Projekt
It’s way better grass than Crysis 3’s 30hz locked sh*t.
Crysis 3’s grass is truly bad and somehow many folks looked over that. Witcher 3 doesn’t deserve so much condemnation for their’s when “the benchmark game” Crysis 3 got a free pass with that BS.
Crysis 1’s graphics are beginning to show it’s age. People saying it’s still one of the best looking games right now are just being blind.
You know buzzy, we know you are mad because people cant see what you see…
Or they are like me and just upgraded from their old Core2Quad era system and can finally play the game at full speed with max options 😉
People don’t actually remember what crysis 1 looked like, its all nostalgia goggles. Fact is, witcher 3 is gorgeous and really does outshine most games.
Have you played it? I have to say that this looks much better in motion on your screen. Sure the water is rather bad but you don’t see it that much.
Also you are remembering the best modded crysis scenes, in actuality it doesn’t look that good.
agree so much. so many “pc master race” players hyping the graphics because its the only game to somewhat try to actually use the power of the pc.
Stop talking nonsense. PC master race players hyping the graphics because it’s the only game to somewhat actually use the power of PC?. How do you come up with nonsense like this?. So first of all anyone who has a good PC and enjoys visuals are master race players?. Jealous much?.
This is the only game to somewhat actually use the power of PC?. So do we just ignore every other game that launches on PC with a raft of features that take advantage of PC hardware do we?. I don’t think any PC gamer wants to play pish like The Order with 33% of the screen missing, dull linear corridors with a few poor enemies with no artificial intelligence to shoot at it 30fps all because everything was sacrificed just to have hardware pump out best possible visuals.
Im agree, that foliage looks so weird actually.
All I can say is i’m glad now I didn’t have my new rig built in time for this, again I get it finished maybe someone will have modded some of the features from the 2013 trailer back in. I’m not a complete graphics whore but I really wish developers would stop showing us one game and then delivering another in the final product. I’m sure the game is still epic and can’t wait to play it but hopefully mods or the EE will restore it to the game it was meant to be.
I think walls look flat because of lack of Parallax Occlusion Mapping.
http://www.d3dcoder.net/Images/Samples/ParallaxDemo.jpg
The sad thing is, Crysis from 2007 had that tech too as far as I know. It’s shame that Current Consoles aren’t that powerful, which is bad news for all of us because we’ll have to wait for next baby step after 7 years.
http://www.pcgameshardware.com/screenshots/original/2008/07/Crysis_Parallax_Occlusion_Mapping.jpg
The only way those lying F*CKS can get some redemption would be if they release a DX12 patch/enhanced edition later on – that will look or even better, surpass the 2013 trailer!
Thats the only way they can regain some trust, as right now, those dipshits knows they are the new Ubi Soft and rightfully hated.
So right now they dont deserve one penny, right now all they deserve is to get theirs crap pirated. Dont give a cent to these lying A-holes I say, they DONT deserve it!
Please, calm down. We all get it – CDPR did lie about the game but still I doubt that even 1/10 of all gamers who bought TW3 are able to max it out and still play at 60fps. Hell, my 670 is way below the minimum requirements and can’t hit 60fps even at ultra low settings. Yes, I think the way this game runs in terms of GPU performance needed to maintain locked 60fps is kinda bullshit. TW3 is the first game that basically refuses to reach 60fps at any settings.
I’m in the 70-80fps on high, 1080p, HBAO+, also I can get 60fps quite easy with my GTX 970 on ultra.
As long as you turn off hairworks. I have a 970 and 4790k and can attest that I wouldn’t want anything less powerful for TW3.
The antialiasing is screwed in TW3 as well so you really need to run at 4K dsr and for that I’m sure something like dual 980’s or higher would be needed.
There is something to be said for just having CLEAN graphics in a game. What good is all the extra effects if you can’t run them well?
They announced not long ago that they are not going to make an Enchanted Edition for TW3 but to be honest , i don’t believe them.
They lied once which means they will lie twice. They just didn’t say that they will make an Enchanted Edition for TW3 because this would affect the current’s edition sales negatively which they do not want of course.
My suspicions about releasing a Enchanted Edition later on comes from CDPR since they said 2 months ago that when DX12 gets released they may release a DX12 patch for TW3 , it may not be a patch it could be a new edition with all the 16 DLCs and stuff.
You mean ‘Enhanced’. And no, they aren’t lying; they released enhanced editions for both of the previous titles so it’s not a stretch.
Did you read my post properly ? It was known the the company that they would release Enhanced editions for 1 and 2 but they said prematurely that we won’t have an Enchanted edition for 3 which i doubt that is going to happen.
Source or it didn’t happen
True, I didn’t understand your point (not a clear post)- but Google does not turn up any such statement from CDPR. Even if your assumption were true, it would be a nice surprise rather than a negative one. It’s not as if this enhanced edition would suddenly cost.
It did come with a good result from their forums , a guy from their staff made a post about it but since the release of the game if you google about witcher all you get is how to buy it lol
However not getting Enhanced edition , it is known. There are reddit posts which verify that http://www.reddit.com/r/witcher/comments/36be1m/wait_for_the_witcher_3_enhanced_edition/crcg47i
Also W3 may get DX12 then it gets released – http://wccftech.com/witcher-3-cdpr-dx12-availabletemporal-aa-consoles/
Because Directx 12? Are you serious?
I wish there is a way to have hairworks not on geralt and ONLY on animals. Having hairworks on geralt itself drops frames by 20, but if wolves, or a griffon comes in, it surprisingly doesn’t drop any lower. I’m pretty sure the penalty for running Hairworks only on animals is pretty low, but boohoohoo, I’m pretty sure we won’t get an option for this.
Also a single gtx970 can run the game in ultra fine without ingame AA. Well, a normal peasant 970 won’t, but a 1550mhz 970 will. Ingame AA has a nice 5fps penalty.
Desperatly waiting for mods, I want pretty animals but not geralt hair. Maybe a mod which replaces models can fix this. When with ciri, the griffin she sees, that uses hairworks, the horse she rides on uses hairworks, yet my fps drops barely by 5fps tops. Which is a shame. I will only enable gameworks when playing with ciri.
Is it only me, or there is no visual difference between HIGH and ULTRA textures, apart from memory usage?
What comes to my minds is that unity looks much better and runs the same as the Witcher 3 but yet everyone yells UNOPTIMIZED cause it’s Ubi, this is how you play tennis without a net.
Well that game was so very buggy, pop-ins, pop-offs!. glitchy. Also that game killed more frames to enable fxaa than every other setting, while fxaa in most other games basically comes at zero performance loss.
Just so you know, I ran the game on an AMD card so there’s that.
let’s hope there is an enhanced edition like the witcher 2
John I read some articles saying gtx 600-700 not good optimized for this game even gtx 960 can destroy the original titan
Ok thx for clearing thing up but gtx 770 have more bandwidth and memory bus the only thing that gtx 960 win is in power consumption
You know what I hate most ? An ugly looking game that performs badly.
Well this is beautiful and performs ok.
Played a bit, well yes, it’s not 2013 quality, but it looks way better than youtube/screenshots once you play it on your PC. Great textures on models, lip sync is mediocre as John mentioned already and the two things i don’t like so far is the grass(It looks blurry) and Geralt’s movement(Looks fake). As a game, it looks really good so far.
So at ultra settings, the game looks almost similar to the console version and yet it requires at least a 970 at 1080p? Oh dear……
I’m guessing the game runs at a mix of medium and high settings on the consoles. It’s just that there isn’t major differences between the presets. Medium, high and ultra all looks good.
The game does use more than 2GB of VRAM.
Vegetation brought to you by xbox one…
My god…
Man i’m not going to play this game just because of this shity vegetation.
You are right should not have released it at all on PC. Graphics are too sh*t for the “masterrace” lmao
Yeah even GTA 5 has POM.
Did a 4 minute video on the performance on ultra, GTX 970.
https://www.youtube.com/watch?v=P8EgXDHHrcI
BTW guys, consoles look bad, cut scenes go as low as 20fps, terrible popup in the city, missing clouds. chunks of the walls pop in, textures don’t stream in properly.
http://i.imgur.com/cEZzDqR.jpg
http://i.imgur.com/Wqjmvwk.jpg
http://i.imgur.com/ERXpo2A.jpg
http://i.imgur.com/LKdkLP3.jpg
“thanks to NVIDIA’s GameWorks players can destroy various objects (that’s of course only if you enable Hairworks)”
This mean that every Gameworks feature is under the Hairworks toggle? Can anyone confirm this?
No it’s purely Hairworks, PhysX is on the CPU so anyone can use it and there is no option.
I am still waiting for Kepler single GPU driver ready!
I just want better lighting the current one is just too flat, also the game really needs sweetfx the game is over-saturated to insanity.
I am sure the old grass will make my graphic card puke blood, the skyrim grass mods already make it do so.
What is the software used to monitor FPS and Ram consumption? MSI afterburner? or something else? anyone please Reply.
It shows in the screenshot it’s MSI Afterburner. lol
Are you sure the 960 has better tessellation than the 780?. I know that the Maxwell architecture enhanced the tessellation but not by that much that a weaker card will drastically outperform a higher end model. I think there needs to be a new driver because the 960 and 780 should be quite far apart with 780 way out in front.
50-60 fps with Hairworks ON
65-75 fps Hairworks OFF
1080p. All settings Ultra
This is with a GTX 980 G1 and i5 3570k, both stock
It seems that any other GPU solutions, including SLI and all non-Maxwell card need optimized drivers to approach similar numbers. The game was clearly made for the Maxwell chips, whether that happened for financial reasons or because they failed to fully test on everything else, it is what it is.
That said… AMAZING GAME
upgrade your cpu to a i7 5960X or 5930K or 5820K
you cant please everyonew especilly some. personally. plays great on ps4 and looks great. brilliant game, if your not playing it your missing out. simple.
Don’t blame Nvidia.
http://img11.hostingpics.net/pics/704586TheWitcher3Analysis.png
It’s a architecture limitation not a none optimization of Nvidia drivers. Same issues with FC4. This is why Maxwell is more powerfull in Pixelfillrate and less in Texturefillrate. Actually around 100 Gtexels is enough for now…
If the GTX 780 is behind or ahead the GTX 960 it’s because the GTX 780 have multiple configuration(32, 36 or 40 pixels by cycle) so the fillrate can be worse, equal or better. That’s random. If you have performances issues with GTX 780 in TW3 or FC4. This is the reason.
Happy to get this game 19$ on sale from GOG
In other words, it didn’t happen. Every post in that thread uses ‘they’. No one even pretends to claim that they are from CDPR.
A DX12 patch would be welcome either way, at least the draw distance could possibly return, but the DX12 release date is too early for an enhanced edition. These will likely be separate updates unless they delay the DX12 implementation.
Im going crazy with this article
This is cool. Seems like the DEVS. made all good with the PC version. Only took some days.
what the heck has the world come too. a bunch of people on the pc talkin about rendered blades of grass. ive played every major game that has ever been made. the witcher 3 is by far the best looking as a whole. period. i would like to see side by side comparisons of any game in the world that looks better.
I’m into W3 for the stories; that aspect I can vouch for being among the best in gaming yet.
There are a few visual aspects of W3 that excel, but it’s far fetched to say that it’s ‘by far’ the best looking game ever made (it was worth my PC upgrade anyway). Even the Witcher 2 had more love in the modelling department for most types of objects simply because they didn’t need to be so modular. In terms of engine, Ground Zeroes has dynamic weather on top of a comparable lighting system to Frostbite and it runs on the same engine as that used in MGS 5. There isn’t a single thing I can recall from Ground Zeroes that was jarringly unrefined. I can’t say the same for the W3.
Fortunately, that hasn’t prevented it from being my favorite game yet because that’s not the point in playing it.
Awesome game! Bought the PC version here for only 26€ ;P
https://www.g2a.com/r/gstore