Nvidia announced yesterday that its upcoming DX11 driver that will improve performance on CPU bound titles will be released this Monday, and you can view its complete changelog below. According to the release notes, this new driver introduces key DirectX optimizations which result in reduced game-loading times and significant performance increases across a wide variety of games.
Here are the release notes for this new driver (GeForce 337.50 Beta):
Performance
Introduces key DirectX optimizations which result in reduced game-loading times and significant performance increases across a wide variety of games. Results will vary depending on your GPU and system configuration. Here are some examples of measured gains versus the previous 335.23 WHQL driver:GeForce GTX 700 Series (Single GPU):
- Up to 64% in Total War: Rome II
- Up to 25% in The Elder Scrolls V: Skyrim
- Up to 23% in Sleeping Dogs
- Up to 21% in Star Swarm
- Up to 15% in Batman: Arkham Origins
- Up to 10% in Metro: Last Light
- Up to 8% in Hitman Absolution
- Up to 7% in Sniper Elite V2
- Up to 6% in Tomb Raider
- Up to 6% in F1 2013
GeForce GTX 700 Series (SLI):
- Up to 71% in Total War: Rome II
- Up to 53% in Sniper Elite V2
- Up to 45% in Aliens vs. Predator
- Up to 31% in Sleeping Dogs
- Up to 20% in CoD: Black Ops 2
- Up to 10% in Hitman Absolution
- Up to 9% in F1 2013
- Up to 7% in Far Cry 3
- Up to 6% in Metro: Last Light
- Up to 6% in Batman: Arkham Origins
SLI Technology:
- Total War: Rome II – added profile
- War Thunder – added profile
- Watch Dogs – updated profile
- Diablo III – updated profile
Gaming Technology
- Supports GeForce ShadowPlay™ technology
- Supports GeForce ShadowPlay™ Twitch Streaming
SHIELD
- Supports NVIDIA GameStream™ technology
3D Vision
Supports new “3D Compatibility Mode” for 3D Vision that enables us to improve the 3D experience for many key DirectX 10 and 11 games3D Vision Profiles
- Path of Exile – rated “Good”
- KickBeat – rating now “Excellent”
3D Compatibility Mode Profiles
- Assassin’s Creed Liberation – previously “Not Recommended”, now rated as “Excellent”
- Sniper Elite: Nazi Zombie Army – previously “Good”, now rated as “Excellent”
- Sniper Elite: Nazi Zombie Army 2 – previously “Good”, now rated as “Excellent
- Strike Suit Zero – previously “Not Recommended”, now rated as “Good”
- Watchdogs – rated as “Good”

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
Hardly a “wonder” driver but welcome never the less. I wonder if there will be a DirectX update.
Maybe M$ will try to buy Nvidia’s DEV team so DX12 will be everything they say it will be
Nvidia > AMD come at me brehs.
don’t try to start a fight, you are a $ony fanboy and you are not welcome here, so don’t try to start a fight between pc gamers (they eat you alive)
LOL. Delusional. I bet my PC is more powerful than yours, and I know more about tech than you.
Dumb fanboys here are ignorant of PS4’s architecture changes/advantages, how devs will still spend years optimizing it, and how it can punch above its weight in ways PCs still have to brute force. You refuse to admit both PC and PS4/consoles have their advantages and disadvantages, instead preferring to scream like ignorant fanboy children about peasants and such.
Time to admit I was right, as usual.
still alive huh ?
Yeah curious how this new Nvidia driver is going to improve things.
i have rome II installed and i have my benchmarks from it, i wait for driver and i will test it again
This is not the right time to discuss your love for the PS4, though at the moment you seem to talk about nothing else but the same sh*t.
That f*ck lives in a dreamworld, dont know what that F*CKER are doing over here anyway.
Delusional console Sony turd fanboy is all he is, total blind for the truth heh!
So apparently I can’t appreciate the architectural advantages, console optimization, and great price/performance of a PS4 because a GTX Titan Black just craps all over it. Got it.
I’m right with facts on my side, you’re a dumb and delusional fanboy.
psst I own a GTX 780 Ti and i7-4770k
Consoles have architectural and low level API advantages, happy now? We knew that but it’s not the point of why consoles are weaker this generation compare to a budget gaming PC.
Nvidia is indeed better than amd ,but amd is cheaper so thats probably why are they always discussing.
Yes because Microsoft, Nintendo and Sony wanted a cheap deal so they didn’t have to spend years losing money on their consoles. Ironically Nintendo and Sony are losing money anyway.
Consoles are nowhere near top end PC’s like PS3/Xbox360 were because they went cheap this time.
Yeah, this generation of consoles is kind of disappointing tbh. They got 8 years out of the old ones because they were really powerful at launch, they’d have gotten a couple more if they had put more RAM in them. But the PS4/One are pretty ‘meh’ :/
Just seen a Laptop with a NVIDIA GTX 880M BF4 at 1080p, high settings beat a PS4 which runs at 900p and destroys the XB1.
Laptop mobile GPUs beating consoles, who’d of thought it. LoL
Well, someone’s got to fight the good fight, I guess. Good luck!
This would be nice… if I could have more than ONE game on my computer. Titanfall is still taking up all my space reserved for gaming (50 goddamned gigs).
Elderscrolls Online is saying 60GB minimum. :p
If that’s all the space you have reserved for game installs on a PC capable for running Titanfall… you need to rethink your priorities or buy an extra freaking hard drive lol
Haha – yeah, I really wish SSD prices would plummet already.
Yeah, I need too much space to deal with SSDs. Distributing your data across drives helps, though. Windows is on one drive for me, I have two 1TB drives for data, and I have a 3TB drive that is solely game installs. My load times are just fine, really. I only notice much of a wait in Skyrim’s initial load, which is probably due to all the HD textures and mods 😡
Hmm, I wonder if this will have much impact on Arma III.
I think Arma III’s big issue is poor CPU optimization. I can adjust the graphics settings up and down in those titles and it doesn’t seem to do anything for my framerate either way.
Well it’s still beta I just wonder how much performance they can pan out with the final release. Since not all of us can afford 300-$700 gpu’s every year.
You never need to do that unless you’re obsessed with maxing every single title.
yeah that’s true. I just wonder how these drivers will work on my setup. Really interesting about the whole Making DX11 games perform better. Kinda a shame they don’t do the same with games like Guild wars 2 that are dx9 and very cpu demanding. oh well no biggie Free performance is better then none.
They said the same for Dx9 games, it was in the description of the driver.
aaaaah yes I see I seee. can’t wait to test that out
I just hope we’ll see a good performance increase in 64-player modes in Battefield 4.
BF4 is hardly CPU bound game, plus they should release benchmarks with slow CPUs, see the difference. Most games on the list are from AMD GE program, so it looks more like they optimized drivers for some games, but benchmarks tell more.
we’ll put these drivers to the test with our Q9650, so it will be interesting to see whether there are actual performance gains in games like Tomb Raider, THIEF, Sleeping Dogs, Resident Evil 6 and Battlefield 4
Yeah, thanks John. Can’t wait to see the results of testing.
Thats why nVidia rules, everything they do they do it better then the so called competition.
They leave the crappy AMD in the dust!
It’s simple get nVidia or GTFO 😉
Or get what you like and enjoy the technology from whoever brings out what, as it’s all great competition.
Yeah thats also true 🙂
I just wish AMD would be better as right now nVidia pretty much can take how much they wants from their cards :/
Allright maybe not that bad but you get me.
I almost bought an AMD card after Mantle came out, but after seeing what Nvidia were bringing out (Maxwell) and MS hopefully getting their act together with DX12, I went out and bought a 750ti. I’m glad I did, it’ll keep me happy until the 8 series of Maxwell is released. Whoever you go with, PC gaming is where it’s at 🙂
I did the same, went with a 750 Ti and i will never look back, thing is a beast for what it is and the price! Not to mention it overclocks really well.
You’re right, I’ve been telling people it’s a beast too 🙂 I bought the KFA2 OC edition for £100, it doesn’t even have a six pin connector like the other models, but it’s still great. What model did you get?
mantle and DX12 has very little to do with GPU lol but sure its probably reason for someone to get some particular brand.
Yes obviously has nothing to do with it lol
What gave nvidia and ms a kick up the backside? amd’s mantle. Does anyone know if this will do much for my 670?
Apparently this is meant to increase CPU performance with Nvidia cards, freeing up some kind of bottleneck. If you have an older CPU it should give you a performance boost in some games that use lots of CPU calculations, like Rome Total War II.
If you have an overclocked i7 and a GTX 670 I’d doubt you would notice a performance boost.
It’s a bit of a trick really. AMD FX CPUs are so weak in CPU bound games so they need to move that fact away from them with Mantle. You can blame devs for making their games so CPU bound because the last gen consoles had very strong CPUs so they shift the work load more to the CPU, which works well at low resolutions like consoles run at(720p) plus the low level access makes that work better.
I usually stick with EVGA – they’ve never let me down. I think EVGA is exclusively Nvidia, now.
I hear really good things about EVGA. I’ve had Gigabyte for my last three cards (HD5570, HD6870, GTX760) and I really like them. The custom coolers they use are great.
Nvidia has always had better driver support and uses less power than a comparable AMD card, typically. Really the only thing I don’t like about Nvdia is their naming conventions. They name their cards by the technology that goes into them, NOT by how powerful they are. For example, a 650 is just as powerful as a 480 but costs a lot more (just a random example, not specifically true) – consumers don’t usually know these kinds of things – review sites don’t usually compare cards that are a couple years apart.
Oh, one more thing I don’t like about Nvidia: they killed Physx. It used to be a stand-alone card that went in a PCI slot, but they did away with that in order to sell consumers another graphics card that is more expensive and far less efficient at calculating physics. A discreet physics chip is an amazing concept and Nvidia killed it so they could make a buck. I’ll never forgive them for that.
I think most Gameworks stuff works any card (pretty sure the hair effects use DirectCompute for example). Crazy them providing some nice tools to save developers time.
And AMD does the same. fucking. thing. I had an AMD card at the time (thank goodness), but do you remember what a clusterfuck Tomb Raider was for Nvidia users? It had barely been tested on Nvidia cards at all since they were in bed with AMD for the title. I’m kind of sick of that thing from both companies.
There’s really no way to stop that kind of thing from happening. Game devs need the money, plus the architecture for both AMD and Nvidia chips is a bit different, so some form of proprietary code is needed in order to maximize performance. It will probably only get worse from here since AMD now has a monopoly on the consoles. Thankfully it doesn’t make all THAT much difference. Mostly you can fix any of these ‘exclusive’ features by turning off or down one of the graphics options – it’s never anything that will prevent you from enjoying the game.
I don’t mind a bit of optimization, and I know it’s probably helpful for a developer to have Nvidia or AMD on board helping a bit. I don’t think that’s an excuse for a game to ship that hasn’t been properly tested on the other company’s cards, though. At that point it’s just a big insult to consumers. And do they really want to have people to say “optimized for amd/nvidia? guess I’ll pass since it will run like crap on my card”
I’m not that worried about AMD being in consoles hurting Nvidia game performance since that hardware is going to be left behind rapidly anyway. I think the lasting effect will be those slow 8-core CPUs forcing developers to multithread their games better, something that should help everyone on PC since even quad-cores aren’t getting fully utilized at this point.
Yeah but throwing more cores at the problem don’t help much if your architecture is weak like FX CPUs are. Intel CPUs destroy AMD CPUs in CPU bound games because their architecture is good.
It helps in heavily threaded games. The FX 8350 matches i7’s and i5’s in heavily threaded games like Crysis 3 and BF4. It will probably continue in the future.
Nope, clock for clock the AMD FX loses to the lower clocked Intel CPUs with only 4 cores in BF4. AMD CPUs still need raw MHz to get near Intel’s.
Price wise yeah the AMD CPUs are great but they lose to the Inte’ls so badly sometimes that the price is somewhat negated.
It does help. Just check out some benchmarks of BF4 and Crysis 3.
AMD have closed the gap for sure with Piledriver, i’m getting a FX 6300 next week. 🙂
Good choice. That’s a great GPU especially if you are on a budget.
You’re kind of missing the point. Intel beats AMD core-to-core, there’s no doubt about that. But AMD gives you more cores for your money. Their octocores aren’t as good for gaming because games rarely use more than 3-4 cores, something that is probably going to change as devs have to optimize for the consoles’ weak octocores.
I mean, there’s quite a bit of horsepower there- it’s just distributed over more cores usually. And price/performance wise, AMD still gives you a decent deal. Though I’m wondering where the HELL their new enthusiast CPUs are. I’d like to give them a chance to keep me (I have a Phenom II x6 1090T atm), but I’d like to see something fresh and new from them.
It depends on the architecture of how the cores work, like I said, more cores doesn’t mean sh*t if you don’t use them right or provide a good architecture around them and the FX series doesn’t.
AMD don’t have any influence in consoles, so it’s not a monopoly and customers have a choice in consoles, hardware is transparent to console users since it has no relevance to them either.
NVIDIA are more interested in the mobile platform with Tegra which is much more profitable.
Well, I guess the last gen consoles have an massive install base, so I should have said that AMD has a monopoly on “Next-gen” consoles.
It’s not a monopoly, console makers had a choice in who they went to for their hardware. If AMD had a monopoly, they would have been the only choice and supplier for the console hardware.
Haha, I’m not talking about an economic monopoly. There are numerous definitions of the word.
“the exclusive possession or control of the supply of or trade in a commodity or service”
Very good, that’s one definition! There are others, I’m sure you can find them.
There isn’t one to what you’re claiming, you’re just using the wrong word.
“The exclusive control of possession of something.”
I’m not saying you’re wrong for questioning what I meant, but I’m not wrong either – I just used the word in a way that you hadn’t thought to.
AMD don’t have exclusive control or possession of any of the consoles or businesses, developers making them.
Monopoly is an absolute state, but he meant it as subject of gradation. Same as I did. There is probably word for it which I don’t know. Point is you got it^^
New consoles shows us lot of things. For first in consoles there is used SoC, first is used low end CPU (jaguar is used in ultrathins/tablets), for the first time we have unified memory. Mainly for first time we have very low end HW specs (compare to previous gen.) capable run quite demanding games. And no consoles didn’t have much choice who to get SoC from. Noone else could do it, unless they wouldn’t go for ARM.
?Tegra? profitable. Well tegra should this year bring 50% of company profit, it is not even close to that. Tegra is not bad, but also it is not great, competition (qualcomm, samsung, ARM, etc) on android is totally another story from PC GPU market, but definitely Tegra is not profitable. Hopefully it will be in the future.
TressFX was made with cooperation with Crystal D. The reason why TressFX runs worst on nvidia (G104 and lower especially) is because small kepler is much worse with general computing, openCl or DC, similar thing like AMD is slower in massive tessellation.
Actually, NVIDIA gave their customers the AGEIA techology on all NVIDIA cards with CUDA, a second card is optional and anyone with SLI can use the second card for PhysX if they want. AGEIA would need two cards to even run their tech.
PhysX uses a portion of CUDA cores on the card and yet people think they can get free performance with CGI quality physics and particles with a small portion of CUDA cores, LoL. Be thankful NVIDIA even made it so CUDA cores can access it ,they could have let you buy another NVIDIA card only just to run PhysX. People should stop whining about PhysX, Apple or other proprietary companies don’t let others use their tech without licence agreements so welcome to the proprietary working world.
If you want free and open tech, move to Linux and then you can stop whining.
Be thankful that we only have that one option? I don’t think so.
CUDA is a GPGPU tool set, NOT a dedicated CPU for physics calculations.
An extra GPU for Physx may get the job done, sort of, but it’s been years since AGEIA was bought out and destroyed – their hardware, if it continued to grow at the same pace as GFX cards, would have blown away a $400 SLI’d Physx card for a small fraction of the cost. But, that’s not what Nvidia wants – they want you to buy an whole new graphics card and SLI-capable motherboard.
I know that complaining about hardware that could-have-been is almost pointless – there’s no guarantee that dedicated physics hardware would have been financially viable – but it was never given the chance. AGEIA was ahead of their time – it’s an amazing technology that never had time to grow because they needed support from either ATI or Nvidia, both of which saw AGEIA as a competitor, or at least stepping on the toes of giants in the gaming hardware ring.
The whole thing just pisses me off. When there are only two companies (or factions) in charge of anything in this world then competition is stifled – neither company is willing or has room to take chances because they’re so intent upon competing with each other, not allowing themselves to give or take an inch. I know this is a sad factor of life and is to be expected but that does NOT mean I need to like it one bit.
You need to get over the fact that companies buy startups just like Facebook brought Oculus Rift. AMD brought ATI after all now using their tech to boost CPU sales via their APUs the same way.
ATI, 3DFX and S3 are dead because they didn’t make the right business decisions.
Don’t be silly – I don’t need to get over anything. I already explained to you why I don’t like such a practice, and I’m justified in my distaste for it.
If you always accept things simply because they exist then you’re never going to change the world, Sean.
Good luck with that then, maybe you should start with using Linux.
I don’t know what you’re talking about now.
Try changing the world by continuing to support proprietary companies and technologies which you admit to doing.
Um… okay.
That’s why I said use Linux, you’re basically putting up a middle finger to that sort of thing which you admit your don’t like. 🙂
Man this article is about gaming, I do not like microsoft for many reasons and if that company would start now it would end up badly very soon after opening. Unfortunately Windows is only platform on PC where you can play most games, I hope Steam or Google change that. But since this is about gaming, using linux isn’t the best option ATM. Therefor is no point to bring linux up.
^^
Nvidia is very stupid in many ways, since physX could be viable API. But for bad optimization and block of multivendor GPU rigs, they closed so many potential market sell points. However with FLEX it seems they understand it. Proprietary features are candy that many want to try out but only few succeeds. You need to own platform for that. And since intel monopoly and also because nvidia has no x86 CPU atm, there is no way how you can achieve it. Actually nvidia has brilliant marketing and get much further than others but it is no good anyway.
” It used to be a stand-alone card that went in a PCI slot, but they did away with that in order to sell consumers another graphics card that is more expensive and far less efficient at calculating physics.”
Did you think about why game developers didn’t use PhysX in times of AGEIA? Why should they if there where only few customers using it. When NVIDIA bought AGEIA and implement PhysX to their GPUs, developers have millions of players instead of few tousends. That’s a big difference. And I didn’t mention the big work what they did last few years to improve this technology. This is really something what you call killing? It’s only ignorance from people who didn’t have any knowledge and they constantly criticise PhysX only because of NVIDIA.
You are really naive. Behind the Gameworks code there are many developers not only from NVIDIA. And if you are developing some tech many years you give it to your competition just like that? Your whole work? AMD never did something like that too. So if you bought AMD GPU because you thing you support better company, you should woke up to the real world.
Google it pls. They do not share the code. It is a fact.
I said exactly the same. 🙂
They done for ages, it is weird since AMD shares TressFX and other stuff with them, right? Someone would think they should get it by this time and still nothing lol
What does that have to do with anything? Companies buy up companies and their tech startup all the time. AMD brought ATI, it wasn’t made by them so stop being so judgemental about something you seem to apply only to NVIDIA.
PhysX is successful and is in many games, get over it.
PhysX isn’t useless, game engines are already implementing CPU accelerated Physics and particles but they’re way behind PhysX because it has dedicated CUDA cores to do it.
The problem with hardware PhysX is that since it’s Nvidia exclusive it can only ever be *aesthetic* in games. Which is unfortunately, what if you could use all those effects to influence gameplay? I mean, I do like PhysX in some games (omg Borderlands 2), but it’s almost always something that’s obviously been tacked on.
Actually being not part of DirectX is a good thing, it doesn’t require you to have a version of DirectX or OS to run it.
I wouldn’t mind seeing a shift to OpenGL, but I don’t really want to
leave Windows unless it’s unavoidable (eg Microsoft really starts making
it a closed system). Even if every game coming out from today onward
supported Linux/SteamOS, what about all my old titles? Or all my
*software* that I use daily? I’m not going to boot into a different OS
just for gaming, that kills the convenience of having a PC to do everything.
CUDA is crossplatform but it’s mainly used to accelerate CGI work. I coudn’t see why PhysX can’t come to Linux.
actually nvidia implementing GPU physX mostly, almost all companies using physX use CPU PhysX (471), GPU physX (ageia/nvidia demos included) is in 29 titles. Almost all of them use 2.8.x with terrible CPu optimization using decades obsolete x87 instruction set. That is the reason why regardless what GPU you have (dedicated or shared) you never can get stable performance, because most code is still compute by CPU anyway. New FLEX is promising though. But pretty late
PhysX was also really impressive in Arkham Origins. Not a great game but Nvidia did a great job with them on it in terms of the technical extras. TXAA is impressive as well.
TressFX is DirectCompute so it works on Nvidia cards just fine. I think Nvidia’s new hair solutions is also open like that, which is wonderful. PhysX is neat, but I much prefer these solutions that work on both cards.
One would argue that TressFX worked well on NVIDIA cards, it wasn’t optimised for NVIDIA cards and as usual AMD like NVIDIA use their render paths for optimisations while the other one suffers. NVIDIA and ATI have been playing this game for a long time, continues with AMD.
yeah TressFX hair sucks on tomb raider still with tomb raider. Her hair should not take up that much performance. No freaking way.
PC gaming at least for me is 60+FPS, name me one game with physX that can get at least close to that! only new Batman can and that is not enough^^
All of them if you have a good enough GPU with loads of CUDA cores.
TressFX same as Hairworks run on DirectCompute, even new physX 3.4 should run on DirectCompute, but old v didn’t so difference is obvious.
Unfortunately AMD destroy NVIDIA in Compute/OpenCL, NVIDIA win with raw power.
“technology called PhysX, which by the way wasn’t made by them.”
PhysX in nowadays is miles away from what NVIDIA bought years ago. They reimplemented whole SDK, improve and optimize functionality and bring up new features. Only people without knowlege about PhysX can write something like you. If you want something to criticise, you should have more but your own imaginary ilusions.
So why they keep using v2.8.x in their supported titles. No TWIMTBP title ever had v3+, for very obvious reason, new batman was very well optimizes but again not by nvidia.
NVIDIA don’t optimise games, they just help devs with the graphics technologies and fast render paths for them. Batman Origins runs well because of tricks not because it’s a graphical beast optimised well, it even fakes particle lighting and has a lot of baked low quality shadows.
I was talking about 3rd party optimizations to physX 2.8 devs has done in new batman. Nvidia mostly deoptimized for other vendors, but I didn’t mean that.
Warframe and Planetside 2 use PhysX 3.2
Because these games were built on old game engines with SDK 2.x support. There are only small exceptins like Sean said. All new games (TWIMTBP included) developed with Unreal Engine 4 or Unity 5 would use new PhysX SDK 3.x because they have it built-in in core of game engine.
physX 3.3 is best physics API (compare to bullet, cannot say if havok is better or not, intel does not let it to be benchmarked), problem is, it is not used and nvidia don’t push it into games because it runs much better on CPUs.
It’s up to game devs to implement what PhysX runtime they want and it depends in what time of the development process the PhysX was implemented
yes everything nvidia do it is proprietary and therefore almost noone use/wants it, point is PC is open platform, partially but is, if you want proprietary platform get MacOS. AMD is mature company that realize that and do open solution, even mantle will be open when final.
DirectX and Windows are proprietary. Open doesn’t mean what you think it means. Actually Mac OS X uses OpenGL.
Windows is open because it lets 3rd party software, API,… to be installed without restrictions, same as MacOS is open to 3rd party software, but its closed to proprietary HW spec. Windows on the other hand can be installed on variety of HW not made by microsoft. DX is proprietary but windows support openGL and now mantle. what you mean is open sourced. OpenGL,linux, OPenCl and so on is open sourced. Open basically mean open, like you have room which is open to other come in.
Remember Linus xD
https://www.youtube.com/watch?v=iYWzMvlj2RQ
If all those percentages were real…imagine this…with each driver you “gain” 60% performance…With that logic all games should be running at 4000FPS with all the drivers Nvidia releases…
I wonder where they get those percentages and compared to what…a nvidia 8600GT?
Whoever believe in this “percentage” stuff knows nothing about computing.
not 60% with each driver and not all games. it clearly says 64% Rome II boost from 335.23 to 337.50b, not 64% boost in every driver and every game. next driver might boost nothing, so it stays 64% but not doubled. this is how AMD & nVidia drivers work and they are real.
Cue idiot comments from AMD and Nvidia fanboys. Seriously, you morons make console fanboys look mature.
When it comes to AMD and nVidia, I just buy the card that gives me the best performance in the price range. People really need to stop being mindless fanboys.
I got the Zotac 750 Ti single fan cheapest model, and it sings like a beast, its got the same no 6 pin connector (my previous Zotac 650 Ti had one) and this thing blows it out of the water at the same power wattage!
Overclocked it from 1033 core clock to 1200 and 1350 memory to 1600 on stock voltage easy and it gained me 5-10 fps in total on benchmarks of all types from stock numbers. People playing with voltage are getting even bigger jumps, like 1400 or more core clocks!
I went with a Zotac 750 Ti 2Gb single fan model and it also doesn’t have a 6 pin connector (unlike my previous Zotac 650 Ti model) and its crazy!
I overclocked mine from 1033 stock core clock to 1200 and memory from 1350 to 1600 on stock voltage easy! Other people who are playing with voltage are hitting 1400 and more on the core clock and more!
I absolutely love this card!
Nice, glad you’re enjoying it. I have my overclocked to 1237 on the clock and 1400 on the memory, let’s see what these beta drivers can do today 🙂