AMD vs NVIDIA by MAHSPOONIS2BIG

AMD Posts Infographic, Comparing NVIDIA’s G-Sync With Its Very Own Free-Sync

AMD has shared an infographic, comparing its very own Free-Synch tech with NVIDIA’s G-Sync. According to the infographic, there is a wider market adoption for Free-Sync, is more cost effective, provides more choices and is supported by more partners than NVIDIA’s G-Synch tech.

AMD claimed that as of September 2015, there are 10 G-Sync monitors and 17 Free-Sync monitors. In addition, AMD claimed that there are five G-Sync partners and seven Free-Sync partners.

AMD also claimed that all Free-Sync monitors provide 2 or more inputs, whereas only 30% of the G-Sync monitors provide such a luxury.

But what is also interesting is the last graph that AMD included in its infograph. While both G-Sync and Free-Sync support DisplayPort and Anti-Ghosting Tech, Free-Synch is the only one that has zero licensing costs, comes with zero performance penalty, and supports open standards and standard display hardware.

75 thoughts on “AMD Posts Infographic, Comparing NVIDIA’s G-Sync With Its Very Own Free-Sync”

      1. The range at which it works in terms of supported framerates / refresh rates. Also in tearing tests it stomps the floor with Freesync. You can find side by side tests of G-Sync running its pendulum demo versus AMD’s windmill demo which is meant to demonstrate how well tearing is resolved. G-Sync has virtually none, while Freesync its quite apparent.

          1. Yea. If i was an AMD consumer, i’d get freesync. It’s cheap and ” SOMEWHAT” does the job. Now now… i am a green man and i love my gsync monitor. And why is $ always a subject in comparisons ? many of us don’t even look at the $ of things when shopping. Not everybody does that i concur.

          2. I tip my hat off to AMD for producing a cheaper alternative that still does a good job overall, but yeah, I paid for my gsync happily knowing it cost more.

            I just bought a 980ti ftw in black Friday same and even had fury X been £50 cheaper, I’d not have bought it, just because I have owned AMD before, and just was happier when I went back to nvidia.

            Phys X (barely used these days) and game works aren’t massive game changers, but nice touch at times. Driver support better, geoforce experience, control panel etc are more user friendly and work better etc

            AMD improving for sure, and do provide cheaper alternatives, and decent bang for buck, but I’d rather invest bit more in Nvidia equivalent

          3. Only idiot wouldn’t consider price as one of parameter when shopping. Some consider higher price as proof of better quality, but even those somehow take price as comparison parameter into their decision.

          4. Because some have more money and price isn’t a problem they’re idiots ? I tend to check in-depth reviews of trusted sites then go buy whatever i want. Price doesn’t matter as long as it’s not 25k for a monitor.

          5. Only an idiot would AMD at this point. Drivers killing cards by setting their fans to 10%?

            Yeah no. Gonna pass on that thanks.

          6. how many cards have died exactly?
            if overheating, the cards throttle down to prevent themselves from frying themselves.

            and if we are talking failures in drivers, when windows 10 launched, wasnt there a geforce driver that killed notbeook monitors, because they overdrove the laptop screens?

        1. I saw that demo and unfortunately cannot confirm your finding, from what i saw there was no tearing at all, also anti-ghosting and lowFPS compensation was presented in new driver released just few days ago. So there should be virtually none benefit for G-SYNC (if you dont take logo as benefit ofcourse)

        2. So you haven’t heard about the firmware update that fixed that many months ago??
          The VRR range could be considered a problem only whit some Freesync monitors but that won’t last for long.

          2016 will be a great year as Intel will finally launch a AdaptiveSync compatible GPU and AMD whit their new GPUs will increase number of compatible cards.

        3. Learn your facts. Actually is the opposite. There is input delay and a fix range G-Sync 30Hz to 144Hz while the FreeSync spec is capable of refresh rates that range from 9Hz to 240Hz.

          1. With freesynch Its up to the monitor manufacturer, and who said that 9 hz is even playable. I said about possibility. There was a problem with freesynch that when you were lower fps that the monitors minimum support range your monitor operate at the minimum hz. This was fixed with driver updates and now when you hit the minimum your monitor switch to the maximum supported refresh. Both technology’s do the same thing equally good. It just with fanboyism you limit you options.

          2. wrong.
            AMD spun that BS on their main Freesync site. The NVidia one was the state of current hardware and they already have monitors that go to 200Hz; the limitation is solely due to the panel not GSYNC (and AMD knew that). The AMD spec was just a paper spec, and when the first Freesync monitors came out they were only 40Hz to 60Hz so AMD was advertising that they were “better” when places like PCPER proved that was just BS.

            The input delay for GSYNC is insignificant. In fact, while I’m glad AMD just recently sorted out their support on the low-end (below 30Hz for example) by using software doubling, that’s done via their drivers whereas NVidia’s minimal latency is a hardware lookaside buffer in the GSYNC module.

            Also the AMD fix needs the upper refresh to be at least 2.5X the lower to work properly (such as 30Hz to 75Hz). NVidia’s GSYNC module doesn’t have that issue.

          3. The specs are not set by AMD. The monitor Manufacturer can implement it right or wrong. The 30 – 75 hz it not an issue its recommended.

        4. Superior in that it puts that pole polisher jenson closer to purchasing his next ferrari with more overpriced Ngreedia tech.

  1. Yes, and after all those great points (they really are), it manages to leave out that it’s not as good as G Sync.

  2. As a NVIDIA fanboy and owner of a Gsync monitor …i had prefer FreeSync to win…that just greed of NVDIA…

    #EVGAftW

      1. Only they don’t own 80% of the market share so try again lol. They may own most of the discrete cards share but in reality that’s a very low number. The actual facts are Nvidia is beating AMD yes, but in the greater scheme there losing horridly to Intel. Us Enthusiasts tend to forget that we make up a very very small slice of the PC users pie. The sad simple fact is which tech is superior is irrelevant to which tech will be adopted and last. Intel jumped on board with Freesync, sorry to say but that day Freesync took the win. Gsync while superior or not will live on a few more years, Nvidia will add support for Freesync and Gsync will fade out.

        1. So true for intel. I only meant Dgpus not on-board gpus. As for g-sync, let me phrase it like that. Every company has their own proprietary techs. Do you see lamborghini adopting honda’s tech, no ? (now don’t come and get me a stupid example, you get the point). Better or not, if the user base is there, it is more profitable for whatever company to use their tech over any, otherwise the RnD would be a huge failure financialy speaking. So you might be true maybe in 5 years freesync will be the standard but saying right now that nvidia will dump gsync soon is like saying amd will never get back on their feet, i don’t believe in that.

          1. I know you didn’t 🙂 I was really directing it at you just a general statement. Every company does have a proprietary tech to a point but that is kinda irrelevant to to the convo. I’m not saying that Nvidia will give up on Gsync or they will kill it. I am saying that they will support Freesync, they will have there pride party and deny it, They may fight it for a bit but in the end they will support it.

            Once that happens Gsync will die, It already is dying in the short time that Freesync has been out and it already has more supported monitors, from bigger monitor company’s. Samsung, Eizo, Intel, they are all on board with Freesync those are whale company’s and there support can make or break a tech. Freesync being open standard and royalty free means that even apple can hop on (which they most likely will).

            Gsync will stay around and die hard fans will use it but manufacturers of monitors will work with the larger population which will be Freesync. Its Nvidias job to sell more GPUs just as its monitor makers jobs to sell more monitors. Going with Gsync will sell less than choosing the open standard (by the way Freesync is a standard already that battle is over). Thus they will choose to make Freesync screens, No Gsync screens = no Gsync, Nvidia has zero control over that unless they start making there own screens. Or if they make an adapter for Gsync to be compatible with Freesync monitors. If they do the latter it may live not sure thats possible however.

            TBH I am actually a very sad Nvidia fan boy 🙁 as I am currently using AMD cards which while all right. I miss the features I have grown accustomed to, I switched for this gen strictly for Freesync (and better 4k performance at the time though that has changed now. The second that Nvidia pulls its head out and drops there betamax redux and supports freesync, Ill get some 980 TIs (or whatever there flagship is)

          2. Hey, you have some very good points in there. I think i now agree with you. Although i think NV is going to “try the market” see if it can live with g-sync being in a handful of monitors only (if financialy viable). Nvidia having the biggest user bans (compared to amd in dgpus), they might have a shot for let’s say gsync 2.0 or whatever update they might bring. They could overhaul the market a second time :). But NV are greedy, they won’t make it royalty free so in the end i think your call is the good one. What pisses me off is that during “blinded subject tests” g-sync, from what i remember, won largely the “feeling” and “quality” tearless experience. That was tested on gamers. Now i am a very happy g-sync customer but if nvidia would implement freesync, on my next monitor i’d probably get freesync, cause like you said if they end up supporting freesync, it’s byby gsync. sad :(. I’m so satisfied of my monitor….

        2. What you on about? They can only dominate a market they’re competing in, and they are dominating that handsomely

          1. What are you on about? That has zero to do with the convo. Intel makes GPUs as well just not discretes. Fact is people that use discrete GPUs make up a small margin of PC users. I never said they arent winning the DGPU war, I said that Gsync will lose because intel and Vesa have adopted it.

    1. Well, they developed better product, that requires the use of components. That costs money and they have right to charge for those costs and make profit.

      AMD developed a cheap alternative, with no components, but still requires specific type of monitor, but the product isn’t as good ultimately.

  3. I hate to see these comparison panels, clearly they’re using the info at their advantage and it’s normal. Nvidia could use the same info at THEIR advantage. It’s just stupid.

  4. amd has always had the stronger cards then nvidia.nvidia has always had the better drivers.
    but with dx12 out now it looks like amd has the upper hand.

      1. when its fact people call you fanboy.. when its truth they call you troll. im used to it. and if you know anything about the 2 gpu companys and did some googling you can easily find it out.

        you just want to leave a dumb remark and think you funny thats a troll.

    1. Indeed for asynchronous compute AMD is up front compared to NV. I guess we’ll have to see when a triple-a title comes out to see who’s up on top.

    2. Define stronger?, and so far the dx12 benchmarks such as Ashes of Singularity and Fable have still performed better on 980tiNot overclocked) rather than Furyx (struggles to overclock according to reviews despite water cooling).

    3. Except for the fact that Nvidia cards pretty much destroy AMD at every price bracket, with better drivers and higher stability, less power consumption and heat to boot.

  5. Even if everything AMD said was true and Free-Sync was superior to G-sync , I don’t want to use a high end AMD GPU to play my games on as I prefer NVIDIA so I couldn’t care.

    The only thing I do care about is AMD’s constant digs at NVIDIA which are not only petty but act more like an internet fanboy than a large business should act.

    1. i am an amd fan but i have to agree. these constant digs at nvidia makes them look childish.
      nvidia just shrugs it off and goes along minding its own buisness while amd looks like its just bark and no bite. 🙁

      1. I am not a fan of AMD as I don’t favour them over NVIDIA however I do like AMD and currently have two AMD PC’s. One is an A10-7850k build and the other is a GigaByte Brix with an APU and discrete 275x GPU and like them both to be fair.

        I am always interested in AMD though as there’s something appealing about being able to have your PC with all major parts from the same brand.

        Yes the constant petty digs isn’t upsetting or antagonising , I just feel that it smacks of unprofessionalism. Also the old saying “The weakest dog barks the loudest” comes to mind so think they would be much better off if they didn’t troll rivals like that.

  6. You dont really have an option if you’re already an Nvidia/AMD fanboy since gsync wont work with AMD cards, and freesync wont work with nvidia cards.

  7. I wish it was true and it was better and i wish it worked with my Nvidia cards aswell.. The price of G sync is way too high. I would pay it if i knew for sure that i’ll get a good monitor, but nearly everyone claims how poorly made all the high end monitors are.

    You get 4-5 bad Acers until you FINALLY manage to get one with no dead pixels, no bad bleed and of course no dust inside the panel.. sometimes it takes more (WTF? You pay 700-800 for a crappy monitor :O)

    Same problem with Asus. The screens are coming from the same factory from what i was reading, and that factory has a POOR quality control, even after the HUGE backlash it keeps having. It’s like they dont even care haha. My cheap 200$ 60 fps monitors ARE better. I’m so happy i returned my Acer and Asus and i kept my money.

    There is nothing worse than having horrible light bleed or uneven screen. My friend have 4 dead pixels in the middle, and yes.. thats after changing 7 monitors with WORSE problems. 4 DEAD pixels in the MIDDLE people. Its bloody annoying.

    P.s. No, not every ISP monitor has the light bleed or uneven screen problem. I have 4 cheap monitors and none of them have it. My friends have a ton of ISP monitors too. The worst is a slight glow in 1 or 2 corners. Google bad ISP glow. Thats what most of the G-sync monitors seems have. Good luck playing any kind of horror game, any kind of game with dark places AND movies.

    P.s2. Yep, don’t even tell me how you got a good one on your first try. I know it happens. People win the lottery too! So spare me all of that. Literally everyone i know with a G monitor went thru hell to find a good one, even myself… sadly after 5 tries i gave up.

    1. Yeah you do know that the bad ones are the vast minority right? Also, the ASUS ROG and Predator models are just about perfect GSync IPS monitors for the same $700-$800 price range you’re talking about. And I guarantee you that your cheap 60hz monitors are NOT better.

      1. Hey i got the Asus monitor (had) and yea, my cheap ISP monitor has a really good dark screen, and zero backlight problems or bleeds. Sure, the colors and details are better, and you get Gsync.. but i can’t stand the horrible dust, the dead pixels and the light bleed.

        I guarantee you that the cheap 60 hz monitor is better for every single horror game and movie. I can’t wait untill we get some better ones or just Oleds for perfect blacks.

        Here is a pic of what i’m usually having, so you can understand me better.

        http://www.consumerreports.org/content/dam/cro/news_articles/Electronics/CRO_Electronics_OLED_Star_Wars_10-14.png

        Which one do you think looks better? You be the judge. This is the same thing i got with both the Acer and Asus. I changed a ton of monitors and i just couldn’t get a good one.

  8. After what AMD did with it’s Fury X benchmarks vs the 980 TI making it look faster I don’t give two $hit’s what AMD’s marketing says until I test out both tech right next to each other and see for myself.

  9. As long as freesync works at any hz I’m fine with calling it even in terms of technology compared to G-sync until then no and I definitely don’t want V-sync ever being used in the process.

  10. I don’t care whether nvidias version may look better, they need to drop in price. Their taking in the mick with their price gouging methods. I’m interested in the technology but there is no way im paying more than £300 for the tech. With black friday deals hardly any of the nvidia tech prices shifted much im dissapointed with their capitalist ways

    1. Yes, how dare they make money for putting in the time and effort to make superior technology! *shakes fist*

      The cheaper, less effective alternative is always there for you

    2. The nvidia version has hardware built into the monitor to make it work better. You don’t get that silicon for free sammison.

  11. If Freesync worked with Nvidia GPUs, it would be no competition. But i left AMD for Nvidia due to generally better hardware performance and much, MUCH better driver support. My R9 290s used to crash to desktop due to a driver error on a regular basis. Since i’ve had my GTX 980, its never happened.

    1. That’s not fair. The 290 is a crossfire card while the GTX980 is a single GPU card, isn’t it? Double GPUs are always more prone to errors, SLI isn’t any better than CF.

      1. I owned a single R9 290 before i went Xfire. In fact before the GTX 980, i always had AMD. HD7770, HD7870. i always just put up with the driver issues as i assumed it was the same for everyone. When the GTX 980 was launched, I bought one, simply due to its performance beating out its AMD rival. I was so impressed i bought a second one for my second rig. The only issue i’ve ever had was an odd scaling issue with my newly released 4KTV. Nvidia fixed it within a fortnight. Don’t get me wrong, I want to see AMD succeed (my editing rig is still using an FX8350 and i don’t intend to upgrade until i at least see what Zen offers) But truth is they simply don’t compete with Nvidia when it comes to their software. Its a mess.

        edit: As if i needed back up for my claim, a news story just broke about AMDs new Crimson software breaking users GPUs by limiting fan speed and causing them to overheat.
        http://www.dsogaming.com/news/users-claim-crimson-software-killed-their-amd-gpus-amd-acknowledges-software-issue/

  12. I run games at 144 hz and I try to get the framerate up there, bare minimum I don’t drop under 60 ever. And guess what, I never notice any tearing. If you’re seeing torn frames your framerate is way too low to qualify as playable/smooth. The last time I noticed tearing was in 2007 when I played Crysis at 20 fps to see the revolutionary new shaders. Not long after I got some sense and started prioritizing framerate over eye candy.

    1. Yea but do you wanna know a funny fact? 🙂 My old LCD monitor has the same kind of blacks. I can take a picture and show ya. I got 5 different monitors and the new ISP monitors all have this kind of problem. Also the terrible bleed..
      Its because the display is not prepared, which according to people costs more and makes ISP more expensive. Old ISP monitors used to have no bleed on them, but they coasted WAY more too.

      Again, my LG from 2005 has pretty amazing blacks and almost everyone that i show it to, think its amazing or its Oled (when i take pics in a dark room, so they cant see the cheap looking frame.) I wish i can get the same kind of monitor again, but with better viewing angles and 1080p. That LG is 900p and it kicks the ass of all ISP monitors in terms of blacks. Im forced to play horror games on it.

  13. Did you really just try to pass off an OLED as a PC monitor here? Lol not everyone in these comments is display illiterate.

  14. You’re missing the point. At 80 fps the torn frame will only be visible for 1s/80. It’s not on the screen long enough for you to really take note, that’s why tearing becomes a non-issue at high framerates.

Leave a Reply

Your email address will not be published. Required fields are marked *