NVIDIA claims that GPU prices will continue increasing through Q3 2018

Things are not looking good for everyone wishing to build a new gaming PC. As we all know, the prices for all modern graphics cards have sky-rocketed over the past few months. And according to NVIDIA, this situation will not be resolved soon as the prices will continue increasing through Q3 2018.

As NVIDIA told Massdrop, the pricing in the market will continue to go up through Q3 of this year. In other words, don’t expect to buy a cheap high-end graphics card anytime soon.

According to the green team, the two main reasons causing the GPU prices to increase each and every month are the mining craze and the vRAM shortage.

Miners are currently purchasing every new high-end GPU they can get their hands on and as a result of that, all NVIDIA and AMD partners have a hard time replacing them.

On the other hand, Apple and Samsung are willing to pay more for the memory that will be used in their smartphones. Factories are using the same production lines for the memory that is used in graphics cards and in smartphones and this has created a shortage of memory for companies like MSI, Gigabyte, Asus, and EVGA to make graphics cards.

Bottom line is that things are not going to improve in the next couple of months, so this will be a tough period for PC gaming enthusiasts!

159 thoughts on “NVIDIA claims that GPU prices will continue increasing through Q3 2018”

  1. SAD news for those willing to upgrade, or build an entirely NEW PC system this year. I doubt this mining trend is going to end soon.

    By the way, NVIDIA GPUs which were being sold to miners were based mostly on the Pascal architecture, and didn’t the company state earlier that they will be ramping up the production, to address the supply shortage which has been affecting the market ??

    Which might also help reduce the price as well ?

    To QUOTE Nvidia’s CEO, Jen-Hsun Huang:

    “”We’re working really hard to get GPU down to the marketplace for the gamers and we’re doing everything to advise retailers and system builders to serve the gamers. And so, we’re doing everything we can, but I think the most important thing is we just got to catching for supply””.
    .

    1. Translation:
      “We’re working really hard to get GPU down to the marketplace for… the Miners,were the Real Money come from.”

      “According to the green team, the two main reasons causing the GPU prices
      to increase each and every month are the mining craze and the vRAM
      shortage”.
      Vram… “shortage”.

      Jen Huang said this year,they will continue to maintain the production of Pascal chip,because is still demand on the market(aka miners) and now is a cheap chipset to produce at high profits.

      Why should bring a new video chipset on consumer market,when the Pascal chip sells very very very well on mining market?

      1. Because when the mining mania dies down, they know how much they’re still going to depend on gamers for their business. This miming high won’t last forever.

    2. Still almost every comment I leave is getting “Detected As Spam” and deleted. Look at the comment I left here. There is nothing spam in it. If you can please restore it. Thanks.
      Still almost every comment I leave is getting “Detected As Spam” and deleted. Look at the comment I left here. There is nothing spam in it. If you can please restore it. Thanks.

      1. Yeah, I just read your previous comment, and I was about to reply, but it got deleted !

        Btw, I’m not deleting any of the comments. Only the MOD/Admin has the full right to delete comments. Ask JOHN, not me.

        Btw, you sure you didn’t delete any of your own comments by mistake ?

        1. Yeah I knew it wasn’t you. I just commented under your comment because that’s where my comment was for John to know where to restore it. Looks like he did. You can’t delete your own posts here so it’s not me. It’s something with the spam filter being set too strict I think. I emailed John about it a while back but it’s still happening.

          1. Btw, YES I think we can delete our OWN comments as well. I’ve done this few times here on Disqus, lol.

            Don’t know whether this is time-constrained or not though.

      2. yeah don’t know why this happened, approved your comment. Disqus has listed you as a Low Rep so perhaps this has something to do with it

        1. Thanks

          Edit: I assume Low Rep means Low Reputation. Not sure how I got that. My posting history shows different but at least I know why my comments keep getting deleted.

      1. You think AMD is any different? Guess what? They still rely on Samsung, Micron and SK Hynix for their VRAM because all system memory come from the same exact silicon wafers on the same production lines.

        1. True but in over all Nvidia is the bigger a** Hol* . And how i know them they gona milk it as mutch as theh can. Maybe for years like DDR4

          1. If you’re going to blame anyone, blame Samsung. Earlier this month they announced they’re cutting back on capital investments, which means cutting back on system memory production. You know what that means? Less VRAM for all GPUs. In business terms, that’s a form of price fixing, which is exactly what China is investigating them for. Android and Apple devices are also suffering for it, prices for those will be rising and production for mobile devices will be declining (Apple already announced) because of the memory prices, which are already hitting their profits as well. If anyone is milking the issue as much as they can, it’s the manufacturer of the system memory themselves.

      2. NVIDIA’s fault in the UK I suppose, always NVIDIA’s fault, NVIDIA’s fault we pay lower prices for their GPUs than AMD’s.

        GTX 1060 6Gb = £329
        GTX 1070 = £539
        GTX 1080 = £599
        GTX 1080 Ti = £839

        RX 580 = £349
        Vega 56 = £749
        Vega 64 = £983

          1. Those prices are from an hour ago from the website. On overclockers you can get a Inno3D GTX 1070 for £499 right now.

          2. VAT is a goddamn joke.

            I shipped a $500 reference GTX 970 from the US to a girl in France and the post office demanded that she pay 25% of the value to them before they would render the item to her. This what we call a shake-down. I was angry for months about this. Not because I didin’t have the additional money, but strictly out of principle.

            If you’re ever searching for reasons to value the US over other “modern countries”, add the absence of VAT to the top of your list of why the US doesn’t suck.

          3. Yeah well, one reason the English split from England to America, they didn’t want to pay tax LOL Yes American pay taxes but it’s not the same is what I mean. We pay higher taxes mainly because we’re more socialist, thing are alot more public funded over here.

          4. Is everyone in socialist society truly as happy as they all pretend to be, or is someone else off-screen always making threatening gestures toward them or holding their families hostage when they go on camera in case they say something negative about their country?

          5. High taxes, more government control in services, bigger government, more control over people’s lives in laws, take care of single mothers, rule the unemployed’s life in various ways, more government debt, no ownership of debt, it’s everyone elses and the government’s problem, no choice in public services and private services, you have to pay the tax end of story or it’s prison. Sounds great doesn’t it?

          6. No it sounds horrific… I fear that a mass of ignorant people in America are wanting to push our country in that direction for some reason though.

            It’s a desire to disassociate from personal responsibility. If it’s all “the government’s” fault, some big nameless mysterious faceless force, then people can relieve themselves of their own responsibilities and expect “the government” to sort everything out. They’re like the kid that thinks his Pop-tarts and orange juice magically appear in the refrigerator each morning, without any concern as to how they really arrived there.

            Western ideology has had SUCH a great run…. The best even, no doubt, but sadly, people want to see an end to the best thing human society has ever seen.

          7. Yes, I’ve been interested in U.S politics more since America elected Trump because of the biased again him and lying about him, it really has brought the socialists out because Trump wants them to work, not live on welfare. Our Labour American’s Democrats seem to want people reliant on governement, a good way to get a voting block, especially when you import people who want and come for free stuff.

        1. Plus NVIDIA don’t even make the VRAM, AMD could do it, they make HBM, funny how he doesn’t mention that, but no, no AMD would never do a thing like that, they have ethics. LMAO

          1. Sells VEGA at £399 for a month and claims it’s MSRP when it’s not, Lies to their share holders,
            “AMD. Four class-action CPU flaw lawsuits filed”

            AMD Ethics champion company, but but insert none argument, NVIDIA/Intel are worse.

          2. In the sense that HBM is AMD’s tech, they come up with it, obviously someone else can manufacture it for them.

          1. AMD are doing the same by your logic and their GPUs are more expensive in say the UK. You make no sense at all, you have no sense of why NVIDIA should increase production and the concequences of it, you ‘re just looking to blame NVIDIA and not the miners because you said nothing about AMD. You don’t have a clue what you’re talking about.

          2. Yes amd is dooing the same .

            BUT
            History thought us that Nvidia are a Buntch of a**ho**S . And they will milk the market and lie to us just like before .

          3. Well you clearly seem to have made up your mind on that, just like alot of your comments, you just assume guilt by history. What ever you do don’t give up your day job and become a judge, as you seem to be judge, jury and executioner.

    3. Shows how little you know about the way business works. The mining fad won’t last forever and each cryptocurrency will eventually become unprofitable to mine without an ASIC. That being said, they still recognize their main market are gamers, developers and consoles because that’s one demand that will never go away. Ever. The miners have a tendency to hurt the market because they end up pushing the PC enthusiasts out of PC building. That causes the intended target market for NVidia AND AMD to slowly fade, which leaves them less profitable than they were in the long run.

      1. Just dropping this.

        Not every cryptocurrency is « asic » compatible and many cryptos are turning to STAKING instead of MINING (mining as we know it is not sustainable).

        1. Oh, of that I’m well aware of that. I’ve already made some investments in crypto, which have already given me a solid return. However, there are other currencies that have recently hit the Coin Market Cap system, which are indeed asic compatible, or at least PC component (CPU,GPU,memory) compatible.

          1. True there are. And that’s a problem. Soon the energy needed to compute cryptos will be gigantic, hence cost alot. We’ll see even more top100 cryptos forking to hybrid mining/staking then 100% staking most probably. But for the time being there’s money to be made mining eth in exemple. And it hurts gamers sadly.. hurts me 🙁

    4. LOL just shows how much companies care for the masterrace aka overpriced garbage don’t buy games pirate it gaming

    5. Remember Nvidia only makes the GPU’s. It’s up to the companies assembling the videocards to purchase the rest of the components, like vRAM modules. Which are one of the key factors in the current price increase. Nvidia can pump out as many GPU’s as they want. The vRAM shortage is just going to get worse in that case. Causing the prices to go even higher…

    1. -Gpu prices rising, so he doesnt upgrade.
      -buys a console that cost as much as a mid range card so he can play games on low med settings because consoles are 4-5 year old pcs at this point.

      Makes perfect sense.

      YO DAWG I NOW YOU CAN UPGRADE AND SHEIT BUT BUY A CONSOLE TO PLAY GAMES WITH WORSE GRAPHCS THAN YOUR CURRENT PC THAT IS 4-5 YEARS OLD.

      Makes perfect sense.

    1. Good for you, but which Video Card/GPU do you have ?

      Mine is the RX 480 4GB. I game on 1080p/60Hz, but I can’t upgrade to any other high-end 2K/4K monitor with a higher refresh rate as well, until the GPU prices become normal.

      On 1080p, the RX 480 pretty much does the job, but still struggles in some of the graphical intensive games (not a big deal though).

      For 2K/4K gaming, I do need a better Video Card.

      1. I know what you mean, and you’re right.

        I bought a new 34″ 3440 x 1440 120 Hz utlrawide recently, and from 16:9 1440p, I’m feeling the performance drop even on a 1080 Ti. I still keep above 60 FPS at higher settings, but I’d become very accustomed to 90+ FPS so it was a bit of a drop for me to go down to the 60’s in some places.

        I need a card that is 150% of the 1080 Ti to be where I want right now, which doesn’t actually exist yet so there’s not much I could do anyways, even if the prices weren’t so high. And SLI isn’t even an option for me anymore. I’m done with those days.

        1. Your problem is the distance from the “native” refresh rate of the monitor, one thing is having 90+ FPS, one other thing is having around 60 on the same 120Hz monitor. Also not sure what refresh rate your previous monitor had, if it was a 60Hz that also justifies the big difference you see now, if it also was a 120Hz monitor, well that’s explained earlier.

          1. If my last monitor weren’t 144 Hz with G-Sync (the new is G-Sync as well), then I’d agree with you.

            But as far as refresh rate goes, I’m doing about the same. My issue is the drop in frame rate due to the increase in pixel count combined with the increased field of view and on-screen rendering.

          2. The higher you go with native refresh, and the worse it’ll look if fps stay the same or around the same.

            for example
            I have a 1080p 60Hz monitor, and let’s assume my hardware guarantees me around 70fps on a certain game. Now if i just change monitor, just the monitor, keeping the same resolution, but with 120 or 144Hz, i’d still have around 70fps, but they’d look much worse on a 120hz monitor compared to my previous, but it’ll be even worse on a 1080p 144Hz monitor.
            It’s just how it works, never wondered why when you go to the cinema you really have no problems with movies’ framerate? Keep in mind practically all the movies are filmed at around 25 FPS, yet you have no problems watching them, it’s just because there’s no monitor, it’s just a projector, projecting something onto a blank panel.

          3. Without adaptive refresh rate, you’d be correct. But as I mentioned, I have G-SYNC on both monitors.

            My native refresh rate IS THE SAME AS my frame rate at all times, because the refresh rate changes in real-time to match the frame output of my card. At no one point is my monitor at 120 Hz and my GPU output any less than 120 Hz, or vice versa. If my frame rate is 72 FPS, then at that exact moment, my monitor refresh rate is 72 Hz.

            This is not the same as a static 120 HZ refresh rate running content at 70 FPS. There are no duplicate/repeated frames with G-SYNC. The display only redraws when a new frame is provided by the GPU, unlike a static refresh rate redrawing the last frame over and over until it gets a new one.

            The “cinema looks fine” argument is based upon variability of frame rate, which, yes, even with a G-SYNC monitor, I will have unless I cap my frame rate. A constant 30 FPS from beginning to end is easier on the eyes than 92 FPS, then 63 FPS, then 49, then 78, and so on. I understand this. But it has nothing to do with native refresh rate when you’re talking about a display with variable refresh rate like G-SYNC.

          4. G-Sync doesn’t do miracles, it’s not capable of such flexibility 60FPS capped would still look better on a normal 60Hz rather than a 144Hz with g-sync pretty sure

          5. How exactly did you come to that conclusion and what are you factoring in here, v-sync on, there more tearing on a 60hz than a 144hz?

          6. V-sync is cancer, also i’m not talking about tearing, all i’m saying is, the farther you are from monitor’s native refresh rate, the worse, no matter how high and how low, and no matter g-sync, each one will give you a different problem. G-sync doesn’t give you 100% accuracy on matching frequency, so if you manage to keep stable 60 on a certain game it’ll probably look better on a classic 60Hz monitor, than on a 144Hz one, even with G-sync, at the end that monitor will still have 144Hz as native refresh rate, even if it’s variable.

          7. Well that’s not the point of G-Sync anyway and everyone gets varable FPS, G-Sync wil make the experience much more smoother. I mean if you get 60fps on a 60hz montior all the time you don’t even need G-Sync anyway.

          8. That’s why G-sync is ideal for 144hz monitors because no one can get 144fps on every game so the varable FPS would look nasty on a varable refresh monitor, while a 60hz screen with 60fps all the time is doable, hence no need for G-Sync. Depends on the GPU though of course, fast sync is a good alternative.

          9. Thing is that even if you don’t reach 144Hz, you probably reach 100+, otherwise why buy it at all, and 100FPS is probably within G-sync’s range of operation, that’s probably why it helps in those condition.

          10. Well games will drop below 60fps even at 1080p on ultra so that’s one reason. IF you ‘re getting wide variation in FPS and not hitting your top refresh often then G-sync is for you, plus no tearing and much lower input lag than V-sync.

          11. Yeah right but how wide, you don’t really expect it’ll make 20FPS look like 60 right? I’m still confident that 60FPS would look better on a classic native 60Hz panel, rather than a 144Hz with G-sync.

          12. 60hz on a 144hz monitor is 60hz, G-sync doesn’t do anything that’s not what you call not native, it doesn’t interpolate anything, no idea where you got that idea from.

          13. I’ll ask a friend of mine to do that test.

            60Hz on a 144Hz is like running a 60Hz monitor to 25Hz, i really doubt about how it’d work.

          14. Nope,I have a G-sync monitor, there is no screen flashing like low Hz do because it’s refeshing the screen less times.

          15. Well for one start I can’t see any screen flashing when the FPS hit 60fps or lower, can’t tell the different between 30fps and 60fps in terms of flashing because of low hz no.

          16. the screen can flicker I suppose a better word would be because of the low Hz, it’s hard to pick up but some people are more effected by it but its not like 60hz CRTs of course.

          17. No, it wouldn’t. It would look the same.

            Because as long as your frame rate is capped and G-Sync is enabled, 60 FPS capped on the g-sync 144 Hz display would function identically to 60 FPS capped on a 60 Hz static. Why? Because under those conditions they ARE identical. They are re-scanning the pixels on the display 60 times in one second. There is no other distinction to be made.

            G-Sync isn’t a miracle, no. It’s just very well-invented technology that makes lower frame rate experiences on higher refresh rates as good as they would be on lower refresh rates, because it is physically changing the refresh rate of the monitor to what it would be on a lower refresh rate monitor to match the frame rate. G-Sync does exactly what you’re saying it does not do.

          18. It surely has a range, i don’t own a g-sync monitor at the moment, but i know a few poeple who do, and what they tell me is that if they tried playing at 60FPS on that same monitor, it’d look like sh*t, and not 50FPS on 60Hz monitor sh*t, but like 20FPS on 60Hz monitor sh*t, so i’m not sure it does its job that good, or at least not as good as you’re saying anyway. You should try and run the same game with the same settings, same everything on a G-sync 144Hz monitor, and then on a classic 60Hz monitor, see how it does

          19. I absolutely have done this, and I can confirm, in no way is adaptive refresh rate inferior to static refresh rate and non-native frame rate.

            I have had a few games go into borderless window mode before when I forgot to enable G-Sync in non-fullscreen apps as well and it looked pretty bad, but this was strictly the same phenomenon of a lower frame rate on a higher refresh rate monitor, which was immediately alleviated by properly enabling G-SYNC.

            It’s possible they did the same thing and didn’t have G-Sync configured properly (there isn’t much to it), but I don’t know how they could see it to be worse than 50 FPS on a 60 Hz monitor.

            Does anyone else have any ideas as to what he’s referring to? I don’t seem to understand.

          20. What is it exactly that you believe G-Sync does if it isn’t adaptively changing the refresh rate to match the frame rate of the GPU? Interpolation?

          21. It is doing that, but not as good as you think, or not as good as nvidia says. I’m pretty sure there are several situations where G-sync would actually make it look worse.

          22. But you’re saying things you can’t even verify, also it would also depend on the monitor, some 60hz monitors actually look better and are better on the eyes than other 60hz anyway.

          23. I’m more than happy to reconsider if you can demonstrate said scenarios in which G-Sync will make frame rates lower than the native refresh rate look WORSE than V-Sync or nothing at all. I can’t fathom such a scenario though, other than G-Sync being disabled in the control panel lol.

            I’ve spent tons of time trying to recreate exactly the phenomenon you’re discussing with G-Sync on and off, and I can confirm, it does exactly what it’s advertised to do. Nothing more, nothing less. Sure, it may be a bit overhyped from NVIDIA’s marketing, as though it revolutionizes the gaming experience, but for someone like me who is very sensitive to motion and frame sensitivity, it is revolutionary.

            But back to your specific argument:

            1) What PRECISELY does NVIDIA say about G-Sync that YOU say is not true? Provide at least one verifiable example if you can.

            2) In what situation would G-Sync make rendered frames on a given display look worse than without any G-Sync at all? Again, provide an example, and if possible, an explanation of why matching the refresh rate to the frame rate would look worse than a static refresh rate monitor and a matching frame rate (assume frame cap @ refresh rate; no V-Sync)

            And by the way, this isn’t me being an NVIDIA fan boy, you can honestly interchange the term “G-Sync” with “variable refresh rate” for all intents and purposes and the same principles apply.

          24. I have no proof to show you as of now, what i’m saying is just theoretical, and i’m basing this on the fact that “old” normal monitors don’t really like it when the frequency of refresh is modified, that’s why most of them have a fixated one or they have very little room. But well if these g-sync monitor are built exactly with g-sync in mind, so in that case g-sync isn’t something that gets added later, the monitor would probably have no problem switching refresh rates fast in a matter of ms, but if it’s some kind of module that gets added later to a simple monitor, i still have my doubt, that’s why i asked my friend to do this test (he’ll do it at a later time), and i will ask him what he thinks of both settings, and which one looks a run smoother to him. Anyway i started the whole argument because i assumed you had a simple 120Hz monitor, the one you were talking about earlier, but you later clarified you now own a 144 g-sync monitor, so yeah the situations changes much, but i want to see to what point it’s accurate.

      2. i play on 1080p 144hz sometimes i downsample from 1440p for better anti aliasing in some games.
        my specs:
        i7 3770k 4.2ghz
        16 gb ram ddr3 2400mhz
        980ti g1

  2. “shortage” haha! tell us a new lie Nvidia. Intentionally producing low numbers of GPU and then calling it a shortage.

    1. It’s not NVIDIA’s fault as a whole, or in a direct way. VRAM shortage is one of the main causes for the price hike to some extent.

      It all comes down to TSMC, where the die production takes place. If Apple/Samsung are willing to pay more for the memory, then they will get first priority, instead of Nvidia or AMD.

      We can’t just blame NVIDIA alone, IMO.

  3. Yeah, and they’ll continue rising the way you want them to for when you release your new line of GPU’s.

    You don’t care if the general consumer cannot buy what you’re selling, as long as you’re getting mining money at the end of the day.

  4. You can’t really blame any gpu manufacturer. The bottom line of those companies is to sell out all their stock. They don’t care who it gets sold to. I only really blame the cryptocurrency market.

  5. I’m good with my current gear. If any new people want to get into pc gaming i suggest going budget with a 1050/1050ti, buying prebuilt or gaming laptop. If you have anything near a 970/1060 in your system keep it for 2-3 years.

  6. Oh if you didnt know you could buy gpus straight from nvidias store. click notify me when back in stock and track your email. msrp 1070s were in stock feb 14th two of my buddies bought one each.

    1. I was wondering if it might be possible to pick up a card with the “Notify Me” function or if you weren’t watching your email every hour then the few they had available would be gone immediately. I’ve been watching their store for about a month now and everything stays out of stock except the Titan Xp is available sometimes.

      btw way to all those that are blaming Nvidia for the shortages and raising prices you can still pick up a 1080 Ti for 699 and a 1080 for 549 and a 1070 for 399 and a 1060 for 299 just like before the mining craze sent prices through the roof if you are lucky enough to catch them in stock. You won’t find anything close to those prices with retailers because the retailers are the ones raising prices to take advantage of the shortages just like always.

  7. 64K, that’s not entirely true. NVidia stated they were going to temporarily stop production on Pascal to bring out their next architecture, however that was BEFORE they made their promise to increase production.

  8. I’m not actually blaming Nvidia, because vRAM shortage seems to be one of the main cause for the price hike.

    And, like you have mentioned, Apple should be getting more priority from TSMC for sure.

    AMPERE is going to face the same fate just like Pascal, if this mining trend continues.

    By the way, I know Pascal is reaching EOL status, but I don’t think Nvidia is in any hurry to release next-GEN architecture, because they don’t sense any serious threat from AMD, at least in the high-end GPU market.

    AMPERE coming out next, just seems to be a speculation, as of now IMO.

    Nvidia might release a new PASCAL “refresh”, if they really want to cater the gaming segment, or just try increasing previous Pascal GPU production.

    1. It’s all just rumor right now. We may have more solid info in April but if Nvidia sticks with their release schedule from the past then we should get the next gen GPU this year.

      The problem with the shortages whether they stick with a Pascal refresh or release a next gen gaming GPU and a mining only GPU is that there is no way to increase production. TSMC can only manufacture so many wafers a day and that’s it. Nvidia gets what TSMC can manufacture for them and no more.

      The only way to increase production is for TSMC to build more fabs and they aren’t going to invest billions for new fabs when mining could go down significantly in the future and demand dry up and they would be stuck with useless extremely expensive fabs.

  9. Current UK prices on Scan by cheapest model:

    GTX 1060 6Gb = £329
    GTX 1070 = £539
    GTX 1080 = £599
    GTX 1080 Ti = £839

    RX 580 = £349
    Vega 56 = £749
    Vega 64 = £983

  10. Nvidia simply doesn’t care
    let the stocks be low and the prices be high, they only profit from it
    and with AMD being useless as they are, nobody is there to ramp up the competition

    so we either endure scalped prices
    or simply refuse to upgrade
    i dunno what about you but when a GTX 1080TI is spiked to the point of a 1050$ (amazon US)
    i don’t really see the point in paying that much
    the msrp was supposed to be 699$-750$
    nobody is gonna pay a grand over a 700$ card

      1. the buy button is purely for notification for the pre-orders
        the founders edition are not being produced anymore

        1. Right but guess what. It’s about AIB cards having the issues. However at least you can still get cards from them at a decent price. If they did not care they would not have them as a option to buy at all.

          All the AIB companies are having the problems getting Memory. Nvidia does not make memory… And miners on top of it do not help the problem ether.

  11. Any type of Ram in general is freaking expensive. Even DDR3 Kits prices are out of control. If anything miners are only a minor factor in terms of supply / demand. I think the biggest problem is with DDR/GDDR memory shortages in general.

    Can’t blame Nvidia / AMD for them. They don’t make memory.

  12. I got my 1080Ti last November and don’t plan to upgrade anytime soon. Sucks for everyone else, though.

    edit: I’m not racist towards peasants. Just saying it sucks to be them right now in this political climate of high GPU prices. All I’m saying.

  13. Yeah I knew it wasn’t you. I just commented under your comment because that’s where my comment was for John to know where to restore it. Looks like he did. You can’t delete your own posts here so it’s not me. It’s something with the spam filter being set too strict I think. I emailed John about it a while back but it’s still happening

  14. Look at the bright side: Devs are forced to optimize their engines and make their games work on existing hardware from 5+ years ago. And AMD gets to sell APUs like hotcakes. Even I bought one and am building a sort of “PC console” with OC’d 2200G, just for fun. High demand on APUs should increase the R&D in that area and we’ll get more powerful APUs sooner than we would have if GPU prices were normal. Progress on APUs means that the future of small, heat-efficient desktop computers comes sooner.

    1. That actually makes a lot of sense. Huh, guess there is something good coming out of this mining craze I suppose.

      1. 2200g runs games at max settings 1080p medium settings only. Meanwhile
        my GTX 970 that was releaseed on september 2014 with my super monster
        ryzen 1700 cpu and super duper fast ram 16 gb ddr 4 3200mhz i run all
        games at max settings 1440p at either 30-35 fps 40-50 fps 50-55 or 60+
        (depends on the game). So who will buy 2200G to play games at medium
        1080p? No one! No one expect those who dont have a pc yet and want a pc
        just for internet S surfing and using Word and Excel and watch 1080p
        movies!! See now why buying this apu dont make ANY SENSE unless you want it only for simple office work?

    2. 2200g runs games at max settings 1080p medium settings only. Meanwhile my GTX 970 that was releaseed on september 2014 with my super monster ryzen 1700 cpu and super duper fast ram 16 gb ddr 4 3200mhz i run all games at max settings 1440p at either 30-35 fps 40-50 fps 50-55 or 60+ (depends on the game). So who will buy 2200G to play games at medium 1080p? No one! No one expect those who dont have a pc yet and want a pc just for internet surfing and using Word and Excel and watch 1080p movies.

      1. A 970 still or again costs 300 bucks. No APU is ever going to compete with a 300 dollar dedicated GPU, that much should be obvious.

      1. Yes. The case I’m using has less volume than the original Xbox and I will hook the computer up to my TV. It’s essentially a current gen console, except multiplayer is free which makes it cheaper in the long run. And I get to run whatever I want on it.

    3. 10 years ago PC hardware was actually progressing and affordable. Lately progress has come to a halt, and now prices are through the roof on top of that. Right now there’s really not much excess performance to take advantage of on the average PC. The average gaming PC isn’t much better than a current console. It’s better for sure, but not by orders of magnitude like it would have been in 2010. In 2010 PC gamers were starting to play Crysis the way it was intended (medium to high settings, 60+ fps) while console peasants were amazed that COD got real-time shadows. Years later console peasants ended up getting a gimped port of Crysis running at sub-720p, sub-30 fps. There’s some perspective.

      Nowadays we’re supposed to be amazed when a game throws 10% more units/NPCs/polygons on the screen. Gameplay-wise we haven’t surpassed Crysis, Bad Company 2, GTA 4. Quite alarming since two of those are actually last-gen console games. That just shows you how hard we’re stagnating. Games are now prettier versions of the same stuff we were playing 10 years ago. It wasn’t like that in 2008. Compare 1998, 2008, 2018. The first jump is massive, the second is purely cosmetic. No progress.

  15. I’m sitting here with my humble GTX 1050 Ti which I bought for 150 euros last Summer… that now goes for over 200 euros…

    What the forkin’ hell?!?

    1. Nothing, their GPU prices are much higher than NVIDIA in the UK and the UK is a big market, he just AMD biased, AMD are making a sh*t ton more by their higher GPU prices.

      1. question, will this ordeal result into low end and mid range cards becoming more powerfull? Because if you ask me the progress of technology the last 3 years has been pathetic. A low end or mid end videocard isnt much more powerufll than the 750 ti and most gpus still have 2-3gb ram.

        I expected technology to advance faster considering how powerfull gtx 770 4gb was back in….whenever it came out, it was a long time ago.

        1. Well the die shrink helps alot, Pascal was just that and that’s probably why as it’s not a new architecture, It’s Maxwell with a die shrink but even that is pretty decent on it’s own and you get lower power.

          1. So, a die shrink and new architecture would probably see more gains, but then again a new architecture has to mature. That’s why the whole “fineWine” AMD is bullsh*t, Polaris and Vega are new architecture while Pascal isn’t, Polaris and Vega will get better and have, not because of “fineWine”.

      2. I’ll retract that, my mistake, I seen the meme but not the name, thought it was our meme guy who is AMD biased.

        1. Yeah I know what you mean but NVIDIA were making a ton of profit anyway even before the mining made prices go up, and it seems AMD are actually the ones making a profit now after that shady move they did with the Vega prices £100 off for a month, but claimed it was the actual price.

  16. That why iam happy that my GTX 970 G1 GAMING still continues running evrything at max settings 1440p so i dont need to upgrasde it yet! While no all games at 60 fps it dont matter as long as it is more than 30 fps.

  17. Nvidia will use this excuse bump up their new GPU launch prices even further than last time, what I paid for a GTX980 in the Maxwell architecture only got me a GTX1070 with Pascal and that was before all these bonkers price rises over the past 8 months.

    All I can say is TTF for G-Sync and variable refresh monitors in general, they make it so that you can squeeze as much life out of a GPU generation as possible as there’s no arbitrary FPS target to hit.

  18. nVidia has no reason what so ever to bring new architecture, more profitable would be to keep manufacturing pascal. They might bring one new chip but not whole line, not at the moment anyway. It would be stupid and nvidia is lot of things but definitely not that.
    For very much same reason AMD does not rush to bring anything new. They both sell everything they throw at market.

    1. DX12 and Vulkan are a good reason, Titan Z Volta is much better in DX12 and Vulkan so they clearly changed the architecture,Pascal can only do so much and it’s clear it’s not even near AMD’s GCN in regard to that because GCN was designed like that in the first place.

      1. Not really! GCN was designed as to be very computationally versatile, obviously it suits more if you have more control than via high-abstract API, but it cannot be made for APIs that have never existed at that time. AMD actually push low level API more probably because of their CPUs than GPUs.

        1. Well, they made Async Compute units years before DX12 and Mantle, clearly made for consoles because they can use it and async compute units were made in the time before PS4/Xb1 came out, so the timing was right.

    2. tbh I don’t know what Nvidia is planning. You’re right, Nvidia sells every Pascal gaming GPU that they get their hands on except for the 1050 (non Ti) version because they aren’t liked by miners and AMD offers no competition on the midrange and above GPUs for gaming. Even the 580 which is around a 1060 6GB doesn’t really compete in efficiency and is a power hog.

      That’s not unprecedented though. Maxwell was much more efficient than Fiji and beat it in performance. So why do we have Pascal? The obvious reason is that Nvidia doesn’t just sell gaming GPUs. Look at Volta with the Tenser Cores. It’s not a gaming GPU. It’s a poor mans version of a Tesla intended for AI. The profit that Nvidia makes from Professional cards is immense.

  19. I just think that it is useless for nvidia to make seperate architecture for mining and gaming if both still can mine even if the other one is not as efficient because once Turing supply dries up miner will go for Ampere. It is similar to what we have right now. Miners can’t get their hands on radeon they turn their attention to geforce even if nvidia gpu does not mine as well as amd cards in mining. So that’s why i’m thinking the talk about nvidia “gimping” Ampere mining performance on purpose might have some logic in it. But it’s all speculation at this point. We don’t even have any solid rumor about nvidia next gen gpu right now.

  20. You can buy them direct from Nvidia for their normal retail price. Iv bought two 1080tis for £680 this month. They are out of stock quite a lot though for obvious reasons so you may end up waiting a bit longer.

      1. It’s true that the blower coolers are more noisy and less effective. I have had a few noisy cards over the years but they have never bothered me. I don’t hear the fans over the sounds from my speakers anyway and even in quiet moments in game I somehow tune the fan noise out but I can certainly understand an aversion to blower fans if a person is sensitive to that type of noise.

      2. You can buy coolers separately if it means that much to you. The stock coolers really arent that bad, at least for the 1080ti. You can also buy from partner sites directly linked from the Nvidia site if you want an EVGA, ZOTAC brand with pretty LED lights and different fans/coolers.

  21. The bad thing about this is also that it will deter more people from getting into PC gaming. PC gaming has evolved so much this past decade that the most of the arguments made to prefer a console is essentially moot at this point. We can game on a couch if we want to, thanks to steam, there’s a large community, third party exclusives are basically dead at this point etc. The only real argument for a console (besides exclusives) is the price of a gaming PC. This will just further strengthen that argument. You guys should see how much the prices have gone up here in South Africa. It’s ridiculous. I paid R4300 (rand) for my 1060 which in dollars is 371.69. The very same card now is R7149 which is 617 dollars. So Yeah while third world countries don’t make up the majority of the sales, it really sucks for us. But I do know the prices have become ridiculous all over the world though.

    1. My normal philosophy is “live and let live” but miners have gone from being a pest at times in the past by causing shortages of AMD cards to being a serious threat to PC gaming. I like PC gaming and have been doing it for a long, long time and if I have to I can pay $1,200 for a high end card but most can’t so that threatens 4K adoption going forward.

      By far most people buy entry level or midrange cards but those are now too expensive for the average gamer. I know most people aren’t needing to upgrade right now but eventually they will as their cards age and video games get more demanding as they always have and probably always will.

      If this mining craze blows over in a few months then it’s not a serious problem and supplies and prices will improve but if present mining demands drag on year after year and shortages continue and prices remain so absurdly high then it is going to harm our hobby. It won’t happen quickly but over time PC gaming will drop and that affects all of us because Developers give what consideration they do to the quality of PC ports based on what they think they can make in sales. Right now PC gaming is number 1 of the 3 major platforms in terms of sales but that could easily change if GPU prices remain the way they are for years.

        1. I don’t know very much about prebuilt gaming PCs but I would imagine that the manufacturers of those buy their hardware in bulk to get better prices and once their inventory is depleted they will have to buy more GPUs and RAM and surely they will have to pay more now even in bulk quantities. What’s available to buy right now in prebuilts was probably built from hardware bought months ago when prices were reasonable. I would think what is for sale in the near future will have to cost more. Costs of hardware increases have to get passed down to the consumer eventually or they will go out of business.

          1. Most of the price increases from GPUs is retailers marking up the prices to take advantage of the demand, Nvidia and AMD aren’t seeing any of that extra money from cards being sold at more than their MSRP. The natural price increases of hardware simply costing more to manufacture will naturally happen but it won’t be as severe as how cryptomining doubled the price of almost every GPU. And with RAM most PC manufacturers make their own RAM so that is an easy way for them to keep prices down in comparison to the DIY RAM market which is also facing price gouging for reasons similar to GPUs.

          2. Yes, I agree, Nvidia and AMD aren’t gouging people on prices and I have been posting that a lot but some people are stuck on blaming Nvidia and AMD. It seems that some people just want to blame Nvidia and AMD for the shortages and price gouging anyway but you are correct it’s the retailers gouging and making huge profits and possibly also the non reference card manufacturers. Some people seem to think that Nvidia can increase production to meet demand if they wanted to but that’s because they think Nvidia has full control over production quantities when they actually have almost no control.

            TSMC manufactures Nvidia’s GPU chips and they also make chips for Apple and Apple takes top priority. TSMC’s Fabs are running at maximum production and they can’t increase production any more without building new fabs which would cost billions of dollars.

            I suppose if the prebuilt manufacturers are buying reference cards directly from Nvidia then they aren’t getting gouged but for non reference cards bought from a non reference card manufacturer or distributor they probably will get gouged because they are going to sell to the highest bidder whether it is prebuilt manufacturers or retailers.

  22. only we have to pray for a bitcoin bubble to explote and all that miners start selling all their broken gpus in les that 50 dollars.

    then AMD will have to drop the vega 64 price to 300 dollars new

    =) really a dream

  23. This and ram prices make PC gaming a terrible option for new buyers freaking sad as soon as we get competition in the CPU market this happens.

    Something has to be done increase freaking production!

  24. nVidia doesn’t care, they will continue to make record profits and will try to damage control the situation by offering their sympahty for the gamers out there.

    1. You can be assured that Nvidia does care about gamers because Nvidia cares about making money. Over half of their revenue comes from gaming GPUs. Mining is unstable but Nvidia knows that gamers have been the backbone of gaming GPU purchases for a long, long time.

      Nvidia can’t increase production to bring prices down or they certainly would. They don’t make their chips. Nvidia only engineers them. TSMC makes Nvidia’s chips. Nvidia isn’t charging a lot more for their GPU chips to take advantage of shortages. It’s the retailers that are gouging. Don’t you think that if Nvidia could increase production then they would because they would make far more money due to greater sales of chips?

  25. f’ing shortages dude, i was barely able to secure the last 6 cards for my 24 card 1070gtx miner, the price was 30$ more per card than 3 weeks ago and i barely got all 6 of them

  26. I don’t even use Twitter so that’s no argument anyway and yes President Trump does have the ability to make people work, it’s easy, remove the welfare system or reform it, welfare keeps people poor and under the thumb of the governement. I suppose it was only a matter of time before an anti Trump person would butt in.

  27. Are you American? And in what statistic are you comparing the US to European countries that makes the US “A** last”?

    Welfare in America is not “a couple a hundred a month”. Try “a couple hundred a WEEK”, PER CHILD!!!

    If you give someone free money for having a fatherless child, and they can get more money by having more fatherless children, exactly what do you think more people on average are going to do? Especially when they themselves were taught by their family that electing a Democrat into office is the only way they can make a living? When they’re told that a prejudiced society and institutional racism are holding them back and that they will never make it on their own without Uncle Sam’s teat, what exactly are they going to believe?

    The facts are, institutional racism in America ended decades ago, and you would be incredibly hard -pressed to provide an example of how it truly still exists. No wait…. There is institutional racism… There were government MANDATES known as the “Community Reinvestment Act” put in place by Obama’s administration that forced banks to provide loans and credit to underprivileged minorities that they knew could NOT pay the loans back which caused the financial crisis of 2008. That IS institutional racism, actually, so forgive me, I was mistaken. But it was the first example I can think of since Jim Crow laws.

  28. Samsung needs to start selling the ASIC miners. Then there is no reason to mine on a GPU. Once there is no money in it, GPUs will be for gaming again.

Leave a Reply

Your email address will not be published. Required fields are marked *