• Carlos Solís@social.azkware.net
    link
    fedilink
    arrow-up
    10
    ·
    2 years ago

    Considering that both Nvidia and AMD have been constantly pushing the prices of baseline GPUs well beyond the golden standard of the 1060, even long after the Big Crypto Spike of 2020? Yeah, barely anyone would bother spending a small fortune on a GPU

    • Communist@beehaw.org
      link
      fedilink
      arrow-up
      5
      ·
      2 years ago

      Not only that, but the used market is skyrocketing, which is just gonna push these numbers even lower.

      • Onihikage@beehaw.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        2 years ago

        There had to have been people in marketing that knew this would happen and were overruled by bean-counting executives. The top card of each generation outdoes the top of the previous gen, but for a couple of generations it’s been increasing in price in almost lock-step with the performance increase. Often the newer card will have worse VRAM than the previous generation’s equal-performing card because you’re looking at an older top-spec card vs a newer midrange, and the midrange cards always have less VRAM. With AAA games now starting to really want more VRAM in order to have better visuals, the older cards wind up actually being the better option long-term.

    • gumpy@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      2 years ago

      barely anyone would bother spending a small fortune on a GPU

      well, except datacenters. can’t get enough of them and the datacenter card prices would make you cry.

    • GalaxyGamer@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      2 years ago

      Then there is also buying used, which tends to be much cheaper than buying directly from either that it makes sense that their sales are falling off.

  • Rhabuko@feddit.de
    link
    fedilink
    English
    arrow-up
    8
    ·
    2 years ago

    It’s even worse if you do creative work on your PC. Nvidia dominates this field completely because of the performance difference. My GPU is old and I really really need a new card for my 3d work but Nvidia is such a awful company…

    • mayooooo@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      I stopped buying new a long time ago, it doesn’t make sense financially or ecologically. It also doesn’t help that I live in a part of Europe where all pc parts are more expensive by default. But used or refurbished is the way to, get a generation older quadro (or whatever they are called now, A something?) and you and your wallet will be happier.

  • GhostMagician@beehaw.org
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 years ago

    They don’t understand average consumer looking to buy a desktop GPU is not the same as crypto miners looking to buy GPUs. Once crypto miners exited the market so did the main reason for an unusual number of units being sold leading to high prices in the first place, since to them it is a business expense.

    • vegemash@lemmy.one
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      I think Nvidia at least have theirs eyes on the ml market. Theys just dont care about even the mid range. The decision to not put a decend amount of vram on these cards serms like a deliberate move to prevent them running many ML workloads.

  • zzz711@lemmy.ml
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 years ago

    Not surprising since the last gen was impossible to find due to crypto and the current gen is overpriced.

  • nezumi@beehaw.org
    link
    fedilink
    English
    arrow-up
    6
    ·
    2 years ago

    i built my pc when gpu prices were sky high in 2020. i settled for a 980 at the time, and its served me really well, actually. it had been years and years since i built my own pc, and it really made me understand how meaningless having the highest tier, current gen hardware is. i thought by now i would have upgraded it for sure, but ive never had any reason to other than maybe wanting to experiment with self hosted ai. nvidia thinks the average consumer is gonna shell out some ridiculous amount of cash for their newest product when its barely better than the previous gen and you really dont need it unless youre chasing clout or tie your self worth to your 3dmark score.

    • CmdrShepard@lemmy.one
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 years ago

      I built mine in 2018 when prices were also high (but not the peak I don’t believe) and settled on an RX570 since I really only needed it for transcoding with Plex and because I bought a second gen Ryzen 7 that doesn’t have onboard graphics. I ran that thing up until last week when I was given both a 1080 TI and a Quadro P4000 for free from two different people. Now I have more GPUs than I know what to do with. I stuck the 1080 TI in since the P4000 only had DVI outputs (though it would be the better card for Plex), but now I have the option to do a little bit of gaming if I want (mostly play console) or do PCVR with my Quest 2.

  • luckless@beehaw.org
    link
    fedilink
    English
    arrow-up
    6
    ·
    2 years ago

    I wish intel got more attention in this field. I have an a380 in my homeserver and its great for lighter tasks such as transcoding a handful of streams at a time. They’ve been putting a lot of effort into improving the drivers for gaming as well as general use.

    • GreenCrush@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 years ago

      They are priced pretty competitive too. I would just hate to run into a game I want to play that’s held back specifically because of an Intel GPU.

    • RandomException@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      I’m rooting for Intel now. Both Nvidia and AMD don’t seem to care about average consumer who would be completely happy with a low to mid-tier graphics card if it was just cheap enough. I hope Intel’s Battlemage would fix many of their current bugs and problems with some games and at the same time would get a sizable performance improvement. Right now I wouldn’t be comfortable in buying a GPU that fails to launch some games randomly.

      Another huge upside is Intel’s great Linux support that they have had since, I dunno, forever. Since I wish to eventually move to gaming exclusively on Linux, it’s a huge bonus as well.

    • brunchyvirus@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      Yeah I think up until this year NVidia consumer cards only allowed 3 streams for hardware encoding no matter how many cards you had. It sounds like they’ve changed it to 5 in March this year, there is a way to supposedly unlock it though https://en.m.wikipedia.org/wiki/Nvidia_NVENC

      What I like about about Intel cards are they’ve always just worked for me out of the box, sometimes with NVidia I’ve had to do some tweaking or had issues with drivers but not as much within the lastbfew years.

      • luckless@beehaw.org
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 years ago

        Yeah it’s an artificial limit. Pretty sure you can remove it, saw something about it on github.

        And yeah, on linux at least since kernel 6.1 intel arc is really plug and play.

  • mustyOrange@beehaw.org
    link
    fedilink
    English
    arrow-up
    6
    ·
    2 years ago

    No shit. When 1080s from 6 years ago still work fine, there’s clearly some stagnation. They need to cut prices if they want people to actually buy their shit.

    Intel needs to come thru with Battlemage and fuck up team red and and team green

    • patchymoose@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      Whatever happened with Intel’s discrete GPUs? I got whiplash trying to follow the news. At one point I thought the news was that they were discontinuing them altogether. But are they proceeding now?

      • mustyOrange@beehaw.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        Honestly, pretty damn well. If they keep with it, I see good things for them.

        Imo, the A770 is a lower mid end hero. They’ve really improved their driver support, and I think Battlemage is going to be great.

    • Pigeon@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      I think it helps that AAA graphics got so realistic that improvements feel more incremental relative to older games, and indie games proved that much simpler, cheaper graphics are viable and often even preferred, and devs started going for stylized art over realism more often. Probably also helps that Steam Deck is a thing now, and the Switch allows 3rd party games, so that hardware can be a target to consider too.

      Anyway yeah. I’m still running a 1070, and at absolute worst I might have to reduce some graphics settings in the latest or most poorly optimized games, and we’re long past the days where moderate or even minimal graphics settings looked awful. Games are still beautiful on lower settings.

      A better GPU at this point would net me better FPS in some titles, but those games make up a relatively tiny proportion of what I play, and even then I still get a perfectly playable framerate as is.

      So, yeah, not paying those prices for a tiny upgrade, and not when I remember prices pre-covid and pre-crypto miners. I can afford to wait out their greed.

      • Katana314@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 years ago

        I keep explaining to people how the world actually kind of benefits from the Graphical Plateau; but so many insist to me “You will want more pixels. Have you seen raytracing?”

        The Steam Deck mostly gives an upper bounds for how much hardware a game should demand for the next few years, and it’s probably lower than some developers wanted it to be.

        The silliest thing about raytracing in particular is it was planned to be a developer convenience. So in an RTX-only future, we were all going to upgrade to much more powerful GPUs, only to run games that look about as good as what we already have.

        • YuzuDrink@beehaw.org
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          I absolutely love raytracing… and on my 3080 it just doesn’t look good enough yet to justify turning it on for most games. Maybe they just haven’t implemented it well yet, but the reduced framerate in most games just isn’t worth it, and I’ve hated effects like screen-space reflections since more or less they came out.

          I think by the time we have a 50X0 or a 60X0 that raytracing will finally be fast enough to have it look good AND perform well. But for now it’s mostly just a gimmick I turn on to appreciate, and then turn back off so I can actually play the game smoothly.

          • Pigeon@beehaw.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 years ago

            It might be that they’ll put more time and effort into getting it looking right once more people can run it at all, too. I’m not sure what percentage of PC gamers have sufficiently new/powerful GPU’s to run it, but I’d suspect it’s still small, and I’d think there’s only so much time and effort that devs will want to put into something that most people won’t see at all, when they could spend those resources for other aspects of the game (including other aspects of graphics) instead.

            The one thing I would really like now is better audio. Both stuff like better 3D positional audio (e.g. Deathloop if you turn that setting on - although the setting kept turning itself off for me, which was maddening) and more varied and complex sound effects and music. It can make a huge difference, even when people don’t consciously notice.

  • lemillionsocks@beehaw.org
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 years ago

    The obvious answer to this question is that the crypto bubble burst and AMD and nvidia are charging high prices for incremental gains.

    I think the other piece to this puzzle though is that you dont need a high end gpu or even middle high to play a game at settings that look ok today. I upgraded from a 5600xt to a 6800xt recently but it was hard to pull the trigger and justify the expense because I was still playing new releases at decent enough looking settings at 60fps. Old hardware is lasting for longer, and you can still do quite a bit on midrange and lower end hardware.

    In addition to that even integrated gpus are at a point where you can play games at a decent clip. Its not the high end max all settings experience, but my wife is perfectly happy playing games on her laptop with a 5800u and vega 10 gpu.

  • ArtVandelay@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 years ago

    I bought a 4070Ti for $1k and I deeply regret it. Not because I can’t afford it, but because I let my want of gaming at 120 fps overpower my ethics of enabling a company to get away with these prices. It’s definitely a regret I have.

    • Reeek@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      I feel that way too. My 2080 is still good so the itch isnt as strong but when I play something on my 4k TV and the fps dips below 60 the itch returns. I truly don’t want to buy anything from nvidia or amd even for a good while so here’s hoping Intel keeps at it and doesn’t get stupid expensive as well

    • Behohippy@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      I paid $1100 for a 3070 during the pandemic with a newegg bundle deal (trash stuff they couldn’t sell). I already had a 2070 and it was a complete waste of money.

  • wagesof@links.wageoffsite.com
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 years ago

    Everyone has a synced upgrade cycle now because EVERYONE upgraded when we were all locked in due to COVID. Does the massive spike in 2021/22 average out to a normalized graph? Yes?

    • cstine@lemmy.uncomfortable.business
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 years ago

      It’s not even only that: crypto mining went from every card the miners could find to literally zero almost overnight. The spike was, honestly, more driven by crypto than gaming during the super-high sales in 2020/21 and then immediately vanished.

      Of course, nVidia ALSO alienated the heck out of a lot of potential buyers who are sitting on the sidelines because they’re not paying the inflated prices caused by that spike, so the crypto guys are gone, and the gamers are waiting.

  • Triage8420@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 years ago

    Still rocking my 1070ti. I mostly play overwatch 2 and Minecraft so it works ok for me now. Also I’m broke and can’t afford the upgrade.

    • sailsperson@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      1080 here. I’m really happy with the decision I made years back. Some games are terribly not optimized, but that won’t make me cash out for a new piece of hardware.

      And anything that’s actually worth upgrading from my GPU is going to be even bigger and block the front panel pins on my new motherboard I was gifted last year. Yep.

    • Derrek@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      I had a 1070ti since 2018 and it has run everything I have purchased just fine.

      I thought about checking out this ray tracing stuff the kids are into, but is there a card under $300 that anyone recommends? It also would need to be mini itx as I have a tiny living room gaming PC.

      • Poke@beehaw.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        Sorry but I’m not sure you’re going to get any good ray tracing experience for less than $300.

        AMD probably has the best general use GPU in that price range.

        Intel probably has the best (with a big asterisk due to driver and directx issues) gaming GPU in that price range.

        It’s just hard to recommend buying a GPU right now imo.

  • UprisingVoltage@feddit.it
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 years ago

    Personally I’m starting to buy second-hand hardware and I recommend it. Less pricey, more eco-friendly and less money in pockes of greedy corps

  • Weerdo@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 years ago

    Just, no reason to upgrade yet. My current card plays even the newest games at middling levels which is tolerable. Based on previous experience, I’ll upgrade once every 5 years or so. I’m only two years into this secondhand card.

  • ghashul@feddit.dk
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 years ago

    I’m still running a 1060 6gb card. I’ll keep it for as long as I can, and then I’ll likely upgrade to something that isn’t the newest generation at the time.

    • Jediotty@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      I’ll probably use my 1070 till it dies, and after that if I’m able to fix it :)