Settings

Theme

We could be witnessing the death of the graphics card in real time

techradar.com

31 points by DuckConference a year ago · 48 comments

Reader

TillE a year ago

People have been wishcasting the end of dedicated GPUs for like 20 years now.

It's nice that they're slowly improving, that every Apple Silicon Mac has a half-decent integrated GPU, that ML upscaling is pretty good if you want "4K". But they really don't come close to an NVIDIA card at 200-400W.

  • readthenotes1 a year ago

    Basically the original article was saying "I was right 10 years ago, believe you me!" And the editor's note at top was saying "some advertisers aren't happy with what you just said. Walk it back."

    • bn-l a year ago

      Haha you know they got a pissed of call from the advertisers when they regurgitate nvidia’s ai marketing talking points.

  • buran77 a year ago

    Far, far more people are gaming on Nintendos, Playstations and Xboxes with iGPUs and relatively weak total processing power than on 400W GPUs. You're making the mistake of thinking only the fastest can win but in real life the good enough almost always wins. Streaming will do even more for the death of the dGPU than performance alone can compensate. You already rely on someone else to rent your game software from, doing the same for the hardware is one small step.

    Mathematically GPUs can keep getting better at churning out more pixels faster but the biggest limitation is really entirely subjective. Past a certain point the returns are diminishing in terms of what your human eyes and brain can experience as an improvement.

    Laptops will never be as fast as workstations but they're fast enough to outsell any desktop. Any reason to believe GPUs are intrinsically immune to this "fast enough"?

    • NBJack a year ago

      Because the iGPUs aren't keeping pace, and many of the biggest/most popular titles continue to push the envelope of fidelity.

      Ray tracing for example is becoming much more prevalent, and (done correctly) adds a significant amount of both immersion and overhead to the experience. It is inherently an expensive operation for GPUs, so much that most implementations rely on upscale to make it viable.

      Your laptop analogy doesn't really work here; mobile chip implementations are indeed slower but can be measured as a modest percentage drop in performance to their desktop counterparts. Meanwhile, most iGPUs are orders of magnitude slower than many entry level dedicated solutions, especially for sustained performance.

      • buran77 a year ago

        > push the envelope of fidelity

        Almost without exception the simple trick of turning the "vivid" dial to 11 makes people swear that's the "better" picture.

        > Your laptop analogy doesn't really work here; mobile chip implementations are indeed slower but can be measured as a modest percentage drop in performance to their desktop counterparts.

        That "modest" is doing a lot of heavy lifting here [0] even at the mid-range and below. Definitely doesn't fly at the higher end, the 200-400W GPUs mentioned above. The physics of powering and cooling such a system just doesn't agree well with the size constraints of a laptop or its power supply.

        Back when we had the same conversation about laptops, they were far less adequate than today. Tech evolves faster than our perception and at some point it's good enough. I'm typing this from an 8 year old laptop. Not too long ago a laptop would have been useless at that age.

        [0] At the top end the difference seems to be "a modest" 40-50% between the mobile and desktop versions of a GPU. https://jarrods.tech/wp-content/uploads/2023/07/rtx-4080-mob...

        • NBJack a year ago

          I'm focusing on the mainstream iGPUs that dominate the market, which often run in the tens of watts. Indeed, mobile versions of dedicated GPUs with desktop counterparts pull their own.

          Apple is a bit of an exception to this with the M series (if the marketing lived up to the hype), but globally, they only hold about 8.1% of the market.

      • klipklop a year ago

        > Because the iGPUs aren't keeping pace, and many of the biggest/most popular titles continue to push the envelope of fidelity.

        I feel like these days that is not really true. The high end gpu market is so small so most devs don’t even bother. Many games these days are optimized to run on a Steam Deck.

        Things are not at all like the late 90s where you had to have a high end PC to even launch a game. Almost everything works fine on a humble igpu. 1080p low to mid graphics settings are possible.

        Ray Tracing in most titles does not result in a better game. The hype for it has mostly died out because gamers prefer the higher frame rates.

        • earthling8118 a year ago

          A large quantity of my games are outright terrible to play on my steam deck. I have my games divided into categories based on if they're worth playing on it or not. For more intense games where the framerate is abysmal and audio stutters, I avoid using it. That or for games that I want the extra smoothness and visual fidelity. There are many, many games that are like this. It includes older ones too.

        • ChoGGi a year ago

          > Ray Tracing in most titles does not result in a better game.

          I'm okay with precast shadows, and so on, but if the game has ray traced reflections I'll always turn those on.

          I don't usually play mp games, so I'm fine with a slightly worse framerate to not have screen space reflections.

        • hu3 a year ago

          Sorry but do you got a source on this?

          > The high end gpu market is so small so most devs don’t even bother.

          I feel the opposite with big AAA titles pushing realism and ever amazing graphics, selling millions.

          I love Steam Deck but good luck playing games like Cyberpunk, Black Myth: Wukong, Latest Metal Gear Solid, Star Wars Outlaws on Steam Deck with any reasonable framerates.

    • 6SixTy a year ago

      Saying anything but the Switch has an iGPU is a very literal interpretation of what an iGPU constitutes. I'd wager that fairly modest power consumption (10-30W) and (non G) DDR memory constitutes the soul of an iGPU. PS5 and XSX do not fit those definitions, as do the Xbox One (eSRAM) and PS4.

      People already have their own opinions when graphics peaked, but depending on when they set that threshold, I honestly can't help but respond with look around at the level design.

      And streaming has natural latency and quality issues that can't be resolved. Almost every game streaming service is incredibly niche, dead, or BYOG.

kkfx a year ago

Mh, so in the long run instead of a mobo you prize a desktop composed by a large unique integrate brik with at maximum external PSU and screen? Or the other end, looking at my desktop where the Nvidia card it's not much a card but a computer inside the computer with GPU, memory etc and it's dedicated nvtop listing processes and resource usage like [hb]top for "the larger computer"?

Such "migrations" are a classic, we merge some components in a "package", we split a "package" in some discrete components and so on. Personally I'm more interested in the possibility of easy repair and custom component selections instead of crappy glued stuff with plastic clips an no standard to force drop an entire car just for a punctured tyre.

BTW most assembled system today have a useless super-CPU, too little ram and bad storage choice to last 3-4 years instead of 8-10 in comfort. That's one of the biggest issues for us.

  • edelbitter a year ago

    Where is that 3-4 year number from? That sounds way lower than my personal practices (Samsung ran out of incremental 3 digit model numbers before any of my drive reached expected write endurance), or what those graphs regularly published by Backblaze suggest.

neuroelectron a year ago

Maybe they'll be replaced by gpgpus but i doubt they're going extinct. Upscaling is not a replacement for 4k rendering, it's a crutch for slowing advances.

mathgeek a year ago

> Sound cards and network adaptors were an integral part of custom PC builds for years, but those eventually got swept away as motherboards improved and started to integrate those features.

If audio and network technology needed and were able to keep up with demand equal to what GPUs are, you’d see the same results. You need both the demand and the potential to be there at scale in order to drive that kind of advancement.

If we get to a world where high fidelity graphical demand is a niche, similar to audio, then I could see this argument having merit. I don’t expect that will happen in any way we could reliably predict in 2024.

  • moogly a year ago

    Well, anyone serious about computer audio today will be using an external DAC.

    eGPUs? We'll see... Maybe if they get hot enough.

    • eep_social a year ago

      Likewise anyone serious about network performance uses a discreet NIC. These days the hardware on a high end NIC can be used to offload anything from basic packet processing, encryption, or NAT, on up to DMA and of course costs as much as a mid range PC.

  • frou_dh a year ago

    Talking of HiFi, playing the oft-mentioned streaming games running in a datacentre just kinda disgusts me on principle because it's a lossy video signal. I don't want to play modern games with visual artefacts on top - that's just barbaric.

danjl a year ago

Okay, so, GPUs will go away to be replaced by AI, which... runs on GPUs? Even if the target application changes from games to AI, we still need widely parallel processors (aka GPUs) and serial processors (aka CPUs) if we want to keep growing performance.

thunderbird120 a year ago

The APU offerings from both AMD and Intel have been improving pretty rapidly recently but they're still pretty low end by dGPU standards. I can certainly see them causing the death of dGPUs in laptops but it's difficult to imagine a scenario where they're competitive with mid to high range dGPUs in desktops. I can't see either AMD or Intel trying to cram an iGPU that large onto one of their chips, it would be an extremely niche product and would be badly bandwidth starved.

effdee a year ago

Graphics cards are already dead for me (as a causual gamer). I recently bought a used notebook with a 12th Gen Intel Core processor and it can run an impressive list of games.

  • coffeebeqn a year ago

    I’m very happy with the Steam Deck as my main gaming machine which has integrated graphics and runs most new games great. If I were to play on a big screen I’d still go for something like a 3070 or better with the new AI upscaling to use the 4k screen. Not looking to shell out the money for that kind of a PC right now though

Dalewyn a year ago

I've been calling it[1] for a while now; this is history rhyming with itself for anyone who's paying attention to computing history.

As unsettling as video cards going dodo might be, it definitely will be cheaper for the general consumer if integrated audio and NICs are any indication.

[1]: https://news.ycombinator.com/item?id=40236186

  • Meganet a year ago

    GPUs are niche for a long time already.

    The big hit happened when intel started doing this. It killed the whole category of desktop GPUs.

    Audio and NICs are very different though and the Apple GPU integration has nothing to do what happened to Audio and NICs: Apples customer demand GPU power for Image and Video editing, for the retina display and they pay a big price for these chips.

    A integrated audio/NIC got optimized away because compute got so much better. iGPUs and co are not compensated through CPU compute but because putting them together makes it cheaper. The iGPU still has normal GPU components.

    A M* Chip from Apple is HUGE and f* expensive. If it wouldn't be for people with deep pockets, it would be a lot cheaper to build the hardware yourself with normal GPUs. Mac Studio? 6k vs. same setup without Mac hardware: 4k and less + upgradable etc.

    • Dalewyn a year ago

      >The iGPU still has normal GPU components.

      That's not a useful delimiter since integrated audio and NICs are still discrete components on the motherboad.

      • Meganet a year ago

        Thats often enough not true. They mostly are just integrated

        I know the distinction feels very thin but come one, a GPU chips complexity is far beyond a sound or nic chip and i don't think that comparision is fair at all.

        You are not adding a network card to your desktop pc to have significant better network.

        And they don't even make new soundcards since 2021.

        • Dalewyn a year ago

          What part of discrete components do you not understand?

          • Meganet a year ago

            Why so snarky?

            There are mainboards who integrate this type of stuff onto their mainboard controller

            • Dalewyn a year ago

              I would implore you to look closer at motherboards. Most if not all have discrete components for the audio, NIC, USB, et al. on them.

              The integration of those functions into the motherboard merely did away with the bulky physical connectors that take up space and complicate designs.

    • acdha a year ago

      > Apples customer demand GPU power for Image and Video editing, for the retina display and they pay a big price for these chips.

      I know “too pricey” has been a telling point for Mac forum threads since the turn of the century but you really should check the numbers before saying things like this. The M series chips meant Apple had a multiyear period of being notably cheaper because an integrated chip saves money - the correct angle for criticism is limited customization options.

      Your pricing for the Mac Studio is high by 50% but also misses the point: that’s not competing with gaming rigs or home PCs (the $600 Mac Mini is that market) while the Mac Studio is aimed at people who need expansion options like video editors - note how it had hardware acceleration for the ProRes codec they use, support for 8 displays, double or triple the Thunderbolt and USB ports, etc.? You’re not buying that to play Call of Duty, you’re buying it to connect 8K cameras. The Mac Pro is even more of a specialist design with the PCI-X slots.

      https://www.apple.com/mac/compare/?modelList=Mac-mini-M2,Mac...

      • Meganet a year ago

        The M1 chip was a game changer. This is true and for whatever reason, a MacBook Air is at a exceptional price/value point.

        But not with a Mac Studio: You can build your 8k super trible all bells and whistles with a lot less money than giving it to apple. The difference is volumne. A Mac Studio is probably 5-10x smaller.

        The point is still valid: You do pay a big price for these chips. Apple pushes you to a Mac Book Pro due to RAM.

        Its not bad critisism, don't get me wrong. My company laptop is really good but it costs 3k.

        The normal consumer market, outside of an Apple ecosystem high price bubble, actually starts a lot lower. YOu can get a normal laptop for 300 while the MacBook Air starts at 1000.

        But before the M chip, this was totally different. I would now try to convince people 'if you can afford it, save uup a little bit more and get you an macbook air' i would not have said this a few years back.

        • acdha a year ago

          > The normal consumer market, outside of an Apple ecosystem high price bubble, actually starts a lot lower. YOu can get a normal laptop for 300 while the MacBook Air starts at 1000.

          That $300 “normal” laptop was worse in almost every way and had significantly shorter service life - I still remember people making those comparisons claiming spinning metal drives were the same as SSDs. What you’re conflating is that there isn’t a single market segment but several, and Apple relies on used kit for the lower end price points. When you compare equivalent hardware capabilities, things have been roughly even since the switch to Intel, although it got tricky during the end of that when Intel struggled to ship low-power parts and you really had to decide how much you valued battery life.

          • Meganet a year ago

            The problem ist not that its shit, the problem is that in our world there are a lot of people who can't afford a $1000 laptop and Apple doesn't cater to these people at all.

            Smartphones helped here a lot though but are not always an alternative. A young person barly making enough for studing (a person who needs a keyboard).

            • acdha a year ago

              You were previously saying that you could get the same thing much cheaper, but now you’re talking about how you want something different. That’s a valid topic but conflating the two won’t help.

  • puzzlingcaptcha a year ago

    The difference being, at no point audio and NICs required >200W of power budget to meet consumers' demands.

    Even if you scale down your expectations, iGPUs have been hampered for decades by the limited RAM bandwidth - there was no point in beefing up the GPU side if the memory couldn't keep up.

    My personal benchmark has been "can it play Witcher 3 (almost a 10 year old game by now) at 1080p/60fps on high settings" - and we are getting there only recently (https://www.notebookcheck.net/AMD-Radeon-890M-Benchmarks-and...).

    Hopefully the platforms with CUDIMM will finally deliver the bandwidth needed.

nottorp a year ago

"AI" upscaling. Are we entering a new era where all games look kinda the same because of the default "AI" settings?

Like when Unity was new and I guess everyone used the same example shaders to build on...

cthalupa a year ago

We have a whole subheader on how AI is supposedly going to kill dGPUs, but the paragraph explains nearly nothing as to why. It doesn't even begin to contemplate the idea that if AI does continue to grow in ubiquity, there's even more demand for compute that handles that sort of workload efficiently, and we know GPUs serve that purpose better than CPUs in the vast majority of cases. It even mentions that AI is being used for graphics purposes in gaming. It ignores all of this to suppose that the distant second in dGPU sales not releasing an already niche card on some arbitrary schedule is indicative of the market as the whole.

It mentions sound cards, but a lot of what killed sound cards is that the only people buying them after integrated audio being a thing were audio enthusiasts, and audio enthusiasts have moved nearly universally to external DACs via USB. They're just as specific of a product, serve the same purpose, and exist across the whole price range that sound cards did and far beyond, for people that want to light enough money on fire. I'd wager the DAC market is far larger today than the soundcard market was even before integrated audio became a thing. Things way have changed so it's no longer something you're slotting in as an expansion card, but it's still an additional purchase for a premium purpose. I don't see dGPUs going anywhere for the exact same reason, barring physics-defying technological advances.

jowdones a year ago

Well it seems to me that games from 10 years ago run on a today's integrated GPU just fine. But that's not the case with TODAY's games which barely run on a dedicated GPU from a few years ago.

So until game developers will completely ceise to target the powerful dedicated GPU of today and limit themselves willingly to technology 10 years in the past, we won't see the death of dedicated GPU.

Only place where that could happen would be Soviet Russia by decree of the central committee of the party. In a competition based capitalist society it will happen... never. Someone will always use the latest and greatest GPU and if you don't do it too your company falls behind. Hence: not a chance.

  • KronisLV a year ago

    > So until game developers will completely ceise to target the powerful dedicated GPU of today and limit themselves willingly to technology 10 years in the past, we won't see the death of dedicated GPU.

    I think that one of the problems is that a lot of game developers go for a baseline level of graphical quality and don't allow the users to customize the game/engine towards anything less than that, instead trying to use as much of the graphics budget as they can.

    For example, a game called Vigor recently came out (https://store.steampowered.com/app/2818260/Vigor/) and was initially optimized to run well on a Nintendo Switch and that also meant that by the time it finally hit PC it was optimized better than most recent releases! I can play it on my Intel Arc A580 with a stable framerate and essentially no stutters, which I cannot say for a lot of modern games, even if it got bad reviews for other reasons.

    Probably most of the current games could be played on integrated graphics, if the users had the ability to customize which shaders and effects are on, what resolution of textures and shadows is used, all of the models had LOD levels that go lower with a customizable LOD bias, in addition to upscalers being supported in every title (DLSS, FSR, XeSS). I'm not saying that integrated graphics would always get smooth 60 FPS, but 30 might be doable.

hdjjhhvvhga a year ago

> Sound cards and network adaptors were an integral part of custom PC builds for years, but those eventually got swept away as motherboards improved and started to integrate those features.

Yeah, for better or worse.

  • maximus-decimus a year ago

    I really liked when when previous motherboard was randomly sending screeching sounds to my ears. I ended up buying a ~100 CAD fiio E10K which I'm still using 6 years later.

    So really I still have a "sound card", it just became a USB dongle.

Kalanos a year ago

how will they handle heat?

fithisux a year ago

Risc-V has a proposal for this.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection