Since for at least 2010 we’ve had laptops with integrated GPUs inside the chipset. These GPUs have historically been very lacking (I’d say, extremely so, to the point that a Tiger Lake 11400H CPU, which is quite powerful, wouldn’t reach 60fps on CSGO a 1080p with the iGPU. AMD SoCs fared better in that aspect, but are still very lacking, even in their most recent RDNA3-based iterations, due to the poor bandwidth these laptops usually ship with (at best, dual channel DDR5 ram, but mostly dual channel DDR4). As such, dedicated GPUs with their own GDDR6 RAM and big dies have been necessary for both laptops and desktops whenever performance is a requirement, and lowend dedicated GPUs have been considered for those manufacturers that want slim, performant laptops with a decent battery life.

At the same time, there have been 3 important milestones for the APU market:

  1. In 2007 the Xbox 360 shifted from a Dedicated GPU + CPU combo for a single GPU combining both in the same die. The PS3 still follows the usual architecture of separate GPU and CPU.
  2. Both Sony and Microsoft release the PS4 and Xbox One (and their future successors) with an APU combining both. The Xbox One is fed with DDR3 RAM (don’t know how many channels) + a fast ESRAM, and it seems the bandwidth was a huge problem for them and part of the reason why it performed worse than the PS4.
  3. Apple released the Apple-silicon powered Macbooks, shipping powerful GPUs inside the laptops on a single die. Powerful at the expense of being extremely big chips (see the M2 Max and Ultra), and maybe not as powerful as a 3070 mobile in most cases, but still quite powerful (and pricey, but I wonder if this is because of Apple or because APUs are, for the same performance level, more expensive, we’ll get to that).
  4. The Steam Deck is released, featuring a 4 cores/8threads CPU + RDNA2 GPU packed with a quad-channel DDR5 RAM at 5.5GHz, totalling 88GB/s.

Now, for price-sensitive products (such as the Steam Deck, or the other game consoles), APUs seem to be the way to go. You can even make powerful ones, as long as they have enough bandwidth. It’d seem to me that it’d be clear that APUs provide a much better bang for the buck for manufacturers and consumers, as long as they’re paired with a nice memory architecture. I understand desktop PCs care about upgreadability and modularity, but why is gaming APUs not a thing in laptops/cheap gaming OEM Desktops? With 16gb 4-channel DDR5 or even GDDR6 RAM, those things would compete really well against game consoles, while avoiding all the duplicated costs that are incurred in when pairing a laptop with a DGPU. And in the end, laptops already have custom motherboards, so what’s the issue at all? What are the reasons why even cheap gaming laptops pick RTX 3050’s instead of having some love from AMD?

Bonus question: How come the DDR3 RAM at 1066MHz in the Xbox One is 68.3GB/s while the Steam Deck, with a much newer 5500MHz RAM and quad-channel is able to provide just 88GB/s?

  • opelit@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I mean, comparing entry level gaming laptop to premium non-gaming laptop price and performance is WTF?

    Cheap non-gaming laptop with top tier APU like 5800H/6800HS going to be 600/800$. Sure it will still be weaker than an laptop with GPU but price is not reason. Amd or Intel simply doesn’t do chips with powerfull iGPU included cuz it doesn’t make sense. Too little memory bandwidth and monolitic dies cost way more the bigger they are.

    People could think, so why don’t they use GDDr6 like consoles. Well, cuz GDDr6 provide a lot of bandwidth but has much worse latency. Which makes CPU part slower.

    Ultimate move would be if they used 64bit (aka single channel) LP/DDR5 and then put also GDDr6 memory controller for GPU. No need to unify memory (still not here, shared memory is different).

    That would make best of worlds. Tho it would work only on laptops where everything is soldered down. On PCs it would be hell, or mobo would include GDDr6 memory soldered down even if no chips that would use it was used, which doesn’t make sense. And could make it even worse at some point (there would be probably ones with 4/8GB ggdr6 or without any) which is hell for customers while buying mobos