Comments by "Siana Gearz" (@SianaGearz) on "How Nvidia Won Graphics Cards" video.
-
31
-
16
-
3
-
I don't see GPUs being obsolete in near future.
What happened with soundcards is that they grew ever more complex, synthesizers, gaming 3D audio, all the stuff. Actual processing power. But then, CPUs outran them and it made more sense to just integrate a simple dumb soundchip on the mainboard. By the way, Intel killed HD Audio interface (i2s multichannel + i2c command) that was used previously, now onboard audio is USB. I actually find that exciting, maybe we'll see Crystal (Cirrus Logic) or Burr-Brown return. Crystal is gearing up for sure.
Now, CPU progress is slowing down, but GPUs are still trending up in performance, growing with software (game) requirements, into monsters which far outstrip the power budget of the CPU package. Memory interface of the CPU is unfortunate for them too. For sure i see a possible distant future where everything is an SoC, but not yet, not in the next 5 years. Anyone who says they can see longer than that into the future probably lies.
Now main GPU uses besides game are crypto and AI, and these are of course just waiting for ASICs to catch up.
But indeed as far as standard office computers, laptops as well, no need for a dedicated GPU, but been that way for decades.
2
-
2
-
1
-
@simplemechanics246 Different video delivery method. Androids only have H264 hardware, so YouTube ships only H264 video to Android and other mobile-like devices. But on PC, they know you have a bigger CPU, so you get VP9, AV1. VP9 is something in particular that is Google property and GPUs have shown some resistance to adopting it, and AV1 also has hardware compatibility issues. So often you get a software-only decoding, CPUs are slow at this and shovelling that much video data to GPU is hard. Then differences in surrounding software, webpage with control overlays vs. an optimised app. Google controls, understands and optimises the whole Android ecosystem, but on PC things are "good enough" and are left at that, so a lot of sloppy craftsmanship. As to other video services, there's again web overhead, DRM overhead much more on an open platform like PC - on Android they can just check that it's locked down.
Console APUs are scary at around 300W, you can't simply put that in a PC, you need to design the computer around something as powerful as that. Of course it's a matter of time till this power fits into a 125W envelope, at which point it very much can be shipped in a PC, which is why nobody is in a hurry to design a PC platform which can carry a 300W APU. Would probably take more than 3 years, and not certain either whether it'll happen.
1
-
1
-
@mariusvanc That was not true when nVidia was formed and rose to power. This is an NVidia story, not a GPU story.
Well there was an NVidia in the original Xbox, and it was a decent GPU stuck in a sub-par system, so difficult to say whether that or GeForce 3 was actually better, on paper the Xbox one was better, but it was probably bus-choked. But of course Ti 4200 would come out for PC a few months later, and it was no contest.
The second time when NVidia GPU was in a console was the PS3, and it was not good, just not good at all, the only saving grace is that with Cell, it doesn't have to do quite as much work, as a lot of T&L and post-processing could be moved to SPUs. ATI's chip that found its way into Xbox 360 on the other hand was truly the next gen stuff at the time of release, pretty exciting. But then Geforce 8800 came out and wiped the floor.
1