Comments by "Luiz Devil" (@luizdevil6855) on "Moore’s Law is So Back." video.
-
That's because AI is highly parallel and Amdahl's law of scaling don't seem to affect it. Just keep adding more and more GPUs because the algorithm used for systolic array computing is highly parallel, its almost 100% parallelizable.
Machine learning is just a perfect fit for systolic array computing.
But I bet somewhere in the GPT architecture, they're going to encounter a serialization blocking in information processing, maybe that's why that law of scaling famous formula is saying (Scaling Laws for Neural Language Models) , its just a parallelization of distributed computing law (dammit sam altman marketing, I hate marketing)
Also, all NP problems are actually easy.
N(1) -> 1 GPU
N(100) -> 100000 GPUs
N(1000) -> 100000000000 GPUs
N(10000) -> 10000000000000000000 GPUs
just keep buying more GPUs with infinite VC money because you just slap "AI" and you get free money for GPUs.
I'm going to buy a house. "oneself home-affordance AI company" , get money to buy GPUs, use it to buy a house, to put the GPU inside. see, I solved homelessness (at least for N=1).
Computing science theory is a scam.
1
-
1