General statistics
List of Youtube channels
Youtube commenter search
Distinguished comments
About
Peter Lund
george hotz archive
comments
Comments by "Peter Lund" (@peterfireflylund) on "george hotz archive" channel.
Previous
1
Next
...
All
@Zoronoa01 he is absolutely delusional. He didn't really know that much of the floating-point cost is in doing stuff with the mantissa/significand, he doesn't understand how IEEE fp rounding works, he doesn't understand how the add in multiply-accumulate fits nicely into the gigantic addition of all the partial sums. Maybe he understands redundant representations (a simple version of which is used in two-input carry-save adders) but I don't think so. They are the trick to making multipliers go fast because they allow all the monster additions to happen without caring about carries until the very end. He is also wrong about VLIW and static scheduling -- for most of the machine learning work. Nobody has made VLIW or static scheduling work well for general purpose code and they probably never will -- but it works fine for code with highly predictable memory access patterns, which is true of practically all current machine learning. The guy is pretty smart, though, so he'll probably figure it out soon if he continues with the Cherry project.
4
Verilog and VHDL is 100% programming! Yes, it’s different... but so is Haskell and Prolog.
2
Start programming. Doesn’t really matter what or how. Then seek feedback. Then do better. Repeat. Hotz has been doing exactly that since he was a kid... Arduino is fine. You can take a look at the machine code the compiler generates. It’s Atmel for the original Arduinos and ARM for the newer, 32-bit machines.
1
It's not a new idea. It has been tried again and again for at least 30 years. It has gone nowhere. It turns out to be really hard to do proper analog stuff at scale. FPAA's exist (they are the analog version of FPGA's), for example, but it is almost always better to use an FPGA (or a microcontroller) with ADC/DAC converters. https://en.wikipedia.org/wiki/Physical_neural_network https://ieeexplore.ieee.org/document/18590 Yeah, analog "neurons" in the 60's. Chips in 1989 (or perhaps even earlier). So what's different now? I agree that it sounds like the right thing to do -- but is it actually? How do I implement resnet or any other net with residual connections (aka skip connections)? How do I implement transformers? How do I initialize the neurons properly? That turns out to be very important, especially for deep nets. How do I implement normalization layers? How do I implement GANs?
1
No, but he wisened up since he began the endeavour a couple of streams ago. He now understands why in-order VLIW isn't such a bad idea for this problem. Sure, he likes to call them "LIW" but they are still VLIW. He'll probably come around to systolic arrays, too.
1
No, it was just the song.
1
Previous
1
Next
...
All