Comments by "John Smith" (@JohnSmith-op7ls) on "Based Camp with Simone & Malcolm Collins"
channel.
-
13
-
13
-
8
-
7
-
6
-
6
-
5
-
3
-
3
-
3
-
3
-
3
-
We’ve already strung together various specialized models and “AGI”, whatever that even means beyond a buzzword, didn’t happen.
ANNs really do not work like BNNs, even one of the two guys who coined the term “neural network” regretted that name for the inapt comparison it draws to how the human mind actually works, which is far from a solved problem. ANNs are a highly crude attempt at emulating vaguely similar training and output capabilities. They aren’t simulating BNNs, we don’t even have system close to having the capacity to do that, even to our limited understanding of how they actually work.
Just like you can’t strap together a bunch of steam engines and get a warp drive, a bunch of LLMs working together don’t magically become human level intelligence.
You need a fundamentally different architecture and leading researchers agree on this, LLMs have a use, but are near their limit. You can squeeze more accuracy and capabilities out of them but this is just adding more blades to a Swiss Army knife. More specialized but ultimately just a bigger dumb tool without awareness, without an internal world model and the ability to make accurate predictions from it, without the ability to learn in real time, and so on.
We haven’t seen anything emerge from the growing complexity of these models that goes in that direction, despite the BS marketing hype of Sam Altman and some lunatic Google dev who made nonsense claims amd got suspended for it.
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1