Comments by "Christian Baune" (@programaths) on "Why is Math Hard? - A Meta-Mathematics Perspective | Stephen Wolfram and Lex Fridman" video.
-
Even with simple problems, a friend is puzzled on how I know to add something or assume something to progress. When he look at it, it's like I just know.
Reality is that at some point, I stop and ask: "What would be convenient ?" or "What am I missing here ?".
Often time, it means I need to add/drop a constraint or add something I know.
That part is the hidden path and most people can't see it.
It's a close cousin of decomposition where you assume some part of the problem is already solved. This works very well in computing/programming.
It also means that you should gather your thoughts at the same abstraction level. Changing the level of abstraction has a cost!
As an example, I had to deal with real time software and high concurrency. I used high level primitives then implemented them.
This lead on a rock solid solution that was FAST.
The already existing solution was "optimized" (i.e. avoiding unnecessary abstractions), but slower and plenty of bugs (almost each corner case was a reason).
On top of that, the first solution took 3 years, the second only 8 days spread across 2 months.
Mathematicians (and good software programmers) can go very far by using proper abstraction and refrain to unnecessarily switch to another level of abstraction.
It's like doing all your calculations in symbolic form before evaluating.
Lastly, there is some political correctness, but mathematics (and programming) require a brain; It's not the forte of everyone and that's fine.
1