Comments by "Vitaly L" (@vitalyl1327) on "ThePrimeTime" channel.

  1. 7
  2. 7
  3. 7
  4. 6
  5. 6
  6. 6
  7. 6
  8. 6
  9. 6
  10. 6
  11. 6
  12. 6
  13. 6
  14. 6
  15.  @thebluriam  these days most of the systems are very complex and contain multiple parts, some are software, some purely hardware, and there is very little tools available for simulating such systems. Try to find a decent mixed signal simulator that will simultaneously let you debug software running on an MCU and debug how an anaolog circuit will respond to this software behaviour, all in properly simulated time. So, until we have such simulators, the only real way to debug such systems will be to run them physically, in real time, and then collect as much data as you can while they run - pass all the trace data through available pins if you have any, even blink LEDs and record slow-motion video (I did it a few times, was quite fun), use analog channels to log more data... What is not possible in such scenarios is to pause system at any moment you like and inspect it with a debugger. And these are systems this world runs on - dozens to hundreds of MCUs in any modern car, MCUs running a lift in your building, MCUs in medical equipment in your hospital, etc. It means, if we want to sustain the very foundations of our civilisation, we should not train programmers who might eventually end up supporting such systems with an emphasis on interactive debugging. Much better to teach everyone debugging the hard way, and only then tell them that there's such a thing as a debugger that can be handy if your system is not time-sensitive and if all the usual debugging methods failed. Not the other way around. So, my point is, the hard methods should always be the default, interactive debugging as only a last resort. We'll have better developers this way.
    6
  16. 5
  17. 5
  18. 5
  19. 5
  20. 5
  21. 5
  22. 5
  23. 5
  24. 5
  25. 5
  26. 5
  27.  @YMAS11  Languages can be classified on many dimensions, and choice of dimensions that matter is somewhat arbitrary. One dimension is level of abstraction, it's the most well known classification but most people still get it wrong. On this axis, languages go from low level to high level, where low level is their operational semantics being close to some perceived view of the real hardware (talking about the real hardware makes no sense due to its massive complexity, so it's some abstract mental model, some RISC-y machine code). From this extreme languages go to higher levels, with operational semantics being more and more steps removed from small step semantics of the machine code. C, Java, Python - they are all very close to the low level side of this axis, as they have very explicit control flow, mostly explicit memory handling, explicit order of execution, and all use the same structural programming for expressing this low level control flow. The higher you go on the levels of abstraction ladder, the less obvious control flow is, and it can become entirely undefined for very high level languages. They can have no tools for explicit control flow whatsoever. SQL or Datalog can be common examples of such. Some languages allow to cheat and place themselves anywhere arbitrarily on this abstraction level axis. It's the meta-languages, with proper macro metaprogramming capabilities that allow you to add constructs with any, arbitrarily complex semantics to the language, turn the host language into any other language you can imagine. Rust belongs to this group - as it provides procedural macros that can turn the simple low-level Rust into, say, a very high level, optimised SQL. Now, there are many other dimensions for classification, type systems among the most common ones. All of the common low level languages either use a very simple ad hoc type propagation and very loosely defined subtyping, or have entirely dynamic typing. More complex type systems - including Hindley-Milner typing of the ML family and Miranda-Haskell-etc., Sytem F typing, dependent typing of Agda, Coq and alike - they all don't fit well into the low level, explicit control flow, structural programming model of the common languages. Another dimension, which I decline to consider important, is the typical way the language is implemented. Natively compiled, natively compiled but with a complex runtime and managed memory, JIT-compiled with some intermediate representation (such as CLR or JVM), bytecode-interpreted such as Python or Perl - all such details are immaterial and it was shown many times how easily languages can be implemented on top of any of these models regardless of the other qualities of the language - see QuakeC, PyPy, multiple Java AOT implementations, etc. As for algotrading - well, it exists, it makes money, it pays really well... What else can I say? I'm also grateful to it for driving higher end FPGA prices down due to growing demand.
    5
  28. 5
  29. 5
  30. 5
  31. 4
  32. 4
  33. 4
  34. 4
  35. 4
  36. 4
  37. 4
  38. 4
  39. 4
  40. 4
  41. 4
  42. 4
  43. 4
  44. 4
  45. 3
  46. 3
  47. 3
  48. 3
  49. 3
  50. 3