Comments by "Traveller" (@traveller23e) on "Fireship"
channel.
-
5
-
5
-
5
-
5
-
5
-
5
-
5
-
4
-
4
-
4
-
@voidvector Ironic you mention Java, since it still doesn't have generics working properly in a lot of the more basic classes (methods just return object and stuff, so you lose the type checking). Also, in Java and C# things like lambda expressions keep things from being pure OOP. However, I would posit that that is not a bad thing; pure-OOP is 1) a concept revered by programming philosophers, without much intrinsic merit to the programmer, and 2) a concept people claim to try for yet don't really seem able to consider in concrete terms, sort of like Clean Code or Agile. (Note to clean coders: it's not that I dislike the concept, just it doesn't have any amount of objective measurability and can when unchecked lead to voodoo programming.) One thing I think would be interesting to see would be a language with fully-fledged support for multiple paradigms but with compiler directives forcing you to explicitly label sections as hybrid OOP/procedural or functional or whatever. It would be interesting to see how people react with having to make an explicit decision like that.
Actually, I'd propose adding this feature to Powershell, as it could only make it better.
4
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
1
-
1
-
Note about the "a value cannot be null, but you can make it nullable by adding a question mark".
The short answer is this is inaccurate, but a full explanation is that there are two kinds of data types, reference types (defined by classes and interfaces, e.g. string) and value types (defined by structs, e.g. int). Reference types can be null, however value types cannot.
Therefore, if you create a new variable of a value type (as in "SomeType variableName;") it'll be set to the default value, but for a reference type it'll be set to null. If you need a nullable value type, that's actually a fundamentally a different type and is written with a question mark at the end. So, int? can be null, but int cannot. If you need to compare them, a cast is involved (if I recall correctly the int gets implicitly cast to int? prior to comparison). There are also other important differences between value and reference types, so if you're new to the language at some point be sure to learn them.
However, for some reason the designers of C# decided that this was confusing to newcomers [citation needed] and so in the most recent versions of the language by default there's a warning (hence the yellow squiggle, not red) if the compiler detects that a value of reference type that's not explicitly nullable (the question mark) could at some point in the execution become null. This has the advantage that you don't have to worry about null checking if the value isn't marked as nullable, assuming your dev team does a good job of paying attention to warnings and resolving the issues. The code base I'm working on currently has something along the lines of 4700 warnings if my memory serves (it's also an older version of C#, from before they decided to start all this non-nullable reference type stuff). However, there are a few disadvantages to this behavior too.
One is that you have to explicitly mark the type if you ever plan on setting the value to null, a frequent thing to want to do. Additionally, if you're trying to work only with non-nullable variables it can make some patterns more cumbersome (e.g. "SomeType returnResult; if (something) {doStuff(); returnResult = someValue;} else returnResult = someOtherValue; Log($"Returning {returnResult}!"); return returnResult;"). One could also argue that it can confuse people at a fundamental level about the differences between classes and structs, sort of like in Java the line between abstract class and interface was somewhat muddied when they introduced default implementations in interfaces. And then you get the people like me who just hate it for no solid good reason other than "this isn't how I learned it!" and try to justify their hatred through other arguments.
1
-
1
-
1
-
I'm a fairly junior programmer, and the stuff I've gotten to work on has been so boring compared to my personal projects. Not just because I don't care about the program, the requirements are interesting, but like...guys, not everything needs to be written exactly the same way. Seriously, everything seems to be done with classes, not structs (even where value-type semantics would be better), it's like people are scared of using a class directly rather than in interface for that class (the other day I was asked in peer review why I'd defined the return type of a method as a List<T> rather than an IEnumerable<T> when the method created the new list internally 🙄), abstract classes seem to be a largely forgotten concept, not to mention the "clean coders" who will spend half an hour of review time debating something really trivial but then when asked about the performance implications of writing a LINQ statement one way versus another can't say anything on the subject. I'm trying to switch from C# to C for my next job, in the hope that the C crowd will have a more performance-oriented perspective. Unfortunately most of the openings are for C# and Java webservices and stuff though.
1
-
1
-
1
-
1