Comments by "Seegal Galguntijak" (@Seegalgalguntijak) on "Rob Braxman Tech"
channel.
-
50
-
21
-
Thanks for this comprehensible explanation! I've been using Linux on a desktop since 2006, and there were maybe two or three times when I've had to run a compiler, and none past, say, 2015. Here's my history with Desktop Linux: I've started with Debian Etch (and parallel to that, also Kubuntu 6.06 for a short time - back in KDE 3.5 times, I was a KDE fan), then Lenny when Etch became stable and Testing was "freezed" feature-wise. Then at some time, there came KDE4, so I switched to Gnome2 and with that also to Ubuntu, I think it was 9.04 at first, which I then upgraded to 10.04 LTS. Starting with 12.04 however, they introduced their new Unity desktop, which I didn't like, so I was looking around for what I wanted to use and ending up with Mint because of the Cinnamon desktop environment, which was still configurable to look and act like a modernized version of Gnome2 (MATE wasn't a thing yet back then, and later on when it existed, I found it to look somewhat out of date). Mint always had some disadvantages over Ubuntu, like not including kernel updates by default (never had a problem with doing them anyways), and also no dist-upgrades, which they later on started changing towards better solutions (although the warning about kernel updates are still unnnecessary in my opinion, and the dist-upgrade is currently still as much work or more as reinstalling the system). At the same time however, Ubuntu became worse and worse, with switching from Unity to Gnome3, which completely threw all known-to-me concepts of GUI usage over board, not offering an official Cinnamon flavour, and most recently the introduction of the proprietary SNAP package format. Oh, SNAP packages and the package manager can still be open source, but the only server that will ever distribute snap packages is run and owned by Canonical, and it is not open source software. So they want their gatekeeper role of software distribution, and I cannot accept that, therefore nowadays I always advise against Ubuntu, even for people who think their GUI is good. It's just inacceptable. Plus, I once installed a calculator app via snap on a PC (not mine), and it was somewhere around 25MB (I was like WTF is wrong here, this should be only a few kB or maybe a Meg or two), and when I clicked onto the launcher for the program to load, it took something like 30 seconds to load a zucking calculator! So it's slow, meaning resource inefficient and therefore absolutely out of the race. Hence: No Ubuntu here. Mint fortunately uses Flatpak, which at least is a truely free package format, where everybody can set up their own flatpak repository to distribute their own software...
I thought about trying Arch or Manjaro, but in reality, I'm lazy, and why try something new when the thing I've got works so well for me?
On my home server, I run Debian, because it has no GUI, it runs my Nextcloud to function as the backend of my phone. Although, I think I'm still on Buster there and should probably upgrade it to Bullseye...darn laziness! ;)
20
-
19
-
19
-
17
-
10
-
7
-
Same here since 2006 now, so 15 years! Started with Kubuntu 6.06, then KDE4 came and I quickly went to Debian Etch and then Lenny, then back to Ubuntu (then the standard Gnome2 version) until 12.04, which introduced Unity, so I was on the lookout for something else. Tried out several distros, stuck with Mint, I love the Cinnamon desktop, it can be configured with two panels on top and bottom just like Gnome2, but with a much more modern look and functionality. Up until 3-4 years ago, I'd have said: If there were an official Cinnamon flavour of Ubuntu, I wouldn't need or want Mint, but then Ubuntu introduced Snap, while Mint avoided it, so now I'm happily staying with Mint, which is a good distro anyways (the only reason why Rob didn't like it was the icons...well, you can quickly change those to a different theme).
5
-
5
-
4
-
4
-
So, I'll have to rephrase my second comment: SNAP is NOT an advantage, but a DISADVANTAGE! It is SLOW, and it is UNFREE! Canonical are the only ones who run a SNAP SERVER, since it is not open source, so they are the only ones who get to decide which software is available through SNAP and which isn't. Therefore, Mint doesn't use SNAP, but the competing FLATPAK instead! This is much better, because a) programs packaged with it aren't such a resouce hog, and b) the server is free, so everyone can distribute their own software through it!!
4
-
4
-
4
-
3
-
3
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
Except for the IP address (no VPN), I use all these measures fully "naturally", as a result of my understanding of how technology works. For example I recently bought a new Xiaomi phone, which needed some type of "waiting period" before the bootloader could be opened, and it also needed a Google account, so of course I created a new one which never gets used again, and didn't actually use the phone (except for reading ebooks, which I copied onto it locally and read with an app from f-droid), until I could finally unlock it and degoogle it so that it's now usable. I only use this google account, and I only use it in this PC browser (on different PCs though, but with the profile data copied from one machine to the othe), and I don't use anything else in this browser. Everything "not youtube" I do, I do on other browsers. I don't even go into this gmail account all that often (and if I do, of course I use this "youtube browser" for it). So the only thing I don't do is IP obfuscation through a VPN, basically because it's yet another level of complexity, having to decide when the VPN should be active and when it shouldn't, stopping email clients from downloading emails via POP3/IMAP while the VPN is active and stuff like that...
2
-
2
-
2
-
2
-
2
-
But wait, isn't the European UMTS also a CDMA technology? Funnily enough, there are some European countries that have switched off the old GSM standard from 1991 and keep UMTS active as a fall-back until 5G is rolled out nationally, while other countries like Germany have switched off the CDMA-based UMTS networks in favor of keeping the 30 year old GSM networks running, because there are so many industrial appliances still running on GSM, which only transmit minute amounts of data so GSM is sufficient for that, or they only use voice calling like elevator emergency call modules, and all this stuff would be too expensive to exchange, so GSM keeps running while UMTS is gone by now. With the interesting side-effect that on a phone which doesn't use VoLTE (which are still many phones!), you can't do a phone-call and use the internet at the same time, when you're not connected to the internet vie Wifi. I'm really looking forward to SailfishOS implementing VoLTE in early 2022.
2
-
My main phone runs on SailfishOS, which is a non-Android Linux OS, however in its licensed version (available for phones from the Sony open device program) it does have an Android VM called AlienDalvik, so it can run Android apps as well, but of course it doesn't come with any Google spyware.
I recently bought a new secondary phone, which is made by Xiaomi. Big disadvantage: I couldn't use the phone for a week or so, because I had to wait a certain time to unlock the bootloader (and it had to have a SIM card inserted and used mobile data - but it is certainly not the SIM that I'm going to be using in it afterwards). Also, the only available program to do that with runs on Windows, so I had to use a VirtualBox for that, since I only run Linux on my computers. But it eventually worked, and while the LineageOS fork called crDroid I'm currently running on the phone seems to be a little too uncritical (or unaware) of the Google spyware issue, it still provides me with a degoogled phone, and with apps like AFWall+ and AdAware, I can also block unwanted connections to hostnames from privacy invaders like Google or Facebook, since I'm not using any of their services (except Youtube on this one PC in this one browser only, which doesn't get used for anything else).
2
-
"Your Windows, your Mac or your Android" - so is this channel not for me? I've been using Linux since 2006, but not at a high knowledge level like a programmer - more like amateur admin level. For example, my avahi-daemon always sucked 100% of one of my CPU-cores, and when I disabled it via systemctl, it came back anyways! Therefore now I have masked the socket (although I don't even know what this means, it were just the commands I entered) and am now hoping that it won't respawn. But is this avahi-daemon sucking 100% of one CPU core already a hint of an intruder or malware? Because, I have actually found bug reports of other users on the internet who have experienced the same thing - albeit in a large LAN, while all I have here is one NAS, one debian home server and 2 mobile phones (one SailfishOS/one AOSP without Google), so not really a big network? Really strange...
2
-
2
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
@DerekDavis213 I agree with your opinion about Macs, but let me give you an example of your initial question: Windows showed the "Desktop" as the uppermost level of its file hierarchy, while in truth it was first bureid under "C:\DOKUME~1\...." etc and now it's buried under "C:\Users\....something" - it's just an utter lie, how they structure even the most basic structure of their GUI. These are things that Android is by the way also guilty of, like often making it overly difficult to find out the real path to a file, when only showing it in an app, so you can view/open the file, but can't actually use it (as in copying it or doing whatever with it), these are things that just aren't necessary and are only there in order to make it harder for the user. Hiding the concepts of basic operation so that the user doesn't learn anything, and then doesn't even know how to help themselves often with the most simple things.
And then there's stuff where Windows could just be better, but it isn't being used, like the file privileges of NTFS and stuff.
And no, I don't have any "bugs of day-to-day-use" in Linux. Or at least none that I didn't chose (i.e. I connect to my Synology via sshfs instead of SMB, because why use a non-native protocol, and there my file manager actually has a rare bug if reconnecting when the machine went into S3 and woke up again without properly unmounting the sshfs connection, but I know that and it's easily worked around, and it doesn't even occur daily). Also, I have set up several PCs for friends, some of which use Windows (i.e. because they need special software that only runs on Windows), and others were okay with Linux. The funny thing is: Most of the time, when one of these friends calls me because something about their computer doesn't work the way they want it to, it's those who run Windows, while those who run Linux usually don't have any problems, because their computer just works. OK, these are all people who really don't know anything about computers, so they don't even change anything about their systems or install new software, they mostly just use a browser, LibreOffice or MS Office, and that's about it. Here I can clearly see that, if Linux is set up right once, it'll run for years without ever having any problems, except clicking to install updates every couple days. With Windows, it's most of the time the case that after a certain amount of years, it'll need to be reinstalled, or "cleaned" as in removing malware/adware or other crap that has been added through their use of the internet, clicking on stuff not knowing what they do. So in a way, in order to use Windows effectively and not slowly break it along the way, you have to be much more knowledgable than with Linux.
1
-
@DerekDavis213 I used Windows 3.0, 3.11, 95, 98SE, NT4.0 and 2K. Didn't even switch to XP, because every new version hid what was truely going on in the PC a bit better from the user. In 2006, I stopped using Windows, and I'm happy that now I have an OS that gives me all access, and which doesn't use GUI to obscure knowledge about how computers work. I think this is geared deliberately towards "dumbing down" its users, because that way, users can be more easily incapacitated from doing what they want with their machines, I reference the 2013 talk "The coming war on general computation" (or something like that) by Cory Doctorow. Granted, I have to chose which hardware I buy with regards of Linux compatibility, but that hasn't ever really limited me in what I can do. Plus, I just don't like the way of thinking you need to apply in order to operate a Windows machine - starting with small things like drive letters or mouse-wheel-scrolling not happening where the mouse cursor is located, but instead in the window which has the focus. But also that you basically learn "click orders" in order to achieve certain things (mostly administrative in nature), instead of learning how the system really works. In contrast to that, I'm really happy with a system that's totally open to me as the user in regards of its intrecate functionalities, so it's all logically comprehensible, while with Windows, it often isn't. Starting with, again, little things, like how Microsoft calls their Linux-subsystem for Windows the wrong way, thex call it Windows-Subsystem for Linux, when in fact, it's a Linux-Subsystem for Windows. They've got their thinking all twisted around somehow, and it shows in so many more places, not just what I've listed as examples here. So basically, let's say: I don't like it, I don't like using it, I don't like having to download software programs from some potentially shady website, I don't like how they all don't update through the system update functionality, I don't like how there's no shared library system, I don't like how you need antivirus stuff and how they don't even show filename extensions by default, making inept users click on a malicious file "file.pdf.exe" with an Acrobat reader icon, and how this file automatically has the right to be executed.
1
-
1
-
1
-
1
-
1
-
@crimestoppers1877 I know about LFS, but I've never seen the necessity to go throuh with it - it would be an intense learning experience, I'm sure, but on the other hand, I'm also certain that it wouldn't really serve me all that much. As for the rest, I already stated that I'm happy with the distro I use, and while I could run virtual machines to try others out, I don't see any use in that either - the computer has become more and more a tool for me, and as long as it works as I want it to, I don't need any changes, so I'll stick with what I have and what serves me best. Thanks for the encouragement anyways.
1
-
1
-
Also, the naming conventions are different in different countries. In Europe, we've had GSM which was 2G (later enhanced for data rates above 9600baud with GPRS first and then EDGE). Previously, so the "1G" were basically national solutions that weren't commercialized, since telephone networks were state-owned until the 80s/90s. So in Germany, we've had three networks before "2G": A-, B- and C-Netz. The first one (from the 60s) was with a manual operator to patch you through, the second one (from the 70s) already used normal dialing, but the caller had to know the prefix of the cell tower to which the car you wanted to call was connected. Because they were all car phones, they took up to half the trunk space of a full-size sedan like a Mercedes S-Class, and if the phone didn't turn off 30 minutes after the engine was off, the car battery was drained on the next day. Anyways, the 3rd network from the 80s was still analog, but it actually implemented cell handover for the first time, so this was a huge advantage. And then, GSM started in 1991 (and would later be called "2G"), while the CDMA-based UMTS network (with totally different frequencies as the CDMA networks in the US) started around 2000, and was used until 4G or LTE had come in some time in the early 2010s. Now, they're talking 5G, but they don't call it LTE any more, so only the tech-savvy know that it's basically the same technology, just with a few enhancements and different frequencies.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1