Comments by "Michael Wright" (@michaelwright2986) on "Brodie Robertson"
channel.
-
60
-
13
-
11
-
8
-
8
-
7
-
7
-
6
-
@JacksonNick-j6i There's a difference between gender and sex. There's quite a nice book by a Catholic priest and journalist,* called IIRC "The Year Of Three Popes" (the year of the sudden death of John Paul 1). He describes going into the working parts of the Vatican and being surprised that there are everywhere triple shithouses (not the word he uses, but it's nice to introduce a little coarseness into things). They are all iconically identified, and there's clearly a Man icon and a Woman icon, and a third icon representing a figure in a long coat-like garment. And then he realises it's a cassock, and the third loo is for priests. Three genders in the 1970s. BTW, mammals have two sexes (though not every individual is neatly sorted), but other species have more sexes. I know trans (etc.) activists can be tedious to old men like us, but that's an activist thing, not a trans thing.
*Hebblethwaite is, I believe, his name.
6
-
5
-
I doubt if there is any OS that doesn't require troubleshooting (I trouble shoot Windows for my friends over the phone -- not without stress), but I have found Mint to be the easiest for me. There's a different version of Mint called Linux Mint LMDE, which is not based on Ubuntu, which might get round your problems. Or if you have a very new computer, it's possible that you need the latest and greatest to work with your hardware, so you could try Linux Mint Edge. It might well be that another distro would be better for you, but I haven't found anything that is easier to get going and keep going than Mint.
It's also the case that, with normal luck, getting stuff going properly is a one time thing. Try entering a search with the name of your hardware, your distro, and a description of the problem, and see what Duck Duck Go or Google brings you.
5
-
5
-
5
-
4
-
4
-
4
-
4
-
4
-
3
-
3
-
3
-
3
-
@dlsisson1970 You are quite right. One problem in discussions like this is that Linux has two main user groups. There are traditional og users, who at the least are computer hobbyists, and go all the way up to major server wizards in large organisations, and developers. These people want and need the command line, and they want and need to learn the system.
The newer group, and I'm one of them, just want an OS to run their computer so they can do everyday normal stuff, like watching cat videos and writing books and doing the accounts for their business. We're fed up with the Apple way, and have come to seriously mistrust Microsoft's desire to own our data and get in our face at every opportunity. And give or take a software issue or two (mostly spelt A D O B E) we've got that now, in the two or three obvious, big, desktop-environment-centred distros. We don't want, or need, to know a lot of stuff, anymore than anyone needs to know how to run a server with Windows to organise their book club or engage in the collaborative production of a policy document. Sometimes real wizards give advice to normies with the best will in the world, but without realising there's a whole new audience out there. They run the internet, but we're the people who will bring about the year of Linux on the desktop (if it is permitted to speak apocalyptically).
3
-
3
-
2
-
2
-
2
-
2
-
2
-
2
-
From my experience in a similar area, I think it's not exactly incompetence but selective blindness and impatience by the talented of what they don't understand. I used to work in a university, as a subject academic, and also as the sort of nerd who works out the rules and regulations. It's an acquired taste, and I could well understand my colleagues who weren't the least bit interested; but some would refuse to admit that this sort of stuff was anything to do with them at all. I'd say "I know what you want, and it's good, but the way you're doing it won't work in the system, and it's affecting your students. Let me design some regulations for you that will do what you want, and the computer won't barf over." "No, I did it this way in my last university and I'm not going to change for some bureaucrat." Well, you can kind of see that, but you need some kind of system: in the FOSS world, legal technicalities are what keeps it FOSS, so you have to respect them, even if they bore the hind leg off a donkey. I never had the nuclear option, but sometimes I'd have used it if I had.
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
1
-
1
-
1
-
Off and on Linux flirter for 20 years. User for 10, all GUI, Mint/Cinnamon for preference: easy to do now I've retired, but I'm not sure how I'll go shortly when I'm co-operating again on a book, and might have to go back to MS Word, which means Windows. Also I need a Windows machine for iTunes to organise my music for my iPhone.
The new interest in Linux is because Windows 11 has become so predatory against its users. "Tell us everything about yourself. Entrust all your data to our cloud, where it will all be encrypted so that no one (apart from us) can access it, including you if you let your subscription lapse."
Why Linux won't work for some people:
1: Some people use one or two pieces of software for their professional work: typically Adobe. The decision tree here goes: I need to use (say) Photoshop; what systems (hardware + software) will give me a good experience? The operating system isn't the choice, it's a consequence of prior choices. This is the same as people who have got some very expensive piece of manufacturing hardware which is old but still functional and central to their business, so they need to nurse into life some antique PC with a Centronics port because that's the machine's interface.
2: People who have a lot of experience with Windows or Mac, and know how to do out-of-the-way things, and try to do the same with Linux, and can't. It's partly that a lot of this is what is sometimes called "implicit knowledge": stuff you know without knowing how you got to know it. The charitable reading of the Linus fiasco is that he wanted to set up an advanced gaming rig with his Windows knowledge (the uncharitable interpretation is that he thought a bad-faith video about how Linux is too complicated and broken would be good commercially). This accumulated knowledge can be a real change-stopper, depending on how old you are and what sort of appetite you have for learning new stuff.
For pretty much everybody, changing from Windows is a rational choice, but there are some people for whom Mac is a better alternative. (I used to like OS X in the days of the Big Cats, but too much of its functionality is hidden for me these days -- like menu items that only appear when you hold down the splat key while clicking on the menu.)
And then there's gaming, but isn't everything better on a console anyway? I don't game.
1
-
1
-
1
-
Hah, coastal boy, eh? In Australia, I lived in Armidale (NSW), and in the nation's capital; and in both places I saw -10 C. It was, admittedly, in the middle of the night in the dead of winter, but it was cold. Also dry, so if you came into contact with any synthetic fabric on a winter morning, there were sparks when you put your hand (or key) near metal. Lovely days, but, +15 and better, but cold as when the sun set.
1
-
1
-
I don't agree with that pronunciation you found: the second syllable has the ee sound, and typically the stress is on the second syllable, too. I do know Ancient Greek, btw, and I have thought a bit about asceticism, but I don't always get English pronunciation of Greek words right, so I checked Wiktionary (a good online solution for all your dictionary needs).
I have no problem at all with the from-the-ground-up, learn each tool as you need it, approach to Linux, or any other OS. For many people I'm sure it's the best way to learn an OS, and indeed, since I started on CP/M, I've had something of that trajectory, even though my switch to Linux was pretty much entirely convenience, as I wanted something that's less intrusive and less of a faff than Windows.
But I do have issues about calling that approach ascetic. As I understand it, asceticism is about two things: one is freeing yourself from the complications of unnecessary possessions and material concerns (a kind of wellness play); the other is to simplify your life to the utmost so as to concentrate on ultimate value, whether you think of that value as freedom from all illusions about the nature of "reality", or whether it's something for which the word you reach for is god, or the divine.
The minimalist approach to Linux (why not Gentoo?) would certainly not leave anyone with much time to think about anything else, but if you take the asceticism angle seriously, it would seem to imply that the ultimate concern is, indeed, Linux. Nothing wrong with that, as long as you don't frighten the noobs who are just looking for a better Windows than Windows (where have I heard that before?); but I'm not sure whether many historical ascetics would agree with it (though, how monastic are the Shao Lin martial arts monks?)
Another practice that goes with asceticism is anchoritism, the practice of withdrawing into a secluded or solitary life as an anchorite or hermit (all anchorites are ascetics, but not all ascetics are anchorites). Hmmm. Anyone going to talk about basements?
The notion that someone who's become one with Linux, as described, would be happy with macOS is a bit laughable. Sure, it's BSD underneath, with a funny kernel, but Windows is supposed to be VMS underneath, and when ordinary people talk about those OSes they are thinking of the DE (there's not even a Mac server edition these days, is there), and I hate the modern Mac experience because it's as opinionated as Gnome, and far too eager to leap in and do what it thinks you want it to do.
1
-
Brodie, you say you're a Linux user so it doesn't affect you, and I thought the same, but then realised that if we have any transactions with Windows users (you know, that nice specialist on-line retailer), we're vulnerable to their vulnerabilities.
Given that DJT hates the tech companies, there's a possibility he could be persuaded to do something about this; hand onto that, as a silver lining in a very dark cumulo-nimbus that looms in November.
Oh, I see, as I should have known, I'm very late to this. So here's another thing to worry about: that Group Policies rename calls "Turn off Recall" "Disable Windows Snapshots" or some such. But on a computer, snapshots are good, aren't they? They enable you to restore your system easily, don't they? The tech savvy admin who's concentrating will smell a rat, or will know about this: but in some large organisations, it will slip through, because mistakes always happen.
1
-
1
-
1
-
1
-
1
-
I'm a non-technical user of Mint, but NOT a new user. Been using it for years. That's a distinction worth bearing in mind: it's not that I don't YET know stuff, it's that I've got other things I'd rather be deep into than Linux technicalities, and I rely on useful sources, such as yours (thank you) to help me sort out what I really need to know to use Linux felicitously.
It sounds presentational, really. So "Verified" is not a guarantee of absence of malware, but no such guarantee is possible, I think? Someone sufficiently motivated and resourced could presumably infiltrate malware into the Microsoft Store (probably starting from Petrograd).
So the question is, for a non-technical user, are they better off sticking to Verified flatpaks? (I actually want to know, and so far I have the impression that the answer is "Yes," to some degree.) And if so, how to present the information? Remembering that non-technical users get MEGO pretty quickly.
A question I'd like the answer to is, which source is least likely to serve up malware: distribution's repo, Verified Flathub, unverified Flathub, random binary, random flatpak? I've got a clue, but I'd like to know the detailed rankings. Or perhaps it's not possible to give more than a general answer, which would be good to know.
Last, I take the point about what happens if flatpaks are not available through the preferred source. The answer might seem to be to say, "VLC is great (for example); we think you should install it from our repository, rather than this unverified flatpak." Given that the Mint package manager now shows traditional packages and flatpaks on the same page, this seems like a reasonable idea? And a way of combatting the erosion of safety measures (some clown will always tear down the fence at the top of the cliff).
Oh, and post-lastly, are there any advantages for the user in installing flatpaks? Is the sandboxing of any security benefit, for the user? Any benefits in app updates? I observe on the Mint package manager that typically flatpaks are a more recent version than what's in the distribution's repository, but I come to conclude that that's not necessarily an advantage
The Moral is maybe one of the things I learned in an early part of my experience with computers: don't be an early adopter. Wait for someone else to find the bugs (and now the scams). (And, BTW, never ever install version x.0, and with Microsoft wait for v. 3.1)
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
Only knowing about similar institutional contexts, I agree that the CoC committee was ill-advised to effectively apply retrospective legislation, and seems a bit heavy handed. If a problematic arsehole says he's sorted it out with the insultee, and that seems to be the truth, surely a good idea to let things lie.
OTOH, without any moral judgement, the behaviour of this developer is clearly counterproductive, perhaps self-damaging. Let us assume he is passionately devoted to getting his file system into the kernel, yearning for it with a messianic zeal. It would seem sensible to discover how you get things adopted into the kernel, and it seems that there are agreed procedures, which have nothing to do with kowtowing to the corporations or going all woke, about what happens when. To an outsider, it sounds like the procedures are at least defensible, as a means of getting everything synchronised, all ducks lined up, and the pigs all ready for a formation take-off. Might not be the best way, but it's worked for a couple of decades. Also, it's a community, so it's probably a good idea not to alienate the community. Not for any moral reasons, but for a simple, transactional, analysis of how you get things done. Like, what matters more: the project, or the ego?
1
-
1
-
1
-
1
-
1
-
1
-
1
-
Fonts 12:00 ish. I fancy Windows does render text better than Linux. My Linux fonts are fine, but when I go to my wife's computer, the text on Windows looks crisper. I believe this is a genuine difference, because sometimes I forget about it, and am struck by it again.
Too much customisation 8:20. I don't think this is a problem, because the users concerned don't even realise their is customisation. I helped my doctor's receptionist (a smart woman) by showing her that in Windows you can change the size of the pointer. What is, perhaps, a problem is would-be Linux evangelists banging on about all the customisation you can do, rather than saying "Choose this flavour of Linux, and you can just get on with your work/life in half an hour."
Also, all this banging on about games is a turn-off for ordinary people.
1
-
1
-
1
-
1
-
1
-
1
-
Suggestions about redirecting the students' creativity, as at 9:25, are obviously made by people who have no experience of educational institutions, apart from having attended one with an intention to learn (which makes them a minority from the start).
If you consider what has been happening at this school--I mean, the lived reality of the context described in the use-case--then you realise that for the poor bastard asking this question, burning down the school (7:45) could be a beneficial side effect.
Alternative suggestions also do not take seriously the chronic underfunding of public schools in many parts of the world: in the USA, for instance, I gather that school teachers have to buy pens and paper for the kids out of their own, tax-paid, incomes. So the probable situation is that the school has the person who's asking the question, and one tech, to look after the kit. How long would it take to disconnect a speaker on a PC? Remembering that you have to not only do the deed, but also travel from computer to computer, probably including travelling between buildings. And all this, BTW, would have to be done out of hours, so as not to disrupt the creative little angels while they're (ostensibly) learning. I guess, on average, about half an hour per machine. Might be twenty minutes, if the logistics are favourable. How many computers? Sounds like they've go a lot. Maybe 200? So that's 100 hours, or two and a half weeks work (in a civilised country with a 40 hour week: even in the USA, management would have to budget 10 days to get the job done). So, by the time you get to the end of the process, the computers you did first have already had new, louder, speakers put in by the kids. And you've made it a competition, so it's a matter of principle now. Who gets to turn the school PC into a theremin?
I know, I'm cynical about kids--and I haven't even been a school teacher. And many students could doubtless be diverted to more useful aspects of computer systems programming, though of course you'd have to fit it in with the government-mandated syllabus. Or you could set up a computer club, so that's someone who volunteers to give up another evening a week for the privilege of looking after other people's children, and organising the use of facilities, and ensuring that they don't try a ransomware attack on the local hospital, all on an income that pays a poor hourly rate even if you just stick to the official part of the job. But even if all this worked, you wouldn't get everyone. A few little twats would just like causing trouble for the sake of causing trouble (or maybe shit posting is not really a thing?), so it would start again, and others would then join in, and you're back at square one, though maybe a more sophisticated lot of troublemakers because of all they've learned in Computer Club.
In the circumstances, the question, certainly asked after much thought and in desperation, seems entirely sensible. To people who have never been in a classroom, teaching seems easy and obvious. And bits are, indeed, good. So why don't you go and frigging do it? But in the real world, I think the questioner would have been justified in asking for a good implementation of EOU.
1
-
1
-
1
-
1
-
1
-
1
-
For those who weren't around, goatse was used kind of like rickrolling. So some things have got better. I will admit it gave me a bit of a startle the first time I saw it, so I guess it's in my memory, but no, the old Gnome Circle logo is nothing like goatse. But now I'm going to really foul things up. Given that an image of a tree can suggest a penis (Rod of Jesse, family tree, "got wood?"), I think the one-handed logo suggests masturbation, and I'm shocked, I tell you, shocked. Will no one think about the children? (on second thoughts, don't answer that.) Go on, cave in to this old white man. Stop corrupting the world, Guhnome.
1
-
1
-
I know Linus Torvalds has a reputation, but if you compare that "flamefest" with historical controversies in scholarship or, especially, theology, it is no more than direct, with a /small/ side dish of snide.
This also reminds us of a remark attributed to the great philosopher, Yogi Berra: "Prediction is hard, especially about the future." I was there at this time, even by an odd chance being on a committee that decided to buy the university a Silicon Graphics RISC/Unix machine (the engineers wanted to stay with VAX), and everybody /knew/ that RISC was the future. I guess the same was true about microkernels (BTW, isn't Mac OS or whatever they call it now a hybrid, at least?) It would be interesting to know how much Tannenbaum was wrong and knowably wrong, and how much it was sheer contingency? Intel at the time was making little chips for little machines--consider the dude complaining about the elitism of writing for a 32-bit chip--and it was not inevitable that they would win against MIPS or even Western Digital. But there was enough inertia behind the instruction set, and they were clever enough (and Moore's Law was still working) to change the way their chips worked.
1
-
1
-
1
-
1
-
Plain text is universal? Yeah, right. When I started with little computers, some of my colleagues were using an IBM word processor. Guess who got to be the expert on translating EBCDIC to ASCII?
My wife and I were doing a book: the general editor had support at his university computer department, who worked with LaTEX. So guess who learned how to put in LaTEX codes using the simple, austere word processor we were using then. And then the editor lost the support, and I forget what format we ended up with, but I know one author in the bibliography had a Polish given name which was spelt with a z with a dot over it. Long before UTF: so I bought a copy of WordPerfect, and learned it (in so far as anyone actually learned WordPerfect, rather than being quick at navigating the cheatsheet template).
The problem, I think, is that a lot of the people who pontificate about Linux are developers and sysadmins (to whom, respect) for whom writing is producing documentation for other professionals. But a lot of writing IRL is for publication, either in dead tree or e-book format, and what publishers want is Word format files, and they want authors to do all the formatting for what used to be called camera-ready copy. (Maybe if you're a best seller, this doesn't apply, but it's the way it works in academic publishing). For this purpose, word processors don't do a fully professional job, but they will produce a passable result that's good enough for academic publishing. Though I observe that publishers still have difficulties with getting footnotes done properly in ebooks. Publishers (outside the technical sphere, perhaps) do not want LaTEX any more than they want nroff, they want .DOC or .DOCX.
Commercial and advanced FOSS word processors can get incompatible (hell, MS Word can be incompatible with itself if there's enough of a gap in versions and platforms), but that only applies to pretty recondite sorts of usage. These days, for the sort of thing that markdown does, the compatibility is good. Especially if you use .RTF, which is proprietary, indeed, but MS is not making any money out of it, and .RTF will tell you if you're doing something too intricate for it.
Where word processors can be, and certainly used to be, evil is when there's a monopoly. Microsoft used to change the .DOC format with every upgrade. This would to drive the massive sale of upgrades by a simple mechanism. It used to be a rule in large organisations that the person who had the very latest desktop PC was the CEO's PA. So, an EDICT would be issued from the desk of the Supreme Manager. It would be typed up (and probably corrected for grammar and spelling) by the CEO's PA (or, as it was in those days, Secretary) and she (as it was in those days) would promulgate it to the masses. Since the CEO's PA/Secretary was a very intelligent and capable person (probably smarter than the CEO), she was in complete command of the new version of Word, and would use its new features. So when the message came to the peons, and they opened it in their old versions, they could not access the guidance of the Dear Leader in all its fullness, and so each department paid for upgrades, and so was increased Bill Gates' fortune (ill-gotten, but now used well).
And if you want pure, undistracted, composition of a first draft, nothing beats paper and a 2B pencil.
1
-
1
-
1
-
14:20 Psychopath. Are you suggesting that Gnome should adopt Reiser Sans as the default?
All the font minutiae are real, but often in quite compartmentalised use cases. Actual dead-tree printing has had over 500 years of this, and for a book I'm not sure there's anything much better than the font Nicolas Jensen designed in Venice in the second half of the fifteenth century: but it looks pretty crap on a computer screen. Also, it's not obvious that different alphabets should belong to the same font family, though it is clear that one should pay some attention to how a particular Arabic or Hebrew or Georgian script looks alongside a particular Latin face. Each of them has their own tradition of calligraphy and type design, quite separate from the Latin tradition, so there's no a priori reason why they should belong together in the same act of design. Which means that wanting to have one font to rule them all is likely to introduce complications which could be avoided by accepting that a system could have a variety of fonts available, even for the default display fault, depending on default language. Possibly even a different font for languages using the Latin alphabet with a lot of diacritics.
One thing I find troubling in the discussion is that there is no mention of readability studies. There are the obvious abominations like I and l being indistinguishable (as in the font I see on YouTube now), and my pet hate of l being hard to distinguish from i in some quite fashionable fonts; and then there's telling the difference between rn and m, which is unnecessarily hard in some sans serif faces. But there have been more general studies, taking into account different levels of visual acuity and stuff.
BTW, making a Bold by just tweaking some parameters on a base font is, I think, regarded as devil's work by font designers. Even scaling by point size can be usefully tweaked, if your aiming for the font beautiful.
A distro using a clone of Comic Sans? To go alongside Hannah Montana OS ?!
1
-
1
-
1