Youtube hearted comments of Michael Wright (@michaelwright2986).
-
357
-
164
-
14
-
7
-
I doubt if there is any OS that doesn't require troubleshooting (I trouble shoot Windows for my friends over the phone -- not without stress), but I have found Mint to be the easiest for me. There's a different version of Mint called Linux Mint LMDE, which is not based on Ubuntu, which might get round your problems. Or if you have a very new computer, it's possible that you need the latest and greatest to work with your hardware, so you could try Linux Mint Edge. It might well be that another distro would be better for you, but I haven't found anything that is easier to get going and keep going than Mint.
It's also the case that, with normal luck, getting stuff going properly is a one time thing. Try entering a search with the name of your hardware, your distro, and a description of the problem, and see what Duck Duck Go or Google brings you.
5
-
5
-
4
-
4
-
4
-
4
-
4
-
4
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
Briliant. Thanks to this video, I have got a long way with running a virtual machine, which I have previously regarded as an Arcane Mystery, not for the likes of me.
My version of VirtualBox is 7.0.4 and I'm running it on the Thinkpad T420 I use for Adventures in Computing. One difference I see with 7.x is in installing Guest Additions. Your version offers a choice of physical drive or ISO. 7.0.4 doesn't offer that, only having an option for CD, but there is another command (something like "Load Guest Additions CD") which slightly counterintuitively means "tell the machine to look for the ISO in the D: drive." That doesn't work, but I looked in Explorer for something that looked like the right file, and clicky-wickied, and it worked.
I got a slight hiccup with USB. My venerable machine only has USB 2, so I clicked the radio button for USB 2 controller, and that didn't work: it does work if I tell it to enable the USB 3.0 controller which, as a matter of fact, I don't have. idk, but I can see a USB drive now.
My only problem left is getting iTunes to see the WiFi network rather than the virtualised connection, so I can hook it up to Airplay. This is a pretty niche requirement, and thanks to you I feel confident in trying to solve it.
I spent a career in tertiary level teaching, some of it fluffy stuff, some of it with its own technicalities. I know absolutely first-rate exposition of difficult material when I see it, and you're a master. Just the right amount of PowerPoint and graphics--in this, I was actually grateful for having the main points all on one slide. Brilliant.
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
Interesting, and also a cinematic triumph. Many famous film auteurs have tried to get there, but have never quite achieved a fully motivated black screen.
I see quite a lot of comments saying that gen 3 and 4 Intel i7 chips will "easily" outperform these little modern CPUs, but without any details about the workloads on which they are so superior. My most modern chip is an i5-9500, and in your tests, which represent most of what I'd be doing, responsiveness looks pretty similar--and most of my machines have much earlier chips (the HP Mini that I just got for $10 with a single core Atom is, alas, too sluggish for pleasant use). Is there likely to be any kind of work with which the golden oldies are going to be much better (excluding, of course, the use of external video cards)?
2
-
2
-
2
-
2
-
Thank you very much for another year of old skool intelligent, informative content, clearly delivered. You will NOT be replaced by any conceivable AI.
On Windows 12 and the hardware upgrade treadmill: I'm trying to remember when I decided I didn't need a new computer for anything I wanted to do--probably about 2015. So different from the early days, when everything was obsolete after two years. I do not want local generative AI (except for very specific cases), and I've finally switched to DuckDuckGo for default search, since it still seems to act like a search engine.
Mini PCs are intriguing. Long ago most non-gamers/non-creators decided that a laptop or AIO was enough computing for their needs, in a convenient package: the teeny-weeny boxlet seems to be going back to discrete components. Makes sense, I suppose, since monitors last forever, and keyboards and mice can be cheap to replace (for non-gamers). But there is still the problem of cables, which can be an important consideration for what used to be called the Spouse Approval Factor. Will we go back to some kind of dedicated computer desk, for external cable management?
2
-
2
-
2
-
2
-
1. Websites where a business has had a site made for it, and it includes a "Contact Us" page, and no one monitors it, so you never get a reply (these people mostly only use the phone, because they can't write).
2. Commercial websites that have been made by Web Professionals, and include all sorts of self-gratifying Design Elements and Fluid Transitions, but No Sodding Information (sometimes, if you fossick around enough, you can find a link to really ugly tech spec text pages, but often not even that).
3 Micro-USB connector, which is near enough to symmetrical that you can't tell the right way up. This is the worst ergonomics of the whole USB system, and that, as everyone knows, is a high bar to cross. USB B is good, and I find Type C a relief, but Type C hasn't replaced anything, just another standard to add to the mix (the last time I bought an external DVD drive, it still had a Mini-B connector). Standards are wonderful--there are so many of them.
2
-
2
-
Two urban myths could have been dispelled better.
MS never said that Windows 10 is the last version ever, certainly. But that official rebuttal is so full of obscurity vetted by the legal department that the tech journalists might have genuinely misunderstood it. And if at first you don't succeed, give up--that's not the attitude we expect from a predatory monopolist. If they'd just said "Look, we never said W10 was Windows' final form. <Legal cya material follows.>" they would at least have tried.
Linux and the terminal. Shortly after Windows 11 appeared, and before I'd worked out how to turn off a lot of the obnoxious stuff, I was trying gently to propagate the virtues of Linux for some users, and someone fairly cluey about computing in a corporate environment said, "Oh, Linux, you have to do everything in the terminal." He was obviously thinking of the server people at work. But this myth persists because, if you go online looking for help, you'll see lots of how-tos using the CLI, even when GUI alternatives are available. I don't think this is flexing, mostly; if you spend all your working day in the CLI, it will be quicker to do simple jobs in the mode you're familiar with, and here are some things that you can only do in the terminal, and others that are easier (to go back to my beginnings, PIP B: = A:*.DOC is quicker and easier than copying all the originals, but leaving the back-ups, in a GUI file manager). Don't know what to do about this, but at least the forums for desktop oriented users might adopt a policy, or at least a guideline, or maybe a nudge, about using GUI methods unless absolutely necessary (as some do).
One last thing: do I believe in an urban myth? When RAID first appeared, I understood it was an acronym for Redundant Array of Inexpensive Disks, as opposed to the expensive (Enterprise, we would say nowadays) disks used for high availability. But so many acronyms have varying expansions, especially under the needs of social change. Was it always Independent disks?
2
-
2
-
2
-
My recent experiences with Windows makes me think of Microsoft as a corporate predator, seeking to ambush its users to get more access to their information, or maybe old-fashioned money. It has great difficulty just shutting up and letting me get on with what I'm doing. It's the barefaced impudence with which it does it that is especially annoying.
The lack of customisability in the interface is, perhaps, more understandable. The big bucks are in mass orders for corporations, and there are good reasons for big organisations to want all their desk-top computers to look and work the same, especially as hot-desking becomes another way of cutting people costs. Since there are also good reasons for an individual not to get too dependent on their own special way of setting up a computer, but just getting the most out of what the manufacturer gives you, probably the best we can hope for is something intelligently designed, and not changing just to give the designers a stunning innovation to put on their CV.
As your were talking about Microsoft losing its monopoly, I thought at first this was wishful thinking (MacOS is annoying, too, and Linux won't get the mass acceptance): but then I realised that ChromeOS will do most of what most people want to do on desktop, and although Google are quite as nefarious as Microsoft, they tend to be a bit more subtle about the way they grab all the details of your on-line life.
Microsoft, Google, Zuckerberg, Musk--it is all a bit dystopian, isn't it? Your fury is understandable.
2
-
2
-
I greatly admire your ability to present content clearly, and I learned from this, but I'd suggest the first section does not start from the right place, and misses a chance to free people from some bugbears about sound quality.
I think there are two things that need making clear right at the start: the limitations of human hearing, and the fact that audio digitisation is a process of sine-wave reconstruction, not the smoothing out of little isolated steps (like the columns of a histogram).
Both these apply as much to uncompressed formats: when the consortium introduced the CD format, this was based on a huge amount of study of the limits of human hearing, done by record companies as well as tech companies, and the record people would not want anything that would make their stuff sound bad (DGG was big into this). This established the parameters for the CD, and these are the outer limits. So people who pay extra for "high definition" audio at high sampling rates and bit depths for listening (as opposed to editing) are not getting anything tangible for their money (though doubtless it makes them feel special, and that is always worth spending money on--I do mean that). People who were brought up on the analogue world, where there was always a small (though diminishing) gain to be had from getting a better cartridge or a more exactly controlled turntable often have difficulty in accepting that good enough is truly good enough, and that though you can get something that measures better, it is a physiological impossibility for it to sound better--just as you could build a computer monitor that emitted in the deep infra-red, but it wouldn't look any different. Also, it seems that the most discriminating hearing is to be found in young females , who are not strongly represented among the people who think a lot about audio techniques and equipment.
The other thing that needs explaining -- and even a mathematical dunce like me can kind of grasp it -- is that the digitisation and reconstruction of a sound wave is a precise reconstruction of sine waves up to a certain frequency, and not a smoothing out of lumpiness (which implies approximation). A misapprehension here leads to a misunderstanding of what is an easy sound to compress, and what is hard. The step-smoothing notion leads to the idea that an instrument whose sound is very close to a pure sine wave is hard -- so, solo flute is difficult, people sometimes think. Doubtless they think of the sound of a flute as pure, and digitisation is a pollution of analogue purity. Whereas in fact a flute is very easy for digitisation, and hence for lossy compression. Whereas what is difficult is a Nordic death metal drummer at the end of a session hitting everything as hard as possible as often as possible, with diminished precision.
These two things are hard for lots of us to grasp, at first, but getting them straight can end up with a lot less disk space (not that that matters now) and a lot less anxiety about quality, more money to spend on speakers where it makes a difference, and more mental energy to spend on listening to music, which is the main point of the exercise.
All this from the point of view of the listener: for editing, of course, you want to stay well clear of the minima to make life easy: and maybe that's another clarification that could be made: there are significant differences in the requirements for the editor and the end listener, and your attention is perhaps more addressed to the needs of people who edit?
Love your work.
2
-
2
-
1
-
1
-
This format is very appropriate for desktop distros, thank you. I can understand (or imagine) that fine differences in performance make a real difference on servers, but it's all Linux, and at the desktop I wonder just how much difference in performance there is? MX Linux is often described as lightweight and performance oriented--can you point me, please, to comparisons in performance of desktop tasks--like reformatting a long word processor document, or processing a RAW image file--between lean distros and the friendly ones?
Perhaps also the audience of this channel is not about to switch from Windows, but is likely to be advising friends/relatives on the switch. In which case, it is necessary but not easy to drop one's own preferences and think about a user who actually doesn't WANT to understand their computer, but just wants to do stuff with it (like most folk use a phone). Don't put your cousins and your aunts onto Arch, btw.
1
-
1
-
1