Youtube comments of Michael Wright (@michaelwright2986).
-
357
-
281
-
192
-
164
-
When you offered the answer that 7 means "to the max", like "cranking it up to 10," I can't have been the only person who thought of Spinal Tap. So it was delightful to see that possibly, given the sexagesimal number system, 7 is 11. Shows that finding answers is a bit of a crap shoot.
I also thought of "The magic number 7..." I don't think it's necessarily right to simply dismiss that idea as speculation, provided we stop trying to look for a single cause. Seven is the recurring, para-sacred, number, for whatever reason, and it means "completeness," let's say. If that's true, its use is reinforced by the apparent fact that, on average, a list of 7 items is pretty handy for humans to remember (and hence maybe??? the prevalence of lists of 7: the seven capital sins, the seven deadly virtues, the seven wonders of the world). Full explanations need not just "causes," but maybe causal feedback loops.
141
-
132
-
78
-
78
-
60
-
57
-
53
-
45
-
40
-
38
-
32
-
32
-
White, cis male, old--too old to be a boomer. One thing that to someone of my age seems to be missing is a discussion of the nature of gender roles and stereotypes,
When I was young, there was a great deal of talk, and some actual activity, around redefining gender roles. At the time, a lot of it was such mundane topics as that men actually could and should cook and do housework (Schwerpunkt: can your man clean the toilet properly?) and look after kids, and that women could chair meetings with men in them and could learn to fix their own cars: but there was also talk about men "getting in touch with their feminine side" and that sort of thing. And androgyny was a thing, and we all loved Marlene.
The swerve away from talk about sex, using only "assigned gender" and "experienced gender", seems unfortunate in that it tends to preserve the notion of fixed gender characteristics. It is an undoubtable fact that some people are born the wrong sex and need to be reassigned to have a decent chance of flourishing. But one wonders how many of the children who feel dysphoria, and especially girls who want to be boys, are actually reacting to something wrong with society's construction of gender, not some sub-optimal feature of their embodied personality. I think it is not just recently that girls have thought "I wish I was a boy," and given the patriarchal nature of society that's not a surprise.
I wonder if this is one of the areas in which science is of real but limited help. Gender has possibilities for nuance and ambiguity which don't produce good statistics. It is absolutely certain that the politicisation of the topic is doing no good to anybody except for the politicians and businesses that hope to profit from inflaming the socially conservative.
32
-
"Iron ships will sink". I remember, a very long time ago, meeting this meme as an example of the stupidity of the Establishment. But, of course, the Naval establishment was not stupid, and it was actually true. In the age of wooden ships, it was actually rather rare for a defeated ship to sink: hence, the large number of captured ships taken into the service of the enemy, like HMS Belleisle or HMS Sans Pareille. Loads of them. If the powder magazine caught fire, or the ship was caught in a storm, they might go down: otherwise the wooden structure meant they floated, even if no longer capable of being fought, and could be taken back to be refitted. But once iron ships came in, no more were prize ships taken in to their enemy's navy. It wasn't a good reason for not adopting iron ships, but (if this argument was ever actually made) it was NOT evidence that senior naval officers didn't understand displacement. In the 18th and 19th centuries, the Navy (in Britain) was probably the best technically educated arm of the services.
30
-
27
-
26
-
25
-
23
-
20
-
19
-
@incognitotorpedo42 Not necessarily racism. It's about business culture. I'm in NZ, and we've had a lot of people from India coming here recently. A bit of friction, occasionally, and an Indian community leader explained that some Indians are brought up in a very challenging environment, where dog eat dog is a rational survival strategy. That means that it can be pretty rough doing business with such people if you're not used to that. And you might contrast the Chinese business culture, which, when it's working properly, means that you haggle like hell, and then once a price is agreed, it's immediate payment, cash. There are of course crooks who are Chinese, and I have always found a lot of Indians with whom I feel an immediate rapport. But to make a remark about the prevalence of scamming is not racist: I'm English, and when I've gone back to England I've been rather appalled at the high level of petty deceit and low level scamming that goes on in trading.
18
-
18
-
16
-
16
-
16
-
15
-
Yuval Harari is more a Universal Historian than a medievalist, though doubtless his PhD was on some aspect of medieval history. His first book, _Homo Sapiens_, was a great hit and is pretty impressive: don't scoff at the attempt at universal history, somebody has to put together all the pieces to give us ways of understanding how we got into this mess. He's smart, but he seems to be developing into everyone's favourite guru, which makes an old failed academic like me suspicious.
Where he's clearly wrong, in an academic's way, is in thinking that because people don't understand how machines make decisions, they will be unable to override them (as Dr Hossenfelder points out near the end). Many a time a rather ignorant politician has failed to accept the suggestions of experts. Often this is because of bloody mindedness or more or less blatant corruption, but sometimes it's because what politicians are good at telling what people will accept (or, anyway, what about 51% of people will accept).
15
-
15
-
14
-
14
-
14
-
14
-
14
-
13
-
13
-
13
-
13
-
13
-
I saw a Sperrin at Farnborough when I was still in short trousers. It was acting as a testbed for the Gyron: we pronounced the engine's name with an initial affricative, like "gyroscope." The Gyron was going to be the Really Really Big jet, but it never seemed to get anywhere--that would be interesting to hear about. But the Gyron Junior did get some use--Wikipedia tells me it was used in the initial (underpowered) version of the Buccaneer, so not much more successful than big brother.
That's a really interesting account. I didn't know that the Valiant was essentially carried on as a private venture. I'd always thought that the Ministry ordered two cutting edge aircraft (which ended up being the most successful and long lasting of the whole set), with the less adventurous Valiant as a safety development. And they ordered the Sperrin, just in case, and then there were four. But the Sperrin as initially intended as a safety net for the two advanced aircraft looks almost rational.
Looking back, it looks like the British industry produced a profligate number of prototypes, all competing. I suppose the US produced a lot of different types, some of which failed; but they could afford it. France seemed to manage things with a bit more economy. Although British aviation enthusiasts have nothing but bad to see about the forced amalgamations, something like that was needed for a world where aircraft production was getting more and more capital intensive. When I was at secondary school, the Aviation Club (or whatever we were called) got taken on a Saturday to Hatfield where the Comet was being produced. What we were shown looked like a series of sheds, one with a Comet fuselage in a corner. Memory is highly fallible (I realise I can't by memory locate this visit to before or after the disasters--must have been after, by my age), but the impression I carry with me is that it would all have looked a bit scruffy in the back garden of the bloke next door.
13
-
13
-
12
-
12
-
That's a great video. What was particularly heartening was seeing Veronica struggling with some of the set-up. I'm a very non-technical user, and I get frustrated at installation/set-up instructions which probably make sense if you know all about the software, but which are totally opaque to the newb. I realise that when I'm trying out Linux distros, the installer is pretty much make or break (like, I kind of know about partitioning and mount points, but I'd like the installer to offer sensible defaults for a first try). And the video made me interested in GrapheneOS.
Oh, small phones. You, me, and my wife too, Veronica. But it seems there ain't a market for rationally sized devices, because Apple has stopped the Mini, and I don't think there's anything reasonably powerful with a screen smaller than 5.8".
12
-
12
-
12
-
11
-
11
-
11
-
11
-
11
-
11
-
@matthewferrantino9521 I agree that "we" should know more about the Persian Empire, and the history of China, and the Indian sub-continent. But there is a reason why the Roman Empire (and the Greek world) are much more taught in "our" schools, which is that, for better or worse, "we" are in a direct continuity with it; might even speak a language that is the development of Latin. It depends who "we" are. On YouTube, "we" tend to be majority, or anyway plurality, English speaking, together with European. Same with the British Empire: in relatively recent times, it touched the lives of many people in many parts of the world, and the aftermath is still going on. I think the focus on these aspects of history is reasonably understandable in parts of the world to which they are especially relevant. You can only get so much into a syllabus, especially if it's to have any kind of substance to it.
What really does bug me is the way that I was taught the history of Greece and Rome as though they were entirely exceptional, rather than in their context and in their interactions with other ancient cultures and societies. But, at least in Anglophonia, it's a battle to get any kind of knowledge of the deep past into curricula.
10
-
10
-
10
-
10
-
9
-
9
-
8
-
8
-
I'm sure there are a number of reasons why MS keeps changing things. One is doubtless to support the industry of providing corporate training; I'm not quite sure what the payback to MS is, but I assume it's good to have a symbiotic industry around you. Second, of course, they don't sell to users but to corporate purchasing departments: I don't know, but I bet IT professionals don't have as much say in purchasing as the people who are swayed by bright and shiny. Third, if you can sneak in a new feature, or an incompatibility, you can Trojan horse an upgrade on a whole organisation. Just ensure that the PA to the CEO has a new computer, with a new version of MS software on it. Then corporate memos, descending from on high, will need the software upgrade so that the peons can open them.
Probably there's also the power of the part of MS that designs these things; if people were content with what works, they'd be out of a job. Internal politics can be powerful in these matters: remember the dumb skeuomorphism that afflicted Apple, when the Calendar app had an icon representing not just a desk calendar, but a heavy and pretentious leather-bound Executive calendar.
Favourite versions of Windows: Windows for Workgroups 3.11; Windows 2000; Windows 10. The rest is frippery and noise.
8
-
8
-
8
-
8
-
8
-
8
-
8
-
8
-
8
-
I've just been watching 5:00 to 10:00 and it looks like there's been a major change in the objective of battle; in the age of the steam iron-clad, the aim is to sink the enemy's ship. AFAIK, this wasn't the primary objective in the age of sail. Ships rarely sank, unless the magazines exploded, and in any case it was MUCH more profitable, both for the ship's crew and for the nation, to capture a vessel and bring it into your own fleet (often retaining its original name). For all the reasons you list; and this presumably explains the difference in gunnery tactics: the French aim high to disable the rigging, as this is the elegant way of disabling a ship with a view to capture.
All this is implicit in what you're saying, but maybe worth foregrounding since it might imply that the early iron-clad steamships, although they still formed lines of battle like the old days, actually introduced a revolution in naval tactics. Or not, I might have got it wrong.
8
-
@BrandonHensleyEMD I've taught undergraduate courses, though not in Classics, and you can generally assume that there is a lag between what is presented in u/g courses and the latest news from the knowledge front. This is partly inertia and time allocation (academics' careers are not dependent on their u/g courses, for the most part) and partly sensible: for a few years, the hot conversation will be "The Bronze Age Collapse: Yes or No? And if so, Why?" and it will be framed in terms of the consensus that existed before the conversation started, so you'll need to know a bit about that to understand it. I mean, if you're reading stuff that says "The Bronze Age Collapse is an illusion," it helps if you know what it was supposed to be. Also, the latest and greatest theories are not always right--I'm of an age when the most popular explanation for religion was magic mushrooms, and that didn't stand the test of time. All that said, it would also be good to have u/g courses of the form: "Current controversies in <discipline area>", if the structures are flexible enough to allow that (they weren't at the place I worked).
8
-
8
-
7
-
7
-
7
-
7
-
7
-
7
-
7
-
7
-
7
-
7
-
6
-
6
-
6
-
6
-
6
-
6
-
6
-
6
-
6
-
6
-
Good to see you. I was wondering, as I watched this video, whether in fact Hermetism is symptomatic of a prevailing way of seeing the world, and that Christianity, Islam, and indeed some kinds of Judaism, articulated their particular religious insights in the terms of the prevailing way of imagining the world. So, not exactly influence of Hermetism, or any specific philosophy, but development in a common seed-bed. The way science provides a background sense of the world today, which both shapes the way serious religious thinkers do their work (religion and evolution, creation and cosmogony), and provides a language for the purveyors of woo (quantum vibrations and cosmic energy--the Wellness Centre next door is erecting a huge crystal in its back yard, and I swear it's interfering with our WiFi).
6
-
6
-
@JacksonNick-j6i There's a difference between gender and sex. There's quite a nice book by a Catholic priest and journalist,* called IIRC "The Year Of Three Popes" (the year of the sudden death of John Paul 1). He describes going into the working parts of the Vatican and being surprised that there are everywhere triple shithouses (not the word he uses, but it's nice to introduce a little coarseness into things). They are all iconically identified, and there's clearly a Man icon and a Woman icon, and a third icon representing a figure in a long coat-like garment. And then he realises it's a cassock, and the third loo is for priests. Three genders in the 1970s. BTW, mammals have two sexes (though not every individual is neatly sorted), but other species have more sexes. I know trans (etc.) activists can be tedious to old men like us, but that's an activist thing, not a trans thing.
*Hebblethwaite is, I believe, his name.
6
-
6
-
5
-
5
-
5
-
5
-
5
-
I doubt if there is any OS that doesn't require troubleshooting (I trouble shoot Windows for my friends over the phone -- not without stress), but I have found Mint to be the easiest for me. There's a different version of Mint called Linux Mint LMDE, which is not based on Ubuntu, which might get round your problems. Or if you have a very new computer, it's possible that you need the latest and greatest to work with your hardware, so you could try Linux Mint Edge. It might well be that another distro would be better for you, but I haven't found anything that is easier to get going and keep going than Mint.
It's also the case that, with normal luck, getting stuff going properly is a one time thing. Try entering a search with the name of your hardware, your distro, and a description of the problem, and see what Duck Duck Go or Google brings you.
5
-
5
-
5
-
5
-
5
-
5
-
5
-
5
-
5
-
5
-
5
-
5
-
5
-
5
-
5
-
5
-
5
-
5
-
5
-
5
-
5
-
4
-
4
-
4
-
4
-
4
-
4
-
@xmathmanx In practical terms, that is just not true. I am not a climate scientist, nor a mathematician, but I am a citizen with a vote. I therefore have a responsibility to form as good a judgement as I can about climate change, and about the policies best able to mitigate the bad consequences. I therefore have to try to determine which people who claim special knowledge are most reliable: that is, I have to pick experts. I should, of course, pick a variety (but exclude anyone employed by a fossil fuel company) and not expect a single narrative, but I simply can't interrogate the facts for myself; and in a case like this, of course, it is not a question of looking at the facts, but trying to decide which modelling of the facts is likely to make the best predictions.
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
As usual, this is very clear. But I wonder if it's quite beginnery enough? If it is for someone who's thinking about trying out Linux, they're probably going to be using Windows, so a demo making a bootable drive on that OS might be better for them. I wonder, too, if "ISO" needs explanation; and perhaps a few words about what making a bootable USB means.This would, of course, increase the size of the video a bit, and maybe these points are covered in other videos I've not seen.
I see one commenter is saying demoing ventoy would be better. I'm inclined to agree with your approach. I use ventoy myself and it's great, but it would be a bit more complex to explain; and I have occasionally found, with older hardware, that a ventoy USB won't work, and I have to revert to the old skool method.
Finally, a question. I use Mint, and the file mangler on that will make a bootable USB just from the context menu on the ISO (I think it calls dd, but I'm no expert). Is there an advantage, on Linux, of using a standalone program, rather than the built in facility?
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
4
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
Thank you. Although it's not defined, New Atheism is useful as a label, at least as useful as the labels applied to literary or other artistic movements. And the chief feature of New Atheist discourse is that the practitioners don't understand Christianity; or at least, think that all Christianity is like rather benighted forms of US White Conservative Evangelicalism (very marked in the anti-creationist rhetoric of Dawkins, who is English; he would be hard pressed to find a creationist in any of the long-established denominations in England).
Many campaigning atheists aren't clear whether they're opposed to the idea of God, or opposed to organised religion. That's a distinction I'm sure you'll be pursuing, but we all know of intensely believing theists who have been strongly opposed to at least some forms of the religious group they were brought up in or which surrounds them (defiers of steeple houses, shall we call them).
Meanwhile, as some people who don't believe in God form groups for mutual support and delight and to celebrate their world view, one watches with a certain wry amusement as the history of religion seems to begin to be played out in them. How long, I wonder, before a dispute about what it takes to be a true atheist?
This, of course, only applies to public atheists, and especially atheist campaigners, and has nothing to say to about the many people who live their lives without belief in any god or godlike entity (sorry, hard to express that in the light of the Christian theological tradition that says it is incorrect to say that God exists). It would be interesting to know if those many people are happier or not, feel themselves more open to flourishing than the typical believers/practitioners of a variety of faiths.
For the avoidance of doubt, I'm not in the USA. That poor country, amongst its many ills, is under the baneful influence of a mistaken form of Christian belief so deviant as to perhaps amount to heresy. In the USA, Satanism has a lot of work today (as long as they don't really believe in Satan).
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
I love this channel, but debunking woo is not the same as dismissing everything that is not physics as woo. That is just arrogance. Take "quantum healing"; obviously woo, and probably a fraud upon the public designed to extract money from the gullible, but although you can't cure cancer by thinking, you certainly can produce effects in your body by conscious effort. Square breathing, for instance, lowers heart rate, and you can increase peripheral blood flow by conscious attention. I paid good money to a perfectly sane psychologist to learn such techniques to control my rage and desire to murder several of my bosses and colleagues. Works, though I don't know how, but hard to explain how it works, or even describe the phenomenon, without using mentalist concepts.
Doesn't do any good to pour scorn on research on ESP. It was, for a time, a possibility taken seriously by serious people. Properly investigated, it was shown not to happen--rather like the way the Society for Psychical Research put a great deal of effort into debunking fraudulent mediums, so that now mediumship is just a form of stage entertainment and exploitative psychotherapy. It's the same as the way historians of science point out that although alchemy is now obviously wrong, it took a lot of serious scientific analysis to distinguish alchemy from chemistry, the woo from the true.
Shut up and calculate is a method that works, but to assume that it rules out attempts to "understand" what is going on is philosophically naive.
I am not a philosopher, but I know some, and they aren't actually fools. But of course scientists have to denigrate all humanities studies because they're competitors for funding in universities.
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
@dlsisson1970 You are quite right. One problem in discussions like this is that Linux has two main user groups. There are traditional og users, who at the least are computer hobbyists, and go all the way up to major server wizards in large organisations, and developers. These people want and need the command line, and they want and need to learn the system.
The newer group, and I'm one of them, just want an OS to run their computer so they can do everyday normal stuff, like watching cat videos and writing books and doing the accounts for their business. We're fed up with the Apple way, and have come to seriously mistrust Microsoft's desire to own our data and get in our face at every opportunity. And give or take a software issue or two (mostly spelt A D O B E) we've got that now, in the two or three obvious, big, desktop-environment-centred distros. We don't want, or need, to know a lot of stuff, anymore than anyone needs to know how to run a server with Windows to organise their book club or engage in the collaborative production of a policy document. Sometimes real wizards give advice to normies with the best will in the world, but without realising there's a whole new audience out there. They run the internet, but we're the people who will bring about the year of Linux on the desktop (if it is permitted to speak apocalyptically).
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
Briliant. Thanks to this video, I have got a long way with running a virtual machine, which I have previously regarded as an Arcane Mystery, not for the likes of me.
My version of VirtualBox is 7.0.4 and I'm running it on the Thinkpad T420 I use for Adventures in Computing. One difference I see with 7.x is in installing Guest Additions. Your version offers a choice of physical drive or ISO. 7.0.4 doesn't offer that, only having an option for CD, but there is another command (something like "Load Guest Additions CD") which slightly counterintuitively means "tell the machine to look for the ISO in the D: drive." That doesn't work, but I looked in Explorer for something that looked like the right file, and clicky-wickied, and it worked.
I got a slight hiccup with USB. My venerable machine only has USB 2, so I clicked the radio button for USB 2 controller, and that didn't work: it does work if I tell it to enable the USB 3.0 controller which, as a matter of fact, I don't have. idk, but I can see a USB drive now.
My only problem left is getting iTunes to see the WiFi network rather than the virtualised connection, so I can hook it up to Airplay. This is a pretty niche requirement, and thanks to you I feel confident in trying to solve it.
I spent a career in tertiary level teaching, some of it fluffy stuff, some of it with its own technicalities. I know absolutely first-rate exposition of difficult material when I see it, and you're a master. Just the right amount of PowerPoint and graphics--in this, I was actually grateful for having the main points all on one slide. Brilliant.
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
I think Simulated Intelligence might be a better term. I don't know what the Python code is like, and it is clear that Machine Learning can help humans greatly when very large data sets are relevant. But the summary of Macbeth is a sign that the value of these systems varies greatly depending on domain.
Basically, the Macbeth summary is vacuous crap. Anyone who teaches English will have seen that sort of thing, and it probably from time to time causes one to ask why one is doing this job. It is likely to be accepted, because management requires a certain minimum pass rate, but you can't imagine that the person who produced it gained any value from the exercise at all, and would have been better off doing something else.
But, even today, it might not be accepted because there is a howling error. It is not Macbeth who goes mad, but Lady Macbeth. To tell the truth, Macbeth is not a play I think all that highly of (within the context of plays by Shakespeare), but the ChatGPT error makes me realise that one of the interests in the play is the way that the rather stolid Macbeth keeps on to the end, whereas the more ruthless and imaginative Lady M loses it. But the point is you need a human, with a prejudice against the output of high powered chatbots, to detect the error.
All literary stuff is doubtless of little interest to the STEM crowd, but it should be, because this sort of activity is a surrogate for a great deal of human employment of intelligence. How does the latest statement from the Elbonian Foreign Office compare with earlier pronouncements? You might find a machine learning analysis useful, but God forbid anyone should rely on it, because of the capacity for error: and on the evidence so far, error is going to persist.
As for the two brief statements about AI, the ChatGPT is so flabby as to be useless--it's the sort of unexceptionable blandeur that CEOs pump out on public occasions, because it will never come back and bite them. Whereas the human response, though necessarily general, points to specific areas of concern and opportunity.
The real danger is that, because it takes considerable active human intelligence to make good use of this machine learning output, too few people will be aware of its dangers, and what the computer says will be taken as an oracle. I look forward to the day when all medics can take advantage of automatic analysis of huge data sets, just as they now use sophisticated analysis and imaging; but I still want to be diagnosed by a doctor, not a robot.
Oh, yes, and all that analysis will be great in government: but you still want elected politicians to set goals. After WW 2, many governments set on a path towards rough egalitarianism--not the total eradication of difference, but a levelling up, the assurance that no one would fall below a minimum level of material provision that was at quite a high standard. I was hugely a beneficiary of that consensus. In the 1980s the mood changed to the pursuit of economic efficiency, and now prosperous countries have poverty and deprivation on a level not seen since the Great Depression. Machine Learning will have nothing to say about which of those paths to pursue, and for that choice to be made--or rather, for negotiating the path between extremes, we need human politicians, however prone to failure. And hope that the bad actors don't use AI to cook up all sorts of misinformation which is carefully designed to correlate with what people are already saying, and so believe, but extend it and weaponise it.
In some areas, AI is overhyped, in others it is very powerful. Unfortunately, humans do bad things with powerful tools, sometimes. Oh, and AI is not going to be part of our evolution until we can exchange genes with an Invidia card (though perhaps research on that is already going on in basements across the globe).
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
Well, what is the job of Windows 11? To make money for Microsoft. At first, in the days when there was quite a lot of competition amongst apps in the Windows (and DOS) world, MS used its OS monopoly to give an advantage to its own apps. But now there's a near monopoly in the basic office apps, Windows 11 tries to ensnare its users into dependency on a Microsoft account, and hopefully, the use of Microsoft's cloud. I don't want to sound like Stallman, but it does look as though MS is trying to get to a place where all your data belong to us, a kind of legal ransomware. And this, of course, is exactly what a corporation ought to be doing, by the current doctrine that the role of a firm is to maximise the return to its shareholders.
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
I very much doubt that Cook used an internal mental clock. If you've got solid land, a telescope, and appropriate astronomical tables you can find the local time, and compare it with a remote reference (say, Greenwich). Some people thought that that would be the way the problem of longitude would be solved, and possibly one reason that Harrison had such a hard time getting his payment was that just having a very reliable clock wasn't the proper way to do it.
There's actually an interesting contrast in the way two seafaring cultures did their navigation. Cook was a great practitioner of the Western mode, with the knowledge embodied in instruments like sextants (and their predecessors) and clocks and maps. Traditional Polynesian navigators internalized a great deal of knowledge about prevailing currents and sea states and winds, and also internalized a moving picture of the position of certain stars in the sky at different times of year, and as they apparently moved during the night. Ultimately the Western way was easier to transmit and extend--it didn't take as long to learn to navigate in the Western fashion as in the Polynesian way, and you could extend your charts more or less indefinitely; but Cook took Tupaia, a traditional navigator from Tahiti with him on one his first voyage to New Zealand. Maori, because Tupaia's cultural style was much more familiar, sometimes assumed that he was the leader of the expedition.
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
From my experience in a similar area, I think it's not exactly incompetence but selective blindness and impatience by the talented of what they don't understand. I used to work in a university, as a subject academic, and also as the sort of nerd who works out the rules and regulations. It's an acquired taste, and I could well understand my colleagues who weren't the least bit interested; but some would refuse to admit that this sort of stuff was anything to do with them at all. I'd say "I know what you want, and it's good, but the way you're doing it won't work in the system, and it's affecting your students. Let me design some regulations for you that will do what you want, and the computer won't barf over." "No, I did it this way in my last university and I'm not going to change for some bureaucrat." Well, you can kind of see that, but you need some kind of system: in the FOSS world, legal technicalities are what keeps it FOSS, so you have to respect them, even if they bore the hind leg off a donkey. I never had the nuclear option, but sometimes I'd have used it if I had.
2
-
2
-
2
-
The Inca rafts are fascinating, but make me think that you might look at Pacific traditional craft. Like, catamarans, with shunting rather than tacking. The Auckland (NZ) Maritime Museum has a display of various traditional Polynesian (and Melanesian) vessels, and what is striking is the range between long distance cargo boats, and little outriggers that you use to go from one side of the lagoon to the other, the equivalent of the bicycle. If you don't have a lot of flat ground, the wheel is no big deal. And, indeed, water transport could beat road and rail up into the middle of the 19th c., and can still beat it for some freight.
In passing, about the Stone Age. That's a real thing, but there's a terrible tendency to conflate material culture with intellectual development. Until contact with Europeans, NZ Maori had no metal, and no writing. As soon as they met metal, they knew this was the thing, and the style of woodcarving changed. Maori were magnificent navigators, but they also quickly mastered the new European style of sailing boats, and in the early days of European settlement were dominant in the coastal trade. And by the middle of the 19th c., there were Maori language printing presses. Technology transfer, as we see in Japan, and then China, and now South East Asia, is a real thing, and can happen very fast.
2
-
2
-
Interesting, and also a cinematic triumph. Many famous film auteurs have tried to get there, but have never quite achieved a fully motivated black screen.
I see quite a lot of comments saying that gen 3 and 4 Intel i7 chips will "easily" outperform these little modern CPUs, but without any details about the workloads on which they are so superior. My most modern chip is an i5-9500, and in your tests, which represent most of what I'd be doing, responsiveness looks pretty similar--and most of my machines have much earlier chips (the HP Mini that I just got for $10 with a single core Atom is, alas, too sluggish for pleasant use). Is there likely to be any kind of work with which the golden oldies are going to be much better (excluding, of course, the use of external video cards)?
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
Thank you very much for another year of old skool intelligent, informative content, clearly delivered. You will NOT be replaced by any conceivable AI.
On Windows 12 and the hardware upgrade treadmill: I'm trying to remember when I decided I didn't need a new computer for anything I wanted to do--probably about 2015. So different from the early days, when everything was obsolete after two years. I do not want local generative AI (except for very specific cases), and I've finally switched to DuckDuckGo for default search, since it still seems to act like a search engine.
Mini PCs are intriguing. Long ago most non-gamers/non-creators decided that a laptop or AIO was enough computing for their needs, in a convenient package: the teeny-weeny boxlet seems to be going back to discrete components. Makes sense, I suppose, since monitors last forever, and keyboards and mice can be cheap to replace (for non-gamers). But there is still the problem of cables, which can be an important consideration for what used to be called the Spouse Approval Factor. Will we go back to some kind of dedicated computer desk, for external cable management?
2
-
Thank you. There is a strangely persistent belief in British pop history that radar was a British invention. This is probably due to the capacities for self-promotion of Watson-Watt, but it's odd that it has survived so long now that the 40-year history of radio-frequency detection has been studied. (BTW Watson-Watt genuinely was responsible for HF/DF, which located very short bursts of radio transmission: it used a technique he developed for detecting lightning strikes when he was a civilian scientist, and it mattered for submarine detection.)
There's a common belief that Chain Home was very resilient precisely because it was rather crude in construction: do you think that's right? I realise that we assume the radar should have been the first target, because in any modern invasion it is--but that's with modern precision weapons, especially radiation homing weapons.
My sense is that the Dowding System worked for a variety of reasons. First, as you say, it was the integration and distribution of information which was the technical triumph. Perhaps more important was Dowding's perception that the object was not to achieve victory but to avoid defeat. An invasion would have needed air superiority, if not command of the air, as a necessary, though not sufficient, condition, and his job was to prevent that, which he did by ensuring that relatively small numbers of aircraft were fed into what we'd now call 'target-rich environments'. He resisted the urgings to mount giant Sky Battles, and got kicked out as thanks; but it was important that he understood how much air war in the 1940s was a question of attrition.
As you say, radar at the level of Chain Home was no use for final interception (and it took a long while before British radar was good enough for interception at night). German radar could direct defending aircraft more precisely, but (was this 'because' ?) was more limited in the number of interceptions it could handle. I assume the chief value of Chain Home was to give advance warning of the direction of an attack: I can't assume that members of the Observer Corps were at a constant level of high alert, but I don't know of records of messages being sent out to wake them up and point them in the right direction--have you found any? The other great virtue was giving advance notice to the fighters to take off and climb to operational altitude in the right general area, particularly important with fighter escort high cover; and also giving assurance in not putting up other squadrons, so preserving mission capability and going with Dowding's parsimonious approach to the battle.
2
-
2
-
The notion of "animism" as an attribution of personality to the inanimate sounds very much like an artefact of the absolute (and mistaken) distinction between humans and non-human animals, regarding animals as machines. A bit closer to that idea of animism is the habit of many people who have a lot to do with machines of regarding them, or at least addressing them, as persons. Why is it necessary to swear at a car when fixing it? Why do people try to coax a machine into doing what they want it to do? Why do down to earth and oily-handed mechanics (sometimes) give their cars names? I think it's not actually a metaphysical proposition; more the adopting of a mode of interaction with the external world. I noticed a couple of years ago that, whereas when computers were new, one RTFM, for things like phones there are no manuals, and it's more like interacting with an animal that knows lots of tricks and can be taught new ones, if you get to learn its, well, personality.
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
Love tech evangelism: preach it, brother. When I was first offered detergent tablets by a demonstrator in the supermarket, I did some quick mental arithmetic and said "Expensive way to buy detergent," and she couldn't refrain from nodding. BUT, Bosch's detergent dispenser has forced me to use the damn things. After about 18 months of ownership, the dispenser sometimes sticks, and doesn't dispense. A quick internet search, and it's not just me. Message to Bosch NZ gets an admirably quick and polite response, but they tell me it's to do with the way I load the machine (NOT blocking the door mechanically, mind). That has to be corporate bull-shit, surely? So now I throw tablets onto the floor, and the dishes come up clean. But I should NOT have to do that. Thank you for trying to introduce me to the Full Potential of my dishwasher--I know it's there, but somehow, always just beyond my aspiring grasp.
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
1. Websites where a business has had a site made for it, and it includes a "Contact Us" page, and no one monitors it, so you never get a reply (these people mostly only use the phone, because they can't write).
2. Commercial websites that have been made by Web Professionals, and include all sorts of self-gratifying Design Elements and Fluid Transitions, but No Sodding Information (sometimes, if you fossick around enough, you can find a link to really ugly tech spec text pages, but often not even that).
3 Micro-USB connector, which is near enough to symmetrical that you can't tell the right way up. This is the worst ergonomics of the whole USB system, and that, as everyone knows, is a high bar to cross. USB B is good, and I find Type C a relief, but Type C hasn't replaced anything, just another standard to add to the mix (the last time I bought an external DVD drive, it still had a Mini-B connector). Standards are wonderful--there are so many of them.
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
Two urban myths could have been dispelled better.
MS never said that Windows 10 is the last version ever, certainly. But that official rebuttal is so full of obscurity vetted by the legal department that the tech journalists might have genuinely misunderstood it. And if at first you don't succeed, give up--that's not the attitude we expect from a predatory monopolist. If they'd just said "Look, we never said W10 was Windows' final form. <Legal cya material follows.>" they would at least have tried.
Linux and the terminal. Shortly after Windows 11 appeared, and before I'd worked out how to turn off a lot of the obnoxious stuff, I was trying gently to propagate the virtues of Linux for some users, and someone fairly cluey about computing in a corporate environment said, "Oh, Linux, you have to do everything in the terminal." He was obviously thinking of the server people at work. But this myth persists because, if you go online looking for help, you'll see lots of how-tos using the CLI, even when GUI alternatives are available. I don't think this is flexing, mostly; if you spend all your working day in the CLI, it will be quicker to do simple jobs in the mode you're familiar with, and here are some things that you can only do in the terminal, and others that are easier (to go back to my beginnings, PIP B: = A:*.DOC is quicker and easier than copying all the originals, but leaving the back-ups, in a GUI file manager). Don't know what to do about this, but at least the forums for desktop oriented users might adopt a policy, or at least a guideline, or maybe a nudge, about using GUI methods unless absolutely necessary (as some do).
One last thing: do I believe in an urban myth? When RAID first appeared, I understood it was an acronym for Redundant Array of Inexpensive Disks, as opposed to the expensive (Enterprise, we would say nowadays) disks used for high availability. But so many acronyms have varying expansions, especially under the needs of social change. Was it always Independent disks?
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
I'm glad you posted this video. I live through the changes in Academia that brought things to the state you found, in which universities lost their special nature, and became just another commercial organisation managed to make money, and managed by the same sort of people who run commercial operations.
I was lucky to be able to get out by early retirement, and I was in the Humanities where the commercial pressure was less intense, because they're cheaper. But it was clear that by now the university is just a commercial player in the qualifications market.
What we need to carry on the human search for deeper and better knowledge is to find or invent new institutions, now that, after about 800 years, universities are no longer the place to do that. I had the time to set up voluntary groups to study ancient and medieval languages, and found like minded souls, and that costs practically nothing. You have gone to YouTube, where I and hundreds of thousands others are glad to find you; your work costs money, and you have an honest commercial relationship with society, one that does not distort your inquiry.
Long may you flourish.
2
-
2
-
2
-
Another thought: at about 1:10 you show a screen of Windows file manager and talk about "default folders". Alas, they are not folders, but libraries. Having been brought up from CP/M and DOS, I am used to Directories, and learned to call them Folders, but Libraries are so virtual that if you try to manage them manually, you can get into a terrible mess (or at least, I did). I think of them as a fine example of an OS trying to do things for you, but assuming you'll fit in with its notions of the way things should be. So, although I have great problems working with mount points, I am sure if I took the time I could understand, but it seems as though Windows is trying to hide the reality from me. Or maybe I'm just ranting.
2
-
2
-
2
-
2
-
2
-
2
-
2
-
My recent experiences with Windows makes me think of Microsoft as a corporate predator, seeking to ambush its users to get more access to their information, or maybe old-fashioned money. It has great difficulty just shutting up and letting me get on with what I'm doing. It's the barefaced impudence with which it does it that is especially annoying.
The lack of customisability in the interface is, perhaps, more understandable. The big bucks are in mass orders for corporations, and there are good reasons for big organisations to want all their desk-top computers to look and work the same, especially as hot-desking becomes another way of cutting people costs. Since there are also good reasons for an individual not to get too dependent on their own special way of setting up a computer, but just getting the most out of what the manufacturer gives you, probably the best we can hope for is something intelligently designed, and not changing just to give the designers a stunning innovation to put on their CV.
As your were talking about Microsoft losing its monopoly, I thought at first this was wishful thinking (MacOS is annoying, too, and Linux won't get the mass acceptance): but then I realised that ChromeOS will do most of what most people want to do on desktop, and although Google are quite as nefarious as Microsoft, they tend to be a bit more subtle about the way they grab all the details of your on-line life.
Microsoft, Google, Zuckerberg, Musk--it is all a bit dystopian, isn't it? Your fury is understandable.
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
I greatly admire your ability to present content clearly, and I learned from this, but I'd suggest the first section does not start from the right place, and misses a chance to free people from some bugbears about sound quality.
I think there are two things that need making clear right at the start: the limitations of human hearing, and the fact that audio digitisation is a process of sine-wave reconstruction, not the smoothing out of little isolated steps (like the columns of a histogram).
Both these apply as much to uncompressed formats: when the consortium introduced the CD format, this was based on a huge amount of study of the limits of human hearing, done by record companies as well as tech companies, and the record people would not want anything that would make their stuff sound bad (DGG was big into this). This established the parameters for the CD, and these are the outer limits. So people who pay extra for "high definition" audio at high sampling rates and bit depths for listening (as opposed to editing) are not getting anything tangible for their money (though doubtless it makes them feel special, and that is always worth spending money on--I do mean that). People who were brought up on the analogue world, where there was always a small (though diminishing) gain to be had from getting a better cartridge or a more exactly controlled turntable often have difficulty in accepting that good enough is truly good enough, and that though you can get something that measures better, it is a physiological impossibility for it to sound better--just as you could build a computer monitor that emitted in the deep infra-red, but it wouldn't look any different. Also, it seems that the most discriminating hearing is to be found in young females , who are not strongly represented among the people who think a lot about audio techniques and equipment.
The other thing that needs explaining -- and even a mathematical dunce like me can kind of grasp it -- is that the digitisation and reconstruction of a sound wave is a precise reconstruction of sine waves up to a certain frequency, and not a smoothing out of lumpiness (which implies approximation). A misapprehension here leads to a misunderstanding of what is an easy sound to compress, and what is hard. The step-smoothing notion leads to the idea that an instrument whose sound is very close to a pure sine wave is hard -- so, solo flute is difficult, people sometimes think. Doubtless they think of the sound of a flute as pure, and digitisation is a pollution of analogue purity. Whereas in fact a flute is very easy for digitisation, and hence for lossy compression. Whereas what is difficult is a Nordic death metal drummer at the end of a session hitting everything as hard as possible as often as possible, with diminished precision.
These two things are hard for lots of us to grasp, at first, but getting them straight can end up with a lot less disk space (not that that matters now) and a lot less anxiety about quality, more money to spend on speakers where it makes a difference, and more mental energy to spend on listening to music, which is the main point of the exercise.
All this from the point of view of the listener: for editing, of course, you want to stay well clear of the minima to make life easy: and maybe that's another clarification that could be made: there are significant differences in the requirements for the editor and the end listener, and your attention is perhaps more addressed to the needs of people who edit?
Love your work.
2
-
2
-
2
-
2
-
2
-
2
-
There are rational arguments either way. There's the judging of strength, which depends on whether or not you make a consistent cup of tea (NATO Standard, even). On the other hand, if you put in milk first, then the milk is brought up to temperature more gradually, and so is less likely to scald. Meself, I don't like milk in tea.
There is however a perfect way out of these etiquette things. If someone raises an eyebrow at an alleged faux pas, you simply remark, with total assurance, "Oh, in our family we always put the milk in first," as though they ought to take your family as the arbiter elegantiae.
BTW, I suspect the fetishisation of boiling water might have something to do with economising on tea in WW II, as exemplified in Orwell's essay on how to make a cup of tea (You are too young to remember Jack Train, a WW II and post-war BBC DJ: his catch phrase was "Always remember, take the pot to the kettle, not the kettle to the pot.") With boiling water you get maximum extraction, which helped make the tea ration go further; though it does vary with type of tea. With the black teas (oxidised, Chinese call it red tea; often called Indian in the UK), something close to boiling is normally recommended; with green teas, something quite a bit off the boil is better (less bitter), and I've even once had a very refined Japanese tea where a temperature of 70 C was recommended.
2
-
I come late, but it is time to denounce the hoax that this was an April Fool gag. The weapon has, in fact, a very long conceptual history. In the Iliad, Homer depicts his heroes at times of great stress picking up rocks and throwing them, often with fatal results. These were, to an extent, Improvised Concussive Devices, and had no handles, but on the other hand the impact was enhanced by the fact that Homer's heroes could wield weapons such that two men of Homer's day could not pick them up.
The other factor that points to the April Fool cover story as being a piece of French establishment disinformation is the passing over of the Galalith angle. Galalith has many virtues, especially in the construction of fountain pens and the handles of knives, and so its tactile qualities suited it well for the use. However, it is extremely poor in its resistance to moisture (I have a ruined pen to prove it), and so was obviously unsuitable for employment in a WW I trench. The placing of the contract, and still more the continuation after the first reports of combat experience, would have threatened the position of the government in power that month. Such information as has come to life suggests that the original order was influenced by the complexly reticulated relationships of a certain Mme Poisson d'Avril.
2
-
1
-
The problem is NOT climate scientists. The problem is the rich and powerful who call even moderate predictions "alarmist" or a conspiracy, and who manage to persuade a sufficient number of voters in democracies that those scientists are liberals or are only doing it for those huge research grants they get (yes, totally out of touch with reality). This is popular to hear--no need to worry. If you want to blame someone, blame the politicians of the centre-left parties who have got totally out of touch with their historical constituencies, and so have no capacity to communicate unwelcome truths.
In countries that are not democracies, of course, just mine coal, drill for oil, and ignore anything beyond the limit of the Five Year Plan.
I'm 80, with no kids, so I have no dog in this fight. But even I might live to see the massive suffering when there are mass migrations from now uninhabitable regions. Weep. Just weep.
1
-
@ExplainingComputers Just to follow up on one of my failures. I wanted to run iTunes in Windows, using Airplay. To do that, iTunes has to see the WiFi adapter with its own beady little eyes, apparently. This is not possible with the host computer's internal WiFi adapter, because VirtualBox virtualizes it, which is not good enough for iTunes (picky). The answer seems to be to use an external USB dongle in the VM. I happened to have an ancient WiFi n adapter; after a certain amount of fluffing around with switches and getting the sequences of start-ups right, I got the external adapter visible in Windows, and iTunes found my Airport Express. So, finally, I could listen to Andras Schiff playing Bach over my monitors attached to an Airport Express, while browsing the web in Firefox in Linux Mint. Tomorrow, I see if iTunes can find my iPhone when plugged in. (All the choices which have brought me to this situation have been made rationally, but the complications can make you wonder.)
1
-
1
-
1
-
1
-
1
-
1
-
1
-
I found this very interesting, but I think you've oversimplified what manuscripts mean as evidence for historical fact. You are dead right in saying that the gap in years between the events and the first MS is of little to no importance. What really counts is the number of generations of copying, because it's pretty much an axiom that every act of copying introduces mistakes (Jewish scrolls of the Tanakh for use in the synagogue are perhaps an exception, but that's the product of extraordinary care, and I'd bet there'll be some errors in that body of texts). So, if you have a MS that was written a thousand years after the composition of the text, but was copied directly and carefully from the author's fair copy, it is vastly more authoritative than a MS that's only a hundred years after the date of composition, but is a copy of a copy of ..., and some of those copies made either carelessly, or (worse) by people trying to "correct" or "improve" what they had in front of them. This applies to all manuscript evidence of all texts from before the introduction of printing.
The second point, which you rather conflate with this, is the difference between Caesar's accounts of his wars, which are undoubtedly spun like a top and propaganda, but are at least trying to look like an accurate narrative of events, and the gospels which are intended to communicate faith, and interpret the life of Jesus. All history is full of interpretation, but the accounts of leaders who are meant to be inspiring are, as you say, a different genre: the gospels probably sit between the popular accounts of the life of George Washington and the official biography of Kim Jong-Il. I suppose Billy Graham was trying to counter the Jesus mythicists, but it's a very poor argument, as you say, even against that futile hypothesis.
1
-
1
-
1
-
At least you see that anti-intellectualism doesn't confine itself to the humanities. You should have added that there is particular hate directed in that direction because it's an area in which women are in the majority: there's probably a big overlap between STEM bros and incels. But the abolition of the Humanities was going fast in New Zealand before this thesis hit the headlines.
It's all playing into the popularist mode. Headlines indeed: condemn a thesis on the title. Which fits well with the MAGA contempt for *all* credentialed experts, especially medical. Prevent a disease by injecting poison into your arms? Obviously wrong, if you just use common sense.
BTW, I don't know if the thesis was bullshit or not. There is bullshit in the Humanities, mostly generated by the dedicated followers of fashion. There is too much research in the Humanities, and in the field of EngLit, a lot of it looks very unlike what I understand as the study of literature, and more like half-baked philosophy, politics, and psychology. But some of it is good, and asks us to look at what we take for granted. I'd be interested in a thesis on smell. To think of the late 18th early 19th centuries. When the Prince Regent (to be George IV of the UK) first encountered his bride-to-be, Princess Caroline, he needed a large brandy because she wasn't much into personal hygiene. OTOH, it is said that Lord Nelson, when expecting to return to Lady Hamilton, exhorted her not to wash for a while, in expectation of his return. And to think that our attitude to smell is unrelated to our attitude to other social groups would be naive. Might be worth having a look at that. Who knows if the thesis did that, or just regenerated a lot of theoretical boilerplate, but it might be worth more than YAST* thesis. And if you need to moan about the poor suffering taxpayer, it wasn't your taxes that paid for it, and Humanities research is dirt cheap compared with anything else except, perhaps, Theology.
*Yet Another String Theory
1
-
1
-
1
-
1
-
1
-
This video is a great contribution to the good of the world. Clearly, ASD is real, and the neuro-typical must acknowledge that. Equally, the activists seem to be more concerned about being activists than helping anyone. Dr Hossenfelder's zero bullshit approach has helped me think about this more clearly.
I wonder if it is right to consider "masking" a problem? From another point of view, it is people on the spectrum learning a social mode in which they can function. For everyone, functioning in society involves learning how to adopt a social mode: in society, we are all, always, acting to a certain extent.
To support that point: a friend's daughter is on the spectrum. She has difficulty with modern job interviews, that are so very informal. If we neurotypicals went back to formal etiquette, and an insistence on the right way to do things, we might make a world easier to navigate for the neurodiverse.
So, Dr Hossenfelder, maybe being German is the best thing to do?
1
-
In a reasonably functional world politics is a mixture of values, and of means of implementing those values; ultimate values obviously affect the values part, but normally don't control the question of implementation, which is largely technical. So, for a Christian, it is undoubted that active concern for the poor, widows, orphans, and the stranger within the community is a fundamental requirement. We know that Jesus, as a Jew would, held this value as important. But that doesn't necessarily decide between left and right in terms of normal politics. We think of the left as being especially concerned for the poor, but a particular left party might be too ideological, or just not competent enough, to effectively help those with a special call on us; while there are right-of-centre parties, and especially factions within right parties, that have a genuine concern for those who are down on their luck, and have policies to help them.
The upshot is that, I think, religious values don't tell you how to vote, but they can tell you if there are parties that it is impossible to support in good conscience--such as an extreme neo-liberal party, or a left party that is more interested in ideology than people.
That's in a normally functioning society, with a healthy religious environment.
1
-
1
-
1
-
1
-
1
-
I think it's a mistake to completely write off conventional hybrids. They are at their most efficient in cities, with a lot of stop-go driving, which is where ICE vehicles are least efficient. The downside to that, of course, is that cities is where it's most likely that we'll shift to more use of public transport; but still, there's a bit of a win to be had.
I'd really like to know if synthetic hydrocarbon fuel, more or less net zero carbon, is a realistic possibility or not. If it is, the PHEV or EV with range extender would be very attractive possibilities, using the existing liquid fuel infrastructure, and meaning the batteries in vehicles could be considerably smaller.
1
-
1
-
1
-
1
-
I see, some people are confusing "literal" with "accurate". I don't think translators of non-Biblical texts use the terms "formal" and "dynamic" equivalence, but in the few little translations I've done, I have been very conscious of two main classes of readers: people who want an equivalent to reading the original; on the other hand, some people want a translation as a help to reading the text in the original language.
Translation is impossible, if you set a high enough standard; if you really want to know a text, at the least you need to read many translations, but in the end it's probably just less work, and more satisfactory, to learn the original language.
And the best of luck with the language courses. Anglophones have a dreadful tendency to become monolingual: something King Alfred the great recognized over a thousand years ago, and wanted to put right, "for the more languages we know, the greater wisdom will be in this land" (translation slight modified to make my point: from https://www.departments.bucknell.edu/english/courses/engl440/pastoral/translation.shtml . It's an interesting statement in its own right).
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
Some of the effects of computational photography used to be achieved with (less precise) chemical methods: you could enhance the sharpness of an image by controlling the dilution and frequency of agitation of the developer as it acted on the negative (enhanced sharpness (technically acuity) might be gained at the expense of fineness of detail, and that was a big thing in the 1960s. Doing this, and much more, digitally gives greater precision and control. At the moment, the basic happy snap from a phone camera is getting to the point where it looks a little over-processed--hyper-reality rather than reality. I notice this when I compare results from my iPhone 13 Pro with the pix I get from "real" cameras with bigger sensors (and much less processing power). But I also like the phone look, most of the time. And you aren't doomed to have it. I haven't tried using the RAW images from my phone, because I'm not doing anything serious any more, but it's there on many makes of phones. What I would like is more ability to control the degree of processing, as you get on a stand-alone camera. Some control is there on phone cameras, but there could be more. But phone cameras are really the social equivalent of the box camera (or the Instamatic, like the little Rollei at the beginning of the video), and it's astonishing how technically capable they are. I would never have dreamed of taking a movie of a bumble bee enjoying itself in the flower of an artichoke, but it was a piece of cake with an iPhone SE (1st gen).
1
-
1
-
1
-
1
-
1
-
1
-
1
-
I love your work, but this especially was deeply moving--I speak as someone whose chief intellectual concern is being expunged from universities. In particular, though, two specific things struck me: the first was the Library as an institution of cultural imperialism (which does not invalidate the work done there--intellectuals work in the crevices and shadows of the exercise of power); and the other pointing out that Alexandria is humid. I'd never thought of that. But, with papyrus, it means the library was doomed to disappear, without a constant process of renewal. And, of course, when it comes to destruction by fire, shit happens: cf Notre Dame in Paris, and the 1992 fire at Windsor Castle in the UK. Both could have been started for all sorts of political reasons, but in fact the calamities were just dumb bad luck.
On the numbers, two things occur to me. One is that it is easy to get confused about what counts as a book: for us, the Iliad is one book (as, for instance, in the translation by Emily Wilson plug plug--one ISBN); but it is divided into 24 "books," the division apparently done by Alexandrian scholars and so presumably corresponding to a scroll in the library. On the other hand, there were presumably many works of which multiple copies were held: those same Alexandrian scholars worked on establishing the text of Homer, and you can't do that without multiple copies--especially because, in the world of manuscripts (texts written by hand), every copy is an edition.
Thank you so much for this. I am off to the merch shop.
1
-
1
-
1
-
1
-
Jonathan's explanation is ingenious, but wrong. These are, in fact, pairs of Transylvanian duelling pistols. In a Transylvanian ceremonial duel, the seconds loaded each pistol mechanism, and then screwed them together on the common barrel. The principals then took one end each of the assembled apparatus, and at the drop of the umpire's handkerchief both fired, as near simultaneously as possible. The results were rarely fatal, but the participants were likely to be wounded by flying pieces during the self-disassembly that inevitably occurred. These wounds resulted in scars, which were worn as a badge of honour, like the scars of German university duels with edged weapons.
1
-
1
-
1
-
1
-
1
-
There's some very good evidence against a simple connection between critical thinking and atheism. First, it's easy to find theologians who are ferocious in their analytical thinking, and retain religious faith. I'm thinking mostly of medieval Christian academic theologians (like the Gaunilo of Marmoutier, a Benedictine monk who analysed and rejected Anselm's ontological argument for the existence of God). And on the other hand New Atheists do not by any means employ critical thought all the time. I lost what faith I had shortly after, and fairly certainly in part because, I read Dawkins' The Blind Watchmaker_; I came back to a kind of Christianity after, and because, I read the first third of _The God Delusion and chucked it out because it seemed almost entirely devoid of analysis, and its mis-analysis of Christianity made it clearer to me what I actually kind of believed, or at least was prepared to take a punt on. Reading a bit of thoughtful modern theology made me realise that a lot of magical thinking that was kind of a drag on my mind was NOT a necessary part of Christianity. And I've been similarly Enlightened by some Jewish thinkers whose works I've read.
Where there is a link between rationality and loss of faith may be where a person's religion is intricately involved with beliefs that are entirely incompatible with the normal use of human reason. I think of people with a belief in the inerrancy of the Bible, read literally. I note that Bart Ehrman, as a biblical scholar, has moved from belief to atheism as he has thought more about the nature of the biblical texts; but he seems to me to be a very Evangelical atheist.
My examples are all from Christianity--and most of the discussion is about this faith. I'd really like to know how it goes in "Hinduism".
1
-
I really must make a contribution to my distro of choice. Thank you for making that clear.
Recently I noticed some comments on a couple of channels expressing the notion that you have to be constantly fiddling with Linux to keep it going. IME, just doing the things a normy does with a computer, I find Linux less of a faff than Windows. But it does suggest that some Linux advocacy is counter-productive, in its desire to reveal further intricacies of the bowels of the system.
It really doesn't help as much as people think to talk about distros as "good beginners' distros." That implies that Linux is an end in itself, that the user is embarking on a "Linux journey" towards the kind of inwardness that lets them save an e-commerce cluster by the cunning use of cat, grep, and some utility nobody has heard of. The point for most folk is that Linux gives you a system that you can spend less time thinking about, on which you have to spend less effort fending off efforts to trick you into signing up for something you don't need, and you can just get on with doing normal human stuff.
One part of the FOSS campaign that I am beginning to take more seriously is "free as in libre," that is, not losing control of your data: the software as a service push is making that more and more relevant. I recently tried to back up from a copy of my OneDrive folder (it had everything local as well as in the cloud); without activating a OneDrive account, it was gibberish, and felt like a ransomware attack. Luckily I had two other backups of my disk, but I suddenly saw that Stallman had a point. So, whilst the F in FOSS has never meant "free as in beer," and I would really happily pay for a perpetual licence for Lightroom on Linux, I wouldn't want to subscribe to it. So, yes, FOSS is not Linux, but one of the virtues of the Linux world is that it is free of subscription services, and we need to watch that if commercial distributors do start making their stuff available on Linux.
And I really must chuck some money at the devs of my beginners' distro.
1
-
1
-
I couldn't tell (and I wouldn't know how to tell) whether there's an absolute decline in highly influential research, or whether HIR is a lower proportion of the total research output.
If the latter, it's kind of what you'd expect. In those heroic days when it suddenly appeared that Physics hadn't come to an end in the late 19th century, most of the people who became professional scientists were highly motivated, and disproportionately prone to disruptive research. As science becomes more of a career path, you get more people entering it as a profession, perfectly competent but not driven or devoted to thinking differently. Add to that modern management methods ("never mind the quality, feel the width") and the proportion of published research that makes a big difference declines, inevitably.
I'm not a scientist, but similar things have happened in the Humanities, compounded by the decline in prestige in disciplined ways of thinking about people as, you know, people.
1
-
I'm struck by the way there's a lag between the appearance of these simplified weapons and the disasters that created the need for them. Dunkirk was early summer of 1940, but the properly made No. 4 Ian has was made a year later, and the Stenified models were from mid-1942, when the situation was beginning to get better for the British. But that is inevitable, I guess, since a proper simplification depends on finding what is cost effective, and then you need changes to tooling and procedures. And then, I guess, it's a real calculation whether it's better to interrupt production to make a quicker product, eventually, or just carry on increasing production as much as you can.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
Off and on Linux flirter for 20 years. User for 10, all GUI, Mint/Cinnamon for preference: easy to do now I've retired, but I'm not sure how I'll go shortly when I'm co-operating again on a book, and might have to go back to MS Word, which means Windows. Also I need a Windows machine for iTunes to organise my music for my iPhone.
The new interest in Linux is because Windows 11 has become so predatory against its users. "Tell us everything about yourself. Entrust all your data to our cloud, where it will all be encrypted so that no one (apart from us) can access it, including you if you let your subscription lapse."
Why Linux won't work for some people:
1: Some people use one or two pieces of software for their professional work: typically Adobe. The decision tree here goes: I need to use (say) Photoshop; what systems (hardware + software) will give me a good experience? The operating system isn't the choice, it's a consequence of prior choices. This is the same as people who have got some very expensive piece of manufacturing hardware which is old but still functional and central to their business, so they need to nurse into life some antique PC with a Centronics port because that's the machine's interface.
2: People who have a lot of experience with Windows or Mac, and know how to do out-of-the-way things, and try to do the same with Linux, and can't. It's partly that a lot of this is what is sometimes called "implicit knowledge": stuff you know without knowing how you got to know it. The charitable reading of the Linus fiasco is that he wanted to set up an advanced gaming rig with his Windows knowledge (the uncharitable interpretation is that he thought a bad-faith video about how Linux is too complicated and broken would be good commercially). This accumulated knowledge can be a real change-stopper, depending on how old you are and what sort of appetite you have for learning new stuff.
For pretty much everybody, changing from Windows is a rational choice, but there are some people for whom Mac is a better alternative. (I used to like OS X in the days of the Big Cats, but too much of its functionality is hidden for me these days -- like menu items that only appear when you hold down the splat key while clicking on the menu.)
And then there's gaming, but isn't everything better on a console anyway? I don't game.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
This is excellent, but I'd want to add other terms in your discussion of world-views from about 13:00.
First, I don't think we have to choose between naturalism, dualism, and immaterialism (or idealism, as it used to be known). It seems to me that it is possible to hold that the basic stuff of the universe has both a mind-like nature and a physical (material) nature: like dualism but without an absolute split. This seems to be the pov of Thomas Nagel (an atheist) in Mind and Cosmos when he declares himself a "neutral monist." It seems to me evident that there are facts of existence that we can't explain without using mentalist language, but dualism seems very problematic for accounting for how much our mind is intertwined with the electro-chemical functioning of our brains. (Test: describe two 12 year olds playing a game of chess, using only statements about brain states and functions.)
Second, on epistemology: I am not happy with the only choices being "Science + Reason" and "Divine Revelation." There are things we know by intuition and by non-rational (which is not anti-rational) observation. I think of the way the arts can inform our understanding, but also things like human love, friendship, hostility, and hatred. There's also the way in games that if you're one-on-one with another player, you can sometimes just know whether they're going to come in hard on you. There's the way we can talk about a great poem being "inspired" without necessarily being divinely inspired. And on the other hand, I am very hesitant about what we might mean by calling the Bible "divinely inspired": certainly some meanings of that phrase mean things I couldn't believe.
I realise that these positions might be a bit nuanced for a mass survey in a PhD thesis, but they're important to at least some people.
For the record, I'm a church-going Christian (Anglican, which some people might regard as CINO), having been in and out of belief for most of my life.
BTW, there's a typo in one of your slides: POSTITIVE ATHEISM -> POSITIVE ATHEISM, if you ever feel like fixing it. It's always the block caps where the last typo lurks.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
A rather old fashioned philosophical move is to ask what is the cash value of an idea. Which is a way of dealing with the uselessness of the idea of solipsism in actual life (of course, the solipsist might be mistaken in believing that they exist, but Augustine of Hippo disposed of that one over a thousand years before Descartes, and in two fewer syllable of Latin).
But, in one sense, reality is created by our minds, because "Reality" is a mental concept: roughly, for ordinary usage it means how the world is, as opposed to how we think it is; but we don't have access to how the world is apart from how we think it is. All we have is the possibility of constantly improving the accuracy of how we think things are, by the methods of the Sciences, and the Humanities. The question becomes whether or not this obvious fact is trivial: how does it cash out for the purposes of living our lives?
1
-
1
-
1
-
When I heard "pill-lock pistol" I immediately thought of the companion self-defence weapon, the bollock knife (or ballock dagger). And yes, that is a bona fide, fair dinkum, technical term.
Any clue what was the composition of the priming compound? As there was some still in the flask, I do hope it got analysed.
And, as for the proportion of female gunmakers, my hobbyhorse is that, although the patriarchy is real, it is a mistake to assume that it was worse the further back you go. There's a fairly common thought that the status and opportunities of women declined in the Renaissance, and again in the 19th c. But I do hope Ann Patrick retained a significant role in the new merged company.
1
-
Hah, coastal boy, eh? In Australia, I lived in Armidale (NSW), and in the nation's capital; and in both places I saw -10 C. It was, admittedly, in the middle of the night in the dead of winter, but it was cold. Also dry, so if you came into contact with any synthetic fabric on a winter morning, there were sparks when you put your hand (or key) near metal. Lovely days, but, +15 and better, but cold as when the sun set.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
Yes, but you're confusing issues.
1. The people who gave you an EV without even the basic three-pin charger are kippering the whole thing for a start. And overnight charging is not, for most people, eight hours but more like 12, and not starting from zero charge each time. So for a lot of people, maybe not you, good enough.
2. Were you for real thinking that if you drove quickly, you'd make it up on the regen? Are the laws of thermodynamics a joke to you?
3. Buying a nice new EV to save money is not a good idea, and is not the point. An old Leaf might be an economical choice, in some cases.
4. And indeed the infrastructure is a problem, which is a bit of chicken and egg. Pedoguy Musk is a menace, but he gets some things right, and developing an infrastructure at the same time as selling his cars is one of them.
Much of which is why I have a series hybrid rather than an EV (in my case, no off-street parking); but for someone who's thinking about getting an EV, the pros and cons could be clearer.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
I know what you mean about Linux getting better, but I mostly do text stuff and I look at Warty and think "Hmm, looks like Windows 2000: I could use that." It's the automated set up of hardware that has made the difference to me. My first attempt to use Linux was with a Knoppix CD-ROM, and I want to say it was last century. I failed, because the laptop I was trying to put it on had a funny Yamaha sound chip that was not supported and it was beyond me to do a work around, and I wanted sound. Tried a few more times, but always ran into some problem that defeated me, until I tried Mint about ten years ago, and for the first time it went the way it's described in the instructions. But I'm sure that if I could have got the system up and running earlier, it would have been easier to live with than Windows for Workgroups 3.11, which was what we had at work. And I must say that when I do a bit of distro sampling now, I still find that with some I get a hang up on some detail or another, or an instruction I don't understand. But now I know it doesn't have to be that way, and I just say "Ain't nobody got time for that."
So nice to see Veronica's delight as she rediscovers her youth.
1
-
You left out a possibility, one you've taken: BOTH.
So, there are some niche apps I use everyday for work and play that are only on iPhone.
I like a smart watch, and when I bought mine, Apple was clearly the best game in town.
But, I have a lot of locally stored music (I had a lot of CDs I ripped). It is a pig to manage that with iOS, especially now I'm using Linux on the desktop.
The ecosystem is great, but the walls keep users in as well as competitors out--like, not having a real file manager.
So I'm going out tomorrow to buy a cheap Android table, basically as a music player. But if I had the budget, I might be going to change to Samsung--except I'd need to have an iPad for my niche apps.
Choice is good, but it's not infallible.
1
-
1
-
I don't agree with that pronunciation you found: the second syllable has the ee sound, and typically the stress is on the second syllable, too. I do know Ancient Greek, btw, and I have thought a bit about asceticism, but I don't always get English pronunciation of Greek words right, so I checked Wiktionary (a good online solution for all your dictionary needs).
I have no problem at all with the from-the-ground-up, learn each tool as you need it, approach to Linux, or any other OS. For many people I'm sure it's the best way to learn an OS, and indeed, since I started on CP/M, I've had something of that trajectory, even though my switch to Linux was pretty much entirely convenience, as I wanted something that's less intrusive and less of a faff than Windows.
But I do have issues about calling that approach ascetic. As I understand it, asceticism is about two things: one is freeing yourself from the complications of unnecessary possessions and material concerns (a kind of wellness play); the other is to simplify your life to the utmost so as to concentrate on ultimate value, whether you think of that value as freedom from all illusions about the nature of "reality", or whether it's something for which the word you reach for is god, or the divine.
The minimalist approach to Linux (why not Gentoo?) would certainly not leave anyone with much time to think about anything else, but if you take the asceticism angle seriously, it would seem to imply that the ultimate concern is, indeed, Linux. Nothing wrong with that, as long as you don't frighten the noobs who are just looking for a better Windows than Windows (where have I heard that before?); but I'm not sure whether many historical ascetics would agree with it (though, how monastic are the Shao Lin martial arts monks?)
Another practice that goes with asceticism is anchoritism, the practice of withdrawing into a secluded or solitary life as an anchorite or hermit (all anchorites are ascetics, but not all ascetics are anchorites). Hmmm. Anyone going to talk about basements?
The notion that someone who's become one with Linux, as described, would be happy with macOS is a bit laughable. Sure, it's BSD underneath, with a funny kernel, but Windows is supposed to be VMS underneath, and when ordinary people talk about those OSes they are thinking of the DE (there's not even a Mac server edition these days, is there), and I hate the modern Mac experience because it's as opinionated as Gnome, and far too eager to leap in and do what it thinks you want it to do.
1
-
1
-
1
-
1
-
1
-
Brodie, you say you're a Linux user so it doesn't affect you, and I thought the same, but then realised that if we have any transactions with Windows users (you know, that nice specialist on-line retailer), we're vulnerable to their vulnerabilities.
Given that DJT hates the tech companies, there's a possibility he could be persuaded to do something about this; hand onto that, as a silver lining in a very dark cumulo-nimbus that looms in November.
Oh, I see, as I should have known, I'm very late to this. So here's another thing to worry about: that Group Policies rename calls "Turn off Recall" "Disable Windows Snapshots" or some such. But on a computer, snapshots are good, aren't they? They enable you to restore your system easily, don't they? The tech savvy admin who's concentrating will smell a rat, or will know about this: but in some large organisations, it will slip through, because mistakes always happen.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
Originally, j wasn't a separate letter, but just a fancy way of writing i. Particularly you see it at the beginning or end of words in medieval English spelling. So, when simply writing numbers in ordinary narrative, with no need for tamper-evident records, you still see 3 written as iij, and 6 as vj.
Fun fact: although differentiation of i and j as separate letters representing different sounds started as early as the 16th century, it took a long while for this to become fully established--especially since at first most people who could write could write Latin, in which j is a purely calligraphic variant. Washington DC has streets named by letters, so: A, B, ...G, H, I, K, L.... There is folk-lore that this is the result of a feud, and whoever was in change of street naming had a deadly enmity against someone called Jay. Nice story, and tells us about how Americans think about the conduct of politics, but it was just that J still wasn't firmly enough established to count as a separate item in a sequence. Those who can have had the privilege of flying in the Queen of the Skies can perhaps remember the seat lettering in Economy on a 747: ABC (aisle)DEFG(aisle)HJK. Same structure, but the once-mighty Boeing company chose the fancy member of the pair.
Indo-Arabic numerals start appearing in English manuscripts in the late Middle Ages--I remember a late 15th c. manuscript that uses Indo-Arabics for folio numbers (which might, admittedly, be a bit later than the original writing). At first, some of the numbers are written in a different rotation from the modern usage, so 4 (IIRC) is written as though, to our eyes, it's lying on its back. These were the orientations of Arabic writing--I don't know why they were later rotated.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
Are there any studies of the economics of massive projects like this? I don't mean cost benefit analyses, so I guess it's really finance rather than economics, but basically: Where does the money end up? A collider is a very large hole in the ground, full of things made of steel, copper, and more exotic materials. Do you get particle physicists to dig the hole? Hell no, you get one of the specialised companies, with their highly skilled and nomadic work force, who dig holes, whether for water pipes or underground trains or Elon's latest X-wormerator to extrude people through the earth. The highly skilled, and rightly highly paid, workforce comes to the new site, and spends money in the local economy, which generates taxes for the host nation. The tunnelling companies presumably pay taxes somewhere. And so on with all the equipment inside the hole. A large part of the operation can be seen as the transfer of money from science budgets in the funding nations to the companies that do most of the engineering and the nations where they pay taxes.
Whereas if you funded, let us say, theoretical research, the capital costs would be much lower; the theorists could even move into the space freed up by the abolition of the Humanities, though they will have to compete with the expanding Department of Social Media and Marketing. And you could fund small experiments, too, that know what they're looking for.
Instead of the mega-enterprise, a model might be the sort of labs where the foundations of modern physics were experimentally explored, that seem to have required a good supply of string and sealing wax, and a very good glass-blower.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
So these things will need a large array of antennas (only insects have antennae). How much electricity would you generate with a similar sized array of solar panels, at various locations on the earth? Suppose, whimsically, you covered Manhattan with a farm of solar panels (though it would be much better to do that to Scunthorpe): how much electricity would you get over a year, and what would be the cost of realistic short term storage to get the supply as even as with a space farm? I don't have a clue about the answers, but it would seem fairly important to guess as well as you can.
I don't understand, and it is almost certainly my problem because I'm shape-challenge, but how does a geo-stationary array stay in the sunlight for 99% of the time? Do you have double-sided panels, or do you steer it?
BTW, the Brits have really got everybody conned about English weather. It really doesn't rain that much in South East England, it's just that the locals are nesh.
1
-
1
-
Flightshaming seems to me the worst sort of social activism: it's moralistic, and indiscriminate. Mind you, I live in New Zealand where the nearest country is three hours in a plane or three days in ship away, so I'm biased. But in Europe, given the choice of train or flying, a good train would be my choice every time.
The problem in places where land transport is a realistic alternative is cost. Flying between two cities can be cheaper than taking the train, which is mad. Part of the answer has to be passing on the cost of less-damaging alternate fuels, which will shift consumer choice and cut down on the total amount of flying, both good things. That, of course, has to be by international regulation, which is, I fancy, our biggest challenge.
1
-
I'm a non-technical user of Mint, but NOT a new user. Been using it for years. That's a distinction worth bearing in mind: it's not that I don't YET know stuff, it's that I've got other things I'd rather be deep into than Linux technicalities, and I rely on useful sources, such as yours (thank you) to help me sort out what I really need to know to use Linux felicitously.
It sounds presentational, really. So "Verified" is not a guarantee of absence of malware, but no such guarantee is possible, I think? Someone sufficiently motivated and resourced could presumably infiltrate malware into the Microsoft Store (probably starting from Petrograd).
So the question is, for a non-technical user, are they better off sticking to Verified flatpaks? (I actually want to know, and so far I have the impression that the answer is "Yes," to some degree.) And if so, how to present the information? Remembering that non-technical users get MEGO pretty quickly.
A question I'd like the answer to is, which source is least likely to serve up malware: distribution's repo, Verified Flathub, unverified Flathub, random binary, random flatpak? I've got a clue, but I'd like to know the detailed rankings. Or perhaps it's not possible to give more than a general answer, which would be good to know.
Last, I take the point about what happens if flatpaks are not available through the preferred source. The answer might seem to be to say, "VLC is great (for example); we think you should install it from our repository, rather than this unverified flatpak." Given that the Mint package manager now shows traditional packages and flatpaks on the same page, this seems like a reasonable idea? And a way of combatting the erosion of safety measures (some clown will always tear down the fence at the top of the cliff).
Oh, and post-lastly, are there any advantages for the user in installing flatpaks? Is the sandboxing of any security benefit, for the user? Any benefits in app updates? I observe on the Mint package manager that typically flatpaks are a more recent version than what's in the distribution's repository, but I come to conclude that that's not necessarily an advantage
The Moral is maybe one of the things I learned in an early part of my experience with computers: don't be an early adopter. Wait for someone else to find the bugs (and now the scams). (And, BTW, never ever install version x.0, and with Microsoft wait for v. 3.1)
1
-
1
-
1
-
1
-
So it was big in car audio, because that's where you want a bunch of music as background. But the CD is an album: it is a materially constrained format that shapes the artistic creation (consider the change from the 78 rpm single, collected in physical albums of records, to the 33 1/3 rpm "album"), so if you're listening seriously, you want to listen to a whole album. So there's no big advantage in having MP3 CDs for a stationary player. The stationary player, one CD at a time, fills the same place as the vinyl turntables which, surprisingly, are now so popular. I loved MP3 CD in the car, until memory chips got so cheap you could get more than 256MB on a little player.
Also, whilst Apple is an exceedingly annoying company, so certain that they know better than their customers, not everything they do is malicious. AAC, as I think you know, apart from being more efficient than MP3, also had the ability to do DRM. When Apple was setting up iTunes, and fighting with the anal retentive misanthropes who run the labels, they had to agree to implement DRM (it was not their wish: iTunes Plus shows that all along they didn't want DRM.) Same for ALAC, which is a lossless compression codec, but with DRM capability. I know you know all this, but I think we should distinguish between the things Apple does that are really obnoxious, and the things that are reasonable at the time.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
I can see why they picked Eliza as a control, but that was a very specific design. The idea, it is said, was to simultaneously satirise AI and Rogerian non-directive counselling, which was very big at the time. It's not too bad a simulation of the counselling, and it was programmed to throw in a random "You haven't mentioned your mother" if the human's mother had not, in fact, been mentioned. Worked on 8 bits.
I know nothing about statistics, but shouldn't the AI be scoring higher than random to be considered to have passed? As there are only two alternatives, then markedly above 50%? They fooled me more than I'm comfortable with, but.
1
-
1
-
No Fox Talbot? I know you can't do everything, but he was ready to publish at almost exactly the same time as Daguerre (Darwin/Wallace, much?), and he invented the negative-positive process (which, as you know of course, is the mechanism used in modern reversal films, and a quite different way of getting positives out of the camera). Which is the true line of the future, because you can make many prints of the same picture. Also, he lived and did his work at Lacock Abbey, offering many possibilities for misreadings in the blooper reel. BTW, with B+W negatives, if you hold them right, you can sometimes see a positive image by reflection off the silver. IIRC, works best with underexposed images.
1
-
1
-
1
-
I have a real, genuine question to ask. I see the point of Arch: life on the frontier, and a very high degree of configurability--not for me, but it's easy to understand why people want that. But I wonder what advantages come from using Arch as the basis of a stable system? Clearly Manjaro sees the Stable version as a major use case, since it's the default for the desktop installation. What would you get from it rather than Debian, or offspring of Debian, or grand-offspring of Debian, say? The installer certainly looks one of the better ones around, so it could be of interest to me.
Also, about a year ago Manjaro got its name trawled through the internet a bit for sloppy administration. Assuming that was roughly true (big assumption), do you know if they've got all their stuff in one sock by now?
1
-
Of course I loved this; but maybe it's worth putting the point that radical anachronism is deep in the DNA of romance, or at least the only type of romance I know anything about. The medieval Arthurian romance is set in a pretty precise historical period, 5th century CE. But the fashions, both in clothing, arms, and social manners are always bang up to date. Chrestien de Troyes, who seems to have invented Lancelot, at least as the figure who dominates the Arthuriad, could be read as part of a movement to bring men of the knightly class out of the state of brutish warriordom into modern, refined, 12th century psychological interiority. And so it goes, down to the guns in Malory. I have sadly little knowledge of women's clothing in the Middle Ages, but in Sir Gawain and the Green Knight there is a low-cut number which has nothing to do with the 5th century, or the 12th. Maybe there's a distinction between Romance, which is always about the present, and the Historical Novel, which is about the past (in so far as History is about the past)? With Historical Romance as a hybrid?
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
This format is very appropriate for desktop distros, thank you. I can understand (or imagine) that fine differences in performance make a real difference on servers, but it's all Linux, and at the desktop I wonder just how much difference in performance there is? MX Linux is often described as lightweight and performance oriented--can you point me, please, to comparisons in performance of desktop tasks--like reformatting a long word processor document, or processing a RAW image file--between lean distros and the friendly ones?
Perhaps also the audience of this channel is not about to switch from Windows, but is likely to be advising friends/relatives on the switch. In which case, it is necessary but not easy to drop one's own preferences and think about a user who actually doesn't WANT to understand their computer, but just wants to do stuff with it (like most folk use a phone). Don't put your cousins and your aunts onto Arch, btw.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
Only knowing about similar institutional contexts, I agree that the CoC committee was ill-advised to effectively apply retrospective legislation, and seems a bit heavy handed. If a problematic arsehole says he's sorted it out with the insultee, and that seems to be the truth, surely a good idea to let things lie.
OTOH, without any moral judgement, the behaviour of this developer is clearly counterproductive, perhaps self-damaging. Let us assume he is passionately devoted to getting his file system into the kernel, yearning for it with a messianic zeal. It would seem sensible to discover how you get things adopted into the kernel, and it seems that there are agreed procedures, which have nothing to do with kowtowing to the corporations or going all woke, about what happens when. To an outsider, it sounds like the procedures are at least defensible, as a means of getting everything synchronised, all ducks lined up, and the pigs all ready for a formation take-off. Might not be the best way, but it's worked for a couple of decades. Also, it's a community, so it's probably a good idea not to alienate the community. Not for any moral reasons, but for a simple, transactional, analysis of how you get things done. Like, what matters more: the project, or the ego?
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
I guess you know this, but the Brits were working on cannon armament before the War, but thought they needed a special type of 'cannon fighter'. They got a pretty good one, in the Westland Whirlwind, but it got orphaned when Rolls-Royce stopped development of its engine. The initial problems with cannon in the Spitfire were problems with installation and changing to a belt feed, rather than inherent problems in the gun.
The way Brits tell it, the UK contracted for US production of the Hispano before the US was drawn into the war. There were serious problems with light strikes (which was more of a problem in British aircraft than US ones, for reasons), and the UK decided to concentrate on domestic production. It is said by British sources that the US manufacturers were shown a fix, which involved a slightly shorter chamber, but refused to implement it. Given the saga of the Mk 14 torpedo, this does seem plausible.
The general comparability of six .50 cal. vs four 20mm is shown by the way in the immediate post-war period, the USAF persisted with the Browning, whereas the US Navy adopted four cannon. Maybe the difference was that if US Navy aircraft were going to do any strafing, it would be of boats and ships which tend to have tougher skins than trucks.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
@adamplentl5588 Well, according to the source of all knowledge, the International Humanist and Ethical Union says, _inter alia_, that Humanism "stands for the building of a more humane society through an ethic based on human and other natural values in the spirit of reason and free inquiry through human capabilities."
Note the use of the word "humane", which implies some positive value to humanness. To try to build an ethics or morality or value system on a purely human basis implies that we are apt to bear the weight. And it genuinely seems to me that that is an open question. We are truly capable of great acts of good; and also of great acts of evil and destruction. See the history of most major religions for appalling examples of the human capacity for evil; and they are human acts, though done under the delusion that they were God's will.
It's possible to say that we should just work for the best considering only human values; but it seems extremely hard to agree on what human values might be. Competition or cooperation? Probably both, either in a blend, or in different contexts, but how do you get to agree?
It looks a bit like the attempt to gain the authority of a religion, without openly making a metaphysical commitment; assuming that values we can all agree on are "natural". I think there's a lot to be said for basing common life on a principle of naming what is obviously wrong, and trying to fix it, without any deep metaphysical rootedness. But whilst most of us would agree that having a lot of people homeless and sleeping on the streets is wrong, there will be some people who will question whether it's wrong (not, I hope, any people you or I know), and huge disagreement on how to fix it.
Sorry to take up your time when you obviously have much more important things to do, but I really don't think "humanism" is on a par with any of the other ethical bases you mention; not least because utilitarianism and consequentialism seem to me both compatible with whatever "humanism" might be, if it is not an identification of humanity as itself a source of positive value. "Deontology" in a secular form just seems to be plucking moral absolutes out of a hat.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
Thank you. There is a narrative that pervades British aviation history which is deeply politicised. It starts with the R100 and R101, goes through the Nene, and ends up with the cancelling of the TSR 2, and the whole moral is the failure, and perhaps treachery, of left wing politics. There is a bit of a sidenote of anti-Americanism, chiefly centred on the Miles M.52. A sub-theme is the thwarting of Whittle, which centres on A.A. Griffith finding fault with Whittle's original proposals. Griffith is presented as a civil servant with his own ideas to protect; ignoring the fact that he had made a major discovery about the aerodynamics of turbine blades; and suggesting that his favoring of axial flow designs was some sort of private enthusiasm, rather than a clear perception (shared with the German designers) of where the future was headed.
I think this false narrative gets its strength from two denials; the first, that after the Second World War the UK was bankrupt (and you can blame the Americans as much as you like for the fact that it was bankrupt; there were still too many Americans stuck in 1776); and the second, the denial that the UK aircraft industry was fragmented and shed-based. Duncan Sandys gets a look-in, and it must be admitted that he was oversold on the the hype of the futuristic; but it's not clear that the UK would, or could, have produced the aircraft that really would have succeeded.
1
-
Not a scientist, but I have always understood that peer review meant that the paper was presented in proper style with no gaps or infelicities that prevented understanding, and that there were no obvious that were visible from a careful reading.
Peer review has never been intended as a guarantee that the paper is true. For that, you need a whole series of attempts to replicate the findings--you know, science. The systemic problem is that there is not much incentive, either in funding or publication possibilities, to do replication studies, except when the finding is egregiously nuts, and important. Like horse wormer for Covid-19, or cold fusion.
The other systemic problem is that people misunderstand the nature of peer review. If it can take major detective work to detect fraudulent papers, why does anyone think that it could be picked up by a reviewer, particularly one who is looking for errors and omissions, not deliberate falsification.
1
-
1
-
There's a book by Pamela Eisenbaum that deals with this: _Paul Was Not A Christian_. She's a New Testament scholar, employed in a Christian theology school, and a practising orthodox Jew. I think it was actually Peter who had the vision you're thinking of (Acts 10:9-16), and from Acts and the authentic Pauline letters, it looks as though Peter wasn't settled about how far Gentiles had to follow the Law.
The guts of Eisenbaum's book, as I take it (and only on one reading) is that none of the first followers of Jesus was rejecting the Law, still less God as seen in the Hebrew bible (they were, after all, Jews, and didn't proclaim themselves as having stopped being Jews); the question was how far Gentiles had to follow Jewish practice to be accepted as followers of Jesus. In the end, it got to be decided that keeping kosher (for instance) was a part of Jewish identity and the specific covenant, but not necessary for following YHWH through Jesus.
1
-
1
-
1
-
It depends a bit on the depth of the records. In the case of WW2, I'd trust a recent historian more than a near contemporary one because of event-specific reasons like distance from the partisanship of the conflict and the release of once secret records over time; but that is a special modern event with a huge quantity of evidence, and one whose conflicts still affect us. In the case of these extra-biblical references, we might assume that Roman historians had access to some sort of primary source or secondary sources they trusted (and they wouldn't have trusted the traditions of Christians). Outside of Christianity, Jesus of Nazareth wasn't a big deal, just another religious enthusiast and trouble-maker who got crucified like thousands of others, and are not mentioned in the records.
There's also oral history, which is a very large can of wriggly worms, especially as it is often important in identifying and, hopefully, compensating for some of the wrongs of colonisation. But I think most people would give some credence to oral history over a 50-100 year span, while expecting some details to get forgotten or mistaken, and some episodes to be either invented, or heavily re-imagined. Like, for instance, the Gospels. Which is why tracing extra-biblical references is important for sticking in, as it were, survey points to locate the whole narratives.
Matt will doubtless answer for himself, but I'm sticking my oar in, too, for what it's worth.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
@MrTGuru I'm retired now, and I don't do a lot. But I have done a real job, earning real money, using a computer most of the time, without taxing the technical resources of the machine much at all. A word processor, a presentation suite, email, and a browser because there's a lot of good stuff out there. There's a techie bias that equates "real work" with "advanced tech stuff" and most people who do the work that keeps the world going don't do that.
I thought the difference between the stuff that keeps the internet going and the stuff we have on our desk was mostly the desktop environment.
I really would like to see Linux, or some other FOSS OS, as a viable alternative to the monopolists for ordinary people doing ordinary things; that means accepting that these days, computers aren't special. In fact, that was a realisation of the 1990s when I realised we didn't have to specify the details of an office computer, because they were now fungible.
1
-
@dr_jaymz Ah, I'm not a power user so I just let Mint and Cinnamon do their thing, and I don't have many problems. Occasionally Chrome or Firefox will freeze, and take the machine with them, but scanning documents and printing to a wireless printer is actually one of the things I do frequently, and that goes fine.
If you want an OS that really makes you search hard to find out how to do simple things, my nominee is macOS. I used to like it, in the days of the Big Cats, but I've used it a bit recently, and everything seems to need a Masonic secret handshake and a power chord of at least three keys. Tip to Apple: there are things called Menus, you may have heard of them. You can use them to help users find out how to do stuff.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
This is distressingly fundamentalist, in its belief that the Laws of Nature are coextensive with Physics, and that the mind is exhaustively explained by computation. There is no evidence for this, and it is hard to know what kind of experimental proof there could be for these propositions. At best they are axioms, and we have the freedom to accept axioms or not (I think, but presumably some do not). There is also an incoherence at the end, in which Dr Hossenfelder seems to be appealing to her audience to think differently. This can have no meaning under strict determinism. Free will is puzzling, and heavily constrained by all sorts of things, but to claim that it doesn't exist because Physics--that doesn't get us anywhere. STEM types have an especial hatred and contempt for philosophers (I am not a philosopher) but this is an example of how badly things go wrong when a good scientist and communicator strays outside their area of competence and tries to engage with a philosophical question.
Also, for a cheap shot: of all the questions we could choose to discuss, determinism seems one of the least profitable. I exhort you not to waste your time on it.
1
-
1
-
Fonts 12:00 ish. I fancy Windows does render text better than Linux. My Linux fonts are fine, but when I go to my wife's computer, the text on Windows looks crisper. I believe this is a genuine difference, because sometimes I forget about it, and am struck by it again.
Too much customisation 8:20. I don't think this is a problem, because the users concerned don't even realise their is customisation. I helped my doctor's receptionist (a smart woman) by showing her that in Windows you can change the size of the pointer. What is, perhaps, a problem is would-be Linux evangelists banging on about all the customisation you can do, rather than saying "Choose this flavour of Linux, and you can just get on with your work/life in half an hour."
Also, all this banging on about games is a turn-off for ordinary people.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
@PassportToPimlico I think it started with the RAF just wanting a fighter. They were talking with North American about bombers, and asked them if they could manufacture P-40s (which were good fighters, about equivalent to the Hurricane). NAA said they could do a better fighter with the same engine. RAF Mustangs with the Allison engine were used for Army Co-Operation (because of the notorious high altitude limitations of the Allison without a turbo-supercharger); Army Co-Operation at that time was NOT a high status role. So the origins were pretty unglamorous, but various people had the idea of putting in a Merlin, with its high performance mechanical supercharger, and with the addition of fuel tanks everywhere, and six .50" calibre guns (replacing a mixture of calibres), it became the best escort fighter of the war.
The P-38, BTW, was a purely US origin design: it did have the turbo-superchargers the Allison needed for high altitude, but the installation wasn't satisfactory in Europe. Yes, they did think of putting Merlins in the P-38, but various US interests successfully resisted putting nasty foreign engines in a US design.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
The problem with AI researchers is that they define "Intelligence" as, pretty much, the sort of thing a computer can do. Great, wonderful, it will help us. But without getting metaphysical about it, this ignores the part that emotions play in actual human thinking: roughly, by concentrating on the electrical aspects of brain function, they ignore the chemical (i.e. hormonal) aspects. Probably regard emotion as an unfortunate flaw, or at best, a device for rough shortcuts, in human thinking. But, he will undoubtedly make a lot of money, and surely from his point of view maximising the individual's utility is the point of it.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
After years of dabbling, I've made a partial shift to Linux (Mint). I find it easier to manage than Windows, no more finicky about hardware, and calmer.
What keeps Windows going is two things:
> 1. Inertia. The machine comes with Windows installed (the foundation of the Windows monopoly)
> 2. Applications. Some people HAVE to be able to use MS Office. Some people HAVE to be able to use Adobe media stuff. Yes, there are FOSS alternatives, and they're good, but if you're collaborating or even sending your files off to somebody else you HAVE to be able to do it in the "industry standard format", and if it doesn't work out, it'll be your problem.
So there's no point really in Linux users worrying about what to do to attract Windows users. Just concentrate on making the distros as good as possible, in different ways, and unremitting attention to file format compatibility, to try to keep up with the monopolists. It'll only every be a minority using Linux-on-the-desktop, for as long as the OS on the desktop persists; you can see that the commercial outfits are trying to get us to put everything into their clouds, so they can charge us rent. I mean, the number of people who actually need a version of Office later than about 2003 is pretty small, and where's the profit in that.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
On AI. On the big claims for AI, I am deeply sceptical; sticking strictly to the material basis of consciousness, AI so far only works on the electrical connections of the brain, leaves out the emotional/chemical part of the thing (D'Amasio, Descartes' Error for what I mean). BUT I have become converted to the extreme usefulness of 'AI' for specific functions since I discovered how much SpeechNotes improved voice to text transcription running an LLM locally, on Linux, with modest hardware.
So I'd suggest you might attend, not to the big claims, but to quite specific uses of cognitive computing for practical tasks, running locally. For instance, it seems to me that it should be possible to get a scam detector system that runs entirely locally on the sort of adequate hardware we all have now, and which would be more accurate than traditional spam filters. Telegram is always a no-no, but some scams take a bit longer to detect. I have no clue how it would be done, and I might be wrong, but maybe it would be interesting to explain why I'm wrong. Similarly, as Windows becomes less and less tolerable, we're all trying to get away from Adobe. Are there ways in which the new computing could help with analysis of graphics projects, maybe even generating hints on how you could achieve specific results with non-Adobe software.
For example:
- AI: The sky looks a funny colour in this photo. Do you want to change it?
- Human: Yes, I should have used a polarising filter when I took the photo. On PhotoShop I'd do <procedure>. How could I do that in DarkTable?
- AI: Well, let me see .... What you could try is <procedure>.
Now that might be too blue sky for where we are, or could only be done by major collective effort, but it would help me to know what I might expect, and not expect, from new developments in computing -- stuff that would be helpful for me, not just help Predatory Commerce try to make more money out of me.
This, I think, would fit with what I see as three main streams in your channel. The first is the Maker thread (we used to call it Hobbyist, I think, but it is truly beyond that); software maker rather than hardware, but still. The second is your attention to where computing is going (both Big Computing and Small Computing). The third, and what really engages me personally, is your careful and clear explanation of what computing can do for us normies who quite like the hardware and software, but are only really involved because it helps us do stuff.
And maybe you could run a competition for the most entertaining AI hallucination of the month? As for a prize, I used to reward co-workers with an Internet Gift Voucher, good for one fantasy of your choice. Maybe now we could have AI-assisted fantasies?
I live a long way away from England these days, but it took me back when I saw that to venture into the open air in the middle of summer, you needed a pretty serious looking jacket.
Hope this helps, one way or another. Love the channel.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
@StarmenRock I think the Mint installation is about as simple and clear to follow as it gets. The reality is that a lot of people trying Linux for the first time are going to want to dual boot, and just accepting Mint's defaults will get you a good set-up. I've been installing and reinstalling Windows (10 and 11) on my old machines a bit recently, and you have to answer more questions -- and that's before you worry about whether or not you want to be engulfed in the Maw of Microsoft. I rather suspect the Windows user you're thinking of doesn't actually install Windows themselves, and has someone set up an account for them.
I think the people who really get troubled, including you, are not average Windows users, but people who know how to do things in Windows and are thrown by the fact that Mint (sometimes all Linux) doesn't do things the same way: like drive letters, which don't exist in Linux, or any *Nix. But with Mint you don't have to understand what a partition is, you just follow the defaults, and you certainly don't have to know about mount points, which is a boggler for sure.
The barrier for the typical Windows user, who doesn't want to know about computers, thank you very much, is that some of the productivity software on any Linux distro is different, and in some areas with clunkier user interfaces.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
There is no doubt that social media postings cause trauma to kids; some people use social media for wicked purposes. But traumatising and bullying are not new. As the English poet Philip Larking put it
They fuck you up, your mum and dad,
They may not mean to, but they do.
But also digital media are new, and there is a serious question about whether this has caused a change in mentality. Some people think that the introduction of print and the spread of literacy caused changes in mentality. But we'd know an answer was serious if it identified particular aspects of social media that are harmful; what we seem to have is apocalyptic horror stories, that sell well.
But, niggle, could we say studies that are strongly based on *evidence*, not *science*. Unless, of course, sociology and psychology are to be regarded as sciences. Or maybe Dr Hossenfelder really meant "Wissenschaft", which doesn't quite mean the same as the English word "science" (whatever that word does mean).
1
-
1
-
Suggestions about redirecting the students' creativity, as at 9:25, are obviously made by people who have no experience of educational institutions, apart from having attended one with an intention to learn (which makes them a minority from the start).
If you consider what has been happening at this school--I mean, the lived reality of the context described in the use-case--then you realise that for the poor bastard asking this question, burning down the school (7:45) could be a beneficial side effect.
Alternative suggestions also do not take seriously the chronic underfunding of public schools in many parts of the world: in the USA, for instance, I gather that school teachers have to buy pens and paper for the kids out of their own, tax-paid, incomes. So the probable situation is that the school has the person who's asking the question, and one tech, to look after the kit. How long would it take to disconnect a speaker on a PC? Remembering that you have to not only do the deed, but also travel from computer to computer, probably including travelling between buildings. And all this, BTW, would have to be done out of hours, so as not to disrupt the creative little angels while they're (ostensibly) learning. I guess, on average, about half an hour per machine. Might be twenty minutes, if the logistics are favourable. How many computers? Sounds like they've go a lot. Maybe 200? So that's 100 hours, or two and a half weeks work (in a civilised country with a 40 hour week: even in the USA, management would have to budget 10 days to get the job done). So, by the time you get to the end of the process, the computers you did first have already had new, louder, speakers put in by the kids. And you've made it a competition, so it's a matter of principle now. Who gets to turn the school PC into a theremin?
I know, I'm cynical about kids--and I haven't even been a school teacher. And many students could doubtless be diverted to more useful aspects of computer systems programming, though of course you'd have to fit it in with the government-mandated syllabus. Or you could set up a computer club, so that's someone who volunteers to give up another evening a week for the privilege of looking after other people's children, and organising the use of facilities, and ensuring that they don't try a ransomware attack on the local hospital, all on an income that pays a poor hourly rate even if you just stick to the official part of the job. But even if all this worked, you wouldn't get everyone. A few little twats would just like causing trouble for the sake of causing trouble (or maybe shit posting is not really a thing?), so it would start again, and others would then join in, and you're back at square one, though maybe a more sophisticated lot of troublemakers because of all they've learned in Computer Club.
In the circumstances, the question, certainly asked after much thought and in desperation, seems entirely sensible. To people who have never been in a classroom, teaching seems easy and obvious. And bits are, indeed, good. So why don't you go and frigging do it? But in the real world, I think the questioner would have been justified in asking for a good implementation of EOU.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
For those who weren't around, goatse was used kind of like rickrolling. So some things have got better. I will admit it gave me a bit of a startle the first time I saw it, so I guess it's in my memory, but no, the old Gnome Circle logo is nothing like goatse. But now I'm going to really foul things up. Given that an image of a tree can suggest a penis (Rod of Jesse, family tree, "got wood?"), I think the one-handed logo suggests masturbation, and I'm shocked, I tell you, shocked. Will no one think about the children? (on second thoughts, don't answer that.) Go on, cave in to this old white man. Stop corrupting the world, Guhnome.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
I think the change in museums started before the internet---certainly before the Web--and in part it's a good thing. Museums used to be, in part, collections of weird and rare stuff, like books on history of languages used to be all about exceptions. Then the idea got around that they ought to be telling the story of the main streams of development, and that is good. But then museums had to justify their (inadequate) funding by crowds through turnstiles, and some did get dumbed down. York in England is a classic place for museology. One of its museums is a very early introduction of the Folkmuseum idea to the UK, and it's a great museum of everyday life--if you're old enough, you can join in the exclamations of "My grandma used to have a room just like that!" There are also a ton(ne) of churches in York that no longer have congregations. A lot were turned into museums, but there aren't all that many artefacts, so they used the space to tell relevant stories about craftwork and building and stuff with good signage, and using the building as a physical contextualisation. The star used to be Jorvik, which is a museum housing an archeological exploration; at first it was a brilliant combination of display of current research and house of scholarship, and an innovative display which engaged the ordinary interested person and gave a real feel of the sights, sounds, and smells of the Viking era settlement. Alas, commercial pressures mean it's now a theme-park ride, with lots of gamified learning activities for the kiddies, and the seriously interesting and beyond that the scholarly important stuff is pushed to the background.
But what I'm sure museums have to stick to, and what they have over the internet, is the artefacts. The things themselves. It's great seeing all this stuff on Ian's channel, and InRange, and Othais and May, but there is no substitute for the things themselves, whether it's guns or paintings or medieval manuscripts. Must be a hard time to work in museums.
1
-
1
-
1
-
1
-
1
-
So I was just thinking that Flatpak got round the infrequent update issue, when you said it. I can see it's a further step away from the Unix philosophy, but we do have a lot more storage these days. And there's always *BSD.
LInux Mint finally let me free myself from nagging commercial systems, so I like it a lot. Partly it's good because it is rather cautious about changes, which is, I guess, why LMDE is still an "alternative." No one really thinks Ubuntu is going away, but it's not paranoid to think that Canonical might do stuff that makes Ubuntu less attractive as a base. Mint slowly prepares to change, just in case; and it's an alternative because, presumably, they're not yet confident it's as polished as the traditional form.
1
-
1
-
1
-
1
-
1
-
1
-
When I first came to NZ, the car industry was hugely protected, and second hand cars were ridiculously expensive. Then used imports from Japan were allowed, and the world changed. Since that change I have had (amongst other cars) a Honda Vigor (Accord with a 5-cylinder engine), Toyota Spacio (not simply a made-up name--it's a miniature people mover on a Corolla platform, lots of space), Mazda Cosmo RX-5 (big fast-back with a rotary engine), Nissan Skyline V35 (best car I've ever had), VW Polo (made l.h. drive for Japan, and imported from Japan), Nissan e-Note (series hybrid--wonderful little car to drive), and I've also now got a Toyota Porte, which is a derivative of the Fun Cargo, which is a vannish variant of the Yaris, and has one sliding door, and my one has a special passenger seat for people with restricted mobility. Never a dull moment, and great value. But you have to change the radios. The ones they import have only been used by a little old lady to go to the shrine, and have very low mileage, but on the other hand there is a possibility that they've only ever had one oil change.
1
-
1
-
1
-
It's an outdated convention, but it is the universal convention in photography, and you'd guess that anyone concerned with sensor size in a phone camera would know something about what cameras available. "One inch sensor" is the name of a whole class of cameras, the only pocket camera class to survive the improvement of camera phones. It would be better to give the physical measurement of the sensor, in both cameras and phones, but you'd still need a snappy descriptor for the ad copy, and to know what it means, the buyer would still have to know about, you know, cameras.
Also, size is not the reason for using plastic rather than glass in the lenses. No problem with small glass elements. It's probably because the elements in question have very complex surfaces, rather than simple spherical ones, and that's much easier to do with plastic moulding than grinding glass. The big advantage of glass is that it comes in a much wider range of refraction and dispersion values, which is why the best lenses are (mostly) glass.
1
-
1
-
The installer is really important for me: it offers you quite a lot of choices, with intelligent defaults and useful handholding to set custom values. Specifically, when setting up a dual-boot installation. I can't remember any other installer that makes it so easy to install Linux alongside another OS (mostly Windows, of course) and adjust the size of the two partitions. Other installers I've used have, at best, a single default, and if you want anything different you need to manually set mount points. I know I could look up the way to do this, and get it done right on the second or third attempt; and I know that manual control of this sort of stuff is really important to some users; but it's not important to me, and I'm really grateful for a distro that makes it easy for me. I like Cinnamon, but it's available elsewhere, and I'm happy enough with other desktops (and window managers). And the rest is Linux. But the installer is so unobtrusive it's a star, and I really don't know why other installers can't be like it.
1
-
1
-
1
-
1
-
1
-
1
-
The placebo problem is not just a witticism. I am mildly afflicted with depression, and during a very stressful time at work started taking an SSRI (Selective Serotonin Reuptake Inhibitor: the currently widely prescribed class of anti-depressants). My doctor told me it would take a time to work as the molecule titrated up to the effective level, but in a couple of days I felt relief. Not numbing: I still knew that my situation at work was shit, but I wasn't oppressed into hopelessness by the realisation of this. I wondered why it worked so quickly.
Then a series of studies were reported, showing that the effects of SSRI were not distinguishable from placebo. I'd also retired, so I thought I would try gradually weaning myself off the drug. After about a week of gradual reduction of dose, disturbing forms of ideation re-appeared. Ten years on, I've tried the same reduction, with the same results.
My provisional conclusion is that SSRIs are indeed placebo, but such powerful placebos that they do indeed work on people who believe them to be placebo (like me). My doctor thinks that perhaps the brain has enough serotonin all the time, but the SSRIs might affect the way a particular brain processes it. I always take my doctor's hypotheses seriously, but I think that perhaps there are funny things going on in the mind, and for things that have a large mental component (yes, the mind is a thing, even though we don't understand it) it is EITHER hard to distinguish placebo from pharmacological activity that works sometimes OR there are super-placebos or meta-placebos that sometimes work on people who believe them to be placebo. Or you could call it magic.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
@lxn7404 Well, I can understand that. But my pov is just that of an end user of a computer, who knew in about 1970 that a text editor would make one of my pieces of work hugely more efficient. And now I discover that both Windows and Mac OS are spending most of their time trying to bond me to their systems and make me dependent on them, and I find an alternative in Linux. I don't want to go beyond Linux Mint Cinnamon and LibreOffice, and when I'm trying to help a friend who's on a Zoom call with me (reading Homer) I know I'm not going to ssh into her system, but just trying to remember how Windows works, all the theory goes away. What I think I'm saying is that because Linux is FOSS, it actually makes a much better basis for ordinary end-user applications than the commercial alternatives, and that this is also a legitimate use case for Linux, and the app alternatives, like FlatPak, fit into that. As I said, I think, the new app packages may not be right for people who are deep in the entrails of the system, but they can be very good for those of us who are pretty ignorant, but who will be the people who produce the year of the Linux desktop.
1
-
1
-
I know Linus Torvalds has a reputation, but if you compare that "flamefest" with historical controversies in scholarship or, especially, theology, it is no more than direct, with a /small/ side dish of snide.
This also reminds us of a remark attributed to the great philosopher, Yogi Berra: "Prediction is hard, especially about the future." I was there at this time, even by an odd chance being on a committee that decided to buy the university a Silicon Graphics RISC/Unix machine (the engineers wanted to stay with VAX), and everybody /knew/ that RISC was the future. I guess the same was true about microkernels (BTW, isn't Mac OS or whatever they call it now a hybrid, at least?) It would be interesting to know how much Tannenbaum was wrong and knowably wrong, and how much it was sheer contingency? Intel at the time was making little chips for little machines--consider the dude complaining about the elitism of writing for a 32-bit chip--and it was not inevitable that they would win against MIPS or even Western Digital. But there was enough inertia behind the instruction set, and they were clever enough (and Moore's Law was still working) to change the way their chips worked.
1
-
1
-
1
-
1
-
1
-
1
-
As I understand it, a lot of composers of Western Art Music compose on the page, without the need to hear the music. Recently I was looking at a biography of Benjamin Britten: as a student, he wrote a lot, most of which he couldn't get performed. I was astonished to find that, when he did get something played, he said, as though it were a realisation, that the sounding of the music was the real thing. I also went to a series of lectures by a musicologist: someone asked her about a piece, and she said "I haven't read that," and then, slightly embarrassed, changed it to "I haven't heard that." It doesn't lessen LvanB's achievement, but perhaps he was working in a way that wasn't, in principle, all that unusual, the ability to experience, in a satisfying way, a piece of music from the score. Quite beyond me, but so are lots of things. What Beethoven's deafness really destroyed was his capacity, first to play, and then to conduct his own music.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
@templarroystonofvasey When context menus first appeared, I disliked them. But I got used to them, and I now find it convenient to have a few frequent actions that close to a file when fiddling with it.
I don't think computer interfaces are intuitive, in any absolute sense, at all. You learn how to use them, and get used to the sorts of metaphors and structures that underlie them, and if they are well designed you get to the point where you're like: "I want to do X, but I don't know how. It's probably on the Tools menu/It's likely to be on the Context menu/Let's see what happens if we do Control-Meta-Tilde."
Once upon a time you studied the manual. Not now (when did you last see "RTFM" in the wild?) It seems, especially with phones but increasingly on the desktop, that it's like learning how an animal behaves, and how to get it to come in, or do its tricks. It's probably less explicit, and so all the implicit knowledge feels like intuition, but it's all learned behaviour, and you have different intuitions for different (well-designed) systems. Some people find GNOME intuitive.
1
-
@ExplainingComputers Just now I had cause to boot into Windows, and one of its cheery messages invited me to try New BING. Ask me anything, it said, so I did (I don't know how to post a screenshot, so I transcribe):
-----------------------------------------------------------------------------------------
Me: Why won't Microsoft Windows leave me alone to get on with my work? Why does it always interrupt to tell me to use different programs, and get me to sign up for things that make Microsoft money? Why doesn't it respect its users?
BING: I'm sorry but I prefer not to continue this conversation. I'm still learning so I appreciate your understanding and patience. <Namaste emoji>
---------------------------------------------------------------------------------------------------------------
I think that's masterly. Evade the question, whilst trying to make me feel as though I've made an improper suggestion or am bullying a child.
Evidently, there's something in that corporation that turns people evil. Bill Gates, once he retired, has turned his money to doing actual and significant good, though he still seems to have problems with people. But the corp goes slithering on, trying again the Browser-as-integral-part-of-the-OS stunt it used to try to kill off Netscape.
1
-
1
-
1
-
1
-
Plain text is universal? Yeah, right. When I started with little computers, some of my colleagues were using an IBM word processor. Guess who got to be the expert on translating EBCDIC to ASCII?
My wife and I were doing a book: the general editor had support at his university computer department, who worked with LaTEX. So guess who learned how to put in LaTEX codes using the simple, austere word processor we were using then. And then the editor lost the support, and I forget what format we ended up with, but I know one author in the bibliography had a Polish given name which was spelt with a z with a dot over it. Long before UTF: so I bought a copy of WordPerfect, and learned it (in so far as anyone actually learned WordPerfect, rather than being quick at navigating the cheatsheet template).
The problem, I think, is that a lot of the people who pontificate about Linux are developers and sysadmins (to whom, respect) for whom writing is producing documentation for other professionals. But a lot of writing IRL is for publication, either in dead tree or e-book format, and what publishers want is Word format files, and they want authors to do all the formatting for what used to be called camera-ready copy. (Maybe if you're a best seller, this doesn't apply, but it's the way it works in academic publishing). For this purpose, word processors don't do a fully professional job, but they will produce a passable result that's good enough for academic publishing. Though I observe that publishers still have difficulties with getting footnotes done properly in ebooks. Publishers (outside the technical sphere, perhaps) do not want LaTEX any more than they want nroff, they want .DOC or .DOCX.
Commercial and advanced FOSS word processors can get incompatible (hell, MS Word can be incompatible with itself if there's enough of a gap in versions and platforms), but that only applies to pretty recondite sorts of usage. These days, for the sort of thing that markdown does, the compatibility is good. Especially if you use .RTF, which is proprietary, indeed, but MS is not making any money out of it, and .RTF will tell you if you're doing something too intricate for it.
Where word processors can be, and certainly used to be, evil is when there's a monopoly. Microsoft used to change the .DOC format with every upgrade. This would to drive the massive sale of upgrades by a simple mechanism. It used to be a rule in large organisations that the person who had the very latest desktop PC was the CEO's PA. So, an EDICT would be issued from the desk of the Supreme Manager. It would be typed up (and probably corrected for grammar and spelling) by the CEO's PA (or, as it was in those days, Secretary) and she (as it was in those days) would promulgate it to the masses. Since the CEO's PA/Secretary was a very intelligent and capable person (probably smarter than the CEO), she was in complete command of the new version of Word, and would use its new features. So when the message came to the peons, and they opened it in their old versions, they could not access the guidance of the Dear Leader in all its fullness, and so each department paid for upgrades, and so was increased Bill Gates' fortune (ill-gotten, but now used well).
And if you want pure, undistracted, composition of a first draft, nothing beats paper and a 2B pencil.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
Slow, sure. So were computers in the 1980s (though not as slow as this). But something like this really beat a typewriter, a manual, a portable, or even a mighty IBM Selectric, if you had to produce really accurate text. Sure, the screen redrawing is glacial, but it beats having to re-type a whole page. They weren't a thing for long (I had a simpler self-correcting typewriter that could kind of store text), but for the few years before everyone could afford a Kaypro, they ruled.
BTW, daisy wheels were slow, but produced grown-up output, as compared to dot matrix printers (which were loud and nasty), and the wheels lasted pretty well. I once set my computer to type out a lecture on my Brother daisy wheel, and went out of my office. As I closed the door on the tap-tap-tap, a colleague looked at me all "WTF?!" I explained. "Oh, I thought you had a typist in there," as though suspecting me of great nefariousness.
Look at the keys on that machine. I bet the touch is superb.
Nostalge over.
1
-
1
-
1
-
14:20 Psychopath. Are you suggesting that Gnome should adopt Reiser Sans as the default?
All the font minutiae are real, but often in quite compartmentalised use cases. Actual dead-tree printing has had over 500 years of this, and for a book I'm not sure there's anything much better than the font Nicolas Jensen designed in Venice in the second half of the fifteenth century: but it looks pretty crap on a computer screen. Also, it's not obvious that different alphabets should belong to the same font family, though it is clear that one should pay some attention to how a particular Arabic or Hebrew or Georgian script looks alongside a particular Latin face. Each of them has their own tradition of calligraphy and type design, quite separate from the Latin tradition, so there's no a priori reason why they should belong together in the same act of design. Which means that wanting to have one font to rule them all is likely to introduce complications which could be avoided by accepting that a system could have a variety of fonts available, even for the default display fault, depending on default language. Possibly even a different font for languages using the Latin alphabet with a lot of diacritics.
One thing I find troubling in the discussion is that there is no mention of readability studies. There are the obvious abominations like I and l being indistinguishable (as in the font I see on YouTube now), and my pet hate of l being hard to distinguish from i in some quite fashionable fonts; and then there's telling the difference between rn and m, which is unnecessarily hard in some sans serif faces. But there have been more general studies, taking into account different levels of visual acuity and stuff.
BTW, making a Bold by just tweaking some parameters on a base font is, I think, regarded as devil's work by font designers. Even scaling by point size can be usefully tweaked, if your aiming for the font beautiful.
A distro using a clone of Comic Sans? To go alongside Hannah Montana OS ?!
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
Groupthink is real, and when it leads to opposition to, say, nuclear power, it's harmful. But you can't keep science and politics separate (that is a statement about what is possible, not what is desirable). We need to do something -- well, I'm nearly 80 and have no kids, so I'll say YOU LOT have to do something if you want to avoid really bad times -- and doing something is what politics is about. Al Jaber, also, was not making some innocent remark; the major producers of fossil fuels have been trying to stifle honest discussion by setting up climate science. Sure, it is important to remember what the main point is; and if carbon capture can be made to work, that would be wonderful. So in that sense fossil fuels are not the core problem. But, as far as I know, there is science to say that the major cause of excessively rapid climate change is the release of carbon dioxide and methane into the atmosphere; and that the chief source of these gases is the use of fossil fuels. So, by a chain of reasoning that even Tucker Carlson could follow, there is science behind concern about the use of fossil fuels. When does carbon capture reach a scale to allow us to use fossil fuels like we do at the moment? About the same time that nuclear fusion becomes a practical source of power, I'd bet.
The other reason why it is intellectual purity to the point of naivety to decry the mixing of politics and science is that a lot of the denialist rhetoric is an attack on science as an institution. It's all a big conspiracy to do down the salt of the earth God, guns, and family Americans. If these people were susceptible to reason, it would be good to ask how the big oil companies find reserves. They employ expert people to suggest where to look. Who are these people? Geologists. Scientists, who don't seem to be interested in overturning the master conspiratorial narrative about where hydrocarbons come from (and who probably don't believe in the literal truth of the Genesis story).
Politics is dirty. Science is cleaner, though in fact there's a lot of internal politics in science, as in every part of institutionalised intellectual activity (string theory? new particle accelerators?), but it's only really cleaner because the stakes are, for the most part, small. But if the climate deniers carry on with an unremitting, orchestrated narrative that has no relationship to the truth, it is an unfortunate necessity that people wanting to oppose them have to think about the political effectiveness of their statements, more than the nuances of precision. At least it's not as bad as my old subject, LitCrit, which is almost entirely politics these days.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
So, just to simplify matters, there are THREE designations "Mk. IV" applying to Webley revolvers. Webley's Mk IV in the service revolver series; the gun is adopted, and happens to be Mk IV in the quite separate Army sequence. Then there is Webley's Mk IV pocket revolver (they having two different lines, which seem never to get mentioned in the naming of parts), which gets adopted with the manufacturer's name, not a separate military designation. That military naming is, by then, completely out of line with the standard designation for Army revolvers, but it would, I guess, have been far too easy to call it Revolver No. 3. Of course, they couldn't have called it Revolver No. 2 Mk 1, because that would have been to admit that the Enfield was a knock-off.
Glad to be able to clear that up for everyone.
1
-
1
-
1
-
1
-
1
-
I've got a couple of old machines that do or can run Windows. Since I only have them because I need Windows, the unofficial way of updating to W11 is attractive.
I hadn't properly thought about Linux as a vehicle for using web apps, but that's obviously a good idea if someone wants minimum change; and then the choice between Linux and Chromium is a real one. My assumption would be that going to Chromium would be less of a faff for people who just want to do their stuff, and are not interested in computers for their own sake. Is that assumption true? Would Chromium be better suited (i.e. less demanding) for a really old computer?
It also looks to me as though OnlyOffice is an easier switch from MS Office (or whatever they call it now) than Libre Office. I recently wanted to do a search on the string of a tab character + a numeral. MS gives you a short list of such special characters, as does OnlyOffice: LibreOffice says "We do regular expressions." Yeah, that's what real computer users want, and it's more powerful, but I looked at it 25 years ago, thought regex looked interesting and I might have a use for it, but never did. And in other ways, OnlyOffice looks more addressed to people for whom the whole Windows 11 saga is a nuisance, rather than those of us who find it an outrage.
1