Title graphic of the Moonspeaker website. Small title graphic of the Moonspeaker website.

 
 
 
Where some ideas are stranger than others...

FOUND SUBJECTS at the Moonspeaker

Reducing the Commercial Web (2023-08-21)

A snapshot of the main page of the project gemini capsule in gemini space using the amfora browser. A snapshot of the main page of the project gemini capsule in gemini space using the amfora browser.
A snapshot of the main page of the project gemini capsule in gemini space using the amfora browser.

Technically of course, as the introduction to the newish gemini protocol says right in the snapshot illustrating this thoughtpiece, geminispace is not expected to replace the commercial web at all. But there is a reducing element to using it, in the sense of getting away from an increasingly unusable web with its all too many normalized intrusion practices. It's bad enough that the web has been rejigged fully into a platform fundamentally designed for mass surveillance, heedless of the compelling evidence that mass surveillance is not an effective tool for maintaining security at all. Mass surveillance is simply evidence of growing paranoia among those who have managed to engross more of the necessities of life and actively debar others from accessing those necessities. Bad conscience works on people like that, although that unfortunately does not mean it works fast or effectively from a practical perspective for people to depend on it as a corrective mechanism. Besides the obsessive demands for us to load javascript code libraries so that websites can dump propaganda and spyware on us, most webmasters are heinously sloppy with such items as images, sounds and movies, as Maciej Cegłowski skewered so well in his discussions of the "web obesity crisis" before he got lost in politicking in the united states. I have finally figured out the wretched application of svg files such that loading a site that uses them first loads a massive, page filling version, then belatedly stops short and reloads the whole page again to resize the thing into something that makes sense. At one time this rude behaviour seemed limited primarily to "social media" buttons, but since then it has proliferated like pink eye through a busy kindergarten classroom.

For those who have opted to begin maintaining a "capsule" on gemini, there are quite a range of applications that people are trying out. Since the protocol favours text above all else, geminispace has become host of a minor resurgence of ASCII art. More importantly, a number of people have turned to it as a super lightweight mode to blog in, finding it more congenial and less prone to becoming an inaccessible database. From what I have read so far, an important group of geminispace participants have found that git integrates very nicely with their new publishing process. There are already a number of unusual capsules that support comments, providing an intriguing opportunity to observe all over again how things work without the ability to vote on posts are attempt to manipulate perceptions with algorithms. A variety of tech-related blogs have begun maintaining their own capsules, and quite a few capsules include a re-ported rss feed. I am using the gemini browser amfora, feeling for the most part quite satisfied with its barebones qualities (still working out how to move more efficiently among links once there are more than 10 on a page). I have also heard a great deal about another browser called lagrange, which has a graphical user interface rather than running from the command line. The trick for the time being for many of us interested in potentially adding our own contribution to geminispace is sorting out hosting. Since it is not a 500-pound gorilla and deliberately stripped of such capacities as running scripts in order to preserve privacy and security while setting a serious barrier to spam of all kinds, self-hosting is quite feasible.

Feasible, and perhaps unavoidably necessary since the majority of universities and residential internet service providers have removed all support for any minimal web hosting let alone newer lightweight protocols like gemini. The destruction of webhosting for students is particularly egregious at post secondary institutions in my view, because these include the same people who insist that they are desperate that the research done by their students and staff be shared with the world, especially online. Yet they obviously have no serious commitment to this when they mostly expect people to either sign up to use a "free" wordpress account, or else demand that they apply for university server space, then do a form to confirm the site will still be in use each year, while munging the addresses into a difficult to type mess that reveals usernames. Today the latter point is a serious issue, and a better solution would be to connect the subsites to the department or faculty that the research subject matter ties to most closely, with a name that reflects the name of the research project. That would likely go a long way to improved security and a better effort at naming the subsites. The rub there of course is the utterly self-defeating common look and feel demands of most universities, which are as rigid as any government or corporation's and just as self-destructive. That isn't even getting into the fossilized evidence of rounds of "updates" to the sites in the form of multiple, unintegrated navigation systems.

Which leads me to think that actually, if research and academia opted to make an effort to share their results in gemini capsules, they could potentially hedge their bets sensibly. Admittedly it would go against the grain to not obsess over "branding" and imposing scripts o make it look like there are movies and fancy visual transitions, but the low footprint, high security, and opportunity to provide access to results in a focussed manner that respects the intelligence, time, and privacy of the visitor would go a long way to improving accessibility. Pursuing this thought further, it seems to me that the gemini protocol lends itself to porting for interpretation by screen readers and the like. Let alone that such a stripped down protocol could potentially render data usage on a cell phone genuinely affordable again – well, depending on how far the grift goes to set up so-called "5G" while deliberately destroying "3G" – but then again, public wifi is pretty common these days too. In other words, there is a great deal of potential in the gemini protocol, including the potential to ward off the parasitical corporatization that is killing the web and significant parts of the internet beyond it. (Top)

A Little English History (2023-08-14)

Engraving by Gregoire Huret of a woman representing rhetoric by her pose, clothing, and accessories. The original document is held at the metropolitan museum of art, which donated high quality scans of it to wikimedia commons under Creative Commons CC0 1.0 Universal Public Domain Dedication. TEMP
Engraving by Gregoire Huret of a woman representing rhetoric by her pose, clothing, and accessories. The original document is held at the metropolitan museum of art, which donated high quality scans of it to wikimedia commons under Creative Commons CC0 1.0 Universal Public Domain Dedication.

Due to a fun latin project I have been reading at for awhile, it finally happened that I could spend a bit of time looking into the origins of the miserable classes in school labelled something like "language arts," "language studies," "literature," or "english." Being a writer who does indeed enjoy reading a wide variety of non-fiction and fiction, it may be something of a surprise to learn that I hated every class labelled under any of these sorts of names that I ever had to take at any level. Hated them. It did not help that in the lower grade levels it was not remotely clear why we students were being put through the tedious and awful exercises we were, why we were being forced to read stuff we would never have chosen. Never chosen not based on how hard or easy it was, because at such an age we had no means of comparison, but due to the subject matter. To this day it is still all too difficult to escape the endless attempts to indoctrinate younger students into consumerist capitalism with a heavy dosage of sex role stereotyping and now much more subtle but still persistent racism. Kids get that they are having shit piled on their heads even when they are too young to know or use such terminology. We're sharp at that age, we have to be. Later in the higher grades the obsession was with teaching us to pass standardized exams, so we worked endless practice versions of the sorts of questions on the eventual official exams. It would have been unpleasant but bearable by analogy to sports, music or arts practice: dull practice is part of what you need to do, but with a serious leavening of new material and application. That of course is why good teachers do their best to design exercises that help students use the skills they need for the exam to produce something more interesting and not just one off projects, like a school newspaper or a one-act play or whatever. In university, many of us are forced to effectively redo the entire course of english we just finished in high school regardless of our mark, in order to meet an "effective writing requirement." This is the heinous outcome of forcing teaching into narrower and narrower channels so that it is only "for the exam" in many schools. For those who only got the narrowest instruction, they do need the remedial help. But "for the exam" also means grade inflation, so that universities don't actually trust the results documented in high school transcripts. It's crazymaking, and frankly, insulting to students who arrived competent and prepared for undergraduate study.

On my idiosyncratic end of things, I have generally found university english literature courses pretentious past bearing, which is really too bad. However, it seems to me that this can't really be about the courses as such in many cases. Rather, I seem to have the wrong mindset to be able to take the type of what seems an obsessively inturning type of analysis at all seriously that I have read in many of those classes. Nevertheless I have managed to rather enjoy some of the literary studies that have fallen into my reading requirements over the years, such as Gilbert and Gubar's famous opus The Madwoman in the Attic. There are certainly studies with intriguing premises that I could not resist reading for fun, such as Seo-Young Chu's Do Metaphors Dream of Literal Sleep? I freely admit that the title alone won me over. Studies of the imagery in Shakespeare's plays or the ins and outs of Dickens' london can indeed be fun to read and fun to take as examples of how to close read and contextualize writing of all kinds. But the politics and efforts to make academic literature studies "serious" aside, this left me wondering, what in the hell is this all supposed to be about? What are these courses and where did they come from, when nowadays students can come out of their years of mandatory schooling barely able to compose a five paragraph essay or even a coherent three or four sentence response to a question?

Well, in the end I opted to spend a little time putting the internet archive's rather odd and slow search engine through its paces to look up some of the older books used to teach basic literacy, as that was likely to give some good recent evidence. It isn't difficult to find brief accounts of the mediaeval version of the trivium, the basic subjects of grammar, rhetoric, and logic. But it is not easy to track down a similar account of how things changed from those three to the eventual rough modern equivalents under labels like those listed at the start of this thoughtpiece. Grammar has a specific technical meaning in linguistics, but more colloquially it refers to correct use of language, so it includes whatever training that helps a person to both speak and write well-formed sentences and sequences of sentences. "Correct" covers many things, including standardized punctuation, spelling, and application of specific styles and forms to particular uses and topics. The reason spelling and punctuation are such bugbears is because they were not especially standardized for most languages until the full spread of the printing press, and they continue to change anyway. Which sort of writing is appropriate to a given topic also changes all the time. As recently as the mid nineteenth century, didactic poetry was still quite an ordinary thing to write, read, and study for any topic, including the subjects considered "hard sciences" today. I suspect this had to do with the widespread use of oral examinations. Rhetoric has a bad reputation because it is a term for the techniques of persuasive speech and writing, which can be cynically abused, yet the best defence is offence, learning how rhetoric works. A clever speaker or writer can't fool you if you know the techniques they are using too. Logic has been all but banished, yet it is meant to cover techniques of reasoning that should be self-consistent and possible to describe to others, with clear steps showing the truth or strong plausibility of the argument or claim being made.

The textbooks I found on the internet archive were quite interesting, referring consistently to rhetoric and composition, sometimes to spoken rhetoric, but primarily to written rhetoric. There were much more in depth sections on materials covered lightly and in a way that was utterly confusing to me as a child, especially the exercises in identifying the major categories of words (good old nouns, verbs and adjectives) and dividing sentences into subjects and predicates. Spelling was covered in very brief, because there were specialized spelling textbooks with endless drills and lists for that. By the late nineteenth century the textbooks were more and more often written by women, and they carefully set students to learn grammar thoroughly before rhetoric. But logic and what today is probably best referred to as reading comprehension were not much in evidence. That said, the textbook examples I touched on had excellent sections explaining how to write different types of letters – which leads me to wonder if part of the reason letter writing has faded is not about email but about not teaching how to write them – reports, essays of different lengths, and different types of poetry. Explicit instruction in how to construct and parse subordinate clauses were more common in these books as well. No such instruction of this type was given where I went to elementary and secondary school, but we did get very basic grammar and spelling, short answer writing, and the five paragraph essay until our faces just about turned blue.

From the look of it, english instructors then were under similar pressures to now in that they were constantly being ordered to make students into people with enough basic literacy to be able to work in certain types of jobs. Their instruction was supposed to be "practical" and oriented to the marketplace or perhaps to government jobs where competitive examinations for them were in use. They were supposed to somehow do this in less and less time, to more and more students per class, while discouraging independent thought and not instructing students in the ancient languages that were the status markers of the upper classes. This was the case even though it actually does make sense for students to learn at least the basics of latin and the greek roots to help them deal with unfamiliar vocabulary when they can't access a dictionary right away, and to learn those different categories of words and how subordinate clauses and the like work. Since most english grammar terminology is latin grammar terminology, this makes it easier to learn – most students can quickly identify the parallel structures in english because it is related closely enough to latin for them to truly correspond with each other. (Top)

Against Monarchy (2023-08-07)

Illustration from an 1857 edition of Goethe's *Reineke Fuchs* (Reynard the Fox) by Wilhem von Kaulbach, courtesy of oldbookillustrations.com and the internet archive. Illustration from an 1857 edition of Goethe's *Reineke Fuchs* (Reynard the Fox) by Wilhem von Kaulbach, courtesy of oldbookillustrations.com and the internet archive.
Illustration from an 1857 edition of Goethe's Reineke Fuchs (Reynard the Fox) by Wilhem von Kaulbach, courtesy of oldbookillustrations.com and the internet archive.

Certainly it isn't difficult to be against the existence of monarchies. Even if monarchs are officially without real power, the fact remains that they are generally the same dangerously inbred families whose fortunes were built up through organized crime laundered through warfare and some form of religious sanction. Any unpleasant family can have unusually sensible branches or members whose own character and experience leads them to turn away from the established habits of ongoing greed and wastage typical of the culture of those who have managed to seize enough money not to care about what they do with it. Some of them have transitioned quite well to play the celebrity game, which is of course the united states version of monarchy because the richest people there are relatively nouveau riche and some of them managed to steal their pile via business chicanery that didn't depend on having a war going on. Well, I suggest that in an effort not to be cynical past bearing. It seems to me there must be some outliers on that score, yet I can't deny that between the efforts at public relations and retroactive hiding of how the sausage gets made, the suggestion may in fact be wildly unrealistic.

I have heard people get excited about "royal visits," and certainly it seems to be used as a justification for an off cycle civic holiday in some places, as well as a judiciously coordinated tourist event. Yet I have never understood the appeal. Celebrity or monarch, these people live within a carefully curated bubble that helps render them incapable of dealing adequately with real life tough decisions beyond those that benefit their class. They are persistently trained to consider other people less than real and generally as quite expendable. The evidence of the heinous handling of the ongoing COVID-19 pandemic, let alone the crazed response to the renewal and growth of warfare in europe is all too much evidence for that. It's terrible, but when members of a particular class of people repeatedly demonstrate an inability to relate to or value the lives of people who are not in that same class, and they are completely open about their demonstrations, it is hard to deny the facts. The clarifying behaviour has certainly led me to revisit my lack of any support for a monarchy of any kind, whether constitutional or carefully relabelled as being celebrities or the very rich who supposedly should be allowed to run everything because supposedly having lots of money means they are smart and forward looking. The trouble is they are, but only within their own very narrow areas of interest.

Perhaps for me a big part of the difficulty even for the most seemingly harmless monarchs and celebrities, is that it seems to me that being that rich and waited upon should be embarrassing. For those who like to maintain a certain number of servants of more or less permanent type, there seems to be a sort of regression to semi-childhood. The person behaves in a manner that suggests they have lost the capacity to clean up their own messes, arrange their own schedules and so on. At the very least, there seems to be a culture of deliberately seeking to offload all such "crude," day to day stuff to other people. I do realize that coming into adult responsibilities is not the nicest thing in the world, there is a certain romantic haze around the idea of somehow being able to go back to carefree days of no responsibility and seeming to have everything magically done for you. Or dreaming of such an idealized time that never really existed and trying to make it happen. For myself and many others though, having won independence and the skills to be adults in the world, it seems ridiculous to throw away that hard-won self-respect and capacity, or even to seem to throw it away when real life can catch us out at random. But that is not the most embarrassing aspect. To me, it seems that having great riches must indicate either a profound act of extraordinary theft, whether it be by direct violence or taking part in modes of moneymaking that wreck societies like rentierism, or an extraordinary moment of luck. In which case one would hope that being a crook would finally activate a person's conscience for real rather than their desire to whitewash their ill doings, or that the very fortunate would willingly share as much of their fortune as they can. We can do our best not to mess constructive opportunities, but luck is random and won't necessarily stick with us all the time. When we stumble on bad times, by not being greedy we are able to build and maintain the relationships with our fellows that are our best insurance and help in hard times.

That's not me trying to do some kind of moral or ethical bloviation, it is something we can actually verify by checking the historical record, and the more recent incidents of people struggling on after major disasters. It seems to me that those who are allowed to pretend at being some sort of semi-divinity on Earth not only create conditions under which their efforts to get more mean that many more have nothing, their example encourages the destruction of social cohesion that is the real help when we have to deal with grim challenges like natural disasters and pandemics, wars and authoritarian implosions. This is the sort of loss that also ricochets nastily back on the people who thought for sure they were at the permanent top of the heap and could always arrange to maintain their rarified lifestyles. It's not a good longterm strategy, but then admittedly, this simply demonstrates that in the view of people who are counted among the monarchs and the one to five per cent richest people generally, they don't need a longterm strategy, they just need one that lasts long enough for their immediate goals. (Top)

Thoughts on Computer Vulnerability Reports (2023-07-31)

Snippet from issue 732 of the Ubuntu Weekly Newsletter, april 2022. It is of course also possible to see the section illustrated here directly and follow the links. Snippet from issue 732 of the Ubuntu Weekly Newsletter, april 2022. It is of course also possible to see the section illustrated here directly and follow the links.
Snippet from issue 732 of the Ubuntu Weekly Newsletter, april 2022. It is of course also possible to see the section illustrated here directly and follow the links.

There is no question that computer security is important. Nobody wants their computers rummaged through by strangers or for their operation to be compromised by malware. It makes sense to keep software patched and up to date. However, in line with the general practice of click-trolling by just about every website purporting to provide any sort of tech news online, we aren't provided much by way of practical and reasonable security advice or recommendations. Instead there is plenty of scaremongering, and a perverse refusal to provide information that allows a person to assess their risk and pick up on possible warning signs. This is especially bizarre right now, when every other minute shills for supporters of authoritarianism regularly insist that "we" should all be taking individual responsibility for ourselves and never, ever doing anything like working together for mutual benefit and safety. Somehow that "we" never includes the ones insisting this the loudest, and the insistence goes awol as soon as the subject turns to computers and securing them. This is transparently ridiculous. On to of that, as I have noted in a previous thoughtpiece, more and more often security updates are pushed in combination with other software changes that break the software in other ways, so people resist applying them. This may sound counterintuitive, or foolhardy. But it isn't necessarily either.

I subscribe to several different rss feeds that chronicle security patches and warnings, sometimes picking up warnings before patches are available. This is critically important, because sometimes patches are not available for awhile, and crackers may already be exploiting the issue. Over the years I have learned some basic strategies for managing the risks, and while it is true that they are a bit more work than depending blindly on a corporation or project, they are amenable to being made into habits. And the nice thing about good habits is that once established, they are not work anymore.

So, suppose that a security vulnerability is reported, and it impacts a system that I use or manage. Right away I want to have a quick look at the report to see if the vulnerability is accessible remotely, or if a person would have to be in a position to physically access the computer to take advantage of it. Those are two very different scenarios, obviously. Patching should be done in either case, but if physical access is required, that may lower the level of urgency to apply the patch. Or a bit more reading may reveal that the vulnerability is in some process or option that actually should not be available in the first place on that system, in which case I shouldn't be patching it, I should be disabling and/or removing it. So when it comes to vulnerability reports, the first three things I want to know are, which systems have the vulnerability, what program does it reside in, and can it be exploited remotely. For general assessment, a vulnerability report that presumes the only people reading are those who will look up CVE numbers to get details rather than providing those three high level details or at second best a link to a description is a useless insult. It is more respectful and more useful to answer those questions straight up in a way that is legible to people even if they don't do any coding or scripting at all.

Besides being able to check these things, I have offline back ups, as in, non-remote accessible back ups. including offsite copies. Having been burned very badly by fancy back up archive formats, I maintain literal copied backups offline, and for the most critical material, I keep that on good old-fashioned cds and/or dvds. Those are immutable and can't be messed with by malware after the fact. When it comes to file scanners, I am not sure how useful those really are outside of email servers. The better defense from malicious email and websites is keeping abreast of the basic techniques used for phishing and dishonest redirects, plus not allowing javascript to run by default. For my part, I delete all emails without even opening them if they purport to be from a financial institution I deal with, which they don't like very much. However, to my mind that's a better habit, and at least where I am it is still possible to call or even go to an actual branch office to follow up. And like most people, for daily use machines I use a modern email client that has a built in scanner for typical phishing patterns, mismatches between where a link goes versus what it says and the like. Oh, and for the times that I have to deal with any file that may include macros, I invoke the program settings that autostrip them or else refuse them permission to run at all unless I explicitly enable them. Really ancient machines that are can't be fully patched, or upgraded but still work don't go online at all. That's not too common nowadays with so many excellent gnu/linux options available.

As this various layers and strategies suggest, I am well aware that being a human being, it is possible to get caught out because of some unfamiliar nasty strategy or a software vulnerability that I couldn't have known about until somebody security testing found it and it was subsequently reported. It is a shame that such precautions are necessary, but they are, and they are quite feasible. There is no need to panic or just wait for trouble. Well, except these days if you are dependent on windows or so-called "cloud computing." In both those cases, so much is out of a person's hands, whether they are using a personal computer or serving as administrator over a cluster of computers at work, that I am not quite sure what the best answer is right now. The stories I hear from my colleagues striving to manage windows installations sound hair raising, especially if it is not feasible to take the machines off the internet and keep them off, which of course it generally isn't. If it isn't possible to switch to gnu/linux, or in the case of "the cloud" to revert to local hosting, then really your best saving grace is regular back ups that are not connected to the internet, plus write-only system restoration media – such as live cds and dvds. But that is still a real strategy, especially if combined with keeping an ear out for warnings of malware outbreaks and increasing frequency of offline backups and making sure that your restoration media work. (Top)

At Least One Serious Markdown Use Case (2023-07-24)

A teaser of Timothy Comeau's barebones and highly eeffective markdown reference, which he shares on his website, along with some interesting thoughts on this now widely used form of mark up. A teaser of Timothy Comeau's barebones and highly eeffective markdown reference, which he shares on his website, along with some interesting thoughts on this now widely used form of mark up.
A teaser of Timothy Comeau's barebones and highly effective markdown reference, which he shares on his website, along with some interesting thoughts on this now widely used form of mark up.

To be sure there are many more uses for markdown than one. It's just that having been by chance among the earlier numbers of people who built and published websites before the invention of blogging software, I had internalized the basics of html mark up to make webpages long before markdown ever came along. Having started way back when html was in something like version 2.0 and no stylesheets, there wasn't much to learn to get started. I developed templates and learned extra things for the few fancier bits I wanted to keep using gradually, and after a sort of baroque period settled back into a much cleaner webpage style overall. Today things are set up with stylesheets in the templates, and these have gotten a bit more elaborate with the years. Still, generally the styles fail gracefully when something goes awry or an older browser is in use, and I can run a bit of grep scripting to reduce it all back to even html 4.0 in an emergency. Thankfully, I have fully gotten over trying to make consistent use of the original frames implementation, although the newer one is actually brilliant for creating simple to update tables of contents without using blogging software. But with all that learned and applied, markdown seems mostly superfluous. There's no reason to bother if I don't need to run the material for my website through software to generate the final webpages. It just seems like too much work. Still, I get that on a busy blog with multiple contributors who may have little to no html-coding experience or desire to have any, that markdown has a strong case there. Still, I couldn't see any reason to do anything with it myself, and so ended up adding it to my list of cool things that are good to understand about and recognize, even if they are not otherwise helpful. Then I finally ran into an intriguing and sensible use that looks to be quite common now, especially with so many operating systems with document preview facilities in their file managers by default or by plug in.

These days among the lingering files that still make an appearance in many software packages even though we are encouraged to read pdfs or turn to online documentation instead, is the venerable "README" file. These are generally just simple text files, openable in whatever basic text editor is available by default in the system. On early apple computers, that was simpletext, on windows notepad, and on *nix boxes an all manner of wild possibilities. When I started out usually we'd just dump the text into a terminal window, but of course there was vi, the strange behemoth that is emacs, and in time the many, many variants that can be represented by gedit. Or then again their total antitheses, pico and nano. If the new software was giving you trouble or there was some recommended set up steps left to do, the "README" documented them, along with any lingering irresolvable bugs it had shipped with. Oddly enough, at some point however these files were being generated in the course of the final software packaging, "README" files often arrived without extensions, perhaps because under that label they had a particularly macos-associated origin, and at one time macos handled program-filetype associations with a fancy thing known as a resource fork. It was a clever idea in its own way, but was finally jettisoned when apple finally switched systems to a variant of bsd unix. Of course "README" files are still readable, but they are no longer amenable to such GUI shortcuts as double clicking or opening via keyboard short cuts without an excursion to tell the system what to open them with. That's annoying. But the utility of such files remains. Enter markdown.

UPDATE 2024-09-20 - Another interesting discussion of when to use markdown and adjusting the development tools for a longlasting website is unixdigest's Come Full Circle - Back to HTML, dateline 2020-10-18. There are many other great, well-researched and thought out articles on the site, including ,ultiple tutorials.

The old "README" files are now appearing as "readme.md" files, and whatever happens, the system copes with them quite nicely, and in systems with a quick preview option available, may even pop up with styles applied if the previewing software supports markdown. As it turns out, markdown wound up being the perfect solution for working up a more efficient file content previewing option for folders of documents I use in my research projects. Now I just write up a quick markdown file that I update when adding new files, and instead of going through files one by one or running a more elaborate search, I can just preview that file to track down the specific document I need. This is far less work than trying to rename the files to something more legible even nowadays when long file names are well-supported. The result is a lightly annotated "table of contents" for those research folders that solves a long time annoyance that a more general search or a citation manager couldn't. By rights citation managers should do the job handily, but I have found that sometimes they are the equivalent of a pile driver when what is needed is something much smaller. Let alone that citation managers in my experience do not sort by default based on folder location. It is possible to get them to do it, but it is non-trivial, and of course requires opening an extra piece of software. For a quick look up when I actually know the file is in a given folder, the markdown contents file does the trick. That works well in the case of reference files sorted into a consistent small number of labelled folders, a style of organization that doesn't work for everyone.

Now, all this could be put down to me being relatively speaking, an old fogey on the internet. Yet it seems that there is something of a shift happening among people who maintain a regular writing space on the internet somewhere. Blogging software is so massive and generally designed to make it difficult to get data out of again that from the sound of it, a great number of people have decided to revert to a much simpler way of writing and posting their web pages. The big proponents of markdown seem to be an early cohort leaning this way. More recently several web pages and blog posts were brought back to my attention on the point, from people switching to "static blog publishing" – or what us old internet fogies call "writing and posting webpages" (e.g. Bradley Taunt, John Ankarström, or Steren). These example posts are all by men and they seem to be a bit defensive, like they have to justify getting off the feeping creaturitis treadmill. I guess they are either just discovering, if they are fairly young, or rediscovering if they are a bit older, that no really, it is possible to maintain a website with just a well-featured text editor and a terminal emulator to post files to the server. And the pages don't have to use any stylesheets at all because html is still there and genuinely simple to learn and apply, partly because the old-style frames implementation is thoroughly deprecated. It is possible, and actually rather enjoyable. (Top)

"Europe" is Not a Continent (2023-07-17)

Excerpt from the political labels version of the excellent and free for everyone to download, print and use equal area projection map developed by Bojan Šavrič (esri), Tom Patterson (united states national park service), and Bernhard Jenny (monash university). There are non-labelled and topography-focussed versions, and a growing number of translations. Excerpt from the political labels version of the excellent and free for everyone to download, print and use equal area projection map developed by Bojan Šavrič (esri), Tom Patterson (united states national park service), and Bernhard Jenny (monash university). There are non-labelled and topography-focussed versions, and a growing number of translations.
Excerpt from the political labels version of the excellent and free for everyone to download, print and use equal area projection map developed by Bojan Šavrič (esri), Tom Patterson (united states national park service), and Bernhard Jenny (monash university). There are non-labelled and topography-focussed versions, and a growing number of translations.

Seriously. I realize that many of us of a certain age got taught to recite by rote that there are seven continents back in elementary school, with "europe" named as one of them. This is the sort of thing that we are encouraged to take as common sense. However, even when I was much younger the declaration that europe was a continent confused me. We were instructed that continents were big and surrounded pretty much by ocean, usually with north america or africa taken as an example. North and south america were preferred in my experience, because they look quite impressive on their own in a mercator projection. The trouble being, europe isn't at all like that. As a child, I was baffled how asia and europe were supposed to be separated from each other, when that whole part of the world is quite significantly stuck together. This isn't like the americas with their now artificially divided isthmus, or the place where africa genuinely looks like it is pulling apart and making a break from asia so that even a stubborn kid can accept the idea. Plus, I couldn't understand why this was at all important. Not that geography wasn't interesting in principle, it was, and one of the few times when having to colour in a bunch of outlines was fun. This didn't seem like a terribly important problem, and I sorted out ways to remember labels for the parts of the world teachers seemed to pick at semi-random for us to know in quizzes and left it at that. However, it eventually happened that due to a different writing project the question came up by the back door, so to speak, because I needed to make sure that my use of "continent" in its set of meanings referring to geography was correct. This turned out to be one of the more peculiar research rabbit holes i have stumbled on, so much so that after editing away my attempt to use the term because it turned out to be so ambiguous, I set it in my "look this up later" list.

As always, I started with my trusty copy of the OED, and it must be conceded that the editors tried very hard to make it all make sense. They tried, but the inconsistent definitions made a hash of their efforts. The main definition refers to "a mainland contrasted with islands" which really doesn't make sense to me. Where I'm from, we talk about "the mainland" and islands, not "the continent." As will not surprise any regular visitor to the Moonspeaker, this indicates that I am not from any of the countries in the shaky political conglomerate known as the united kingdom. A more technical definition deeper in the dictionary entry notes that in geology a continent is one of the Earth's major land masses, including continental shelves. It is quite likely that my present sense of what a continent is now has far more to do with this definition, due to having some advanced training in geology. It still doesn't solve the weird europe-asia thing, but the implications of plate tectonic theory is far more interesting anyway on the geology side of things. The end result though is that this all begins to sound more and more like a convention outside of geology, a political convention that has more to do with who is speaking than a consistent definition based on actual geology otherwise. That actually helps make some better sense of it though, especially after checking that the latin roots of the word refer to holding together or inside. In that case things clear up wonderfully if we stop focussing on geology and turn our minds to people. If we go all the way back to the ancient greeks, they were not necessarily trying to describe geography so much as fairly cohesive groups of people who shared a language or perhaps lived under the same or similar types of political authority. Well, no wonder the word has such a wishy-washy quality in this context.

There is actually an entire book tracing the geographic meanings or rather, attempted meanings, of "continent," and it has been out in the world for quite awhile. Martin W. Lewis and Kären Wigen's book The Myth of the Continent: A Critique of Metageography hit the shelves in 1997, and I suspect it may be an excellent companion to Mark Monmonier's monograph from a year earlier, How to Lie With Maps. The latter is by turns practical and funny, while the excerpt from Lewis and Wigen is a bit more scholarly, but very interesting indeed, based on the excerpt still available to the public at the new york times review of books. Unsurprisingly, the ambiguity and suspiciously political inflection of how the word "continent" is used in geographic parlance is not new and has been confusing people of all ages for literally millennia. Not having had a chance to read the Lewis and Wigen's book yet, I can't speak to their own conclusions or arguments in it with any specifics, although even this snippet indicates they are going to go happily to town on ethnocentrism.

But why does it seem to me that europe is not a continent, and that this is actually important? Well, I have already answered the first part of the question: there is no real division between what is usually called "europe" and "asia" in geographic or even cultural terms. The "western" leaning toward exoticizing and racializing "the orient," which is a means to deny people from those regions full humanity and a right to their own cultures is all too old, and at the moment going through an especially pernicious resurgence. The peoples typically labelled "slavs" based on their eastern european languages seem to especially annoy "western europeans" with their often literally more colourful and texturally rich material culture on top of a remarkable level of survival of ancient traditions of dance, music, and poetic performance. There is a non-trivially puritanical and destructive streak expressed in the nonsense claim that "europe" is a continent, one that encourages warfare and attempts to force people to leave their own cultures behind – in other words, genocide. It doesn't take much research to confirm that this is no exaggeration. (Top)

Not a New Problem (2023-07-10)

Scan of the frontispiece of an early edition fo Galileo's monograph *Dialogue Concerning the Two Chief World Systems* from the Newberry Library collection via their materials shared with the internet archive. The original scan is of much higher quality. Scan of the frontispiece of an early edition fo Galileo's monograph *Dialogue Concerning the Two Chief World Systems* from the Newberry Library collection via their materials shared with the internet archive. The original scan is of much higher quality.
Scan of the frontispiece of an early edition fo Galileo's monograph Dialogue Concerning the Two Chief World Systems from the Newberry Library collection via their materials shared with the internet archive. The original scan provided by the Newberry is of much higher quality.

Among the matters that the ongoing COVID-19 pandemic has raised has been the issue of "science," what it means, and how to make judgements that are upheld by it. This should not be too difficult to sort out, except that in line with the tenor of the times, with the general abuse of language by those who hope to take advantage of the unwary, "science" is a term being conflated with "authority." When some person in a position of supposed authority inveighs upon us the "necessity" of trusting the "science" we should be put on our guard, and indeed most of us are. The person who says such a thing is not calling upon us to trust "science" but to obey their pretended authority, which at this time is wholly artificial because the majority of them are not in their positions due to their actual knowledge or qualifications in science but in their capacity to suck up successfully to the rich. I referred to such bait and switch terminology as part of the tenor of the times, and by this do not mean that this is unique. It isn't at all, there are plenty of examples of the use and abuse of "science" in an effort to win unearned trust and authority, and people claiming to be champions of "science" have engaged in this more than once. When a once trusted authority shows itself to be the institutional equivalent of a pathological liar and cheat at best, there are plenty hoping to step into the breach by encouraging general panic and expecting the general population to rush to anyone pretending to be the biggest and meanest. Still, I should provide some specific evidence for my claim that this is an old phenomenon that "science" has been tangled up in before. A commonly presumed example is the case of Galileo and his clash with catholic church authorities, but that is a deceptive one. Deceptive not because it isn't an example, because it is, as those who have read Margaret Wertheim's book Pythagoras' Trousers will be familiar with in detail. Deceptive because it is held up so often as an example of "what can't happen anymore."

Let me quote then, from a paper by the famous plasma physicist Hannes Alfvén, "Cosmology: Myth or Science?" published in the Journal of Astrophysical Astronomy in 1984 (the year is coincidental), excerpts from pages 87-88. Each excerpt is separated by some vertical space, and some text has been left out apart from that marked by ellipses in the second.

The victory of science over myth in the field of celestial mechanics spread slowly to other fields. It took more than two centuries before it seriously invaded biology. In our century the scientific approach has embraced other areas which earlier were alien to it, such as the origin of life and the functioning of the human brain.

However, this does not mean a complete and definite victory of common sense and science over myth. In reality we witness today an antiscientific attitude and a revival of myth. This tendency has at least two causes. The popular creationism in the South in the United States derives from religious fanaticism. But in a way, the most interesting and also most dangerous threat comes from science itself. In a true dialectic sense it is the triumph of science which has released the fores which now once again seem to make myths more powerful than science and causes a "scientific creationism" inside academia itself.

...It was claimed that "Einstein has discovered that space is four-dimensional," a statement which is incorrect. In fact, H.G. Wells (1894) has based his ingenious novel The Time Machine on the "generally accepted idea" that space was four-dimensional, with time as the fourth coordinate. This novel was published when Einstein was fifteen years old.

However, the fourth coordinate which Einstein introduced was not time, but time multiplied by √-1. From a mathematical point of view this is elegant, because it meant that the Lorentz transformation can be depicted as a turning of a coordinate system in four-dimensional space. However, from a physical point of view it does not give any new information.

Many people probably felt relieved by being told that the true nature of the physical world could not be understood except by Einstein and a few other geniuses who were able to think in four dimensions. They had tried hard to understand the science, but now it was evident that science was something to believe in, sot something which should be understood. Soon the bestsellers among the popular science books became those that represented scientific results as insults to common sense. The more abstruse the better! The readers liked to be shocked, and science writers had no difficulty in presenting science in a mystical and incomprehensible way. Contrary to Bertrand Russell, science became increasingly presented as the negation of common sense. One of the consequences was that the limit between science and pseudoscience tended to be erased. To most people it was increasingly difficult to find any difference between science and science fiction, except that science fiction was more fun.

On the other hand, in the general theory of relativity the four-dimensional formulation is more important. The theory is also more dangerous, because it came into the hands of mathematicians and cosmologists, who had very little contact with empirical reality. Furthermore, they applied it to regions which are very distant, and counting dimensions far away is not very easy. Many of these scientists had never visited a laboratory or looked through a telescope, and even if they had, it was below their dignity to get their hands dirty. They accepted Plato's advice to "concentrate on the theoretical side of their subject and not spend endless trouble over physical measurements." They looked down on observers and experimental physicists whose only job was to confirm their high-brow conclusions. Those who were not able to confirm them were thought to be incompetent. Observing astronomers came under heavy pressure from prestigious theoreticians.

There is considerable food for thought here. Most prominently, the deadly shift from continuing an honest and effective depiction of science so that it did two things. First, it held up scientists such as Einstein or others working in areas now conveniently deemed too difficult to understand for the general public as authorities who were now encouraged to pontificate not just on their proven area of expertise, but on everything and anything. The trouble is not in asking such people about things outside of their area of expertise, but allowing them to act and speak as if they were experts on everything else. We don't make so foolish a mistake as to presume that an expert in any more materially-bound area of knowledge will be a total authority on everything else. After all, as Alfvén observes, they are not keeping their hands clean by sticking to thinking, making them rather dirty and incapable of creative development or discovery according to the persistent snobbery so rampant in idealist philosophies. Second, it allowed for the general population who had not managed to become celebrity geniuses to be depicted as stupid children and duly informed that really that was what they were, so they should stick to the stuff that was fun and not too taxing while their "betters" did the actual thinking.

I do appreciate the temptations of idealist philosophies. They are so wonderfully tidy, providing an excuse to refuse to actually engage with the real world. Ideas can be put together into aesthetically pleasing structures where everything is predictable and seems to make sense. If the structure is one we understand and become committed to, then we are bound to invest considerable trust in it. It becomes how we make sense of the world, especially if the set of ideas reassures us that whatever we are doing is right, including things that otherwise our consciences might trouble us over, or that material reality keeps stubbornly contradicting. In my early adulthood, when I was still making my way through the trailing edge of puberty and its stubborn penumbra of an urge to find simplified answers to make a tumultuous period of development easier to cope with, idealist philosophies held a deadly attraction. Having grown up and had some more life experience, for myself I now find such philosophies unsatisfying, precisely because they demand in the end a blind faith in authority. There is too much evidence in my own experience and the historical record showing how acutely dangerous and destructive blind faith is.

Science is not a faith position. Science is a method of observing and learning from experience, then applying that learning and experience, seeking always to improve our understanding. When we get an unexpected result, if we are applying what is usually better referred to as the scientific method, we don't seek to explain away with a contrived rationalization. Instead, we seek to make new observations and otherwise gather new information in order to see where our original reasoning went wrong. Often our reasoning has not gone wrong so much as we have overextended an explanation that was originally framed within a specific context that we know very well. Einstein's theories of special and general relativity are excellent examples of the results of an overextension and subsequent correction. Newton wasn't wrong, but he was bringing together data within a physical context characterized by low velocities and relatively small masses compared to where later physicists tried to use his earlier results. They didn't go wrong trying this either, they were exploring whether his work could be extended so far. Then the next step was figuring out why it couldn't, and how to characterize the new results in mathematical terms. In this mathematics are both useful and dangerous, because we seem sadly prone to mistaking the descriptive powers of mathematics for creative powers. Mathematics is a carefully designed language, and we are prone to expecting it to be magical because it can be hard to follow because of the mystifying way it is taught.

I am not suggesting that any of us can just pick up a complex document from any science, whether it be from the orbit of physics or the more difficult areas of the biological sciences and understand every last bit of what a given paper or book says. It is undeniable that if we are going to make sense of any language, law, or mathematics we have not learned already, we have to do some learning first. But the general principles are quite approachable, as indeed the many popularizations of Einstein's theories of relativity show. Or the practical observations of COVID-19 superspreader events, which can clearly only be fully explained by aerosol spread. Attempts to scare people into obedience to an authority because supposedly they are too dumb to understand should be a blaring alarm to all of us. By rights, such a play should lead to the prompt removal from any position of possible authority for a person who tries it, because it reveals that they are untrustworthy. (Top)

System Updates (2023-07-03)

Free Software Foundation badge. Free Software Foundation badge.
Free Software Foundation badge, circa april 2022.

The very first computers I worked with were unix terminals attached to a mainframe that did all the real work, which remained a typical set up for another ten years. But when I started, it was still necessary to go and work on those terminals to test and debug code, although at least between my second year of university and my third, this could be managed live on the terminal, not via a cycle of try to run the code, then go pick up the print out downstairs to see if it worked. At that point even the instructors were bad for not explaining that it genuinely could be worse, we could have been stuck using punch cards, which were still the basis of the input and output for the computers used to manage the library. But by the next year, that was all magically gone, and suddenly we were supposed to use execrable machines running a version of networked windows. This was so bad that even the people who had no particular dislike of microsoft competed for time on the macintoshes and lingering commodore 64s. That is saying something. By the time I could buy my own computer, having been so lucky, I opted for an early apple laptop because it would run neither windows nor the version of DOS I had suffered with when working on IBM machines. It wasn't ideal, but at the time apple had not locked its software down so completely that mere personalization was difficult, and despite it being an early laptop, the case was openable and all parts replaceable. Now I know that even the motherboard could have been swapped, although it would have been too expensive for me then.

As time went on, I heard about GNU/linux, and missed the style of unix box I had used to most pleasing effect at university, but it had run on SUN and SPARC boxes, and no way in hell could I afford those. Thanks to good fortune job hunting I still got to use them sometimes, trying to catch up with what was happening on BSD, but beyond that, it was tricky. The only thing I knew for sure was that no way in hell would I willingly spend money to make Bill Gates richer, because every encounter I had with computers running windows or other software from microsoft, it was a miserable experience. That is how I developed a now "hit the keyboard shortcut to save every three sentences" reflex, and an unwanted ability to dig into activity monitors and task managers (graphical or command line interface) and identify memory leaks by certain tell-tale forms of bad software behaviour. After a point I began to wonder if the whole point on windows was to deliberately make things work badly, as relatives bought windows boxes like everybody else, and soon began to regale me with complaints about how their machines kept running slower and slower, and all the issues with viruses they were suffering. This was still the age when apple chose good parts and put together computers that could be upgraded and repaired. So it still made more sense for me to watch and catch a good deal on a new machine when the marketing cycle turned over due to the newest machines being released, putting a heavy discount on any straggling older macs in inventory. Oh, and this was before there were corporate apple stores. It wasn't a perfect situation, but I did notice that overall the number of machines I was buying was nowhere near what my relatives were, or even my employers on a per employee basis. There was a price for this of course, in that I couldn't take security for granted, and had to make sure I understood the major risk factors, how to manage them, and when they weren't manageable anymore for a machine that would spend regular time on the internet.

To begin with, I was not too sure about the whole of Richard Stallman's arguments, but I appreciated them. If nothing else, they were important and useful to think with, and I was on side when it came to the question of being able to repair hardware and software rather than just accept spyware and being forced to throw a sensible computer in the trash for no other reason than pisspoor software. On the other hand, I was glad to recommend and help implement a switch over to linux for friends who had old and ailing machines that were originally running windows. It made sense to me to keep those out of the trash if software was the only real issue. Maybe they couldn't go back on the internet, but I began my typing life trying to make electric typewriters do the right thing. Word processing software on a computer plus a decent laser printer is a damned technological miracle compared to the heinous electric typewriters I encountered in their official last days of office use. They made the two analog laptop typewriters I had via great luck at the second hand shop even more badly missed. They were far easier to use, no electricity, so quieter and cooler, and even taking into account that the margins had to be set by hand. But finally I couldn't get typewriter ribbons anymore. (Today such typewriters are making a comeback and ribbons are accessible again, go figure.)

Then it turned out that by just dumb luck (seriously) I managed to buy one of the last of the macbook pro line of laptops that is repairable and was designed with an easy-remove panel to facilitate battery replacement, hard drive replacement, and RAM upgrading. That machine is still kicking to this day for those very reasons, and while it may not look as sleek as the newer apple laptops and desktops, it has more than amortized its original cost over the years it has been running. To be sure, it was way too expensive when I bought it for anyone who expects to replace their machine every year. But it was part of the last years of well-chosen hardware combined with accessible and repairable designs. Thankfully things have developed well enough in the free software world that running an alternate BSD or linux is no problem at all, except for the damnable touchpad. (I will figure out how to sort out the driver settings so that the wired mouse can stay at home. Oh yes, I have a couple of wired mice still, and are they ever worth having when troubleshooting balky machines, whatever OS they are running.)

Today, in light of the development of ever more pervasive spying and the reversion of the internet into the primarily military and spy oriented tool it was originally planned to be, I find myself moving ever closer to Richard Stallman's position on avoiding proprietary software as much as possible. There is one program I miss, because the company is keeping their code secret even though honestly, the software is so good they could let it loose while still maintaining a lower powered free version and a fully unlocked version that needs licensing. They have great documentation and support, with a real respect for the people who make the software part of their daily workflow. Different companies will make their own calls on that question, and it is clear that most of them are still disinclined to use GPL licensing. But now it begs the question, if a company that sells software and/or hardware is unwilling to at least make their code openly available, even if they would like to use a more restricted license for reuse, of what they are hiding. (Top)

Ideas That Hold Us Back (2023-06-26)

1878 depiction of the wrecked ship childwall hall from the illustrated london news via wikimedia commons. 1878 depiction of the wrecked ship childwall hall from the illustrated london news via wikimedia commons.
1878 depiction of the wrecked ship childwall hall from the illustrated london news via wikimedia commons.

This is another of those paradoxical times, when a few people desperate to maintain their ill-gotten gains and undeserved influence are flogging bafflegab as hard as they can in hopes of keeping the rest of us too distracted to hold them to account and put them back in their places. From the way things look in the previous historical examples I am aware of so far, a major item in the toolbox of such characters is pushing for some version of extremist idealism, where everything in the world somehow hinges directly on the individual human mind. Starting from that premise, they can insist that if we are experiencing difficulty in our lives, it really is an individual, personal issue. Our heads just aren't on straight, we have not taken appropriate steps to imagine our personal world right. It doesn't take too much effort to go from there to trying to police people's thoughts, whether by absurd abuses of police authority or obsessive attempts to create a uniform wall of propaganda. Meanwhile, the rest of us are supposed to miss that actually, the people whose minds supposedly make the world in such conceptualizations are the same people who would like the rest of us not to notice what sociopaths they are, nor for us to be other than atomized and unable to work together to put things right in the actual, material world. That many of us do get swept up in variants of idealist nonsense is not a fault as such, it is often a hijacking of a faithful habit, and since habits are not fully conscious it can be hard to stop the hijacking. On top of that, it seems at least a little plausible if we don't look too closely, because ideas we come to believe in become powerful guides to behaviour. This only holds up as long as we don't stub our toes or run out of some necessity of life with no obvious means to replenish it. If we can be persuaded that we are in a cage, or must live like we are in a cage, regardless of anything our senses may tell us, then we will live like we are in a cage. This is at once an insanely powerful and intensely frightening human capacity.

Observing the past several decades, not just the past several years, I can only conclude that some ideas do indeed hold us back, even if they state positive or pleasing things that may be true in whole or in part. It's a bit mind-boggling stated this way, but most people have a good chuckle when it is rendered into the form of the "best of all possible worlds" doctrine. Gottfried Wilhelm Leibniz came up with an early european male philosopher version of the idea, and Voltaire made righteous and hilarious fun of him and it in Candide. Voltaire's point being, if you can see or experience terrible injustices and the like without recognizing something is wrong and wanting to stop them, you have to be delusional or willfully obtuse. Leibniz insisted that no, there is a god, god made the world, and whatever god made must be best, because nothing could be improved without making something else worse. Both men, like all manner of people then and since were trying to explain the presence of evil in the world, whether people have free will (watch out for the definition of who counts as "people"), and the like. To be fair, Leibniz tried very hard to come up with a formulation that did not lend itself to self-delusion or quietism in the face of evil. Philosophers still argue about whether Leibniz' doctrine is right. For the purposes of this thoughtpiece though, it doesn't strictly matter whether we agree with a position like Leibniz's or not as such. Instead it matters whether we are directing our behaviour not necessarily individually but on a societal level as if we believed something like that, taken in the direction of therefore resisting any substantive change. Alas, there are such fairy stories abroad, and alas that they should be so dangerous.

Let's start with a couple of examples embedded in the scanty national mythology of the colonial settler state of canada. Many of the ones I have in mind are especially recent, all post world war two, corresponding with a new era of mass media messaging and defensive posturing in an effort to maintain cultural distinctiveness in the anglophone population from anglophone counterparts in the united states. Never mind that in both countries, which are huge, the anglophone populations are spread out, have different compositions and local histories, and therefore are all different within themselves and from each other. Where they share specific heritage such as language and place of origin of their ancestors, they will share some cultural features, and yes the broad outlines of the english language. But the top story in the modern canadian colonial settler mythology, this one long preceding world war two, is "we are not americans." The much later ones layered on top of that are the stories of canada as a "middle power," a "peacekeeper," and a "top country in the world, only getting better." Sounds great mostly, except that the first one is a bit obnoxious at best when held up as something of a not so subtle claim to moral or cultural superiority. In my experience the first one is usually waved around more as a way to avoid saying something positive and specific about what being "canadian" is.

But the trouble with all these is not even necessarily how true or not they are, or how annoyingly formulated they may be. The key issue is that they encourage complacency and denial about the actual history of canada, including its role as anything but a "peacekeeper," and certainly not "getting better" by sitting around refusing to take practical and just steps to rectify the exploitation and cruelty that is grinding more and more people into poverty and homelessness. Let alone not dealing with how to stop striving for the genocide of Indigenous nations who were already here and anybody else who is different from the "white anglophone christian male" ideal at the nasty centre of the colonial project. So far as I can tell, the vast majority of people living in what is currently called canada, whether they consider themselves canadian or not, are horrified that such things could be happening here and want those things to stop and for there to be permanent positive change. Until they get reminded of those other places that are worse off, seemingly just because they are worse, because click bait news headlines and news stories manipulated into ersatz mini action movies hide away complexity and the role of multinational corporations including not a few canadian ones in making them worse off. But the invidious comparison tugs at that nasty little rationalisation of "Well, maybe it doesn't matter so much. It's not like it's as bad as it is over there." The trouble is, that means even without the factor of having a nasty subset of canadian citizens engaged in making things worse "over there," it is ridiculous to refuse to improve things that are awry in canada because they are apparently more awry "over there." Such paralyzing thought loops are the most dangerous sort of stories.

This is exactly why in most societies most of the time, people are taught and encouraged to check out not just their own household and immediate family, but to consider their neighbours and whether they are okay. It reminds me of a Métis elder who commented at a cultural workshop that during the great depression, if a family was forced to take relief in their community, it was a disgrace. Many non-Indigenous readers would take that as "a disgrace to the family that took the relief." But no, the disgrace was to the community for not pulling together to help the family. That didn't mean the situation had to stay that way, or that the community deliberately hung a family out to dry. As it doesn't take much research to learn, the great depression was an extremely disrupted time, and impoverished communities struggled to maintain cohesion as the difficult conditions made it hard for people to keep in touch and share resources. The people who had the toughest time were least aware of what was happening outside of their immediate circle, and least inclined to try to oppose injustices that could only come to a nasty result. They were also the ones most likely to think that really, things were fine, not as bad as "over there," since after all they "weren't like those other people." (Top)

A Clarifying Video (2023-06-19)

Quote of Téa Smith's website snapshot, circa april 2022. Quote of Téa Smith's website snapshot, circa april 2022.
Quote of Téa Smith's website snapshot, circa april 2022.

Some time ago, in an all-mighty binge of background podcast-playing while re-assembling and repairing furniture after a major house-move, I found myself going through some of the older episodes in Téa Smith's oeuvre. She is among the few early internet participant-developers still fighting the good fight, and she has kept on through the experience of being very online and its impacts that many younger people today are having a hard time avoiding or coping with if they have landed there. Smith has a fairly recent website redesign and update completed, and she is still working on her book. In december 2020, she posted a remarkable video that not only covered what it said on the tin "'Wokism': Definition and Diagnosis (and a likely rant about the culture wars.)" but also finally brought clear to me why Socrates was an irresponsible jerk and ended up getting himself killed – regardless of the fact that being an irresponsible jerk is obviously not sufficient reason to execute anyone. But irresponsible jerkdom can get a person into huge and potentially fatal trouble all the same, and that is something too often forgotten. It is unfortunately not likely that Smith's discussion has gotten near the views or listens it should, and to the extent possible in my obscure corner of the web I'd like to signal boost it and her in general. The best way I can think of to do that is to provide my own idiosyncratic high level notes from it, that have in turn been at my elbow while reconsidering and thinking through both ancient and all too present-day puzzles. And as a powerful complement to this video as an introduction to Smith's work, I would recommend her recent essay, We need to talk about the gentrification of the left.

00:05:56 – wokism, a phenomenon, largely online, not an ideology... with real life consequences, related with fandoms, online culture wars, a consequence of how online tribes have been built, created religious thinking, phenomenon of marketing
- see missing history of big tobacco, "we are in the doubt business"

00:09:00 – [the big corporations] are in the business of making sure people aren't sure about things, then sell them things in the cracks... attack critics, buy research

00:09:51 – wokism is not critical theory, critical theory is not postmodernism; wokism is a distinct phenomenon from the perfect storm, scholarship is part of it, not a conspiracy or red scare

00:10:50 – postmodernism is not even really defined or well defined... critical or rubs against and critiques modernism

00:11:39 – deconstructionism seems to pour rocket fuel on crazy, which even postmodernists agree it does

00:13:00 – hypothesis, military industrial complex saw that deconstructionism is a convenient way to legitimize the doubt business

00:13:55 – deconstructionism is still niche within postmodernism, a small, batshit part of people who think everything is a social construct; not all postmodernists think this
- most postmodernists think you should not learn this stuff until you have learned modernism, foundations, learn that reality exists first, then learn that anything is a social construct
- Derrida and other deconstructionists come out of media, books, movies, generally media

00:15:23 – leads me to ask if it's Judith Butler's fault that doctors are listening to literature professors; I don't think it is

00:16:20 – start from what we know is true, big pharma gives doctors kickbacks, start paying for studies that sow doubt, in the doubt business, sow the idea that everything is social construct
- how to infiltrate science and cast doubt... encourage papers that sow doubt

00:17:50 – get the kids in, 40 year project, independent of the scholarship, u.s. colonialism and culture

00:19:00 – [wokism] comes out of fandoms, the missing link is fandoms, communities where children and pedos gather around something like Marvel, Potter universe
- linked to marketing, good marketers build cults, evil forces have entered in on top; not just wokism also alt-right stuff

00:21:05 – a subsection of fandom is fan fiction and deconstruction, the fans incessantly deconstruct the materials they are fans of
- large group of people who dedicate their time to this, worship, discussion, from a tribe, philosophy

00:22:20 – like a microcosmic department without the foundations needed before you go into the deconstruction stuff

00:23:30 – at some point these [fandom] things began to go off the rails because the corporations saw an opportunity to market to them, the dark forces came in

00:24:08 – there's a religious component to being a fan, at the core of this
- wokism is part psychological phenomenon based on fans, part political (squash the left, international relations), digital religion, digital cult, cultural tribes doing deconstruction without the necessary foundation
- marketing hard to make money without a large following and ads; viral marketing, fame lottery internet, fandoms, religious component to being a fan, super online, unsupervised kids evangelized to tech too early, deconstruction a norm in fandoms, calling everything as a social construct, seeing 50 shades lady get rich, JKR's hero's journey, thinking if I create this, build worlds, I might get famous, win the fame lottery
- using tools they are not ready to safely use; 25 year www, whole generation of teens that grew up on the internet not knowing that not everything is a social construct

00:29:00 – as soon as you switch universities from foundational stuff, modernism, theory, they've gone from studying the basics and building on that... a culture where everyone needs an undergraduate degree, almost secondary to experience
- creates gender studies, critical theory, no foundational stuff, with no basis; deconstruction is a weapon, will teach kids that everything is a social construct

00:31:20 – basic attachment theory, kids need to feel secure to explore the world themselves; tell a 6 year old girl that girls can be boys... you're fucking them up
- corps, pharma see an opportunity, a generation of kids with poor mental health, lack attachment to their parents, believe everything is a social construct

00:32:30 – can't fight this with liberalism, can't fight this with united states neoliberalism, have to fight with Marx, material socialism, Marxist or Chomskyist analysis
- doubt business destabilizing a generation of kids who grew up in the marvel universe and potterverse online

00:34:10 – cost is $100 000 and you're selling an experience to a child, not selling an education, they're the customer, demand for degrees goes up, parents want their kids to pass and not be pushed

00:35:20 – kids literally believe everything is a social construct because that is what they have been told because big corporations are in the doubt business, buying ads online

00:36:00 – corporations in the doubt business, they need research, so what they do, whole departments that they fund and are dedicated to deconstructionism

00:37:55 – pour jet fuel on a feminist backlash, see that it's by design; real postmodernists do consider class and build upon that, they're not in the intersectionality cult
- wokism came largely out of sociology and media departments, not politics
- even Butler, she's a lit professor; she's going to say stupid shit, she's not responsible for what they do with her stuff

00:42:00 – the digital angle is like 70% of this whole fucking thing

00:42:09 – wokism is a digital phenomenon, a digital cult that is largely fuelled online from fandoms, has got religious thinking and religious components to it which has been exploited by corporations; it is used as a tool to punish dissenters; its relationship with postmodernism is tenuous; the deconstruction aspect had jet fuel poured on it by fans, corporations, fame lottery

00:45:00 – privatized education and health care also part of the problem
- kids go to school to get a bullshit degree that just confirms their worldview, degrees in cinema studies or whatever, these things crowd into electives
- those kids are now in the workforce, they have never been told they suck

00:53:50 – when I talk about fandoms, I'm talking about online communities with a religious component to it, an obsessive component to it...

00:55:00 – end result is that a lot of these kids don't know what's real

00:59:00 – you can't fight this with the same old liberal bullshit, it has to be fought with Marxist analysis I think and Chomsky, here's how the world works, here's what's happening, here's the interests that are working to fuck with your head, here's what they do, they make you doubt yourself, hey make you doubt things that are generally true... this is now generally in schools

Overall, it was surreal to reread these notes with things so much further gone, with apparently even far too many adults letting go of their grip on the real world. There's a lot to be overwhelmed by right now, but at this point I find myself mistrusting the general doom machine tone in all of the mainstream media. According to that decrepit and now beyond sold out mess, we're all doomed, the Earth is doomed, there is nothing we can do except buy, buy, buy, before it's all gone. Whatever "it" is. And that sounds just a little too useful for the scammers and thieves, and too destructive to the rest of us. (Top)

Paper Keeps Going (2023-06-12)

Varanasi paper bag maker in india, circa 2005. Shared by photographer Jorge Royan on wikimedia commons under creative commons attribution-share alike 3.0 unported license. Varanasi paper bag maker in india, circa 2005. Shared by photographer Jorge Royan on wikimedia commons under creative commons attribution-share alike 3.0 unported license.
Varanasi paper bag maker in india, circa 2005. Shared by photographer Jorge Royan on wikimedia commons under creative commons attribution-share alike 3.0 unported license.

While rummaging happily around in the archives of no tech magazine, happening onto this 2006 paper by William Powers, which I had to drag out of the internet archive: Hamlet's Blackberry: Why Paper is Eternal. It is part of a series of discussion papers emanating from the Joan Shorenstein centre on the press at harvard. As is to be expected, it is an essay starting from the wails of established newspapers, whose owners were annoyed that they couldn't sell nothing for something online any more than they could do it on paper. This led Powers into a morass of nonsense about the supposed primarily nostalgia valuation of paper as a general medium, because he had to get through the attempt at advertising newspapers to catch this supposed nostalgia as a means to maintain and boost sales. Once he thrashed his way through all of that, he actually looked at the ways in which paper was brought to europe, and how its usage developed there, having run hurriedly away from the deeper traditions further east and north that most europeans who could started out by imitating. He does take note of how the lower cost of paper made it a more accessible medium and tool for the broader population, and even spends some time exploring the technology behind Hamlet's tablets, so that he can get back to the united states, which is what he is apparently most comfortable speaking about. This is something of a shame, because one of the best sections he wrote dealt with what today are the lesser known intellectual connections between many of the people who wrote for and even ran their own printing presses in the united states and their contemporaries in england and france – and that is only a start. My perspective is certainly reflective of my interest in history, but also of my wish to better understand the united states. Marketing hype hides more than it reveals, and makes bizarre implications like somehow only the united states had a vigorous newspaper reading culture and that only newspaper and magazine publishers care what happens to publishing on paper. Well, no. Understandably magazine and newspaper publishers are very interested in trying to maximize their profits from the ways they publish and distribute their wares, but that makes them a sideshow to the real questions about he future of paper. And actually, there were multiple vigorous cultures of newspaper reading which the united states' own regional cultures could and did overlap with. I did some minimal digging on the question, and there are definitely many promising research rabbit holes, among them many that get away from suggesting the united states has always been some sort of completely unified hayseed ignoring the rest of the world.

When it comes to digital publishing, Powers touches briefly on that vexing and uncomfortable question of dishonest changes after the fact. I don't feel too much sympathy with the people who like to make a fuss over Stalin's minions airbrushing out people he decided he didn't like, because they don't call out such yes dishonest behaviour wherever it occurs. It is hardly new either, we can find variants of it in the context of every form of publishing over time out there, including ancient inscriptions. That it is now far easier to carry out such dishonest changes on digital versions is a serious question that nobody has really grappled with, although at least the people at the internet archive have written on their blog about how they maintain paper copies in part to be able to check the digital against the much harder to change paper version. Unlike the digital one, they can account for access to each item if they need to, and of course, books can withstand the power going out and a surprising amount of sheer havoc and disaster. This is not something to be taken for granted, but it means a lot. Still, important as this topic is, I don't think it is really about paper so much as it is about specific obnoxious and dangerous habits that people who think they should be able to control what others thing and know, and how to effectively oppose them. (Hint: consistency does help. So do independent back ups!)

To me the real questions about paper are the ones Powers does finally make his way to, the way that each of us from day to day use paper, and not just paper with something already printed on it, but blank paper. It is not true that paper and writing were invented because "human memory is weak." Untrained and undisciplined human memory is certainly weak, and in fact we humans have a penchant for developing and applying ways of training and disciplining our memories to keep track of what we care about. At the moment this often not done formally, although if you have encountered a rabid baseball fan who can regale you for hours with the line ups of the team or teams they follow, you are likely sharing in the fruits of their independently reinvented method of locii based on positions on the field and dugout. The effectiveness of memory systems when consciously developed and maintained is quite extraordinary, and those looking for a not too technical introduction may try reading Lynne Kelly's books The Memory Code and the more recent Memory Craft. Paper can help with encoding and reinforcing already encoded information we may have in memory, and there is reasonable evidence for that in the past. Most of us of at least a certain age are familiar with how well paper can serve as a means to get ideas from their first glimmer to their ultimate expression as the most diverse things, from yes books and articles to clothes and buildings.

Paper is remarkably useful for lots of other tasks besides, as the paper bag maker whose image is featured above reminds us. But there is still more to paper as a tool and a medium. Its increased availability made it into something that anyone who could manage to get some and some sensible writing tools could use to record their own thoughts and ideas, whether or not they wanted to keep them forever. It remains true that using ordinary pen, pencil, or whatever and paper is something we can still do profoundly for ourselves, without some form of snooping or spying in our current period of obsessive spying by corporations and governments alike. There is a reason that even the cheapest notepaper and pencils have begun proliferating again, and a reason that people generally do not consider it a positive that so many children at the moment lack the dexterity and hand strength to use a pencil in the early grades. To be able to write and read on your own, without an electronic gadget intervening, is a genuine individual superpower that should not be surrendered without a fight, and children shouldn't be angled into surrendering it when they have no way to make a choice. Let alone the issue that so many children are not playing with toys where they may build hand-eye coordination and fine-scale hand dexterity in other ways.

The tougher question about paper it seems to me, is not whether it will continue to be used and useful into the future: clearly it will. The tougher question is whether the sociopathic subset of boosters who want to take away a key means of expressing independent thought and planning from what they can exploit and pretend to control will be allowed to remove access to it in future. (Top)

The Foyson is Already Missing (2023-06-05)

Author Patricia Monaghan, publicity still from her posthumous and still lovely website at patricia-monaghan.com, quoted for non-profit purposes only, april 2022. Author Patricia Monaghan, publicity still from her posthumous and still lovely website at patricia-monaghan.com, quoted for non-profit purposes only, april 2022.
Author Patricia Monaghan, publicity still from her posthumous and still lovely website at patricia-monaghan.com, quoted for non-profit purposes only, april 2022.

The increasingly parlous state of access to nourishing food for the majority of the population almost anywhere we care to look today is much harder to deny than it used to be. For all the wailing and gnashing of teeth about what reporters repeatedly describe as "an epidemic of obesity" and/or diabetes supposedly primarily related to lack of physical activity, I get reminded again that this is a distraction from the real cause. Those are real conditions afflicting people, no question. But the actual cause is not individual in the way references to exercise and the like suggest. No, it really hit me some time ago in the context of the early COVID-19 pandemic, that what is going on is not a pre-existing "epidemic" of those inflammatory conditions and the subsequent special dangers COVID-19 adds to that. The actual cause is massive levels of population-wide malnutrition created and exacerbated by the soaring misproduction of heavily adulterated and otherwise manipulated foods on top of food that began as force-grown and fed plants and animals themselves deprived of adequate nutrients. All this before the impacts of global warming on nutrient availability and uptake to the animals and plants we eat have affected matters. To get decent nutrient levels, people today need money and enough time to be able to access real food and real ingredients so that they can make their own. They also need to avoid practically all anti-foods, from the execrable and gross-tasting fructose syrups to the strange constructions made out of tofu to masquerade as meat. Old-fashioned tofu curd and tempeh are fine, although based on my study of how to prepare these properly, "westerners" are prone to eating them in excess compared to how they are meant to be used. They can be affordable for those who know where to get them and how to cook them. I will leave aside the issues of learned and imposed helplessness and how they impact this for another thoughtpiece.

Over the past four years, I have been especially struck by the fact that even junk food, which I used to understand to refer to many things that are "treats" and therefore not something eaten frequently or in large amounts. So any dessert, candy, the infamous "burgers and fries," breaded fish or chicken fingers and so on, doesn't taste good and provides no physical satisfaction at all. Due to health issues, I have been forced to revert to an old-fashioned approach to these sorts of foods, going back to the way I had to eat them as a child: infrequently, as treats. Otherwise far less than pleasant mayhem ensues. There are not many such things I can indulge in even then, which is annoying, to put it mildly. So it is all the more bitter to eat them and not even get the sort of satisfaction that should come from eating food, even for things purporting to be meals, not desserts. It's not just about flavour anymore, but a matter of feeling like it was necessary to go away and have a real meal. This was so disturbing to experience that it has quite put me off of even take out dining, except for what at the moment are perhaps two examples. (One has been inaccessible to me due to COVID-19 restrictions.)

I have been struggling to put my finger on what was awry to friends, while doing my best to make sure they realized my issue is not some kind of moral one in the immediate sense. I don't think there is anything wrong per se with eating something that is a treat, and for those who can eat more without discomfort or health impacts, I salute their remarkable fortune. But how disturbing, indeed frightening, would it be for someone to be eating the heavily processed stuff marketed as "convenience foods" and in a position where other options are unavailable regardless of whether or not they wanted them, and eating such stuff left them feeling full, but not like they had eaten. For people in that position, restaurants are utterly beside the point, they are struggling to get enough nutrients to do a sensible day's work, whether that consists of physical labour, a desk job, being a student, or whatever. Our whole bodies need the good stuff. Too much of the bad stuff, and we are bound to begin feeling not just physically poorly, but emotionally and mentally uneasy as the effects begin to compound and our bodies begin to protest. Who else remembers the brief flurry of articles about getting "hangry," when people grow short-tempered because they have not had a sensible lunch or even snack at work. Finally, finally, I remembered a single word that neatly sums up the problem with the anti-foods, and now our poorly nourished plant and animal kin who can't feed us well if they don't get to eat well either.

The words is foyson, and I am quoting the late, great Patricia Monaghan to outline its meaning. Her incredible prose-poetry should not be missed by anyone who loves beautiful language – I am remiss and need to catch up on her poetry. The quotes below are from her 2003 book, The Red-Headed Girl From the Bog: The Landscape of Celtic Myth. Monaghan is one of the first authors I ever wrote fan mail to, because it struck me as utterly necessary to let her know how much I appreciated the book, even if in the end my missive landed in files she didn't get to see in the course of her busy writing, farming, and teaching practice. Wonderfully and all unexpectedly, she wrote back to me herself in what was clearly not a standardized message, which I still cherish to this day. (The Feminism and Religion blog posted a lovely eulogy to her on 28 november 2012.) That is the kind of generous person she was and no doubt is since she went home in 2012. Her thoughtful and sensitive explanation of foyson comes in the context of discussing, with care, one of the ways in which the Irish spirit beings still often called fairies may interact with this world. Due to these paragraphs being extracted from the book, I will add one more bit of detail regarding her use of the adjective "gross" below. It isn't meant to express disgust in the way the word is often misused today, but to mark a contrast between our solid world, and that of the Fairies, whom now we might contrast to neutrinos. They may interact with us and this world more than neutrinos do, but only if they want to, and if we are lucky, we are properly prepared to meet them.

Despite these hesitations [to speak of it], I will tell you how something precious went "away." The cause of such disappearances is said to be the "fairy blast," an invisible, inaudible tornado that tears things from their moorings in time and place and sets them down far away in either dimension: here one moment, gone the next. Sometimes these kidnapped items reappear, hours or days or even years after their disappearance. Sometimes they never reappear, or they come back so far from where they disappeared that no one recognizes their return. According to the old lore, butter is a favourite if somewhat inexplicable target, so much so that Lady Gregory devoted an entire chapter to its protection... Butter itself is not the fairies' goal; they want its foyson – an archaic English word for "inherent vitality" or "nourishment" that remained in use in Ireland until recently – which they drain away, leaving the physical ghost of the butter behind. The Gentry steal milk too, making us spill it, then growing angry if we cry (hence the still-common proverb) as they lap its foyson away. Fairy food may tempt the taste buds, but it never satisfies as does our own gross food. [55]

But why should we fear to meet the sidhe? Because they will steal us from earth, and beautiful fairy land is, there is a shadow upon it. However much one eats of fairy food, one is never full; however much one drinks of fairy mead, one is never drunk. Something – some foyson, some essential vitality – is missing in that perfect, shadowless world. Our world of growth and death, death and growth, interlaced and intertwined, holds something that fairies need, something so compelling they resort to theft and kidnapping to taste it. [65]

I sat with this a long while, thinking it over. Foyson is exactly the right word, and everything has it, not just food. How intensely greedy is it, that all these corporations are busy manipulating food with all their might to make it empty of nourishment, empty of foyson. So empty that there is nothing left for people, nor a thing left to share with the Fairies, who shouldn't be begrudged a bit of spilled milk, butter, or some time spent with objects they take away with them for awhile. I am not making any claim that the people working on these anti-foods are trying to keep anything from the Fairies. At the moment my greatr concern is that they are evidently absolutely determined to keep foyson away from all plants and all animals – including us – unless somehow we can pay them. Or perhaps not. Perhaps the sick, glimmering goal is the temptation to absolute power over others, and whatever we pay is just a scorekeeping device. A committed capitalist I knew told me that money isn't anything but a way to keep score. He has never gone without or needed to worry about money.

Regardless of my views on that perspective on money, it remains an uncanny realization that so many "convenience" foods and drinks, heavily marketed to us every day, have to be marketed. They have to be, because their foyson is already missing, and given any choice at all, no one wants to eat or drink that stuff all the time. In fact, nobody wants to eat or drink this stuff at all. But not only are many people struggling to access food and drink that still has foyson in it, they are struggling against a systematic marketing machine that tries to make us terrified lest we ever go hungry for even a moment, as if we were infants so newborn we can't sit up yet. (Top)

Is It Necessary? (2023-05-29)

A sample screen of a computer running the uxn software ecosystem developed by the 100 Rabbits art collective, under Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) license. A sample screen of a computer running the uxn software ecosystem developed by the 100 Rabbits art collective, under Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) license.
A sample screen of a computer running the uxn software ecosystem developed by the 100 Rabbits art collective, under Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) license.

Among my technical perambulations over the past couple of years has been completing the switchover to gnu/linux as my primary operating system. This has been partly a matter of ethics, as over time I have had to agree that the best future path for computer hardware and software is indeed the model proposed by free software and hardware advocates. "Free" of course meaning with source code and plans made freely available and with the tools and materials available so that we may build or own or at least repair our own for as long as we are willing and able to do so. Contrary to the claims of those who pretend that no one would ever buy anything under such conditions, we know that what people will buy if they can choose will vary. The point is to put a stop to what amounts to treating the world as a resource to be strip mined for the profit of a few until there is nothing left, with the final stages of this today being about stealing our personal data and striving to control what we are able to read, learn, and know. On a more day to day level, I dislike the ongoing attempts to make us all stupid and helpless by attempting to convince us we couldn't possibly understand or act on our own behalf when dealing with anything, let alone computers. I am also not remotely wealthy enough to run out and buy the supposedly latest and greatest thing that will somehow be "obsolete" now because corporations decide to stop maintaining software and security updates. And obviously, I am also interested in computers and programming, so this is not an unpleasant chore for me to build up skills in these areas.

The COVID-19 pandemic doesn't really have any silver linings to it. But many, many people have made the best of it that they could given their means, and those with a bit more have done remarkable work getting conference recordings, texts, and of course classes online. This has been of particular utility for anyone who has been working on exiting the big tech exploitation systems to the extent possible. For instance, the 2021 libreplanet free software conference has almost all of the talks and speeches available online, with video and audio-only versions, and many speakers also shared their slides. It was courtesy of this conference that I learned about the hundred rabbits art collective, and was reminded to reacquaint myself with low tech magazine, which I had not visited much lately. I was especially intrigued by hundred rabbits, a duo who live on a sailboat and pursue a career of art and software making, including software to make art, not just games. Their talk, software doldrums, is an excellent examination and critique of the way people working under corporate auspices and motivations develop and apply mainstream software and hardware. Right now, we can count on the "most up to date" stuff to be energy inefficient, deliberately designed to be impossible to repair or update, and rendered dependent on constant internet access to work. Worse yet, even such smaller players with better intentions at least produce cheap hardware that cannot be repaired and has little to no durability, even if the software is great. This is insane, and I suspect could end up pushing many people back to using offline tools and even pen and paper more because they can't afford the stress or monetary cost to basically have the access to their work held to ransom. Obviously two people living on a sailboat who don't primarily hop between well-serviced docks have specific issues to manage when it comes to internet access and power usage that many of us will not necessarily share. Their trenchant critiques of multitools and similar items that may do many things but often not well and are also not easily repairable are hard to beat.

The hundred rabbits website includes an excellent reference link to low tech magazine's solar powered version, specifically to the page describing the reasons for having it and the technology used to run and maintain it. I must admit to not having thought through the implications of having a website dependent upon on the fly generation and many code calls to the server in terms of energy. Like many webmasters, I had thought of it more in terms of how much work it would be to code and maintain a site using such tools, and whether and how I would extract data from the subsequent database if the software stopped working. People who are not webmasters seem to have been widely convinced that "static" is another modern dirty word, and that a "static website" must be mad, bad, and dangerous to visit. It's rather ironic that actually it is websites highly dependent on scripting for on the fly generation of pages that have these qualities, because they require careful maintenance to ensure they are not breached by people up to no good or helping spread malware or other issues online. To date, it is too convenient for those hoping to keep the lucrative "cell phone data plan" and constant proliferation of cell phone towers with new "nG" networks to support those websites full of tracking scripts, advertisements and so on let alone the mostly execrable cell phone applications going. I was priced out by the combination of data and the constant cell phone planned obsolescence churn years ago, let alone appalled out by the tracking. Tracking I should add, by the cell phone service provider. I tested the new fancy reminder application provided in the apple ios, and discovered that it reported all my use of the application as part of my usage dashboard with the cell phone service provider I had at the time. This was before the application was set up to synch with a counterpart on a desktop computer, so it should not have needed any network data usage at all, nor should it have been chatting with the cell phone service provider or apple.

A few of my friends habitually refer to The Moonspeaker as a blog, no doubt because of the regular thoughtpiece feature if nothing else. Thoughtpiece writing is definitely congruent with the original writing form of blogging before it was professionalized and advertisement-ridden to death in its first incarnation. Against all odds, the independent personal website does seem to be making a comeback, and many writers are resisting the temptation to use a centralized pseudo-free service. There are ways to make a personal website public that cost little or no money, surprisingly many even though fewer and fewer internet service providers include a modicum of free hosting space with a service plan and even universities have killed off the small amounts of space they used to provide to current students. No doubt they argued that this was about "security" and not wanting to support blogging software. Yet they never had to. Nobody does. There are other means to deal with content management when their are multiple contributors. For the rest of us, there is that old-fashioned tool, the webpage template. A template provides practical limits, and if well-designed is a great tool for learning html and page design. (Top)

Some Thoughts on Verbal Literature and Orature (2023-05-22)

The front cover of the Open Books Publishers 2013 published *Oral Literature in the Digital Age: Archiving Orality and Connecting With Communities* edited by Mark Turin, Claire Wheeler, and Eleanor Wilkerson. The book as a whole is under the creative commons attribution 3.0 license. The front cover of the Open Books Publishers 2013 published *Oral Literature in the Digital Age: Archiving Orality and Connecting With Communities* edited by Mark Turin, Claire Wheeler, and Eleanor Wilkerson. The book as a whole is under the creative commons attribution 3.0 license.
The front cover of the Open Books Publishers 2013 published Oral Literature in the Digital Age: Archiving Orality and Connecting With Communities edited by Mark Turin, Claire Wheeler, and Eleanor Wilkerson. The book as a whole is under the creative commons attribution 3.0 license.

Some encounters definitely stick in the mind, and among those I have experienced is an unfortunate presentation where things got hijacked by a prominent audience member's insistence that neither "verbal literature" nor "orature" nor indeed "oral literature" make sense or exist. I must confess to being unsure quite what to make of such convenient literal mindedness in a person otherwise deeply engaged in consideration of the application of metaphor in language as a means to describe new things. That person at least walked away from the talk feeling they had wasted their time listening to an analysis of elements of a much larger oral tradition and the means by which it has been maintained over remarkable time, distance, and severe negative impacts from the usual suspects, war, colonialism, and disaster. To those who claim that at least the first and second terms are "contradictions," I ask why they insist on this when the now almost marketed to death term "literacy" has at best minimal relationship to reading or writing when modified by such adjectives as "digital," "media," or "visual." We can find a suitably snobby lineage for the extension beyond reading and writing that goes back at least to the ancient romans, whose word "litteratura" did indeed mean in its basic sense "something made of letters" but already had the extended sense of "scholarly knowledge or erudition." A quick perusal of the updated major latin dictionary, which many still refer to as the "Lewis and Short" and a jaunt to check the OED on the word "literature" will provide further "european" credibility if a person insists. In any case, the key points are that the knowledge described as some type of "literature" and the skills deemed a type of "literacy" involve a type of systematic learning and sharing between people.

The editors of Oral Literature in the Digital Age write that "The term 'oral literature' broadly includes ritual texts, curative chants, epic poems, folk tales, creation stories, songs, myths, spells, legends, proverbs, riddles, tongue-twisters, recitations and historical narratives." This of course gives away a part of why a person might want to blow off the term. It remains less than acceptable to allow that the types of material the editors reference could actually preserve any real knowledge. As if there could be real content in such tongue twisters as "Peter Piper picked a peck of pickled peppers" or the endless series of knock-knock jokes. To be sure, sometimes the knowledge encoded in certain types of oral literature may be rather indirect in nature. Tongue twisters tell us something about the sorts of verbal games english-speakers have developed, and knock-knock jokes are part of the whole series of anglophones learn to make puns and eventually compose and apply similes and metaphors. It is too easy to derogate material that is part of the immersive cultural environment, where ethnocentrism is as much of an issue as a lack of respect for the needs and development of children.

Certainly in many "western" mainly anglophone countries at least, the colonial project to drive out oral tradition, including the complex rules for its transmission, which varies based on the type of information and who holds it, has managed on almost complete sweep. At this point even the practice of sharing and reading old "nursery rhymes" with small children as they learn to read is almost lost. I suspect many of my younger compatriots are more familiar with the carefully deracinated works by "Dr. Seuss" than the often child uninterpretable and later adult astonishing materials swept together under the monicker "Mother Goose." The latter provided much inspiration for an earlier version of the internet search rabbit hole, this earlier version facilitated by those now almost vanished dictionaries, encyclopedias, and adults with time to answer children's questions.

I am not kidding about the rules. Some of them are outwardly simple, but permit astonishingly complex applications and outcomes. The "oral formulaic theory" now widely accepted as the likely means by which ancient greek bards remembered, then composed during performance, the songs making up the surviving ancient greek epics, for example. But there are much better examples, including the complex methods studied and documented by Lynne Kelly in The Memory Code and Knowledge and Power in Prehistoric Societies among others and many articles. She is also founder of The Orality Centre (this is a web archive link, as they have been having some server trouble) which supports continuing study and instruction in the many mnemonic techniques used around the world. We may all have heard or learned some version of the "memory palace," but there are many, many more. Among the principles these methods and technologies share is the creation of a regular sequence of yes, places, or a structured object. Each place, or part of the object, is systematically encoded with what the person or community wishes to remember, and the information may be systematically reinforced and recited by going through the places or parts in their due order. A person need not literally go through them all every time when they want to recount one episode from a history, or one song in a sequence used in a particular ceremony, or to refresh their memory of part of a pattern or technique they are applying. Those of us still able and willing to visit libraries or other places in person where we collect or select items to borrow or purchase apply the innate spatial memory sense all humans have to navigate them. It is not uncommon at all for friends of mine to reassure me they can find a book again by its colour, shape, and where it is in the local main library branch as long as it is not out already, even if they can't remember the title or author, just the subject and why they believe I should read it. And it is no mere reassurance, they can and have. If you have ever been thoroughly annoyed by the way products are shuffled around in department stores in hopes of persuading people to buy something that isn't selling well, or by the "cap displays" that get in the way in grocery stores, part of your annoyance is how these mess with your memory map of the store.

When used to create, maintain, and share information as in oral literature, or orature, which I admit to liking better as a term, memory is a full body experience and people must practice. The way most of us can still vividly remember and sing or recite songs, poetry, and stories we learned as children are evidence of this. As children, we repeated these over and over again because we enjoyed them, or had to for class, probably at times to the utter distraction of our parents and other adults in our lives. Without the practice, we are prone to getting "rusty," but especially if associated with something that provides rhythm, we can bring the whole thing back to mind again. Repeated, systematic visits to a sequence of places are of course a much more externalized means of creating a bodily experienced rhythm. All of which leads me to wonder if part of the apparent and terrifying inability to remember independently of outside sources so common today is related not just to lack of practice and taking for granted the existence of books and computers, but also lack of using more than perhaps the eyes or the ears to learn. This isn't just about being a bit embarrassed to be unable to avoid getting lost because the latest "smart phone" has quit working or its map application is inaccurate. I am thinking of such examples of relatives who gravely assured me that a particular political party was directly responsible for the economic and social havoc where we live. A party which had literally never been in power or in a position to successfully advocate for legislation, nor was this the sort of party that has a violent wing. (Top)

Javascript Is a Big Problem (2023-05-15)

Random javascript logo, march 2022. Random javascript logo, march 2022.
Random javascript logo, march 2022.

I remember the early days of javascript, from the days when it was conflated with java for marketing purposes, which caused a great deal of difficulty and led to the EMCAscript standard but not a renaming that ever took. It is in principle a clever idea to run bits of useful code in website vistor's browser rather than adding to the remote server load. As originally flogged, it was about personalization, and in that case the tie made sense, the scripts were supposed to do things specific to a given visitor. The stuff for everyone who visited no matter what would still be handled on server-side scripts. Since so much personalization was done with now infamous and much abused cookies set with javascript, it seemed like something of a win-win. Except that it wasn't, it wasn't from the start. The cover story was how javascript was supposed to be great for personalization, but in the end the underlying motivation was to move the processing load for doing fancy stuf on webpages from the server onto the client computer. Corporations especially wanted this, especially media corporations wanting to push as much of the expense of serving audio and video onto their visitors and customers as possible. The trap for coders was how javascript could be used to do clever and fancy things to make websites look tech savvy, as well as take over useful tasks that could sometimes be more difficult to code using server side scripting. Server side scripting of course also keeps in front of the developer's faces the necessity of keeping the software patched and up to date, and the scripts ruthlessly debugged and tested. Demand for quick profits and data mining drove even greater interest in javascript. It is possible to write lots of "get the job done" but very buggy and very insecure javascript very fast. Or nowadays, a person can just access a pre-existing javascript library they may or may not know very much about.

Unfortunately, this is a recipe for a horrible mess on the web. Today the web is all but unnavigable and unusable with no limits on which javascripts run, and that isn't just because of how much javascript is used to insert the growing advertising blight. The scandal of webpages bloated by giant images that nobody could be bothered to optimize properly, or the menace of navigation that is useless without javascript enabled and often even after it has been enabled is well known. Too few people are familiar with how to view the source code to work around these issues when they can't just abandon the website. Yet there doesn't seem to be nearly as much heat and light on the issue of the hundreds to sometimes up to three thousand lines of code inserted onto webpages on some sites in order to bring in all the javascripts meant to gather all sorts of data, plant trackers on the visitor's computer, and of course add in the wretched and ubiquitous advertisements and other propaganda bits. All those references and links mean that website is loading all that javascript into the client side computer's memory via the web browser, unless of course the person using the computer is using an extension like noscript or has turned javascript off all together. Or maybe they are using a text-only browser like lynx, which is still going strong.

Of course, there are plenty of people who will claim that with javascript turned off "the internet" is broken. Far from it. Most of the time I don't allow any javascript to run when browsing, and for the most part find the web are much faster and more pleasant experience as a result. There are certainly sites that by design do not work at all without javascript these days, "social media" being the most infamous example. Almost any site with any sort of sign in or forms depends on javascript to validate the data entered before sending it to the server, and won't work without javascript on. If a person is not inclined to use rss feeds for audio and video channels they are following, they will likely never find the data links that would allow them to view or listen without turning javascript on. With so many people apparently interacting with only the present-day equivalent of what we used to colloquially refer to as "aohell," and then with a focus on material that depends upon insertion of video and audio clips for interest, they are certainly channeled primarily onto the sort of sites and pages that are broken by design with javascript off. That is, they are being coolly shaken down for as much personal data as can be wrung from them with clever scripting and inappropriate access to the web browser innards via those clever scripts. And these days, we don't even get the distant consolation prize of applying userchrome adjustments, which former leader on this aspect mozilla is choking away as the parasitizing effect of google money continues expanding.

Merely dumping javascript would not solve the many problems with spying and abuse of client side computers via the web and other aspects of the internet. However, it may ultimately turn out to be necessary at least as a first emergency extreme measure to regain some level of sanity in these areas. That said, I doubt that the various corporations busy manipulating web standards will do anything to curb javascript abuse, since they are the worst offendors and supporters of smaller scale offendors. Instead, the rest of us are going to have to do things like stick to browsers allowing management of javascript permissions on a website and web domain basis. There do not seem to be many options of this type at all for people who do most of their web browsing on phones or other handheld devices though, so they are either stuck or else need to have a means to use a program to provide javascript and advertisement blocking via their home router, a vpn, or something like a freedombox. (Top)

What Do You Mean "We"? (2023-05-08)

Low resolution scan of a panel most likely from the late 1940s Lone Ranger comic book, cited for illustration only. There is a far better political cartoon riff on this image that is stubbornly evading search engines at the moment. Low resolution scan of a panel most likely from the late 1940s Lone Ranger comic book, cited for illustration only. There is a far better political cartoon riff on this image that is stubbornly evading search engines at the moment.
Low resolution scan of a panel most likely from the late 1940s Lone Ranger comic book, cited for illustration only. There is a far better political cartoon riff on this image that is stubbornly evading search engines at the moment.

Humour aside, at this point I am seriously wondering what it will take for populations at large in so-called "western democracies" to finally stand up and firmly reject the force teaming rampant in them. It is grotesque to me to hear and read peoples talking about how "we decided to bomb iraq" or "we put sanctions on china" or "we took austerity measures to save the economy." Leaving aside for the moment the other major bullshit claim in the last example statement, these are all ridiculous. "We" did no such thing in any case. At no time were any of these steps actually put up for a meaningful vote or canvas to gauge whether the majority in any "western democracy" was in favour of taking any of these steps if such steps are among those "taken" by that country. There have been no referenda, no plebiscites, no well designed or honest opinion surveys, no serious respect given to serious protests and practical proposals for policies that would be more effective and less harmful to people and the land, let alone less expensive. Instead we have elections with growing legitimacy problems, from which the winning party or coalition goes on to do carry out various legislative and policy actions as if they had been given blank cheques. After all, they are quite certain that they are not accountable to the broader voting population at all, and rather than develop legislation themselves or engage with actual citizen proposals, they wait for their donors and lobbyists to bring them pre-written laws instead whenever they can. Failing that, they give directions to the public servants in the requisite department based on what donors and lobbyists tell them, primarily donors of course, because the reality is that money supersedes the vote. But this all hinges on enough people continuing to go along and give this mess a veneer of legitimacy.

Veneers are by nature thin and rather fragile, and the require constant intervention to keep them in place and plausible seeming. Force teaming is a clumsy tool used more and more often in more and more flagrant ways. Criminalizing dissent is the big one of course, whether via conveniently redefining "treason" to deciding to create "mandates" to force people to undergo a treatment or other change that they would not have have chosen. Then there is the equally crude elements of propaganda, with its major themes of scapegoating, outright lies about how much influence individuals really have on public policy beyond the municipal level (using an optimistic measure), and the intelligence-blocking nonsense referred to as "patriotism." The last is probably the worst, because that is the way people are most often convinced to treat whatever heinous or inappropriate policy as something "we" did rather than something that actually a set of oligarchs imposed upon the rest of us. I actually do think that it is possible to be proud of a community or even a nation without the accoutrements of chauvinism, racism, and insistence that "our country can do no wrong." But at present there is basically no sign of this mode of civic or national pride, and much more of people doing bizarre things like trying to force strangers to go along with bizarre practices only introduced during world war ii, like singing national anthems before sports events. People may like and prefer to keep such practices, thinking they are harmless. I used to naively think that myself, until I learned more about their close connections to militarism and racist bullying. In real life, not theoretically.

The main reason for having the unaccountable, oligarchic systems of governance that we do in most places I have been told is "efficiency." Supposedly it would just take too long to get anything done, if politicians had to actually canvas their constituents and follow through on what they learned or lose their seats, leading to a by-election or perhaps an overall election depending on the level of government and who proposed the policy. This has always struck me as a very strange argument. After all, it is not even remotely new news that the climate is changing, yet meaningful policy change from governments to minimize or properly mitigate the human elements of what is causing the change has stood still for decades. Or the issue of people whose homes are in areas long known to be prone to suffering disasters, where the policies and laws have been left unchanged. Therefore people end up suffering these disasters again and again because they did not have proper information or options to avoid living there. The great depression led to mass poverty and misery, but somehow that went on for the better part of twenty years, and nothing much was done to improve matters until finally its impacts began to impinge on the profits and comforts of the oligarchs. Even on the relatively small scale of a municipality, if the local most wealthy folks decide they aren't interested in supporting or carrying out an important policy like providing and maintaining decent public housing, action grinds to a halt. Any pre-existing public housing is allowed to run down into slums and whole campaigns go out claiming that low income people ruin property values by their mere existence.

Practically speaking, "we" know what the point of the various modes of force teaming combined with oligarchic and unaccountable governance is for. It's meant to convince us, if we are victims of bad policy that we are to blame, that if we are lucky and benefit by the policy that we are somehow part of the oligarchy no matter how absurd such a belief is, or if the policy passes us by still, somehow "we" chose it by merely living where we live. The last might cut more ice if people could move around as easily as today's version of capital in the form of digits in computer databases. If people could easily vote with their feet, then indeed, it could be claimed that they are giving at least passive acquiescence to what is happening where they live. The only people who can do that sort of footloose voting however, are the oligarchs who spend their lives with little awareness of the real world because they have looted enough money to pay servants to keep it away from them. (Top)

Undeserved Praise (2023-05-01)

Another fascinating image result for search terms related to the thoughtpiece title. This picture is a scan from the internet archive repository, and derives originally from a 1901 book on Shakespeare. Another fascinating image result for search terms related to the thoughtpiece title. This picture is a scan from the internet archive repository, and derives originally from a 1901 book on Shakespeare.
Another fascinating image result for search terms related to the thoughtpiece title. This picture is a scan from the internet archive repository, and derives originally from a 1901 book on Shakespeare.

Probably it is too much to hope, but maybe one of the few good outcomes of the recent excess of fundamentalist non-think will be a broader understanding of how dangerous undeserved praise is. By danger I don't just mean the problem with telling another person a lie which can have the personal impact of making it difficult for others to take the original teller seriously. That is an issue, but since undeserved praise is thrown around so widely, the very notion that praise is meaningful in the first place takes a serious hit. There is also nothing good about praising a person or organization for merely doing enough not to be impossible to live with. Wildly praising a man as "stunning and brave" for parading around in public in his fetish gear is not doing him any favours, because besides egging on his mental illness, it encourages him to seek stronger thrills. Praising him permissions him to see what else he can get away with, it doesn't encourage him to improve. Praising exxon-mobil for not having a recent tanker crash that blights the majority of the alaskan and northern british columbia coast line is not deserved. That is the very least level of practical and social responsibility that should be required for such a company to be allowed to exist at all, leaving aside for the moment the issue of hydrocarbon fuel usage. Practically speaking, this indicates that praise has an ethical component to it. It's powerful stuff and therefore potentially dangerous if given carelessly or deliberately abused.

Some examples of undeserved praise are easy to pick up on right away. Consider how an adult would receive being praised for brushing their teeth and combing their hair properly. From that example which usually no one would refer to as praise, we get immediately why it isn't and what is wrong with it. It is generally insulting to suggest to the adult in question that they are unable to carry out basic personal hygiene practices. In the case of circumstances where an adult requires assistance to complete these personal tasks, the vast majority of us get that praising them or their assistant would also be unacceptable. Then the supposed "praise" would be at best thoughtless, but more likely a manifestation of assholery. This is so obviously a socially and humanly unacceptable way of abusing praise that most people would never think to refer to it as praise at all, let alone as undeserved. Quite understandably they would go straight to declaring the statement no more and no less than an insult that should be apologized for at once and not repeated.

Other examples of undeserved praise are widely understood as such, but people don't like to admit to ever taking part in giving or receiving them. In this category are the many obsequious touches that feature in comedy sketches making fun of the way "upper class twits" imagine they should be perceived and treated. Unfortunately, in real life such treatment tends to be a response to outright bullying by those getting the undeserved praise. Most infamously, the real life "upper class twits" are able to pay for the praise, or otherwise force people to praise them or lose funds, a job, or whatever. This is the stick that big donors like to use when they decide it is time to pretty up their public image by supporting some selected charities. The way people who have acquired significant social and political power by engrossing more resources than others is particularly dangerous, because the undeserved praise helps them to hide their wrongdoing or delay action to stop their depredations. I think there may well be a strong argument that in such conditions the "rich" person is often manifesting sociopathology, and that undeserved praise feeds their mental illness and encourages them to keep pushing their luck. This is unethical and dangerous to everyone concerned, even the rich sociopath or the rich not quite sociopathic but at least reality challenged. It is always better not to contribute to their difficulties with reality.

Most of the time though, we aren't dealing with such extremes at all. Instead, we are striving to find the right balance so that praise we give helps support and encourage healthy social relationships and development in the persons or organizations we praise. When a person is learning something new, we want to support them in learning. Praising them as they meet goals in the process of learning makes good sense, but if we praise excessively we can accidentally convince them they have no more to learn so they leave off. Not letting them know when they are ready to take on a new challenge is also not helpful. This can be extended to organizations with care: we can reward organizations for what they do that demonstrates an improvement that is not just for show, for example. Staff at an organization who figure out they can make a good public relations splash by making some noise and putting up some posters but without making real changes otherwise will keep doing exactly that. If praise can be won just for making a short term appearance without making a real change, both organizations and persons will do just that. It's a way to gain attention and sympathy without doing anything demanding actual commitment and mature action, and also to avoid any sort of critique or change.

This is the nub of what makes undeserved praise so dangerous, even in its apparently least obviously insulting to the recipient or the giver. As a means of avoiding challenge and change, undeserved praise is all too effective. But that also makes it crippling, because the real world isn't one in which we will never have to face up to things going wrong, our own mistakes, or the need to develop and change to continue living in a healthy and sensible way. It is completely reasonable to strive to avoid insulting people or speaking and acting in a way that discourages them from developing and acting in the world in a healthy way. But the urge to avoid challenging them, however inspired, is not a healthy one. There is no harm and a great deal of help in seeking to praise real achievements, but picking out something as an "achievement" which is anything but, is harmful. It teaches the person to be brittle and immature, whether that is expressed as bullying or whining. Both behaviours indicate something has gone wrong, and if they become entrenched, that serious trouble is ahead. (Top)

The Trouble With "Born This Way" Arguments (2023-04-24)

Illustration from the book *Los mexicanos pintados por si mismos,* by Andrés Campillo, published in 1854 via oldbookillustrations.com. Illustration from the book *Los mexicanos pintados por si mismos,* by Andrés Campillo, published in 1854 via oldbookillustrations.com.
Illustration from the book Los mexicanos pintados por si mismos, by Andrés Campillo, published in 1854 via oldbookillustrations.com. There must be a story behind the upside down picture within the picture!

One of the saddest, and frankly irresponsible pretend arguments for not subjecting homosexuals to social ostracism, castration, or outright execution is the bleat, "we can't help it, we were born this way." It's sad because how pathetic can a person be, to claim they are utterly unable to control how they act, as opposed to making claims about their feelings, which yes, cannot be controlled unless we resort to psychoactive drugs. This is an extremely important distinction, because as it happens I do think it likely that people are born with a predisposition towards homosexuality or heterosexuality, but whether they act on it and how depends significantly on their social circumstances. This is what can be ascertained from the historical record. Now, what makes resorting to "born this way" arguments irresponsible is the way they are manipulated by people with sociopathic and psycopathic leanings. The same people who want to manipulate society into pitying them because they are rapists who will do absolutely anything, from trafficking to rape in front of a camera to make pornography. The same people who are absolutely desperate to conflate homosexuals in the public mind with rapists and pimps, because they thrive on taking advantage of the socially isolated, and enforcing such a conflation will socially isolate anyone merely perceived as homosexual. There is no point trying to soft peddle this. The reason such people are so interested in computer generated pornography is because they see it as the best gateway drug to pornography made abusing real people, and as a means to push up the price of abusing real people because of the legal risks involved. To see whether this is true for yourself, just spend a little time tracing the cross links and writings of the current woke darlings of the alphabet soup brigade that has hijacked and is striving to completely destroy the successful work of the women's, lesbian, and gay rights movements.

I am going to belabour this point because it seems to be missed by far too many people, including so many who should know better. It makes no damned difference whether you were "born this way" and so have certain feelings or not, what matters is what you do based on your feelings.

On the question of sexual orientation, the first thing to consider when trying to sort out whether your feelings can be harmlessly acted on is to consider questions of who and how. If your orientation is toward people of your own age, as in the number of years you have been alive, not this lying "identify as" claptrap, then you are already in a sensible ethical position. There are inherent power relations between adults and children, or between any person in a position of trust who may therefore be able to control the actions of another person via the authority that position gives them. Sexual relationships between people in either of these types of relationships are wrong, period. No one can meaningfully "consent" when they are in a relationship, however brief it may be, where another person is able to force them to behave against their own wishes via a threat. When we hear about a person being held at gunpoint and told to hand over their wallet, we generally have no issue figuring out that the person handing over their wallet did not actually "consent." We can even figure this out when it is a religious authority using threats of hellfire and damnation to wring cash out of their followers. Sex is not an exception. That leads right into the "how" consideration. If you would not be willing to force a person to interact with you sexually, including in the case when they are of similar age, then your ethical position is sensible and I dare say good. It is good, though I agree not necessarily easy, to act on a respect and care for others that leads you to refuse to knowingly use coercion in intimate relationships or any relationships, for that matter.

Frankly, if a person is unwilling to act to refuse to coerce others into sex, then they have no business running loose in society, they should be in prison. This is already what is done inconsistently to those who are genuinely unable to control themselves, as in the case of some types of severe mental illness or other cognitive impairment. I wholeheartedly agree that this is not "nice" or "pleasant" or "kind" as those terms are currently defined. However, if we think more carefully and don't shy away from the truth, we will appreciate that there is nothing nice, pleasant, or kind in encouraging people to lie to themselves when they can actually choose to be better people. There is nothing nice, or pleasant, or kind in trying to allow the dangerously mentally impaired to run loose in society. If we could somehow magic away their impairments, do we seriously imagine they would feel no remorse at their actions, or anger at the rest of us who were unimpaired for doing nothing? Even if they felt no remorse because they were not in control of themselves – which interestingly enough healthy humans apparently are quite bad at or religiously driven abuse would not work – that would not keep them from feeling anger about leaving them to wreak havoc, even if only for the selfish reason that they don't like people flinching at the sight of them.

To return to the main point, strong arguments are based on being able to say and show that you are able to act as a responsible human being in the world, and that you are able to accurately assess whether you are about to behave in a way that would hurt someone in acting in a way that the majority do not. I think that the various people who try out the "born this way" weak argument are hyper-aware of this, because they are often the same people who demand particular performances of "normality." In the case of lesbians and gays, they want to see mimicry of heterosexuality, preferably by remaking themselves into ersatz "straights" by taking drugs and having surgery to create an appearance of opposite sex secondary characteristics. A lower key demand has been for "gay marriage" and making sure to conform to sex role stereotyped dress, behaviour, and jobs while in a permanent monogamous relationship. The easiest proof of this to check the depiction of lesbians in mass media and note who is missing, unless as a joke or a way to suggest that in real life they had better get on with dosing with testosterone and surgery. These are the sorts of demand that a weak and irresponsible argument leaves a person open to, because by hanging everything on "born this way" they have already declared themselves unable to function as active adults in the world without special indulgences. Indulgences only come with a price. A corroding and coercive price. The "born this way" argument for acting in a particular way is a trap. Don't fall in it. (Top)

Fed Up to the Back Teeth (2023-04-17)

View of the mouth of the thermodon river in turkey by Gerhard Pöllauer in august 2021, founder of the amazon research network website. View of the mouth of the thermodon river in turkey by Gerhard Pöllauer in august 2021, founder of the amazon research network website.
View of the mouth of the thermodon river in turkey by Gerhard Pöllauer in august 2021, founder of the amazon research network website.

I originally planned an entirely different thoughtpiece for this week, but another of those times when real life intervenes has arrived. To wit, I have finally gotten more than fed up to the back teeth with the now widely used online practice of telling others what to think about what another person or organization says, does, or publishes. The people who engage in this type of explaining things to others as if those others must be stupid, since those others have been looking at/reading/listening to a source these explainers do not approve of. I am at the point of wanting to throw things when these jackasses show up in a conversation. In this the likelihood that I am in the minority is small, and this is just one more damned thing wrecking the web as a means to take part in discussions. My perception of this is not based on "social media" which originally I didn't join by accident, later by choice for reasons that actually had little to do with what we have all learnt about its pitfalls. This passive aggressive way of supposedly being helpful on the internet has now done serious damage to the few version 2.0 bulletin boards I visit regularly. Worst of all, this practice is now creeping steadily into firmspace as well. They are miserable places to even browse through now, exacerbated by the politicization of pandemic policy, and now the stupidity of warmongering. Both of these phenomena involve a great deal of dehumanization of a selected group, demands for performances showing allegiance to supposed "right think" and an expansion in the specific tactic that has finally overrun my patience all together.

In very simple, it involves referring to a title or tweet or to some publication that the original commenter assumes the audience agrees with them is in some sort of unacceptable category. "Right wing" or "anti-vax" are the accusations of choice, the latter pretty non-partisan. Accusing someone of being on the baddie of the hour's side if you don't jump up and bray with the pack screaming for war and collective punishment in response to what they've done is never deemed something "only the other side" would do. So the most recent example online I saw was a person who decided it was a good idea to sneer at everyone on the message board that another post on the babylon bee shows what the writers there really think of "you." Another day it'll be a less than gracious article published by unherd, or perhaps quillette. So far nobody tries this with any publication or commenter with a "left wing" reputation remotely as often. These are the same people who will also complain about "cancel culture" and "the need to respect free speech" and how "terrible polarization is." The real life instance was in an otherwise anodyne office conversation, where etiquette generally means we don't discuss politics or religion (I am glad to follow that etiquette!), sounding off about how "anti-vaxers" should be force-dosed or denied medical treatment. It is one thing to give an example with the comment such as, "Here is why I don't think it is worth supporting this writer/speaker/publication." It is something quite else to engage in trolling, or outright dehumanization. Even if no one responds – and in the case of dehumanization we should versus trolling when we shouldn't – the damage to the overall atmosphere is done. But in the end, it isn't even the more common trolling aspect and its attendant virtual version of shitting in the middle of the shared space that bugs me the most.

I know, hard to believe, right? But I am serious. What bugs me the most about it is that even the trolliest of the trolls who use this tactic are onto a real issue, and nobody seems to want to face up to it at all.

Practically speaking, it is not possible to cope with the real world by only engaging with ideas and arguments that we agree with. If we don't experience other people disagreeing with us, preferably on age-appropriate topics as we mature, when we finally do, chances are our responses will be some version of a temper tantrum and flounce, or a complete personal crisis as we discover that the world does not bend to our wishes. Later in life our first real encounters with adult conflicts of this kind can be quite comical to revisit, so long as we had good training when we were younger of course. I am fond of a video blog by a woman who refers to these encounters as "baby adult" experiences, and this is a really good label. Our brains are still developing into our mid-twenties, and so until then we still sometimes respond with the expectation that everything is simple and black and white. That's when we are most prone to experiencing the shock of learning that no, following "the rules" does not guarantee success, and this was indeed something teachers and school guidance councillors did not tell the truth about. When I was about a decade or so older than that age, there was a flurry of think pieces on the phenomenon of the "quarter life crisis," the earliest minimal acknowledgement that no matter what they took in school, most people between the ages of 20-25 were unemployed, many of them forced to return to live with their parents because they couldn't afford separate housing. Those young people were dealing with the impact of being taught utterly unrealistic "rules" on top of a long trend of capitalist profiteering that was only accelerating. And those who never looked at sources of information outside of their usual bubbles had no means to make sense of what was happening. According to their usual bubbles, there was no problem, housing was cheap and everyone who had a credential had a good job. Nowadays the cost of housing and poor employment prospects for young people are so persistent nobody tries to market quarter life crises for clicks anymore.

All that said, yes, there are sources of information out there that are inaccurate, and/or even extremist. And we should be able to express reservations about a source someone else is perusing, which we can do without insulting their intelligence. Since currently most publications on and off line are totally polarized, it is also common at the moment for even the most moderate to be less pleasant to read if we do not share their political slant, and prone to being like a broken clock from our perspective: right twice a day. It happens, and that doesn't mean we are at fault or have suddenly become prey to vicious recruiters and brainwashers with their virtual buckets of sudsy goop. That's ridiculous. The bigger issue is that too many people have been rewarded for not asking questions and opting not to use their critical thinking capabilities. This is at issue both in firmspace and online. Asking for clarifications about some article or publication tells us precisely nothing about whether the questioner agrees with it in general or in specific. Open questions indicate honest inquiry, and one type of common open question is the clarification question. We also need to be able to ask open questions in order to develop and maintain our critical thinking skills. That's one part of the big issue. The other part is transparency related.

I am fed up with the way such terminology as "state run broadcaster" or publication is thrown around. State funded broadcasters are quite common, and include the vaunted british broadcasting corporation, the canadian broadcasting corporation, and the united states' national public radio. Even these have been for at least periods of time not just state funded, but definitely directly state run. I do agree that it should be common knowledge up front where any publisher or broadcaster is funded from and who is in charge of it. It should be as close as their "about" page or section. It is good practice to check those details for any publisher or broadcaster. What annoys me about the way references to being "state funded" or "state directed" are used is that somehow "state funded" only counts as "state run" in a country we are being encouraged to declare a villain, without checking receipts and thinking the question through for ourselves. This is the sort of dangerous and sloppy thinking that is convenient for authoritarianism and bad for stopping actual evil in the world. If the publisher, broadcaster, person, or country in question is genuinely awful, checking receipts will certainly convince us of that. It is even possible to parse facts out of heavily slanted stories, because they need something to start from. For example, most news stories in the mainstream press in canada about Indigenous politics are biased to the point of being pretzels. Usually they reveal no more than four facts: an Indigenous community is engaged in a political conflict related to the impact of a specific section of the indian act. Everything beyond that is racist stereotypes or if we are lucky, complete misunderstandings. Then again, most of us were never taught to expect that any media thing to cater to our preconceptions and demands, but that we should expect to have to pay attention. (Top)

Of Unknown Function, Not No Function (2023-04-10)

Pair of cross-sections illustrating the structure of the appendix, quoted from the 1903 edition of D.J. Cunningham's *Textbook of Anatomy* via clipartetc. To read the text around this illustration, see the 1918 edition at the internet archive. Pair of cross-sections illustrating the structure of the appendix, quoted from the 1903 edition of D.J. Cunningham's *Textbook of Anatomy* via clipartetc. To read the text around this illustration, see the 1918 edition at the internet archive.
Pair of cross-sections illustrating the structure of the appendix, quoted from the 1903 edition of D.J. Cunningham's Textbook of Anatomy via clipartetc. To read the text around this illustration, see the 1918 edition at the internet archive.

There are so many aspects of our own bodies that we don't understand. Let alone the inexcusably poor information available to women about their own bodies, because at the moment the unsupported assumption that women are just ill-formed and crippled could have been men still rules medicine, humans have some organs and body parts in both sexes that current science cannot explain. The DNA codons once dangerously mislabelled "junk" because they couldn't be simplistically mapped to an easy to identify process or phenotypical expression in the body, for example. Well before people were able to puzzle DNA though, our ancestors got to start puzzling over what are often referred to as vestigial organs or structures. A nice example of a vestigial structure is the divot most of us are born with in our upper lip. We come from a lineage of creatures whose upper lip is usually fused together before birth, although there are genetic conditions that may prevent that fusion from completing and that today are managed by surgery. The bit of webbing many of us have between the near ends of our fingers is another, that hint way back to our distant fishy ancestors. It's the internal organ examples that cause more difficulty if they cannot be explained and have a tendency to get inflamed and cause problems, like the vermiform appendix.

And now for a little bit of Moonspeaker trivia: this is the 400th thoughtpiece!

UPDATE 2024-02-17 - Heather F. Smith at midwestern university is engaged in longterm research on the mammalian appendix, including a new paper published just this year in The Anatomical Record ("A review of the function and evolution of the cecal appendix," 306(5): 972-982), which can only be accessed by the general public through a library, although its web page includes the full abstract, which is quite readable. Smith also has a transcribed interview with the u.s. media outlet npr, Your appendix is not, in fact, useless. This anatomy professor explains. Unfortunately the article is a bit light on scientific detail because the npr reporter decided it was necessary to spend most of the interview eliciting Smith's biography. Fortunately, it is possible to skip down to where the bold texted questions are, and that focusses on the science part.

Plenty of creatures out there don't have an appendix at all. There are two solid articles useful for digging in detail into the topic for the really determined, although only one of them has references. For just a general overview, no need to think about hunting up further articles or anything, there is Loren G. Martin's 21 october 1999 article at scientific american. For those who would like to dig further into the question, there is one with references by Julie Pomerantz, Does the Appendix Serve a Purpose in Any Animal? As it turns out, the first thing that needs clarifying is that not just any animal has an appendix. Yes humans do, and it looks like perhaps some other primates, plus some rodents. That suggests very diverse diets among the creatures that do have an appendix, so even before reading the articles we should drop any expectation of simply connecting its presence or absence to diet. It is understandable but too bad that this little organ is referred to as "the appendix" with its connotations of being extra and tacked on at the end. The connotations have worked powerfully on the preconceptions scientists and doctors have had about this little tube, with its proneness to getting partially or totally blocked and then developing an infection that can wreak havoc if left untreated. Pomerantz provides some notable details that scientists found once they began to observe the development and structure of the appendix in more detail.

When researchers examined the appendix microscopically, they found that it contains a significant amount of lymphoid tissue. Similar aggregates of lymphoid tissue occur in other areas of the gastrointestinal and are known as gut-associated lymphoid tissues (GALT). The functions of GALT are poorly understood, but it is clear that they are involved in the bodys ability to recognize foreign antigens in ingested material.

Thus, although scientists have long discounted the human appendix as a vestigial organ, there is a growing body of evidence indicating that the appendix does in fact have a significant function as a part of the bodys immune system. The appendix may be particularly important early in life because it achieves its greatest development shortly after birth and then regresses with age, eventually coming to resemble such other regions of GALT as the Peyers patches in the small intestine. The immune response mediated by the appendix may also relate to such inflammatory conditions as ulcerative colitis. In adults, the appendix is best known for its tendency to become inflamed, necessitating surgical removal.

In this case the scientists have somewhat better reason to have misunderstood the possibility that the appendix actually plays a role in our immune system than in the case of the tonsils, once so frequently removed that when I was in grade school we read short stories depicting this as a near inevitable childhood experience. Today doctors avoid surgically removing the tonsils as often as they can, because of their immune function contributions.

Martin's article gives more information about the specific chemicals that the appendix releases early in life when it is still larger relative to the overall body size and rich in lymphoid tissue. The anatomy illustration reproduced above shows a cross-section of a two-year old child's appendix on the left, and one from a middle-aged male on the right. The description around this illustration notes that the appendix does keep growing until middle age, then it begins to shrink. That adds a while new dimension to the well-known decline in immune function people generally experience in old age. Here is a bit of Martins article on these points.

...During the early years of development, however, the appendix has been shown to function as a lymphoid organ, assisting with the maturation of B lymphocytes (one variety of white blood cell) and in the production of the class of antibodies known as immunoglobulin A (IgA) antibodies. Researchers have also shown that the appendix is involved in the production of molecules that help to direct the movement of lymphocytes to various other locations in the body.

In this context, the function of the appendix appears to be to expose white blood cells to the wide variety of antigens, or foreign substances, present in the gastrointestinal tract. Thus, the appendix probably helps to suppress potentially destructive humoral (blood- and lymph-borne) antibody responses while promoting local immunity. The appendix – like the tiny structures called Peyer's patches in other areas of the gastrointestinal tract – takes up antigens from the contents of the intestines and reacts to these contents. This local immune system plays a vital role in the physiological immune response and in the control of food, drug, microbial or viral antigens. The connection between these local immune reactions and inflammatory bowel diseases, as well as autoimmune reactions in which the individual's own tissues are attacked by the immune system, is currently under investigation.

Martin goes on to comment that surgeons often treat the appendix as a sort of "back up" because they have been able to reuse the appendix to replace damaged or lost tissues in the body, especially the urinary tract. Fair enough, to be sure. From Martin's professional perspective this is ordinary stuff, but I must admit to being astonished at the idea that we have a local immune system in our gut. That is not only seriously cool, it is quite a remarkable example of how evolution has led to our organs generally having multiple functions, many of which we may not be aware of. There is another localized and highly important immune system that recently many of us have been learning quite a bit more about, the nasal mucosal system.

So if we are being cautious, particularly when it comes to our internal and intermediary organs, we'll hesitate to deem them useless, even if we are not quite sure what they do. Or more often, even if they may become inflamed and cause such health difficulties in some individuals as to force their removal. Our bodies are amazingly complicated, and we don't actually have a means to figure out every single possible bit of how they work over time. While some people might find that discouraging, to me at least it really is the best kind of amazing. We are all walking around with genuine wonders inside us! (Top)

Copyright © C. Osborne 2024
Last Modified: Sunday, September 08, 2024 13:53:00