FOUND SUBJECTS at the Moonspeaker
Speech Bubbling (2022-11-28)
Hopefully in the near future the pathologies of the present will become memories studied as examples of what happens when people get caught up in groupthink who are in rule or decision making positions. Groupthink is pernicious, in part because it is another one of those mental patterns that hijacks a usually otherwise useful habit. It hooks into our tendency to go along with our crowd of respected friends and acquaintances, but also manages to make connections to our weakness for taking a shortcut to an opinion or decision rather than actually thinking things through for ourselves. We are actually not generally weak in this way, but have the toughest time with it when for whatever reason we feel afraid, anxious, or rushed. When groupthink strikes people in decision or rule making positions though, we have all the pieces of a terrible disaster at hand if the dynamic cannot be broken through in time. In north america and many other places, people who end up in rule or decision making positions are encouraged to believe that their intelligence and wisdom exceeds that of all others, or if they are not quite convinced of their individual capacity, that their group has the greatest intelligence and wisdom. That's bad enough. On top of that, that belief soon mutates into a rationalization for the further belief that they have the right to tell everyone else what to do, and even that the ends justify the means so long as everyone else is forced to obey. There is a terrible cycle of this going on right now, exacerbated by the careful creation of soundproof bubbles around the speech of anyone who raises questions or tries to bring them new information, even if actually that person agrees with the groupthinking decision or rule makers.
Some years ago now, Eli Pariser wrote a book called The Filter Bubble about the information bubbles that "social media" create around people who use them. The advertised purpose of the filter bubbles is to provide the user with what they want. Of course the real purpose of algorithms that push forward more of what a person has already seen, now after pushing steadily to get them to look at things that make them angry or otherwise upset, is to make the "social media" companies money by trapping people in interaction loops that expose them to advertising campaigns. The various executives and engineers who write the pernicious software that does this job are quite sure that the vast majority of us are stupid, soft putty in their hands with no clue how to run an ad blocker, turn off autoplay, or even in a pinch, use a card to block annoying, wiggling ads intended to distract us from the main thing we are watching. Let alone the people who are opting out for other sources of entertainment and news information that doesn't involve letting the likes of google and facebook attempt to peek in their figurative underwear. But really, the reason "social media" gets pointed at so often, quite apart from how shitty they genuinely are, is that it distracts us from the problems of yellow journalism and corporate concentration in the older media. The people working in what still claim to be newspapers and magazines, let alone the companies bankrolling movies and shorter length programming, would much prefer we not hold them to account for their shoddy efforts and complete sell out to the advertisers and propaganda millwork demanded by the military and various other interest groups. Those people are also generally of the view that most people are malleable idiots. I've already written before about how if our supposed complete helplessness to media and propaganda manipulation was for real, there would be no need to keep desperately trying to find more and more pernicious means to shove propaganda in front of people.
In the end, the people who are in truly dangerous filter bubbles are the groupthinkers who are in positions where they can make decisions or rules that affect many more people than themselves. In fact, they make exceptions for themselves on the grounds of their presumed superiority. They are often surrounded by yes men and flunkies who encourage them to continue according to what they have decided on their own, heedless of evidence. Even evidence that suggests parallel and complementary action is not acceptable to people caught up in this type of filter bubble, because they see any challenge or contradiction as a challenge to their group and themselves personally. Whether they frame that as an unacceptable challenge to their authority or a people supposedly "refusing to accept the new" or whatever doesn't matter. Nothing else is audible or visible except what they have agreed to allow in their own groupthink bubbles.
UPDATE 2023-01-15 - I have recently learned about the scholarly label "elite panic" for the groupthink activities against society of "elites" who are sure they know best and are horrified lest the presumed stupid and undisciplined "herd" get out of control. It is hard to speak in a neutral way about people who have come to such megalomaniac beliefs about themselves. In any case, here are two links providing solid introductions to the concept and study of "elite panic." Teams Human: Alarm is Appropriate, the Volcano is Erupting and James B. Meigs at Commentary: Alaska Earthquake Elite Panic vs. the Resilient Populace.
There are many reasons this is insanely dangerous. For one thing, it has recently wrought unholy havoc on properly facing up to a global pandemic. Time and again groupthink led to calls for people to do things that increased the spread of COVID-19 and willfully blocked reasonable additional tactics of defence in depth. From refusing to allow complementary repurposing of drugs found to be effective treatments and prophylactics to help maintain prime conditions for the use of prophylactic injections until sterilizing vaccines could be developed (if they can for a coronavirus), to driving a massive campaign of scapegoating of anyone who had not been dosed with non-sterilizing vaccines for the ongoing pandemic, the result has been severe social disruption and rampant, completely unnecessary cruelty. (Yes, cruelty is by definition unnecessary.) People have openly called for anyone who has not received one of the earlier COVID-19 shots to be denied all medical treatment and left to die because "they deserve it." Protestors have interfered with the operation of both hospitals and vaccination centres, interfering with medical care in the former and the right of others to make their own informed medical decisions on the other. Governments have buggered up their responses to these protests, which should not be barred from proximity of those places all together, but they must be restricted from positions that prevent the due operation of either type of place.
Add to that the horrifying misfire of driving all attempts to talk about complementary and parallel prophylactic therapies based on known drugs from the mainstream media. Most people who dare to discuss these things anyway or indeed critically discuss any hot button issue of the moment are now driven entirely out of "respectable" outlets so that they must rebuild elsewhere. While this means considerable impetus is being given to the development and expansion of alternative, responsible outlets, this is in spite of the groupthink bubbles of the so-called "elites" who are doing the driving. Their behaviour has encouraged widespread mistrust of their motives and behaviour, not least because it has given away their attitude of sheer contempt for the majority of people in the world, to the point that they have let slip their view that most of us are literally herd animals. (I do realize their expressed model of herd animals is also ridiculous.) This has helped make it easier for people who genuinely are trying to spread nonsense to be taken seriously because anybody who is not crazy but asking reasonable questions and making reasonable suggestions is being conflated with them by the "mainstream media." Now, again, I don't think people generally are being taken in by this tactic. The issue is that the groupthinkers in rule and decision making positions have convinced themselves of this equivalence and in their own self-created panic and terror are continuing to flirt with authoritarianism. This is not restricted to their garbled response to the pandemic by any means either, the pandemic simply provides tragically vivid examples. Authoritarianism is never a good idea. (Top)
Scapegoating is a Not a Good Idea (2022-11-21)
I know, I know, it sounds simple minded and obvious. Of course scapegoating is a bad idea, everybody knows that. Everybody knows that the term comes from an ancient practice many of us have learned about indirectly from excerpted episodes from the christian version of the hebrew documents generally known as the old testament. On one hand, the older version sounds a touch silly, and like it should be sort of helpful. If a small community is struggling with infighting and generally being awful to one another over an issue or issues that are defying resolution by positive means, it is tempting to think that maybe having everybody agree to pick on a helpless goat and drive it out of town should be enough. Cruel to the goat, but if people generally hold themselves as more valuable than goats and genuinely hold to an agreement not to fight over the issue anymore because they sent the goat out of town, maybe that's enough. Well, of course, and you already knew this was coming, it isn't. The scholarly consensus is that the goat is an attempt to replace picking out a person for this grim usage in which the whole community was expected to take part. Shirley Jackson's short story The Lottery has been helping remind anglophone north americans of the glaring ethical issues of scapegoating since 1949. Still, nobody does that sort of thing anymore except for in so-called "shit hole countries" (actual terminology used in public by a united states commenter in the mainstream press, alas). It would be wonderful if that were true.
Scapegoating and victim blaming are closely related, because they depend on some form of chance. The chance part is very important, because that's precisely how the members of the community at large can feel no guilt and in fact an exhilarated vindication in engaging in it. After all, if a deity or a carefully depersonalized fortune or luck can be considered the real cause of someone being marked for scapegoating, then that hooks into our propensity to want to feel better than others. Even more so if the marked person or persons can be widely depicted as the authors of their own misfortune. There is no real logic in this, as any examination of attitudes towards women versus men in the matter of unwanted pregnancies or sexually transmitted infections shows. Women are always held responsible for "getting pregnant" or "getting infected" as if the men they were involved with had nothing at all to do with it. The exception that proves the rule is gay and bisexual men, but especially gay men, who were widely declared to be justly dying young of AIDS after infection with HIV heedless of the fact that it was not even a known pathogen when it first ran rampant. That in fact is how it was able to run rampant. A number of contributing factors go into vulnerability in these cases, predominantly the ideological insistence that men are the only real human beings and their sexual pleasure is always a priority, which makes "safe sex" an annoyance that impinges on what are actually inappropriate privileges they conveniently like to believe are rights.
Today the commonest pretext for scapegoating in general is perceived or actual lack of conformity with whatever is being trumpeted as "what everyone should be doing" from a central authority. That central authority may be a religious one. Or a government one. More and more often, it is an economic one, the consensus of typically male capitalists who want a properly trained and cowed working class. Hence the perennial attacks on the homeless as supposedly just lazy panhandlers who spend all the money they are given on drugs and alcohol. In reality, people struggling along without permanent or dependable housing include families with working parents and children in school who have suffered an acute setback related to such challenges as severe medical conditions, loss of a key family member, unaffordable rents, or eviction by real estate profiteers. With the shut down and closure of many mental health facilities as "too expensive" the people who needed them end up on the street, struggling to maintain proper access to the therapies necessary to manage their conditions. Then there are people who have criminal records and find themselves completely unable to get honest work on a consistent basis regardless of their record subsequent to serving their time. But it is a lot easier in the short term to blame all these diverse people for their situation than get to grips with the underlying structural issues. Rather than admit the structures are not working, we are encouraged to think that it is the presumed wilful failure to conform to the demands of those structures that has led to their difficult conditions. Part of that encouragement includes discouraging us from learning any specifics that might humanize any person struggling in such circumstances.
There are many, many reasons from the ethical and moral to the practical and feasible that scapegoating is wrong. And all too many ways we can be encouraged to do it anyway, by being distracted into calling it something else and dissociating what we are doing from the label. A key tactic is of course to keep going back to that absurd seeming biblical story. When Jackson's story was published, it raised no end of controversy, not least I suspect because she carefully designed it to make it clear that what was happening was a socially condoned choice based in blindly obeying authority. Worse yet, she pointed out, however indirectly, that such blind obedience leading to such travesties could and did happen not just in places like nazi germany in "abnormal" times, but potentially anyplace in "normal" times. The temptation to blind obedience is of course created not least by the way in real life risk of being scapegoated goes up with being merely perceived as not conforming in some way. That is a dangerous juggernaut enlarged and made faster by its tendency to push people into suspicious, paranoid frames of mind. (Top)
Difference is Not Inherently a Hierarchy (2022-11-14)
Quite some time ago now, I wrote a thoughtpiece called There Are No Oppression Olympics, in which among other things, I pointed out that people are not oppressed over their identity claims, claims about their identities made by others are used as rationalizations for oppressing them. Now I would add that playing into the claims made about outside imposed identities does not game the system, it just makes the person who thinks they are playing a useful idiot for the actual oppressors. The supposed "oppression olympics" are a game in which people compete for the highest numbers of "oppression points" which we are told is an all-powerful passive aggression cannon that can be used to force through the boundaries of others and pretend to control their thoughts and speech at all times. Put this baldly it sounds ridiculous because it is. If merely having difficulties in life rendered the sufferers into superheroes, we'd be living in a comic book universe, except the superheroes would be mostly superzeroes obsessing on watching our every move and browbeating others into forcing us to praise them at every other moment and bewailing where the meaning of their lives is now that everyone is serving them as the deities they are. (I admit to never being quite sure why so few people openly discuss that "superheroes" are transparently supposed to be deities on Earth.) In any case, none of this would work in the first place if there wasn't such determination to read "difference" as always demanding arrangement into a hierarchy because a "difference" must mean somebody is socially advantaged or disadvantaged based on that difference alone.
Please note the "socially" here. A person who is taller than me has an advantage if we play basketball together, and that has no social meaning in itself, it just reflects the mechanics of the game. Women are capable of gestating, birthing, and breast feeding babies, and men can't. These neither advantage or disadvantage women relative to men or men relative to women. The various disadvantages to women arising from their having those abilities are social in nature, not inherent. There are certainly physical things that a pregnant woman will not be able to do as well or easily when she is pregnant than when she is not apart from social impositions of course. There are obvious practical things, like bending forward, or walking for long distances in the later stages. Pregnant or not, women have a lower centre of gravity due to the underlying skeletal development necessary so that if a woman does experience pregnancy, she can still get around. Since most competitive sports are designed for men, whose centre of gravity is higher, women are often at a disadvantage in those sports. But again, that's basic physics, and depending on the sport women may develop a range of compensatory mechanisms, especially technique. When it's not a sport, well, women do sensible things like come up with a reasonable technical solution. There is no doubt in my mind that a sensible woman invented the atlatl, because it dealt with three key issues: it increased her throwing range, and got around a lack of easily available wood straight and strong enough to make into a spear, and storing the thing when not in use. This also clarifies that I am referring to "difference" that we develop from the basic physics of our bodies as we grow and age.
I do appreciate how desperately tempting it is to conflate differences, especially between people, with a rationalization for taking advantage of them. There is considerable evidence that humans have a strong sense of fairness, and that we are probably predisposed to this because it helps manage and minimize outright violence and maldistribution that would generally endanger our survival. It takes considerable hard work to suborn this sense, because it is an important part of our survival instinct equipment. If it were otherwise, and I have pointed this out for other examples of supposedly "natural" things that we have fantastic levels of social and economic pressure brought to bear on us to force us to do them, we would hardly need so many enforcement tools. No need to tell children that their fate is dictated by the fact that they happen to have a certain colour of skin, or that they are female or male, or they have a certain religion. There would be no need to "save the phenomenon" through such hideous practices as driving children struggling with puberty into the arms of anyone who pretends to be willing to fix their distress by making them "conform" to what is supposed to be "natural." There is no need for medicine or law to make what is natural to happen. Conversely, we can't make something "natural" and therefore acceptable and presumably harmless by declaring it "lawful." This is the sort of lie that the various members of the "pomo theorist" crowd are so big on, and it is incredibly popular with the people who want some way, any way, to stop successful resistance to the practices and structures they use to exploit others and quell their consciences while they do it.
After all, accepting the false conflation of difference with social hierarchy has a nasty, tempting quality for those who wind up on the nastier levels of the socially created ladder. When all else fails, the person at the bottom of the sequence of levels for living people can always console themselves that they are still alive. But for most of us, we are encouraged to look down on some others who hopefully aren't so close that real life doesn't disabuse us of what we actually have in common with them. Great for dividing and conquering as far as the people who are doing the majority of the exploiting, to be sure. Based on the quality of commentary in the news these days, the sense of inherent superiority this can give to people is one terrifyingly intoxicating and addictive drug. But it only works as long as enough of us believe in it. (Top)
Seeking Followers is a Big Part of the Problem (2022-11-07)
"The problem" being the multifaceted nasty of the current malaise afflicting so many societies on Earth right now, and the special horribleness of the late, self-cannibalizing neoliberal and capitalist fundamentalist cultures of the so-called "west." Part of what makes that horribleness so awful, and so important to refer to by an ugly coinage, is that the people suffering the worst of it are those who had the very least to do with imposing and supporting it in the first place. The level of injustice in that is crazymaking. That is not the main topic per se in this thoughtpiece, but better not to leave the problem I have in mind undefined here before getting specific about "followers." The term itself is a bit of an odd one, because it is not strictly respectable to be anybody's follower, or at least it wasn't not so long ago. Usually "follower" didn't show up unescorted in articles and books that I have seen either, at least not until recently. Usually it carried an adjective, say "religious" or "political," both implying blind faith and a willingness to act in order to avoid or actively suborn the truth in service to whatever or whoever the person so-described was following. Its extension was possibly innocent, starting as it did in "social media" but that very starting point makes me very suspicious. Before "social media" if we wanted to regularly read a particular blog or newspaper or magazine or whatever, we would subscribe to them either by a literally paid subscription or by picking up their rss feed if they had one. Or we did it the old-fashioned way for websites, we made a bookmark in our browser and checked it by hand.
But social media "following" has some of the same vices attached as the connotations of old fashioned following when associated with religion or politics. Not an active engagement with whoever or whatever is being followed, but a passive acceptance of whatever the person or religion or politics dictates at a given moment. This makes it much easier to accept the "social media" company curating your personal newsfeed instead of you, supposedly for your convenience rather than theirs. By this I don't mean that automation is bad, and I think it is a real lost opportunity that the bookmarks menu is incessantly manipulated in the most useless and infuriating ways by the few browser developers without at least providing an option to set up your own folders locally such that the links you haven't visited in say, x months move to that folder. Something more along the lines of the autodeletion rules that can be set up in rss readers. Considering the level of labour wasted on inserting impossible to delete folders with unhelpful names in the bookmarks menu on top of folders with automatic sorting behaviour that doesn't make sense (*cough* firefox, *cough-cough* firefox mobile), it seems unlikely that it would demand much in terms of coding since they could politely reimport the requisite code from thunderbird.
The thing about online followers is that they bring the pathologies of celebrity culture to everyone who gets tangled into "social media." We are all involuntarily familiar with the horrors of "celebrity" in which celebrities are expected to constantly perform and reproduce whatever made them famous in the first place. Some thread the needle in more or less successful ways that don't seem to totally destroy their lives, such as William Shatner, who has made quite a successful career out of being belovedly infamous for his often awful acting in formula television and movie productions combined with much better work in other places. Others get caught in self-destructive spirals as they effectively immolate themselves in public as they strive to hold as many eyeballs as they can. The low budget version of this, for good or ill, is the latest variant of the internet troll, whose primary habitat is the grotty regions of social media currently raised above all others by a hopefully temporary successful power grab by the least mentally well and most antisocial elements of the techbro club.
The result is a constant drive to reproduce another and bigger spectacle of mean-spirited name-calling and piling on to build and hold followers of the most fickle and unpleasant kind. This is bad enough on its own. Worse yet, it is infectious and spreading among people who are busy trying to parlay their various "social media" footprints into ways to make money from not doing very much, or worst of all, from trying to do something real that genuinely does deserve support, like investigative journalism. Solid support for real work can't come from blind followers, people who are typically following a hormonal high, be it adrenalin or dopamine or both. Once the rush wears off and the groupthink bubble pops, and they always do, the crowd dissipates. Unfortunately the foul mess they leave behind does not. (Top)
No, Cryptocurrencies Should Not Be Saved (2022-10-31)
There have been no shortage of crazes and bubbles associated with clever pieces of software intended to magic money out of thin air. "With a computer" has probably been one of the greatest gifts to the crooks trying to persuade the unwary or just plain unlucky into pyramid schemes and other such money draining set ups ever devised. I have never been quite sure why so many people have been taken up with the hype around "cryptocurrencies" because apart from use as tools for money laundering, I couldn't see any purpose to them. The utility of cryptographic signatures and cryptography more generally for safety, security, and privacy were quite obvious to me, as did making sure those things take too much time and energy to be worth breaking. But cryptocurrencies just sounded ridiculous. Banking is already based predominantly on "virtual currency," in effect a whole complex net of promises to pay, and the origins of that practice goes right back to at least ancient Sumer. So it seemed to me that what cryptocurrencies were was a clever speculative fad with a long fading time because they could be used for money laundering, and tech bros never give up on something done with a computer until they've been thoroughly drubbed by real life. All that skepticism expressed, I am also aware that when it comes to money matters, I am deeply risk averse. Other people with better tolerance for risk than me are all over the stock market, and that speculative exchange has been going for some time. So I could also see that perhaps there is something more like the stock market that cryptocurrencies could do.
Then I dug a bit more into how a person actually gets and spends cryptocurrency, and things got far more murky. At first glance they are at least analogous to an internal online community currency, where members may trade in-kind goods for the most part, and have an exchange of real world currency for a certain amount of within community goods to get started. But the key issue for the cryptocurrencies that have been taking off on the speculation front is the implied convertibility from cryptocurrency to currency spendable anywhere because acceptable anywhere. Understandably, the people most committed to cryptocurrencies really want to open up its wider convertability. Then the wide-eyed excitement when bitcoin soars would make serious real life sense, because if there was a guaranteed way to convert it at the height of its value in a standard currency, the holder could become an instant million- or billionaire just by investing a few thousand dollars in computers and absurd electric bills. Or, a person could try using cryptocurrency in the way many people in countries where things are looking scary use gold, gems, or a currency that they expect will not lose value. That last use is probably the big one that many of the so-called cryptominers in "the west" are actually interested in, because the united states is staggering in a scary and punch-drunk manner. I certainly sympathize with the desire for insurance against poverty!
Unfortunately, the practical impact of "cryptomining" is an environmental shitshow. Jurisdictions from china to north america have had to take steps to curb it, because the level of electricity used to carry out "proof of work calculations" and other such nonsense are competing invidiously with other practical day to day uses. Yes, it was reading about the "proof of work calculations" in the end that did in my respect for cryptocurrencies. They fundamentally depend on being able to prove that you have enough real money to waste on generating them or being able to figure out how to steal the electricity to make them from somebody else. Hence now malware has a whole genre of programs meant to generate some cryptocurrency in the other that savvy computer users have to protect their machines from. Heedless of the dream of magic money that bears no stubborn serious relation to the real world, cryptcurrencies are no realization of that dream. If anything, they might be a great illustration of how cryptocurrencies are a highly obfuscated version of putative perpetual motion machine that is only convincing as long as we don't understand how it works.
So no, contrary to some banner appeals I have seen on a few sites, cryptocurrencies should not be saved, and don't let the attempts of some of those appeals to conflate this specific use of cryptography with cryptography as a whole fool you. If there turns out to be a way to generate cryptocurrency without helping drive planetary global warming and the collapse of the ecologies that we humans depend on to live, I would have no argument with them. But that is not what cryptocurrencies are built from at this time. Nor can we solve the socio-political problem of preventing arbitrary seizure of our goods by despotic governments, because the possibility of such a seizure is a symptom of the disease, not a cause. (Top)
Posturing on Critical Race Theory (2022-10-24)
The moment that terminology has been thoroughly co-opted and rendered into nonsense for use in the mainstream media, is when a pressure group either claims it is being taught to children and must be excised from instruction by the hand of the state, or claims that it must be added to instruction to children by the hand of the state. In either case, it does not matter whether the words refer to something that actually could reasonably be taught to children in terms of their capacity to understand it, or should be taught to children at all. It is fair to consider these two things each time new material is proposed for a standardized curriculum. But I think we can all be pretty clear that if a pressure group starts demanding state surveillance and major fines or even jail time for teaching or not teaching whatever the terminology supposedly refers to, then matters have spiralled away into political posturing and distraction intended to keep us from noticing the real things we should be concerned about. The current right wing freak out in the united states against what they call "critical race theory," which mashes together reasonable challenges to the stuff that people like Robin DiAngelo are selling with not attempting to pretend that the united states has always been a perfect utopia where there has never been any social disunity is sad, but predictable. Canada has been dealing with some similar nonsense as different provinces try to excise Indigenous history out of the curriculum before it gets in. All that said though, it is not unreasonable to check out what all this shit is supposed to go back to, and what the terms actually mean.
For my part, I started with going back to the question of what "critical theory" is first, because it seems to specially irritate those of seriously right wing leanings. That already suggests the whole notion is associated in a powerful way with Karl Marx and his writings. My OED doesn't mention him as such, in part because this is not the whole dictionary but a one volume desk dictionary. It defines critical theory as "a philosophical approach to culture, and especially literature, that seeks to confront the social, historical, and ideological forces that produce and constrain it. The term is applied particularly to the work of the Frankfurt School." Yes, another of those definitions that annoyingly has to be chased to another entry. The frankfurt school was "a school of philosophy of the 1920s whose adherents were involved in a reappraisal of Marxism, particularly in terms of of the cultural and aesthetic dimension of modern industrial society." Names I see bandied around a lot elsewhere as members of this school are Adorno, Herkheimer, Althusser, and Marcuse. Now, the marxism connection aside, "critical theory" doesn't sound like a project exclusively engaged in by this school, in fact in the 1920s and later many people were worrying about the impact of art, mass media, and propaganda as a whole. Insofar as cultural and aesthetic works were contributing to social unrest, I don't know of anyone who hasn't raised concerns at one time or another. Persistent concerns about the content of school curricula usually do have a basis, however small, in a critical view of what the contents of it are and whether it is rendering them into obedient supporters of the status quo or thoughtful participants who insist on change.
That's right, I don't think that nihilism is actually taught in school, whatever level of schooling we opt to pick on. Based on my own reading and observations, I must agree with Arendt, who in very broad strokes noted that social atomization and immiseration related to lack of meaningful work and persistent unemployment produces rootless purveyors of mob violence and nihilism. This is not to say that critical theory can't go awry. It most certainly can, if its practitioners get wrapped up in playing with ideas and lose contact with material reality. With no anchors in the real world, it doesn't take long before critical analysis of literature or art slips into such nonsense as claims that the way to fix structural issues is to redefine the people involved so that the people on the wrong end of a structural injustice don't exist anymore. The obvious purveyors of that right now are the gender identity extremists. Another group are busy in what they call "antiracism work" which mostly entails demanding that we claim that racialized people are doomed to suffer racism and non-racialized people are doomed to be racist, so we had best resign ourselves to our individual awfulness that is inborn and impossible to change. That this is actually rebranded racism has not been lost on the first people to get this thrown at them, african americans. Having a material grounding doesn't inoculate a person attempting to use critical theory from all nonsense, but it is a powerful start.
After all that, finally it gets to the point of critical race theory, which has complex origins unrelated to marxism or the frankfurt school, although it does share the critical theory technique of considering how received ideas may have unwanted outcomes and identifying what is awry in those ideas so they can be fixed. But the key difference with critical race theory is that it started from a legal context, which provided an initial materialist grounding. The widely recognized founding scholar in the field, Derrick Bell, was an african american lawyer who worked hard on civil rights related cases. Another famous name is Kimberlé Crenshaw, whose original conceptualization of intersectionality was intended to illustrate how legal definitions can produce perverse outcomes that put african american women in double jeopardy. Laws and systems founded under conditions where racism was considered acceptable are bound to have enough wiggle in them to allow racism to continue right along on its merry way if no questions are asked. By now I doubt many people are unaware of the phenomenon of tokenism and its roll in providing cover for continuing to do the same racist things as always but having a racialized person at the ready to trot out for photo ops and the like.
A great introduction to "critical race theory" is provided by the combination of articles and podcasts produced at the black agenda report including some good citations and search terms. The podcasts include: CRT Origins "Radical Liberal," Not Marxist, from 13 july 2021. For articles, including a three part series of long reads are: Critical Race Theory Opponents Are Just Proving Our Point For Us by Gary Peller, 8 july 2021; The Conspicuous Absence of Derrick Bell – Rethinking the CRT Debate, Part 1, Realism, Idealism, and the Deradicalization of Critical Race Theory – Rethinking the CRT Debate, Part 2, and The Theory of Intersectionality Emerges out of Racist, Colonialist Ideology, Not Radical Politics—Rethinking the CRT Debate, Part 3, all by Patrick D. Anderson and published between july and october 2021. If nothing else, this should show a skeptical but fair minded person that the claims that critical race theory is not just a project to run down the names and memories of people who think they are white in the past, nor to teach people who think they are white and their children that they are inherently horrible human beings and they can't do anything about it. These articles and podcasts provide an overview of critical race theory's own history, and subjects different approaches to it to criticism.
Obviously, I am not a citizen of the united states. But canada is also a settler state, and I have certainly observed my friends who think they are white and racialized non-Indigenous friends too struggle with deep discomfort with critical theory approaches. It's not comfortable to question what we have never questioned before for anyone. The propensity to take any challenge to accepted bromides as a personal attack is quite frustrating, but I suppose it is less surprising if we bear in mind that one of those bromides claims that if you never disagreed with something done before you were born means you agreed to support it by default. Sharp eyed readers will recognize that bromide as a secularized version of "original sin," and there is no end of angst people have been instilled with using this bromide precisely because then it becomes something nobody can do anything about. But as I commented to my friends who have got caught up in this, while we can't do anything about the past, we can do something about not repeating what we don't like about the past into the present and future. Which of course we can't do if we don't look those bromides firmly in their figurative eyes. (Top)
What is This Supposed "Inflation Risk"? (2022-10-17)
The boogie-word of the past year or so has been "inflation," yet somehow it never seems to behave in quite the way most of us understand it to work. On one hand, I keep reading and hearing pundits declare that inflation is very low or practically zero and people should not worry about inflation in the context of the cost of living. Since I am far from the only person noticing inexorable and apparently irreversible cost rises in the price of groceries, housing, and clothing, and the changes are certainly not just one or two percent per year (which is already bad enough), this direction "not to worry" is simply a blunt insult to our intelligence. On the other hand, the second, maybe even the millisecond, that evidence of sufficient pressure to enforce an effective raise in the minimum wage and the wages of the majority of workers, the wailing starts up to a deafening roar. There will be inflation everywhere! In everything! All the prices will go up to compensate for the rising wages! How dare workers be so greedy as to figure they may as well starve without having their life and dignity constantly trampled underfoot and hold out for better wages! Sarcasm aside, I do understand that inflation actually is an important tool to use in assessing how "the economy" is working. My issue with inflation as it is used in the media, where it is a meme intended to drive panic and push more modestly well off people into demonizing those in service jobs – today's servant class – is its dishonest use even outside of the media and general punditry.
UPDATE 2023-01-20 - Now more and more information is slipping out indicating that the "inflation" for the most part has been far from "real." That is, the corporate bought and paid for media has been shrieking "inflation" while profiteering corporations have been jacking up prices. Those price increases can no longer be explained away by such actual conditions as disease outbreaks affecting domesticated animals or plant monocrops, or problems unloading trashy "consumer goods" from container ships. This is especially notable in the case of grocery prices, where anyone who has been watching price per unit or unit amount will have noticed packaged goods shrinking in size and increasing in price over the past year. There is no justification for a simple tin of beans (no sauce) to suddenly cost 75-100% more than a year ago.
I see from the dictionary that in economics, "inflation" is just supposed to refer to price increases accompanied by loss of buying power for the same amount of money. This confirms that the colloquial definition of inflation is basically the same. Funny how at the moment (let alone in the previous two to three hundred years) there are apparently only two causes of inflation. So-called "acts of god" such as droughts or wars that interfere with distribution or production of whatever is being sold on the market, and wages. Again and again we have learned that for wages not to reach heights that are impossible for employers to pay out from the profits they take in from the labour done by their employees without endangering the business, that the government needs to pitch in with supports that keep the basic cost of living from becoming insane. The real world data on this looks completely consistent with what the modern monetary theory economists, today's heretics in the field are saying. The challenge in a capitalist context where a government that has the ability to issue its own currency and would prefer to suck up to the very rich than anyone else is apparently to somehow make "trickle down economics" work well enough without having the equivalent of a world war-style mobilization, which itself is still intended to do best by the very rich but has unwanted side effects. You know, all that social mixing and instruction in organization that can make it easier for the majority to hold governments, the rich, and corporations to account.
Yet there is still a key element missing here, in the understanding of what affects the rate of inflation. Conveniently, the fact that most employers take a significant portion of the profits is left unconsidered the majority of the time. This is not a surprise, because their ability to do this is being treated as sacrosanct, as supposedly the only reason they start businesses at all. The repeated response to workers holding out for better wages is every kind of coercion plus complaints by the employers about how hard their lot is. Now, I actually agree that smaller employers are having a shit of a time, especially those in malformed industries like restaurant and catering businesses, where the cost of food is going up because of global warming all the time, and profits were already marginal and drawn from hyper-exploiting employees. Those smaller employers are getting ground out of an unfair system that is trending to a few huge monopolies in each "product segment." Most of them have been dutiful and loyal capitalists, and they are getting screwed in their turn, and they understandably feel they should not be screwed when they followed the rules, including turning the screws on their employees because "that's business." But practically speaking, the game is rigged to maximize the ability of the biggest and most psychopathic players to maximize and hoard the amount of money they can get. So we have a crazed situation in which late stage capitalism is grinding to a halt because money is constantly being taken out of the system so that it can no longer serve as capital. Meanwhile, demand for the basics of life keeps going up, inflating the prices demanded by the monopolistic corporations controlling more and more of the supply, which pressures workers to hold out for higher wages and try to keep the gaps between the ends from guiding too wide by accessing credit at usurious rates of interest.
Is it any wonder that under these conditions, with the consumer credit system also slipping under water because of invidious debts, many people are taking seriously arguments that the very rich are actively scheming to slaughter as much of the population as possible? The reason for the slaughter is to curb the way these conditions add to workers' power to resist exploitation and ask hard questions about why rising productivity keeps correlating to worse social conditions in capitalist places. At this point this can hardly be seriously counted as a supposed "conspiracy theory" anymore, considering the extraordinary and apparently deliberate hamfistedness of the average government response in capitalist, and especially capitalist fundamentalist countries, to the coronavirus pandemic. The continuing destruction of mass transportation for the wider population is not a "conspiracy" either, in that the open pillaging of those systems by corporations through successful campaigns to privatize anything that was nationalized, end subsidies to maintain needed service on routes that are unprofitable due to low population and refusal to hybridize freight and people transportation except at the cost of helping people get around have all been quite public. For many places, the only mass transportation to get between towns, let alone towns and cities, is private cars combined with planes. But as we all know, regardless of our views of the environmental sustainability of frequent flying, it is passing ever further beyond the reach of more and more people for any reason they might want or need it. The persistent pressure on people to carry out all their web browsing and any online discussions via a few corporations and the propaganda forcing chambers of so-called "social media" is also quite open.
I remember vividly a period in my younger days, as I began to develop the capacity to take on adult responsibilities, when rather goofy books about aliens and the Illuminati and such sounded weirdly convincing. But it did strike me as weird, not via a conscious analysis at that time, but because they were so profoundly invested in appeals to authority and magic. It sounded too much like "Yeah, well, my Daddy says so, that's why!" to really win me over. Plus, my own direct experience had shown me that adults were prone to trying to manipulate their authority to take advantage of me. So at the time I read a few of those sorts of books, and put their claims into my "reserving opinion on this" file, so to speak, and then had other practical matters of finding a job and the like to focus on. When I went back to those books and similar stuff later, better equipped with more real information, of course then I realized they were goofy, even as I had to grudgingly respect their clever deployment of bad arguments. It seems like many people and communities accustomed to thinking that today's authorities are working in their favour are having a similar experience because for a whole range of reasons, both good and bad, they have had a chance to pause, take a breath, and look at the real information available to them versus what they are told to think about it by those authorities. And it turns out those "authorities" have their pants, both under and over pants, around their ankles while claiming that they can't be exposing themselves inappropriately because they are really some other type of person who doesn't have genitalia. It would be grotesquely funny if the real world impacts were not so destructive and unjust. (Top)
A Culture of Oversharing (2022-10-10)
Thinking it over, maybe everybody thinks they are living in strange times, because one person or community's strange is certainly another's normal. Maybe it is always difficult for us to accept the contradictions and discomforts of our present circumstances, especially if we don't happen to be a member of the luckier and more empowered subsets of our society. That acknowledged, it doesn't change my response to the twilit present of the world most impacted by asocial amplifiers like twitter and the facebook collection of properties. Somewhere between 1995 and 2005, the general mainstream culture has apparently fallen into a massive state of oversharing that seems in itself to be a combination of inadvertent for many people, and totally deliberate in others. The ones who never intended to be quite so "open" have run repeatedly afoul of retroactive and/or opaque rule changes, often combined with being online from too young to understand the potential consequences. Add to that the fact that few people understood that the web remembers what it shouldn't in multiple versions, and forgets what it shouldn't yesterday. Then there are the people who think they have gamed the system, and are pursuing a pretension to celebrity. For many of them, perhaps it is all about narcissism or a very clever scam to get vulture capital and the like. I suspect that even more often, these are people who have observed that celebrities seem to have a lot of power and get whatever they want, and who counts as a celebrity is now pretty much anyone who is famous for any reason.
At this point, the culture of oversharing is such that corporations and other organizations more generally have been actively cultivating oversharing in the workplace. We have been getting intimations of this for a long time, in the form of "human resources" departments actively searching for all of a candidate's "social media" posts and other retrievable web activity to find disqualifying items. It is true that verifying claims made on a resumé are reasonable, and that a security check is appropriate in many, many cases. However, refusing to hire or ending probation for a person due to something stupid or vile posted when they were fourteen is clearly nonsensical. Such blatant cases are of course no longer so common, not now that we have a whole system of woke scolding and cry bullying in place due to the new crypto-religion of gender identity extremism. But these are all examples of organizations taking advantage of information shared outside the workplace. Pressure to overshare within the workplace is the more recent part, and it is growing at an appalling clip.
Not so long ago, I would have just shrugged this off, except that it became clear in my own workplace after several official announcements that any staff who had social media accounts and worked there were on notice. Their accounts were under scrutiny for any commentary on the organization, which the organization claimed the right to manage, whether or not the person in question had ever indicated in any way, such as in a biography, a photograph, or avatar, that they worked at the organization or if they disclaimed their commentary by stating clearly that they did not speak for it. Then there was the new intranet, which had been redesigned to look more like facebook or twitter (or perhaps both) in order to "increase engagement." Suddenly everyone could have followers, like, subscribe to feeds of one type or another, and just by clicking on another person's name get not just their email and phone number, but a whole run down of the documents they own, the projects they are assigned to, and the like. Some of that sounds sort of useful, and a bit overdone but not necessarily alarming. Not so much the voting and following. Then the new "diversity, equity, and inclusion" policies were rolled out, in which suddenly all staff were being encouraged to do things like declare pronouns, disclose mental and medical issues as part of "awareness circles" and provide detailed updates not of work items but life outside of work, I guess in order to "humanize" one another, officially. This has slipped steadily along into a steady drip of little notes prodding people to change their avatars to show they really care about the latest social justice issue deemed worthy of attention, followed by a hailstorm of officially voluntary seminars, most now including readings to be done off the clock.
Even with all of this I might have tried to shrug it off, except for the latest version of the ever-present microsoft windows, which comes set by default to present from its task bar "news just for you!" The new outlook also strives mightily to force the person required to use it to retrain with the supposedly new and improved email search – hilariously, even just trying to force it to run the tutorial while I get a cup of tea doesn't work because of a bug. In any case, I had to research how to turn off the creepy, pre-populated "aren't these the things you want?" features that were grabbing items from online and constantly eating cpu cycles and slowing the computer down to persuade me that I wasn't at work but engaged in some weird social media hell hole. Even my colleagues who like social media have been turning this stuff off. Other colleagues cheerfully use their work stations and phones for personal business beyond looking up the odd address or phone number, which I have never been comfortable with doing, having started in the workforce when anything more than that was strictly forbidden. Back then it was about trying to minimize exposures to computer viruses and other nasties, but it also had the salutary effect of helping keep the boss out of our private lives.
But now, now "the workplace" is awfully interested in our personal behaviour, and managerial and "human resources" staff keep a steady pressure on us more and more often to disclose more and more about ourselves – and demonstrate our "right thinking" by adding pronouns, the right tag lines and assertions, dutifully updating our avatars and using more and more granular features in software that attempt to enforce disclosure of the specific reasons we are booking that time off or blocking off time during the work day to work but not for a meeting. More information specific to work might be reasonable to request under specific conditions, but not as a general rule. And to think that the level of surveillance we are expected to accept and participate in now not so long ago was firmly denounced when the secret police in the eastern bloc did it! (Top)
Eugenics Persists Like a Bad Rash (2022-10-03)
There is a dangerously seductive and cruel element in the ideas characteristic of eugenics. The desire to do as much as humanly possible to ensure that all children are wanted children, and that they are born in the best possible health and the best chances of staying healthy for a fulfilling and long life, is laudable and ethical in itself. The trouble is it is hijacked all too easily by the idea that human perfection can be absolutely defined, and that the definition not only can but should be based on a narrow range of physical and mental abilities. This is how eugenicists soon ran themselves into terrifying claims that it was only right to prevent those they deemed "defectives" from having any children of their own, and furthermore to adjust them by whatever means necessary to make them "useful" and obedient. This usually meant rendering them into servants by imposing such "treatments" as drugging them into docility or the infamous lobotomy or other forms of deliberate damage imposed on the brain, ability to communicate, or limbs. In the extremes, people deemed "defective" or perhaps "surplus" were marked to be prevented from living long, or at all. This is rightly horrifying. Then there were the eugenicists who pointed to people in vegetative states and other severely disabled conditions and declared it "only merciful" to somehow not allow them to happen, as if such people are natural disasters. This should be easy to pick out as a dishonest conflation of issues, and as the equally horrifying and inappropriate conceptualization of their lives that it is. And indeed, today it seems to be. Nowadays, if I was to canvas people about whether it is okay to sterilize or euthanize anyone against their will, the likelihood anybody would say yes is minuscule to none. (The question of assisted suicide is a separate and properly difficult question that I won't get into here.) Yet for many people clarity and skepticism has utterly vanished in the wash of demands that we believe in and unreservedly support "gender identities" up to and including funding and supporting extreme body modifications and witholding support and treatments for disorders of thought and feeling because they are supposedly only caused by thwarted "gender identity expression."
Now, here are some quite strange coincidences between the approach to "gender identity" that according to a few very loud activists and their corporate backers as the only just and moral thing to do if a person declares that they have a "gender identity." The populations most likely to be swept up into the broad category of people with "gender identities" according to the statistics gathered to date are:
According to the gender identity activists, all such people have gender identities, and their only route to a healthy and fulfilled life is the processes of drugs and surgery they advocate, otherwise these people will inevitable commit suicide. They make this incendiary claim based on a repeatedly debunked internet survey, and knowing that because many of the people listed above are children and adolescents, that many people will respond with sheer panic and rush to follow the activists's recommendations. Now, to be sure, the principle that everyone should have an opportunity to live healthy and fulfilling lives appears to be shared by everyone. But the "treatments" the activists insist are necessary oddly enough have an extremely high likelihood of sterilizing the persons subjected to them, and on top of that to wrecking their health and reducing their lifespans. All that, while reinforcing compliance with sex role stereotypes. Oh, did I forget to mention that one of the major criteria for gauging mental perfection among the old time eugenicists was that each person strictly hewed to the sex role stereotypes they defined for them based on the sex and class of the subject individual? Well, I have now.
That is not the first uncomfortable coincidence. Let's consider again who will be preferentially subjected to sterilization by surgeries and chemicals intended to support their proper embodiment of their "gender identity" under this rubric. First off, there is a range of people with disorders of feeling or perception, a term which I learned from Celia Kitzinger and Rachel Perkins in Changing Our Minds: Lesbian Feminism and Psychology. I prefer it to "mental illness" because it recognizes that the people having those experiences are not having imaginary problems, and it is more specific about what they are struggling with. Such people were among the "mental defectives" that old time eugenicists were quite strident about not allowing to reproduce, if nothing else. Then we should note that homosexuality and bisexuality were both wrongly conflated with disorders of feeling and perception and so those old time eugenicists wanted them at least "corrected" and if incorrigible, then sterilized. The men among those old time eugenicists were often extremely committed to patriarchy and the total subjugation of women, and so to their mind enforcing sex role stereotypes by any means necessary was a social necessity. Those forced into the proper moulds at last should be exemplars to keep the rest in line, so it was okay if the result was ultimately a recipe for social isolation all the same.
The old time eugenicists wanted very badly for the targets of their desire to weed the human race into perfection to go along willingly and not make a fuss. The first sexologists thought they had this solved with their development of the inversion pseudo-theory of homosexuality, especially since that was consistent with their belief that homosexuality was an illness. But history shows that more people than not resisted being diagnosed as deranged when they were simply same sex attracted, and due to the extremely class-specific models of sex role stereotype they were trying to impose, heterosexual people resisted sexologist's claims in droves in their own right. On top of that, the extreme hypocrisy of victorian morality that winked at the ruthless abuse of children and women in prostitution and racialized people in slavery as long as it wasn't where they could see it finally torpedoed that morality and with it the sexologists' original project. Then of course the nazis annihilated the credibility of eugenics by applying genocidal methods against europeans instead of sticking to striving to annihilate racialized peoples in africa and the americas.
With that sort of previous reputation, there was absolutely no way that eugenics was going to get back into public through the front door. Social sanctions against expressing eugenicist beliefs were too high, even if the barriers to imposing them on the vulnerable weren't. This has created another coincidence with those earlier times of one morality being the official version while another was being practised. So it was clever and dangerously persuasive to redress eugenics in claims that it was in fact a means of matching moral claims with moral action, and that its new formulation as "gender identity" wasn't eugenics at all. That it is transparently inconsistent in its logic: after all, if a person merely needs to be allowed to express a "gender identity" to cure any disorders of reasoning and perception they may be suffering and in general to live a fulfilling life, then it shouldn't be necessary to do anything different. They shouldn't need drugs, surgery, or a license to try to control how other people speak and think whether or not those other people are in their presence. But according to the activists, these things are absolutely necessary. And the carrot for those persuaded that they have "gender identities" is a cure for all that troubles them, and a new and intoxicating power to control the words and actions of others that the youngest of them have probably never had before. A more dangerous and potent means of persuading those who would have been labelled "defectives" as recently as fifty years ago that they should comply with the new eugenicists could hardly have been developed.
Eugenics persists like a bad rash, and like a bad rash if not sensibly treated and guarded against will rapidly develop into an awful, seething mess. Like any dangerous rationalization that empowers authoritarian inclined people, it can't be vanquished by calling out one designated super villain or claiming that a specific group for scapegoating as the "cause" of how eugenics has made its way back to social acceptability. Only the very sensible and compassionate approach of many people involved in care and healing of people suffering because puberty is difficult, patriarchy is vicious, and disorders of perception and thought are cruel, works against eugenics. That compassionate approach does not involve promptly reaching for the drugs with one hand the scalpel with the other. (Top)
Ah, That's Where That is From (2022-09-26)
Having written a thoughtpiece discussing George Orwell's book 1984 fairly recently, a few details about it were still fresh to mind, including one feature that has always puzzled me a little. Not puzzled in the sense of the plausibility of it per se or anything like that, but how he came up with it because it is really a bit strange. The detail in question is the (in)famous televisions that watch people back and are required in every home. Why this seems odd to me is simply that in a total system, a few people not having one of these devices is no big deal. If you don't have such a set installed in your home for whatever reason, and the fictional government knows all about that, all they have to do is require that you go and watch the mandatory programming with someone who does have one, who will in their turn be required to provide access to their set. I could see that being a particular version that would please Orwell because of how much intrusiveness it would entail, and all the extra angles for someone to tattle on the person without a set. So what could have inspired that bit of detail, bearing in mind that I make no claim that it definitely did?
By now anyone who is british or very familiar with how things work with television service in the uk is probably rolling their eyes in annoyance or chuckling at my naïveté, because they know all about the damnable television license fee that funds the bbc. And yes, it is damnable.
A bit of search engine puttering soon revealed the basics of this fee, that it was introduced in 1946, when the bbc was still the only television broadcaster around. Orwell would die four years later, so he got to see its introduction and get familiar with the details as he was working on his last novel. Not paying the license fee is a criminal offence that leads to additional fines if the person who hasn't paid has not got an exemption and is found on inspection to have any device by means of which they can view programming broadcast by the bbc. There are literally inspectors that check this, and a whole office that sends reminders that the license fee has to be paid. There is no opt out, and that includes by not having a television. After all, people can watch broadcasted programming on all sorts of devices now. It doesn't take much imagination to realize this means a horrible number of impoverished people, which the majority of the time means women, are in jail for not paying this fee. So in a creepy sort of way, britain has had a television that watches people back already for quite some time.
Now, to be sure, canada is not all squeaky clean on this point, because the cbc is also a state funded broadcaster, which means taxpayers fund it. Which means we don't have a choice in the matter, but then the only way to evade providing a cut to the cbc is to evade paying taxes. But on the other hand, the cbc is provided a budget each year by federal legislation, not via an imposed separate fee with criminalizing penalties attached for not paying it. Plus, canadians with low incomes can in many cases be exempt from most if not all taxes, an important detail for those who are doing without as much as they possibly can to somehow get by. Criminalizing someone for being unable to pay a license fee for watching television is completely disproportionate, and suggests somebody was seriously and shamefully asleep at the wheel when the original british legislation on this point was proposed. It would make more sense to at least make it easy to be exempted from the fee by reason of low income as a stopgap while the larger question of whether the bbc should be funded that way at all is considered. How to fund the cbc is being debated here, and unlike the bbc it does have commercials and does not have a lucrative overseas media and product tie in sales division.
In any case, such is the bit of detail I learned that provided some new perspective on a detail from Orwell's last novel. One so dead obvious to the original audience of the book that of course there was no footnote about television licenses in the edition I read. Indeed, I have not observed anyone else puzzling over the orwellian television sets among my compatriots here in canada, it just seemed to strike them as an obvious sort of thing for an authoritarian government to have, and this well before the current era of omnipresent cctv cameras and cell phones, I suppose – which is in its own way, a disturbing thought. (Top)
On Bureaucracy (2022-09-19)
We live in unfortunately interesting times. A range of scholars have argued that part of what contributes to our present troubles is the expansion of technocracy, "the government or control of society or industry by an elite of technical experts" as the OED notes. I find it grimly fascinating how little critique technocracy gets, especially its embodiment in such blatant egomaniacs in the ranks of the tech billionaires as the infamous Bill Gates, Jeff Bezos, and Elon Musk. They each play their own part in undermining what little democracy there is, as they insist that their capacity to engross massive amounts of money by sharp dealing means they are among the people who best know how to run everything. To my knowledge none of these three specifically have tried to claim that they should rank among the guardians or philosopher-kings we can read about in the always anti-democratic imaginings of Plato, but they do share a faith in narrow expertise being a recipe for being best at tasks involving high social and physical complexity. There is technically an earlier version of technocracy that has existed within colonial governments with figurehead leadership, bureaucracy, which refers to governing by officials rather than elected representatives. (Again, hat tip to the OED.) What supposedly qualifies the bureaucrats to govern is of course their technical expertise. And somehow it is mainstream commonsense that bureaucracy is bad.
To be clear, I actually agree that bureaucracy or its implied updated form technocracy are undemocratic and far from just or appropriate means of governing. Both are wrong on the same two criteria at minimum. Rule by unaccountable oligarchies, however their membership is defined, generally goes exceedingly badly because by nature they are prone to prioritizing only what they value and what benefits them. There needs to be checks and balances to keep them from destroying the communities they seek to rule because depending on chance to have an enlightened leader or oligarchy, which nowadays we could refer to as depending on heredity, has repeatedly been shown to be at best a fool's strategy. The second criterion has to do with the way that a small community of experts, no matter how genuine and well-meaning their expertise, are prone to losing contact with the material world and the actual and potential impacts of imposing the things that work so well in their idealized models. All too few pursue properly designed field research and carefully designed longitudinal studies.
There is at least one more criterion these modes of governance fail on, and that is the demand that everyone outside of the little club of experts treat them as earthly equivalents to deities whose ideas and orders must be accepted on faith. In history we have multiple examples of faith-based obedience as a mode of governance failing appallingly. Too many people avoid facing up to the fact the so-called "dark ages" were not so dark, both in terms of there being written records and evidence in many places people were better off in the former roman empire where the term applies in that period. The extraordinary power of the christian churches just before the "dark ages" were a net evil for society at large, no matter how expert in theology and its branches the clergy and their allied secular rulers were. Like it or not, any belief system hijacked into an excuse for blind violence and authoritarian control is not a recipe for a longterm and stable mode of running a society. I have read many "western" scholars who decry what they call marxism, communism, or socialism as being secular faith systems that run right to authoritarianism. While I can quibble with specific definitional details those scholars apply, their mistrust of blind faith and its connection to authoritarian rule is certainly held up by historical examples, both recent and ancient. But faith-based extremism is not restricted to what is usually recognized as religion or political and economic ideas challenging capitalism.
Whatever the political or religious inclinations of any single person, a greater source of our present malaise does look to be the extreme expansion of alienation. Alienation of people from their own labour, and alienation from their right to take part in the governance and structuring of their own societies. Both labouring and political behaviour here as in the sense closer to the ancient athenian notion of direct participation may be done with input and advice from experts. After all, experts are supported in developing their expertise in one way or the other by their larger society, and it stands to reason that they would contribute the products of their learning and experience. The question is how these experts should be guided to contribute those products. That contribution could be guided into much better channels if expertise was reframed again as what it actually is, a gift of luck and social support to be shared, not some kind of wholly bootstrapped phenomenon that gives a license to pretend to be some form of divinity on Earth. (Top)
The Inescapable Orwell (2022-09-12)
Way back in 2018, I wrote a thoughtpiece called The Abused Verb "Identify", eventually adding three updates to references to other relevant pieces and a quick explanation of why I found George Orwell so politically objectionable. It seems like references to some of his word coinages are constant these days, understandably. In many ways, Orwell is inescapable, especially his last novel, 1984. Alas that he is inescapable for the wrong reasons. For the wrong reasons, and often with little to no recognition of the things he got reasonably right, although why he got them right is rarely understood. I have sympathy for why this is rarely understood by the way, because few people have managed to read 1984 all the way through, and there is more than a little bit that is absolutely contemptible about Animal Farm. I find it intriguing how rarely the latter is referenced to this day, and it is actually heartening to me that it isn't. In any case, in my brief explanation I referenced Isaac Asimov's review of 1984, noting that I appreciated that Asimov did take issue with it as a novel and the persistent efforts to force it into the category of science fiction. The review is relatively long, and very revealing about Asimov's own politics and attitudes, as such reviews must inevitably be. His politics don't impress me, and some of what he didn't credit Orwell for getting right were probably too difficult to see for a united states citizen in 1980. The post-world war ii halo of prosperity had not quite receded, but things were starting to look scary, the actual size and state of the military-industrial complex was still not quite visible, and most people seemed to forget that even Adam Smith denounced corporations. Like many mainstream writers, even those with immigrant backgrounds, Asimov mistook considerable technological change for evidence of how inherently inventive and advanced the united states in particular and "the west" in general were. Now we have all bitterly learned that this was a result of a racking braindrain instigated by two world wars and then driven by "the west."
Today the attempts to force 1984 into the science fiction genre have mostly abated, and not many people are inclined to argue, as Asimov did, the novel was "engaging in a private feud with Stalinism, rather that attempting to forecast the future." A lot of the plausibility of this argument depends upon the belief that "the left" is incapable of doing much really constructive because it is so riven by infighting. Today it is hard to miss how much infighting there is on "the right," including in the united states where anybody left of Franklin Roosevelt – who was not a leftist, but a pragmatists trying to head off a worker's revolt while it was still possible – has been almost completely driven out of politics. That said, I think that a big part of Asimov's distaste for Orwell's writing from the potential science fiction perspective overlaps mine in that there is no future in the novel. Asimov seems to think that the novel is permanently aimed at the past. For my part, I think the novel is actually a nasty reflection of the present at the time of its publication, and not of the imagined present in the soviet union invoked by Asimov, but in england itself. Much of what he describes are aspects of what he was living and participating in with the serial numbers barely rubbed off via exaggeration and some sarcastic tweaks. The sarcastic tweaks caught Asimov's eye in particular, from the "vaporization" of enemies to the "ink pencil."
UPDATE 2021-10-05 - I have recently been pointed to an intriguing two part series dealing with the notion of "the great reset" that is further evidence for my argument that Orwell was arguing about his present, not a past or possible future. Overall I have some doubts about the strategic culture foundation, mostly because I cannot find a clear indication of where their funding comes from. It's contributors have a range of political views that at least appear to agree that authoritarianism and totalitarianism are evil, which is encouraging. Otherwise consistency is not wholly obvious, so like wsws.org a reader's mileage will certainly vary from day to day. I also found that to get it's pages into a readable state I had to turn off the site stylesheet – this can generally be done by switching to a reader view, or using the "page style" option in your browser's "view" or "window" menu. With all this information duly provided, the two articles are by Cynthia Chung, posted on 17 and 21 september: The Great Reset: How a 'Managerial Revolution' was Plotted 8 Years Ago by a Trotskyist-Turned-CIA Neocon and How the Great Reset was First Thought up by the Original Proselytizer of Totalitarianism and the Father of Neo-Conservatism. It is genuinely fascinating how long descriptive titles have become a hallmark of independent journalism.
UPDATE 2023-09-21 - Member of the naked capitalism commentariat Cristobal provided some further context and information related to George Orwell and his time writing propaganda during world war two recently that pertains to this point as well. "Recently read a book about the excellent Spanish journalist Manuel Chaves Nogales' time in London after he had to leave Spain, and then France just a step ahead of the Nazis (Los Aňos Perdidos). Chaves Nogales went to work at the British Ministry of Information cranking out propaganda for the Allies, to be sent to South America. As a refugee (and decidedly anti-Nazi) he had little choice. George Orwell also briefly worked for the Ministry. From the account of Chaves Nogales' time there, Orwell's 1984 was not about a future distopia, but about an actual one: the Ministry of Truth as he had experienced it. The Brits have been in the propaganda business for a long time."
From today's vantage point, it is strange to read Asimov claiming that there were no wars anymore, and that this wasn't being used to prop up the united states economy. Alas, Smedley Butler's essay War is a Racket never has been as popular a read as it should be. He modulates this claim a bit further on, but he is wrong on this and Orwell was right. As a former british civil servant who had worked "in the colonies" including british india, he saw that britain was at constant war in an effort to hold onto the infamous british empire after pursuing constant war to make it so big. Seeing britain hollowed out by this warmongering did not escape his notice at all, it is part of what arouses his disgust. There's an awful cross-cultural theme among british and british-descended anglophone cultures, of contempt for the loser, even if at the end the loser is the person's own country, crown or whatever group or social structure at hand. He did identify correctly that imperial powers are always at war, and not just so-called "cold wars" that are really only cold for the metropolitan centres of power. He also got right the importance and danger of language abuse, although Asimov rightly upbraids him for his unconvincing examples. Unconvincing in a way that suggests he was deliberately refusing to face the genuinely dangerous forms of linguistic manipulation out there, heedless of the evidence readily available. Asimov thinks this is about Orwell's personal battle with Stalinism. But I would suggest Orwell's visceral contempt for "the proles" and women in general is a better explanation. These are the groups he saw as mere sheep to be directed by the overwhelming volume, if not intelligence, of the new authoritarians whom Orwell disliked almost as much. It's hard not to get the impression he disliked them for being the ostensible winners of the time.
The level of Orwell's contempt shows all too clearly in both Animal Farm and 1984, but the latter in particular. It's a miserably written novel, a boring slog with startling glimmers that tend to be the highlights of movie renditions. Think the infamous "two plus two make five" and "do it to her, do it to Julia" scenes for example. This stands out in stark relief if a person happens to have read any of Orwell's essays, which in comparison are excellent compositions. His 1946 essay "Why I Write" is justly famous for its charm, and I suspect is an important element in his positive reputation. Mean spirited as Animal Farm is, still it is well written satire. This is not meant as an indirect argument that Orwell had lost his touch as he spent his dying days writing and readying his swan song novel for the press. Not at all, he was unquestionably in control of his medium. Clearly he deemed it necessary for the experience of reading the novel to itself evoke the stasis of the quasi-fictional world it depicted. So, while I don't like it, that is actually an aspect of the novel I can grudgingly respect. Orwell was bound and determined that reading 1984 should not provide a sense of catharsis or hope, and he certainly achieved that.
But there is one more thing that Orwell got right that today few people engage with, for the same reason I think that Asimov upbraids him for not anticipating major technological change. Asimov deemed Orwell's apparent dislike of technological change and romanticizing of old ways of doing things at best wrong, and could barely hold back his own annoyance with Orwell's contention that science would be choked off in a society that valued conformity above all else. Winston's fountain pen and notebook never struck me as particularly romanticized, rather they struck me as pathetic – and inconsistent. In a society where the television is always on and always watches you, the likelihood that most people would be taught to both write and read is pretty low. Very recently in history these were taught as completely separated skills, and it has taken a major social revolution to make them accessible to the broader population as a more general rule in society after society in asia and much of africa. There again, it seems to me that the more important thing Orwell observed correctly was the pathos and uselessness of purely individual resistance. Individual consumer resistance alone is indulgence and doesn't necessarily mark strength of character. But let's go back to the technological change point, because here Asimov has taken as given what many of us did because this is what we were taught, myself definitely included: technological change is always for the better. Period. There may be outliers and little problems, but more technology will solve them. Technological change means everything is different and new and better now.
Let's pretend for a moment that we are not aware of global warming and the dangerous climate changes it is driving, or the crazed, anti-life responses to the COVID-19 pandemic, among other such appalling things. Instead, let's reread something published in 1972, something that at the time was no doubt sniffed at as polemic and exaggeration. Older readers will probably pick out who the authors are instantly within a few sentences.
One can begin almost anywhere in compiling a list of problems that, taken together and left unresolved, mean disaster for us and our children. For example, the number one health problem in the United States is mental illness: there are more Americans suffering from mental illness than from all other forms of illness combined. Of almost equal magnitude is the crime problem. It is advancing rapidly on many fronts, from delinquency among affluent adolescents to frauds perpetrated by some of our richest corporations. Another is the suicide problem. Are you aware that suicide is the second most common cause of death among adolescents? Or how about the problem of 'damaged' children? The most common cause of infant mortality in the United States is parental beating. Still another problem concerns misinformation – commonly referred to as 'the credibility gap' or 'news management.' The misinformation problem tales a variety of forms, such as lies, clichés and rumours, and implicates almost everybody, including the President of the United States.
Many of these problems are related to, or at least seriously affected by, the communications revolution, which, having taken us unawares, has ignited the civil-rights problem, unleashed the electronic-bugging problem, and made visible the sex problem, to say nothing of the drug problem. Then we have the problems stemming from the population explosion, which include the birth-control problem, the abortion problem, the housing problem, and the food and water-supply problem.
These are a couple of choice paragraphs from the introduction of penguin uk edition of Teaching as a Subversive Activity by Neil Postman and Charles Weingartner. They are not just accepting this narrative at face value, although it undeniably encompasses more truth than anyone would have liked in the 1970s, and it is all but word for word applicable to our present moment in its true aspects. The broad point for my purposes being, more technology did not solve the old problems, although it may have papered them over. While I cannot claim that Orwell was purposefully arguing this, I think he did intuit it, and it bothered him. In fact, the best analysis of avoiding real change and its baleful impacts on society is a science fictional one and it was written much later, by the inimitable Joanna Russ in her short story called simply, "Corruption."
These are the things that Orwell should actually be inescapable for, that societal corruption actually is as much to do with total individualism as valuation of conformity above all else. His perhaps unconscious recognition that technological change often merely redresses the old problems with new fancy labels that obfuscate that a constructive and positive change has not occurred. His definitely accidental illustration of how totally destructive contempt for the majority of peoples' lives, intelligence, and will are, and that this contempt is a terrifyingly subtle poison. Yes, subtle, not least because we can be so easily persuaded that if "we" don't torture or imprison people like "them," we haven't "gone too far." Merely pointing up his recognition of cognitive dissonance as a means to enforce compliance and mental torture is in many ways a red herring. That is hardly new, the cultural destroyers better known as missionaries have been abusing this knowledge for centuries. (Top)
But is it Legal? (2022-09-05)
Some time ago I read a judge's comment on his own decision, that it was neither fair nor just, but that it was legal. To which the question is begged, if a decision is neither right nor just, what is the point of it being legal? This is basically a prettied up version of, "It's not my fault boss, I was just following orders." This is a vile excuse, and it should not be accepted from upper level officers or judges and prosecutors any more than the lowest NCO or articling barrister. Or anyone else. So then, what the hell bullshittery is available to allow a judge to convince himself that showing off the asshattery of the law by going along with it is more appropriate than seeking a right and just solution to the question at hand, apart from whatever motivations he may have had? It is worth looking into, and the good old OED comes in handy for this purpose.
Now, it is not coincidental that there are two terms that sound different but have the same shape and are related to "the law" as constructed in most european-derived cultures right now. They are legal and loyal, both from a latin term loosely defined as "belonging to the law." Both are familiar words, although maybe "loyal" as something to do with the law is less so. "Legal," as my OED reminds me, means "based on, or concerned with the law; recognized by common or statutory law; permitted by the law." The entry cross-references loyal, which is defined as "giving or showing firm and constant support or allegiance to a person or institution." That definition then cross-references loyal to "regal" and "royal" which have parallel shapes because of analogous linguistic trips from latin to english. The latin word behind these two is "regalis," and that of course refers to something to do with a king, that is someone who gives direction – "reg-" is the latin root underlying the noun rex and all those derivatives in english referring to direction, correction, regulation, and so forth.
Well, this is all very telling. A "legal" decision simply has to demonstrate support or allegiance to a person or institution. So in the english common law case, that usually means "the crown," meaning not so long ago in fact, the king. The particular case the paraphrase I started from comes from then could be seen as reasonable if the judge deemed himself duty bound to serve not what is right or just, but the crown. The irony that "right" is from the same root as "rectus" which means to move in a straight line and does seem to have a real connection to the "reg-" in "rex" not least because of how often surveyors were the slavish servants of kings, is one that should not escape us. After all, if a legal decision is determined merely by its consistency with the demands of an institution or person, then there is in fact no law at all but might makes right. This is actually a crazy way to try to run a society, but it is easy to paper over how crazy it is as long as the arbitrariness doesn't go too far or injure so many people that there is a rebellion or a regression to authoritarianism. (For those keeping track, the "reg" in "regress" is not a root, the root is "gradi-" and the "re-" part is a prefix.)
Most of us have been coping as best we can with a current slide to arbitrary might makes right behaviour. As seems usual with such things, it picks up steam via a subculture of extremists claiming they have a right to what they want because they want it, and if they can't get it they will do violence to get it. The more sophisticated and nasty who don't like to get their hands dirty fighting have been trying clever legal manipulations instead, because they appreciate better than anyone that most mainstream law is still preferentially built around a might makes right justification. They simply select a more urbane sort of expression of might, usually based on lots of money combined with manipulation of access to information. This is not at all new. Today this is achieved via computers, in ancient times it could be done via keeping most people illiterate simply by pretending that reading and writing was limited because it was supposed to be either too sacred for mere mortals or too filthy for anybody but money grubbing merchants and bankers.
We do have practical means to short circuit such terrible nonsense, including correcting approaches to law that are neither fair nor just. Besides supporting access to information and free debate, the more important tool is building and applying a capacity for sympathy. A genuinely useful question to ask before imposing a punishment or any other condition on another person who is unwilling to suffer it is to consider whether we ourselves would care to be subject to the same thing. By itself the answer to that question is not enough in many cases, but it is a powerful and important start that often makes it very hard to go for arbitrary, might makes right nonsense. After all, by nature "might makes right" has no end, it must always become more extreme, because the people who benefit by acting as bullies must inevitably fear getting their comeuppance, and so they must tighten the screws of control. Hence the way societies with this sort of "law" slip towards authoritarianism in an appalling hurry. (Top)
A Typical Accusation (2022-08-29)
Women one way or another wind up having to get used to being idly accused of ridiculous things simply for existing let alone not just going along with whatever someone else, usually a man tells them. Many of the accusations are deployments of loaded terms that have been inappropriately reused in other to mean at best loosely related meanings. Based on my observations and studies I have read by linguists ranging from Deborah Tanner to Dale Spender demonstrate that men are taught how to abuse language in this way from when they are children. I mean this quite literally: the first time one of my then early teenaged male relatives accused me of being "paranoid" he was barely fourteen. Due to some more recent reading and research I finally realized that truth be told, this term that is so widely used as an accusation against women and girls with political or environmental interests, is one that in practice seems to mean "you are bad and crazy and make a fuss over nothing." Well, that's a colloquial meaning to be sure, but let's see whether the OED editors have found something similar. They start with a definition that comes from probably psychology and psychology oriented sources, rather than a "speaker on the street" one:
a mental condition characterized by delusions of persecution, unwarranted jealousy, or exaggerated self-importance, typically elaborated into an organized system. It may be an aspect of chronic personality disorder, of drug abuse, or of a serious condition such as schizophrenia in which the person loses touch with reality.
I highly doubt my young relative had any idea of this definition at all. Since then of course many people have seen the 2001 movie or read the book A Beautiful Mind about the mathematician John Nash and his struggles with paranoid schizophrenia. Or maybe they have seen the earlier movie, Shine about the australian pianist David Helfgott from 1996. It is quite likely that the term "crazy board" is a reference to the movie about Nash, because the scene where his own is discovered is arranged for peak impact. But now let's look at the second definition the OED records – and I should acknowledge that perhaps these definitions are now re-ordered or elaborated differently since the edition that lives on my desk dates to 2005.
suspicion and mistrust of people or their actions without evidence or justification.
Now this is much more like what my relative meant. Part of why the conversation with him where he deployed this absurd accusation stayed with me is that hilariously, I hadn't said anything that really had to do with suspicion or mistrust. In fact, the accusation came up in the context of acknowledging that a recent family gathering had been a bit uncomfortable for everyone, mostly because of a miscue with respect to the menu. Miscues are by definition accidental. So what made this memorable is that at the time I couldn't figure out how that had to do with the conversation at that moment. Of course now I realize the purpose on my relative's part was to force an end to the conversation. His attempt was crude because he was still learning this dishonest verbal bullying technique, and hopefully his sense of honesty was also undermining his ability to apply it.
Now, this use of idle accusations against women and girls alike is far from new, and used all to often as a pretext to treat us badly. More recently, men and boys have been getting caught out by these sort of accusations because they are perceived as in some way effeminate. But in the context of perceived femininity or not, usually such accusations focus on the person's supposed mental illness or stupidity. It has been grimly fascinating to watch this extended in the context of popular redefinitions of systemic issues such as sexism, racism, and so on as not systemic but manifestations of individual mental illness. That these are systemic issues, demonstrable with respectable techniques rather than the infamous "crazy board," and that these demonstrations have already been carried out has not stopped the change. Of course, rhetorical techniques are about treating conversation or debate as analogous to some sort of game in which there has to be a loser and a winner.
To be sure, it is understandable that when faced with a difficult conversation or assuming in response to somebody noticing discomfort or a miscue that the person doing the noticing must be making an invidious critique, that sometimes the response of the other person is aggressive and meant to shut down the conversation. That it is understandable does not make this the right approach, nor is it particularly effective for anything but either shutting down a conversation or diverting it into an unproductive argument. (Top)
Arithmetical Puzzles (2022-08-22)
There are a few websites with solid and constructive comment sections out there, alas all too few. Still, the best of the survivors tend to be those where yes the moderators are committed and consistent, and the commenters themselves still in more of a mindset to seek to add constructively to the conversation. Despite the ubiquity of means to vote comments up or down, I have noticed that the websites where I find the commentariat is at its best lack this means all together, and people are discouraged from making statements like "+1000" or what have you. Obviously my sampling is small, unscientific, and not cross-culturally informed as such because most of the sites I read that have comments on are in english and based in primarily anglophone countries. It may well be that with different speech cultures, which could mean that english is commonly used but that most participants are multilingual, not just no english is spoken or the primary language is not indo-european, that comment voting works to the benefit of the conversation. To be sure there is no simple answer to whether comment voting feeds the type of competition that we might hope for, a competition that brings the best comments to the top, because the criteria for "best" are not necessarily agreed on or clearly defined. Hence most sites with standard voting algorithms would likely have buried the comment with the embedded link to a blog post debunking the assumption that people using roman numerals couldn't do basic arithmetic easily.
When I sat down to really think about this claim, it dawned on me that actually, it doesn't make any sense. The four cultures often held up as engineering masters of all time by mainstream histories are the ancient egyptians, greeks, mesopotamians, and romans. At minimum we can agree that impressive monumental engineering works were undertaken by communities recognized as part of each of these groups and their associated empires. They often produced these works with remarkable speed, which nowadays better archaeological work including experiments, excavation, and mapping has given the lie to claims that if they did manage this by sorcery it must have been mass slavery. (Note this is not the same as denying the role or importance of slavery in those cultures all together. That would obviously be wrong.) There is no way to get that much done that fast without having a reasonably quick means of doing the necessary arithmetic. But that of course begs the question of how people did arithmetic that looks very difficult to do without hindu-arabic numerals. Of course, it's the appearance that has caught us out.
UPDATE 2021-07-27 - I should also add that one of the commenters on the blog post that helped inspire this thoughtpiece was Eleanor Robson. Her book Ancient Knowledge Networks: A Social Geography of Cuneiform Scholarship in First-Millennium Assyria and Babylonia is available as a free download from the publisher, university college london press, diagrams, pictures and all, and there is a supplementary website as well.
The blog with that provides a clear rebuttal of the claim is at The Renaissance Mathematicus, titled "The widespread and persistent myth that it is easier to multiply and divide with Hindu-Arabic numerals than with Roman ones." I think that the blogger has done an excellent job of showing that it is a combination of unfamiliar appearance together with an unfamiliar calculation method that makes this myth seem plausible. Well, that and of course not stopping short in puzzlement at how people could have managed to much work that did actually require regular use of arithmetic. It is well worth reworking the examples with roman numerals to really get a sense of how quick and yes easy these other calculation methods are. Maybe part of what also catches us out is that today we are accustomed to having to show our work on paper, whereas paper even if available would have been too expensive for such temporary uses. Therefore people used counting boards and the like, which are regularly wiped clean and often early steps vanish as the next steps are taken with the counters in any case. Plus, from what I have been told regular instruction on times tables and practice drills on arithmetic operations are not common practice in elementary school any longer. So that makes calculating without an electronic helper seem harder than it is.
From that blog post I followed a related link to another on a renaissance-era multiplication method. This one would baffle most people actually even though it uses hindu-arabic numerals and there is an extensive write up of the steps. A write up, but in a format that at least I found rather difficult to parse without reading the post that inspired this one at JF Ptak Science Books. In some ways the method strikes me as an example of mystification of what is quite simple multiplication of two four digit numbers. Simple, but even done the way we are customarily taught to do it today, easy to make mistakes with because number carrying and writing out the stack of sums lends itself to errors. In my elementary school years, when we were taught today's standard method, we were required to rule up the numbers places in a little graph that helped us keep ones, tens, hundreds and so on together properly. Back then it was because we were learning about place notation and the like, but also it is quite necessary once you get to working with larger numbers by hand. We weren't to know that then, and everyone knew that calculators were allowed in grade five. Why I think this renaissance method is a mystification is that it amounts to a fiddly way of doing what we did without being too clear to somebody watching how it was done. As The Renaissance Mathematicus noted, this was a method taught in university in its time.
All of this is of course a long way indeed from the puzzle of comment section culture and how to somehow automate comments into quality via crude toting up of votes for and against. And yet, there are some intriguing accidental parallels in the sense that the arithmetic is simple, but the final results can be mystified by not sharing the method used to weight them. After all, it seems highly unlikely that where comment voting is concerned that the results ar not tuned somehow when those votes are part of a "social media" instance as opposed to a simple blog. Regardless of how that works, I would be interested in reading any research that has examined the impact or not of voting on comments, and whether my impression that it tends to punish comments with links is correct. Who knows, maybe on some sites following a link in a comment auto-increments its vote count, which would be interesting. (Top)
"No, Seriously, You're a Duck" (2022-08-15)
Absurd titles and cute penguins aside, this does roughly correspond with a fascinating phenomenon I have run into several times now, in quite diverse circumstances. Being a writer, I am quite familiar with the concept of a character or persona, and that while these are fictional constructs that romp through all manner of novels, plays, movies, and so on, they are too simple to correspond to real people. Done well they may suggest real people, and may create such a sensation of realism that audiences may talk about them and argue about them as if they were literally real. Of course, such characters are "real" insofar as they capture a solid sense of what a living, breathing person could be like in their particular circumstances. From such successful captures do whole edifices of fandom and debate arise, whether it's something a bit lighter such as Sherlock Holmes or the perpetually multiplying yet never too different from one another legions in modern day comic books, or something a bit deeper such as Clarissa Dalloway or Jean Valjean dealing with the great moral and existential questions of their creators' times. Yet it seems that at this particular moment in anglophone societies at least, characters have gotten utterly loose from their usual moorings and are now busy getting in the way of actual people. That is, a surprising number of us are going about the world acting as if other people can all be categorized as a certain type of character, whose beliefs, plans, and general psychology we know everything about and can totally predict. The confidence in this knowledge and ability to predict is so high that based on minimal evidence, the future behaviour of these people is taken as a given, and they can be treated according to that given. This does not sound at all sensible, I know. I resisted this as an explanation for the recent experiences and observations that led to this thoughtpiece, but there seems no other plausible explanation. It does match up with the more broadly recognized phenomenon of extreme partisanship that many people are expressing rightful concern about. Let's take the easier example first, easier in that it seems rather obvious.
Political polarization is an important topic of concern, and the usual pundits are complaining that it will lead to radicalization. In which case, if those pundits and "the media" are worried about this, truly worried, it is a great puzzle why they keep fuelling the polarization. After all, the way to help people hear and respect each other is not to caricature all "conservatives" as retrograde bible thumpers out to make a theocracy, nor "progressives" as a bunch of people who want total government control combined with complete hedonism. Not only do these sorts of caricatures make no actual sense, it creates pseudo-paradoxes. There is actually nothing at all strange about being what is currently labelled "socially conservative" and "fiscally progressive." Arguably the people who are big on enforcing what they deem a socially conservative form of social organization based on the patriarchal nuclear family are wholly consistent in being in favour of social spending, especially if they see that type of spending as a means to maintain the patriarchal nuclear family. But the meaning of "social conservatism" is not broadly agreed on anymore than that of "fiscally progressive" is. For all I know without talking to other people to learn what they actually think about these things, they would reject all those words and focus on more specific, concrete goals, like ensuring everyone who needs a decent job and housing has them. Then the intriguing follow up question is how we could work together to make that happen. There is no way we can live together in a good way if I or we as a society act based on lunatic labels developed by public relations wonks who want us to treat everything like a sports event with two teams and we can only cheer for one of them.
In that context then, we've all heard of the characters. Obviously I am not a resident or citizen of the united states, but even I have heard of the "deplorables" versus the "progressives." Or the "woke" versus anybody who can't keep up with the latest developments in what seems less and less like striving for "equality, diversity and inclusion" and more and more like a secularized version of the type of social violence rampant under extremist calvinism. There is a version of this in canada to be sure, and from the sound of it in australia, new zealand, and the united kingdom (such as it is). They are not exactly the same, but they do all share this construction of characters placed into "all good" and "all bad" categories, and they are taken as the default nature of everybody. How a given person gets categorized depends on whether they are able to pronounce the right shibboleths at the right time, nothing else about their behaviour counts. This is the sort of garbled thinking that easily sparks mobs. The subtler version of this inappropriate use of "characters" may have developed by a sort of – diffusion from this more blunt and for most people more obviously alarming instance.
The subtle version is quite strange and disorienting to encounter, because it may take some time to realize that is what is causing wild communication issues with another person. For instance, in my own recent experience, an acquaintance of mine became intensely upset with me after it became clear to them that despite their certainty about me, they were incorrect about my beliefs and plans. This person was utterly certain, beyond a shadow of any obvious doubt, that I did not have a real profession and therefore that I was busy lining up and going through interview after interview to get a "real" job, move to a completely different city, and so on. Now, as it happens, I do have a "real" job, and no, I have not been seeking to interview for the types of job this person had in mind, and at no time have I indicated serious interest in moving away from my current city of residence. I had attempted to correct this person's misunderstanding before the reality finally could no longer be successfully fended off with what my acquaintance had started out so sure about. They were so invested in this idea of me that it was a shock to them to learn otherwise.
I actually ran into this earlier in quite a funny case. In that case, a fellow student who knew that I am rather fond of Star Trek, decided to ask if I was excited about the latest spin off. At that time it was the new Patrick Stewart vehicle, Picard. I answered, perhaps unwisely, that this particular spin off didn't strike me as particularly interesting at all. To which my colleague replied, "Oh that's right, you're political." I remain somewhat baffled by what still strikes me as a non sequitur, although I have learned that this response was apparently based on the assumption that I thought the next series should have featured a woman to be "politically correct." Truth be told, I just figured then and still do now that after seven years of one series and a handful of movies, I'd rather see some fresh characters at all, not just squeezing the life out of the most recent crew of the fictional Enterprise until they are all too old to carry it off anymore. Bill Shatner is a great guy and has made a spectacular career out of his peculiar line delivery, but his last turn as James Kirk was sad, let alone the awkward script used to let poor Brent Spiner out of playing Data in the last "NG" movie. In any case, my colleague's response basically killed the conversation dead. He was already sure what I was going to say and what I must think, so there was nothing left to talk about. Off he went, and so missed that I think it would even be neat to have a proper Star Trek series set completely in the mirror universe.
Lighthearted examples aside, there is nothing respectful or healthy presuming a person's entire experience of the world past and potential based on a stereotype of their presumed equivalence to a "character." How much must we inevitably miss if we start out on meeting someone with immediately categorizing them and deciding everything about them based on the presumption that they are "political," "religious," "refusing adulthood," "woke," "deplorable" or whatever. Of course, by now, it is obvious that what I have been calling "characters" are in this context our old enemy the stereotype. (Top)
A Little More of the RSS Saga (2022-08-08)
It looks suspiciously like while everyone else is looking at google pretending to deprecate "amp," the coding format it demanded news publishers follow to win top spots on google news searches – and most of them rushed to comply rather than stop inserting anywhere from 300 to 1 500 lines of scripting to clean up their pages – google is busy endeavouring to do more anticompetitive and anti-open web damage with its other virtual hand. Way back in 2007, google bought out a promising small company providing rss feed management services called feedburner. Blogs were still kind of a new thing then, and blogger interest in tools to help them to manage their rss feeds to syndicate their posts and track their subscription numbers was high. The company also supported email lists for blogs, and in short order all these tools were being used by a newer blogtype, the podcast feed. Things looked quite rosy for feedburner, and the google brain trust apparently saw it as both a possible threat and a candidate for co-opting to insert advertising and thereby increase its access to data from both bloggers and their subscribers. Feedburner was duly purchased and shepherded into google's notorious maw, and no doubt many excited toasts were shared by the founders of the small company made good and the people who still saw google as likely to improve the software and extend the tools.
Then things started looking sinister in 2012 when google shut down the feedburner APIs and adsense access. As might be expected in time google forced feedburner users to make google accounts and sign in through those so it could shut down feedburner's original independent sign in service. In my research on it, worried comments began to pop up about how there didn't seem to be any effort put into modernizing the feedburner interface, not so much because it didn't work, obviously it did. The implications just weren't very nice, and google already has a reputation for leaving the remnants of absorbed small companies to whither without adequate technical and resource support prior to shutting them down. In 2019, the email lists were shut down. Then everything when quiet again, and it seemed that google had opted for a position of relatively benign neglect. Until this year, when google began talking about moving the feedburner software and accounts to newer infrastructure, enforcing a period of "maintenance mode" during the server move. This seems to be awfully weird for google to need to do. We're talking about google here, which owns so many servers and has so many staff that it can always run a new and old system in parallel to shake loose issues on go live prior to shutting the old one down. Google has reassured all feedburner users that their feeds will be unaffected, they just won't be able to access the service tools for awhile.
Now personally I don't have immediate skin in the feedburner game, not being a user of that suite of services. But I do have skin in the game of maintaining and improving the availability and utility of rss feeds. With google having killed its own feed reader which signalled a certain level of uninterest in the format, and note that all manner of articles are available online to tell us all how to use google news like an rss feed reader. Overall, it is fair to say that google's business strategy is based on trying to persuade everyone that the old aol was a good thing, as long as it is the new google and not say, such interlopers as facebook. Either way, I suspect that google found that there was not enough new data to grab worth the effort once it decided to remove adsense from the product hooks added to feedburner. It may be that there is still a doughty team of coders and managers fighting for feedburner's life within google. But I suspect that no matter how slowly google goes about killing it, feedburner is doomed.
Into the growing breach, a range of new companies have sprung up trying to recolonize the niche feedburner used to dominate. There are now blog software plug ins that provide many of the tools that originally made feedburner unique and a must have for the busiest bloggers and podcasters managing their subscription lists and syndication schedules. Even though rss feeds are actually not difficult to make, many people either don't realize that they can work without at least the auto-feed generation tools or prefer not to, which is fair enough. The stubborn challenge is what to do about syndication, and dealing with the different demands of different distribution services, especially for podcasters. Anyone who has figured out how to get the rss feed for a podcast out of itunes or spotify (to pick just two widely used examples) knows how the lack of a standardized way of naming rss feeds is the least of a given podcaster or blogger's challenges. This is where the braintrust at google likely think they have a perfect set up, because to their mind the dominance of their search engine and advertising sales makes them the syndicator par excellence and anybody else can jump. After all, if your site doesn't get solid search ranking in google's constantly manipulated algorithms, well what can your rss feed do for you?
For getting the word about rss feeds out there, and indeed of websites in general let alone blogs and podcasts, there are options. If the website, blog, or podcast is not on an already heavily covered topic, then they are already far ahead. Support and recommend non-mega-corporation search engines to help visitors learn about them and try them out. The smaller companies like mojeek can't improve without people actually using them. And believe it or not, there are latter day versions of what yahoo and excite! used to be out there for rss feeds, in the form of rss directories. But even better for people perusing the web to do whether or not they have reason to maintain or syndicate an rss feed themselves, one of the best things they can do is subscribe to the feeds of sites they keep going back to. Sometimes the rss feed subscription link is not obvious, but in a pinch, have a look at the page's source. This can usually done by accessing a right-click menu or selecting the option from the "Tools" menu if there is one in your browser. Then just run a quick search for "feed" and usually the link will pop up right away for copying and pasting into your rss reader. It shouldn't be that fiddly, but for that to change browsers need to have the automatic "rss feed available" subscription button beside the address bar to be restored. (Top)
Fine Then, What About Postmodernism? (2022-08-01)
In comparison to "modernism" the notion of "postmodernism" has been shaped and reshaped in ways so bluntly cynical that discussing it often triggers the sort of polarized screaming matches so characteristic of what passes for political engagement right now. Going back again to the OED, we can read of postmodernism that it is "a late-twentieth century style and concept in the arts, architecture and criticism that represents a departure from modernism and has at its heart a general distrust of grand theories and ideologies as well as a problematical relationship with any notion of 'art.'" This partly tautological definition is so full of problems that it is little wonder that the word it is meant to define is so good for seeding slanging matches. For one thing, I think it is quite clear that "modernists" in the terms explored in the previous thoughtpiece, were also none too enamoured with anybody's grand theory or ideology. The majority of them were profoundly alienated from any established religion, for example, and they can't all be neatly divided into dogmatic followers of specific political parties or theories either. Nevertheless, I think we could fairly characterize much of what can be taken as "early postmodernism" as a backlash against modernism, especially the widely shared "modernist" desire to find commonality and ways of living together in spite of diversity that could and did make this very difficult. But the more important core of postmodernism seems to be extremist antimaterialism, including its deadly correlates of womanhating and necrophilia.
This is in no way an extremist description. It is clear that those seriously committed to the "postmodern" project are hostile to any acknowledgement that there could be any certainty or consistency, with or without human intervention. That is just the sort of impact that an insistence that the world is just ideas is prone to having. In architecture this apparently is leading not just to pastiches and refusals of simplified, easy to mass produce designs but to buildings designed to look like they are warping or possibly falling down. I have seen interior designs of newer condos that actively hinder the use of their spaces, making it near impossible to easily use the kitchen or have any possessions apart from the absolute minimum because we shouldn't be so gross as to have "stuff" when the real world is ephemeral "ideas." Some parts of the apparent revulsion to what they label "modernist" seems to actually be disgust with mass produced objects which by nature resist modification to meet individual needs. Extreme idealism also goes with hyperindividualism, because everyone must have their own ideas alone. If they have the same idea, they each have their own fixed copy. Unlike messy biology and other materialist stuff, there is no time in the rarified zone of ideas of course, which actually may explain why so many postmodernist musicians are apparently hostile to actual music.
Since the postmodernists are very much in the ascendant right now, with their authoritarian demands to impose a single set of perfect once and for all ideas on the rest of us, it should not surprise us how much emphasis there is on appearance and performance. After all, what else is there if there are only ideas, and only ideas are important? It's logical, but quite bonkers. I am sympathetic to the leaning toward trying to find the right ideas, because much of our education is still based on materials developed in the previous period of extreme idealism when the catholic church had a real pretence to ruling much to the world that europeans knew about. It's hard not to find this sort of thing convincing, precisely because those earlier sources shared the same root ideas, including a belief that if they could force everyone into sharing the same set of presumed correct ideas, then the world would reach perfection. Literal perfection, because time would end and then for the christians at least, they figured that would bring the end of the world and their presumed removal to heaven. To make that happen, any amount of intervention, manipulation, and violence was acceptable, because after all only ideas, in the terms of that time "the spiritual" mattered, and the material was the grosser part made real due specifically to sin. Beating the material into submission was supposed to expunge or make up for sin, because sin was supposed to be what was preventing the end of the world. A secularized version of this is what the grand postmodern project is hawking. For a whole range of reasons, a significant number of people, evidently myself included, don't find these ideas convincing, no matter how much we share a desire for a better world. We disagree quite profoundly about how to get there.
Some time ago, I was trying to describe my understanding of how a community-held knowledge base was maintained and passed down. One of my interlocutors told me off for supposedly thinking this must be the most neoliberal possible mode of knowing and knowledge sharing and that it had nothing at all to do with the author whose work I was engaging with as part of learning about this type of practice. The only substantive reason I can think of for this rather aggressive reaction was that my understanding that community-held knowledge is not static, it does change over time, as it must to respond to changes the community experiences, was lost in translation, so to speak. That, and the ways in which community knowledge is anchored in material reality and community relationships and community-held principles. In the context of that conversation I was mulling over how this method was resistant to the sort of extreme centralized control so often sought in european-derived ways of building up bodies of knowledge. Not that they'd ever use such a materialist metaphor! My point at the time was to emphasize how everyone contributed in a necessary way. The knowing was a knowing together, not a knowing apart.
Postmodernists, among whom I think many neoliberals can indeed be counted, seem utterly convinced that we can't know together unless we are forced to know in the same way, all at the same time, as directed by their elite selves. Without uniformity, including avoidance of all questioning or appearance of questioning the faith, the "ideal," then in their view there can't be any certainty. Since so much of this perspective depends on trying to read and control the minds of others, the proliferation of emotions characterized in english as anxiety, depression, boredom, and fixation should not surprise us. How miserable it is for a person who is convinced of the primacy of ideas and the necessity to constantly demonstrate total faith in only the right ones, only to keep running into the problem that ideas, damnably enough, change! It's exhausting to be constantly shoring up a house made of cards, and inevitably the person doing that must always be afraid the card house will come down.
All of this is not to declare postmodernism simply evil and to be gotten rid of at the first opportunity, despite this terrible record. Ideas are important after all, and it is true that we need to be on guard against persuading others or ourselves into utter nonsense. It is right to try to figure out how best to sort out good ideas from bad ones, to understand how and why our ways of thinking and what we accept changes even when it seems to us at first that they are not changing at all. Boredom is unhealthy, we should be resistant to totalizing uniformity. But it is not true that having a sense of stability is a bad thing, because without feeling confident of the ground beneath our feet, we can't grow. Of course, growth means change and aging, and a horror of aging and its inevitable biological correlates is part of what leads to so much postmodern fascination with death and cheating it. (Top)
Literary Questions (2022-07-25)
Or at least, literary questions as way into some bigger questions. In this case, the question of what the hell "modernism" is or was supposed to be, and what "postmodernism" was or is supposed to be. These are not simple to answer questions. So much so that even in a book like Diana Souhami's No Modernism Without Lesbians, not once does she venture to give any overt definition of modernism. Part of the difficulty is the dreaded "-ism" suffix, which so often seems to reduce people to arguing the equivalent of the mediaeval cleric's fuel for intellectual masturbation, "how many angels can dance on the head of a pin." Understandably most people, including even most clerics back in the day, respond to such questions with veiled contempt. It's one thing to acknowledge that a question does not have a simple answer, or sometimes any answer, a whole other thing to create a question primarily designed to divide the in-crowd from the out-crowd. It's a shame we don't have a sensible suffix in english to indicate the term is meant to encompass consistent trends. Probably this is why so many people have tried using adjectival forms like "modernist" or "postmodernist" instead, which doesn't help much.
As usual, we can start by trying out the OED, and see what the editorial team ame up with out of the mass of material they had to work with. For modernism the first definition is "a style or movement in the arts that aims to break with classical and traditional forms." Well, okay, yes. But then, the actual artworks covered by this term must change constantly. When romanticism was a current artistic trend, people participating in it strove to break with what they had learned as classical and traditional forms. This is not intended disingenuously, but rather to draw out the point that people are breaking with the past all the time, arguably especially those who are working artists. They won't get much work if they just reproduce what is already out there. But it doesn't take much exploration of the materials that are by consensus counted as "modernist" to appreciate that responding to a market was not the primary motivation for most "modernists." Established critics and older artists responded to these works with extraordinary violence, claiming that these new artworks would corrupt the young and destroy society. There are not wholly apocryphal tales like those of riots following the premieres of Stravinsky's works, or the nazi penchant for putting together exhibitions of supposedly "degenerate art" while eagerly stealing and keeping all the examples they could get their hands on. Censorship of fiction and non-fiction was rampant, including a particular patriarchal obsession with trying to somehow neutralize women enforcing respect of their rights by keeping them ignorant. Furthermore, "modernism" seems to be something that started after the first world war, and then stopped somehow shortly after world war two. What the hell was going on?
A helpful way to make some sense of all this, especially if you are not a literature major, is to read a recent controversial book, or at least its first two or three chapters. That book is The Vimy Trap, or How We Learned to Stop Worrying and Love the Great War, by Ian McKay and Jamie Swift (2016). Whether or not you have any inclination to agree with McKay and Swift's thesis for the book as a whole, their exploration of the disorientation that preceded the first world war and grew into a real cultural and moral crisis after makes it much easier to understand what was happening in "the arts" and indeed well beyond them. To be clear though, McKay and Swift are focussed on the making of meaning and imagery of world war one as expressed through treatment of one of the battles fought at vimy ridge in france during that conflict. They weren't trying to unpack or define "modernism." Yet they drew out key impacts of the primarily male experience of newly mechanized warfare that indicate why old artforms lost their savour for many men and some women from the end of the nineteenth to the early twentieth centuries. The experience of mechanized brutality on such a wide scale, such that romanticized portrayals of warfare could no longer be taken seriously, and older artforms seemed designed to leave out too much. The rules of representation demanded the excision or elision of messy human biology, of expressions of madness and terror. They also demanded stasis, a heavy duty idealizing into what now we might refer to as a type of platonic form.
The experience of mass dehumanization in warfare was not new for women at all, so this really doesn't tell us why so many women would become powerful leaders in the development of artforms and styles now called "modernist." Nor was the experience of having to deal with artforms apparently designed to firmly elide or excise their existence and adventures. For them the powerful changes at work included a series of important successes in enforcing respect for women's rights and autonomy, and the impact of the new forms of mass media. The new more widely accessible books and periodicals, whether merely in cheaper forms or via affordable subscription libraries, plus photography, film, and audio, made it easier for women to interact with one another. Women could more easily interact with one another, share pointers, and compare and contrast their lives. On top of that, in this period, many women who could were living as exmatriates from their countries in order to make a living or make fixed incomes go further due to advantages in exchange rates. It is likely that the numbers of women taking one of these options and able to win and keep independence from men as a result was higher than it had ever been before. That includes independence from male-dominated publication and distribution systems.
Therefore, for both women and men, day to day life not only challenged the old certainties, the old certainties were not available to them for the most part, whether or not they wanted to abide by them. Under the circumstances, it seems unsurprising that a significant portion of them would try to respond by making wholly new art to respond to conditions that had apparently broken with "classical and traditional forms." Many of their opponents actually agreed with their assessment of a break with what had formerly been assumed constants. It's just that their response was to insist that the answer was to reassert and enforce the old forms and ban and destroy any possible alternatives. As we know being alive later in time, the way things worked out was considerable amounts of co-existence of old and new forms, and even the revitalization of moribund or almost forgotten artforms and styles using the newly available technologies and social networks.
With this in mind, for myself at last it became much easier to understand the materials produced within the "modernist" movement or trend of the late nineteenth through much of the early twentieth century. Whether the results of the trend are convincing, likeable, or whatever is a separate question and very much one of personal taste. Contrary to accusations from the usual suspects, "modernists" were not simply being gross or prurient in their efforts to represent messier and less pleasant biological realities, although it is of course true that some of them could be in specific works or in general. (There's always that one guy, right?) They were trying to break taboos of the time, some of them considered quite nonsensical today precisely because faced with a good faith challenge, some taboos did fall and fall rightfully. It is too easy for us to miss that at that time the sorts of requirements now imposed mainly on movies to win a PG rating (i.e. no sex, no nudity, no spraying blood, but murder and beating is okay) were imposed on practically all art for all ages and backgrounds. (Top)
Wire Recording (2022-07-18)
Due to the highways and byways of my various research projects, I have done a great deal of reading and research on the topic of oral tradition and oral transmission of epic poetry and other narrative forms. Anyone who digs into this much runs very soon into the early studies by Milman Parry, who proposed the oral formulaic theory of composition for the Homeric epics and went on to carry out field work in the now former yugoslavia to test his hypotheses. A significant portion of his archives, including sound recordings originally made on aluminum discs, has been digitized and may be read in transcription and in some cases listened to at website of the Milman Parry collection of oral literature maintained out of harvard university. The purpose of the site is to make at least part of the archive more accessible, since as is typical of archives of this type, in person access is by appointment only. Parry's work was carried on after his death by Alfred Lord, whose archives are accessible from the same site, including digitized copies of recordings he made, not on aluminum disc, but on wires. This recording stood out, because while I realized I had heard of it before, the truth was I had no idea what this actually entailed.
My bafflement was not helped when I went back to other references that included pictures or video, so older television programs and some of the world war ii code breaking history materials I am also familiar with. The machines in these examples all looked like standard, older style reel-to-reel tape recorders. They are a regular feature of photographs of the Beatles taken when they were working in the studio, and their medium is a basically a long plastic strip with a magnetic coating on each side. A much smaller version of this whole system is inside another obsolete recording format, the cassette tape. On one hand, evidently wire recorders must have some basic physics in common with the more familiar tape recorders. That is, they depended on magnetism for recording and eventual playback, and to achieve both tasks the recording medium was wound off of one reel and onto another, and passed over a recording or play head. On the other, I could also see that with some proper items in the picture for scale that the wire recorder was much smaller than the studio reel-to-reel examples, and of comparable size to the table top ones.
After a bit of search engine running, I managed to find a number of especially helpful documents about wire recorders. The first was of course the website for the museum of magnetic sound recording, which explains that the recording medium was a thin steel or stainless steel wire. The article author than states that this wire was "drawn rapidly" across the recording head. This should give us pause, because anyone who has worked with regular tape knows that the speed can't actually be that fast. Metal wire may not be as prone to stretching and breaking in such a system as later tapes, but it still would, and keeping it wound evenly would be a challenge all its own. Gretchen King, formerly of the school of music at the university of washington wrote a brief technical paper for students and staff tasked with transferring wire recordings to new media in 1998, Magnetic Wire Recordings: A Manual Including Historical Background, Approaches to Transfer and Storage, and Solutions to Common Problems. She notes that the speed of the wire recorders was 2.5 feet per second, or in metric units, 76 centimetres per second. That is actually so slow that further on when considering the question of breaks and snarls, King reports that a broken wire could be tied back together to continue playback with little real impact on the data.
The actual recording wire used was quite fine, fine enough that it could plausibly be explained away as sewing thread, at least in the television program Hogan's Heros. Accordingly, the whole wire is the medium recorded on, not just the surface as might be half expected by a tempting analogy to later tape. From King's summary, it looks like the dreaded phenomenon of print through could be a grave bugbear as well because the ferromagnetic wire could remagnetize itself in parts while in storage. "Print through" caused many headaches for those of us old enough to have made our own mixed tapes during high school or purchased cheaper cassettes. On cheaper cassettes or re-recorded mixes, sound from the opposite side of the tape could become audible on the currently playing side, or bits of earlier recording left behind from erasing it might persist on re-recording. There is a famous example of this from the Beatles catalogue, where the final mix of "Hey Jude" unmistakably has faint echoes of other parts of the track. In that case it actually sounds rather neat because it is after all the same song. More often for amateur tape mixers and people working with archival tapes, quite different sound would break through in a disruptive and unpleasant way.
As we can well imagine, there is another challenge worse than print through with recordings on wire, and that is the wire coming loose and getting tangled. In that case the necessity of not pulling the subsequent mass tight anywhere if at all possible would be even more critical than with yarn or cotton thread. Worse yet, the risk of getting fine cuts when struggling with the wire would be nontrivial as well. All told, it is not wholly surprising that this particular type of recording did not survive the development of adequate tape formats, and of course those have since been overwhelmed by digital formats instead. Readers with some familiarity with one or more *NIX systems will have some experience of tarballs, the last survivals of the once standard data archive format, which was on tape. One of my tasks on a long ago job was loading and reading in data from this type of tape. According to at least a few people on line some companies are still using tape archives, although whether this is for newer data as opposed to maintaining older back ups that can't be economically transferred yet is unclear. I suspect we can safely assume there are no wire recorders still in business use these days, though. (Top)