Title graphic of the Moonspeaker website. Small title graphic of the Moonspeaker website.

 
 
 
Where some ideas are stranger than others...

FOUND SUBJECTS at the Moonspeaker

Security Updates Required (2016-08-11)

So, here's the gig, no updated valve for the gas unless you take this new unit. Steampunk art from the Dover Steampunk Sourcebook, 2010. So, here's the gig, no updated valve for the gas unless you take this new unit. Steampunk art from the Dover Steampunk Sourcebook, 2010.
So, here's the gig, no updated valve for the gas unless you take this new unit. Steampunk art from the Dover Steampunk Sourcebook, 2010.

Some time ago I said flatly that security updates should never be bundled with other code providing the equivalent of – keeping to relatively polite terms – completely unnecessary fiddling. There is no reason whatsoever to insist on bundling supposed "feature updates" and cosmetic debacles with security updates. These are two different classes of changes, and they are not intended for the same purpose at all. To my mind, the insistence on bundling security updates with other things is not only abusive and disrespectful, it is ultimately counterproductive. If the only way to push out a bunch of other changes people may be reluctant to accept is to attach a security update they feel must be applied, the other changes are crap. But, fun as ranting is, I realize this may not be making quite clear why you should seriously consider my claims here. So, let's try a little illustrative thought experiment.

Steampunk is still a thing, so let's imagine, thanking our stars it isn't true, that we are back in that woeful area when queen Victoria reigned, and we are rich. Therefore, we are bleeding edge technology types, because the cult of technology was just really taking off at this time. We have a gas-fired stove to go with the brand new gaslighting in our modern home. It's marvellous, the cook hasn't poisoned us over it yet. Then we get alarming news: our current stove has a gas valve associated with a number of recent house fires and explosions. We need to take steps to rectify this danger immediately. It is not possible for us to swap the valve out ourselves, even using our well-appointed home workshop where we dabble in inventing machines for dubious purposes. The suspect valve is a complex mechanism, using a number of newly identified but still poorly understood physical properties. This isn't a situation where you can just experiment and see what happens.

UPDATE 2016-08-23 - Just in case you thought I was exaggerating the potential harms or the lengths some companies nowadays are inclined to go to that make security patches problematic for installation, it is well worth reading EFF's recent article on the topic, With Windows 10, Microsoft Blatantly Disregards User Choice and Privacy: A Deep Dive. More specifically, this passage, "And if this wasn't bad enough, Microsoft's questionable upgrade tactics of bundling Windows 10 into various levels of security updates have also managed to lower users' trust in the necessity of security updates. Sadly, this has led some people to forgo security updates entirely, meaning that there are users whose machines are at risk of being attacked." Some people at companies like microsoft are genuinely making crazy decisions that lead to this result.

The people who sold the stove to you insist they couldn't possibly update your present unit. Instead, they are sure you must throw that stove away, and buy the much bigger and fancier exemplar illustrated to the right. It includes many cutting edge features, including a built in music box, elaborate multi-faced clock for tracking the cooking times for different dishes, a knife sharpener, and a unit that can be connected to the house communication tube system. The salesman is sure there are no other options, and it doesn't matter if this new unit is safer, it won't fit in your house, and you don't need eight burners and the extra features. Under the circumstances, you can't leave the stove the way it is, but you didn't manage not to squander you inherited wealth by believing what salesmen tell you. Instead, you ruefully have your gas stove converted to burn coal.

Yes, computers and the programs they run are not stoves. They are in nice parallel in terms of how we have integrated them into our lives. And to be fair, changes to improve stove safety or computer security do often come on their own without extra baggage that negatively impacts their utility. The problem is when they don't, because that is when people are forced to decide whether to ignore the patch, switch programs, or try something requiring a bit more research and tech savvy. I think it is fair to say only the first option is the most common one right now, even though there is wide agreement that security patches are critically important. But this means that anything that unnecessarily pushes people away from applying security patches is a bad idea.

There is an unfortunate level of paternalism rampant in many of the older players in the tech industry, whose feelings when "users" don't do what they want seem to range from baffled frustration to furious contempt. How dare we refuse their prettied up interfaces and additional features we should be panting to use. The trouble is, even in software, one size does not fit all, and refusing to risk breaking software that suits our needs is not perverse, a refusal to "enter the future," or childish. In fact, I suspect the next generation of inventive and successful software companies will focus on providing security updates consistently and separately from other updates, the other updates being optional and modular so that it is easier for individual people to keep the program set up they prefer and adjust it incrementally, dialling back things that don't work without compromising security. This is a tougher technical challenge than locking people using computers into the virtual equivalent of a padded nursery, but this should be no deterrent. After all, not so long ago, computers were still giant piles of vacuum tubes that had to live in their own special rooms with only duly trained technicians touching them with white gloves. (Top)

What is "The West", Anyway? (2016-08-02)

Just a quick directional graphic, a sort of inverted arrow labelled west. C. Osborne, august 2016. Just a quick directional graphic, a sort of inverted arrow labelled west. C. Osborne, august 2016.
Just a quick directional graphic, a sort of inverted arrow labelled west. C. Osborne, august 2016.

Perhaps this question seems to have an obvious answer, although I'm not so sure. David Graeber, in his essay There Never Was a West seems to agree with my sense that when we hear of references to "the West" and being "western," the reference isn't actually to a place or a direction at all. After all, for those of us with more than the average number of physics classes under our belts, as a location or direction we know "west" can be entirely relative. In fact, maybe it simply is entirely relative, after all, it's not as if we can impose an "absolute west" on all of outer space. At best, maybe we can insist there is a Solar System-specific west, at least for the planets. But admittedly, this isn't actually what I was trying to get a better handle on. Rather, I have been trying to pin down the illusive and gelatin-like concept of "the West" which is the basis for the immediate declaration that my culture is not "western" and that is persistently enshrined in medievalism-valourizing works like the Lord of the Rings – although, to give Tolkien credit, he did make "the West" a bit more literal in his work, something his many imitators have not necessarily done.

UPDATE 2017-06-16 - David Theo Goldberg in his book Racist Culture: Philosophy and the Politics of Meaning provides a more helpful definition of "the West." According to Goldberg, "'The West' is similarly a sliding sign. Initially designating countries west of the iron curtain, its scope came to include those countries and their inhabitants that are capitalist in their mode of production, politically free with democratic institutions, culturally modernized[sic], and largely white. Thus, the designation usually includes Australia, which is almost as far east as one can go without being west, but excludes Japan and does so implicitly on racialized grounds... Indeed, 'the West' has included South Africa insofar as that country has been considered white or non-African" (1993: 165).

UPDATE 2020-08-21 - Sean Manning has an excellent new blogpost up at Book and Sword pertaining to this topic, If You Find "The West" A Confusing Term There Are Good Reasons. He unpacks some of the uses and abuses of this term in a fair-handed way, including a firm warning that it is primarily used to slip political programs past people they might otherwise have serious reservations about.

Okay, that this is already a contradictory idea that my metaphor has acknowledged: who expects to pin down gelatin, even metaphorically? On one hand it can't be pinned, and yet I can't help but want to corale the goop a bit by process of elimination. For instance, on one hand, I am a person Indigenous to north america, which is further west than any europeans had ever gone not so long ago. Yet when they first got here, and their descendants today, are quite sure I'm not western in culture. Leaving aside whether or not I agree with the characterization, this confirms we're not talking about a direction, and it is supposed to somehow be tied to culture.

This is where Graeber would point out that "the West" is not a coherent concept, and that the "culture" it describes doesn't have logically consistent content. Yet "the West" does seem to be ultimately be more about ideas, specifically ideas considered "the best" by a particular white, male, moneyed elite. Those folks always seem sure about "the West," even if they can't corale or pin its gelatinous mass any better than I can. Which makes it all seem like arbitrary crazy talk, and that is obviously not what invocations of "the West" are, or it could hardly inspire the emotional commitment far beyond the elite it does. So let's tip this upside down, and ask when "the West" is most invoked, what the context of the invocations are.

Alas, this is where things get much more alarming. Based on my readings and viewings of both fictional and historical works (we can argue about whether those terms describe two discrete kinds or two extremes of a range another time), "the West" leaps firmly to the fore in the context of the run up to and actualization of war. There is constantly a "west" and "rest" and these are in perpetual violent conflict. Whether literally with weapons or via "culture" or economics, always there is a clash. Somehow "the West" is always under threat, and it is always passive, it never does anything, it's just hated for existing. Various rationalizations are provided for this hatred in both fiction and non-fiction, but in the end the sense of threat created around the idea of "the West" soon becomes a rationalization for "the West" to lash out, especially "pre-emptively." Hmm. It seems the "gelatin" metaphor is not so well chosen, since it elides this alarming feature. Perhaps "slime" would be better, since that is something we all find disgusting and even frightening depending where it shows up and what colour it is, sticky, hard to manage, and occasionally smelly too. If somehow the "us versus them" leading to some form of violence could somehow be detached from the amorphous concept of "the West" and its uses, it might become benignly useful. Then again, without a role in creating and rationalizing a bipolar conceptual world, "the West" would signify nothing at all, except the direction the Sun sets in. And that of course, is a direction we can all find without fighting over it. (Top)

A Meditation on Software and Change (2016-08-02)

An older version of the custom macs logo from www.tonymacx86.com, a forum for sharing tips on how to run macosx on older apple and non-apple hardware which are otherwise unsupported. This is pretty much where the people who don't want to run freeBSD or openBSD hang out. An older version of the custom macs logo from www.tonymacx86.com, a forum for sharing tips on how to run macosx on older apple and non-apple hardware which are otherwise unsupported. This is pretty much where the people who don't want to run freeBSD or openBSD hang out.
An older version of the custom macs logo from www.tonymacx86.com, a forum for sharing tips on how to run macosx on older apple and non-apple hardware which are otherwise unsupported. This is pretty much where the people who don't want to run freeBSD or openBSD hang out.

I have been on an unsystematic quest for a better word for people use computers than "users" for some time now. After all, we don't refer to ourselves as "users" when applying our often idiosyncratic collections of tools for odd household jobs. No doubt seriously trying to call mechanics, carpenters, or seamstresses "users" instead of their job titles would go over extremely badly. "User" is a word that has picked up awful connotations like persistent barnacles, and they tend to also involve bad puns on "loser" and encourage disrespect. "Operator" is sort of better, but clunky. Nobody believes a "factor" is a person anymore, even though it's an agent noun derived in the same way as "actor." But in the end, I think the best alternative may well be "maker" because it manages to forefront agency while more or less reminding us that computers are tools and they are supposed to be helpful in getting a range of tasks done. They're multi purpose tools, so they can support a wild range of tasks and applications, and boosters of various types feel sure they won't be laughed at when trying to convince everyone computers should be as ubiquitous as sand. Thing is, as time goes on, many software companies, and a significant proportion of communities built up around free software, seem to dislike makers mightily.

On one hand, it is possible to come up with reasons for this that sound plausible. Any of us, even if the computer is little more than a glorified typewriter for our purposes, manage to do things by accident or even on purpose never imagined in the use cases and test runs. Then we get fed up with the bugs and weird behaviour our unimagined acts turned up. Some people are certainly up to no good, and they can wreak havoc that nobody who makes or uses software really likes being on the other end of. Creating a product that serves many people well is hard, not at all like determining the best design for a type of hammer. So much unpredictability, so much unasked for curiosity, so many people taking advertising claims to heart about the wonderful capabilities of the software. All that, and it was hard enough to get it out the door and distributed without finding it in a torrent five minutes later and seeded with something unwanted five minutes after that. Frustrating stuff. This is all before we get to what amount to own goals by the various purveyors of software, free or proprietary, and they all make it harder for makers to make.

One own goal is the one many software companies claim is anything but. It used to be called creeping featuritis, although it seems a bit more common now to call it feature bloat. This of course is the terrible phenomenon whereby a software update, nowadays with obligatory security patches (security patches are good, tag along unasked for features bad), adds new features that offer to do things like automatically colour your tables, or preinstalls hundreds of document styles you never asked for and will never use (and good luck getting rid of them). These new features often don't work properly, so heaven help you if you give in to the temptation to use the "autoupdate selected styles" option in the word processing program. So now the program is slower or you have to root through far more options than you ever dreamed, and nobody who selected and implemented the features seems to understand what the people who use the program actually do. I'm a writer, so word processing programs have long been a major part of my life, and one of the worst decisions ever in the one I have to use for work was the decision taken to include a drawing package within the program. The few times I have tried to use it, the result regardless of directions followed was a graphic that often doubled the size of the document and slowed scrolling mercilessly as the screen rendering struggled to cope with it. The graphic in question was usually a set of small arrows or something like that.

Another own goal arguably far worse is obliterating the interface, or making it "responsive." This seems to correspond to a decision on software companies' part especially to obliterate the interface so the program will be "new and improved." Perhaps underneath it all the program really is improved. However, for the people saddled with the upgrade, returning to an updated workhorse program and discovering the menu items have all been changed or relabelled, or that they have been set to change automatically based on what they use, the effect is "new and broken." All the muscle memory and procedures developed over use of the program that made them willing to stick with it because they knew it well can be instantly rendered useless. I have observed people on support forums snarking that others should stop whining and learn the new version already. Except, how does this make sense? Such programs are tools, not games. Let's take a step back and consider the more ordinary tools in our lives. Think of a fork. Does it really make sense to render a fork nearly unrecognizable or difficult to use because of its shape, toss out the old forks people could use comfortably and easily, and then tell them to get over it and get used to using a spoon to eat salad? Tools need to be consistent, especially if there is a desire on the part of their manufacturers to discourage people from modifying them.

Apple used to be better (or worse, depending on your point of view) for taking note of how people modified or worked around aspects of the OS and apple software, and then incorporating the most widely applied examples. When people make such modifications, they are showing where the software doesn't quite work, and they often come up with excellent solutions even if not technically inclined. Now this is being treated with more and more contempt. I had to make a rare support call recently, in which for a short time the tech support person and I shared a screen so she could help me out with an issue. Her first steps were to try to get me to change all of my basic default settings in my web browser, which had nothing to do with the issue – and I do mean basic. When she saw my web browser's home page was something of my own, and that I had selected a different search engine, both options allowed and implemented by the program itself, no hacks required, she was utterly scandalized. I dared to have my own homepage, that had to be what was wrong! At that point in the conversation I had to remind her what the problem actually was, and if she had answered my original question we would have been done in ten minutes instead of an hour. The majority of the time amounted to her telling me how bad I was for having anything that didn't come in the default OS install, including its settings.

This is a telltale response actually, and consistent with the apparent growing impatience with people who don't do only the narrow range of things a given software or computer vendor thinks they should do. This isn't just about respect either, it reflects a growing attitude among many of the people working in those entities that makers don't own the computers or the software they bought, they have merely been given the privilege of using it. Matters are not quite so bad on the free software side of things, although in a strange way the issue is inverted: you can do whatever you want, but if it isn't included in the limited use cases, expect it to be difficult and don't dare expect help or a civil response to a feature request. Perhaps we would all handle this better if we began thinking of the computers as actually owned by the people who buy them, and the software as a co-creation between the people who use it and the people who write it. Then again, that would buck the current trend of forcing the consumer product model onto software more completely, so that it doesn't last long and has to be replaced regularly. (Top)

The Privacy Economy (2016-07-31)

The selling point for this phone was increased privacy, all the way back in 1912. Image courtesy of wikimedia commons, 2016. The selling point for this phone was increased privacy, all the way back in 1912. Image courtesy of wikimedia commons, 2016.
The selling point for this phone was increased privacy, all the way back in 1912. Image courtesy of wikimedia commons, 2016.

Over the past couple of weeks, I have been struck by a small flurry of articles by technopundits related to devices like the Blackphone, advertised as devices intended to allow secure and private communications. Many of these devices aren't selling very well, and no doubt their high prices and low availability is not helping them, nor their basis on the android phone OS, a google project. No matter how free the various flavours of android may be, considering how implicated google has been and still is in vacuuming up personal data for itself let alone state actors, it is possible that potential buyers who are not deep into programming don't find the security claims quite believable. Yet this is not what technopundits give much consideration to, perhaps due to the need to research what the more general public may define as their wants and needs vis à vis cellphones in particular and security and privacy in general. In fact, it seems to me that the actual people dealing with these issues in their daily lives are notably absent, especially when the technopundits throw up their hands to the sky and rhetorically wail to the heavens, "Oh, why won't people pay for privacy?!" As if this indicates people generally don't value privacy in the first place.

UPDATE 2020-12-25 - Circumstances have changed a bit since this thoughtpiece was originally posted, as we should expect. For one thing, there is a wide-ranging move to face up to the dangerous state of monopoly in the software, advertising, and computer hardware industries. For a well-researched overview, see Antitrust Spring, syndicated on naked capitalism 21 december 2020, on the pending anti-trust cases against facebook, google, apple, and amazon. For another, there is a sensibly priced non-android cellular phone that is in the first generation iphone phase of development, the pinephone by Pine64. The librem project by the same people who have successfully produced the purism os and associated line of computers is further ahead, but much more expensive in terms of hardware with very mixed reviews primarily related to the state of the software. Evidently porting computer software to a phone-sized device is a tricky endeavour, and that is how google persuaded so many to take the apparently easy path represented by porting android instead of making their own software.

UPDATE 2021-09-24 - For an excellent overview on the mass scam system that is online advertising, Charlie Warzel at galaxy brain has a new post up, The Internet's Original Sin. It includes good cross-reference links for additional detail. Do spend a few moments exploring the martech 5000 diagram in particular, and compare to Maciej Ceglowksi's diagram from his talk The Web Obesity Crisis.

UPDATE 2022-05-15 - Further to this, anyone spending time online and using web forms at all will be very interested in yet another means various companies are using to try to steal our data and track us without our consent, including grabbing passwords. The issue is covered via a research team's work published at Leaky Forms. They provide their results and look to be regularly updating their results as some companies stop the parts that are too embarrassing to leave in place or that they did not intend and do not wish to have there.

To be honest, I feel that this bit of rhetoric (don't let it get on your shoes) is that extraordinarily rare bird: a stupid question.

At the moment, we are living through a time of peak market fundamentalism, in which there are people who call themselves entrepreneurs and are busy rooting around for a way to make us pay for things that are fundamental human rights. They are very clever, because they adore the logical fallacy known as "the slippery slope argument" and they regularly use it to claim they should be allowed to do more and more of whatever makes them money because they have been permitted to do just a little, and just a little was okay. (I highly recommend An Illustrated Book of Bad Arguments with text by Ali Almossawi and pictures by Alejandro Giraldo on such logical honkers.) In fact, often "just a little" was for the other folks, cautiously trying something they hadn't before, perhaps against their better judgement, "We'll give this a small trial. If it doesn't work out, we just won't renew it." Just because somebody takes advantage of a good faith trial with bad faith follow up actions does not make slippery slope arguments true. And implicit slippery slope arguments like the dishonest "why won't people pay for privacy" question are no better. For a firm rebuttal of another bad argument by people opposed to respect for privacy, "if you haven't done anything wrong, you have nothing to hide" see Bruce Schneier, and Daniel J. Solove's paper in the San Diego Law Review.

From what I can see, people are refusing to pay for privacy for the sensible reason that they refuse on a gut level to pay for their human rights to be respected. They get that to give in to the claim that privacy should be an extra feature you pay for is to effectively redefine privacy as something only for the rich, and that such redefinition ought to be resisted. Unfortunately, the people who could seriously consider paying for privacy are often also the same people who can't seem to fully parse out the "human right" part of privacy, because they are awfully quick to demand that the poor should have no privacy. Which means the poor have a sharp analysis of the meaning of privacy and the suggestion that it should be a controlled by money. Oh, and by the way, this issue is not in fact about technology, nor is it new. Far from it.

Minimizing privacy is a great way to impose centralized social control, and that was as true when the people hoping for the social control were little more than a bunch of goons with clubs. Informers and secret police can be found in the written record from at least many of the earliest known cities. We can probably work out when this is going on even without written records, if we can learn enough about the types of homes people lived in and how those homes were arranged. Archaeologists don't have to start from scratch on this, they merely need to spend some time with the urban planners and look into the various projects built as social experiments. The social experiments were especially meant to put an end to the creativity and variability of cities, including their intense social mixing, making the cities into perfectly tidy, carefully arranged compartments of separated people and uses. Not only would this improve social control by central authorities, the traffic would be improved by designs geared towards the free movement of cars (note that only the movements of vehicles is recognized as "traffic"). An excellent overview of this can be found in Jane Jacobs' books, which are widely available in libraries and sensibly priced paperback editions, and written in plain language.

But people generally don't need books to remind them about the relationship between loss of privacy and imposition of control by others. We've all been children, and experienced the age when we began to wish to explore things on our own, and struggle loose from our parents' understandable but ultimately unhelpful overprotectiveness while everyone readjusted to our new capabilities and questions. Many invasions of privacy are described in terms of "being for our own good," "for our security and safety." But in fact, having privacy is a critical element of our safety and security as society and as individuals. The circumstances where it is appropriate to limit privacy are strictly limited either in time (i.e. our childhood) or in space (i.e. the airport). Beyond those limits, refusal to respect privacy becomes the bedrock of oppressive behaviour. The attempt to put a price on privacy, in all its forms, whether in a device we put in our pocket or the administrative horrors experienced by anyone who needs social assistance, is always an attempt to impose inappropriate control, and an expression of mistrust and disrespect of other human beings. (Top)

On Experts (2016-07-05)

A rather obnoxious and ostensible expert. Steampunk art from the Dover Steampunk Sourcebook, 2010. A rather obnoxious and ostensible expert. Steampunk art from the Dover Steampunk Sourcebook, 2010.
A rather obnoxious and ostensible expert. Steampunk art from the Dover Steampunk Sourcebook, 2010.

It is probably fair to say that nothing brings out controversy quite like elections and referenda, whatever purpose they may be intended to serve. They certainly bring into vivid relief the attitudes and beliefs current at the time among "the chattering classes" which today includes not only the grossly over-concentrated print and audio-visual media, but also a stratum of bloggers on-line, most of whom would probably insist there are at least two strata and they are part of the upper one. Yet all of these people, across an extraordinary spectrum of people claiming to be from an all manner of political positions, agree on one thing. The general voting population is dumb, and ought to do whatever experts like them say, because the experts know better than them. A decision taken by a majority of a voting population not to do as the finger-wagging or threatening "experts" say – or even the slightest suggestion they might consider not doing as they are told – is denounced absolutely. Such people are not merely ignorant and stupid, they are obviously vile racists who are just waiting eagerly to drum up a fascist march and lynch some immigrants. The contempt for democracy and "the masses" expressed in this vicious, fearmongering tripe has reached a kind of crescendo in north american media I have never seen before in my lifetime.

Please note that this combination of contempt and fearmongering has been applied by all parties in many recent elections and both sides in the most recent (in)famous referendum held in britain. It is not specific to a position at all, in fact, the british referendum is an especially good illustration. Campaigners on both sides spent their time trying to drum up irrational fear with threats and dog whistle politics. The people urging britons to vote to leave the european economic union all together rather than just maintaining a separate currency did so in a manner so vile it leaves a person to wonder if their mandate was actually handed to them by the campaigners on the other side. An impression bound to be created when both sides share such a negative view of the electorate and democracy itself.

All of this wouldn't be necessary and the general public would be viewed positively, the punditry and many neo-liberally inclined government officials assure us, if only the rank and file would obey the experts and vote as directed. It seems to me that somewhere along the line, the definition of "experts" being used by the general public and this smaller "elite" are at significant variance.

My trusty OED's definition of an expert is "a person who has comprehensive and authoritative knowledge of or skill in a particular area," but that counts for little in the current corrosive environment that is destroying public debate. There is a widely held view that experts are expected to provide advice. "Advice" as in a recommendation which a person or persons may use or not. An expert is not unlikely to have knowledge or skill too narrow for the question at hand, so it may happen surprisingly often that their advice must be considered but not directly applied. Yet even in such conditions an expert's advice and the knowledge they share can be helpful, so long as it is grounded in real experience. They do not give orders.

Unfortunately, a rather different definition of "expert" seems to hold among many of the people insisting that those of us who may have an opportunity to vote in an election or referendum ought to do as we are told by them. Their definition is apparently that an expert is someone who holds a faith-based position that cannot be moved by any evidence whatsoever, combined with strong rhetorical skills and privileged access to a large audience via some type of media outlet. They are to be obeyed because they are certain and because they can shout louder than anyone else, and they are better verbal manipulators than anyone else. They provide no advice, they give orders. In other words, the sort of people better labelled "conmen," "hucksters," and "cult leaders." The inanimate parallel to such people is the advertisement, and both have the same problem: repetition and/or their inherent contempt for their audience always gives them away and breaks their persuasiveness and reveals their lack of credibility.

There may well be real experts out there with access to a big audience, evidence-based recommendations to share, and respect for their audience among the mainstream punditry and politicians. Yet it seems they have almost all been bowled over by the current mania for "branding" and the trendy non-language it is possible to make up buzz-word bingo cards for. This non-language is officially known as "key messages" and generally delivers no real information at all if possible. Any real expert who takes up this stuff looks and sounds no different from the con artists, and has committed a terrible strategic error, at least if the intent was to maintain credibility.

Even if none of this was the case though, and there was not a huckster to be found anywhere, and public debate was civil, nuanced, and informative, one thing would still remain true. The general public, when faced with the need to make a decision, whether it be by voting or in any other way, has every right to make it. The decision may not be what the expert recommends. That's informed democracy for you. If the expectation of those claiming to be "leaders" and "experts" is that the "masses" will do as they're told, not only are they neither leaders nor experts, they don't believe they live in democracies but in some form of totalitarian state with themselves in charge. We should all find such megalomania disturbing. (Top)

Refusing the Land (2016-06-26)

Clipping from a Treaty 8 1900 era map, original source the queen's printer in canada, 2016. Clipping from a Treaty 8 1900 era map, original source the queen's printer in canada, 2016.
Clipping from a Treaty 8 1900 era map, original source the queen's printer in canada, 2016.

Between all the bandying about of words like "healing" and "reconciliation" in the colonial state of canada for the past several months, I've been thinking a lot about treaties. Thinking a lot about the land too, and the strange ways of interacting with the land non-Indigenous people have invented. Or reinvented and reintroduced to the americas, as has certainly happened. No group of people is a uniform blob, and this is as true of non-Indigenous people as Indigenous people. We all have created and live within many nations. When we clash, it keeps going back to the treaties. Those stubborn treaties, which one minute colonial governments demand we forget because it's all old news and hardly matters now, and the next demand we remember and follow to the letter, usually when it will benefit them. This certainly isn't helping build the credibility of or respect for colonial representative governments.

So there are a lot of treaties in north america in particular, between various colonial governments and Indigenous nations. Over time, to the shame of those colonial governments, they have been ignoring those treaties by as many tricks and wiggles as they can think of. Retroactive legislation, changes to how treaties are ratified after the fact, writing down something completely different, document destruction, you name it. Or just plain bad faith in the first place. These governments, and the colonial societies they dubiously represent are ashamed about this. I'm quite sure, because the terrible effort to forget the past, but somehow only the awkward bits, which never works, is something easily seen in the newspapers, textbooks, and mass market publications of all kinds. I do mean all kinds, including movies and television shows. That, and the ongoing effort to destroy Indigenous nations is still going on. Indigenous children are still being stolen from their families by colonial states. Colonial states still maintain and impose policies that cripple and destroy Indigenous ways of life.

This is craziness, because if you talk to non-Indigenous folks about what's happening, including the enforcers of these horrible policies, they will tell you they feel awful. And as far as they can, they are telling the truth. But they sure can't tell you why they're still doing it. The terrible joke about the doctor and the guy who says, "Doctor, doctor, it hurts whenever I do this!" is hard to avoid in this situation.

The trouble is, so many non-Indigenous folks are uncomfortable with the land here in the americas. They seem determined to refuse any relationship with the land. Indigenous people are like the land and part of it, so this also means refusing relationships with them, including all those necessary treaties. We humans express relationship in many ways besides treaties. We share food, intermarry, fight together, make art of all kinds together. We acknowledge each other's existence and learn each other's proper names. We treat one another's children as the precious and irreplaceable spirits we know in our hearts they are. Actually, no, that's wrong. All of these practices, all of these expressions of relationship are treaties. Migwetch Ryan McMahon, for your podcast explaining this. When for any reason we fall away from those practices, the relationships, the treaties become very sick, and so do all of us. We are the stuff of those relationships, those treaties.

The very first non-Indigenous people who came to the americas never intended to stay. The point was to get very rich, buy a title and a huge house back in the old country, and enjoy their newfound status until they died. No need for a relationship, in fact, a relationship would be a bad thing, because it would interfere with the ruthless exploitation necessary for this plan to work. The fundamental nature of a get rich quick scheme is that only the ruthless succeed. But lots of folks were far from ruthless, especially the ones coming later on. They just wanted to get their impoverished families out of poverty and go the hell home, of course they didn't want to get attached. Who can imagine how it must have felt for the first ones who realized first, they couldn't go home, and second, they didn't want to be at home where they were. If they had done, they would have learned the names of the places they had come to live, and used them, and added the new names of their own creation. Instead, they reused european names that didn't and couldn't fit. How jarring it must be, to say the name "london" or "berlin" or "paris," invoking utterly differently places and histories, and actually being someplace else. Or they did make up new names, invoking some other place in their imagination, still ignoring where they were, feeling disrespected every day the land was different when they opened their eyes.

All of this is no excuse at all to continue doing the same horrible things, however politely and out of sight they may be done now that they are embedded in bureaucratic systems that grab people in ones and twos. But it does reiterate that there is a choice, and a way out. There is also no requirement to delete the history of newcomers to the americas at all. Notice I mentioned adding new names, not deleting old ones. There's plenty of room for creativity and no need to try to make a blank space, which is again, impossible anyway. Accepting and building a relationship to the land isn't something that is going to magic away pain and distress overnight. Non-Indigenous folks, if you are serious when you arrogantly proclaim how "we are all here to stay," then I'd like to see some action in place of that talk. Let's see some real work on those treaty relationships, and an acknowledgement that Indigenous people don't need your permission or your labels to exist. Let's see you stop rejecting a relationship with the land. (Top)

Artificial Archaeological Artifacts (2016-06-07)

Mycenaean stirrup jar held at the Louvre, number A019201. Photograph by Jastrow, released to the public domain and uploaded to wikimedia commons in 2006. Mycenaean stirrup jar held at the Louvre, number A019201. Photograph by Jastrow, released to the public domain and uploaded to wikimedia commons in 2006.
Mycenaean stirrup jar held at the Louvre, number A019201. Photograph by Jastrow, released to the public domain and uploaded to wikimedia commons in 2006.

Archaeology and anthropology strike me as an example of conjoined twin, humanity-science border blurring ways of creating knowledge. "Conjoined twin" because depending which textbooks you read, or if you have taken a formal degree in either one, which school you were trained in, they may be considered separate (european) or archaeology is subsumed under anthropology (Boasian). The second idea is actually quite strange, because archaeology is far older than anthropology. For example, in china their second period of archaeological activity began with the seventeenth century. Yes, that's second period. If we consider merely the systematic collection of objects for display in some sort of museum-like environment, then some scholars argue that the first such place in the written record is the museum of Ennigaldi-Nanna in Ur (now iraq), dated to approximately 530 BCE. This example is more controversial, but it does reiterate the point that human curiosity about vanished ancestors and attempts to understand and recreate how they lived go a long ways back. It took much more time and newer circumstances for humans to invent anthropology, which tends to be oriented towards living peoples, not vanished ones. By this I don't mean that no one ever tried to study other living peoples who were foreign to them. Merchants and other travellers have always had an interest in at least picking up the practicalities, and some of them wrote extensively.

What truly drove the creation of anthropology though, was a shift in the scope and nature of colonialism. Colonialism became a huge industry with a more or less explicit mandate to exploit and destroy "lesser peoples" and make room for the expansion of "superior" europeans. A sort of proto-anthropology grew up post the year everyone in the americas has drummed into them from early childhood, 1492. The purpose of this proto-anthropology was to enable europeans to win footholds in the americas, and then spread back into africa along with the increase of the transatlantic slave trade. It had no name as such, and for a long time such study of "the savages" was left almost exclusively to traders and missionaries. This all changed when it began to seem to europeans that Natives everywhere were finally dying out, taking their unique languages and cultures with them. Scholars looked on with alarm, because here was irreplaceable knowledge escaping before their very eyes for lack of attention. Today we would be prone to reading this as "some knowledge or practice we could market is getting away!" but in the nineteenth century when the development of anthropology proper, including the use of that term, it was more about knowledge, and the sense of control having knowledge gave that was at issue.

Now famous figures like Franz Boas leap to the fore, busily pestering Elders for stories to fill up monographs and robbing graves and villages for artifacts to fill up museums. At the time, none of these scholars seemed terribly troubled about taking advantage of the desperation of peoples who had been decimated by infectious disease and actively prevented by racism and enclosure from providing for themselves except by selling whatever they could for cash they could use to buy food. Somehow for those scholars, a non-christian burial didn't count as a grave that should not be desecrated, and they happily filled crates with spiritually potent masks and regalia for shipping. Each object, each story, and far, far too many ancestors, were stripped from their home in a community of people. At first minimal provenience information was noted, basically because of the "salvage paradigm" which dictated grabbing as much as possible as quickly as possible. Perhaps the people involved believed that they could reconstruct the rest afterwards when they had more time. And why shouldn't they, when they had a successful model of how to do it from archaeology?

From this we get the weird phenomenon of artificial archaeological artifacts, objects and stories in various forms (both oral and written), disengaged from their social context. They might be whole in the sense of regalia being in one piece and potentially wearable, or an ancestor's entire skeleton might be crammed in a drawer or box, all associated with an ethnic name and location drawn from the words of a Native person. This is still more information than the majority of archaeological materials ever have, yet it is still a pale shadow. But this pale shadow is incredibly convenient when the people creating it are busy in a massive project of cultural appropriation. By cultural appropriation, I don't mean learning and adapting a practice or idea from another culture. I mean taking up artifacts, songs, stories, whatever culture element, separating it in some way from its source, and then claiming to be the true knower of those things and to have the only authentic versions. Furthermore, in cultural appropriation, the appropriator claims to be the only one who knows what is authentic and what authenticity consists of. "Authentic" is a loaded and slippery word, and is a close cousin to "authority" which is just what cultural appropriators claim over somebody else's culture. The connection between knowledge and authority is also not coincidental. "Knowledge is power" is a truism, however much of a cliché it is. Yet it is also a curious fact that in the application of this idea in early anthropology, a key aspect of the application was the refusal to know.

To make an artificial archaeological object, the anthropologist had to refuse to know things like that they were pilfering graves, or taking advantage of the starvation of a family to take away much loved regalia and art at rock bottom prices. They had to refuse to know their role in colonialism and that their behaviour, whatever their personal intentions, constituted a part of a massive war on Indigenous cultures. They had to refuse to know that Indigenous peoples were not backward or the equivalent of children in adult bodies. They had to refuse far more than this, and the pattern set by archaeology helped guide the refusal, probably unconsciously. It's little wonder that today anthropology is in a state of flux if not crisis, as anthropologists struggle to wrench it from its foundations in colonialism and destruction, and cope with the fact they can no longer refuse to know. (Top)

"I Know, Fridge the Planet!" (2016-05-30)

A last picture of neptune sent by the voyager 2 space probe and released by nasa in 1989, via wikimedia commons. A last picture of neptune sent by the voyager 2 space probe and released by nasa in 1989, via wikimedia commons.
A last picture of neptune sent by the voyager 2 space probe and released by nasa in 1989, via wikimedia commons.

There is nothing wrong per se with remakes, which today are now referred to as "reboots" in an attempt to file off the serial numbers, nor with remakes that change the visual style from whatever the nearest "original" was. After all, no one would reasonably expect a 2000s era Sherlock Holmes to be made with the same types of cuts and transitions as one made in the 1950s. Something serious has gone wrong however, when it seems that most of the "new" programs, shows, and comic books getting the most attention are remakes, or not very alternate universe retakes. Worse yet, there is at least one franchise where there is a strong argument that the "reboot" eviscerates the founding principles of its source material. It seems (to put it diplomatically) the point is not to try to tell a different or better story, but to "play it safe," and make money from a "sure thing." This trend is not so safe as its growing number of followers might expect, though, because when you basically repeat the same story, people recognize it and are prone to getting bored. To shock them out of boredom, writers begin grasping for gimmicks, but even the gimmicks are worn thin. The vile fridging trope is an excellent example.

"Fridging" is a term I learned relatively recently myself, even though it refers to a trope I know all too well, as indeed does anyone who watches or reads any pop culture materials at all. In it, the (always male) hero, is driven to new heroic heights by the brutal murder of his wife/girlfriend/daughter/family/beloved butler-father surrogate... you can easily fill in more, the pattern is clear. According to wikipedia – and for pop culture references of this type it is particularly good, and includes a working link to the original source of analysis of it, Women in Refrigerators founded by Gail Simone – the key problem with this trope is how it renders the existence of the murdered character, who is to this day most often a woman, into nothing more than a means to move along the male hero's story. This isn't just sexist and horrible (don't forget that Alfred is effectively feminized because he waits on Bruce Wayne hand and foot), it's bad writing. Reeaaally bad writing, and so overused even the usual defenders of such crap are bored with it. There's only one way to make it shocking again, find an unexpected victim. Preferably one with feminine resonances still attached somehow. And big, really big. The solution is obvious when the problem is set out this way. There's only one.

Fridge the planet.

That's right, fridge the planet. Alas, Doctor Who did it first in this age of remakes, then Star Trek when Vulcan gets the chop. In both cases, the idea is to put the affected characters "out on their own" and "make them more emotional" to push their stories along, of course. Except, this makes no sense. Take the Doctor, a rather insane alien time traveller who likes to stop bad things happening because at root he's a good person. Well, that's the centre the character bobbles unevenly around depending on who is in charge of script editing. He also suffers considerable angst because he is fundamentally an exile from his own people, though so far no convincing explanation of why has been given. There is quite a lot to work with there, and would probably be even more if some basics about the Doctor's species were worked out properly so that writers could follow out the implications. Broad agreement holds on the effective immortality of at least a part of the Doctor's species including himself, and that has important biological implications with subsequent related psycho-emotional knock on effects. One of the few authors to follow this thread was Jon Peel in the original Gallifrey Chronicles, a 1991 tie-in reference book. Spock is by definition alienated (I couldn't resist the pun) from both humans and Vulcans in the Star Trek universe, where a persistent discomfort with "mixed-race" people in mainstream culture is mapped onto people who are literally "mixed-species." As many people commented on Leonard Nimoy's passing last year, Spock is one of the most fully developed characters in Star Trek, the writers and fans have found an enormous number of stories for him, and no doubt there are more. Nimoy's compelling portrayal of Spock reiterated how rich in potential the character was and is. Which leads to why writers have resorted to fridging the planet: doubt.

Somebody important in the creative chain no longer believes these male heroes' stories can move without some kind of fridging event. Apparently wrestling with the ethical implications of running around playing god isn't enough, nor the difficulties of being a perpetual outsider by choice and/or by necessity, especially the question of whether non-conformity is worth the pain, in the case of the Doctor. For Spock, it seems to me the doubt is about the capacity for any actor to follow in Leonard Nimoy's footsteps and play the character as well. Rather unfair to Zachary Quinto, if this is a correct supposition. More likely however, is the doubt that there is any other means in a Star Trek remake to push it from the original lines established by Gene Roddenberry into a more "action-oriented" mode with more explosions and deaths to shock the audience with. Which expresses if not doubt, then a deeply cynical view of the audience, who are apparently not expected to be able to follow a more complex or potentially unfamiliar story, or any other story without periodic, video-game like stimuli.

Pushing the fridging trope to this extreme doesn't solve anything because it is not a solution to the actual problem. The problem is incessantly repeating the same stories again and again, with the same hero, to the same end, with what amounts to a bit of garish decoration shrieking, "new and improved!" when improvement is impossible on something worn to less than threadbare. If what is ostensibly intended to be a creative work has a greater resemblance to a repeated ad than an entertaining story, something has gone gravely wrong. (Top)

Is Javascript the Issue? (2016-05-26)

Simplified spider web drawing quoted from clipartmag.com, 2016. Simplified spider web drawing quoted from clipartmag.com, 2016.
Simplified spider web drawing quoted from clipartmag.com, 2016.
Example picture of orb spider communal webs in a large building. The linked source paper for this picture taken by Pedro Cardoso in 2010, 'An Immense Concentration of Orb-Weaving Spiders With Communal Webbing in a Man-Made Structural Habitat' is awesome and includes many more photographs at much better resolution. Example picture of orb spider communal webs in a large building. The linked source paper for this picture taken by Pedro Cardoso in 2010, 'An Immense Concentration of Orb-Weaving Spiders With Communal Webbing in a Man-Made Structural Habitat' is awesome and includes many more photographs at much better resolution.
Example picture of orb spider communal webs in a large building. The linked source paper for this picture taken by Pedro Cardoso in 2010, 'An Immense Concentration of Orb-Weaving Spiders With Communal Webbing in a Man-Made Structural Habitat' is awesome and includes many more photographs at much better resolution.

Yes, you're right. It is no less than bratty to title a post with a question, because the proper answer to such questions is practically always "no." There are exceptions, as there are to every rough hewn rule worth its measure in life, although this is not one of them. What led me to this rather facetious question was a bit of random internet wandering courtesy of my current preferred search engine that has not yet devolved into a pure advertising company. Having just completed a major clean up of a key set of research notes, I wasn't up for anything too serious, so I decided to try looking up what the blogosphere had to say about the way javascript is used on websites these days. This seemed promising especially in light of the havoc wreaked on a widely used javascript library not too long ago by a developer's abrupt withdrawal. There was plenty of discussion of dependencies, insults hurled at anyone who disliked using javascript for any reason, and a smattering of threads wrestling with what "open source" means. Javascript security definitely had its time in the Sun. Yet one topic was barely evident, perhaps because most of the people writing are primarily coders of various types, unless they were decrying the excellent browser extension NoScript for "breaking the web." NoScript doesn't "break the web", but it does reveal the issue nobody seems to be talking about.

If I am surfing with javascript turned off, I fully expect that some, if not most websites may be less pretty. Some of their fancier whizzbangs may not work, like horizontal-scrolling slide shows, or games, or certain types of small-scale data processing for web applications that I may or may not want to use. This makes perfect sense. What doesn't make sense is when a site is completely crippled or blank without javascript turned on. It is a source of wonder to me that there are people out there who thought it was a good idea to build websites with navigation systems useless without javascript turned on, let alone impossible to view. Mind you, the impossible to view ones are so-called "active sites" built up "on the fly" from data sucked in from a whole pile of other places mostly via syndication. Thankfully, there are numerous alternatives to sites that effectively do what used to be done by having a "best viewed with browser x" label: saying loudly and rudely, "fuck off, we're too cool for your stinkin' page views." Among the many things accused of breaking the web right now, this is one of the few real ones.

Note the issue here is not javascript as such. The problem is specific implementation choices that effectively drive away or otherwise punish people for not being willing or able to allow javascript heavy sites to have their head. There are a lot of good reasons not to, from security to fending off data overages from pointless page bloat. The decision to build sites geared exclusively to people who have no concerns about data caps, RAM limitations, or security worries implies a particular audience. Said audience is apparently mainly affluent and english speaking, with a preponderance of males of no more than middle age. This is not surprising, especially in light of the three pieces I wrote about what is wrong with the web, which I won't reprise here (1 | 2 | 3).

What mystifies me is how little traction this issue has despite how germane it is to others relevant to the future of the web, who it will be open to, and how. The attitude of "it's broken if javascript is off" jives very nicely with "it should be broken if DRM isn't on it." The difference is whose ox is figuratively gored, not the attitude behind it. Declaring the web "broken" without javascript is like declaring it or privacy dead: a species of wishful thinking, and far from the actual facts. When it comes to narrowing the accessibility, utility, and even basic enjoyability of websites, we already know how this story goes. As far bigger web pundits than me have said, the story we've already heard is television, cable or otherwise.

At the risk of being labelled "unrealistic" – which nowadays is actually a compliment in airspace so polluted with false "there is no alternative" messaging – I stand with those stubborning for an open web. Whizzbangs and an open web are not exclusive things unless we make them that way, and we don't have to. Some folks may prefer a web something like the hyper-enclosed one pictured at the top of this thoughtpiece, yet there's something that much more impressive in a creation more like the communal webs orb spiders can build up under the right circumstances. (Top)

Five United Pacific Tribes Vindicated After 20 Years (2016-05-19)

Clipping of photograph of the bust created by StudioEIS. Original photo by Brittany Tatchell of the Smithsonian, quoted here from GeekWire, 2016. Clipping of photograph of the bust created by StudioEIS. Original photo by Brittany Tatchell of the Smithsonian, quoted here from GeekWire, 2016.
Clipping of photograph of the bust created by StudioEIS. Original photo by Brittany Tatchell of the Smithsonian, quoted here from GeekWire, 2016.

Over twenty years ago, another in a long series of disrespectful actions towards the remains of a Native person was carried out in the united states. The original discovery of the remains of that person was not necessarily disrespectful, in fact, based on what I understand about the events all that time ago, from the account in my then modern archaeology textbook in a first year university class, the discovery was an accident. As is usual when human remains are found where at least white people did not expect them to be, an anthropologist was called to help determine whether these were recent or ancient. The archaeologist made at least one poorly chosen comment in the media, helping fuel another shameful episode of disrespect and double standards, where the remains of a Native person are treated as curios and retained "for study" against the wishes of the nearest kin present.

If you follow archaeological news at all, you already know this is a reference to the so-called "Kennewick Man" who has just been confirmed to be exactly who the Confederated Tribes of the Colville Nation said he was. Their relative. He will be reburied soon with appropriate rites in a secret site if he hasn't already, since the findings were announced last month. The clip here is from smithsonian photographer Brittany Tatchell's picture of an attempt to recreate the living appearance of this person. It is cropped mainly to encourage you to go see the original if you can, or at least the full photo at GeekWire.

To this day, there are many non-Native people who declare that they can't understand what all the fuss is over the treatment of human remains, especially those so old that "no one could be related to them anymore." Or at least, those are the folks whose words get reproduced in the media most often. In reality, both non-Native and Native responses are far more complex, not least because their sense of whether they have a relationship of some kind with people from the past where they live may vary. Their beliefs about how the ancient dead should be treated may also vary significantly, depending on whether they believe they have obligations to those past people. Archaeologists can be especially muddle-headed on these issues, because in professional origin they are thieves of both the artifacts and bodies of peoples deemed "conquered" or otherwise "lesser" by those defined as "white" in roughly the nineteenth century. (Those affected include practically every culture and people around the mediterranean.) So there is a sense of entitlement and belief that anything is okay as long as it is for science that was originally inculcated in anyone involved in such work, right into the 1960s. Add to that the still persistent teaching that any sort of epistemology outside of "western epistemology" doesn't even count as real thought, and the fact that any archaeologists are able to work constructively with anybody at all today is astonishing.

UPDATE 2016-10-23 - Courtesy of a heads up at unwrittenhistories.com, I can now add that legislation has now been passed to return the Ancient One to his relatives. There is a short write up on this development on the canadian broadcasting corporation's news website. Since it is illustrated with a picture of the Ancient One's skull and the recreation of what they may have looked like – well, sort of, because for some reason the forensic person gave them a massive and frankly ridiculous beard that hides most of their face anyway.

The sense of outrage and the subsequent media and legal mess centring on the Ancient One had a more immediate practical tie, however. The mistreatment and exhumation of Native bodies in order to strip them of any grave goods or jewelry they may have been buried with is not just something people did before the "west" was supposedly "won" in north america. It is still an ongoing problem to this day. There are too many former community sites Native people still return to to take care of the cemeteries, worrying every day that the time will come when they will not be able to protect those places anymore. Some of those cemeteries are more recent, others are incredibly ancient, and people continue to care for them.

I have focussed on Native examples here, but this is not just a "Native" issue. Take some time to read the information at the National Burial Database of Enslaved Americans a crowdsourced effort to document the many cemeteries enslaved people were buried in that are commonly unmarked. Not just unmarked, but like Native cemeteries, often ploughed under or built over, the names of the people buried there effaced. The majority of these enslaved people were of african heritage, and so it is predominantly the african american community engaged in this crowd-sourced effort so far. Free african americans or african canadians didn't necessarily see their cemeteries get treated with respect either. For example, see the documentary 'Speakers for the Dead.'

By now, perhaps feeling a bit bludgeoned by how awful this all is, gentle reader, you may feel inclined to say, "but science!" Well, it is quite possible to do science without being assholes about it, to be blunt, and that includes studying our ancestors. There are projects all over north america where archaeologists have worked with Native communities to follow the correct cultural protocols for handling ancestors who have been uncovered during excavations for various projects, or who have been uncovered by erosion. It's not easy for the outsider scientists to do, because they have been so used to doing whatever they pleased without having responsibility to anyone but their institution. The resulting research may not be good for generating media circuses and employment for lawyers, but it is far better for actually understanding the past and building better relationships between Indigenous peoples and newer arrivals. So yes, the adrenaline rush of being in the media spotlight and imposing their will upon the recalcitrant Natives would not be part of the package for anthropologists or archaeologists involved in such projects. That shouldn't be a real loss though, since those things aren't science.

We'll never know how things might have turned out if at the very beginning of the "Kennewick affair" a decision had been taken to say to the Native people concerned, "You're right, this is your ancestor, and we respect that. But we'd also love to learn what we can about this ancestor before you rebury them. Can we figure something out?" The missed opportunity seems all the more striking to me, because the DNA analysis has reiterated that the Native people were right about this Ancient One in the first place. (Top)

Art Trouble (2016-05-08)

Stick person getting squished. C. Osborne, may 2016. Stick person getting squished. C. Osborne, may 2016.
Stick person getting squished. C. Osborne, may 2016.

There has been, alas, no shortage of examples of artists being treated poorly by a whole range of parties bent on suppressing or repressing their work. (By "artists" I mean writers and sculptors as much as those who create images or music.) The examples we hear the most about tend to be those artists affected by government action, especially when the government in question is on the "disliked by western governments" list. In occasionally close second place are the artists struggling against misguided accusations of copyright infringement. It seems rather distant as a phenomenon when set out that way. However, the sense of distance got sharply decreased for me on bumping into this snippet, paraphrased by Susan Dubalskis-Hunter in her thought-provoking book, Outsider Research. In the text, she is examining how even well-meaning non-Native researchers may negatively impact the Indigenous people they are so eager to study.

Wheeler states that historically, it has always been the artists of any people, rather than the politicians, who have made substantial changes within a society. Furthermore, he states that even though writers and artists have traditionally been persecuted and poor, it will be these people, rather than politicians, that will lead to change for Native people.

It is with wailing and gnashing of teeth that I discovered Dubalskis-Hunter's reference to Wheeler is incomplete, so I have not been able to determine which male Wheeler this is or find the original context of these comments. For the purposes of this thoughtpiece though, the key point is about the "traditional" state of "writers and artists," a state not traditional in Indigenous societies as it happens. When reading about the history of western art, it doesn't take long to get the message that the main thing western artists experience is precariousness. The precariousness is most often economic, but may include health problems whether mental or physical (insofar as the two can really be separated). There are exceptions of course, for every Mozart a Haydn, for every Wollstonecraft a Behn. Then there are some other phenomena to consider. Such as the fact that children and young adults are actively discouraged from pursuing an artistic calling at all, or any interest in art beyond the earliest years of schooling. The first classes to be cut under funding constraints are art classes of all kinds, with physical education close behind. There also seems to be a remarkable level of social hostility directed at artists, especially if they have not got themselves wedged inside a corporate machine of some kind. Artists are far from indolent, and we all know it from our experience with the one art none of us can avoid: writing. There is no way to produce something good without working at it, one way or the other.

So it seems there is a lot of effort from different directions going into repressing artists and therefore art by discouraging it from being made in the first place, let alone the various obstacles in the way of sharing it. Suppression is very much a late stage event. All together, the level of effort and general suffering this must all require is completely senseless, unless we take seriously the claim that it is artists who make major changes within a society. The truth of course, is that we do. Otherwise we couldn't feel horrified at the idea of bookburning, or be disturbed by fiction. We know very well that the imaginary worlds shared by artists of all types are not so much unreal as gently obfuscated. Science fiction is as much about the present as an envisioned future. Satire is about possible futures as it is commentary on current events.

Change is difficult, and we humans are predisposed to fear what we don't know. We certainly don't know the future, and are at times alarmingly uncertain about the past. Seo Young-Chu has argued that science fictional writing is a way of making what we can't yet make sense of knowable. I am inclined to agree, and extend the idea to art in general, that is, artists help us make sense of events, people, possibilities and so on that we may not be able to understand yet. This is a powerful and risky job, since all too often we don't react too well to the folks who are trying to help us cope with change we can't avoid. Perhaps the fear and hostility inspired by the unknown, especially in its form of impending change, is being directed against the artists in hopes of chasing off the unknown once and for all.

Thinking it through in this way, it seems art and science are indeed not so far apart after all. What else is science but an ongoing effort to make sense of change and transform the unknown into something familiar? Art and science are non-trivially different in how they go about these tasks to be sure, but it is critical to realize they are not antithetical to each other. Which means that when the artists are having problems, the scientists will soon have analogous problems of their own. For a recent example, have a look at the effects of the recently ended period of hyper-conservative government in canada on the arts and sciences. It's alarming food for thought. (Top)

Don't be Such a Neanderthal? (2016-04-07)

Neanderthal facial reconstruction based on the skull of the Gibraltar 2 Neanderthal specimen, image courtesy of the anthropological institute at the university of zürich, 2010. Neanderthal facial reconstruction based on the skull of the Gibraltar 2 Neanderthal specimen, image courtesy of the anthropological institute at the university of zürich, 2010.
Neanderthal facial reconstruction based on the skull of the Gibraltar 2 Neanderthal specimen, image courtesy of the anthropological institute at the university of zürich, 2010. This wonderful photograph is widely reproduced but too rarely with its origins noted, evidently because they were surprisingly tricky to find in 2016. Perhaps they were considered common knowledge, as even Heather Pringle did not mention them in a blogpost from the same year. I finally found the basics of the citation in the abc science article, Humans Interbred With Neanderthals: Analysis, 7 may 2010.

A startling number of years ago now, as part of my archaeology degree I had to complete two different courses that included components discussing the species now known as Homo sapiens neaderthalensis, among other hominins. This was long enough ago that "hominins" were still called "hominids," actually. Other creatures more or less like us are bound to be compulsively interesting, and one of my classmates wrote a huge paper about the "neanderthal question": whatever happened to them? It's not a bad question, and we had a lot of discussion over it then, and the discussion continues now. (It was a really cool paper!) I have been fascinated by the uncanny (yet unsurprising) parallels between the changing neanderthal narrative, and another narrative I am personally familiar with. Here is the approximate neanderthal narrative of the time that I took those two classes.

Once upon a time, there was a population of early humans. They were isolated by migration and natural geological forces, especially glacial movements, from other human species. Left to their own devices, they developed specialized, rather primitive technology and never had very high numbers. Then the glaciers retreated, and early modern humans began to migrate out of africa, spreading everywhere their feet could take them and assuming nobody had dabbled in any kind of raft or small boat building. (The plausibility of that assumption needn't concern us here.) They soon encountered the isolated human population, and it turned out the early moderns had much better technology, could speak, and were generally more intelligent than the isolated humans. Incessant violence ensued, with the bigger, cleverer, better equipped early moderns soon wiping out the isolated humans, who were of course, the neanderthals.

I never liked this narrative. For a long time, I couldn't quite put my finger on why. Then it dawned on me that it was because the "why" was too damn big for one finger. Consider this next narrative.

Once upon a time, there was an early population of humans in the americas. They were isolated by migration and natural geological forces, especially glacial movements, from other humans. Left to their own devices, they developed specialized, rather primitive technology and never had very high numbers. Then the glaciers retreated, and humans from europe eventually began making their way to the americas, because they were the only ones serious about boats. (Leave the polynesians and chinese and japanese out of this, they don't concern us here.) The europeans soon encountered the isolated human population, and it turned out they had much better technology, could speak better languages, and were generally more intelligent than the people already in the americas. Incessant violence and helpful epidemics ensued, with the cleverer, better equipped europeans soon wiping out the original population, who were of course, the Indigenous peoples of the americas.

The americas weren't so isolated before europeans came along, far from it. Furthermore, Indigenous people are still alive, kicking, and continuing to refuse to be wiped out by ongoing colonial oppression and violence. And for the record, we Indigenous folks were no more and often less violent than the europeans, and widescale warfare was nearly unknown. (I think this had as much if not more to do with hard won practical sense than trying to impose peace as an abstract value, but that's an essay for another day.) If Indigenous languages are so much lesser than european ones, I am at a loss why such violence including residential schools and laws against speaking them have been imposed. After all, a better language or practice hardly needs violence to make people take it up, does it?

Many of the earliest european newcomers to north america were either prisoners being thrown off the boat into permanent exile as punishment, or were various other sorts of marginalized individuals. The first chance they got, they hightailed it to native communities and integrated, something so hated by colonial authorities that in early english colonies going to live with "the Indians" was punishable by capture and a gruesome death in front of the rest of the colony. "Integrated" will of course include intermarriage, which means kids and descendants. All that continued going on even after the european invasion got bigger and far more violent. The story was not like this in central or south america, because militarized invasion was the starting point there.

UPDATE 2021-03-02 - Apparently certain modern humans are still arguing about whether neanderthals could not just talk, but hear in the way that modern humans do. Honestly, I don't know if the people who write these articles appreciate how they sound, nor if the scientists busy with these studies realize things sound a bit off. Then again, maybe for some projects it is a sure way to hang onto funding, which sometimes can lead to obsessive seeming research. On the other hand, I do appreciate that the ability to three-dimensionally model the skulls of specimens of different hominin species and then determine their real life frequency range sensitivities is cool, and arguably valuable to develop as a technique. So in that case part of the justification would be refining that technique. None of which detracts from the interest of the Science Alert article reporting on this, In a Momentous Discovery, Scientists Show Neanderthals Could Produce Human-Like Speech by Michelle Starr.

UPDATE 2022-08-11 - I spent a good part of an afternoon tracking down the origins of the photograph illustrating this thoughtpiece, including far too much time digging through search engine image results. Neanderthal reconstructions are very popular among some rather strange groups. Alas, this research also demonstrated that I was not wrong to draw parallels between the two tell-tale stories at all, considering how many of the reconstructions were clearly based on pictures of Indigenous people from the americas in their regalia.

Now the problems, and complete bullshittery of the early neanderthal narrative should hopefully be obvious. Since the all too glaring racist political convenience of it didn't knock it down, finally other new evidence did. For example, archaeological evidence demonstrating neanderthal ritual behaviour, tailored clothing, sophisticated tools, a wider area of occupation that was not completely isolated, and longterm interactions with early modern humans. Or genetic studies demonstrating that many modern humans today have neanderthal DNA, and yet other evidence strongly indicating neanderthals could talk. Furthermore, neanderthals were probably not all white – which should surprise nobody at all. Just search for images of people who live in the arctic today, then check out pictures from earlier times, which is easy because circumpolar peoples, especially Inuit, are among the most photographed on the planet. Arctic peoples are mostly dark skinned while having a range of skin tones just as they did five hundred years and more ago. This is probably a general human feature, and pure populations of a given colour are as mythical as "races."

What we may have in the neanderthals and the denisovans, are two human populations that genuinely interbred with people who match the criteria to be called "modern humans" and finally ceased to be separate. However, I can't see how this amounts to "extinction" for either one, because successful interbreeding means these three groups were not separate species. What has happened is that a particular set of phenotype and genotype combinations are no longer present or at least prevalent today. I say "prevalent" because I have an acquaintance who has most of the features we were taught to associate with neanderthals in my classes: skull recognized by an occipital bun, flattened overall head shape, large nose, smaller to retreating features of the lower face in comparison, but a large mouth; body tending to be stockier and broader than modern humans. Yes, that's anecdotal, but as it happens, I met him before attending university, so that was probably the original source of my doubts about the early neanderthal story. Oh, and I should add: he was italian, second generation in canada.

In other words, life was even more interesting and complex in the past than we thought, and the vicious social darwinist caricature of evolution has received another good drubbing. Unfortunately that story is reminiscent of a bad fungal infection and has to be regularly put down. There are numerous great articles out there that bring together the latest findings about neanderthals, denisovans, and other human populations. A great place to start is Athena Andreadis' The House with Many Doors – never underestimate the insights of a molecular biologist who writes beautifully. Anthropology professor John Hawks' article Culture, Mathematical Models, and Neandertal Extinction is another great introduction. Somewhat less helpful but with excellent links to papers in Science and Nature is Two New Studies Undermine "Over-simplistic Models of Human Evolution" at Ars Technica. I'm not sure why the second half of the title is in scare quotes, since early models of anything complicated like evolution are bound to be oversimplified. It's hardly contentious to acknowledge that!

POSTSCRIPT: For anyone wondering about the lovely picture to the upper left, that is basically the only neanderthal facial reconstruction picture that I could find online that was not a bearded male with a strange aversion to combing his hair and all manner of repulsive skin conditions. (The problems with the assumed skin, hair and eye colour is an issue for another day.) I've never understood the insistence on reconstructing ancient people as almost exclusively male and unaware of combs or comb-like devices. We humans are vain, have the typical primate grooming instinct, and the remains of combs have been recovered from at least the early neolithic. For the palaeolithic, the evidence for combs I know of is more indirect and includes early textiles and the "combed" surfaces of early pottery. (Top)

Comic 538 'On Password Security,' by Russel Munro, reproduced as per the Creative Commons Attribution-NonCommercial 2.5 License from his site xkcd.com. Comic 538 'On Password Security,' by Russel Munro, reproduced as per the Creative Commons Attribution-NonCommercial 2.5 License from his site xkcd.com.
Comic 538 'On Password Security,' by Russel Munro, reproduced as per the Creative Commons Attribution-NonCommercial 2.5 License from his site xkcd.com.

"Consent" is an important concept, and has been a growing area of interest and reasoning over the past few years, not least because it turns out to not be so simple to sort out when consent is not being given freely, pace the xkcd comic at left. There aren't many places it doesn't show up in north american liberal discourse, since liberalism begins with the premise that all people are separate individuals whose personal freedom should be as great as possible within the confines of society. A free person is someone who can give consent, that is, permission for something to happen. With this in mind, it is no surprise that "consent" has become a key area of contention ever since white males were forced to grudgingly admit that they were not the only "people" in existence. If freedom can no longer be defined as merely having a peculiar lack of skin tone and a particular relationship to coercive power, and a person wants to elide or deny a commitment to the idea that "might makes right," then consent comes in quite handy. It wasn't supposed to be this way of course, as the invocation of "free, prior, and informed consent" in the United Nations Declaration on the Rights of Indigenous Peoples makes clear. The part that makes consent double-edged is not that it is wrong or impossible, but that it is invoked in contexts where structural factors may make it difficult indeed for anyone to actualize.

An excellent example of this came up in an unexpected place just this week via the Geek Feminism blog, Creatrix Tiara's essay on economic consent and willingness to pay or sell. Her discussion is rare coverage of this angle from someone other than Indigenous scholars. Many more people would be less skeptical about the claims by the fundamentalists of late corporate capitalism that it is the inevitable and best sort of economy if its boosters weren't so insistent on using violence and coercion to crush other possibilities. (After all, logically if capitalism is inevitable and the best, there would hardly be a need to enforce it.) Creatrix Tiara isn't considering the issue from quite this high up, rather she is dealing with the effects on a more individual basis. Her examples include people in precarious wage work and those working as artisans and artists. These are definitely well chosen, because they bring the potential and real existence of exploitation to the forefront.

How many people who invoke Adam Smith as the patron saint of unfettered capitalism have read what he actually wrote on the subject? I was quite surprised on first discovering that he was actually opposed to monopolies, defined the so-called "invisible hand" as a social force active within very narrow circumstances, and that he wrote an entire book on his Theory of Moral Sentiments before plunging into his later books, which included the (in)famous The Wealth of Nations. A major reason Smith was concerned about "moral sentiments" was his desire to understand how to control a presumed preponderance of the desire to fulfill self-interest in society at large. Considering he was writing in the 18th century, a period of intensive colonialism abroad by and enclosure within england and the attendant economic exploitation following on them, his concern is not surprising. By this I do not mean he was very sympathetic to the people being exploited, and he wasn't much interested in consent in the modern sense.

Creatrix Tiara asks a series of thought provoking and far from rhetorical questions about consent and economics at the end of her essay, a couple of which I will quote here, to provoke thought experimenting:

How different would our thinking around economics be if we added consent into the equation? ... How can we craft situations so that people's Willingness to Pay or Sell arise from full, informed, active consent, and that they have the freedom to seek what they need at the price they can and truly want to bear?

(Top)

Science-type Thoughts (2016-02-29)

The frontispiece from Charles Lyell's 1835 book 'Principles of Geology' 1835 via wikimedia commons. The frontispiece from Charles Lyell's 1835 book 'Principles of Geology' 1835 via wikimedia commons.
The frontispiece from Charles Lyell's 1835 book 'Principles of Geology' 1835 via wikimedia commons.

According to my electronic OED, the primary meaning of the word "science" is this:

The intellectual and practical activity encompassing the systematic study of the structure and behaviour of the physical and natural world through observation and experiment.

This is a quite practical, even ecumenical definition in the sense that it allows recognition of more than just say, mixing chemicals or using copious amounts of electricity in an attempt to keep breaking subatomic particles up into smaller pieces. The more controversial question though, is often whether or not it respects knowledge with these characteristics produced by people who are other than white, male, straight, and able-bodied. Which sounds crazy, I know. Why should those features matter anyway? They aren't even in the definition. They shouldn't matter. But the truth is, they are made to matter, not by the nature of the definition, but by the nature of all the things it must inevitably leave out.

The scan of the frontispiece of Charles Lyell's Principles of Geology is shown here advisedly. His book is (in)famous for being a key influence on Charles Darwin, who of course wrote two monolithic books laying out the concept and evidence for evolution by natural selection. The fact that there are still people spending incredible amounts of time, energy, and money trying to combat the spread and continuing use of the very idea of evolution of natural selection certainly reminds us that science is is done in a social context, and indeed, so is every other intellectual pursuit. It's one of those things that really is a feature, even when at times it feels suspiciously like a bug.

The bug is actually how hard it is for us to see past what we assume to be just "the way the world is" which came up in last week's thoughtpiece as well. If the people busy doing science aren't the stereotypical sorts of people who do science, or if they aren't doing science in the usual way, or they aren't sharing their results in the way that "mainstream" society says science should be shared in, a chorus of protests go up, "That's not science!" Getting mixed up about what is primarily form as opposed to substance can lead to no end of trouble, and we're all dealing with an excellent example right now called "anthropogenic climate change."

What do I mean by form versus content? Well, let's take an example that is not "mainstream" but still considered a part of the western intellectual tradition. By remarkable chance, a long didactic poem in latin has survived from antiquity, written by Titus Lucretius Carus, a follower of Epicurus. Among the various teachings attributed to Epicurus is an early version of atomism, and Lucretius' poem De Rerum Natura, is one of the earliest written accounts of it. There are still people who question whether the account of atomism in this poem counts as science, not least because it is part of a poem, and "poetic language" has been highly distrusted in the english language since the enlightenment. Well, the poem reports a theory and a series of observations that Lucretius considers evidence for it. That's science. Since Lucretius' time, new and better observations have been made, and the theory was first adjusted and then eventually replaced. That Lucretius was writing poetry and this early iteration of the idea wasn't perfectly accurate at the start doesn't make it not science. It's odd that the short definition above leaves out the role of iteration and correction, which makes it sound like scientists can just read off "the answers" by means of observation and experiment. (For a post enlightenment example of form being used instead of substance to judge what is and who can do science, read Simon Winchester's The Map That Changed the World. In my view, it is his best book.)

Okay, now for a less "western" example. The late Suzette Haden Elgin gave this summary of a hantavirus outbreak in the four corners area of the states in her book The Language Imperative (2000):

When the American Southwest was hit a few years ago by a mysterious and usually fatal illness (eventually determined to be a hantavirus), the Centres for Disease Control investigators paid little attention to the Navajo healers who explained the illness by telling a story about the dangers of interaction between humans and mice. The CDC's attitude about the Navajo myth changed when their investigations showed that the source of the illness was in fact mice, which were unusually abundant in the Four Corners area at that time due to equally unusual weather. The English and Navajo speakers were using different metaphors, but the recommended behaviour – avoiding mice and their droppings and debris – were exactly the same.

So here we have scientific knowledge that is encoded in what non-Navajos call a myth, so it was not taken seriously by the investigators struggling to understand the outbreak of illness and stop it. They didn't even do a science-type thing such as going, "Well, not sure if this is true, but can it hurt to test some mouse droppings and catch a few mice and check?" We all know that under such conditions everyone is concerned about wasting time and resources in an emergency, but this wasn't some random person coming in off the street imparting the information. It was Navajo healers speaking up from a corpus of knowledge they have built up independently of the western version of science, and encoded and "published" if you like, in a different format.

I am not presenting this to make the argument that it is important to respect the knowledge of the "non-mainstream" so that the "mainstream" can appropriate what it finds profitable. That is self-evidently an invidious argument that has nothing to do with respecting the hard work and knowledge of people who may be different from us, whether we are mainstream or not. What really blows my mind is just how much about the world, how many wonderful and amazing things about it and all of crazy beings living in it that we may never understand or even recognize, because of cognitive filters we may not realize are there. Now consider this: all the folks challenging assumptions that reality is only what a few white males say it is are helping identify those cognitive filters, which makes it possible to work around them. Thanks to them, we have a fighting chance of understanding and appreciating things we could never otherwise imagined. How cool is that? (Top)

Natural Semantic Metalanguage (2016-02-22)

Illustration of the hyoid bone from Gray and Carter's Anatomy, 1858 edition, via wikimedia commons. Illustration of the hyoid bone from Gray and Carter's Anatomy, 1858 edition, via wikimedia commons.
Illustration of the hyoid bone from Gray and Carter's Anatomy, 1858 edition, via wikimedia commons.

Being social creatures and so impelled to find some way to communicate unless a grim combination of terrible mishap and no workarounds at all strikes us, language is something we care about a lot. It also happens to be a topic that we seem to find almost universally accessible. We all know our mother tongue and have some opinions about it and what constitutes a good accent, or a reasonable gesture style in the case of folks who sign, a reasonable vocabulary, and so on, even if we have never sat down and thought these things through. Admittedly, many of these are received opinions, ideas we imbibed in grade school and at home as we grew up. This is probably where many people learn to be terrified of public speaking for example, or if they are english as a first language speakers, where they learn that english somehow directly reflects reality for everyone on the planet. I don't use this description facetiously – it's a weird fact that over time becomes clear to Indigenous people who may speak english as a first language, but have also learned a different worldview, which english doesn't fit.

UPDATE 2021-05-22 - Interested readers may also wish to read Kenneth Hale's papers and articles compiled by the linguistics faculty at mit in his honour in an ongoing collation project, A Tribute to Kenneth Hale. I suspect it is far from coincidental that like Wierzbicka, Hale undertook significant work on Indigenous languages spoken in australia, many of which (I can't say they all do based on my limited knowledge) have ergative case systems and cannot be simply assumed to follow the subject-object-indirect object sort of divisions most IE language speakers are now used to.

On a separate note, I should add a little more detail about the attempt to use latin as a universal language among at least european scholars, as it was not necessarily a tactic based in conscious linguistic chauvinism. For much of the middle ages until the early renaissance in europe, to be a scholar meant by definition that the person in question had learned to read, write, and often even speak latin in addition to their mother tongue. This was also a time when unilingualism was as unusual in what is still called the united kingdom (at least for the moment) as well as europe, at least among the better off, which those who managed to access higher education generally were.

In the course of catching up with the sundry updates at Mark Rosenfelder's website, I stumbled on a reference to polish linguist Anna Wierzbicka and the natural semantic metalanguage (NSM) approach to linguistic study. Like any great idea, NSM has its share of linguists who act like it doesn't exist, those who rather vocally hate on it, the ones sympathetic but skeptical, and then the folks who work using it. Wierzbicka has been pursuing a major research program with Cliff Goddard, colleagues, and many students over thirty years or so. Its basic premise is that to correctly understand complex words, we need to be able to define them in terms of the simplest words possible, words similar to indivisible atoms in that they can't be defined. (You can try to define them, but will find that the attempt runs in a self-referential circle.) Those atoms are called semantic primes, and Wierzbicka and her colleagues further argue that they can be found in every human language, based on data from numerous languages of different types. For more details, ordinarily I would simply link to the page on it at griffith university. However, the individuals who run that website have placed all content behind a bullshit set of third party javascripts that only allow you to see the content if you run them. That is called a security fail and attempting to break the internet. So instead, the wikipedia article is okay, and blessedly, Cliff Goddard has an introductory chapter posted on it that you can read as well. Not everybody agrees with the NSM approach, but if nothing else I think it is well worth reading about it because of the interesting questions and ideas it raises. A part of why some people attack NSM though, seems to relate in part to Wierzbicka's determined, constructive challenge to ethnocentrism in linguistics and more generally.

In the early pages of her book Imprisoned in English, Wierzbicka provides the following sample dialogue to illustrate just how powerful, and how pernicious ethnocentrism grounded in english can be. No bad faith required:

A conversation with a linguist working on a language that has no word for "brother" (e.g., the Australian language Pitjantjatjara) may go like this:

- Does this language have a word for brother?

- Sure.

- What is it?

- Well, actually there are two: one for older brother (kuta) and one for younger brother (malanypa). And actually, this second word is for younger siblings in general, not just brothers.

- So there is no word for brother as such?

- Well, not as such, no, but I'm sure they have the concept of brother.

- Why are you so sure?

- Well, because the absence of a word doesn't mean the absence of a concept.

- So do you think that speakers of English have a concept of malanypa?

- What are you on about?

The linguist in the conversation is behaving unconsciously as if english perfectly reflects the real world, with every possible concept already represented in it. Yet as this quote illustrates, that's simply nonsense. If it were true more generally, would translation ever be easier. The only reason it seems otherwise apart from natural ethnocentrism we all have to work our way through at times, is the fact that english is the language of the most recent bout of wholesale colonialism in the world. Before english, the language that supposedly had this universal vocabulary was latin, in the face of how much greek vocabulary it absorbed to deal with oh, let's see: philosophy, medicine, non-roman religion, literature, mathematics...

Think of how powerful a realization Wierzbicka's point actually is. If an english as a first language speaker goes into dealings with people who speak other languages with the unconscious or conscious view that their language is in effect a perfect representation of the world, communication is going to become a serious problem sooner or later. It's just that sort of idea that led early anthropologists to conclude bizarre things about Indigenous people like failure to recognize siblings or having no concept of time. Or to the misunderstandings that modern english speakers can have reading Shakespeare's english, which includes many words that have shifted in meaning and are expressing a worldview that is different from the present. One of my favourite early modern english quotes giving an inkling of this is, "At whiche dealinge hee wepte and was nothing contente, but it booted not." from The History of King Richard the Third by Sir Thomas More. The modern applications are obvious, and all the funnier when it is translated into modern english: "At this change he wept and was not satisfied, but it provided no remedy."

The reason I think Wierzbicka's critique is constructive is that she is quite clear that it is possible to overcome the hurdle of taking english as the default. In fact, those of us who work in more than one language do it all the time already. The folks having the hardest time are english monolinguals, although any monolingual would suffer analogous trouble. Learning the raw basics of another language is of real help, including ancient greek and latin, or if you'd prefer something closer to now, say french or spanish. (Top)

Sprinkle a Little Technology On It (2016-02-15)

Vintage salt shaker that produces gears instead of salt. C. Osborne, february 2016. Vintage salt shaker that produces gears instead of salt. C. Osborne, february 2016.
Vintage salt shaker that produces gears instead of salt. C. Osborne, february 2016.

It isn't a new thing really for various pundits and hucksters to claim that every problem can be solved by more technology, including the problems caused by the earlier technology that may be causing the problem in the first place. What does seem to be a new thing is that people are beginning to pick up earlier on when an attempt is being made to sell them more of the same bad bill of goods. There are a lot of booster articles about "artificial intelligence," "self-driving cars," "disruptive technologies" et cetera, et cetera. The thing is, people are getting quite fed up with unintended consequences, and their acquiescence to one application of technology being manipulated and used as supposed license to introduce something else nobody wants but a capitalist fundamentalist. Contrary to the way they have been depicted by their detractors, Luddites were not irrational machine breakers. They had a point, and we ignore it and their rationality at our peril.

I wound up discussing the problems of technology one day with one of my professors, and we went back and forth for a bit, working on the claim that "technology is neutral." Yet this seemed at defiance with common sense. A nuclear bomb is only neutral in the sense that it kills everybody dead, and this isn't what that statement is meant to entail. However, it seemed to me then, and still does now, if we return to an older definition of technology, we'll be better able to think about it. Long before "technology" was used to refer to the "application of scientific knowledge for practical purposes" (this definition courtesy of my electronic OED), which conveniently allowed colonizers to claim only they had science and only they used reasoning to apply it, "technology" meant something very different. In early days it was defined in a manner closer to the meanings of its ancient greek and medieval latin parts: a discourse or study of some art, craft, or skill. The discourse or study part got left behind fairly early. This earlier definition, with or without the discourse or study, has some helpful properties that should not be given up lightly. By including the concept of art, it admits that art does include reasoning. Including craft or skill prevents us from fooled into thinking that the knowledge and skill we carry between our two ears is not technology. In fact, being fooled into this perception is one the cruellest tools in the set patriarchy and racism use to deny women and non-whites recognition of their knowledge. To return to this definition of technology, it also has the nice property of being neutral. There are no implications about how the art, craft, or skill is applied. The application is where the troubles come in.

UPDATE 2016-05-25 - Astonishingly, despite owning the resulting book in original and updated editions, it never occurred to me to reference Ursula Franklin's wonderful series of Massey Lectures, The Real World of Technology here. Her discussion is not only brilliant, it is as relevant now, if not more, as the original 1989 publication date of the lecture series. For an excellent overview, including a fine capsule biography of Ursula Franklin herself, see All Problems Can Be Illuminated; Not All Problems Can Be Solved by Meredith Meredith at the Berline Biennale website.

UPDATE 2021-08-10 - There is a growing refusal of the caricature of the luddites so commonly thrown around by capitalist triumphalists. A recent capsule overview including good cross-references I'm a Luddite. You Should Be One Too by Jathan Sabowski is posted at theConversation. For those who would prefer to listen to a more detailed discussion, there is also episode 13 of Justin Podur's civilizations podcast on the industrial revolution. The Luddite bits specifically is 1:33:09 - 1:35:40, but the whole episode provides important context.

Guns are not neutral. There is nothing neutral about a device specifically designed to kill from a distance. They aren't internal combustion devices that have been altered on an ad hoc basis to fire projectiles, nor can they be used for anything else. They are killing machines. For the folks who adamantly insist that "guns don't kill people," they are correct only insofar as guns are not animate beings that run around firing themselves. However, that doesn't change that they are lethal weapons only, and this is by design, not mistake. That is a chosen application of the various principles that make guns work such as how to enclose and use combustion inside a small space, forging metal, combining chemicals to make gunpowder, and so on.

Let's take a tougher example. Robots and pretensions to artificial intelligence are back in the news again. (I have doubts about artificial intelligence because so far we humans haven't even figured how the hell we think, let alone how or if it is possible to recreate that in computers.) Robots, as any reader of Isaac Asimov can tell you, are not inherently neutral with respect to human interest. This is not just true of fictional robots with artificial intelligence. This is just as true of the real life ones that are being deployed in waves whenever they finally come cheaper than a hyper-exploited female workforce in a third world country. The key question is always, what is the robot for, and whose interest does it serve? Does the combination actually make sense? The factory owners who have continued a long process of deskilling workers to allow them to replace workers would say it does. So do the folks who argue virtual robots, aka software, can replace many service tasks, again after breaking them down into such small pieces a worker assigned any one of them is deskilled. After all, a worker subjected to pressing a button all day is bored and miserable.

Except, the fewer people who are working in the wage economy, the fewer people have you know, wages, money. That stuff required in capitalism to buy the stuff and services these physical and virtual robots are producing. This is a problem we don't have to have, but more physical technology isn't going to fix it. Nor is simply dumping all the robots going to fix it, because the issue goes back to how a particular technology housed in human brains is being applied. In other words, the combination of an ideology that combines fundamentalist capitalism with complete denigration of the human body and the majority of human beings. You know, the ones who are not male, white, and rich. They only like sprinkling technology on it when they've got the shaker, and they're bound to prefer whatever serves their interests, which is all the more reason to refuse to leave the decisions wholly up to them, and insist on investigating for ourselves cui bono? Good for who? (Top)

An Unsurprising Worm in the Apple (2016-02-10)

The old style apple logo designed by Rob Janoff and used until 1999, courtesy of wikimedia commons. The old style apple logo designed by Rob Janoff and used until 1999, courtesy of wikimedia commons.
The old style apple logo designed by Rob Janoff and used until 1999, courtesy of wikimedia commons.

In fact, the real surprise would have been if this worm was genuinely brand new, as opposed to now suddenly too big to ignore. As the infamous "error 53" issue strikes more and more iPhone 6 owners – who I guess are discovering they don't own their phones since there is no protocol for them to call apple, verify their identity, and get their phones working again once they got the error – and more and more people find themselves revolted by the continued dumbing down of macOSX, the worm has definitely gotten big. To be clear though, this is emphatically not an "I told you so" rant, and not just because that sort of thing is simply being an asshole. I certainly never predicted the direction apple has gone. This is a fan of what is now clearly an "old apple," who slowly began to suspect that the grim logic and perverse incentives of the legal fiction called a "corporation" in the united states had finally gotten the upper hand. And then my suspicions were probably triggered quite late compared to the information available, and by something that even I laughed at myself about. This was my irritation at the it seemed monthly iTunes updates that not only made the program worse, but included longer and longer EULA's that made me wonder if I was unwittingly promising to give up a kidney or something. From there I finally learned enough to realize that the "corporation" aspect is a misfeature of the company. ("Misfeature" is a bit of hacker jargon that refers to something billed as a "feature" that causes far reaching and even unanticipated problems instead of solving a problem or providing a useful service. There is good reason to foist it away from hackers and into the general english lexicon, I think, because it does not entail malice.)

UPDATE 2016-02-18 - On "Error 53": It appears that apple has had an iOS update out for awhile that mitigates this issue in a mostly sensible way, in that the touch sensor on the afflicted phones and the programs it interacts with are merely disabled after the update is applied. According to the page providing instructions for applying the update and unbricking an afflicted phone, it was last modified on 21 december 2015. Something curious apparently went awry in the apple media machine. There also appears to be an out-of-warranty part replacement program being sorted out as well to allow the objectionable touch sensor to be dealt with. This is certainly better than killing the phone, and maybe there is a slim hope that the bright sparks at apple will remember that access to apple authorized repair shops is not so easy in many places apple products may be used.

UPDATE 2023-01-01 - For a different perspective on how apple has changed using specific hardware and software feature examples, see Brian Lunduke's recent post on his substack Apple has changed... and not in a good way.

UPDATE 2023-10-16 - I have happened on another relevant article to this item, one from october 2015 by Don Norman and Bruce Tognazzini, How Apple is Giving Design a Bad Name. If you have been dealing with the execrable state of GNOME 3 and/or iOS, this article sets out the source of the trouble. At this point I am so pissed off with so-called modern GNOME and supposedly better and modern Wayland that it KDE and MATE are running neck and neck to replace it on my GNU/linux machines.

UPDATE 2024-02-24 - One more quick note here. In the course of going through some older user manuals for some of the more elaborated software I still use, one provided an excellent summary table of file formats it supported, how much the support covered, and so on. If microsoft was willing to do this for its own versions, I would yes not entirely graciously, give them credit for that, as it would make life so much easier for so many of my windows-committed friends and colleagues. But in this real-life table, it had an additional footnote, which sadly noted that apple does not make its pages format specification widely available, so the programmers are not able to support it. Just in case anyone thought I was exaggerating when describing apple's deliberate strangulation of the original ClarisWorks and subsequent iWork, this should make it clear I am not. Apple can't play microsoft's game of slowing support and editing of shared files by refusing to explain what the format is. Unlike microsoft at this stage, apple has demonstrated a propensity to litigiousness, also making it too risky to reverse engineer the format to provide built in support. Now like microsoft, apple doesn't seem to like outside developers much.

I am among the folks who have enough comfort with computers to experiment with alternative operating systems, and originally that's all it was, experimenting, usually on my local university's computers. However, once the apple OS reached snow leopard, then the push towards iOS-ification became clear. I did update to mavericks, because I knew it was possible to turn the most obnoxious stuff off in that version – and I am able to return to the previous system if I wish. Mavericks of course is also the system when apple decided to stop charging money for upgrades, and that meant the next system was bound to be more actively mining for personal data by default. Sure enough, yosemite came along and a Fix MacOSX website right after along the same lines as Fix Ubuntu, and for similar reasons. What that means for me is that at long last, when the security updates for mavericks stop coming, I will be switching over fully to gnu/linux. As of 2022 the gnu/linux flavours at work on my computers are trisquel (based on ubuntu) and pureos (based on debian), and I am looking into freeBSD or even the hello os as a better fit for the oldest apple hardware I have that is still running.

Quite apart from the real issues of privacy and security we are all facing right now, the new interface with a look and feel reminiscent of something from a primary school reader are bad enough. On top of that, the system is locked down in more and more ways that don't even make sense. For instance, by default your own library folder is invisible, and it is almost impossible to set a different default save folder other than "Documents." Yet all of this could still have been fine. All there needed to be was an addition to the "General" pane under System Preferences: System Skin - Mavericks/El Capitan; Show Library Folder? with a checkbox. Well, that and a clear choice to "opt in" to data mining or not up front on first login. User choice was once an apple selling point. You could get into the more complicated stuff, mess with the interface, try writing your own software easily, even *gasp* mess with the system folder – or say the hell with it and stick with how things came out of the box. It was up to you.

UPDATE 2016-05-16 - I have been working my way through the archives of Athena Andreadis' excellent blog, and stumbled on her post Kalos Kagathos, a brief meditation on her appreciation of apple computers and the passing of Steve Jobs. It is well worth the read.

UPDATE 2021-08-06 - Wow, the rot has only gotten far worse since this thoughtpiece first went up. On 5 august 2021, Gareth Corfield's article at the register joined the chorus passing on the news that apple is about to start scanning iphone users' devices for banned content, warns professor. The official aim of this step is a good one, but the actual implications of it are terrible for the security of iphones and of the icloud service, which by the way is mostly hosted on amazon servers. For those who are less friendly to the register, naked capitalism has a solid write up with related links and excellent comments. The "solution" apple is planning to implement won't actually solve the problem it is supposed to respond to, alas.

Let's give apple the credit it's due for a remarkable marketing run combined with at least a partial level of competition based on quality. Software written under apple's aegis or companion companies used to be among the best out there. I still tell the story of the rock solid ClarisWorks, which was so stable that when it crashed, I knew to take my computer to the shop to repair a hardware problem – and it was worth investing in the repair. Pages was never as good as ClarisWorks, but until apple broke scripting when it began giving away the iWork suite, it was actually quite decent. The folks who use Aperture and LogicPro know far better than me what has happened to them, let alone Safari. Apple's shoestring marketing budget could never have worked so well for so long without some substance behind it. But now apple doesn't need substance, it can afford a way bigger bullshit machine. It has gotten so big, and so many people know it primarily through crippled computers serving as phones, that a constricted and constricting macOSX is just normal in their experience. And they have less to no opportunity to see if they'd like a bit more wiggle room.

For my part, I don't regret the money I have invested in apple laptops over the years, especially having made most of my purchases before there were apple stores, so I learned helpful tricks like waiting for the newest machines to go on sale. Then I would pick up the now just "obsolete" machine at a much reduced price, and could usually buy peripherals similarly discounted, albeit slowly. Fact checking the claim that "apple computers are more expensive" certainly helped me get far better informed about software, hardware, cost amortization, and how much I was willing to pay for "cool," if I was. Perhaps counterintuitively, it was making apple hardware work after being frustrated past bearing by hardware running windows that made me refuse to be cozened into accepting computers as black boxes. (I should add, of the three apple laptops I have replaced, two are definitely still working and serving new owners to this day. One had survived being dropped twice and had been in service for ten years already, so it likely has finally died.)

Of course, nowadays we have way better options to trounce false black box syndrome. Among the greats are the wonderful Raspberry Pi, FreedomBox (which you can buy outright or build from the instructions on the website – the Library Box project is now defunct), and the ever awesome Adafruit Industries. For projects to try out, from the basics to some of the wildest stuff you never thought of, check out Instructables or Make just for starters. You'll notice that all of these run some flavour of gnu/linux, and that's not because I have found "linux religion" (frankly, if apple took a different path I found acceptable that's the OS I'd stick with, and I am keeping an eye on PureDarwin, which is close to a real iso for running the open sourced version of macOSX). It's because I value being able to build things and learn more about how computers work and what they can be used to do – as well as about what I would rather they weren't used to do. That's the real future of computing. It's amazing and wonderful that there are real alternative OSes out there, something almost unthinkable back in 1999 when the apple rainbow logo was last in official use. That larger corporations including apple with its formerly more counter-cultural reputation are trying to shut this down gives away the reality that they can't actually compete with it. Otherwise, they'd have no reason to waste time, money, and effort making, and to the extent they can imposing, slobberware. (Top)

On What's Right With the Web, Part Three (2016-02-04)

OPTE Project graphical representation of the internet as of 2003, reproduced here under Creative Commons Attribution-NonCommercial 4.0 International License, accessed 2015. OPTE Project graphical representation of the internet as of 2003, reproduced here under Creative Commons Attribution-NonCommercial 4.0 International License, accessed 2015.
OPTE Project graphical representation of the internet as of 2003, reproduced here under Creative Commons Attribution-NonCommercial 4.0 International License, accessed 2015.

A major aspect of what's right with the web is something that many web surfers these days never look at, and might even be appalled if someone suggested they should. Yet it underlies every single web page, almost every formatted email, and a surprising amount of help documentation where it allows certain software companies to try to pretend that they make sure the help documentation on your machine is up to date. (It almost never is, which is why the help window takes so long to open almost no one bothers with them anymore.) That omnipresent thing is HTML, HyperText Markup Language.

Did all that sound like hyperbole to you? It shouldn't. Tim Berners Lee could have proposed something more complicated and obnoxious, something more like the bizarre gibberish most word processing documents are full of when you open them in a plain text editor.* Instead, he followed the SGML (Standard Generalized Markup Language) guidelines and defined a small number of tags, labels that are separated from the regular text by angle brackets. Even today, there are less than a hundred of them. They are not difficult to use, especially now that formatting has been moved into CSS (Cascading Style Sheets). Alas, CSS is not quite so easy. But suppose something happened and your browser couldn't cope with CSS for some reason. You would still be able to read most web pages. They wouldn't be as fancy or pretty, but you could still read them. The exceptions are the sort of page that breaks the web and should be put out of their misery. Yet, this is almost the least important part of HTML's capabilities. Using HTML, you can mark up your text into a web page. But even more importantly, you can link that page to other pages. If Berners Lee hadn't proposed this feature, at minimum the web would have been significantly delayed.

The web doesn't work because of advertising companies masquerading as search engines or cloud service providers. It works because of hyperlinks. If there were no hyperlinks, what would the search engines crawl and index? Without hyperlinks, there would be no way to "browse" the web, unless we had something like a telephone directory with a list of all the locations of all the pages available. And then we'd have to type in each one. Nowadays hyperlinks do all sorts of remarkable additional duty, like showing and hiding information without using javascript, and providing well-behaved cross-browser tool tips, to pick just a few of the applications I've seen. And well before that, internal page hyperlinks were developed to improve the navigability of lengthy pages. For those of you who wonder why that would be better than breaking up a long page into short ones, that means you came to the web late or you don't have to use dial up. Yet you still get the benefits because every time you are able to start reading or loading a page before it finishes loading, that is because of sensibly applied HTML as much as it is good hardware and software.

HTML isn't just good for the web, and it is not such a bad thing to learn. There was a time not so long ago when I built all of my presentations using HTML and ran them from the local disk using whatever web browser was available. No worrying about different versions of the same presentation software not understanding its own format, or crashing, or stalled transitions between slides. Plus far less trouble embedding content like movies and sound files, a task I see tormenting even my colleagues who are proud power powerpoint users. (I am not mocking them, I admire their fortitude.) Actually, I think it's time to return to that approach, at least for me. If you'd like to set up an ebook but don't want to deal with the ebook format or use a program other than your plain web browser, HTML is your ticket. In fact, the underlying format of ebooks is xml, which is basically HTML with a few additional features and limitations. So without HTML, eReaders would be basically format-locked. Oh, and if you wish (I concede, not likely), or in an emergency (let's hope this is not so likely as to happen), you can write and edit HTML in nothing more than a text editor, even one as awful as notepad in windows.

UPDATE 2018-05-24 - Now everybody running a modern web browser has a program that can parse and display well-formed xml, one of the nicer developments over the past several years of change on the web. This has trimmed down development time on the Moonspeaker's rss feeds.

Furthermore, once you know HTML, if you want, there is an all manner of useful markup languages out there you can learn quickly and make use of. Xml was the trendy one not so long ago, although it isn't much fun if you don't have something to parse and display it. If you have an interest in typesetting and page design, or you're any sort of science student, then you will soon find yourself dealing with the LaTeX document preparation system, which is based wholly on free software available on all computer platforms. If you are interested in editing wikipedia or any other wiki, you will soon pick up one of the lightweight mark up languages they use. Or you may end up coming to HTML from the other direction, so to speak, by learning a mark up language for a wiki first. If nothing else, picking up one of those lighter mark up languages will explain some of the bizarre and irritating behaviour of microsoft office products and their imitators, which have often implemented one of them (typically basic markdown) in silence.

A foundation in a simple, easy to implement mark up language that allows the web to be built in the first place, and even facilitates learning other mark up languages that you can use to build even more of the web. That is emphatically something right with the web, especially when the source code of the subsequent web page is not obfuscated. (Top)

* If you would like to actually try this, first use an archiving program to unzip the file. "Modern" word processing program files are now actually little archives made up of a folder containing several files and a small text file called a manifest. The main reason for this is to bundle the current state of the tool bars and/or palettes with the document. It is also the main reason such documents are prone to becoming corrupted and impossible to edit. Should this horrible fate befall you, it should still be possible to unzip the file, and in the folder will be one file that will include your text. It won't be formatted, but you'll be able to get your precious work out.

On What's Right With the Web, Part Two (2016-02-02)

OPTE Project graphical representation of the internet as of 2003, reproduced here under Creative Commons Attribution-NonCommercial 4.0 International License, accessed 2015. OPTE Project graphical representation of the internet as of 2003, reproduced here under Creative Commons Attribution-NonCommercial 4.0 International License, accessed 2015.
OPTE Project graphical representation of the internet as of 2003, reproduced here under Creative Commons Attribution-NonCommercial 4.0 International License, accessed 2015.

Okay, it's time for some more about what's right with the web. There is a lot, and much of it is inherent in the web from its beginnings, from when it first began to become available to more than a few tech nerds in various research and military installations. In fact, a great place to learn even more about what is right is to go to Maciej Cegłowski's website and read (or watch, when a video version is available, or have your computer read to you) every one of his talks, at minimum. I'm telling you this at the risk that you might get distracted and run off to his site before reading this thoughtpiece. I think the risk is worth it, that's how important what he has to say is, and I will reference him at least a time or two more.

In the last article on what's right with the web, I extolled it as a communication medium. That's important. What's even more important is an accidental emergent feature of the web. It's like Lego instead of some toy that tries to tyrannize your imagination like a branded doll or dinky car. In fact, the web is like the two best toys we all spent the most time with as children if we could get our hands on them – tragically all too many of us couldn't get our hands on them: the empty, plain cardboard box and Lego. When we want, the web is much like that nice, plain cardboard box. It provided infrastructure to have fun with, something like a blank canvas or a blank page. But somehow by the time we're adults, we have often been instilled with a horror of the blank page or canvas, as if we don't trust our imaginations anymore. There's something about the web that trumps that fear, probably its original ephemerality, although the current things wrong with the web are deeply endangering that. Also when we want, the web is like a Lego set, with a wonderful amount of really useful pieces and not too many useless things like ersatz trees and those little hoses that look great but are often only good for decoration. There again, the joy in part is you can try whatever thing out, it's not necessarily permanent, if it doesn't work you can try again. So there is that respect for human frailty again. "Frailty" not just in the sense that we make mistakes, but also in the sense of the fear of looking foolish when trying something new.

Let's consider some actual examples of this. Remarkably, more examples than you can shake a stick at comes from a still often maligned and poorly respected source: fandom. Please remember, fandom long preceded the web, and fans have taken fiercely to pretty much any new technology they could find a way to access in order to facilitate the creation and sharing of the art and criticism they create. The first Star Trek fans figured out how to make early photocopying machines produce something legible on a regular basis, and that's no small achievement. In fact, among the best discussions of a fandom making use of the web in unexpected ways and finding uses and applications for software the software writers never conceived of, read every word of Cegłowski's Fan is a Tool-Using Animal. Then read it again, because the points he is making, especially his takeaways, are well worth learning.

UPDATE 2016-07-07 - If you have read this article before, you may have noticed that I have deleted the reference to TheMarySue. This is because my understanding of its origins, history, and current ownership turned out to be inaccurate. It is not actually a good example of the sort of created and curated website I am discussing here.

But let's get even more specific, and not just lean on Cegłowski. In fact, let's consider a few items from the Random Site of the Week section, which yes, is not really weekly so much as "whenever I feel like it" but that name was too long. For the more literary side of fandom, we have The Library of Babel, an implementation of the idea from Jorge Luis Borges' original story of that title. Borges is a complicated author, very much affiliated with the right wing politicians of his time, with an uneven corpus of work. Yet this in no way takes away from his work that was amazing and inspiring to readers, and here is a great site that illustrates that. Or how about Mark Rosenfelder's corner of the web, zompist.com. Complicated politics, seriously cool stuff, including the Language Construction Kit, which has been tried out as an experimental linguistics textbook at my university. Obviously I am a seriously literary sort of geek, so let's turn to something rather different and arguably more popular like The Leaky Caudron, which Harry Potter fans built themselves instead of waiting for Pottermore. All that let alone Fanlore and An Archive of Our Own, both of which are making expanded uses of the wiki to help preserve the many fanworks in danger of being lost forever because they were built in "free web hosting" services. Notice that all of these projects involve diverse people finding ways to work together and deal constructively with material they may deeply disagree with for any number of political or philosophical reasons.

For all the hate on that sharing cat pictures and videos gets in some corners, and the wailing and gnashing of teeth at poor web design in others, it's easy to forget that people need to start somewhere. And if they want to start participating in the web and building their own corner of it, they need to start somehow, and sometimes the best and most comfortable starting point is the cardboard box. So they build a website that basically works, even if it is ugly. The starting point is "it works" not "it had better be pretty." Or they start from the Lego side of things, and begin by figuring out how to share the antics of their cats, and maybe they don't have a camera and they think a story won't cut it. So they figure out other ways to tell their story using the possibilities out there, from clip art to sound. That's pretty damned amazing, and yet more reason to resist the ongoing attempt to remake the web into a combination of slobberware and surveillance software that would destroy its genuine democratic and creative potential. (Top)

Who Thoughts (2016-01-27)

Classic Doctor Who diamond logo developed by bbc productions circa 1982. This digitized image was posted by sjvernon on deviantart, 2016. Classic Doctor Who diamond logo developed by bbc productions circa 1982. This digitized image was posted by sjvernon on deviantart, 2016.
Classic Doctor Who diamond logo developed by bbc productions circa 1982. This digitized image was posted by sjvernon on deviantart, 2016.

The ongoing saga of Doctor Who in its new, hyper-marketed incarnation that is more directed at an american audience has been a curious one to watch. Consistent with the original inconsistency of the classic series, some episodes are very good indeed, others are excruciatingly bad. Unfortunately, the inconsistency comes from the same cause: nobody with the reins of the show in hand seems to be able to get clear on who the hell the audience for the program is. Worse, nobody seems able to accept the wisdom of the best writers whose works are adored by children and enjoyed by adults. Write what you enjoy, don't write down, the result will work.

Returning to Doctor Who specifically, writing for that program is no easy remit, especially because there is clearly a background conflict going on about how to centre the show. Is the viewpoint from the Doctor, or the companion? There also seem to be challenges to male writers in particular, as they try to imagine arcs for female characters let alone companions that don't demand a stupid and unmotivated romance, or an existence predicated wholly on interacting with the Doctor. They sure are struggling. My thought is, whoa, writer guys, use the same cheat the rest of us writers do, and first write for a male character, then swap the "he" to "she." Stereotypes are nastily tenacious things, and this is one way to get around them. It works. Try it. In fact, try it for one of those two to five minute shorts that used to be such a thing in Stephen Moffat's now closing tenure. May as well start with a smaller palette and get the hang of it.

UPDATE 2022-08-08 - The debacle of how Jody Whittaker was brought on as the Doctor and then the miserable scripts she was saddled with – that level of missed opportunity would make me embarrassed for the writers and producers if it didn't infuriate me so much. There is one episode that is brilliant except that the Doctor's lines are all given to a character who literally gets jettisoned off the space ship they are all on. The episode makes no sense, because the Doctor is not behaving like the Doctor. I could let the hamfisted attempt at a pregnant man sideplot to get the excessive number of companions out of the way go, but that was ridiculous. All this for a visually gorgeous episode with a not quite developed properly but genuinely interesting background story taking advantage of the Doctor Who trope of incongruous looking alien menaces. There were so many ways to get a female Doctor and reboot the show without trying to pander in a grovelling way to the present politically extremist zeitgeist while still leaving room to engage with it and not tying future writers' hands. But then, that is true of every older television, movie, and comic book franchise right now. It would probably be kinder to stop iterating on them and take the risk of making something new instead of zombifying them like this.

Perhaps the toughest part though, is that the world building aspect of Doctor Who is a mess. Perhaps heretically by the lights of Doctor Who fans who started with the show when it was resurrected, I do not care for the Time War arc as constructed, and not just because I dislike the Daleks. (Sorry Terry Nation wherever you are, the pepper pots just don't impress me, despite the other genuinely intriguing story developments that have happened with them, Genesis of the Daleks is a remarkable series.) The idea came up in part I think to respond to the feeling expressed by folks working on late stage classic Who that too much was known about the Doctor and Gallifrey was too easy to get to. However, the real trouble is world building or even a serious sit down to consider, what's an immortal race like, actually? What, if anything, can make them compassionate or restore their compassion if it's lost? What would time mean to them practically, when it has no limit? If they do have a society that is weirdly like ours, why is that? The obvious implication then is they started out as a mortal race, and something quite interesting or appalling happened. If they obsess over humans, why? Is it because they are trying to prevent those humans from having the same experience? Or to get themselves mortal again? Or meddle in their own past or something? "Corruption" actually isn't a motivation, so let's move on from that lazy trope.

What does this all have to do with the Doctor, and how would this make any difference when the show is about him? Well, it seems to me it would help give him better things to do than wander around as mainly an obnoxious tourist or a paternalistic meddler who can't seem to respect the decisions of female companions, though that has shifted a bit. At the moment, the Doctor is verging on unlikeable, and this was the horrible fate to which Colin Baker's Doctor was subjected. Those of us who have some familiarity with classic Who know how well that eventually turned out. (And how unfairly for Colin Baker and Nicola Bryant, who deserved far better scripts and respectful costumes.) It is totally fine to have the Doctor be complicated, to be a jerk sometimes, all of that. But once we can't like him, things have gone gravely awry. This might not be such a bad perspective to take on the villains too, especially the Time Lord villains. I look forward to an iteration of the Master where the actor in the part is not faced with a requirement to overact until they're almost chewing the figurative carpet. Much as do enjoy an actor playing a villain who basically starts throwing everything at the wall in hopes of overcoming a rotten script, that is an enjoyment in spite of what is going wrong.

In the end though, this all goes back to the struggle the people running the show are having with audience, and who they perceive the audience to be. I'm not quite sure why they are having such a hard time. It's not difficult to check out which television shows are killing it in terms of popularity, and it isn't the ones that talk down to the audience, nor are they perfectly quality consistent by anybody's lights. They did spend some time doing at least enough world building to not have to hand wave desperately for character motivation. That's a new and necessary approach Doctor Who needs – it'll be interesting to see if it happens. Either way, they don't have the excuse of having no time and no money to make the show when things don't work out. (Top)

Copyright © C. Osborne 2024
Last Modified: Monday, January 01, 2024 01:26:23