Title graphic of the Moonspeaker website. Small title graphic of the Moonspeaker website.

 
 
 
Where some ideas are stranger than others...

FOUND SUBJECTS at the Moonspeaker

Author, Author (2019-02-12)

Snoopy from Charles Schulz' comic strip Peanuts prepares to write his next opus. Snoopy from Charles Schulz' comic strip Peanuts prepares to write his next opus.
Snoopy from Charles Schulz' comic strip Peanuts prepares to write his next opus. Snippet dated october 2018, original circa 1972-1976

The latest article in my bag of procrastination tricks is the topic of "scribal publication" which so far I have read about only in the context of fifteenth to seventeenth century england. This sounds like it should be obscure, yet the models developed from various preconceptions about hand-written manuscript reproduction versus machine printing and what it means to be an author in the usual depictions of scribal publication are reproduced in almost all writing in english about writing and publishing I know of. Exceptions exist and are growing in number, in part because of the effect of the web and the ways in which it has been developed as a new means for what could be labelled "social publication." "Social" not in the lying marketing sense as in the term "social media," but actually social in that the writing is done and published as part of social relationships. And not just in the form of fanfic either. I think that comment sections at their best are part of the same phenomenon. Social publications troubles the waters of definition whenever we approach the water body labelled "author." Authorship, by which most pundits seem to really mean ownership, is a fraught concept now. Writers who publish with the expectation of being paid in money bewail that their works are pirated and they are unable to make a living. Writers who expect different sorts of return for publishing their writing are often denied the designation "author" and sneered at as amateurs or dilettantes, as if the only way to express a serious interest in the art and craft of writing is to be paid money for it. "Author" has echoes of that terrible word "authenticity" in it, and since that is one of the most fought over features in a capitalist venue because it is a pretext for raising prices, there can be no neutrality about its use, even if more people wanted it.

For those who have read David Graeber's most recent book, Bullshit Jobs, or anybody who works in a profession that involves care for others, which ranges from all forms of artist to the well-known and weirdly despised hard-working teachers and nurses, you already know that "authors" are better because they are paid. Furthermore they are despised even if they are as business-savvy and rich as a J.K. Rowling or Stephen King, because their work is about entertaining, teaching, and even consoling. All types of care for others, whether it is reflected in what we consider self-interested or unself-interested care. And in any place where patriarchy is in play, let alone capitalism, the default assumption we are told the "real people" make every day is that if we do those caring things, then we should feel rewarded enough by the gratitude of the people who receive our care. How dare we be so venal as to expect to be paid! This is unshakeably attached to contempt of course, contempt for not doing something "practical" that "pays the bills." Never mind without all that caring work, the world would stop.

Authors, whether published in the current marketing system or socially published on or offline, are sensibly opposed to this despicable attitude. Yet the market published authors are sometimes teh most difficult to listen to when they begin talking about how unfair it is that they don't get paid, especially when they demand more of the currently broken copyright system. I have never understood this, because the system was founded by printers, not authors, and it meets their needs, not those of the authors except by dribs and drabs. A copyright system that better fit the needs of authors would look different. Of course, when an author manages to write a book that becomes a real and ongoing hit, then the current system seems to work just fine for them, because sheer economies of scale make their tiny royalties multiply so well, and they are able to take advantage of the phenomenon of tie-in products. Other authors are developing an approach to social publishing that effectively reinstates a subscription model to ensure they get a stable living and build a community of readers who are following more than the latest fad. Fads have their place, but not if you hope to make a steady living around making faddish products.

Copyright is part of the problem here, but a bigger one is the ways in which the ability to produce and disseminate what authors write is centralized, and in the case of social publication, where centralization is being imposed. Scribal publication on a wider scale held on so long in part because it was decentralized, and the author could exert more control over their own work, including managing how it was edited. The style of invasive editing typical of english and north american presses is I understand from Alberto Manguel unique to those parts of the world. Authors engaged in scribal publication did edit and accept edits from others, in fact their circle of trusted critical readers were expected to provide suggestions and comments as well as works of their own. And indeed we see a similar phenomena in fandom via betareading, preliminary audience testing with mailing lists, and so on. Now that anyone can have a printing press in their living room, the new push by corporations that formally could expect to basically control all publishing has been to clamping down on distribution via various means. Egregious contracts, asinine copyright regime changes, and especially the longterm propaganda play that says there is no way to pay for art except by advertising or rich jerks patronizing artists. This simply isn't true unless we let it be made true. Before so-called "social media" word got around about new art, new writing and so on. Just like before the cell phone, we were able to cope with emergencies. It will take work to route around and away from the monopolistic tactics of corporations and authoritarian governments, and sometimes risky work. It will have to be done by all of us, including folks like me who are able to pay a modest amount each year to keep a website up and independent of the pseudo-free services, in part to help support the alternatives.

Way back, I was part of a mailing list – back in the internet stone age when "social media" was nobody's bright advertising idea – and the question of how to spread word about and distribute books and other writing outside of the current distribution systems came up. Never mind lots of that had been happening for years, frequently along transmission nets developed by yes, fans of various shows, books, and movies. Precarious at times, but fun, with a clear sense that some productions were by nature going to be short-lived, and that was okay. In the context of the conversation on the list, I suggested that maybe a sort of mail subscription system might be the way to go, thinking out loud, tossing an idea out there. One person in particular jumped on top of me and began pointedly informing me of how stupid the idea was, what it would be expensive and hard to manage, and what the hell was the matter with me, didn't I know there was no way to do this without the economies of scale of an actual company? Skepticism is fair enough, but the side order of personal attack was awfully gratuitous. I also think this sort of hyper-skeptical attitude is self-defeating, and reflects that many authors got sucked in by the claims by various publishers that only by being published by them could a writer truly be an author. Only then could they be spared accusations of being a hack or an egomaniac, only then could they be sure they were working as true professionals and meeting true professional standards.

I can't agree. What makes you an author is that you took the time, care and effort to write something, polish it, and share it with the world, however you choose to do that. If you want to produce a bestseller, certainly your chances will be higher if you manage to attach yourself to a large publisher that is willing to support your book with heavy marketing support and sticking mentions of it in the right places. But that has a lot more to do with gaming a system in which what the author is competing for is celebrity. To sell that, each publisher needs a clearly differentiated product that they can flog the hell out of, and so they will have a "house style" and specialties. If you want to compete for celebrity as an author, then yes, you need to be part of a marketing machine, and a marketing machine only accepts whatever it's special fuel is, so you'd better produce the fuel. But that is far from the only way to share a book with the world, or make money from doing so. The other ways may work more slowly, or demand more ongoing work from the participating authors, which hardly makes them unrealistic or impossible. Or contemptuous. (Top)

Interface Hell (2019-02-05)

Okay, can't scroll properly, can't see all the options, no obvious way to resize this thing -- what the hell? Okay, can't scroll properly, can't see all the options, no obvious way to resize this thing -- what the hell?
Interface monstrosity from a program that shall not be named, C. Osborne, october 2018

If I needed to be convinced of my loss of "Apple-hipness," then the recent efflorescence of articles critiquing the corporation's decision to piss all over its own original user interface guidelines is excellent proof. Not that these critiques are new of course, my research for this thoughtpiece suggests that these have been something of a minor staple for online magazines like fast company for at least four years. Apple's determination to dump colour, destroy font legibility, and crapify the operating system interface is rather new. Long time mac enthusiast and author Fletcher Delancey has recently blogged about it, dealing especially with apple's decision to switch to greys in many interface elements starting with OSX Lion. I wasn't too impressed by Steve Jobs' insistence on his return to the company on the new "brushed metal" look because it removed helpful interface elements. Little did I know that this was the start of the ongoing policy of derogating and removing any suggestion of three dimensionality, despite how useful those faux ridges and the like can be. The illustration for this thoughtpiece comes from a program that is using the basic macosx windowing system, and there is a bug in their calls for the scrollbar, making it impossible to scroll in a fine-grained way. This was making me frantic, because I have more than three calendars to manage, and this issue came up during an import and synch operation. What the hell was I supposed to do with this window that had no obvious way to resize it, the obvious workaround here? More in desperation than anything else, remembering where the old time grab corner for window resizing used to be, I tried hovering the mouse over the lower right corner, and lo and behold, at one little spot the pointer changed into the blessed resize indicator. There was nothing whatever to suggest this would work apart from mousing around for the equivalent of dumb luck.

Grizzled mac veterans will also notice that the developers of this program have lost track of the default button interface guidelines that were originally such a key innovation. That doesn't need any colour either, just the old time thicker line around the default button and having a click of the return key activate it. And don't imagine that more experienced apple developers are immune to messing up like this even if they are working on proprietary software. Try using the rightmost of the window button controls to auto-expand an iworks keynote presentation window. Nothing happens at all, even the original behaviour of jumping out to take up the whole screen without going into fullscreen mode is gone. The developers working on firefox still have this button and window behaviour doing the right thing.

I'm all for making the user interface look nice, and can readily agree that windows 10 is much pettier than windows 7 not least because the designers finally took advantage of the rendering options available to them to make the general operating system look like something from the early 2000s. (This is a compliment, not snark, although there is a lot else in the interface that is left to be desired.) However, looking nice and being useful for tasks besides gaming are two different things that can come into conflict if the designers get too absorbed in some mythical quality like "simplicity" and taking away user choice. This is pretty much what has happened to the apple design teams, probably because the people at apple now have been fooled by their own marketing hype and the mischaracterization of user-friendly interfaces as those which give users no choices, control, or respect. I don't know any other possible explanation for what amounts to the steady recreation of the deservedly hated Launcher approach to the interface, in which you were presented with a few big tiles corresponding to the few applications you were trusted to use and nothing else as a home screen.

I believe it is Alan Cooper who wrote that an undocumented interface that is difficult to learn is not a tool but a game, and when we want to work, what we want is a tool. Interfaces that are easy to learn are not necessarily simpler, in fact they usually aren't, because they have lots of carefully designed elements and interrelationships to help learning and memory along. At one time apple followed this approach assiduously, and included lots of preferences where you could tweak things to your liking as you got more familiar with the operating system and the software you used most. This worked especially well because all developers had to follow the guidelines to get into the apple ecosystem, so what you learned from the system interface was applicable everywhere, and supported leanring the unique elements of specific programs. The result was an interface that anyone could learn quickly and that encouraged further learning. Some folks found apple's approach to this completely off-putting, and of course used other computers or software.

Blunt visual simplicity makes the basic program we use every moment we are using the computer into a frustrating and stupid game. Not much longer after that, it begins to reflect a real contempt for the person who operates the computer. In order to create that pseudo-simplicity, stuff has to go. Like those elaborate pseudo-three dimensional screen elements, but not in favour of modernized elements perhaps harking back to the best days of system 7.5 when such visual tricks were not possible. Not at all. And this really started taking off, alas, with the advent of the touchscreen ipods, when apple took away the ability to use ipods as mass storage devices without extra software or jailbreaks. In other words, there was a time when you could use ipods as the hyper fancy external hard drives they are. Viewers of the "making of" shorts for Peter Jackson's Lord of the Rings trilogy could have seen multiple early ipods being used for exactly that purpose. The loss of this in total, as opposed to just locking down the music folder, was an extremely bad sign. Sure, this made the ipod interface simpler, because now no matter what, you could never use it with a file system viewer until its pseudo-reintroduction with a much later ios. But now you couldn't just ignore that use option if it was more than you needed. It was taken away unless you had considerable programming chops of your own.

The day I reluctantly upgraded my mac to osx lion, which if memory serves also introduced so-called "natural scrolling," that was the first setting I dug into the preferences to kill with fire. Next was forcing my own library folder back into permanent visibility. The way an iphone scrolls its screen is not at all how my trackpad should scroll – not because there is anything wrong with it, but because muscle memory is powerfully against it. The logic behind "natural scrolling" is related to new mac users of course, to people presumed to have come from the iphone to an apple computer. That and the grail of fools in the form of total operating system convergence between desktops and "idevices." Certainly such a vision matches an authoritarian manager's ideas very well, because it is ever more authoritarian in nature, and it is horrifyingly enough beginning to produce the impression of "slobberware" that apple was once unjustly accused of. All of this is part of the changes I conceded two years ago in an unsurprising worm in the apple were happening because of the company's development into a massive multinational corporation. They still suck. (Top)

This is Not About Banks (2019-01-28)

Bob the Angry Flower -- if you haven't read it yet, you're in for a treat. Bob the Angry Flower -- if you haven't read it yet, you're in for a treat.
Bank Job, Bob the Angry Flower comic 857, by Stephen Notley, 2010.

Even though it is so tempting to rant about the banks, because generally that is something everyone is allowed to do because it is basically ineffectual, even as a means of catharsis. Banks are a mighty symptom of the ills and evils of fundamentalist capitalism, which is the pursuit of money as the only valuable thing that in fact has no value but what we agree to give it in a sort of consensus illusion to its ultimate conclusion. Since they're a symptom, it makes more sense to try to wrestle with identifying and challenging the cause. But I am not going to spend much time as such on fundamentalist capitalism either, because it too, wildly enough, is a symptom. Both are symptoms of something so pernicious and dangerously persuasive in colonizing cultures that it manifests in multiple frightening branches from a snarl of interwoven trunks and unpleasantly widespread, though not deep, roots. It is tempting to label this something "the big lie" but that sort of labelling is part of its substance and how we get pulled back into the morass because we can be fooled into thinking that we just have to counter a singular notion. Yet I think the key is to give up on metaphors for it based on nouns and to resort instead to something verb-based. Something like, ways of being that encourage sameness, even demand sameness, as a means of achieving an ever-receding, never achieved safeness that somehow is like the horizon, always ahead but never touchable, never still. I didn't originally have the awkward coinage "safeness" in there, but it was too good a typo to abandon.

The equation of an ongoing state of safety with some form of ever spreading uniformity in everyone else's behaviour, is one of the most clever and dangerously self-deceiving ever invented, precisely because it is made only through action. Somehow the "we" or "I" in the middle always gets permission to bend the rules because we can be convinced by ourselves or by others that what we are doing, enacting, is in fact the right way. The right way to drive towards the uniformity that will finally create the permanent safety that we crave, that promises once everybody is behaving according to the patterns that spread homogeneity, it won't be difficult. It won't take hard work by us or anyone else. It will just be, and it will be perfect. Never mind that the anxiety the attendant drive to perfect conformity instills in us at every moment, until in some places an incredible percentage of the population is on some sort of anxiety-masking drug. Never mind that the evidence in the real world shows that this drive for a uniform existence in fact destroys real peace and health on individual, social, sea, and land levels. Never mind that there are still, against incredible odds, thousands of modes of Indigenous existence in action in the world right now, where people have eschewed the sick drive to sameness and survived well for thousands and thousands of years and are even surviving the drive to destroy them, hard though it is.

I have acquaintances who insist that the fact that Indigenous nations have warfare, haven't completely solved poverty or social injustice absolutely and forever, or are showing the terrible effects of colonialism and ongoing attempts to genocide them means that their ways of life are obviously defective. They don't usually resort to such crude rhetorical bludgeons as claiming those peoples are evil, or that they have failed to "modernize" which just means assimilated or died. In effect, their arguments are some variation of, "they haven't produced what we define as paradise on Earth or the closest thing to it, a place and time when everyone is behaving in the same way all the time and held to it by sheer force of social pressure." And social coercion, of course. I remember when I was much younger puzzling over how an argument so obviously wrong could have wrongness in it so difficult to make explicit, having been caught up in the tangle of presuppositions. Notice the final assumption of the argument goes back to assumption that living in as uniformly a manner as possible is ideal. Yet it is simply impossible.

Among the most tell-tale speculative fiction themes that turn up all over north american examples especially is the recurrence of militarized societies. The military may be relabelled all sorts of other things, yet the key features remain in place: hierarchy and authoritarianism expressed in an absolutist chain of command, uniforms, and pervasive forms of coercion. No matter how often military life is decried in so much of this speculative fiction, it is somehow inescapable, and what is really decried is a particular version. Such as ones where female humans could conceivably end up in charge and not be treated as if they lacked all intelligence or will, or ones that look to be bad for capitalism. The original conceptualization of the Borg in Star Tek came very close to breaking through the illusion here until they got revamped into yet another pseudo insect colony model with a sort of Eve figure to tempt the "next generation" Spock surrogate with the ways of all flesh. The failure of imagination and political cop outs involved were and are tragic, because the writers came within a whisker of identifying and challenging the obsessive attempt to live absolutely the same way everywhere in all time. For a short while I think fans and writers for the show alike came very close to seriously asking, "What if we are the Borg? Have we held up a mirror that we only hope is distorting what we see?"

One the challenges of a thoughtpiece like this is that in english there is no inclusive we versus exclusive we, and that would be incredibly useful here. On one hand, I think the pernicious temptations of trying to live in a way that enforces absolute unity and unchanging perfection are equal opportunity. They are what can fool anyone under stressful conditions to take this way of being on board, even Indigenous peoples whose cultures are built on different principles. I think often of the importance of not slipping into the fallacy of equating shared principles with dogmas that by definition cannot adjust to the reality of where and when they are because they are not alive but enactments of profoundly bad ideas. Enactments of profoundly warped expectations. On the other hand, being from an Indigenous culture and committed to resistance to oppression, I do think there are a few, very few good ideas that work anyplace and anytime. But they demand that we refuse the easy way out, the false goal of perfect safety and perfect sameness and never dying. They're frustrating because they demand that we be responsible and accept and act on the awareness that others are responsible. We must respect them and refuse to apply such false justifications as "might makes right" in all their guises. It's hard to live this way at any time because we are sometimes so impatient with others and especially ourselves. Add that to the wrongful feedback from those who insist that if only we all acted the same that would fix everything and we couldn't make a mistakes because mistakes would become impossible to make, and it is difficult indeed.

Yet, it isn't hard at all. I think often of the Haudenosaunee Confederacy on this point, and the tellings of the origins of the ways they found to live together again that focussed not on creating sameness but on creating the conditions to help everyone favour living with a good mind. That is, in a way that favoured careful consideration of the present unique conditions and the possible consequences of actions before deciding, with the level of consideration going up when the potential scope of the consequences does. It was difficult for them to make this happen. It took a long time, and it couldn't be done by coercion or demanding an absolute unity of way of life or action. It did demand mutual respect and refusing to act oppressively, including changing when on further experience a way of being everyone thought wasn't oppressive at first turned out to be after all. The Haudenosaunee succeeded, and are still even as they must concurrently resist multiple settler states. (Top)

The Dishonest Passive Voice (2019-01-21)

Missing in a crowd, modified meme. Missing in a crowd, modified meme.
Missing in a crowd, modified meme, october 2018

Almost anyone who has suffered through instruction in writing english has been ritually told never to use the passive voice. Until quite recently it was rare indeed for teachers to give clear and actionable reasons why. I can certainly point to moments when teachers inveighed against how they made sentences flabby and generally slowed down the text. Yet I managed to get to a second university degree before finally learning that I actually had little idea what an english passive voice construction was. As in, before finally taking classes in ancient greek, I was confused enough to mix up a past construction like I just used in the previous sentence, with an actual passive construction. Passive constructions depend on uses of the verb "to be" and are often singular and in the past tense, so the confusion is understandable. But I did finally learn that the key feature of the passive voice is that it makes it possible to delete the agent, the (usually) person who actually does whatever the sentence is talking about. And yes, practically speaking it can make a sentence longer than its active counterpart. For example, "the money was stolen by him" is slightly longer at least, than "he stole the money" but more importantly, it deletes who did the stealing. We don't have to delete the agent, of course. We could say "somebody stole the money," or "the money was stolen by someone." That second one is a better example of writing or speaking in a wordier way than necessary, although that is a stylistic judgement, not a necessary one.

With all this in mind, it was intriguing to read Cora Buhlert's reflections on american writing style in science fiction, especially the negative views of the passive voice. Those views usually focus on the text supposedly needing to be "punchy" and brief or else readers won't read it. Personally I don't know about the readers, but it certainly seems to be an arbitrary reason for publishers to use when they don't want to publish a given piece of writing. Buhlert notes that she is just fine with using the passive voice in its place, and perhaps that is the real devil in the details. The passive voice is not a large hammer that clobbers all before it, but a surprisingly sharp instrument that can easily go wrong. It is hard to use well, especially for less experienced writers and even for the experienced. It's a lot like the semi-colon that way.

Feminist linguists have long been critical of the ways that the passive voice has been abused to delete the agents of male violence, men. As study after study has shown, as soon as the topic is male violence, many writers, especially males suddenly find the passive voice the most acceptable voice of all time. There is more to its acceptability in that case than deleting the agent, because the deletion still implies an agent. If we aren't careful, it can fool us into making the victim into the agent because the victim is the only overtly mentioned person. Since that is a great way to reproduce oppressive structures and habits of thought, there is good reason indeed to look with an unfriendly eye on the passive voice when it is used in discussions of oppressive behaviour of any kind. It's subtle, and like the metaphorical pies of the previous thoughtpiece, can be difficult to resist once it gets past us. I think it is non-trivial that the first illustrations of improperly deleted or delayed agents that came to mind centre on property theft. Actually, as a stylistic recommendation, it might be better to point out that using the passive voice may unnecessarily delay revealing the agent. Such delays make the reader work a little, and the established arbiters of pop culture tend to either have a low view of their audience's willingness to work a little, or believe such effort could not possibly be pleasurable. This is probably why those arbiters can't make sense of the success of video games or the growth and independence of fandoms. Clearly, I digress.

UPDATE 2018-10-23 - There are many reasons never to use the passive voice in history writing, and historian Paige Raibmon discusses them briefly in her opinion piece at The Tyee, How to Talk about Relations between Indigenous Peoples and Europeans, 28 September 2018. She notes that the passive voice in the context of texts intended at least officially to help us learn in fact works against that goal. "Elimination of the passive voice forces us to specify the "who," "what," "where" and "when." Many people misperceive these questions to be at the heart of historical enquiry. Instead, they are tools for casting our basic categories and assumptions into relief. (Who are "people"?; What is a "permanent" settlement?) When we use active instead of passive voice we redirect our attention to the historical agency of who did what to whom."

Evidently for the passive voice to have such potentially insidious effects, there must be more to its connotations than merely removing or at least delaying the agent. The ideas or feelings evoked are important. In her brilliant book Speaking Freely: Unlearning the Lies of the Fathers' Tongues, Julia Penelope discusses the connotations of the passive voice. In particular, she notes how it may be used to imply universality, whether or not this implied universality is real. For instance, she examines the implied claim that "love is hard to find." The point is not whether we should be more specific about the type of love or whether we mean someone or everyone finds love hard to find. That is not the rhetorical purpose of the construction. It presents an idea that as readers we are expected to interpret as commonsense and in no way questionable. She also explains how the passive voice is built from uses of the infinitive, including gerunds. Gerunds are evil little critters in english (and latin), because they look just like participles, since both generally have the same shape, but participles act like adjectives and gerunds act like nouns. In any case, the ultimate problem is that the passive voice can leave gaps that we are encouraged to fill in, especially when we are likely to – or encouraged to – fill them in incorrectly.

There are times that getting us to fill in the gap inappropriately is just what a writer or speaker wants. Indeed, that can even be perfectly acceptable, such as in the context of fiction where the reader is presented with a mystery that needs solving or at least a puzzle that encourages them to keep reading to find the answer. When we know what is going on and it is all part of a game we have agreed to play, then there is no issue. As long as there is enough correct contextual information, the passive voice can't slip anything by us in non-fiction anyway. Yet as Penelope and indeed science fiction author and linguist Suzette Haden Elgin reiterated, issues multiply if we are silently asked to assume and therefore accept, more and more. Over the past year, we have all read and heard a tricky variant on the rhetoric of agency that deletes the actual agent while demanding we assume that the victim has even greater agency and is therefore attacking someone who shouldn't be held to account for their past actions. This is of course a metaphorical extension of the passive voice. It can be countered by a simple expedient. Just replace the past action that we are being asked to let slide with stealing money or other property. The result is quite revelatory. (Top)

The Problematic Pie (2019-01-14)

Example pie chart from the pgf-pie module for LaTeX written and maintained by Yuan Xu. Example pie chart from the pgf-pie module for LaTeX written and maintained by Yuan Xu.
Example pie chart from the pgf-pie module for LaTeX written and maintained by Yuan Xu.

In the title of this thoughtpiece I am of course referring to the metaphorical pie used to describe pretty much anything that different people may be encouraged to compete for. It is called on repeatedly in reference to political and economic power, access to the necessities of life, and so on. It is also a cruelly destructive metaphor used in many dishonest ways, because of its presuppositions. The metaphorical pie presumes the pie already exists, nothing can be changed about it, it is limited in terms of its size and contents, and everyone is expected to fight tooth and nail for a piece of it. One of my favourite Feminist quotes points out that Feminists don't want a piece of this poisoned pie. At the minimum, they want a new pie. Based on the outcomes of trying to reframe the pie metaphor, I think more and more Feminists and other people working against forms of structural oppression are at the point of saying, "Fuck the pie, it's a distraction that is meant to get us to accept the very oppressions we want to end as somehow necessary for life." Yet the rhetorical problems with metaphorical pies go beyond this.

A fixture of second generation or so spreadsheet programs is chart generation macros, and among them are a selection of pie chart generators. Pie charts seem pretty simple, and there are only so many ways to make them. I had never thought much about them in terms of how well they display the data used to generate them, or whether they are potentially deceptive. I had not fully absorbed the import of this until recently, despite having read a number of Edward Tufte's books, where he minces no words about their poor performance. Practically speaking, as soon as the pie needs to be split into more than say four pieces, the result is visually noisy and difficult to read. We've all encountered an exuberant pie chart that festooned with extra labels and arrows in hopes of making it legible. Even the opportunity to use many colours in a pie chart actually works against legibility while it makes things look colourful and cheery. The unhappy parallels to glitzy, colourful advertising were probably a strong warning as well. If we want people to take the data we are presenting seriously, do we want to potentially be evoking the absurdities of the latest ad for a product that has changed materially only in terms of its packaging? Other visual design experts have emphasized how difficult it is for us to visually compare areas within a single pie chart, let alone several.

So both sorts of metaphorical pie share the feature of making it harder to make sense of whatever information they are intended to describe. This may be surprising, because pies are so friendly. Most of us first encountered them as learning devices in early childhood, when they were used to teach simple fractions and conversion between fractions and percentages. Everyone has at least seen a real life pie, and pies are great in that they are a circumscribed food item that can be cut into pieces. This can seem very apt given specific presuppositions. In the graphing case, the presuppositions are at first glance not egregious. The pie chart is being used to describe and display a limited data set in what is intended to be as simple a way as possible. Long familiarity with real pies also gives metaphorical ones a halo of inherited trustworthiness that they don't deserve.

If metaphorical pies are so bad, that begs the question of what to use instead. In the graphing world, it turns out that a bar or line chart is often the better option, and that they are often better set out using just one element colour. I admit to being surprised by the second point, because it is easy to forget that just because we can add lots of colours doesn't mean we should, even sobre ones. In the case of rhetorical pies, well, it depends on the point you're trying to make, the presuppositions entailed, and what you want your audience to take away from your argument. There are times that it is right to insist that what we are considering is a limited resource that must somehow be divided equably. I have doubts about how common those times actually are relative to the number of actual invocations of metaphorical pies. (Top)

Perverse Incentives (2019-01-07)

Punishment treadmill preserved at the former breakwater prison, south africa. Punishment treadmill preserved at the former breakwater prison, south africa.
Punishment treadmill preserved at the former breakwater prison, south africa, photo by Lennon001 via wikimedia commons, may 2014

There are few different definitions of "perverse incentive" at large right now, many of them encrusted with the desperate verbiage of economists who have eschewed testing their ideas against reality. For the purposes of this thoughtpiece, a perverse incentive is simply any reward, be it social or material, that encourages a person to choose actions that are destructive to themselves or others. This can and often does entail an ethical judgement, but with the caveats that we should start from the position that we should respect others and assume that they are making rational decisions based on the information and circumstances facing them. We may completely disagree or be completely appalled by what they decide, and they may be criticized for making ethically unacceptable decisions, but we can't take the dishonest or childish shortcut of calling them too stupid to know what is good for them. Paternalism is neither attractive nor effective – in fact, it is one of the best generators of perverse incentives out there.

"Perverse incentives" are of such interest to economists not least because they seem to solve certain apparently strange economic decisions, and provide what I suspect some hoped was plausible deniability for the fraudsters engaged in the so-called "FIRE sector." Perverse incentives in the form of bonuses and promotions for men engaged in encouraging people to make bad loans, then repackaging the bad loans into packages sold as guaranteed money makers, for example. The commonly unspoken additional perverse incentives affecting the people who took out the loans – not bad character – is that they are often struggling to finance what they need to make a living by legal means. Housing is so expensive due to speculation at the moment that many people working in jobs with lower rates of pay cannot live close enough to their workplaces to walk or effectively use public transit. But they can't put together enough cash to buy a car outright. But they need to work otherwise they'll eventually be evicted and further impoverished. They may not have time to work out better loan options than the car dealership presses on them. They are under stress. If they can just keep up with the payments by holding down the job, which the car should enable them to do, then they'll be able to keep their chins above water. All of these intersecting factors create perverse incentives to risk taking out a loan they may be unable to pay back.

Another great example of perverse incentivization is frankly capitalism itself. According to capitalism, the most important thing is of course, capital. For serious capitalists, the most important capital of all is in the form of the means of production, because if they can engross that all to themselves, it doesn't matter if at first they have little or no control of the inputs. They become the only markets for those inputs, which points us to what is the third most important thing, the market, preferably one as slanted in the capitalists favour as possible. The very best capital of all is the most liquid capital, money, because supposedly money can be used to buy anything. Which incentivizes capitalists to look for ways to make anything into money. Trees must be board feet, relationships must be advertising impressions. Caring work obviously has no real value, because capitalists are part of a patriarchal system that deems anything that is delegated to women valueless, even though society literally can't continue without it. Of course, as soon as the caring work is redefined as paid nursing or social work, then it has a little value, but not much, because "feminization" decreases value. This is a good reminder that there are indeed plenty of value judgements in supposedly completely abstract monetization and the "free" behaviour of the markets.

This reminds me of one of the stranger experiences of my earlier working days. I was struggling to get back into stable work and life conditions after a period of unemployment and watching my student loan debt being compounded daily and growing ever larger while frantically trying to get into steady, decently paying employment. This was twenty years ago, and already "decent paying" had been defined down to "enough money not to starve and to pay for a rented room." One of my brand new colleagues saw fit to tell me a story one day, especially on learning that I had opted to buy a heavily discounted computer rather than a car. "Everybody is leaving the Earth and there is one last seat on the space ship. You come up to the space ship and have a million bucks. What do you think is going to happen?" I never dignified this with a response, sine there was evidently no point. The point here is that evidently my new colleague believed my choice of a computer over a car was driven by a perverse incentive, but not a rational one. He assumed based on my age relative to his, that my choice was based on self-indulgence or at least poor reasoning. Perhaps he even thought I was making some sort of environmental activism point by not having a car, that would of course be eviscerated by having a computer made overseas. I actually think he had been given a lot of perverse incentive to believe this about my behaviour – paternalistic beliefs and interpretations depend on such incentive. Since he was given every encouragement to believe that I was incapable of making sensible choices, he could avoid feeling any discomfort at lecturing at me and making inappropriate comments about me as a person. This is an anecdote of course, but not inconsistent with the wider evidence base of the use of "economics" with its supposed impartiality as a cover for anything but impartial moral and ethical judgements. It is worth remembering that the (in)famous Adam Smith was not an economist but what used to be called a "moral philosopher."

Of course the whole point behind discussing and categorizing incentives at all is that we would like to encourage behaviour that we find positive and discourage what we find negative. I am of the view that this is best done by minimizing the role of coercion, because where coercion is predominant we get oppressive structures that go very wrong, or structures with good potential that are warped into support for massive oppression. One of the most potent warping forces is indeed any expression of paternalism, which is so good at generating perverse incentives. This is a hard force to resist invoking, because part of its creepy charm is how it justifies paternalism for thee but not for me, encouraging a person to believe in their inborn superiority. So how to counter such nonsense and its horrifying consequences, from multiplying perverse incentives to supporting the reinforcement or creation of oppressive structures? It so happens that besides the useful starting principles of insisting that everyone has a right to a fair and free existence, everyone deserves respect, and everyone makes rational decisions even if at first we don't understand them and even if we are forced to conclude their decisions are repugnant, there is some more useful ideas to consider and implement.

Courtesy of Lisa Muggeridge, I have happened upon the pleasantly readable and thought-provoking article by Elizabeth S. Anderson in the journal Ethics, "What is the Point of Equality?" Anderson argues that "The proper negative aim of egalitarian justice is not to eliminate the impact of brute luck from human affairs, but to end oppression, which by definition is socially imposed. Its proper positive aim is not to ensure that everyone gets what they morally deserve, but to create a community in which people stand in relations of equality to others." Please read the whole paper, but in any case, to my mind this indicates a clear way to think through whether a particular incentive is perverse, and whether we want to keep or change the incentive. I concede having gone thought through the issue in this way that I have to revise my description of what I understand a perverse incentive to be in light of Anderson's point. Revising then, a perverse incentive is one that coerces people into acting in a way that supports and/or reinforces the oppression of themselves or others. It is not merely an incentive to unexpected behaviour or difficult to explain behaviour. It is possible to create a perverse incentive without realizing it or without bad intentions. That doesn't let us off the hook when we identify them. We are still responsible for rooting them out. (Top)

Counterculture and Its Discontents (2018-12-31)

Image of a geiger counter for detecting radiation. Image of a geiger counter for detecting radiation.
Image of a geiger counter for detecting radiation, courtesy of Boffy b via wikimedia commons may 2005

I think it is fair to say that right now many "western" nations are undergoing a difficult phase of culture change. Culture change happens all the time of course, mostly in ways that people hardly notice, but since the development of capitalism in which rentseekers and other profiteers have struggled mightily to find a way to monetize that very change as a source of permaprofit, many of us have been made hyperaware of it. The american marketing consensus insists that anything handed on from the past is awful and irrelevant and confining, unless of course they have found a way to sell it to us first. They have cottoned onto the challenge of letting go of what no longer works because circumstances and what we know have changed, but pushed beyond good sense to deny any role for hanging onto older ways and knowledge all together. This is unfortunate. Yet there is an interesting contradictory move against thinking through and actually adapting new ways of doing and being that will change the cultures we live in, since after all that threatens anyone who is on the beneficiary end of oppressive practices and social structures. As a general rule oppression has a great deal of short term profit in it, so the marketers and rentseekers are not so enamoured of substantive cultural change. And of course, anybody can be nervous and uncomfortable with deeper culture change even if it would serve their interests and they are in agreement that that is the case. Change is stressful, even when it is change that we wholeheartedly want.

Hence the many difficulties experienced by those who find themselves labelled as in some way "counter-cultural" who are stuck dealing with all that understandable discomfort that may come from better or worse underlying motives, on top of being sneered at and generally presented in an oversimplified manner intended to disarm the threat they may or may not actually pose. More than one pop music group has sung at least briefly about their confusion and disappointment about the apparent outcomes of the "hippy culture" of the 1960s. It seems that most hippies decided that they weren't all that interested in change after all, especially if we read and watch only the accounts presented by sources who were never inclined to take their ideas seriously in the first place. And of course, it neatly elides extraordinary and ongoing work by Feminists, anti-racism activists, anti-war activists and so on – groups of people with significant overlap between them – who never expected or argued that all the change could happen overnight and then everybody could just put their feet up. Admittedly, this can be a disheartening thing to learn, that change is a process, not an event.

What has struck me more recently is the level of venom aimed at Feminists especially, for working hard to end structural oppression, including devising alternatives to replace those structures. I have found a reference to so-called "cultural feminists" who far from seriously engaging in this work, were apparently the hippies of Feminism, more busy with beads and candles than actual "real work." Never mind that to my knowledge I have never found anybody who claims this label for themselves, but I have found an important group of Feminist activists who have argued that without alternative stories and rituals that help us embody and enact non-oppressive structures and behaviours, then whenever the going gets tough, we will fall back on old patterns. Since these activists are often engaged in Feminist reform or reinvention of spirituality, such as Charlene Spretnak or Starhawk, that seems to contribute to the attempts to frame them as bubble-headed flower children. I admit to experiencing my own difficulties with some of the ideas presented as alternatives, but then again, it is hardly reasonable for me or anyone else to expect these new approaches to turn out to fit immediately and promptly into the old spots and satisfy everyone and everything they need to satisfy. And discomfort is part of change.

So overall, it is easy to be discontented with any person or idea labelled as "counter-cultural," if we are unwilling or unable to admit that real change takes time, experimentation, and real work. Including change to spiritual practices that we may be especially uncomfortable with reshaping or replacing all together. (Top)

Updated Thoughts on Advertising (2018-12-24)

uBlock Origin logo. uBlock Origin logo.
uBlock Origin logo, via wikimedia commons, april 2015

Rather belatedly I stumbled on Ramsi A. Woodcock's paper, The Obsolescence of Advertising in the Information Age, originally published in the Yale Law Journal 1 june 2018 via the the free library. Part of what signal boosted this article and Woodcock's argument more generally is an efflorescence of snap shot posts at various outlets during august 2018, which were spurred on themselves by growing worries about the role of facebook, google, twitter, and their potential influence on politics in the united states. It seems that the fact that if anybody can buy ads, and a capitalist company will sell ad space to anybody who can pay in an ostensibly free market, and advertising is propaganda, has all suddenly crashed into general american consciousness. I think this is actually a good thing, even if some of what is driving the alarm is actually scaremongering nonsense rather than realistic claims about different foreign influences. I'm not going to get into that aspect of things, but rather some clarifications and updates to my thinking in light of Woodcock's argument that if a purpose of advertising was to get information about products out to people, then nowadays that is unnecessary because people can use search engines and online storefronts to learn about products instead. Then they can just plain go ahead and order them. From there, Woodcock goes on to argue for an advertising ban. To be honest, I am all for that personally, being sick to the back teeth of them online, on television (they drove me to cut the cord), at the movie theatre, on the street – but I digress.

Instead, let me go back to the idea that advertising is obsolete. After all, there was a time before advertising, and then what did people do? Well, I think it is fair to say that people then lived in either smaller communities, or neighbourhoods where there were social bonds of varying strength that they could draw on. So if a person needed something that they couldn't make or borrow and they preferred not to steal it, as is generally the case, then the next step was to ask around among friends and acquaintances, and/or look around at the next market or market visit. So part of how this worked out before is that the actual ongoing human relationships on a face to face basis, including via degrees of separation as in the case of when cousin Bob has never bought whatever widget you need but he heard his buddy George bought one from Lisa down the road and it works perfectly. Word got around, and it depended on people doing the wording, so to speak.

So on one hand, that made it clear to me that advertising isn't necessary, which reiterates that it is propaganda. It was introduced in the first place not because people no longer lived in communities or could tell each other about things by word of mouth as such. It wasn't even brought in for the purposes of competition to begin with. The earliest mail order catalogues in many countries were produced and sent out by companies that had no real competition. Instead, propaganda was rejigged into a specialized form in order to convince people that they had needs they never had before, so that they would buy stuff to fill those new "needs." And of course, capitalist firms were ready to oblige with products galore, especially after the second world war when they were facing potential cuts to arms manufacturing. That this form of propaganda at least partially hijacked the information seeking loop a person would go on when looking for whatever widget for whatever reason was nastily clever. But the truth is that with or without the internet, people are quite capable of finding out about stuff they need or want to buy. Even where social ties and real life relationships can't necessarily provide the information, there is a whole ecosystem of catalogues and sources out there at the good old public library, although of course various capitalists have been attacking them lately in hopes of privatizing them. The extraordinary resistance they and neoliberal governments have been encountering to closing and privatizing public libraries has thoroughly surprised them.

If Woodcock's argument for an advertising ban gets real traction, the troubles will of course all be in the details, and we will have to deal with issues like the immediate temptation larger capitalist entities will have to try to pay people to shill to their networks of friends and family. At least, as long as there is capitalism anyway, since it seems to be a rule of this economic system that every effort will be made to sell knock offs or pretend that the latest packaging really means that the product is like magic fairy dust and will fix your life. Thinking over what an at least advertising-lite world might look like, I suppose it could look a lot like the movies when they were still mostly in black and white, in the sense of a lack of billboards, franchise outlets, and product placements. (Top)

Hard to Soft SciFi (2018-12-17)

Image from the bbc television programme cd soundtrack for 'Life on Mars.' Image from the bbc television programme cd soundtrack for 'Life on Mars.'
Image from the bbc television programme cd soundtrack for 'Life on Mars,' february 2007

Science fiction is a curious genre, intended to imagine possible futures while really talking about the past and sometimes by luck, the present. The features that usually separate science fiction from other types of fiction is the role of science and machine technology in the plot. They provide the macguffin or object of what is typically a modified quest narrative. This leaves a great deal of leeway, and there is even more now that writers have insisted on bringing in other sciences than physics and chemistry, the great foundations of weaponry. This has of course led to the importation of the same pointless and nonsensical contrast between "hard" and "soft" sciences to create pseudo-categories of "hard" and "soft" scifi. I will resist indulging in the obvious and awful puns implied by this terminology in view of the persistent assumption that scifi, especially in its movie and comic book form, has a teenage male exclusive market. And of course, some fiction does cause all manner of challenges to categories that are really intended to serve marketing, not readers or writers. Take the television programme that today's illustration is from, Life on Mars. The partisans could tie themselves in merry knots arguing over whether it should be considered science fiction or just a weird period drama.

I am by no means committed to an extremist definition of science fiction that disqualifies anything that doesn't feature some type of elaborate and powerful machinery. Since in my view technology really should be considered to include ways of thinking, techniques, and the many tools that seem simple but are in fact extraordinarily complex products of longterm experimentation, such a narrow definition of science fiction just doesn't make sense. On the other hand, I don't necessarily expect every science fiction story to eschew fancy rayguns and spaceships, or to necessarily have plenty of well-rounded characters. There are cases where a good writer can use flat types instead, although that is high risk and rarely works out if the idea is to write a story that can be reread or rewatched. In my younger days I read a wide range of science fiction, as do many teenagers, much of it in the telltale little trade paperbacks that show up in yellowing stacks in second hand book stores, rarely worth more than a dollar unless they are by a "big name." Despite my best efforts, even though I feel sure they were both fun to read at the time, I remember next to nothing of Asimov's Foundation series or Clarke's Space Odyssey series. What I do remember of the latter has been irretrievably contaminated by Kubrick's movie. Perhaps that's another sort of categorization worth playing with, how memorable a given science fiction book or series is over time.

An important cue to think about this at all came via Cora Buhlert's blog post, Cozy Space Opera, Cozy Mysteries, and the Domestic Sphere. It had never occurred to me so much as sideways to imagine a descriptor like "cozy space opera" so of course I had to read it, and it seems "cozy" and "opera" are terms that came in as derogatory markers of "femininity." Apparently pretty much any science fiction that delves into the complexities of interpersonal relationships or larger social questions, in other words, the "social sciences" is still often derogated by such terms, or even refused the monicker of science fiction all together, by some purists. I suspect that Life on Mars would definitely not be considered science fiction by those purists, for example. This is a puzzler if we foolishly expect consistency. Among my eclectic teenage scifi readings is only one Heinlein, which is called Time Enough for Love. Not a novel I enjoyed especially, to be honest. The pathologically bored Lazarus Long is neither very sympathetic nor very interesting, and Freud is awfully obvious in the set up and denouement. Others have enjoyed this novel and its compatriots of course, and the key point here is that we could claim that it isn't science fiction. The high technology is barely present, the real centre is Lazarus Long's angst over his long life, determination to perform toxic masculinity, and his attempt to find a few decent relationships (generally with other he-men) before he finds a way to get himself killed at last. Writers being writers, even the most hardboiled and cynical get fed up with what amounts to describing randomly moving objects perhaps blowing each other to pieces. So they turn their attention to people and their relationships, and indeed, Heinlein is something of an example. If nothing else, they can fall on a quick and easy formula in the mode of Flash Gordon or Buck Rodgers.

In the end, perhaps the best and fairest description of science fiction is the one that Seo-Young Chu developed in her book, Do Metaphors Dream of Literal Sleep? She defines science fictional writing and therefore indirectly science fiction, as writing that uses metaphor to deal with "cognitively estranging referents," in other words, weird things we don't understand. (The notion of a "cognitively estranging referent" was labelled by Darko Suvin.) This form of writing then is used to make sense of some technology or even social relationship so different from what we know that it has no obvious words for it, no categories to fit in apart from the box we could loosely label, "what the hell is that?" Chu points out that this does not allow us to limit what she calls science fictional writing only to science fiction. Other forms of fiction may also serve this purpose, but it is science fiction in particular that seems to be geared predominantly to this task. Whether that is historically the task science fiction has had or is a more recent development, that's a different question. (Top)

Increasing Job Precarity and Hiding the Bots (2018-12-10)

Commercial advertising poster for a late nineteenth century typewriter. Commercial advertising poster for a late nineteenth century typewriter.
Commercial advertising poster for a late nineteenth century typewriter, courtesy of the new york public library, august 2018

A few months ago, the new york times newspaper made some changes to the layout of its online edition. Most infamously, it removed reporter bylines while leaving opinion and photographer bylines in place. The official reason for this was that the editors believe that this will break the "filter bubble" of readers' decisions on what to read based on who wrote it. So, somehow the use of who wrote an article as one of the factors in gauging its credibility and therefore whether it is worth reading is somehow equivalent to an advertising company manipulating search or news feed results. At least, that is the equivalence claimed by the new york times. The dishonest paternalism of this pseudo-explanation is obvious. Not being a reader of that newspaper, or at this point any mainstream newspaper myself, I usually wouldn't have much to say on the issue. It seems obvious that the whole point is to hide when the feature articles on the new york times home page were generated by bots, which no doubt is a growing phenomenon. I thought this and then wondered if I was being too cynical, until I found myself reading an excellent comment thread discussion of the implications of bots writing news articles.

There is a role for some news material being generated automatically. Most of what we see in news tickers for example, the daily weather report, sports scores, stocks and bonds, basic headlines. Despite the effects of 24-hour cable channels claiming to be news stations though, the news is not equivalent to commercials. That is, it isn't just a bunch of diverse replayable snippets that can be put on rotation, even though in desperation, that is how these 24-hour networks fill the time. Since news networks seem to have given up on providing actual information, they are wrapped up in an appalling arms race for the most dramatic stories that will trigger as much emotion as possible and presumably the most eyeballs or ears. The result is a drive towards a way of working that seems more like shovelling a pile higher and higher than anything else. The growing resemblance to accepted definitions of yellow journalism are not coincidental either.

The pseudo-argument for the increasing role of bots in the news industry is of course, labour costs. Reporters and real reporting are expensive. Labour is always too expensive in capitalism, by rights it seems that the capitalist view is that labour should cost nothing at all, and profit should always grow. It matters not at all that this is another description of an impossible perpetual motion machine. A great way to push down labour costs is of course to automate away as much labour as possible, and deskill or deny the skill of the few labourers left. This process has been ongoing in news reporting for years, starting with heavy duty consolidation and centralization – a new application of "the division of labour" idea.

I have no doubt that getting rid of supposedly expensive reporters is an important driver in the development and adaptation of bots to produce news items. Bots are also clearly at the leading edge of propaganda production and dissemination, and they will do as they are programmed tirelessly, with no pangs of conscience or investigatory fervour to be found. The new penchant facebook has developed for accusing people of being bots and "behaving inauthentically" seems to be at the least in line with implying that bots are better than they actually are, and to make any prolific commenter or writer de facto suspicious. Right now, accusations of bot-dom are apparently more often political in nature than factual. It is possible to tease apart human levels of production from machine repetition, although no doubt bot developers are busy trying to find ways around that.

So it seems that we now have a situation in which journalists are being driven into precarious and ill-paid work because the main news outlets deem it possible to do more and more with bots. (But this situation far precedes the advent of even the most primitive bots.) Meanwhile, anxiety about the possibility that bots are mostly designed to spread lies and are unaccountable is increasing by leaps and bounds, not without reason. But in that case, making journalism an even more precarious field and decreasing the visibility of journalists' names is not helping with the risk of propaganda masquerading as the news. Bylines are a key means of making it possible to gauge credibility and hold reporters accountable. Their readiness to report and investigate particular areas becomes part of their reputation, as does their propensity, if any, to report propaganda without question or minimize investigation time.

Having said all that, it sounds like in the new york times case specifically, there are workarounds if you read it online, at least for the time being. Apparently only the home page is valourizing opinions and photographs above reporting and investigation, so switching your initial reading link to a department page will route around the problematic new layout, at least as long as the new approach is not imposed on the rest of the site. It will be interesting to see if the new york times opts to extend the new approach further, and that could be a more serious sign indeed. (Top)

Paper Ballots (2018-12-03)

Image of a sample canadian federal election ballot. Image of a sample canadian federal election ballot.
Image of a sample canadian federal election ballot, october 2015

Convenience, according to my OED, is "the state of being able to proceed with something with little effort or difficulty; the quality of being useful, easy, or suitable for someone; a thing that contributes to an easy and effortless way of life." This sounds like a generally good sort of quality, convenience, especially all abstracted like this. In real life, we gladly live with many inconveniences, and of course, we also live unhappily with many unwanted inconveniences. Still, there are a surprising number of inconveniences that are helpful. For instance, it is inconvenient to have to wear a seatbelt in a car. Unless and until we have the bad luck to be involved in an accident, in which case the seatbelt is anything but an inconvenience. Many safety features are like this, imposing a small trade off in extra effort or annoyance for a larger return. Sometimes the inconvenience is one we suffer because of the broader negative impacts for society at large if we don't, rather than a personal affect. Vaccinations are a good example here, because it is easy in these neoliberal days to forget that we are only generally free of epidemic disease because we are majority vaccinated against them. Even if someone in the majority does get sick anyway, their illness should be less severe, and enough people won't get sick to prevent the nasty critter in question from spreading widely and causing an emergency. The unattractive term I have read for this is "the benefits of herd immunity." The connection to voting of all this, is the claim that voting electronically is better because it is more convenient.

At first blush, this seems quite obvious. If you can vote from home for example, all you have to do is log in, do your thing, and log out again. This could take less than ten minutes. Even if you have to go to a polling station, that could still be faster, because all people have to do is line up, poke at a screen, and head on their way. This could overcome the problem of spoiled ballots once and for all! Better ballots by better user interface design. Then the machines could do all the counting, and the results could be out within a few minutes of the last polls closing. Sounds good, if the only qualities we are focussed on are time, with avoiding spoiled ballots being a distant second. Spoiled ballots are an important issue, but that doesn't mean that people should be prevented from spoiling their ballots in anyway. The key is to avoid ballot designs that push people into spoiling their ballots by accident. Voting by secret ballot isn't about convenience anyway, and bare convenience should not be a consideration at all.

The whole point of voting by secret ballot in the type of societies many of us live in, societies rent by severe social inequality and structural oppression, is to minimize the possibility of retaliation based on who we choose to vote for and therefore ensure a fair and accurate election. Not so long ago, voting was done in real time by shouting or a show of hands in a public gathering after each of the candidates had given a speech. It wasn't unusual for candidates to have their own toughs to "help" people decide, or have a good number of supporters from other areas come down for their speech, or perhaps some useful inducements like free and copious alcohol. Nor was it unusual for voters of strong opinions, especially if they were in the majority, to bully their dissenting compatriots. Dissenters could always be taken note of for later retaliation. Proponents of this sort of voting could claim that it was fast and convenient, even if it didn't scale well. In the end this form of voting has been mostly abandoned, precisely because of the potential for abuse and bullying.

The elaborate, and yes time consuming process of producing, managing, properly collecting, and counting paper ballots also has nothing to do with convenience. All those scrutineers, careful folding practices, locked ballot boxes and all the rest, are about ensuring that the resulting vote process is as fair as possible. Meaning you have an opportunity to vote in secret, using a ballot that you can make sense of, that you can deposit safely and be assured it will be counted. No ballot stuffing or ballot losing. Unused, spoiled, and good ballots are all counted, to hedge against the stuffing. Their physical reality is a feature, not a bug. Paper ballots are hard to lose or destroy without a trace, and accordingly also hard to counterfeit. They have to be kept for a certain amount of time after the election so that they may be recounted when needed. It's not perfect, but voting by secret paper ballot is remarkably robust, and accounts for many human frailties. Under the circumstances, we tolerate a range of smaller inconveniences for a better society-wide result.

When it comes to actually getting to the point of being able to vote with a paper ballot, that does sensibly have a few convenience considerations. Polling stations should have reasonable hours and be within sensible walking or transit distance for as many people as possible. They should never close before everyone who has lined up to vote has voted. Locations and polling hours should be easy to find out, and if a person is provided misleading information, then there should be means to make a prompt complaint, the malefactor punished, and if necessary, paid transportation to the correct polling station, which should stay open if necessary so that person can vote. If we're that serious about people voting, then this should common practice, and penalties for wilfully providing inaccurate polling station information severe. Any efforts to falsely strike people from the voting roles should be met with prompt disqualification of any candidates who would expect to benefit and criminal charges with severe penalties to those overseeing and implementing voter role tampering.

There is no place for computers or other machines designed for voting anywhere in this. Machines that punch ballots are at least as bad as insecure computers, and all computers are insecure for voting purposes. The only reason to demand that people use voting machines of any kind is to make it easier to interfere with the results of the election. "Convenience" is at best a sloppy attempt at rationalization. Nobody who argues for machine-based voting or even machine vote counting, is honest. (Top)

Division of Labour in Extensio (2018-11-27)

Riehle testing machine, image courtesy of oldbookillustrations.com. Riehle testing machine, image courtesy of oldbookillustrations.com.
Riehle testing machine, image courtesy of oldbookillustrations.com, august 2018

It is difficult for a person growing up in much of north america, and probably anywhere that the british or british-derived states are active, not to learn something about the "division of labour." Presentations of the division of labour generally declare it an ultimate good, that it hardly matters that it reduces the production of whatever item to a set of small tasks that could be performed with no training and preferably no mind. The great good that is supposed to make this way of managing production good is that it can result in large stocks of whatever is being made in a short period of time, or else the completion of a large task in a short time. The division of labour is often equated with a very specific form of labour division too, that of interest to capitalists who want to maximize profit and minimize costs, of which the cost of labour is most intractable and therefore annoys the capitalists most. They hate profit-sharing beyond themselves. The truth is though, that division of labour is no more than the allocation of tasks to get some larger task done. I think it is fair to say that we are familiar of division of labour based on sex, and how absurd that is beyond what has to do with what female or male biology alone can do. We are also generally pretty familiar with division of labour based on age, which can also be quite absurd at times, although in that case more often due to trying to shoehorn everyone into strict age brackets, and we humans are incessantly variable within overlapping brackets.

What led me to give the notion of the division of labour some more thought was getting started on Peter Kropotkin;s book Fields, Factories, and Workshops. Setting aside his broader thesis in that book for the purposes of this thoughtpiece, I was particularly struck by a page or two in the first chapter, where Kropotkin noted that the notion of division of labour was being extended far beyond good sense, to countries. By this he meant the development of england for example into primarily a cotton cloth producer, canada into a wheat producer, and so on. The justification for this, as he also briefly discusses, is that a given country will produce one major product for export according to this ideal. The exported product will bring in income. The income will be used to purchase all the other products the country does not produce, including its food, which can all be imported. Kropotkin doesn't spend too long on this, I think because he felt he had shown in a few paragraphs how insane this idea actually is. What if the demand for this export collapses for whatever reason, or conditions in the country result in an inability to produce the export product? Either way, that country has lost the capacity to supply any of its other needs, if it has indeed opted to specialize completely in one or even a few export, aka cash crops. Canada in particular should have this sorted out by now, having gone through a terrible and protracted economic period when the plan to become a primary wheat exporter and import everything else crashed, not only due to tariffs, but also due to competition and bad weather, let alone transportation costs. Regardless of what a person may think about free or fair markets, I think it would be hard to find anybody who seriously thinks that the whole economy should effectively be one big egg in one massive and fragile basket.

Pondering on this some more, it seems clear that the over extension of the division of labour idea to whole countries is itself part of an attempt to rationalize colonialism. Colonialism can provide considerable short term gain, at the cost of the society engaged in it remaining perpetually at war, with the attendant social pathologies of oppressive hierarchy, rampant violence and poverty, and general practical precarity. The only way such societies can survive is to keep finding other societies to take over and colonize, because they destroy so much that is necessary for life in the places they have taken over. Nobody really likes this grind of perpetual war, so every effort and mind trick possible is needed to somehow make it all seem worth it and hide the unpleasant consequences, especially the sufferings of the colonized. Warped idealism, what Minnie Bruce Pratt referred to as "right making might" is a horribly effective and destructive part of this mental arsenal of self-deception.

Pratt explained "right makes might" as that sense a person may be taught that because they have and live by the one true answer, that empowers them to not just share that answer with others, but force those others to live by that same answer. By any means necessary. In canada for example, that meant total warfare, spiritual, physical, economic, on Indigenous peoples. These forms of warfare overlapped, but the sad fact is that missionaries played a key role in softening up Indigenous societies, sometimes accidentally because they brought disease with them, all too often quite deliberately via their desire to convert. A desire that led to church support for widescale kidnapping of Indigenous children to imprison them in residential schools. The missionaries were certain then, and in many cases still are now, that any means justify what they are sure are the ultimate ends. The secular missionaries of capitalism felt much the same, and still do. The costs, whatever they may be, especially since those imposing the losses don't suffer them, are always "worth it," because they are supposedly for the "right" reason. (Top)

Not Children, Not Even Metaphorically (2018-11-20)

Arthur Rackham illustration of Tom Thumb, courtesy of oldbookillustrations.com. Arthur Rackham illustration of Tom Thumb, courtesy of oldbookillustrations.com.
Arthur Rackham illustration of Tom Thumb, courtesy of oldbookillustrations.com, august 2018

Ah the wonders of web search rabbit holes led me to Walter D. Mignolo's website a few days ago, leading to wonderful procrastination fodder which throws up just enough relevant material for my own work to leave me denial room if pressed. Mignolo is an important theorist and scholar of decolonialism and important critic of so-called "globalization" the latest word used to put a pretty bow on colonialism and exploitation. Two of his most famous books because they gore some important oxen via their titles just to start with are The Darker Side of the Renaissance, The Darker Side of Western Modernity, and of course, The Darker Side of . I am looking forward to reading them as well as more of his lengthy posts which include many relevant links and online references. Here though, I want to take up a striking reference he made, which led him to make use of a specific, and highly problematic metaphor.

The essay at hand here is Ukraine 2014: A Decolonial Take, posted in March 2014. Having discussed several relevant issues and approaching the end of his remarks, he takes up the question of sanctions and their use against russia. Mignolo has been drawing out the punishment nexus fundamental to "western" and I would say colonial cultures over the course of the whole post, working over indirectly the notion that colonial powers treat colonies, former colonies, and other than first world countries as children. Referencing Leanne Betasomasake Simpson's point about christianizing education he writes:

"Canadian-Nishnaabeg writer, activist and singer, Leanna Simpson, makes a point in marking the distinction between Christina idea of education and First Nations (Indigenous people of Canada). Christianity education is based on prohibitions and punishments. Nishnaabeg in nurturing, for the simple reasons that prohibitions and punishment encourages in a child violation of the prohibitions and revenge to the punishment. US implementing sanctions on Russia follows the Christian way of Western education."

The trouble here is the de facto acceptance of the notion that "the colonized" are like children. I haven't bumped into the source of Simpson's point here, which may easily have been focussed on childrearing or part of an overall discussion of the Indigenous view of the role of punishment in altering human behaviour and what sorts of punishment are effective for encouraging meaningful, positive, and persistent change. I have no doubt that both her Mignolo's point here ultimately comes out to, prohibitions and punishments encourage actions that break prohibitions and breach boundaries with the hope if not serious expectation to get away without punishment. But even getting caught still makes it all into a perverse show of power via rule breaking and enforcing attention from those doing the punishing. This is not specific to children, and in fact is a key structuring behaviour in patriarchal societies, to pick the obvious category example.

Ah, but here's the rub. How do you separate out the inappropriate metaphor that equates the colonized, the oppressed – anyone designated as lower in a hierarchy of oppression – from the real and undeniable point Mignolo is making in this specific post: sanctions are a goad to continue the unwanted behaviour, not a deterrent? Well, practically he could have written a bit more text, and then wrestled with the implications and questions it raises. For example, whether it makes a difference whether the sanctions were defined and imposed by governments or civil society. There is a strong argument that sanctions against apartheid south africa played an important role in delegitimizing it and helping end its worst excesses and begin the process of rebuilding by the majority of south africans who opposed apartheid in the first place. Civil society-driven sanctions are causing so much concern for some supporters of israel's ongoing destruction of palestine that they are trying to get even discussing them defined as antisemitic and hate speech. But then again, there the sanctions are being proposed and imposed from a different constituency and involve considerable grassroots pressure to make them happen. In contrast, sanctions of the type Mignolo is discussing, and that seem to be proposed or threatened nearly every day by the current american leadership, are not necessarily a response to broad-based citizen pressure at all.

The totality of Mignolo's essay clarifies this though, because he is taking issue with especially hypocritical uses of "democracy" as a practice, including the way in which nations with considerable military power and a history of colonialism have a habit of declaring any otherwise recognized democratic practice undemocratic when another nation uses it as a basis for doing something they don't want. He considers the notion of "democratic disconnect" which I think has also appeared in the guise of "democratic deficit" in other political science oriented circles, and how yes, accusations of lack of democratic process may be pointing at real lacks while being cynically deployed. Cynically deployed to invoke a metaphor of the colonized or "non-first world" nations as equivalent to badly behaved children who should be disciplined with violence. Which returns my own reflections here back to the beginning. For those of us with a decolonial, resistant turn of mind, we need to reject this metaphor every time we run into it, even in our lengthiest blog posts. (Top)

Greedy Savers? (2018-11-13)

Publicity still of Alastair Sim as Ebenezer Scrooge in the 1951 version of Dickens' 'A Christmas Carol.' Publicity still of Alastair Sim as Ebenezer Scrooge in the 1951 version of Dickens' 'A Christmas Carol.'
Publicity still of Alastair Sim as Scrooge in the 1951 version of Dickens' 'A Christmas Carol,' august 2018

At the moment there is considerable angst in the abroad both in firmspace and cyberspace for a whole range of good reasons, so it is not surprising to see even more examples of unfortunate comments on the sites where people can leave them. It happens, and no doubt the moderators are doing all they can. A lengthy comment on a news site I have been reading recently caught my eye because it actually struck me funny, in the sense of odd, rather than amusing. The person stated several times that if the real estate bubbles in various large canadian cities pop unpleasantly, all should be well because the "greedy savers" will lose their shirts, and the people underwater with untenable mortgages will be bailed out. Now, obviously there are several layers of odd, or at least wishful thinking here. For instance, regular homeowners don't get a nice pat on the head and forgiveness of their mortgages in the event of largescale crises in capitalism, especially late stage capitalism, which it seems we are quite obviously in these days. Then there is the notion that savers must be greedy, which of course is by no means impossible, but hardly a given. Now, maybe this person actually meant "miserly" which also is hardly a given personal quality of people who manage to save money.

Since I am a trainee historian, perhaps this simply reflects that I happen to have read some things that the other person hasn't about the massive impoverishment of people who lost their savings during the great depression via a whole range of unwanted options. There were those who had gotten in on the stock market, which crashed horribly. Those whose banks went under and the principles absconded with the little capital that was on deposit, the savings of the smaller account holders especially. After all, it would be effectively impossible for them to pursue the crooks because it was expensive to raise legal proceedings then, too. Then there were all the people who ran through their savings trying to keep themselves afloat while under or unemployed. It was in this period that several realities were forced into capitalist fundamentalist brains of the time: capitalism needs people to spend money, not everybody can be a bigtime robber capitalist or practically nobody has any money to make capitalism run, if people can't save money against emergencies they will curb their spending, and everybody else besides the elite living on credit means the system can only run to a halt when faith in the system inevitably fails. That greed is an unappeasable sickness that is inherently prone to destroying any system dependent on it remains unacknowledged by capitalist fundamentalists, and these other ones have been strategically forgotten. Or else, the hope is that the party can somehow keep going just long enough for the "elites" to get off the planet or create the perfect bunker to wait out the coming collapse. And they are sure there will be a collapse, from what I can see.

On the other hand, the original commenter on presumed "greedy savers" was of course making a practical point. If most people have no money because they were bound up entirely in debt that they cannot repay, then the only people left to take money from are those who have saved money by dint of whatever hard work or good fortune has come their way. Many mortgages are in play right now where the ostensible purchasers of homes provided a minimal down payment, and stories of people perpetually one pay cheque or one rental cheque from their tenants from losing the house or condo are easy to find. So the older co-occurrence of some level of savings with being able to finance and pay a mortgage is no longer a given, and encouraging home ownership via mortgage could be considered a means to discourage challenging the status quo.

Alas, the more important aspect of the unfortunate insistence that this commenter had on the presumed "greediness" of savers is the suggestion of jealousy or hostility because those people are perceived to be somehow more fortunate. The theme was sliding ever so disturbingly into tropes that are cloaked anti-semitic slurs if they weren't there already. The evidence of scapegoating of individuals rather than questioning a problematic socio-economic system that we need to change is not surprising, but definitely another warning sign. (Top)

The Master's Tools (2018-11-06)

Portion of a theban tomb painting from around 1350 - 1300 BCE. Portion of a theban tomb painting from around 1350 - 1300 BCE.
Portion of a theban tomb painting from around 1350 - 1300 BCE. Does the guy on the right have dredlocks? Image courtesy of wikimedia commons, may 2005.

In 1979 Audre Lorde gave one of her most brilliant and misunderstood speeches. She had been invited to give this speech, with notes to reflect on the role of difference within the lives of american women. As it turned out, Lorde probably made the conference organizers and many attendees far more uncomfortable than they expected to be, because she sharply criticized the mainstream Feminists of the time. I don't agree that she criticized "second wave Feminism" as a whole, as many people have claimed since. That is one part of the misunderstanding, a key element of her point was that second wave Feminism was already full of diverse women whose differences were powerful tools in helping them recognize and coordinate against oppression. She was particularly critical of the mainstream face of Feminism at the time, which was being reshaped with very little resistance from the women being co-opted into a reformist one that was very little threat because it was ineffective. Lorde was appalled and disappointed that these mainstream Feminists had turned away from grappling with differences, the very grappling that made them effective, and that they condescendingly assumed that "lesbian and Black women have nothing to say about existentialism, the erotic, women's culture and silence, developing feminist theory, or heterosexuality and power." (I am following her capitalization in the quote.)

In a way, this brief quote reveals a lot all on its own about the conference, which was in honour of Simone de Beauvoir. Brilliant philosopher though she was, de Beauvoir never really subscribed to Feminism herself for much of her life. Yet her life and political positioning were highlit and in many ways mirrored by the conference. She was rebellious and outside "the norm," but only so far. "The norm" being the presumed ideal male and sex-role stereotypical behaviour expected of males to the extent that women could emulate them. It is not without reason that frustrated Feminists back in the 1980s could write books like Mary Evans' Simone de Beauvoir, A Feminist Mandarin, which is a strong critique of de Beauvoir's politics and the way Feminists of the time recast her as one of themselves.

UPDATE 2018-11-19 - Further to this, Joanna Russ made a parallel comment in her introduction to "Letter to Susan Koppelman," pages 171-176 of To Write Like a Woman: Essays in Feminism and Science Fiction (1995: Indiana University Press). "Anyone who seriously tries to make received ideas do feminist work will find that the received ideas end up making her feminism do their work, and anyone who really thinks that respectability will do us good in any field – well, I don't know what to say."

Very early in this speech, which many readers by now have already recognized as the one often reprinted with the title, "The Master's Tools Will Never Dismantle the Master's House," Lorde asked a powerful question that takes us to the crux of what the essay title means and the critique she was making. "What does it mean when the tools of a racist patriarchy are used to examine the fruits of that same patriarchy?" It is at least as important to bear in mind her answer to the question while reading the rest of the speech: "It means that only the most narrow perimeters of change are possible and allowable." This not just an answer Lorde was giving, but a warning. She knew exactly where the mainstreaming of a few especially photogenic and not too politically challenging Feminists was leading. Just nine years later, Marilyn Waring would publish her brilliant book If Women Counted: A New Feminist Economics, in which she demonstrated how the usual "economic measures" deny or ignore women's work and contributions to human survival, and thereby serve policies that actively destroy women's economic independence. A great example of the master's tools doing just what they were designed to do, and in order to overcome them, Waring had to design new tools, in the form of time-based analyses, among other things. The key is what the tools were designed for, and who designed them. It matters what assumptions go into any tool or other type of technology, and I have already written several thoughtpieces about this.

What drove me to write this piece is that one more time I have seen and read a philosopher and speaker whose ideas I generally respect declare "with all due respect to Audre Lorde" that "the master's tools can so dismantle the master's house" referring to basic carpentry tools. I begin to think that "with all due respect" is actually a shocking marker of either lazy thinking or internalized racism, because misrepresenting Lorde's point in such an egregious way is not respectful. Lorde emphasized that design, motivation and outcome of use all mattered. It can be hard for us to unpack our own motivations let alone that of others, so it helps to have two more concrete aspects to consider. There is no way Lorde ever meant something like "the master's hammer can never break the chains of the slave even when the slave wields it." But if the slave is fooled into identifying with the master and therefore refusing to use the hammer to break their chains, the real tool in question is not the hammer but the terrible tools of psychological manipulation affecting the slave. The hammer or saw or whatever were not designed with the plan of rendering resistance to oppression and injustice difficult. Those aren't the tools we should be worrying about in that context.

The hammer and nails or whatever are not the master's tools. After all, the master expects the slave to use them, first to meet the master's needs, second to meet their own needs just enough to keep them useful as slaves. Let us not forget that if we insist on looking back to an imagined antebellum american south for images of what Lorde's phrase evokes, the idealized master never wields the tools of basic day to day labour. His tools are primarily psychological, as far as possible the sweaty and bloody stuff are to be delegated to somebody else, preferably a completely co-opted slave, second-best, a person of dubious whiteness.

All of this begs the question really, of whether the commonly repeated misunderstanding of Lorde's point and critique is itself an attempt to blunt its power. This needn't be sinister at all, Lorde was giving the original audience and in turn the rest of us some tough love with that speech. Our challenge is not to let ourselves sidestep that love, but to accept it and look hard at the tools we use, how they were designed, the motivations behind the design, and the outcomes that follow their use. That's how we can keep awake and keep effective in our opposition and ending of oppression. (Top)

Persistent Neoteny (2018-10-30)

Diagram illustrating neoteny via changes in skull shape with age in chimpanzees and humans. Diagram illustrating neoteny via changes in skull shape with age in chimpanzees and humans.
Diagram illustrating neoteny via changes in skull shape with age in chimpanzees and humans. Image courtesy of Steven M. Carr's excellent evolutionary biology notes at memorial university, september 2010

A few days ago I reread Aliette de Bodard's wonderful rant, On the Prevalence of U.S. Tropes in Storytelling, which is just about seven years old but still all too germane. Part of what led me to read it again was reading something rather more esoteric, a translation of Mikhail Bakhtin's essays on speech genres. Based on a selection of russian and western european novels, he argues that heroes don't change in these books, the world changes around them but they remain the same. Deborah Knight and George McKnight have written about genre characters like Neo in the Matrix movie franchise or comic book characters, who are "fundamentally characters" whose outside is their inside. They are not meant to make us feel that they are like real people at all, but to illustrate and perform specific cultural narratives. That is a key part of what makes their stories so apparently teleological: they are. There is only one direction for them to go. I'm not sure if Bakhtin had quite the same view – reading more of his work may reveal that he separated out "characters" of this type from another type of unchanging character, which is a type of character that I think de Bodard is arguing should have more opportunities to shine in storytelling. If like me when I first read de Bodard on this point, you are feeling a bit resistant to the idea of a different type of unchanging character, there's more to be considered, because she has a real point.

For the purposes of this thoughtpiece I am going to try clarifying things by sorting out the terminology a bit. Let's take "character" to be what Knight and McKnight say, a fictional person whose inside is their outside, with simple motivations and a linear experience arc in the story they are part of. For representations of people in stories who are intended to be more complex, with layered and even contradictory motivations, and inner lives which we can always ask more questions about, let's use the old theatre-related word "persona." Admittedly someone could argue that this is an abuse of "persona" because that referred to a mask, but remember that in ancient greek drama, the mask was part of how the actor embodied and responded as a particular being in the context of a complex ritual performance. If you spend a few days reading ancient greek drama, you will notice that even early comedies have unexpectedly complex representations of individual people in them. With this in mind, I think based on what I have read so far that de Bodard is talking about personae, and Bakhtin could be too at least part of the time.

Phew. Okay, so de Bodard is not, as far as I can see, arguing for more flat characters. I think part of what she is understandably fed up with – admittedly I say this because I agree – is the persistent neoteny of characters in u.s. story tropes. They are all obsessively focussed on young people. Every story is wedged into basically three narratives. Young boy becomes a man, usually by killing other people and engaging in sexualized violence. These two activities overlap significantly. Young girl suffers sexualized violence, and if the storyteller hopes to seem more "modern" or shocking, kills other people, and finally comes of age by finding some guy to marry, by implication abandoning her entire independent existence. The usual boy seeks separation and individualized glory, girl desperately seeks marriage and complete loss of self bullshit sex stereotypes. The third one is closely related to the first, with a slight variant in that any male may engage in this one although it is primarily a young man's genre: wronged male seeks revenge, slaughters lots of people, engages in sexualized violence, and dies in a huge fire fight of some kind at the end. A minor subgenre has the guy apparently finish slaughtering all the baddies and then die quietly of old age in his bed, still living alone and too grumpy for society. Nobody gets old, everybody is passing through some form of arrested or accelerated puberty. Adolescence is important, and appears to be cross-culturally a harrowing time of life, and everybody goes through it. But a great many of us get through it, and the lives we build are quite variable, often by necessity no matter how much we may wish with all our might and main to embody sex stereotypes. And once we have settled into adulthood, we're pretty stable personalities, though if we're lucky we hang onto our ability to learn and change our minds sensibly. That's a good bit of persistent neoteny.

De Bodard argues in part for characters who don't change in the usual sense embodied in u.s. tropes because they are already adults, and so they have diverse reasons for doing the things they do and many more possible storylines to follow. They could still arguably be characters in the sense that their inside and outside are the same, they are transparent so to speak. It is possible to write such people without devolving to the cartoon villain equivalent with the silly black hat, mask, and twirly black moustache, fun as those can be in their place. This actually makes me think of a rare but striking character who turned up in an episode of Doctor Who during the Fifth Doctor's tenure, a hard-bitten female starship captain. She seems about five feet tall around the other actors, yet projects unmistakable authority. She is running the ship, the ship is a cargo ship, and they have a delivery to make. Her motivations are quite clear and uncomplicated. There is nothing simple about her. She doesn't simply chuck the Doctor and his friends in the brig, though she considers the idea. And she makes the viewer wonder at least a little I hope, about her life, how she came to be a starship captain. Maybe it was a second or even fifth career, a not so uncommon phenomenon nowadays.

I have written before that some of the best storytelling is the kind that makes you turn the page because you want to see how the writer managed to get themselves out of the story corner they have painted. Other times, as avoiding persistent neoteny can allow, what makes storytelling great is all the unusual questions it leads you to ask, even if you never do get any answers. Those can be the best kind, as thousands of fiercely creative fan fiction writers reveal every day. There's lots of the other sort of storytelling that reaffirms what we already think and believe, or reaffirms that what we think and believe isn't what a few executives think will sell. Let's have some of the other stories, characters, and personas. (Top)

An Important Rhetorical Summary (2018-10-23)

Photograph by José Luiz Bernades Ribeiro of bust of Cicero at Palazzo Nuovo, Musei Capitolini in Rome. Photograph by José Luiz Bernades Ribeiro of bust of Cicero at Palazzo Nuovo, Musei Capitolini in Rome.
Photograph of bust of Cicero at Palazzo Nuovo, Musei Capitolini in Rome, courtesy of wikimedia commons. José Luiz Bernades Ribeiro, september 2016

Possibly we have all heard someone declare in an unimpressed tone that some statement is "just rhetoric" or "only rhetoric." Perhaps more of us have heard its synonym, "it's only words" or "they're just saying that." Or the ever obnoxious and unhelpful, "sticks and stones may break my bones but words will never hurt me." The neutral definition fo rhetoric is simply "the art of effective or persuasive speaking or writing" although "persuasive" and "effective" are both weasel words that beg questions. Effective for who or what? Persuasive to who, about what? Is a persuasive element always required? My OED doesn't say, although it also notes a more overtly pejorative definition of rhetoric, as "language designed to have a persuasive or impressive effect on its audience, most often regarded as lacking in sincerity or meaningful content." The sayings noted already are most consistent with this second definition. But I think it is fair to say that whatever else the current holder of the title "president of the united states" may have done recently, he has shown everyone and sundry that rhetoric is never empty and words can indeed be harmful and brutally effective in the real world. He has made it very hard to avoid those facts by his desperate efforts to keep attention focussed on himself as if he was afraid he would suffocate and shrink away without it. In any case, over a year ago now, Feminist Current featured an article that includes a neat summary of the shock tactics currently in heavy-handed use by "falsifiers and deniers in any debate" and I think that it is well worth quoting it here, courtesy of columnist Janice Williams. Ellipses reflect deletion of topical references to a movie that framed Williams' discussion.

  1. Claim to have been attacked and treated rudely when you have actually only been disagreed with and, in fact, you are the one attacking and being rude. Trade on the fact that your opponent will behave well (often under obligation to appear professional) when under attack and you will gain an attentional advantage.
  2. Shout loudly, be colourful and entertaining, maybe crack a few jokes. You know that on a grim issue like the Holocaust (or other nasty issues, like Female Genital Mutilation or prostitution), this will bring the many who cannot face brutal reality onto your side where you hope to keep them. Disparagement humour has been shown to allow haters to deny their own hatred by claiming that it is "just a joke." That your opponent does not engage in this way will work in your favour – they can be mocked as dour and humourless (especially if your opponent is female!)
  3. Whatever you do, don't distinguish between assertions and claims vs. evidenced facts and expert opinion. Act as if these are all equally valid and very much the same thing.
  4. At every opportunity set up an apparently simple challenge like, "Who can show me a single document that proves the Holocaust happened?" This is attention-grabbing and exciting (a welcome escape from the grim issue under discussion), but is virtually impossible to do, because real history is built up by evidence upon evidence, not (in modern times at least) by a single document.
  5. Above all, create the (totally false) impression that there are two equally valid sides in this "debate." This is how the media often tackles controversial issues: by setting up a short discussion, allowing each side make a point or two, and considering the job done. However, we know... that on big issues, one needs to look deeply at the evidence and engage in critical thinking in order to form a valid conclusion. Simply having an opposing viewpoint doesn't mean it's of equal value....

The running theme in these techniques is that of distraction. Suddenly the debate or discussion is no longer about the topic at hand, it is about being entertained by loudmouths and their ability to hijack the proceedings by behaving in a childish manner. The question we are supposed to be diverted onto is whether the loudmouths will be able to reduce the other participants to shouting back. None of this is good for anything practical or ethical. It's just good for a solid does of distraction for the audience and a dose of endorphins and adrenalin for the loudmouths who bask in the attention.

Contemptible as these techniques are, they are all too effective on the unforewarned and therefore unarmed. Merely feeling scornful after the fact or otherwise apart from actually experiencing them is no help if you are in fact unable to resist their effects. The people who use them hope to keep us from listening and thinking to the actual questions at hand, and to fool ourselves that we are a step ahead even if we fall for them. Luckily the study of rhetoric is making a comeback, even though it remains under heavy siege along with any other area of study or general interest falling under the label "humanities," surreal as that ongoing attack is. Not that rhetoric is making a comeback under its own name necessarily. Often it percolates back up in the genre of "media studies" and "critical reading," where people can learn either how to parse the nonsense out of advertisements or, as is at least as common, learn that they already know how to do that and that they can apply those skills to many other areas of life. Or even that "critical" doesn't always or only mean "find something negative to say" and in fact ideally means "find something thoughtful to say" which helps spawn or continue a helpful conversation.

The rather doubtful fellow whose bust illustrates this thoughtpiece, Marcus Tullius Cicero, whose tumultuous life spanned 106-43 B.C.E. and a significant portion of the implosion of the roman republic, is still a renowned rhetorician. He made his reputation and fortune as a lawyer and politician. Many of his speeches have survived along with philosophical writings, many letters, and the once standard text Ad Herennium was attributed to him for centuries. It would be well worth rereading some of Cicero's more famous orations with Williams' summary in mind, especially his speeches against Catiline. (Top)

Language Prescription Foibles (2018-10-16)

Sample chart illustrating regular astigamtism. Sample chart illustrating regular astigamtism.
Sample chart illustrating regular astigamtism. august 2018

There is nothing new about attempts to control what people say and how they say it. We can find implied instances in the context of the earliest deciphered writing, where a script designed for one language is being used to write another. The direct examples I know of all come from the romans, whose prescriptive grammarians fought a futile rearguard action against the romance languages with of course no idea that was what they were ultimately doing. It would not surprise me to learn that some of the early sanskrit grammarians were engaged in similar work. I appreciate the temptation to inveigh against change in pronunciation and grammar changes in light of the strange career of the word "concerning." For whatever reason, the newer uses of this word grate on my inner and outer ears, though not in a way that leads me to think my compatriots who use it in its new adjectival form are doing "wrong." Fortunately I have learned the lesson of the futility of language prescription.

UPDATE 2018-08-26 - I hold the Radical Feminist position that refusing pseudo-generic pronouns and sex-marking where it doesn't matter are both important steps toward recognizing and rethinking sexism on the road to challenging and ending it, and that this alone in itself will not end sexism. It is part of making oppressive sexist structures of action and thought conscious and therefore open to change, not that second level of more difficult change itself. As Susan Cox noted on Feminist Current back in 2016, we have to watch out for abuse of linguistic change in order not to reveal and facilitate questioning oppressive structures but conceal them and prevent resistance to and destruction of them. The still ongoing mania over "preferred pronouns" has far more to do with concealment, alas. As Cox notes, "But at this point in time, referring to every person on earth by gender-neutral pronouns will have no impact on the reality of sexist oppression – it merely stops us from speaking about it."

UPDATE 2018-09-07 - It turns out that my information on the age of use of singular "they" was quite conservative, among the earliest written uses are those by Chaucer, which makes this linguistic development in english easily over 600 years old because writing lags behind in reflecting changes to speech. See the wonderful summary post at Motivated Grammar, Singular "They" and the Many Reasons Why This is Correct from way back in september 2009.

A particularly wonderful example of unremitting resistance to language prescription comes from the fraught category of pronouns, specifically "gendered" pronouns. I use scare quotes here not to imply sarcasm, but to highlight that actually, more languages allow speakers to leave sex unmarked than do, although they can still mark sex when referring to a person if wanted or needed. In the case of english and a number of related and cousin languages, in light of queer theory and other controversies, there has been a great deal of sturm and drang on the topic of pronouns of late, including a whole new round of verbal sparring over the use of "they" as an indefinite singular form. It amazes me how much this usage continues to infuriate a particular tiny number of commentariat members who get a remarkable amount of press. However, the controversy does not date from queer theorists rushing to renovate the pronoun systems of indo-european languages spoken mostly in europe and its former colonies. Far from it. It dates right back to the good old 19th century, that period of obsessive cataloguing and attempting to exert and rationalize coercive control by colonizing elites, among other things. Everybody was using "they" this way, no transgender lobby anywhere to be seen. And this is not news. Feminist linguists have been documenting this from at least the early to mid 1970s. One of the best summaries of the infamous career of "they" comes from Dale Spender in her 1980 book Man-Made Language. She writes:

Before the zealous practices of the 19th century prescriptive grammarians the common usage was to use they for sex-indeterminable references. It is still common usage, even though "grammatically incorrect"... Then – and now – when the sex of a person is unknown, speakers may use they rather than the supposedly correct he in their reference....

As Anne Bodine has noted, using they as a singular is still alive and well, "despite almost two centuries of ingenious attempts to analyze and regulate it out of existence" on the ostensible grounds that it is incorrect. And what agencies the dominant group has been able to mobilize to this task! Bodine goes on to say that the survival of they as a singular "is all the more remarkable considering the weight of virtually the entire educational and publishing establishment has been behind the attempt to eradicate it." one is led to ask who is resisting this correctness?*

Well, practically speaking, apparently everybody is resisting the many ingenious efforts to force people out of using they as a singular, sex-indefinite pronoun. It seems english speakers at least are quite determined not to mark sex where they deem it irrelevant or unknowable, even under the most extraordinary social pressures. For a recent usage of they in this manner, see the example in the fine article on The Legendary Language of the Appalachian "Holler" by Chi Luu at the JStor Daily. There is a gentle lesson here, I think. (Top)

* For those who would like to see Anne Bodine's 1975 article for themselves, the full reference is "Androcentrism in prescriptive grammar: Singular they, sex indefinite he and he or she," Language in Society, 4(2): 129-156. Dale Spender's book Man-Made Language, published by Routledge and Kegan Paul of london is a regular inhabitant of both public and university libraries. Spender writes in a clear style that avoids jargon and is often riotously funny.

BBEdit and Website Managing (2018-10-09)

The 25th anniversary bbedit logo from barebones software's website. The 25th anniversary bbedit logo from barebones software's website.
The 25th anniversary bbedit logo from barebones software's website. barebones software, august 2018

Several earlier thoughtpieces have dealt with web browsers, especially the ongoing trainwreck that is firefox – currently following the neoliberal playbook to purge the built in rss-reading capabilities better known as live bookmarks – and seamonkey, with a dash of waterfox thrown in. These are all built on the same codebase, and for better or worse constitute the least bad options for web browsing security and privacy in my considered and researched view. Not that writing and maintaining web browsing software is easy, to be sure, especially now that only the doughty crew working on seamonkey still sees a role for the browser in building the web in the first place. For all its faults, the late netscape browser's composer with its descendant still kicking within seamonkey's regular production code did serve as a non-trivial tool in learning how to write and design web pages. "View page source" is helpful, although in the code family the three browsers here are from, the absurd default is not to wrap long lines of code.

UPDATE 2018-08-13 - I did track down how to change the default behaviour on the wrapping of long lines of code in "View Page Source" mode. It was in the mozilla help fora, and apparently the present default was not introduced to firefox until version 41 or so. Strange stuff. In any case, for those who haven't bumped into this yet, the setting to look up under about:config is "view_source.wrap_long_lines" and the setting beside it can be clicked to alter it from "false" to "true."

There are lots of options other than web browser composers for writing web pages, from such massive heavy hitters as the latest successor to adobe golive (remember golive? remember its original company?) to the online databases full of point and click elements and an attempted "wysiwig" interface like wordpress installations. The determined and tightly budgeted can get along in a basic text editor if they can persuade it out of rich text mode. At one point microsoft's operating system came with the only plain text editor left standing after the demise of apple's simpletext that could be used for web page editing. That program was notepad, which like simpletext has no code colouring, spell checking or anything else. Unfortunately, the latest versions of notepad have a terrible bug which does alarming things to web pages set to utf-8, which after being opened in notepad become a mess with all optional white space lost in other programs and the addition of "gremlins," invisible encoding-specific characters that wreak havoc. In one of my other jobs notepad was the webpage editor I had to use, and this bug has bumped me to editplus on the workstation for that job. Editplus is okay. But it reminded me all over again how unique and wonderful BBEdit is.

It is hard to believe that BBEdit has been around for twenty-five years, even though I have been using variants of it and BBEdit proper to build and maintain the Moonspeaker since 2004, which means going on fourteen years now. BBEdit's parent company barebones software provides a free version that is still full of a spectacular number of useful features with no time limits or nagging messages to ask if you wouldn't like to maybe, kind of buy the full featured version. For my part, when my site finally grew complex enough to administer to warrant an upgrade, I took the plunge and bought a full license, and can say honestly that the returns have been justified. Barebones software's tagline for BBEdit is "it doesn't suck" and that has to be some of the best understated advertising out there. Of course, emacs and vim partisans will flatly disagree with all of my praise here, and I respect their preference for their editors. As for me, I will stick to BBEdit so long as my editing platform remains a MacOS.

In terms of web editing then, BBEdit can do all the things you would expect. Templates, projects, code syntax colouring, autoindenting, and built in ability to work with versioning software like git and subversion if you have them installed. There's a bog-standard ftp client built in, spellchecking, clipping facilities, and support for almost every computer language you can think of and ready means to add support for more. For me though, what really won me over for good was grep. Infuriating as learning to code regular expressions can be, once the arcane but stable syntax is sorted out, a whole range of basic updates suddenly take mere minutes that if done by hand would take days and probably be full of mistakes. From there I found my way to BBEdit's includes, which take care of a whole other aspect of updates and changes that can even be done via perl or python scripting. After a couple of hours of development time, I had shifted a whole pile of editing tasks to scripts and could focus on writing rather than coding when working on the site. It includes an excellent html page previewer (just previews, the links don't work) or you can switch to preview in your chosen web browser with an alternate keystroke. Meanwhile BBEdit's project structure allows me to take full advantage of more serious coding needs and writing that needs to be done in markdown. About the only coding/writing I don't do in BBEdit is LaTeX coding, mainly because TeXShop is better for that purpose since it smoothly integrates editing, previewing, and generation of final electronic proofs.

All of those useful tools (among many others) in a program that runs fast and light, and in my experience over the years has crashed only a handful of times. None of them I should add, from when I accidentally told BBEdit to open a huge pdf, which it dutifully did as a text document using the best defaults it had available to it. The results were surprisingly good, and no harm done to the original file. BBEdit has robust recovery and autosave facilities, so a crash is no more than a minor inconvenience now should it happen, since a minimal amount of work is lost. Would that the last word processing program standing that I use was as sturdy. There was a blessedly short period when a terrible bug disrupted the recovery and autosave features, right after the introduction of the document sidebar that allows many open documents to be accessed quickly and conveniently in less screen space. If there must be a terrible bug like that, then at least let it show up in the context of a significant background change. I see from my notes that this led me to edit the site for a year in emacs, and that was also one of the worst years in the Moonspeaker's entire history for lack of posts. So in reality, I didn't edit the site so much as kept it on life support while trying frantically to reestablish a steady workflow. Arguably emacs was not the best choice, but my experience with vim and vi was unhappy, and gedit was not terribly available yet. (Today it is possible to use gedit in macosx by using homebrew or macports to build and install it.)

I should add that in this day and age of web pages overwhelmed with scripts that if not allowed to run won't let you read the text, BBEdit is my go to program for when I dip into the page source with my web browser, grab the text, and then dump it into a plain default html templated page. From there a quick key command allows me to read the text in BBEdit's preview mode. This should never be necessary of course, but in the current state of the web is an important fallback option. Finally, this description of the wonders of BBEdit reflects version 11.6.8, not 12.1.5. It looks like the latest version still does not include built in ftp site mirroring, which is the only web administration task that BBEdit can't take over directly, and perhaps a respectful nod to another longtime macosx software independent vendor still in play today, fetch softworks. Their product, fetch, does one thing and does it outstandingly well, and is also firmly worth its price. That may be the strategy that has served both barebones and fetch softworks so well, that while their products can do quite a lot, they can't do everything that we could possibly think of. So they can do those things they do well. (Top)

The Map Is Not the Territory (2018-10-02)

Excerpt from the main map from the Turtle Island Map Project on the Moonspeaker. Excerpt from the main map from the Turtle Island Map Project on the Moonspeaker.
Excerpt from the main map from the Turtle Island Map Project on the Moonspeaker. C. Osborne, august 2018

If there is one thing that we can count on, it is finding more and more maps. Maps of all kinds, "just in time maps" via "you do the work, we'll take the money" crowdsourced apps used in web browsers all over the world, those amazing hangers' on, paper road maps available in myriad grocery stores and gas stations. The young and adventurous may even find their way to maps not just in their school textbooks, whether electronic or paper, but also in the fading genre of atlases, books of maps and/or charts named for the ancient greek mythical figure who once commonly featured on their covers. That is a lot of maps. But, as Mark Monmonier wrote in his still relevant and brilliant How to Lie With Maps, maps are not neutral depictions or simple reproductions. They are designed to tell stories, and by nature people who make maps must decide what to include and what to leave out, how to label and how much, all to facilitate telling the intended story. A major tool in colonisation and rationalizing that genocidal practice was and is mapping. Alas, like all forms of storytelling, mapping can be abused to support doing terrible things. We can tell ourselves the most appalling stories, and the most wondrous ones.

We all make maps, even if we don't produce two-dimensional representations of them. The sense of how to get from home to the various places we need to go, especially the ones we go to on a fairly regular basis, is in fact an internalized map with our home placed in the centre and the routes to our most commonly visited places expanding out from it. This can be surprisingly close to what we actually draw if asked to do so, with adjustments to reflect key landmarks such as major streets and interesting buildings if we live in cities, or water bodies, mountains, and so on if we don't. This may seem obvious, but on an anecdotal basis at least, I can say that this is not an intuitive connection.

One of my earliest experiences with mapping in the sense of making a map, was being given an assignment in elementary school. The instructions were to draw a map of the city and colour it in. I suspect that the directions did say to make sure there was a legend and a north arrow, though that detail is long gone from my memory. For the purposes of this assignment, each of us were given what at least to me at the time, was the largest sheet of paper I had ever seen, tabloid sized, 14 inches by 17 inches, rather shorter and a little narrower than an ISO B3 sheet. We were supposed to fill that with our maps. It was a veritable sea of paper. Still, the assignment didn't seem too complicated. We weren't supposed to make up a city necessarily, we were supposed to try to represent the city as we knew it. You already know where this is going. I flunked that assignment pretty much. The teacher was thoroughly annoyed. He felt that I had ignored the instructions, and he demanded to know where the freeways were, and the roads, the real roads, and the major buildings like city hall. He complained that he didn't recognize the city I drew. As an adult I know that this was entirely to be expected. As a child, I remember being completely confused. If he didn't recognize the city I drew, I didn't recognize the city he described.

My city was far greener than his description that emphasized concrete, buildings, and various types of roads and bridges. There was basically one kind of black road, that could be a bit narrower at times, but was otherwise uniform. Much of the road surface I saw was newly done in those years because my family was living in a very new suburb. Lacking regular access to a car and money for the bus, we walked everywhere. So my map showed lots of patches of woods and intriguing shortcuts that pedestrians walking long distances learn, especially for self-defence from traffic and hot summer weather. I noted places where we could pick and eat the blackberries and where we couldn't because there was too much road traffic – we were always hungry, this mattered – and the street that one year was filled with streams of migrating and hungry tent caterpillars. I do mean filled. Those critters know not to head into wheeled traffic, but as for sidewalks and other paths, they'll take their chances. Both of these cities were present then, and at least to some degree are now. But not being poor and therefore remanded to the travel opportunities afforded by his two feet, this teacher couldn't believe that greener city existed, or at least, that it was important enough to map.

I have thought about this often, looking at maps that attempt to show Indigenous territories. Two-dimensional, snapshot type maps are profoundly bad at depicting them, and not just because the Earth is three-dimensional. Mainstream thought encourages the conflation of map with territory, of an extremely limited story as a fixed definition, hedged in by borders that in the physical world are complete absurdities. I have had the blessed experience of showing a young friend who was a little younger than I was at the time of that mapping assignment, the one metre wide line designating the official border between the settler states of canada and the united states. She looked at me skeptically. "That," she said witheringly, "is the border?" I assured her it was, and we actually weren't allowed to cross it yet as we hadn't cleared customs. She put her hands on her hips. "That's ridiculous." As indeed it was, and is. (Top)

Copyright © C. Osborne 2018
Last Modified: Monday, July 24, 2017 0:00:55