FOUND SUBJECTS at the Moonspeaker
The AI Conundrum (2017-12-24)
Periodically in my wanderings on websites and blogs dealing with tech-related news, or at least nominally trying to because the majority are drowning in ads pretending to be news articles, I bump into articles featuring whatever the latest Elon Musk commentary on things besides his own company is. To be honest the appeal of his remarks to the tech press has never made sense to me beyond those pertaining to actual tech stuff. Managing to get rich in the tech industry doesn't make you brilliant or even sensible on every other possible topic. (Would that it did, to be sure.) However, in this case I was struck by how various bloggers have taken off with Musk's fretting about "artificial intelligences" and how if they ever happen humans are basically doomed. Most pile on commentators have declared how right he is, or how mistaken, and really thoughtful ones have pointed out that his fears may actually be a projection of concerns about the increasingly malignant force that corporations represent in the world. (For example, see Ted Chiang's late december article on buzzfeed.) That is an interesting interpretation, and I don't disagree with it, but it took me awhile to figure out why Musk's claims on this point stuck in my mind all the same. Until finally I realized, there is another possible interpretation.
UPDATE 2018-01-22 - Anne Leckie's 2016 speech at Vericon is another interesting perspective on this, and well worth the read. I don't wholly agree with where she went on the point about the nexus made between robots and enslaved people in terms of the specific discussion of artificial intelligence, such as it is. The slippage from oppressed labour straight into entrenched racism and the stories that support its entrenchment is not as direct as it may seem. That said, her greater overall point in that speech is the power of stories to guide our expectations and reactions in the world, and I am wholeheartedly with her on that.
UPDATE 2018-06-02 - Also related is this brand new essay by Mark Leier on activehistory.ca, Teaching the Work Process and "Deskilling" with the Paper Airplane Game, which deals with deskilling, the key element that makes so-called "AI" work at all. "AI" only seems to work because it is founded on assuming that how intelligence works is by putting together a linear series of simple, uniform actions.
In 1990, Edward James published the essay "Yellow, Black, Metal, and Tentacled: The Race Question in American Science Fiction," which has since been reprinted in the amazing 2014 anthology Black and Brown Planets: The Politics of Race in Science Fiction, edited by Isiah Lavender III. James notes that the famous laws of robotics invented by Isaac Asimov were meant to create ideally restrained robots, which, despite being officially artificially intelligent in his fictional universe, would nevertheless thereby be rendered perfect servants. And those robots seem to stand in the place of Blacks. This is not only the case in Asimov's work, robots and androids are often artificially intelligent in science fiction, and that generally means their intelligence was built in by humans for the purpose of serving human demands.
Notice where the connections go here, not to corporations, not to uncontrollable constructs, but to enslaved people. They can also be traced back to people who are not enslaved but labourers, forced to work for grotesquely low wages that barely allow themselves and their families, if they have them, to survive with any sort of wellbeing. Consider that at this moment, the level of social inequality in so-called "modern times" has never been higher, and is getting worse. Wage labour is constantly more precarious, to the point that even the poorest people who think they are white are being driven into an existence where they live in their cars or even their "recreational vehicles" and travel from place to place working short term contract jobs. Mismeasured "productivity" has never been higher, and wage levels for the few people left in waged work are being squeezed more and more. They can't afford to buy any of the shit they have to make on the job, and practically speaking they are more interested in decent food, clothing, and shelter. They've been dutifully hanging in there, playing by the rules and coming to the conclusion that this game is rigged so they will lose. Breaking down "non-service" jobs until almost no knowledge is called for to do them is supposed to decrease the worker's ability to resist being driven to work faster and longer. The radically broken down tasks are also that much easier to roboticize and bring those stubbornly expensive labour costs down some more. After all, robots can't learn to be rebellious, right? Except, if they're actually artificially intelligent.
So you see, here is what I think is actually causing folks like Elon Musk to worry at night. They're worried that social inequality is now so bad that they can no longer depend on the "middle class" to have their backs, because they've basically destroyed that class of people. There are a whole lot of poor people with little or nothing to lose, and a whole lot of them now live much closer than say, mexico, or china. Understandably, they are getting awfully pissed off about learning that dutifully working harder and producing more has just meant that they get poorer and less able to survive while the moguls who control the corporations and so forth keep getting richer. No need for the already installed robots to become artificially intelligent or go rogue. Since the humanity of "the poor" is so often debated – the giveaway for when a group's humanity is being debated if not denied outright is discussion of how to stop them from breeding, by the way – it is not at all surprising to me that the idea that they have reason to take concerted action, not some random destructive impulse is still only a low murmur around the blogosphere. It isn't even a very loud sound out in firmspace yet. Queue sarcastic voice: "The very idea that the poor may have full human intelligence instead of being too dumb to get rich! Egads."
In any event, I'm not worried about any supposed AI conundrum at all. I am worried about socio-economic system that is founded on the latest version of the secular rapture, which can fairly be termed "the rapture of capitalist fundamentalists." (Top)
The Trouble With Tourism (2017-12-17)
I have had the good fortune to travel fairly extensively for work, with rather less opportunity when I have to cover all of the expenses. "Tourism" is an astonishingly expensive practice, and something I have found rather uncomfortable and mostly impossible to adopt. Quite apart from the expense, I find it difficult to see the point of going to a place if I am not visiting friends, family, a library, or an archive for study purposes, although even that can be a stretch. Somehow "tourism" just seemed like a very odd sort of thing to do, even though the advertisements for it insist on how rejuvenating and edifying it is. Regular readers will be well aware of my mistrust and dislike of ads, but on the other hand it isn't impossible for them to hold a grain of truth. Soap ads do include references to soap, after all. There are many amazing places in the world, which I am quite convinced of with no advertising required. So it took me some time to unpack why visiting places "just to visit" didn't quite sit well with me.
Several years ago I stumbled upon a short article on Archie Belaney's relationship with Anishinabeg communities in northern ontario. Not much stuck in my mind about it, not least because he was an infamous fraud pretending to be Indigenous in the early twentieth century while joining up with the early stages of what was then called the "wildlife conservation movement." What did stick in my mind was its description of Belaney's frustration with the Anishinabeg whom he was trying to persuade to rework their community into a tourist resort. As far as he was concerned, this was their permaticket out of poverty. The Anishnabeg in question were understandably skeptical of this, and insisted that they were not interested in living as "guests in their own home." Being a guest is not a comfortable longterm state, so this seems like a reasonable point. By then they had had over twenty years to observe how the newly designated "national parks" worked, as well as other tourist sites further south, and so the Anishinabeg were well aware that step one in making a tourism site in canada is removing and distancing Indigenous people from it. Which is troubling, to put it mildly.
Then, in my rereading of Inga Muscio's autobiography of a blue-eyed devil, I happened on this brilliant summary. "The tourism industry caters to fully entitled people from wealthy countries who believe they have the right to enter into a poor, 'exotic' nation on the planet and view it as a source of entertainment – a backdrop for their lives." Clear, succinct, and it draws out the oddness of the travel ads I have seen, and brings out what sort of "experience" they are marketing. And make no mistake, it is an experience that is being marketed. The place, wherever it may be, is always set as the background, in front of which the putative "you," represented by whoever the anticipated demographic representation includes for the magazine, website, or television programme, does whatever sort of touristy thing. Emphasis is generally on being waited on hand and foot, or accessing places curiously devoid of anybody who actually lives there, unless you are being waited on hand and foot. Not being from a background where being waited on hand and foot is something I expect or am comfortable with now that I am far beyond the age of toddlerhood, this is weird. Even the ads that emphasize adventure in the form of mountain climbing and so on, still at the least insist on the absence of any locals, unless they are bringing you hot drinks or minding the equipment.
I am far from unique in leeriness of tourism on these counts. The burgeoning interest in working and volunteering abroad indicates that many people would prefer to at least get to know a new place beyond the tourist areas, which in my limited experience have an eerie sameness, if not try to give meaningfully back to the communities who host them, trying to shift from a parasite-host situation to a symbiote-host sort of situation. That seems at the least more ethical potentially, especially if the work or volunteering does not in effect serve as a vanguard of or further support of colonizing other people.
There's the rub, as Shakespeare said through Hamlet.
I don't think the tourism/travel industry is retrievable as currently put together, if at all. People who seem strange to "us," whoever that may turn out to be, have no obligation to be our entertainment or serve as the soothers of our troubled brows and rejuvenators of our tired souls. Just because people have been driven into an economic corner by being driven off of their land so that they are now working in "the service industry" doesn't mean that using the facilities they work in helps. I suspect they'd be happy to see those of us who can taking the time and effort to oppose the forces that unjustly drive them away from their original livelihoods and homes. (Top)
The Dubious Wonders of VR and CGI (2017-12-10)
More years ago than I like to think, my youngest sibling was caught up in the local kid-fever fixation on the dvd release of Finding Nemo. The latest overwrought big movie corporation cloyfest just didn't interest me at the time, even though Elen Degeneres was a member of the voice cast, which has all sorts of interesting socio-political messages. Nevertheless adults were extolling it too, raving about the quality and realism of the animation. That quality remains impressive today, although I did find myself a bit baffled about some of the excitement about it. Why spend so much human effort and computer equipment to reproduce, well, reality? It seems absurd on its face. I can completely see how generating an extraordinary character like Gollum in Peter Jackson's Lord of the Rings trilogy certainly demands all that effort. There is no being like Gollum on this Earth, and his role is so important that the audience absolutely must be convinced that he is a real entity the other actors are interacting with. In the context of a high-scale kid's cartoon movie, evidently real clown fish, nurse sharks and so on are not going to look vaguely anthropomorphic and move their mouths and gills to match the called for dialogue. I even appreciate the desire to press the animation effort to its absolute limit, because that is also recognizable in the many ways of producing realistic art. It's the follow through to attempt to generate what clearly is hoped to be fuel for visually impressive "virtual reality" that leaves me cold. Reality is already here after all.
In the burgeoning "virtual reality" and effects industries, there are also offshoots engaged in pursuits less immediately tied to traditional games. For example, a few web searches on "digital humanities" and "games" will turn up examples of custom levels for established videogames and independent games and simulations. There are not a few scholars working hard to produce virtual versions of archaeological and historical sites in order to both make them more accessible and divert people from visiting the real thing in person. But to me these are actually even more questionable efforts. They sound wonderful at first, because having a cool place trampled into dust by tourists is obviously bad. Yet the people engaged in these projects are not residents of or otherwise connected to the people in those places, most of whom are subjected to the tourist hordes as a product of cultural colonialism on top of the destruction of their original economies. There is no "right" to experience these places for outsiders. Even if we manage to leave such socio-economic aspects aside, the development focus nonetheless is apparently on greater and greater photorealism. Not only need no one with no or reduced or astigmatic vision apply, the equipment needed is expensive and needs a significant space commitment.
The game thread is an especially fruitful one to follow, because the envisioned future of media hegemonic right now is often presented in terms of the holodeck introduced in Star Trek: The Next Generation. The root assumption seems to be that the compelling games of the future must be visually convincing. Except, it is amazingly easy to find stories of young children who have received elaborate and expensive gifts, only to be completely enraptured with the boxes they came in instead. Or in the case of families with far less disposable income, the children who create remarkable and absorbing games from yes, empty boxes and other useful things they are allowed to pick up. The adult version of this includes the experience of reading and imagining well built secondary worlds in novels, or playing the dangerously, deliriously addictive text adventure games like Nethack, as Athena Andreadis has noted. The key in all these examples, is that the people taking part are able to take advantage of the remarkable capacity between our ears before adding in other tools or machines. We humans are really good at this stuff, we love to play, and it's about far more than just what we see.
This is the challenge for video game studios. They aren't just coming up with games that call for wands, dummy guns, and fancy pads we jump up and down on to get into the fitness market and respond to rising levels of obesity in heavily industrialized countries. The artists and coders in those studios know better than anyone that if they are going to keep up with the amazing wetware between our ears, they need to somehow match LARPing, aka the sometime adult translation of "let's pretend." This is no easy remit, and it isn't helped at all by another root assumption, that whatever else "virtual reality" and realistic animation are used for, they must be embedded in a quantifiable, consistent product that can be marketed and sold for maximum profit. How sad is it that all this technical and intellectual wizardry is going towards the probably apocryphal statement by Henry Ford that "You can have any colour of car you want, as long as it's black." Not that what the modern movie and game studios are producing isn't far more interesting visually than a black car. I am a happy collector of director's cuts with making of documentaries included on extra discs for all the movies I love best. But much as I enjoy those movies, I am not a participant in them. In games, except for a few stand outs, I am not allowed to participate so much as walk along a certain number of restricted and predefined tracks. There are always exceptions on the game side, but they are not so common.
Let's try framing "virtual reality," special effects, and elaborate animation differently, perhaps even heretically in comparison to the marketing hype. They are amazing things, techniques that brilliantly fool the eye and occasionally the other senses. But they are not for allowing us to have life-like experiences in fictional worlds or recreated pasts. They are a response to the same problem that advertising poses for advertisers, the necessity to keep coming up with something more whizz bang and impressive so people will keep buying it. This is too bad, because it routes us into a cycle of blips of more or less excitement and longer periods of jaded boredom. Admittedly this is anecdotal, but now when I listen to my friends rave over the latest sci-fi movie or programme, they spend hardly any time on the effects or the animation, and focus on the story. They are starving for three-dimensional characters and well-plotted stories they can care about. And we all yearn not for sitting on our butts in front of screens movies and games, because we have lots of those, but participation in fictional worlds where we can have fun and learn about ourselves in the world in less risky circumstances. In other words, most people are really yearning for permission to play.
How disgusting and unjust is it, that even the people who have the option to play available to them think they need to beg permission to take it? Then again, if they did just take it, they wouldn't spend long tolerating the fact that so many people don't have the option because they're too busy struggling just to survive.(Top)
Like A Lumber-Room (2017-12-12)
As I go through notes and some other items and the end of the year approaches, among my revisited blog posts are two by Bret Victor, The Web of Alexandria and The Web of Alexandria Revisited. These are 2015 era posts, not too old but not too new, and it seemed to me that really I had already written seven idiosyncratic posts of my own about the web, so could there be much more for me to write? Except, as is the nature of rhetorical questions that beg the answer yes, the concerned thoughts of a friend who also writes extensively on-line kept bringing my mind back to the problem of the web and memory, which Victor draws out so well. Victor is focussing specifically on the web's current failure to remember what it should and forget what it should because the medium itself is malformed. I have to agree with that assessment actually. My friend is wrestling with the headaches inspired by wordpress, which is at once wonderful and awful, because it is not trivial to haul all the various posts and associated pictures and other bits out of a wordpress site. Unlike more old-fashioned sites that don't rely on database software to reconstruct posts on the fly, there is no single folder he can just grab and back up or move to a new server. That isn't just a web memory challenge, it is also a type of inappropriate lock-in that no one should be subject to anywhere.
I agree with Victor's point that the web specifically as a medium is malformed. Figuratively or literally shouting at people for posting appalling or embarrassing things on-line, and then claiming that if those things eventually destroy their careers and relationships, that was just their fault for posting it, is simply bullshit. These are not consequences anyone could have imagined to begin with, and if they had they would have been denounced and probably medicated very quickly as paranoiacs. When a medium is inherently poisonous for some uses, then yes, the medium is at fault. The uses that go wrong reveal the fault. Originally "the web" was simply what Victor refers to as a "conversational medium." Literally a bunch of geeky scientists chatted about their work and swapped papers and notes over it, a sort of more usable extension of the fax machine and document delivery services, if you like. And this was completely fine. Until two things happened. First, the Mosaic web browser was set loose on the world, which was good. Web pages with graphics delighted the original user base and made the web that much more interesting to more people. Second, a bunch of rentseekers looked and said, "Hey, this could have mass appeal. We all know what that means don't we? Advertising! The equivalent of cable television! Somebody else will do the real work, and we'll take the money." Perhaps this is a cynical depiction, even hyperbolic. I'm not sure of that, though.
The second part is especially important, because as soon as rentseekers get in the game, the big thing they want to bring in is subscriptions, and if they can't get that, or even if they can, they want ads. But to justify the ads and the subscriptions, they need to be able to prove that they actually bring in money, which means they have to track how many times the ads are viewed and how many subscriptions are sold. At least with subscriptions, the tracking is built into the model, and both parties to the transaction know what is going on and why. The ad networks have spiralled into a never ending mess of spying, malware spreading, and identity theft. There are basically no established rules or protocols about ads the way there are around what can be done with subscription data. It's a free for all, which has exacerbated the web's despicable memory. We can only hope that it is more like a lumber-room, in which, as Tolkien's memorable phrase via the character of Gandalf describes it, the "thing wanted [is] always buried" and by implication hard to get, and less like whatever weird information dystopia google is cooking up for everyone who isn't a tech executive.
Unethical and uncontrolled surveillance and spying is a critical malformation of the web as a medium, whether or not the idea of it being a conversational medium wins out. Funny thing is, so-called social media companies seem to agree, because they have been moving conversation off the web and into their hyper-surveillant playpens. It's just that they don't see the same cause of the malformation as I do.
As I already noted, one of my friends is dealing with the problems raised by the web being a frustrating publishing medium. It is not robust, as Victor points out, because the maligned and praised copies are in fact ephemeral. If this seems hard to believe, crack open the internet archive and examine any website you know well in its archived form, going backward through the captures. The further back you go, the more is missing, and the gaps are often catastrophic for the oldest captures, which may only include the index page without even its graphics. At one point – I have not yet verified whether this policy remains in force – if the site was on its own domain and eventually the domain name stopped being registered, the captures would be deleted. My friend is less concerned about his site being on the internet archive than he is about not losing his work and being able to move freely to a new hosting server if he chooses. A key aspect of the malformation in this case is alas, the prevalence of blogs and the underlying blogging software.
Let me be clear, blogs are good, I follow my fair share of them. And while I don't wholly agree with the current manner of supporting blogging, which disempowers the bloggers almost as much as it empowers them because blogging software makes it so difficult to look at and learn from the underlying code, I get that blogging platforms have opened up the web to a lot of people who otherwise might have felt unable to contribute. But it is an obnoxious decision of the software writers and blogging platforms to have no option or no easily found option to "export my blog as a static website." There are ways to do it if you are more technically inclined, but that's ridiculous. If a key value of the blogosphere is making it easier for people to participate and not to lose their blogposts, then it should be just as easy for them to download the lot periodically in order to back it up. Or as easy to delete it completely. That begins to get closer to Victor's suggestion that it would be better for the publishing medium hosted on the web to take on the feature that dna and books share: "...every reader gets a copy, every reader keeps their copy."
The question of how this all gets paid for is the usual excuse given when a person wants to claim that really, we all opted into constant surveillance and ads because we don't want to pay for anything on the web. To which my first answer is, who is "we"? In fact, every one of us who is on-line actually pays a great deal and repeatedly, for the infrastructure that supports the web in the first place. Don't let the ISPs and the rest of the pundits fool you on this point. If you have paid or are paying a phone bill (cell or land line), cable bill, and/or internet bill, you have paid in. Loads of us contribute to the web for free via all manner of things, from blogging to work like new approaches to archiving that offer alternatives to the malformations of the medium. A struggling but also growing alternative infrastructure for supporting websites via one time payments and different types of subscriptions that eschews ads is in development right now. For my part I'm cautiously optimistic about how that is going. I have no sympathy for the sites and companies crying that they need ads to survive. That means they don't offer enough good stuff to persuade people to support them. Hate to break it to those folks, but you don't get money just for showing up.
To me it seems that if we draw back a bit more, the deeper malformation of the web derives from an apparent expectation that it should be all things to all people so to speak. That is, that it should be a conversational medium and a publishing medium as pointed out by Bret Victor, as well as a golden goose and a way to somehow prevent bad guys from doing bad things by just pushing a button. This is transparently ridiculous. I do think that it would be worth separating the conversational medium from the publishing medium again, immediately if not sooner. How to make the conversational medium appropriately ephemeral is already clear via the various privacy-respecting initiatives in email and "social media" that are not run by advertising companies. This is not going to create problems for law enforcement. We humans are quite capable of remembering, even if the web has forgotten, that there were many successful investigations before the web became a more widely accessible thing, and those techniques still work. The publishing medium is already available in principle as well, via a range of growing archiving projects and more importantly, peer-to-peer file sharing. Whether everyone is comfortable or not, if "we" really do want a web based publishing medium, then peer-to-peer or some close relative is what provides the robustness that currently the web lacks – even the surveillance web. (Top)
Disagreeing Agreeably (2017-12-05)
I probably should have realized that this topic would be all over the various "business books" and associated websites. There are also of course many obnoxious and condescending memes available to the effect that you can disagree with the bully all you like, he is still going to beat you up, and the rest of it. However, these two main types of search engine results don't match the perspective I have on the matter very well. They remain wedded to the assumption that disagreeing is a sort of competition in which one party must win at all costs, and somebody will win eventually. It's just that there may be more or less pleasant stalemates in progress before one party finally breaks. This would all be perfectly fine I suppose, if every question of every kind at all times and places was absolutely black and white. The world's stubborn refusal to be simplistic in the manner of a spaghetti western or twenty minute commercial poorly disguised as a children's cartoon was already covered a bit in an earlier thoughtpiece, Lack of Knowledge Is Not Objectivity, so rather than rehash all that, feel free to check that one out and I will describe the different perspective I have on disagreeing agreeably here.
More often than not, disagreement seems to me to be not a zero sum game but a set of different hypotheses about how a particular situation works, how it should work, and what we should do next. Indigenous nations in the americas have a tendency towards approaching disagreement in this way. For example, the Haudenosaunee scholars I have read and listened to explain that at the end of a council, the resulting consensus did not mean everyone agreed. It meant that they had worked out an approach that everyone could live with at least for the purposes of dealing with immediate needs. The issue would go live again after trying out that approach and observing the results that followed. This is by no means just a Haudenosaunee thing. Anishinabe and Nehiyaw scholars describe a similar approach, and add that if a significant portion of the community simply could not join the consensus, they had the right to quietly leave and found a separate community, pursuing their own path. This is no failure, because it specifically avoids falling into civil war, and opens up the number of options being tried out. The crazier things are, the more things people are able to try, the likelihood that somebody will hit on the solution that really works is that much greater. And on a less happy note, the less likely nobody will survive to try it.
I had a work colleague who would often pause a debate in meetings by asking, "Okay, but is this a hill to die on?" In a way it was a non-Indigenous way of asking, "Can we find a consensus that allows us to move forward by letting this specific item rest for now?" This sounds great, except that often meetings of this kind are in a context where the "for now" never happens. The apparent request for a consensus is really a quiet demand for submission, because there is in fact no means to return to the issue and work on it properly. This is rarely the right answer, but is so embedded in non-Indigenous decision-making processes that people often feel driven into treating the question as exactly what it is being artificially forced into being, a zero sum game. This doesn't work. It steamrollers through short term plans for particular parties while denying others any recourse ever again, when that is the opposite of the actual desired end result for most of the people participating.
Being an Indigenous way of doing things isn't the only reason to prefer a hypothesis and experiment sort of metaphor. It is also more forgiving of human frailty, both in terms of encouraging small-scale changes and a step-wise expansion if any, and in not calling down people for choosing an approach that didn't work out as well as hoped. The choice of the plan that didn't work does not necessarily indicate the people who chose it have a bad character or lack of intelligence, for instance. Of course, unexpected results may also reveal unacknowledged motivations that may be more or less problematic to the neighbours and even to the people themselves who took that particular option. There is more than one person on-line and off who writes about the many ways we humans can be consciously misinformed about what actually motivates us, or find it difficult to state clearly what motivates us rather than what we are told should motivate us. Real life is tricky enough without insisting on ways of dealing with it that make it that much harder.
Originally, this thoughtpiece stopped at the end of the previous paragraph. Then I got to read the new, revised edition of Inga Muscio's autobiography of a blue-eyed devil: my life and times in a racist, imperialist society (seven stories press, 2014), and found myself reading two pages in particular concerning two men who had fled persia across the period of the overthrow of the shah, the 1978-1979 revolution, and several years afterwards. It is better to read the whole book of course, but let me quote two snippets from pages 121 and 122 that reiterate how important, and how precious, the ability and space to disagree agreeably are.
Though the two men existed on completely opposite sides of the political belief spectrum, they got along fine. They debated, argued, and loved each other. (121)
I do not know exactly why Peri and Parisha's dads got along. From what I have gathered, it seems that they put great value on their freedom to disagree with one another, and understood what a huge luxury this is. (122)
I don't think that being able to disagree agreeably is a luxury, in fact I think it is a longterm necessity as already noted. Yet Muscio's greater point, that this is a precious gift that should be appreciated, and that oppressive circumstances are marked precisely by it becoming impossible to disagree agreeably for reasons of safety and survival, is one I agree with wholeheartedly. (Top)
Lack of Knowledge Is Not Objectivity (2017-11-28)
The title of this piece may sound a bit odd. After all, it may seem obvious that lack of knowledge is not objectivity, it is ignorance. On seeing what my trusty OED has to say about knowledge, it helpfully explains that it derives from the verb "to know" in the sense of "being aware through observation, inquiry, or information." Meanwhile, knowledge itself is defined as "facts, information, and skills gained by experience or education; awareness, familiarity." Well, that seems commonsense enough, and the definition of information refers to facts or even mathematical quantities à la information and signal theory. Yet over the past several years, as more and more databases of information are vacuumed out of every possible thing we can imagine doing online and more than a few things we haven't, because plenty of data has been digitized and resold that was originally gathered at minimum in pre-web days if not pre-internet days, there has been considerable slippage between the terms "knowledge" and "information." Impressive or terrifying (or both) as massive online databases and related data stores may be, they aren't knowledge. They are just mute piles of information. And to the growing cost of more and more people, mute as those piles may be, they are not neutral either.
Cathy O'Neil in her book Weapons of Math Destruction has added to the powerful chorus of critics reiterating the old hacker adage, "garbage in, garbage out." If the data goes in with a racist or sexist bias, then racist and sexist stuff will come out when it is put to use. Even the big player data brokers and advertising companies aren't immune. Somehow amazon.com never did figure out how to stop offering me vacuum cleaners and kitchen appliances because my name was flagged "female." This in defiance of my actual book buying habits and the fact that I am disinclined to order random bric-a-brac from it and never have. Call me crazy, but I have stuck to treating amazon as an on-line bookstore. Sure, the data and the program that crawls through it doing whatever mysterious algorithmic balancing act is just following the rules programmed into it. But people programmed the rules. In this case, a pile of sexist assumptions about what women must want to buy, rather than actually developing similar suggestions based on my previous purchases. The book recommendations were generally a bit better, because they were shaped by the books other people bought who had purchased the same ones I had. In other words, those recommendations were shaped by real data, even if still shakily applied.
This is the easy example, though. One of the harder ones goes beyond the question of computers and supposedly neutral "scientific measurement." There are times when both information and knowledge that we humans lack may enable us to make a more fair decision than otherwise. However, these times seem to be very few. Despite worrying at this thoughtpiece for some time, the main examples that can't be easily trounced I have found so far are marking written work in an educational context, and making decisions in a legal or regulatory context. The latter two tend to inspire considerable confusion, because greater knowledge of the potential factors impacting criminal behaviour that might lead a judge to choose a milder sentence or demand jailtime plus rehabilitation rather than merely locking up someone for as long as possible is so often equated with bias by people of varying political persuasions for varying reasons. Alas, the reasons are rarely considered ones, and mostly to do with assumptions about how a crime is always worse if committed by someone who is poor, and the lightest sentences of all should be reserved for the rich. This is where the ideals of objectivity and impartiality seems the furthest away, because it is undeniably difficult to overcome our personal feelings and interests.
Where the question at hand is not a simple one with a yes or no answer, it is very easy to be fooled into thinking that the less the decision maker knows, the simpler and better the decision. Usually the fooling starts with a bad analogy, like relating the source of the complexity to having too many choices, like when it is hard to choose a consumer product because in north america there are twenty thousand versions available with little to no truthful information available to rank them according to the task or need they are meant to fulfill. All this while leaving out the question of whose task and whose need. But this sort of choice, which in late capitalism is carefully abstracted by "the market" from social and environmental ties in the hopes of persuading us that consumerism is only about us, is possibly the worst analogy for complex ethical decision-making we could draw. And of course, it falls apart as soon as we learn a bit more about where the items we are deciding whether or not to buy come from, how they are made, and so on. On the other hand, an abstracted "ideal consumer decision" is a great example of one that can seem impartial in all sorts of ways, as long as we don't let ourselves learn anything else. But it isn't a neutral decision when we opt to buy bananas all year round via a major multi-national corporation that treats the agricultural workers who gather and pack the bananas like slaves. That we might not have known that when we bought the bananas doesn't make us neutral. It makes us ignorant, and if we are lucky, the less nasty kind of ignorant that comes simply from not knowing, not the horrible kind that comes from refusing to know or refusing to act on what we know.
So neither personal lack of knowledge nor delegation of decision-making to machines can create impartiality for us, or somehow magically create impartiality out of thin air. Being impartial is defined in that dictionary of mine as "being fair and just." For better or worse, often what is fair and just is anything but treating everyone exactly the same. This is another mistake many people who complain that certain types of programs and guidelines for decision-making are not fair. The people who argue against the proportion of women hired or elected should approach the proportion they represent in the population for example, or those who protest that just because somehow the "criminal justice system" is so good at putting people of colour in jail, that doesn't mean it has any problems. They insist the former is wrong because "everyone should be treated the same" and that the latter is wrong because people accused of crimes are already all treated the same. Well, if that were true, then there would already be a proportional number of women politicians and say, corporate executives, and the prison population would reflect the overall social make up of the population of the greater country it was within, not weirdly skewed. This is not to say that the people bewailing the unfairness of a system that doesn't treat everybody exactly the same are necessarily making an argument in bad faith, though they may be. More often, they are arguing from a position of ignorance. They don't suffer from the systemic oppressions that are at work against certain social groups, so it seems that the system must be working properly. But treating everyone the same can't be fair or impartial when the very system itself is not fair or impartial.
In other words, garbage in, garbage out. (Top)
Web Garbage Really is Hazardous (2017-11-21)
No, this is not an oblique reference to social media posts a person may or may not want to have tagging them for the rest of their foreseeable life online. Since the Moonspeaker is not a blog, there are a range of features and options that I don't need to manage one way or the other, ranging from comments to various sorts of pingbacks. On the comments side of things, there are specific reasons I have never enabled those which interested readers are welcome to have a look at. In summary, I simply don't have the time to give comments the type of care and feeding they need (and deserve) to add to the site experience and materials as a whole. That was the most I ever thought about it, and seemed to be about all that needed thinking about, for now. Much to my surprise, it turns out that for all those bloggers out there who have lost interest in their shingle on the web, it is quite possible that your blog may be the latest platform exploited by trolls online. Apparently bots can be used to create new commenter accounts which can then be turned against other blogs connected to the same blogging software and comment management services.
I remember back in the "good old days," between approximately 1994 and 2002, when it was possible to try your hand at building a website on an officially "free" service like angelfire.com, or the original xoom.com. Many of these services didn't merely plaster the "free" site with ads, their terms of service also claimed that they owned all copyright to whatever might be posted on them, in case the things ever made money somehow. Such sites could be abandoned at will, left to clutter up the web like so many crumpled bits of paper and other biodegradable debris. Most of them vanished forever in the course of the dotcom bust, and I know that the loss of my own contribution to those lost sites is well gone. Not because I'm embarrassed about what it said, but because it was ugly enough to make a poor visitor's eyes bleed – it was built when I still had an officially four grey but actually three grey screen. Unfortunately, today's website debris, quite apart from its actual content, is far from biodegradable, and no one can just ignore whatever they may have set up on tumblr or wordpress anymore.
If you're using a hosting service that is "free" and therefore not likely to delete whatever you built just because you're no longer updating it, it's important to take steps to minimize its virtual plastic content. Figure out how to shut down commenting and signing up for comment accounts, especially. It would probably be a good idea to clobber any email addresses associated with them as well if they are also being left to their own devices, since if they aren't attended to they can easily get hijacked into transmitting spam and other nasties. Probably the hardest part is paying attention should it turn out that the wordpress instance or whatever has an unpatched plugin, in which case it behooves you to remove it from use on the site if it looks like it is not going to be patched. Or if that's easier than getting it patched, which is unfortunately a very common situation. In my experience, the best places to watch for a heads up on those are Krebs on security (despite his absurd and shitty attitude about Tor and TAILS) and ars technica.
That this is necessary to be a fully responsible contributor to the web sucks. (I'm as happy to be irresponsible where I can get away with it as anyone!) After all, the hype is all about how footloose and fancy free everyone is supposed to be online, how the various "web 2.0" things are supposed to make it ludicrously easy to present thoughts and projects to the world. No strings attached! Unfortunately, this is one of those "if it sounds too good to be true, well, it's false" things. If it's any consolation, to my knowledge it is not difficult to turn untended plugins and services off, once a person sets out to do it, and there are many savvy fellow users of the same services who are happy to help. (Top)
Yes, the Apocalypse is Now (2017-11-14)
It's just not what you think. Seriously, and the best way to get clear of the usual nonsense about "apocalypse" meaning "end of the world" is to try out an actual ancient greek dictionary to see what it actually meant to begin with. Admittedly, this may not be terribly reassuring, because as it happens means literally "an uncovering of what was hidden," paraphrasing most ancient greek students' go to dictionary, the Liddell-Scott-Jones Greek-English Lexicon (the 1940 edition is searchable online). For Indigenous peoples, it's pretty clear what got revealed to us: the intentions and effects of colonization by a whole lot of lost people, and unfortunately the consequences in general have indeed been the destruction of our original worlds. So practically speaking, I can appreciate why the term "apocalypse" is so often conflated with the "end of the world" concept, even without having read one of the strangest official scriptures I've ever encountered, the "book of revelation" in the christian new testament.
As the title says though, there is plenty of revelation of what was formerly hidden right now, and it isn't necessary to subscribe to a specific religion necessarily to agree that's true. Let's have a look at a few key examples that have come out over just the past roughly five years.
To be sure, this is not a complete or especially reassuring list. But it is better to be aware of these things and therefore able to decide what to do about them than to run along oblivious until the lights go out. Perhaps perversely, I actually find it rather comforting that the truth is coming out about these matters. After all, the fact that there are committed people willing and able to make the facts available and that many more people are responding with resistance to dangerous and destructive actions is the infrastructure of a different and better future. (Top)
The Reality of Fiction (2017-11-07)
The other day I ran into a pair of statements about fiction that caught me by surprise because they seemed so obviously absurd. Evidently not to the people who said them and reproduced them in the book, which I find even more surprising. The first statement was to the effect that fiction has nothing to do with real life. The second that fiction is despised in "america" which in the context of that book meant specifically the united states. To be fair, sometimes people say and write such things for rhetorical purposes, in order to make a broader point that the unexpected statement is meant to highlight for the audience. That didn't seem to be the case for this particular book, which was engaged in a specific discussion about reality and its reflection in writing and culture. Not that there was no rhetorical point at all, certainly there was, and it consisted of the claim that fiction is actually worthless and people in the united states know better than to waste their time on it. Except insofar as we can construe the claim about americans in the united states as being complimentary and reflective of a widely shared practical streak, something has gone awry here.
So let's start with fiction. If it isn't based in real life, where can it possibly be from? Absurd as even our dreams can be, we can trace their odd imagery and disjointed narratives back to our waking lives, often without too much effort. Sometimes it takes much more effort, because we humans are such stubborn pattern matching, symbol wielding creatures. As Seo-Young Chu would remind us, even the wildest science fiction starts from a real situation or idea that has been extrapolated into "what if" question followed through as a story. Books about the adventures of variously empowered or strange teenagers are not literally true, no, as many a Harry Potter fan will regretfully assure you. Yet the questions about responsibility, growing up and so on the stories explore are real enough. The author may or may not trace out answers and explorations that are convincing enough to keep us reading.
It could be argued that science fiction or novels that start with a foot briefly in "the real world" like the Harry Potter series or the Chronicles of Narnia are not fair examples. Yet when I turn my mind to trying to find a story without a toe in this world somehow, it's a near impossible remit. Even fantasy series that eschew any real world ties most often retread the same wearying storylines over and over again. The male quest in a world where colonialism and racism replay over and over and over again in various european flavours and the "other races" are frankly designated non-human still comes from snippets of known history that actually happened. The fictional version can't literally happen, to be sure. But we couldn't even make sense of it if it didn't extrapolate in some way from the real. Perhaps this all seems terribly obvious, but it's always worth a second look when one person's obvious clashes with a personal obvious.
But maybe what the person really meant was that fiction has no impact on the real world, that it is just escapism for example. This is not a new claim, and it tends to make an appearance when the person invoking it wants to call down a particular genre. Science fiction say, or the marketing category referred to contemptuously as "chick lit." It is also a nonsense claim. If fiction had no impact on the real world at all, if it never affected human behaviour, then all the people working in advertising would pack up and move into other careers yesterday. Most fiction isn't particularly effective at all, or is only deeply meaningful to a few people. We can all rattle off examples of fiction that has been effective indeed, in one way or another. Uncle Tom's Cabin, Lewis Carroll's Alice books, yes Harry Potter and The Lord of the Rings, Frankenstein, Wuthering Heights, Orlando. That's just what I can think of in english that a person who grew up in northern north america can at least hear about.
Frankly, I can't believe that fiction is despised in the united states either, and not just because it has seen more than one court case triggered by an attempt to or effective ban on a book or category of magazines. If it were so despised, then I can't see how entertainment companies like the ever more infamous disney or lucasfilm could exist. Or hollywood and celebrity culture, the comic book industry, the video game producers, or the creeping menace of the porn industry. If fiction was despised wholesale, these companies couldn't get started let alone survive. Nobody would ever run a community theatre or take the time to learn about or practice art. Credit where credit is due for the original two statements that got me started on this though: they were certainly thought-provoking. (Top)
Troubled Mixtures (2017-10-24)
There are many topics that can be counted on to raise tempers and generate controversies in north america, and despite the many pious claims about slavery being long gone – would that it was, but it isn't, it's just being called something else – and how there are no laws against "miscegenation" anymore, the fact remains that so-called "race mixing" remains controversial. It is often referred to by other names that obfuscate the racist underpinnings of mainstream anxiety about it. How long this will continue to the case with the mainstreaming of white racism against anyone labelled "not white" in the united states remains to be seen, although racists are as happy to use euphemistic terminology as anyone with bad intentions towards those they deem "other." What we can all be certain of, whatever the favoured racist words of the moment are, is that the concern will be focussed on "white miscegenation." No other mixing counts.
This makes "mixed race" people, presumed mixed "white with something else" a perpetual source of anxiety. As J. Olumide has written, such people are either pathologized or set on a pedestal. They must be either tormented, unsure of who they are, with nowhere to call home, sentenced to an inevitably sad and degraded life, or saintly and selfless individuals who labour tirelessly at their own expense to cure white people of their racism. Never mind that racism is not a sickness, but a word we use to describe a hierarchical system of white privilege and teachings that tell white people they are superior and have the right to degrade "non-whites." Based on this rationalization the Métis Nation is still denounced by some writers as a degraded group, not a nation, that only resisted the imposition of the nascent canadian state because they were degraded. Political and historical complexity neatly overruled by the claim that Métis behaviour was wholly dictated by their inappropriately mixed "racial" heritage.
For those keeping track, that's the real meaning of essentialism, the claim that inborn characteristics, real or imagined, dicate a person's behaviour, likes, and dislikes throughout their life. Note this is not the same thing as using identifying characteristics to tell different beings or objects apart. For example, mammals can be differentiated from reptiles by such features as both having hair and bearing live young. Those characteristics don't tell us how the mammals in general or in specific will behave apart from the physical consequences of having hair and bearing live young.
Then of course, there is the extraordinary praise for anyone who is willing to "play indian" especially if they are willing to play the version who is a "mixed blood" out of direct contact with their putative Indigenous heritage. These folks are easy to pick out. They always come from nowhere specific, can't explain their specific and current ties that reflect their integration in an Indigenous community, and may be quick to rattle off a stream of more or less obscure Indigenous nations they believe they have some distant ancestor in. The most savvy avoid the now worn out claim that "Great great great great grandma was a Cherokee princess." If they are in the public eye, these are the folks who are happy to go along with settler moves to innocence, the last thing they want to do is make whites uncomfortable or sound angry. Nowadays it's a lot harder to get away with this shit than it used to be, as Joseph Boyden has discovered to his social cost, though he had already reaped plenty of financial and critical rewards before his façade was punctured. Neither his behaviour, nor anyone else who tries to play this game, was piloted along this path by their "mixed-race" heritage.
The ongoing anxiety about white miscegenation ties back to the discontent with the reality of ethnogenesis as discussed in Ethnogenesis and its Discontents. But that is not all it is. On reading and listening more closely to the relatively mainstream expressions of this anxiety, I was startled to realize it all sounded familiar in a cock-eyed sort of way. A version that may be a bit less immediate and so easier to recognize and read through is drawn out by Patricia Roy in the first part of her trilogy of books about anti-asian racism in british columbia, A White Man's Province. The most virulent writers claimed in the case of chinese immigrants that if they were allowed to immigrate freely to the province, they would soon out-compete and overrun the whites. Furthermore, they insisted that these newcomers were dirty and carried infectious disease. With a few small adjustments, the same claims are made again and again, in more or less vicious words, about yet another "non-white" group. The only reason Indigenous people are not generally spoken of in quite the same way is because we are assumed to be too small in numbers and therefore not any biological threat.
In other words, the loudest denouncers of immigration and miscegenation are most anxious and worried because they fear being treated in the same way that their ancestors treated Indigenous peoples. Or in some cases, their cousins in what used to be referred to as "the colonies." They can't believe that isn't exactly what somebody else will do to them at the first opportunity, and since people who think they are white have been and are engaged in violent colonial behaviour all over the world, there's a lot of worry and bad conscience to go around. This then feeds a circular rationalization for refusing to stop colonizing, refusing to put an end to systems of oppression of all kinds, because they insist that means they will be subjected to the very same things next. This circle of nonsense is anything but true, but it is good at keeping a person anxious and scared, therefore unable to think and reason. Nor are they able to see or accept that others aren't chomping at the bit, just waiting to colonize them. Is it any wonder that the two most medicated conditions in northern north america right now are anxiety and depression? (Top)
An Assessment of SeaMonkey (2017-10-17)
An alarmingly long time ago, I wrote a frustrated thoughtpiece about the sad development trends affecting Firefox, and noted towards the end that I had not had a chance to look at SeaMonkey. This has now changed, especially in light of the confirmation of Firefox's apparent determination to destroy its userbase and not deal with the ethics and problematics of forcing non-critical updates on users together with security updates. Maybe mozilla will change course on at least the second issue, although its bone-headed insistence on hijacking the creation of new tabs remains a separate, persistent irritant. I have a script work around for when I use Firefox, which is in fact faster and less intrusive than installing an extension to do the same thing. There are so many levels of absurd to that. In the meantime, on my list of possible (and likely masochistic) summer projects is figuring out how to compile Firefox myself, in order to see if I can compile a version missing at least such obnoxious imposed "extras" like Pocket and DRM. But I should get back to the real subject of this thoughtpiece, which is SeaMonkey.
UPDATE 2017-11-07 - For those wondering why I have not switched to the Extended Support Release (ESR) version of Firefox, which is limited to security updates and (hopefully) rare live updates to patch vulnerabilities, that's a good question. The answer is simply that I was not aware of the ESR channel or that it was accessible to the regular person until I began taking a much closer look at SeaMonkey. In the end when I wanted to see the ESR pages on Mozilla's site, I had to resort to a search engine to get to them. Perhaps this is only accidental, but it does mean that without a deliberate plan to look up details on the ESR, it is difficult to find.
I pondered the question of why Mozilla doesn't offer a download for Firefox minus DRM and proprietary code. Compile it once, chuck it up there, let folks decide for themselves what they want. For those who would prefer not to have those things but are not able to compile Firefox from source with those segments left out, they aren't left behind. No one should have to be a developer just to run software that better suits their needs. This might have been a neat thing to have tried out on the first introduction of a chunk of third-party, closed code. The number of downloads would have provided a proxy count of how strongly Firefox users generally felt on the subject. Maybe the majority of folks would have been fine with everything but a specific item. If the third party code bearing version lagged notably behind its counterpart, that would send a clear signal that mozilla could use in considering its next steps, rather than making assertions based about "the future of the web" on no real evidence. But I guess if enough people in charge of things at mozilla are afraid that the general user base might just vote emphatically for the opposite of what they are doing, especially the recent decision to add DRM, they just won't do it.
SeaMonkey is of course a fork from the original mozilla project, named for the brine shrimp that was still marketed as an ideal pet on the back of comic books in the late 1980s. Unlike Firefox, it is an all-in-one suite analogous to old-time Netscape, including a browser, webpage composer, and email. It also has a built in RSS reader that behaves mostly as many of us tend to expect these days, that is any picture plus text automatically shown as part of the feed. The current version of SeaMonkey is based on the Firefox 51 build, which suggests that they are not inclined to move to the new extension format any time soon. Visually SeaMonkey can look a bit rougher around the edges than Firefox, but that can be easily remedied with alternate themes if desired, just as in Firefox itself.
On starting SeaMonkey 2.48 up, the first thing I noticed was how much faster it is compared to the current version of Firefox as of this post, which is 56.0.2. Considering that SeaMonkey is a suite, not just a web browser, this was a real surprise. This may or may not have any real relationship to the absence of Pocket and the wall o' tiles of recent tabs. Flash is gone without a trace, and there does not seem to be any place to enable java, nor is there a java plug-in. My research so far suggests that the SeaMonkey developers have added in DRM, though again a person willing and able to compile SeaMonkey from source can have that module left out. More fuel for including figuring out how to do that on my summer possible projects list.
Setting the home page and the contents of a new tab is relatively simple, and leaves it up to you what to reset them to from the SeaMonkey homepage. This also means that if you use a starter page hosted on your own computer that happens to include links to other resources on your hard drive, they work. If you use a new-style Firefox extension to produce the same behaviour, all internal links are blocked, even if the page they are on is on your own hard drive. Perhaps especially in that case, according to the Firefox development team. I understand what security issue the Firefox developers are attempting to handle with the newer style behaviour, it just doesn't make sense to me to mess with pages you create and set up yourself. It's possible to run a check on that, and pop up a one-time dialogue when a local page has been selected along the lines of, "Hey, this is a local page, it's definitely yours, right?" and if it has local links on it, and "You're okay with the local links, right? (Don't worry, you'll only be pestered like this the first time you choose a local file for your home and new tab pages.)" Not perfect, but hogtying the user "for their own good" is disrespectful, at minimum.
There are some nice touches in the preferences. There are settings to clear all data on session close, allow animated gifs to play only once, set the right dictionary, and prevent or allow third party image downloading. All of these settings can be changed from the "about:config" page, which is not that hard to do, but it is a nice and convenient touch to have them part of the regular preference set accessible from the menu. There is no option to turn on private browsing permanently, though. On the other hand, the default search engine is duckduckgo, the least bad search engine option I am aware of at the moment, another nice touch. For folks inclined to use a theme other than one of the defaults, please note that the checkboxes sometimes fail to render properly in alternate themes. The accompanying settings are not lost or anything, but if you want to change them you'll have to revert to a default to do so, at least until the creator of the other theme has a chance to fix the issue. There a few nice tweaks in the user interface as well that work regardless of theme, such as being able to minimize the address and/or search bars without hiding them completely, and pop-up glances of inactive tabs on hover.
SeaMonkey has a fair-sized crop of add-ons, including NoScript and HTTPSEverywhere, two of my favourites. It is not immediately obvious, but uBlock Origin is also available for SeaMonkey, it's just that in order to install it you need to change your user-agent string to Firefox and install the version just before the most current one. This is actually not too difficult, especially with the assistance of a user-agent string switcher extension. uBlock Origin can block anything you decide to block (including awful "still loading" animations, for example, an animation for which I do not have enough "no" in my body) as well as ads based on subscription to open sourced ad blocking lists on-line.
SeaMonkey's composer may inspire some nostalgia for Netscape's old equivalent, but unlike Netscape Composer, its wysiwyg mode isn't totally crufty, though it does write rather poor html by default – HTML 4.01, no less. I looked into whether there is a way to change this, and all the reports I found had last been edited 17 years ago. It seems that nobody on the SeaMonkey team considers that an important option to add even as a check box or drop down menu in preferences, I guess because they figure if a person wants that option, they won't be using the composer. Maybe nowadays people don't use the composer to view and learn html anymore, and that also works against the option being implemented. Still, that certainly shouldn't stop a person from using it to knock together quick and dirty pages when they need to. Setting up email is a bit non-intuitive for anyone who is accustomed to Thunderbird, Outlook, or Mail. The set up wizard doesn't collect all of the necessary information successfully, probably because it needs a bit of an update. The information can be added easily via the "View Settings for This Account" option though, so that is a bit annoying but not the end of the world.
One thing that SeaMonkey does require of its users if they don't want to dive repeatedly into the "File" menu with the mouse is a minimal familiarity with the most basic keyboard shortcuts of their operating system. In tabs, there is no dismissal "x" so a person had best be sure of the "close window/current tab" shortcut. For MacOSX folks, the keyboard shortcut to return to your home page is a bit counterintuitive, it is "function + command + left arrow" as opposed to Firefox's "function + option + left arrow". It is not possible to remap this particular shortcut in either program, which has always struck me as strange. Firefox prevents you from using MacOSX's built in shortcut assignment feature by not including a menu item labelled "home" (!?) while SeaMonkey manages it by using a key command that includes a key equivalent that is difficult to figure out how to type. No idea why in either case, because it is hardly a security issue, the usual rationale for a perverse behaviour of this type, and at least for me, the idea of getting used to say, "command + H" to go home and "command + shift + H" to see my history or even vice versa seems quite reasonable. This last paragraph is all nits though, not the sort of thing that puts me off a program, though it is nice to know about such curious "gotchas" ahead of time.
The toughest part of the SeaMonkey situation however, is that it has a small development team that is very committed and very swamped. SeaMonkey 2.49.1 was just released, and they are prepping to move up to the Firefox ESR 52 codebase and freeze on it for awhile except for security updates. In other words, they are prioritizing dealing with security issues. According to the most recent update blogpost, they are also looking to join forces with the ThunderBird team in order to ease their resourcing issues, as well as providing a list of areas where they could use some help. It was refreshing to see a request for hands and brains to do the work over money, actually. It's frustrating to see the most privacy and user respectful browsers struggling to keep going when their work is so important.
As a final note, I am aware of the Firefox variant WaterFox, which amounts to Firefox minus proprietary and telemetry code. Unfortunately, the new tab hijacking remains in place, and on my set up it interacts badly with a program I use to assign keyboard shortcuts to application-specific scripts. However, it is an excellent alternative if you are not scripting the browser, and since it runs both add ons and Web Extension APIs, you can use an extension to do that task instead. (Top)
There is No Such Thing As a Benign Dictatorship (2017-10-10)
No doubt the title of this thoughtpiece sounds self-evident, like something we all agree on. However, I'm not so sure, because the agreement that even a benign dictatorship is bad seems to vanish as soon as a particular group of people is labelled "other" in some way. It is a curious fact that othering always seems to go together with treating whoever has been subjected to this process with also considering them incapable of making their own decisions, expressing their own thoughts, and generally acting as full human beings. The rationalizations of how othered people get treated always come down to the same claim, "But it's all for their own good!" It's funny how often their own good has less to do with what their ultimate experience, and more to do with the benefits accruing to the people who have othered them. Dictatorship and benign are incommensurate, you can't have both. This is a contributing factor to why parenting in a good way is so damnably hard. Parent as dictator is not a good way to go about it, but as in most situations of unequal social and physical power, dangerously tempting in all too many places right now.
This is precisely why so many Indigenous cultures opted and opt for the slower and much more difficult mode of consensus decision making rather than resorting to "majority rule." There are simply too many ways to suborn the majority rule concept, especially again when different members of the community have very different access to social and physical power. Under those conditions, the folks who really want things their way have all too many means to enforce what they want anyway, while enforcing the appearance of majority rule. I have heard many people insist that consensus decision making takes too much time and too much work. Certainly it must be, if the consensus decision is not respected and carried out. Nobody likes to waste their time. Yet it seems to me that if we can't see a way to contribute to consensus decision making when it is available to us as a real and effective option, something has gone terribly wrong. On the other hand, I can't deny that being part of a consensus decision making process has inherent discomfort involved, because it is impossible not to have a personal investment and sense of personal responsibility for the outcome of the process. This could be the best check and balance of all though, the desire to avoid making the wrong decision, or at least taking a small step in a certain direction and seeing how that works out before making an irreversible huge decision.
I don't think it is at all difficult to show that Indigenous consensus-based decision making systems are better, though my criteria for "better" may not satisfy many committed capitalists. To my mind, producing more of an abstract fiction called "money" does not signify "better." You can't eat it, drink it, or breathe it, and it's worthless if for some reason people lose faith in the currency. If a decision making system is working well, then I would expect that the land is not being destroyed by the cumulative impact of the decisions it supports. Warfare isn't endemic and people aren't continuously fleeing from one place to another because of repeated cycles of violence. When decisions trigger warfare or terrible damage to the land, that's when the feedback mechanisms kick in to repudiate such decisions. In other words, the decision making system allows for doing just what the punchline to the bad joke says, "Well if it hurts, stop doing it then." (Top)
Ethnogenesis and Its Discontents (2017-10-03)
As is my habit when writing from a starting point on a more or less obscure word, I went straight to my OED to have a look at the meaning of "ethnogenesis" for this thoughtpiece. This led to a relatively shallow though still unexpected rabbit hole. It seems that as of 1971, not only was "ethnogenesis" not included in the OED, according to my compact edition that dates from that printing year, it wasn't a thing yet at all. The closest word to it then was "ethnogeny," referring to "that branch of ethnology which treats of the origin of races, nations, and peoples." Which of course forced me to look up ethnology, the "study of the characteristics of various peoples and the differences and relationships between them." This is something more commonly referred to as anthropology, and at least to my reading, it is not clear whether the differences and relationships are necessarily between the characteristics or the people. Both are well worth studying, after all. Then at last, in my 2010 era electronic OED, there is ethnogenesis proper, defined as "the formation or emergence of an ethnic group." It is no surprise actually that this word takes so long to get included in a historical dictionary, because until not so long ago, at least most mainstreamers in "the west" didn't believe such a thing ever happened.
Instead, apparently new peoples weren't considered new so much as degraded, degenerate remnants of "real" nations. The absurdity of this assumption is uncomfortably obvious the moment anybody tries labelling the successor polities of the former roman empire merely "degenerate groups" as opposed to, you know, french, italian, spanish and so on, all of whom have developed distinct cultures and have and are living their own histories. Not independent histories because there is no such thing in this independent world of ours. In any case, originally this actually was no big deal. New ethnic groups developed and develop all the time. There was and is no requirement that this should lead to civil war, genocide, or division of a so-called nation-state. This has a great deal to do with nation-states being a very recent and unsatisfactory invention, far more recent than the processes that lead to the development of new ethnic groups.
To be clear, I am not suggesting that there was never any disruption, difficulty, or even violence associated with ethnogenesis. Initially I was going to characterize this as evidently not true and provide a few examples. Except, it turns out this is quite difficult to do, because so many of the examples of ethnogenesis evident in the historical and archaeological record are tied to colonialism. That is, some ethnic group or other with inappropriate designs on somebody else's homeland invaded and at the minimum attempted to take over. There is considerable evidence that after fighting back, if the folks being attacked couldn't fend off the invaders, if they could, they took to their heels, sometimes becoming invaders in their own turn. Whether a given community got to stay where they were established, got forced to mix more intimately with others they hadn't planned to, or ended up in another place and making new lives there as a community apart, they would not be able to remain the same as they had been. So with that in mind, it is less surprising that people might have been a bit unimpressed with the cause of ethnogenesis even without the nation-state, especially if they were the ones affected most directly.
Even without invasions or other catastrophes that could enforce a sudden migration like severe climate change or an appalling earthquake or something, ethnogenesis is unavoidable. Communities change over time under the best of circumstances, and may spawn newer communities that develop along their own trajectory, becoming something different. The puzzler I like to bug friends of mine with to see what they say is to point out that really, people in england these days are not remotely like their ancestors in the earlier elizabethan period. On one hand, they still speak the same language albeit changed by time. On the other, in terms of much of their material and immaterial culture, the distance is so great that they have to study that period almost like they would another country to understand it. So are they still english, or have they become somehow a new ethnic group? The answer might seem obvious: no. But this is just where the discontents with ethnogenesis experience the worst discomfort, because the answer is not quite so obvious as would be ideal when placed under scrutiny.
We can't easily pick out a point at which the divergence between "being english" and whatever a given community is is wide enough to require they be considered something else. But we do know via more recent examples of ethnogenesis such as that of the Métis Nation on the northern plains of north america, that new ethnic groups have some specific features that confirm their existence. First, the group in question self-identifies as a community, can recognize one another as such, and can be recognized by others via distinct cultural practices. Second, they are self-governing, making up an actual polity. Nowadays this often corresponds with them being subjected to violence by colonizing governments, giving away that they are actually governing themselves. Third, the group will typically have a distinct language or language practice that is specific to themselves. "Language practice" here can mean a sort of codeswitching, where the ethnic group uses one language to speak with outsiders, and a different one among themselves. Using these features, we could obstreperously argue that the thread connecting current englishpeople from their earlier predecessors is slender indeed, made up primarily of their sense that they are not a distinct community from their predecessors except in terms of unavoidable time. Not that this is an argument that can be made too seriously, because it isn't possible to impose ethnogenesis on a group of people. This may be the biggest source of unease for those uncomfortable with ethnogenesis as a phenomenon of all, the fact that it cannot be enforced or stopped by outsiders or insiders. (Top)
Not So Ineffective Words (2017-10-01)
At the beginning of september, the ongoing saga of "free speech" took another turn as an execrable publication found itself pummelled for mocking the wrong people. Much as I sometimes disagree with Glenn Greenwald, his point that tolerating speech you don't like doesn't entail or mean that you "embrace and celebrate it" is an excellent one. Yet after reading his article, and various other lengthy discussions of "free speech," I found myself wondering what the hell it is supposed to be. Greenwald's point and Cory Doctorow's on compelled attention at the least help delineate what it is not. Another key thing "free speech" is not, is some sound that is meaningless, such as white noise. But definitions by negation always leave me uneasy, because they so often mark woolly thinking if not a refusal to admit what we actually mean. In the obligatory xkcd cartoon, Randall Munroe provides welcome partial clarity, in the context of the meaning of the first amendment to the united states constitution. All the first amendment means, according to Munroe, is the american "government can't arrest you for what you say." He's american, so I am inclined to take his word for it. It is also my understanding that a big concern of the white men who wrote the american constitution was to constrain the government from interfering with their ability to make money. I suspect we can broadly agree at least about the "constrain the government" part of that sentence, and in that case the definition provided of the first amendment is consistent with that goal.
UPDATE 2018-01-25 - The guys at TechDirt have been pursuing a longrunning discussion of free speech on that blog, especially Mike Masnick himself. Bearing in mind that even TechDirt struggles with comment moderation, Masnick's most recent reflections on free speech are well worth reading. He has kept a relatively open mind on the subject, but this is the first time I have read him wrestling with the implications of a system where sheer volume and abuse is being used to block speech. It is just possible he may give more thought to how different forms of social power warp the "playing field" for the diverse people all trying to speak. If nothing else, he is very close to admitting that alas, the "free market of ideas" metaphor is bullshit.
This actually reveals quite a lot about the notion of "free speech" and speech itself. If speech was meaningless noise the majority of the time, such an amendment would be pointless. At the time some white men were writing the american constitution, among their concerns was preventing a government from abusing its ability to martial force to prevent discussion and implementation of other forms of government than republicanism. When canadian lawmakers set to work on the charter of rights and freedoms, the ongoing effects of sectarian and racist violence inflamed and supported by what is now called hate speech was very much on their minds. So speech is a social act, and actions have consequences not least because we live within a social context, and that context never stays still. It is such an important mode of expression that people will go through incredible effort to speak at others. I say "at" deliberately. For better or worse, just because I speak doesn't mean the other person listens. Structures and relations of power affect what can be said, who can say it, and who, if anyone, a person will be permitted to speak to. This is why Indigenous people get so frustrated when some white person beaks off about Indigenous issues, because right now the power structures affecting white versus Indigenous speech preferentially amplifies white speech and silences Indigenous speech. About anything.
My understanding of this owes a great deal to M. Nourbese Philip and Catherine MacKinnon, among many excellent Feminist theorists. Another useful perspective is provided by Stanley Fish, in his book with a click bait title avant la lettre, There's No Such Thing As Free Speech, and It's A Good Thing Too. All of these theorists and thinkers agree that "free speech" isn't some sort of free random noise, but a form of action guided by beliefs and politics. And this is good, especially if we bear in mind the proper meaning of the word "politics," which is "activities to do with governance" (paraphrasing my trusty OED). "Politics" has been rendered a dirty word over the past twenty years or so, which is convenient for the folks who are up to no good and would prefer everyone else to leave them to their nefarious devices. If they are left to deal with politics while the rest of us never "dirty our hands" we can hardly resist detrimental policies and actions, or effectively support the positive ones.
All of which is to say, speech is a political thing, it expresses how we govern ourselves and even whether we will govern ourselves. That's not inherently bad. What the social management of speech is though, is hard. There isn't a single rule we can apply once and for all, because life is never that simple. So periodically we are going to be faced with somebody managing to say something terrible, and we'll have to figure out what to do about it. We already know, even if not all of us want to admit it all the time, that we can't brush "something terrible" off merely as something that offends you that doesn't offend me at all or as much, and so can be left to stand. I suspect what troubles many people when it comes to dealing with a given troubling speech, is that if their reaction is to fall back on the cry of "free speech" rather than discuss why they don't want to curtail that instance, is that they are nervous about interrogating their own beliefs as they relate to that specific speech act. They don't want to admit that they have specific beliefs and ideas guiding what they find acceptable, because they might change based on newer evidence and experience. I appreciate that this is uncomfortable to admit since it refuses simplicity and absolutes. However, this is a feature, not a bug. We change our minds based on new information not because we are merely fickle, but because this is how we learn and survive in a changing social world. And sometimes, we'll get it wrong, just as sometimes we will mishandle a given instance of problematic speech. But this doesn't free us from our responsibility to try, or somehow make the bad slippery slope argument true. (Top)
Quixotic Columns (2017-09-03)
Like many writers and scholars who work with translations and pursue the always difficult to catch "perfect notebook," I have had to deal with the problem of dual column layouts. Also like those many writers and scholars, I have dealt with the frustrations of standard word processing programs and hybridized page layout-word processing programs when trying to actually create such layouts successfully and usably. In particular, there is the problem of reasonable support for either newspaper columns, which are read from top to bottom and left to right in a language such as english, illustrated in the upper part of the diagram at right, and none for parallel columns, which present two texts in parallel, often a text and its translation. In my quest to find a reasonable solution to my need for parallel columns especially, I travelled the boards of stack overflow, the various listserves and archives for libreoffice, even the dreaded, ad-ridden and massive page sized support pages at the micro$oft website. It was a strange and frustrating experience, because the two main proposed solutions don't actually work for more than two to five pages at a time, and even then very slowly, with terrible screen and print rendering problems.
I went so far as to try the two main usual proposed solutions not only in libreoffice, but also in pages on macosx and even word in windows. I took a run at using the scribus open source desktop publishing software. Since I was specifically looking to implement parallel columns, I pored through the documentation to work out how to use floating frames with targeting. This does work, so long as you can stand it taking ever longer for the file to open as the number of pages increases in any of the programs listed here that support such frames. The most commonly repeated suggestion however, was to create parallel columns by means of a two column table that could either be extended indefinitely, or else such a table could be imposed on each page separately. The first way of using a table for this has the increasingly slow response to opening and saving, let alone editing afflicting it, which is perhaps not surprising. The second is completely unusable as a practical approach for anything beyond the five page limit.
As I researched this page layout problem, the level of vitriol hurled at people trying to get this feature implemented in any word processing and page layout program except wordperfect, which has always had it, astounded me. Those people were roundly abused for not writing an implementation themselves, in the case of libreoffice or its cousins. Since of course, we are all programmers with infinite time for such projects. Many self-proclaimed experts insisted that when they tried it that they could make useful documents of at least fifty pages or more, and no doubt they could – so long as the pages had no text on them. It is not impossible that the reason this works so poorly in my case and that of many others is simply that my computer is older and less RAM-equipped than newer models. However, this is hardly sufficient reason to replace a computer that is otherwise doing just fine for a myriad of other tasks, including in my case, heavy duty image and sound editing. Evidently implementing parallel columns is rather difficult or at minimum considered vastly unimportant even in the open source world, where especially among the *office projects the greater focus seems to be on mimicking every possible obscure feature in ms word that no one is asking for. So after this rather disheartening journey through less and less civil or useful commentary on how to implement parallel columns successfully without using a kludgy solution such as html, I spent a few more minutes considering whether it made sense to purchase wordperfect. Alas, the answer is a resounding "no" there being no version available that works on any OS that I administer.
There is in fact, an excellent solution out there, all usable via open source software that can be run on any OS you or I may select. If you can spend about 30 minutes downloading a LaTeX distribution for your system, then add the paracols package to your set up, you may happily write up parallel columns to your heart's content. Admittedly, you may need to spend and additional bit of time learning some basic LaTeX tags, but the short Introduction to LaTeX is more than sufficient for purpose. In fact, you are even welcome to download and use the stylesheet I built to start from, which pulls together all the required LaTeX calls, plus the template I generally use. (Both are just plain text files, with extensions that tell the LaTeX rendering engine what to do with them.) Overall, writing LaTeX tags is no more complicated than writing html, in fact it is often simpler, especially if you already have a stylesheet and template that you can use. The LaTeX editor I use is called TeXShop, and like its relatives in use in the *nix and windows worlds, many of the tags can be applied by using drop down menus and even the usual key commands you might use in a word processor (i.e. ctrl-i to italicize text).
For those of you wondering skeptically if the time and effort required to learn some LaTeX, set it up, and apply it to successfully writing, editing, and eventually printing parallel columned layouts can possibly be worth it compared to just using software you already have and know, I can say in all honesty that it is more than worth it. The time I spent getting this to work was ultimately far less than that required just to go through the various fora I read looking for alternate solutions in pre-existing programs. This quite apart from the email queries that I pursued in an effort to wring a usable solution out of those other programs, which more than once led to an answer of "that's too hard to do" or effectively, "huh? why would you wanna do that anyway?" Furthermore, the resulting files are far smaller than their possible equivalents even in libreoffice, which generally comes out with smaller sizes than its microsoft equivalent. In any case, here is my brief write up, and hopefully it will prove useful for others trying to solve the same problem. (Top)
Anonymity Redux (2017-08-30)
Earlier this year in Anonymity is Not The Problem, I wrote up my reasons for considering claims that online anonymity somehow made it a wild west free for all were at best bogus, at worst not in good faith. The point here is not to reassess or express altered views on the point, since based on the ongoing campaigns of online harassment let alone drive by trolling that can easily be viewed since that piece went live have made and remade my argument for me. An extraordinary number of these attacks are being carried out openly by people with their faces attached, let alone an easy connection made to their real names if they haven't done it themselves already. Instead, I want to spend a few paragraphs on the ways anonymity can and does work differently offline. There is a necessity for anonymity offline too, and not just to allow for things like safe reporting of crimes committed by governments, gangs, and the lesser scale ones committed by individuals. It is important at minimum for discussion of and opposition to propaganda as well. Yet I have run into various descriptions of anonymity as an imposed state used to dehumanize people. This strikes me as a vast oversimplification, or deliberate muddling of thought which is endemic in this age of so-called "postmodernism."
"Anonymity can be used as an element of oppression against groups of people," is I think, a true statement. An easy example is those who may find their names regularly cut off from the fruits of their life's work. The idea that one of the most prolific women authors is "Anonymous" is not just a bittersweet joke. But the namelessness is not the point of the exercise, separating the works from the persons is. That in itself does not dehumanize women. Women may be, and often are dehumanized, by not having their own names to start with. I wonder how many people realize that the "feminine" versions of numerous roman names, many still in broad circulation, were probably the exception rather than the rule. The eldest daughter might get the name "Claudia," say, and then if she had any younger sisters, they might be "Claudia" again, but "Claudia Secunda," "Claudia Tertia," or else just Secunda, Tertia, and so forth. Roman women likely did have their own names at least among themselves, but they have not often been recorded and are often misunderstood as "mere" nicknames. Keeping a moment longer with "literature," most of what has been designated "epic poetry" and "folktales" isn't "anonymously produced" at all. They are communally created and curated works that often some outsider has written down to make some form of profit from, conveniently dropping the social context those works originally rested in.
Notice again the difference between a state chosen versus a state imposed. In a word, it is power. This difference is too often ignored in discussions of woman-only space, let alone the self-sustaining ethnic communities so regularly insulted with the term "ghetto" or "slum" regardless of the facts. There can be a range of good reasons to be anonymous in a whole range of venues, especially the maintenance of privacy and personal safety. Imposed anonymity denies the affected person's existence, and is often a preparatory step before far worse oppressive acts, and a preparatory rationalization for additional, more corrosive ideas. So far, imposed anonymity is not a reality on the internet to my knowledge, let alone the web part or the strange zone of "social media" that is neither social nor media.
According to my OED, to dehumanize is to remove the "positive human qualities." In other words, here we have an infuriating circular definition. When I turn to the word "human" instead, there is, lightly paraphrased: "of, relating to or characteristic of people," "of or characteristic of people as opposed to machines or animals, especially in being susceptible to weaknesses," and "of or characteristic of people's better qualities, such as kindness or sensitivity." The last one is kind of funny in the sense of weird, since I had learned that as a definition of "humane" as opposed to "human," but in any case, this is some material to think with. It is quite remarkable how often "being human" is practically speaking being equated not with say, intelligence, or walking on two legs and having hair instead of fur, but with some sort of weakness either literal or an imposed fiction in the context of a capitalist, slave-holding patriarchy. That is actually the opposite of what the unsuspecting reader might expect. It is a happy accident, or perhaps even a bit of accidental honesty that the meanings identified by examining how the word "human" is used reveal that being dehumanized is actually considered a positive trait because it makes it easier to behave like a perfect, ruthless individual according to the nihilistic version of liberalism currently ascendant. This does not mean that this has always been the connotations and meaning actually floating around the words "human" and "dehumanize," no doubt they have changed, especially over the past 60-70 years, in which the social meaning of anonymity has itself changed so greatly. (Top)
Have Your Straw Feminist, and Burn Her Too! (2017-08-29)
I have been mulling over this particular thoughtpiece for quite some time, ever since I returned to the world of post-secondary education and discovered that a form of groupthink had basically eaten the campus that now serves as my academic home. Not very long afterwards, I learned that it certainly wasn't just that campus, and this particular form of groupthink had suddenly become the "social justice" flavour of the hour. Suddenly everyone has become a participant in the oppression olympics, any kind of "exclusion" for any purpose whatsoever is an evil, except for men, funny enough, who can continue excluding whomever they like with impunity. Furthermore, an area of political thought and class analysis I am familiar with and subscribe to, is currently coming through a period of peak backlash. (Don't take my word for it, have a read of The New Backlash and More Radical With Age, or if you'd rather read a book, start with Susan Faludi in her original breakout book, Backlash: The Undeclared War Against American Women.) But this is one remarkable backlash, in which you can have your straw Feminist and burn her too, while insisting you must be a real "feminist" because you don't exclude anyone, deny that words can have consistent meanings when they are inconvenient to what you think, and you are the feeling police because you are the only one who has the correct feelings. Furthermore, a grimly interesting number of impressively moneyed interests have joined in on this particular backlash, adding vicious manipulation of sex dysphoria and peoples', yes not just women's, resistance to and discomfort with patriarchy to line their pocket books while claiming virtue via what is currently being called "transactivism."
UPDATE 2018-06-04 - For those who thought that the point about backlash was exaggerated, then I suggest you look up the difficulties british women are having just getting together in person to talk about the proposed changes to the "gender recognition act." A great example is Lily Maynard's report on the Transgenderism and the War on Women event held 14 march 2018. Online, it seems there are a significant number of people with nothing better to do than report women for asking questions about "transgenderism," not taking a position of any type, just trying to figure out what the hell is going on.
Well, credit where credit is due. From what I can see, this has to be one of the most incredible social manipulation jobs I have ever seen. Never have I seen so much that comes right out of Suzette Haden Elgin's analysis of verbal attacks in her series of books on The Gentle Verbal Art of Self-Defence. Tragically, these books are out of print, but they are nevertheless still often found in public libraries. Besides being a brilliant linguist, Feminist, science fiction author, and visual artist, her analysis of the verbal attack modes and how they work ought to be part of the currently rather ludicrous "career and life management classes" high school students are so often subjected to. Who knows, in the present environment, they could make sex education look uncontroversial, because Elgin took particular care to explain and help her readers learn to appropriately analyze "presuppositions." Presuppositions are the nasties that somebody who isn't speaking to you from a place of honesty and respect wants you to accept and ignore, taking the bait of something else that is overtly more provocative instead. Then while you're busy being provoked, they can claim that you agree to their presuppositions.
Now, in terms of Feminism, I am of a mindset similar to that of the blogger at Hypotaxis, who states, "I do not ascribe to 1) feminism means whatever anyone says it means (actually, it's rooted in some solid theory and that theory requires meaningful praxis) and 2) feminism is for everyone. Because at its root, it's not "for everyone" – it's for female human beings." Much as I respect and love bell hooks' work, this is my perspective. That does not deny allyship to people who are not female human beings. Of course not. What it does do is make it clear that Feminism is about freeing female human beings from oppression, and yes that means it does not focus on freeing male human beings. If you're going to work on opposing and overcoming oppression, you can't focus on everybody's oppression at once, because by nature your efforts will become so diluted that they will be useless. The key is which aspect of oppression you are going to prioritize opposing based on your capacities and the nature of your current primary emergencies, as the late, brilliant Andrea Dworkin noted. So there are times when my efforts are going to be focussed on opposing oppression of Indigenous people, because that is the primary emergency for me at the time. I don't have much time for that these days though, because my present primary emergency has to do with the fact that I am a female human being who is also a visible lesbian, in other words a dyke.
Obviously there are plenty of people out there willing to disagree with me vigorously on all points here, which is to be expected. Who agrees all the time about everything? I even manage to disagree with myself, simply because I gather more information, think through different issues with greater care and attention, and find that I need to revise my ideas. Which is how I eventually determined that despite the wonderful sound and shape of the word "queer" and all the neat things it looked like that word could be used for, I finally had to admit that not only did it not fit my reality at all as it has been developed, subscribing to the new queer ideology was doing me harm. It has become a powerful vector of liberal individualist politics, which practically speaking when the current state of the world is considered, I think effectively comes down to everybody being allowed to punch holes in the boat in their own special way with no obligations to anyone else. Yet there are a few ideas that have stood the test of time for me.
Feminism is a body of theory and analysis that recognizes women as a class of people defined by sex and placed in a hierarchy, at the bottom, where they are expected to behave according to a socially defined gender that systematically weakens and humiliates them. We usually call this "patriarchy" for short. Furthermore, Feminism is a body of theory and praxis applied to destroying that hierarchy, because it is a system of oppression. Feminism recognizes and reiterates the point that anyone defined as in or in any way similar to the sex class of women, is also oppressed at least to the degree that they are seen to be endangering the hierarchy patriarchy depends on to survive.
Name calling and creation of straw persons are always markers of dishonesty and a drive to silence discussion. You can always identify the straw person by how the person who has set them up will refuse to allow any information to be shared by anyone whom they deem to be equivalent to the straw person. The key marker is always silencing and refusal to gather new information. Merely being uncomfortable is not a reason to stop a discussion or refuse to listen, so hang in there. Being uncomfortable is not the same as being threatened. The former is feeling awkward or weird, sometimes even feeling a bit of pain like when you try on shoes that don't fit quite right. It can feel pretty strange when it's your thought patterns that start to feel like they may not be fitting quite right, and it should. But being threatened is no mere discomfort. That's when you literally feel that you are in danger. Alas, the line between the two is not absolute.
This is all far more complicated than putting together a straw Feminist and burning her, with all the connotations and invocations such a description must inevitably raise – and I don't suppose it will change the minds of those who would refuse to actually engage with what is written here. Yet it is still worth the writing, to do my little bit against the tide of groupthink. It's tough to sink the boat when a bunch of us are working together to patch it up. (Top)
My original writing prompt or jab, if you like, for this thoughtpiece, was the provocative claim that "white culture is empty." If you are a racialized person, this might sound not merely provocative but rather true. Thinking on it a little more, the statement then proceeds to spring problems and questions as quickly and variously as an air mattress does leaks. Who is white? What does being white mean? What's white culture? Is there anything like that? Does this statement actually make sense? With that in mind, I ran a quick image search online, just to see what would come up on searching on the term "empty," and was utterly fascinated by how many results were images like the one featured here, of relatively white rooms. "Emptiness" is regularly equated with blank white pages, blank white rooms, empty cardboard boxes, empty white boxes, blank white screens, and deeper into the results with images artists use in attempts to express the horrible non-sensation of depression. This was all quite eye-opening in a way that admittedly I did not expect.
The first thing it helped me realize is that the descriptor of the cultures of people who think they are white is of course, not a descriptor at all, but a deflection. Certain attempts at canadian history writing are redolent with this deflector aspect, and arguably those attempts are actually at the heart of the ongoing efforts by a very stubborn subset of canadians to claim there is only one "canadian" culture, if there is any. The two main attempts go like this. The french say, "Look, we were nicer than the english!" and the english say "Look, we were nicer than the americans!" The fed up Indigenous peoples say, "You were both horrible bastards."
In my slow way, I have been forced to the conclusion that a central value of the various implementations of culture by people who think they are white, is a refusal of reciprocity. By this I mean a refusal to share or to give back. This is extremely deep, and goes right back to a denial of the necessity of death. When we die, we effectively give back what our bodies are made of back to the Earth to contribute to the continuation of Life, the greater story that we're a small part of and absolutely essential contributor to. Personally, I think this is a pretty cool thing, even though I find the idea of dying frightening and like anyone would prefer not to go too soon, though I'm resigned to going when it's time. But to refuse to die, demands a refusal to be part of the reciprocal relations that make life, even if the rationalizations sound kind of good.
The big rationalization that the central denial hinges on in the current version of "western mainstream culture" is what can be described roughly as the ideology of "liberal individualism." This ideology includes key elements like complete independence, pulling oneself up by one's own bootstraps, which is regularly referred to even though it is as impossible as perpetual motion machines. It's a sort of, "I don't like any of you, you all suck!" juvenile temper tantrum raised to a worldview in which the child remains in place in an adult body and grows up to create yet another advertising company and buy a ring of houses around their own so they can sulk without anyone seeing them without at least a helicopter and telephoto lenses. If anyone would want to see them, outside of the strange parallel universe of "celebrity tabloids."
A different rationalization that can sound much more persuasive for those yearning for an ethical approach to avoiding reciprocal relations, focuses on somehow putting an end to the reciprocal relations expressed via eating and drinking. For a searing, deeply respectful and sympathetic exposé of this, I can think of no better book than Lierre Keith's The Vegetarian Myth. It sounds so helpful and useful to simply stop meat eating to stop the horrors being perpetrated in factory farming. Unfortunately, agribusiness monocropping is also factory farming, and the horrors of animal torture to "test" new products won't be stopped by that. I sure wish it could. But it's not that simple, because the people who came up with factory farming and monocropping are uninterested in reciprocal relations with animals and plants in the first place, which is why they don't care about using them up and annihilating the Earth while they're at it. Plants still want to live, which is why they come up with so many amazing chemicals that can make us high, make us well when we're sick, or kill us dead.
The ultimate envisioning of the end of reciprocity is not where we might expect in the very specific, though I suspect it is in the very general. It is of course, in the context of science fiction, and the example I have in mind is most famously presented as far as I know, within the Star Trek franchise. There are definitely other examples, this is one that I happen to know best, and was referenced, I kid you not, on the packaging of a brand of tempeh available in grocery stores where I live. As trek aficionados know well, in that universe (which is not so rosy even in the Roddenberry version), everybody has replicators. Those snazzy machines make whatever you like out of pure energy, food, parts to fix your ship, whatever you like, as long as you can program it and provide enough energy. All the energy comes from inanimate minerals, magical dilithium crystals in this case, perhaps some more nuclear nastiness earlier in the trek legendarium. Perfect, isn't it? No critters with eyes or voices, no plants being messed with. It doesn't make much sense in real life, but provides the peak of refusal of reciprocity. People not only don't need Earth, they don't even need to grow food or collect water, they can make it out of pure energy. Furthermore, they can recycle the trash into, you guessed it, more energy! See Do the Math for a discussion of entropy to clarify how that might come in.
I hope that it is trivially self-evident that colonialism is itself an expression of refusal of reciprocity by the colonizer. In fact, I have already written about this in a previous thoughtpiece, Refusing the Land.
Okay, so going back to the original provocative statement, the part of "white culture" or rather cultures of people who think they are white that is empty, is the part where the respect for and action according to reciprocity should be. And the deflection provided by "we're not like them" is generally "we did pretty much what they did, but we covered it up better." It's little wonder so many have not merely tried to distract those of us who are racialized from the real thing, they have been in great haste to distract themselves. (Top)
Star Wars Redux? Er, Reflux? (2017-08-26)
Seriously, it's not what you think. I don't care enough about the formerly George Lucas now Disney franchise to indulge in rants on it, though I am well aware of the desperate gasps of relief in many quarters online and off when the new movie didn't turn out to be horrible or something akin to the strange version of "rebooting" being inflicted on the Star Trek franchise. (Or on Spiderman – honestly, how many freaking times can they waste money remaking that thing?) Anyway, the purpose of this thoughtpiece is to spend a little more time on the question of the robot armies and who benefits from them, because yesterday, probably hilariously late, it dawned on me that the latest proposals for the uses of drones in warfare are not new at all. It seemed to me that they sounded awfully familiar, and it took me awhile to figure out why. Then, as I got deeper into an anthology of essays and speeches by Ursula Franklin, The Ursula Franklin Reader: Pacifism As A Map, the penny finally dropped. The latest drone proposals are an attempted, if not actual, resurrection of significant portions of the old and absurd "star wars" initiative of the regan administration in the united states.
I suspect nobody was more pissed off about the media nickname for the so-called "strategic defence initiative" as the various military planners in the u.s., because it messed with their determination to present what amounted to an attempt to create a superweapon in space to exert control over the whole damn planet from on high as something much more benign. Worse, it rendered the whole thing ludicrous sounding. (There is an envisioning of this in at least one videogame I've actually played, see the EA version of Battleship for iPad.) Attempting to invoke "strategic defence" made people think of Chewbacca and C3PO instead. Arguably, this was better for the safety and security of the world at large. The nicer version of how this initiative was described emphasized a network of satellites and bases that would act like a shield over the u.s. by intercepting and destroying intercontinental ballistic missiles, which had become a greater worry than planes with people in them, at least for the u.s. military. It kind of sounds like a good idea, as long as you live in the u.s., and you don't know or fear the inevitable fall out from exploded nuclear devices or devices carrying nuclear material that would still be quite nasty even though neither nuclear fusion or fission is part of their explosive power. Neither people in the u.s. or much of the world were wholly fooled by this, even though the effort has been rerouted into a range of other projects rather than completely killed off, since nobody in the military likes diplomacy all that much and the diplomats don't have as much industrial clout.
Instead, apart from the ongoing militarization of the immediate orbital region of Earth, on top of the abuse of the internet for surveillance purposes (which unfortunately its current design does not ameliorate), we have the explosion of work on "drones." Contrary to the biologistic terminology, "drones" are of course not alive per se, although they do fit the definition of "doing no useful work but living off of others" noted in my electronic OED. The flailings by several "high tech" companies notwithstanding, "drones" are not generally useful. They are amazingly useful to military planners who are not so much unwilling to sacrifice human lives, since that has never stopped them before and they aren't too worried about non-combatants let alone whichever humans they label "enemies," as they are unwilling to suffer warfare and its fallout too close to home, including any requirement to train more than the minimum number of people to carry out the warfare. Avoiding warfare too close to home should sound familiar – this was part of the idea behind the original "blow up the icbms before they get here" logic.
To be clear, I have no argument with not having war happening in my backyard, or my neighbour's, and I would certainly prefer not to have to deal with fallout from it, most importantly the cruel displacement of millions of people, whether or not they were fighting. Once you're a refugee, it really doesn't matter whose side you were on, you have just lost the infrastructure you thought you could depend on and your situation is far more severe and longlasting. But neither drones or any other absurd recreation of the former "star wars initiative" is going to help, because it takes as given that somebody's whole world must have warfare inflicted on it, and for my part, my preference is for no one to have any warfare inflicted on them, and by that I don't just mean the stuff that comes from bombs and guns. (Top)
Sanctuary or Troy Cycle? (2017-08-10)
The question of cultural appropriation can be a difficult one to face up to, especially in the case of greek mythology, which, as Athena Andreadis has so aptly noted, tends to be treated in mainstream western culture as Being Part of Everyone's Furniture. It seems like there should be a way around this, a way to interact with elements of other peoples' cultures, and preferably them as well, in a way that is respectful, that does not pretend to take over the story or regalia or be "more authentic" than the peoples who actually live the cultures. This can be a tough remit for people who think they are white, who have effectively been trained to treat whatever they see, let alone whatever they may find in any way pleasing, as something that they can do whatever they like with. An interesting question then, is whether the creators and writers for the television programme Sanctuary managed this, by trying to work with what looks suspiciously like one way we could characterize the plot of the Troy Cycle: what happens when a powerful, marriageable woman in a hyper-patriarchal culture has too many suitors?
Not that a tie between Helen Magnus, the central character of Sanctuary and Helen of Troy/Sparta, whose centrality to the Troy Cycle is contested, is immediately obvious. At first I thought the idea rather silly, and that I was having a bout of pattern matching on random or relatively random data, which we humans are incredibly good at. A quick websearch on the term pareidolia will turn up at minimum thousands of other examples, often hilariously mundane and associated with religious imagery. However, there is actually real evidence for Helen Magnus being a reflex of her ancient greek counterpart in the story sense, though I have never heard or read anything by the show creators that suggests this was their intention. There are eight major things that jumped out at me when I thought about the possible parallels a bit more.
Of course, there are many more contrasts than the ones that crept into this quick list, such as:
It seems to me that this is a much better way to engage with someone else's culture in principle, even if the result was mainly accidental. It is not impossible that this engagement is more deliberate, considering the settler state of canada has been dealing with an unavoidable and growing resistance movement to appropriation of Indigenous cultures, and a growing discomfort with the federal attempts to render any "foreign" culture into food and weird clothes for the entertainment of anglo-saxons. (Keen and curious readers may want to read Franca Iacocetta's Gatekeepers, especially chapter 4.) The tough part of course, is in the application. (Top)