|
FOUND SUBJECTS at the Moonspeaker |
They Obviously Don't Mean It (2020-08-24) The title of this thoughtpiece is indelibly associated in my mind with a funny story recounted by a late friend who was illustrating the particular clear-eyedness of children before they learn that adults often want them to close their eyes instead. We drive that clear-eyedness out at our desperate peril, because recognizing when another person's speech and/or actions are sincere is not an option. There is no hyperbole in acknowledging that this ability is literally a matter of life and death. I appreciate that as in all things, we must apply our critical faculties in moderation. It will do us no good to reduce ourselves to paranoid balls of mush, although that would certainly be helpful to anyone who is up to no good and not a paranoid ball of mush in their own right. This is also bearing in mind that a paranoid lump of crystal is not necessarily better, though it may be even more dangerous to those immediately around us because then our sharp edges will likely be turned against those within reach, not those who are dangerous. Long ago I ran into a purported chinese proverb – it likely is a chinese proverb, but the trouble with orientalist sources is that no matter how accurate the sentiment, their attributions cannot be taken at face value – stating that to behave sincerely with the insincere is dangerous. Troubled attribution aside, this warning is all too true. Regular readers will already be familiar with my skepticism of canadian claims to "reconciliation" with Indigenous nations on Turtle Island. I find it hard to understand how anyone can look at these claims with anything but rank disbelief, because the proof of the pudding is in the eating, and the pudding continues as rank as it ever was when it comes to colonial-Indigenous relations. The various governments in settler colonial states, including canada and the united states over the past several days let alone months and years (and decades, and now several centuries) continue to piously declare how they want a new relationship, while calling out any form of force they find convenient to force through the latest exploitation project they have decided they want. The new relationship is always right there and ready to be had, so long as Indigenous nations, communities, and individuals agree forever to give up any right to say "no" and have that "no" respected. Otherwise, overwhelming force always comes next, and the imposition of "governments" that are constructed as colonial remote offices whose officials are accountable not to those they supposedly govern but to the colonial power, such as it is. The process of resolving issues always comes down to, "heads I win, tails you lose" in which a particular right will be recognized in a colonial process on condition that recognition is tied to permanent denial of that right from the moment of recognition into the future and into the past whenever colonials find that convenient. Politicians of all sorts of stripes loudly denounce such supposed causes of economic and social distress as "illegal immigration" and "theft of intellectual property" while declaring that of course they and their cronies will fix everything. That whatever cure they are peddling will do the trick, no matter how often the evidence from eating the pudding shows that "austerity" does not work, scapegoating people forced to immigrate by war perpetrated against their homelands does not work, where brain draining as much of their homegrown talent as possible does not work. The wild screams about "theft of intellectual property" strike me as especially funny in a backhanded way, because the fact that "europe" got most of its temporary advantages by stealing what europeans and their descendants now piously refer to as "intellectual property" is quietly ignored. The screamers are that confident that nobody knows or cares about history. But even if they did, they don't mean to solve the problems that those of us on the pointy end of economic and social distress have in mind. They mean to solve their problems of maintaining their own power and advantage, at any cost, including the cost of keeping an ever tighter bubble around themselves. I've rehearsed a lot of this stuff before, so that's more than enough of a recapitulation. And besides that, what I've said is hardly unique to me, nor is it just something that "leftists" or "academics" or whatever indirect swear word in a hyper-regressive time a person may wish to choose. In my anecdotal direct experience, my friends whose political leanings are across much of the so-called spectrum, including those who agree with certain talking heads in the united states that russia and iran are at the bottom of every problem in the world. I have seen and heard pundits and politicians in the mainstream media saying basically the same thing, at least about the other guy: "So and so is not sincere!" If we have such a remarkable consensus across social, economic, and political lines that there is so much lying and redirection going on, it begs the question why so few are getting fed up and refusing to play the game. Then again, I saw the oscars for the first time in years this year, and got treated to the weird sight of actors and the rest fawning on Jeff Bezos because he has stolen so much money that he can get into basically any party on some pretext or other. There are two implications of this, at least. One, that a whole lot of people have given in to total cynicism on this. They have concluded that there is no other game, there is no future, and they may as well stick to doing what they can to hustle. It puts me in mind of the Thernardiers in Victor Hugo's sprawling novel Les Miserables. I am not unsympathetic to those characters or those having parallel experiences in real life, because the difficulties of resisting despair are not to be sniffed at. We have to keep refusing every day, and that is very hard when of course we would love to be able to refuse once and for all. Two, there are a whole lot of people who have indeed walked away from the invidiously rigged game. Our challenge is to not let ourselves be fooled that there is no one who has done this, just because the usual mainstream sources, and even many of the not so mainstream ones, generally don't acknowledge that these people have made the decision to stop feeding the beast. Listen out for accusations of apathy and childishness, because those are actually moments when their refusal is unavoidably recognized and desperately papered over. (Top) Footnotes and Endnotes (2020-08-17) On average, not too many people delve into the footnotes and endnotes they may encounter on a day to day basis. The number of potentially excellent reasons for this is high, from not having much time to doubting their real relevance in light of what the maintext is like. Yet it seems to me that we are also encouraged to look down on this type of textual assistance and annotation because the information is not central and therefore not part of the maintext. This lends them to hilarious geeky in jokes like a text totally overwhelmed by its footnotes or a scattering of footnotes full of puns and wisecracks. Endnotes are not quite as useful for playfulness of this type because they are too far from the ideas they are burlesquing. And we have to watch out lest we be fooled into thinking there is a joke where there isn't one in such things as translations, where the footnotes can outweigh the maintext with no hijinks intended at all. Of course, both types of note are the steady workers who take special care that citations are present and accounted for, cross-references right at hand, and areas the writer couldn't include that the reader may be surprised to see uncovered acknowledged. So it is easy to forget what this type of note actually is, alongside the prosaic bibliography or reference section. Robert Grafton among others has already noted what we could refer to as the now almost secret identity of these paratexts. The original world wide web and research rabbit hole generator. I think it is not acknowledged nearly often enough that the underlying metaphor of the hyperlink is the footnote and/or endnote, and that bibliographies and reference lists, especially when annotated, the underlying metaphor of the search engine result page. Without those familiar ways of marking up and navigating a text in codex form, we would be hard pressed to make sense of these parts of the world wide web, and it would be far more difficult to use. Only the affordances of screen-based reading allow these code methods of annotation and navigation to work on a virtual recreation of scrolls. What the codex and the world wide web have in common that makes the metaphors work is the need to join together diverse texts that we can't look at together all at one time. Within a single book of course, we can't look at more than two pages at a time without taking it apart in some way. When browsing data on the internet in general and the world wide web in particular, the only way to see more pages at once is to have more screens or screen space, which is rarely practical. The challenge with notation paratext like footnotes and endnotes is sorting out whether or not to read them, and if so when. It seems to me that endnotes are especially suited to citations that aren't meant to be read so much as used to relocate the original passage in the text cited. There is no need to look at them unless we need or want to track down the passage in context or read the rest of the text it came from. Footnotes are best for when the additional text provides information that should be there and be read in its own right if needed or of interest, but that needs to be in the context of its cited place in the maintext. That is just where an author typically says things like, "Topic X is not directly relevant to the present discussion and so is not included here. Instead it is covered in Chapter Y, starting on page Z." This type of footnote is a distant cousin of the "frequently asked question" page of many a large website, which today often start from a set of questions the writers anticipate many people will ask. The tougher footnotes are the ones writers tuck additional commentary into. They still made it into the book, which suggests they are not irrelevant, just not possible to fit into the maintext as presently structured. Stopping to read them can disrupt our efforts to follow the author's throughline. Sometimes it turns out that the footnote provides information we need to follow it after all because the wording or detail improves on the maintext or bridges a gap between our own knowledge of the subject and what the writer assumed that we knew. All of which is to say in roundabout fashion that writing successfully for a wide audience while keeping within a certain number of pages is hard, and the whole reason for paratexts like these is to hedge against the difficulty. If in the end our own text can't meet the needs of every reader in itself, it can at least assist the readers who need more information to find it. I concede that this is not likely to have been the original idea behind footnotes that added comments or even translations to a text. In their origins they started from efforts at personalization and were not intended for others at all. They quickly developed into a means of showing off how much the writer had read and/or perhaps the famous or erudite people that writer knew at the time of writing in a textual form of name dropping. And yes, they were used and abused from early on to fortify a text against reader engagement by anyone who just happened to be interested. This last use did the most damage to their reputation among general readers as opposed to those expected to read in a "scholarly" way. I do wonder then, whether marginalia and other forms of annotation by the reader by hand in printed books began to seriously take off in response to the abuse of footnotes and endnotes to create at least apparent textual impenetrability. Writing back against the pretentious annotations has its own satisfactions, even if it will never lead to a conversation with the original author or a change in the book itself. (Top) Since When is Listening "Passive"? (2020-08-10) It appears that books focussed on the origins and history of oral storytelling are back in style, if Meghan Cox Gurdon's recent publication with harpercollins is anything to judge by, though like any trend it is a bit puzzling. A significant number of podcasts and the booming market in audiobooks leaves me baffled by any claim that the "power of reading aloud" has ever been lost. Then again, maybe I am just being too literal. Maybe I am just picking unfairly on Cox Gurdon's greater point that reading aloud to one another is a power that is not being used as steadily and widely in "western" cultures as it could be in these days of "social media" and loads of cheap off the shelf entertainment. I don't think that is a point I could fairly contest by any means. Parents struggle to find time to read to their children, and reading aloud in the classroom is almost unheard of except in the very earliest grades. The issues are time and perceptions of value rather than whether children and adults can cope with reading aloud or being read to. After all, listening while someone else reads or recites a story extemporaneously is probably the most usual and least nasty way to learn how to sit still quietly and focus our attention outside of ourselves. And make no mistake, it is a skill, not an innate capacity. What is bothering me about the snippets of Cox Gurdon's reflections on this is that she takes for granted that listening is somehow passive, and the snippet I linked to attempts to use a recitation of part of the Iliad for this. But I'm sorry, this is an asinine claim. The skill of the person in reading or reciting aloud is in persuading their audience to turn their attention and focus to what they are saying, to imagining themselves into the world that person is creating. If it took no effort on the part of the listener, then any session of reading or reciting could be arbitrarily long barring biological necessities. But this is far from the case. That the audience is not reading does not mean they are not themselves acting in the time that they are listening to a story. If they are engaged with the story, they are certainly busy doing more than the linguistic tasks of parsing words and keeping enough track of what is going on to know when or if they should clap or get up and go home. Even reading itself demands some level of effort no matter what. The exception that proves the rule when either listening to or reading a story, is what happens when the story in question is of a highly formulaic nature. If it is quite simple, then the person can tune it out, tuning in only just enough to note when a key marker is coming up such as when it is coming to the more exciting bits or is about to be finally over. An ancient epic like the Iliad is on one hand formulaic because it was built up using broad formulae for major events, descriptors and such, but brought together out of a deep performance tradition based on a set of rules that still allowed considerable play for creativity. For the earlier audiences familiar with the tradition, part of the experience would have been anticipating how the performer would spin the required elements. Might they centre a different protagonist, apply less used epithets, take advantage of formally similar events in the story to keep the audience wondering which specific event is being recited until the last moment? Of course, that comes from experiencing a work again and again, a work with enough play and depth within it to allow both the performer and the audience to draw different elements out on any given day. Both audience and performer are participants in what is actually an exchange in which yes, the performer leads, but the audience ideally never merely gets dragged along by the nose. It has taken centuries of attempts at audience discipline to try to create a passive audience. The discipline began with the creation of dedicated performance spaces with fixed seats. This forced the audience to sit down all facing in one direction, and while later making systematic rows and numbering individual seats had a considerable amount to do with ticketing and moneymaking, the disciplinary element is still key. After all, not just anybody got to take those seats. They had to have a ticket. And then particular behaviours were gradually discouraged and outright forbidden by such steps as barring "outside food" or any food or drink, forbidding discussion or calling out to the performers at the front. As theatres were redesigned for greater crowd control and more money extraction, strategic use of lighting came to the fore, with lighting of the audience reduced in order to discourage movement and noise in the seats. In an odd sort of way, it reminds me of people throwing covers over bird cages to silence their occupants. The advent of cell phones and other portable devices has slightly challenged this way of treating audiences. Part of what made the vcr and now streaming services such powerful challengers to recent mass media is the latter's demand for an at least apparently passive audience that can't even get a decent break for a visit to the bathroom. Maybe then I should revise my annoyed characterization of claims that listening is passive are asinine. Maybe they seem more reasonable if a person is not aware that the attempt to create a passive audience on a wide scale is almost as recent as the huge drop in opportunities for people to read or recite aloud to one another. After all, how many people learn that this is also how most apparently extemporaneous speechmakers start out, by reading aloud and observing how to interact with an audience. (Top) On Handwriting (2020-08-03) Handwriting is another of those at once banal and controversial topics about which almost all of us have some sort of opinion, even if that simply comes down to "what a dull topic, let's move on." Many of us, though far from all learn handwriting to some degree, especially printing in small and block letters in the case of those whose languages are transcribed in some version of the greek alphabet or its grandparent the phoenician script. The number of people whose writing system is part of the family of scripts spanning most of asia from china right through to japan and melanesia, let alone the scripts characteristic of the indian subcontinent are probably even greater in number proportionately. So there are many ways for people to make their abstracted mark, and that mark becomes ever more personalized if a person continues to write over their lifetime. It so happens that printing is less amenable to this on average because the letters are not intended to be joined and the preponderance of ballpoint pens and similar. Still, individual a person's printing or cursive writing will be, so long as they don't leave off applying their skill on learning how to type, or at least, getting regular aspects to letter keyboards. All that said, there is nevertheless a line of fierce debate about not just any handwriting but cursive in particular, because so many schools are not providing instruction in it. I actually sympathize considerably with the people sounding the alarm over this. Merely handwaving at concerns by saying nobody uses cursive anymore is both wrong and disempowering. Regardless of how much any one person may write using a cursive script, if they don't learn how, they will be rendered functionally illiterate as soon as the text they are faced with is in such a script. And that means an incredible amount of material ranging from the most practical and meaningful to yes the banal and ephemeral. For a person who writes by hand a great deal, cursive is faster and more efficient, and indeed, that is why scribes developed it. It is difficult to develop the strength, dexterity and endurance students need when writing in print or cursive if they don't learn cursive, as paradoxical as that may sound. Yet it is also the case that more students today struggle to write their exams by hand because their hands get sore and tired when writing for more than twenty to thirty minutes at a time. Obviously any time a person needs to write an exam or fill in a job application, both already stressful events, having sore or tired hands makes that feel all the worse. Whether or not a person ever needs to write for others, the dexterity and conditioning of their hands and fingers is helpful for many other tasks in life, and I am not sure wildly elaborate game controllers can cover all the same bases. Still, it is the artificial creation of functional illiteracy that bother me more. Perhaps this is because I am basically a historian, and I mistrust changes that effectively cut off peoples' ability to access and learn their own pasts or that of the societies they are part of. If my youngest cousins can't read cursive, they can't read the early letters and documents left by our grandparents, or even the early juvenilia our parents may have managed to hang onto from their own school days. Official documentation of all kinds was hand written in cursive until right into the early to mid twentieth century precisely because it remained the quickest and most versatile means. Laudable as efforts to make tablet, phones, and computers more amenable to text entry by stylus are, the fact remains they still can't beat a writing instrument and paper for sheer speed and effectiveness. Even a person who is not carefully reproducing a cursive script intended to be read easily by at least others working in the same job is able to take advantage of these form factors when they use their own or a standardized shorthand. I agree that there is a bit of a romantic element to many defences of handwriting, in that they emphasize the role of individuality and expression. After all, I have done that to some degree myself here, and I do think that is meaningful if only as something for a person to try out and see if they care about that experience rather than having it foreclosed for them. The cachet of autographs persists alongside that of unexpected little gems like the scan of a memo written by Marie Curie featured here or scrap paper with Alan Turing's jottings retrieved from where they had been used to plug up a draft at Bletchley Park. Yet I don't think it is actually the individuality or expression parts as much as the sense that through even such little scraps we are contacting ever so tenuously another human being who is no longer in this world with us. And that is very special, even, perhaps especially, when the medium seems rather ordinary. The cave paintings at Lascaux for example are almost overwhelming in their beauty and the skill they demonstrate, and can seem out of reach. Yet a quick note is something any of us could make, even if in the end we just make it the sand on a beach somewhere, and it does not have to depend on a corporation or other such dubious entity. There is far more than just romanticism or the equivalent of shouting "get off my lawn!" in insisting on maintaining access to a skill of that kind. (Top) "In the Penal Colony" (2020-07-27) Australia has not been having a very nice winter, to put it mildly. Too mildly, I know, but there are truly no words that properly express what is happening there. It makes it all the worse that extreme wildfires, grave problems with maintaining services and fresh water, plus soaring temperatures were all predicted. Indeed, these were already part of the mix for the english and dutch when they first began pestering Indigenous peoples there who knew how to live with their lands and waters and found themselves stuck dealing with outsiders who seemed determined to kill themselves and any Indigenous people they met besides. The english in particular looked at australia, and apparently deemed the combination of heat, lack of water, exploitable coal, and what they deluded themselves into thinking was infinitely exploitable farmland easily kept isolated, and concluded they had hit a perfect recipe for highly profitable convict labour use. They could get the troublemakers out of town and force them to work for their captors' benefit just to survive at all. The meanness of the people building and supporting these systems is breathtaking. All of which is simply to acknowledge that australia is not an easy place to live with if for whatever reason a person is unable or unwilling to enter a relationship with it rather than attempt to twist its figurative arm. Places with more extreme climates and overall conditions react powerfully to larger scale changes because they are already carefully poised. The injustice of the settler population of australia insisting on the practically unswerving support of global warming denying coal industry shills is bad enough. Any of us living in settler colonial states are all too aware of how small a portion of the population is needed to support such fools to end up with them holding a lock on the government, heedless of votes, protests, spending money on other things and the various other perhaps visible but definitely ineffective for real change actions we are pressed to take. Or at least, those actions are ineffective if we simply view them as ends in themselves. As key radical Feminist activist Matilda Joslyn Gauge noted over a century ago, the prime value of striving for women's suffrage was and is the way it drove women to learn how to organize and act together to make change, and then realize that mere suffrage alone could not make the most needed change. But they needed the experience of working on suffrage to take its measure and learn the skills and build the connections they needed to be able to identify and perform the most effective actions. The challenge then being not to despair that what they had reached was a local maximum and a place to rest briefly and check their bearings. The despair is hard to fend off at times, to be sure. So here we are, with thousands and thousands of settler australians now facing the fact that they have one of the most vicious and irresponsible but officially democratically elected governments on the planet. A government more concerned about mining and shipping coal than facing up to the fact that wildfires have been so large that people were sheltering on the beach and on the brink of having to flee into the water. People who are furious that "austerity" was used as a pretext to undermine their local capacity to manage and prevent fires in the first place, because centralization was supposed to be cheaper. They are outraged by the pisspoor treatment of volunteer firefighters, and the australian government's attempt to push the firefighters into collecting information on the people they meet at fire scenes in a sort of de facto police operation. They are fed up with ludicrous attempts to send in the army to handle fires and other emergencies, for which armies are not trained to manage. Armies are trained to manage conditions in which they are facing enemy combattants. This is completely the wrong training to help people, and it doesn't take much research to learn about raping and pillaging of "liberated" areas by the "liberators" in any war after the "occupiers" have been sent packing after doing the same thing. The uncanny slippage towards authoritarian pseudo-government and a return to australia as effectively a penal colony is a miserable counterpoint to the ongoing debacles of the double down on extremism encouraged and funded by rich cliques in most capitalist countries currently at work. And yet... I can't help but feel that it is a good sign to see something like genuine good sense reasserting itself among the australians and many others in many of those countries. No one is interested in living in a penal colony. We are not so narrow or simple minded as to be incapable of realizing first that we've ben had, and second that there are far more of us than there are of them, and that what we want in the majority is genuinely a better world in which we learn how to live in relationship with other beings on Earth in a good way. It may sound strange, but Kafka's uncanny story of the penal colony is in fact an optimistic one – not in the easy optimism sense of course, but that is all the better. We need stolid realism more than fairy tales. (Top) Brilliant Notes on Oppression (2020-07-20) I have been dipping periodically into the world socialist website, which is both wonderful and infuriating at the same time. Wonderful, in that the contributors are providing some of the best and at times only serious coverage of such key events as the onrushing effects of catastrophic global warming, the resurgence of never rooted out by design nazism in germany, and the strange ongoing saga of the new york times' 1619 project. There are many interviews with intriguing participants with little evidence of the pruning that so frequently and dangerously simplifies or even falsifies their equivalents in mainstream and less mainstream outlets alike. I don't agree with the editors refusing to let go of the nonsense belief that merely putting in place a socialist government and economy will fix every possible other source of oppression, so racialized people and all women should shut up and fight for the socialist revolution. And alas, the contributors flogging the more and more harebrained "Feminist conspiracy to silence all the popular media producers we like" whose work appears mainly in the lower portion of the centre column are doing damage to the site's credibility. This does not prevent me from appreciating key areas where I tend to agree with them, such as with their opposition to internet censorship, and their commitment to a much higher standard of journalism than is typical today, although they slip badly when it comes to entertainment journalism. (It is an interesting question as to why that is a common area of weakness in newspapers and analogous news sites online.) What is infuriating is the contributors' inability to stop repeating the editorial position even where they are providing news coverage, especially in the case of when that coverage is dealing with the authoritarian creation and control over so-called "identity politics." I sympathize with their frustration about the development and abuses of "identity politics," though. Having set those reservations up front, I'd like to signal boost again Tom Mackaman's 15 january 2020 interview with historian Clayborne Carson on the subject of the previously mentioned 1619 project being dubiously put together by the new york times. This is one of several interviews with historians concerning the project, which has drawn special interest from the International Committee of the Fourth International, though I admit to not being clear yet on why. If it is just about the question of historical accuracy, my own historian self is utterly delighted. But I suspect this is only part of what is at play in their concern with covering the project and challenging it by means of these interviews. In the course of this interview, among Carson's many powerful comments and reflections on oppression and resistance in his answers, three snippets really stood out to me under the present conditions. "So, the question is how do you move someone from acquiescence in their own oppression to that audacious statement of, we have the right to determine our own future? We have the right to participate in the decisions that affect our lives? In a way, if they can do that, that simple declaration is a statement of freedom. Of course, backing it up with something other than a declaration is kind of necessary! To have the power to make someone listen." "Among the most important changes leading to the articulation of rights was the spread of literacy. As people became more literate it became harder to dominate them. Literacy in the African American tradition, from Frederick Douglass learning to read to Malcolm X being in jail and learning to read the dictionary – literacy is itself a freedom. One of the commonalities of oppressed people is that they get all of their information through people who dominate them. For most of human history people never got more than 20 miles from where they were born. They knew very little of the world. There was no way of overcoming that until you could get to the point where you could read. That was often purposeful. Dominant people wanted to control information. You don't want people working for you to know that they can walk 20 miles and find better conditions. Of course, you can also mystify them by religion and say, there's this sacred book. You can't read it, but I can interpret if for you, and tell you what your role in society is. So, literacy is huge. So is mobility. Just think of the impact on the whole notion of labor when workers in Europe began to move from place to place building the cathedrals. They learned by moving to a new place that conditions could be better. That's why there was a huge movement to put up gates at the outskirts of Paris and other cities, because they knew that once you got in the city you could negotiate with whoever wants your labor. So, literacy, mobility. But the commonality of it is just being able to not be enslaved by ignorance." "One of the things that strikes me is that so much of any oppressive world is built on mystification. Just think of how many people assume that corporations are something that was not invented. Haven't we always had them? So, you have this entire legal structure that maintains wealth, and not only just maintains it, but mystifies it. We can't know how this happened that—what is it now, 100 people? Or 500 people? – Have $6 trillion worth of wealth? How is that even conceivable? That level of concentration didn't even exist in the Gilded Age." It's worth spending some time sitting with these, and just pondering them quietly. There are so many more of us than there are of them. Why are we apparently acquiescing to our oppression? I find myself considering the question of whether one of the root horrors in much of the world right now is the belief that everyone is supposed to "satisfice" because they either refuse to believe anything else is possible, or feel comfortable now and so don't want any change to happen so long as their lives will be pleasant until they die. Neither possibility is encouraging, I admit. Even less encouraging is that even though an extraordinary number of people are in dire straits, even they seem to believe that even if they have nothing left to lose but life, no rebellion on their part could succeed, even though their numbers grow every day. The second gave me much better insight into the belief so many of us have been taught to hold in the importance and power of education and travel. These are not luxuries as soon as they go beyond being able to handle the raw basics of day to day life. The ability to compare how different people live, to read and interpret for ourselves is indeed great. These are our defences against confidence men and other types of thieves. How terrifying and tragic it is, that so many of us have lost sight of the fact that our ability to read and to meet and interact with others different from ourselves are such powerful freedoms, and honestly, otherwise latent superpowers, that can serve not only our own interests, but everyone's. Well, everyone's if we care to use our superpowers that way, and resist the pressure to look only to the short term and our own singular selves. No wonder education is always controversial. Carson's point about the role of mystification does not surprise me, since it lies at the heart of all kinds of propaganda. For propaganda to work, we must somehow be persuaded to forget the past, and to believe such stupidities as the soap with the new packaging is somehow meaningfully better than the soap in the old packaging, or that if we just buy the right product we will become smart, gorgeous, irresistable to whoever we want to have sex with, and rich. Yet it's the forgetting of the past, and therefore the loss of any means to study cause and effect that carries the most danger. (Top) Something Is Not Right Here (2020-07-13) Just about two years ago, I had the dubious task of reading one of Kevin Kelly's technology proselytization texts, the arrogantly and unwisely titled The Inevitable. Kevin Kelly is undeniably a non-trivially brilliant man with many non-trivially brilliant friends, and I enjoy much of his writing and review work. Due to my own background I find proselytization to be an evil, and unfortunately like many futurists Kelly's vision of the future comes down to an authoritarian surveillance dystopia in which supposedly computers will know what we want before we do and insist on preventing us from doing ourselves harm. More than twenty years ago one of Faith Popcorn's books struck me in much the same way. The genuine coolness of the gadgets foreseen and suggested by current gadgets winds up rather spoiled by the bits where they surveil us incessantly and constantly mine our lives for data. An important critical discussion of this is laid out by Shoshana Zubov's The Age of Surveillance Capitalism. Maybe Kelly and Popcorn figure that the concerns Zubov raises are just bugs that we can trust techbros and the market to fix for us, heedless of the recent evidence before our eyes. Still, all that aside, Kelly is an excellent writer, so even if the reader can't agree with him or completely agree with him, the reader can enjoy his journalistic-style prose. I have never quite forgotten the following paragraph in particular, and have long wrestled with what bugs me about it. Something seems a bit off, though not with Kelly or the text as such. Here it is. A few minutes later [Jaron] Lanier handed me one black glove, a dozen wires snaking from the fingers across the room to a standard desktop PC. I put it on. Lanier then placed a set of black goggles suspended by a veil of straps onto my head. A thick black cable ran down my back from the headgear to his computer. Once my eyes focussed inside the goggles, I was in. I was inside a place bathed in a diffuse light blue. I could see a cartoon version of my glove in the exact place my real hand felt it was. The virtual glove moved in synch with my hand. It was now "my glove, and I felt – in my body, not just in my head – very strongly that I was not in an office. Lanier himself then climbed into his own creation. Using his own helmet and glove, he appeared in his own world as a girl avatar, since the beauty of his system was that you could design your avatar to look like anything[sic] you wanted. Two of us now inhabited this first shared dreamspace. In 1989. (213-214, 2016 penguin-randomhouse edition) Just a standard description of an early VR experience, the start of a peek into some of Lanier's earliest work in the field. The goggles, the glove, the blue glowing space, all of that seems pretty anodyne. No big deal. Even the aspects that later the Wachowskis would invoke in their first and best version of The Matrix are there and quite simple and sensible. Most of us have enough passing familiarity with this technology Lanier has been a major pioneer in to appreciate the point of having the participant's eyes cut off from outside stimuli and their integration into the virtual space via some form of gear that tracks their physical position. It took me awhile to realize that what is off here is not the technology as such at all. The problem is the avatar, the notion and actualization of "his own world" where he can be "anything" he wants, an odd choice of words where I would have expected "anyone or anything" or the inverse "anything or anyone." No doubt this would not have stood out so much if Lanier's chosen avatar had not been a little girl. I should add here as well that at this first reading, my familiarity with the leaders in VR development was basically nil, nor had I knowingly seen Lanier and connected him with his idiosyncratic appearance. No, what is chafing here is that idea that in a VR world, anyone can be anything or anyone. There are uncanny echoes of the odd and remarkable encapsulated expression of white male colonialism in science fiction television, Quantum Leap, in which the helplessly lost Sam Beckett is able to at least be anyone anywhere and in any time. There is an uncanny arrogance in such a notion, the idea that "we" could somehow absolutely inhabit and experience the lives and sensations of someone else. Which is actually not to say that VR couldn't be used to help us better appreciate and empathize with others who have very different experiences. It's just that in these fantasies and fantasy worlds, the guy busy being an avatar of whatever type somehow always turns out to be better at being "the other" than "the other" is, always more authentic, and able to succeed in life where "the other" fails, implying "the other" fails for individual reasons, especially failures of authenticity. This is not at all the same thing as playing the role of say, a black woman coping with the barriers and effects of structural racism. Structural racism cannot be reduced to the shitty behaviour of a few badly behaved people, that is the whole reason it is referred to as "structural" it is embedded in a whole gamut of interrelated practices and engrained assumptions. Building a VR that helped make those facts recognizable and unforgettable could make for an amazing VR experience. Not that there is a lot of effort going into that sort of VR development, at least so far as I can tell in my far from vast sampling. The most vaunted applications seem to be focussed on games, simulator training for civilian and war planes, and virtual tourism (a very distant third). This seems quite a truncated imagining of what VR could be used for, though to be sure this is a great start. Well, except for the uncomfortable colonial leanings of virtual tourism so far. Then again, evidently I am not the ideal subject of the lure of VR, since it seems to me that the human imagination plus a few judicious costume pieces and some willing friends make for a far better game of pretend, and indeed when VR is at its most successful, people go together to participate in computer-generated VR environments. In the end that may indicate the thing that is a bit off in the VR hype, its repeated construction by the hypers as the ultimate individual experience, when we humans are fundamentally social creatures who by nature need a balance of individual and social experience in a time when genuine social experience is so limited. (Top) The Role of Student Lecture Notes (2020-07-06) Pick a classroom, any classroom you like, elementary, secondary, post-secondary, any general space where people get together to learn from an instructor and take notes to support that learning. In its basics, this is one of the most powerful learning tools there is, though it certainly can't be expected to stand alone. There are many ways to learn, and we need them all. The potential of notetaking has popped up in other thoughtpieces, so I won't revisit too much of that here, except to say that an astonishing variety of projects began as somebody's lecture or reading notes. When it comes to the latter, I think especially of Maria Popova's BrainPickings website. Or the better examples of book reviews in the Times Literary Supplement, or the wonderful collections of Joanna Russ' non-fiction essays. More recently, I was reminded of specifically the role of class lecture notes, especially those by students setting down their complex interpretations of what their instructors said or wrote for them. Even more specifically, of the fact that what we have from Aristotle is in fact copies of some his student's lecture notes, refracted again and again through scribal tradition until the latest printed versions of today. Now, I appreciate that the student notes that survived were overall the better ones, the ones that managed to capture a great deal of Aristotle's ideas with relative accuracy and mostly minor garbling. Yet, it also strikes me how precarious this mode of transmission is in terms of trying to reflect what he said or taught. How do we account for the inevitable effect of the student's level of understanding and personal response to Aristotle? No doubt there are those who argue this doesn't matter because the teachings are identifiable and interpretable, and still others besides work in the area of reception theory to see what we can learn from what is recoverable about the inflections created by students' responses. Then I think back to the many papyri and clay tablets that have accidentally survived from scribal schools in central and southern asia, the things that have accidentally preserved bits and pieces of the story of Gilgamesh and Enkidu and the mathematics in the Rhind papyrus. The endless copying exercises from which later scholars managed to work out how to read cuneiform and then the languages the cuneiform encoded, and then how the languages spoken by the writers changed based on such things as spelling errors and evidence of mispronunciation. It may be hard to believe, but the lists of "this, not this" pairs compiled by despairing grammar school teachers as spoken and written latin spiralled apart from one another can be quite funny. The distance between what we write and what we say is not new at all, and indeed there is an old joke in Chaucer that depends on both the sound and the sight of the different middle english dialect word for "eggs" for full impact on the audience. Though, having typed this, I am wandering a bit away from student notes to touch accidentally on Chaucer's woes with the scribes who recopied his works for him. Those starting out in the job would often have been students looking to make some extra cash. Humble the notes students make may be, yet they can make extraordinary things possible. Perhaps this is a bit of romanticism on my part, yet I can't help but find this encouraging, because the fact remains that in the end the sheer mass and repetition of student notes makes them more likely to survive than many of the fanciest and most complete manuscripts. After all, they are rarely considered important enough to lock away and so prevent anyone seeing or using them, and they are usually produced with the humblest materials, as opposed to elaborately decorated books or even memory-rich computers, both of which are tempting to thieves for different reasons. Oh yes, we had best not underestimate the value of student lecture notes! (Top) Talking About Higher Education (2020-06-29) It can be quite instructive to take a look at what is finding its way to the new book shelves over time, and this needn't be done via an online bookseller or even a new or used bookstore in firmspace. In fact, one of the best places to track new books by is the public library in your town or city, especially its largest branch if there is more than one. The idiosyncratic blend of books generated via the reading populations' requests and recommendations can be eclectic and thought provoking. The shelves I have been reading titles and blurbs from over the past several years is one in a university town and near to a university, so it is perhaps overdetermined that the new book shelf will regularly feature new books on post-secondary education, especially its state and development. This is actually not the case, nor do these sorts of books turn up too much in the parallel branch of the public library. Nevertheless, the mainstream media has a steady trickle of articles on "the crisis" of post-secondary education, and these articles often point to new books on the very same topic. One such reference caught my attention awhile ago, and with the topic refreshed in my mind, I was fascinated to look into the somewhat more obscure section of my nearest new book seller where books on post-secondary education land, and find that at least two others have been published this year and there they were, tempting the unwary browser. Not too fascinated, mind you. I don't seriously imagine such books are best sellers as a rule, and I don't browse this specific section in any bookstore often. Then it so happened that my office mates and I, in a state of either bravery or foolhardiness decided to clean up our office space, thereby winning back shelf space from the ghosts of completed PhD dissertations past. We are not quite done this task, since the pile of defunct textbooks is quite a sight to behold. Besides those, the remarkable range of books, pamphlets, and lost class notes was impressive in its own way. We're not at all sure what to do with a lovely set of post cards evidently collected on someone's european tour and accidentally left behind a decade ago, since we'd rather not spoil them or break up the collection. All that aside, among the older items were two books on, you guessed it, "the crisis" of post-secondary education, one from the late 1960s, the other from the early 1990s. This strongly suggests that somehow or other, post-secondary education has been in non-stop or almost non-stop crisis for well over fifty years. Yet, closure of public post-secondary educational institutions is basically unheard of. Expansions, mergings, changes of status from college to university sure, but not closures, which an ongoing crisis might be expected to cause. Maybe that will happen yet, but this got me to thinking. From the look of it, "post-secondary crisis" books are in fact a genre. On further perusal of book catalogues and the like, it turns there are always a few new ones published every few years, and worries and arguments over "the crisis" are fairly constant in journals and magazines. People in and outside of post-secondary education argue about whether it should be purely vocational in nature, therefore publicly funding only professional and trade schools. They argue about the role of research, whether it should be publicly funded at all, whether it should be done in all universities or only some of them, or if it should be restricted to research-only institutions in the first place. They argue over whether anybody should bother teaching "the humanities" anymore, conveniently forgetting that "the humanities" used to include several of what are now called sciences, and that nobody can function in this world anymore without that basic humanity, rhetoric (that is, the study and practice of how to make, interpret, and share logical arguments). They argue about the newfangled acronym "STEM" and whether young people should be directed only into the areas of study enumerated in it. Uncomfortable rumblings about business degrees, which actually seem like they ought to be earned in professional schools unconnected to universities via private funding, for logical consistency if nothing else. But that of course is another argument. Then there is the argument over whether there is any point to a "liberal education" anymore, which used to simply mean one that included both the arts and sciences with a view to enabling students to be active and effective citizens. Meanwhile, scholars like John Quiggin have begun pointing out all over again Why the Profit Motive Fails in Education. He doesn't put it quite this way, but the sad truth is that trying to make an education run like a business makes for a proliferation of perverse incentives. As Quiggin notes, "Students, by definition, don't know enough to be informed consumers," and neither do their parents, it is worth adding. Neither student nor parent can know how good or bad the course was until it is over, nor how effective or not the overall program they pursued will be for helping the students successfully keep themselves in work and able to keep body and soul together until long after the tuition is paid. Hence private schools and public schools under pressure to "be more business-like" are pushed to do things like:
Regardless of all the arguing, research and analysis, all of the evidence of over fifty years and more besides, there are apparently no answers to the somehow constant crisis of post-secondary education, let alone to wider questions about education in general. There seems to be a rough consensus that the big question about any education is whether it should be purely to get students into the workforce, or to equip them to be citizens in whatever political system they live in. Yet even these supposedly basic questions seem to have no answers, not even provisional ones. This suggests to me that this is another instance of asking the wrong question. These "common sense" questions are distractions. The real question is, what sort of society do we want and expect the education system, from elementary to post-secondary schooling, to support and create? If we want education that encourages massive and persistent social inequality, then we already know what that consists of: privately funded schooling with absolutely no provision for public education whatsoever, except what might be provided by such organizations as churches or charities. If we want education that supports minimal social inequality, we already know what that consists of too: publicly funded, universal schooling with enough teachers and physical plant to keep class sizes reasonably small. The only "crisis" in education in general or post-secondary education in particular, is the constant struggle between those who want it to support the creation and maintenance of authoritarian social structures as opposed to those seeking to create and maintain egalitarian social structures. (Top) Against Utopia (2020-06-22) The situation in the world right now is non-trivially difficult and scary. I think that is a fair assessment. And in such times, people reasonably start thinking and talking about how it is necessary to live differently, because the usual operating procedures are not working. Alas this doesn't usually get to the nub of things, more often than not such core assumptions as "european-style agriculture is a self-sufficient and long term way of making a living" are left untouched in the face of literally thousands of years and square kilometres of evidence to the contrary. Rather than spend time doing admittedly deeply uncomfortable work on those nubs, people get distracted by the appeal of utopias, whether they consider the term utopia to be from greek roots meaning "a good place" or greek roots meaning "no place." Not all discussions or visionings of utopias are distractions of course, especially the ones I can think of that feature in novels. More often than not the authors are exploring, critiquing, and at least demystifying utopian ideas and assumptions. When utopias are taking off in the public imagination, that is important work, especially because it works on the level of what we imagine, not just on the level of rhetoric. Based on my reading so far, my strong impression is that the general idea of a utopia in the sense of a perfect place that is not here and likely never could be, which neatly ties together both senses mentioned above, did not arise in contexts of trying to come up with a new way to run a society. Instead, it seems to have been mostly about satire and critique. Even the model for an ideal society run by philosopher kings so revered and studied in Plato's Republic leaves me wondering how serious he really was, and whether what we have somehow gotten is an astonishing indirect critique of athenian upper class society. Aristocratic young men were brought up on an education centred on homeric epic and carefully sign posted by emphasis on when they first start growing a beard and being allowed to watch the plays at the festival of Dionysus. Plato's guardians are supposed to be brought up on "noble lies," false stories meant to keep them committed to the ideal society. To prevent any contradictions or dangerous free thinking, poets and poetry will be banned. Now of course, a big part of what the Republic is meant to do is indeed to critique Plato's own society. I don't know that anyone has tried to read it as at least in part a satire though, and it might be worth the experience. But back to the notion of utopian visions as a means to critique current conditions. That makes utopias excellent tools for critique, to be sure, and that is no distraction. Where they can become dangerous distractions is when they lose that connection to critique, and instead become either expressions of escapism, or else blueprints for an attempt to create the utopia in real life. In such circumstances, and such circumstances are endemic right now, utopias become extremely dangerous. There is really nothing so dangerous as a utopia that people seriously believe they not only can enact in the real world, but that they should and must. Such utopias are inevitable fuel for authoritarianism, because to begin with utopias are not plans for real societies in the first place. They are idealizations with lost of handwaving and quick applications of magic to get around impossibilities. More often than not magic gets a makeover into "advanced technology" these days, but it is magical thinking all the same. A remarkable number of utopias presented as possible real worlds are at root authoritarian to the highest degree. Proselytizing religions are particularly good examples here, including the supposedly secular capitalism, and they are among the most destructive forces on the planet. They help rationalize unremitting violence of all kinds on the grounds that anything is justified to enable the second coming or finally produce perfectly free markets, or whatever lunatic goal that is supposed to make the world work perfectly for everyone for ever. With no experiments, no change, and dissent will be prevented by ferociously punishing it. Nobody pushing the actualization of a utopia believes that anyone will actually be persuaded to stick with their program, because in the end the programs start from the premise that people are inherently bad, and will only be good if they are constantly afraid of punishment. These sorts of obnoxious views are the sorts of childish declarations any of us are prone to if we allow ourselves to be persuaded that somehow change will stop happening and that we should never ever experience unhappiness or any barriers to our getting what we want. Even if all of us got on perfectly with one another starting tomorrow, we would still struggle with how to balance conflicting wants and needs between individuals, societies, and larger social units. And that's just between humans. Not because we are inherently evil or perverse, but because we live in a world full of other complex and unpredictable beings, and the world itself has unpredictable phenomena that can throw a spanner in the works. The climate in the longer term, the weather in the shorter term, geological phenomena such as volcanoes and earthquakes. On top of that, our own actions can generate unpredictable effects to ourselves and other beings that we never intended and that can force us into terrible catch-22 situations. Several millennia ago, people in asia decided to start domesticating cattle and pigs, living with them and interfering constantly with their lives and reproduction. I think we can reasonably infer that they never anticipated such terrifying effects as herd diseases jumping species to infect humans, or that domesticated goats can be some of the most potent deforesters on the planet. Those people were probably trying to enact a perfect world, in which they would always be able to eat meat without having to hunt for it or maintain a respectful relationship with the animals. Hunting and respect involve effort and commitment, and always the possibility that no animal would be caught even if the hunter did everything right. It seems that people in asia began to forget that the animals also live in this complex and unpredictable world, and if they are not catchable, that might be for any number of reasons that neither they nor the people can control. From what I have learned about different Indigenous cultures in the americas, Indigenous peoples did not and do not fixate on utopian visions that they are obsessed with imposing on the world. This seems to be true even of imperialistic episodes among the Maya, Inca, and Aztecs. Instead, the starting principle more often than not seems to be that humans are actually the puniest and weakest of all the beings in the world. Which means we are fooling ourselves if we think we can reshape it to suit us. Instead, our challenge and our necessity is to learn how to live in and with the world, including how to live in a way that generates as few accidental awful things as possible, while maintaining the capacity to respond effectively to when awful things happen. Different Indigenous peoples have expressed this principle in many ways, and tried many different ways of living, because the land is not uniform. And sometimes Indigenous peoples have made tragic and terrifying mistakes, and these are remembered in our histories and passed down in our stories to forewarn against repetitions. We do our best. We make mistakes not because we are evil, but because we are human, and have limited information in an unpredictable world. That is another key feature of utopian visions, the element of predictability. They often express an understandable wish to remake the world into a predictable place, leading very quickly to visions of making everyone live the same way in the same sort of environment at all times. That requires what in science fiction is fancy-termed "terraforming" to make the land uniform and predictable, but in the mundane world is called domestication. To knock down human unpredictability, ah, now that is when things devolve very quickly into authoritarianism. But don't worry. Organizing and acting to change social practices that produce bad results is not enacting a utopian vision, unless the organizing in change is in fact intended to impose a single solution on everybody for all time. (Top) What Is It About Us You Don't Like? (2020-06-15) Over coffees one day a friend of mine burst out in frustration, "But look, what is with antisemitism anyway? How can people accept such nonsense?" In other words, she was struggling to understand what the appeal is to those who accept and propagate it. We can easily add any type of oppression rationalized by an aspect of a person that they are born with and can't change and wouldn't be accepted as having truly changed even if they made every effort to do so. It isn't hard to find terrible and terrifying examples of the dangers of passing as non-Jewish or as white under conditions of rampant antisemitism or anti-Black racism. Those tend to be easiest to find. I have puzzled over this, because people only rationalize behaviour that they find embarrassing or unpleasant, especially when it is their own. I am well aware that I am not immune to this, and feel it would be at best intellectually lazy to try to claim that such rationalizations are simply the product of involuntary disordered thinking. No using the "they're phobic" or pseudo-psychological explanations. Admittedly explanations for the appeal of oppressing others can't be exhausted just by pointing to the economic and social benefits, because we can get praise and cool stuff without stealing from others. Theft and slavery are not inevitable, we humans are not inherently lazy. Even the obnoxious billionaires spend a great deal of time and energy frantically trying to get more money and paper over their ill-deeds with public relations. This is misapplied energy, but it is not laziness. After awhile, I found myself turning back to Thomas King's 2003 Massey Lectures, collected in the book The Truth About Stories. Specifically, I looked again at a part of the fifth lecture, titled "What is it about us that you don't like?" in which he is discussing canadian legislation as it affects french canadians and Indigenous peoples in what is currently known as canada. Please bear in mind, my purpose is absolutely not to make some kind of false equivalence between antisemitic practices or jim crow. There are however some uncomfortable analogies between the ways that canadian law has been used to disrupt the social reproduction of Indigenous nations, Black communities, and yes Jewish communities as well as many other racialized peoples in canada. I think the reason the Indigenous example stands out for me, quite apart from being Métis myself, is that the usual rationalizations for this sort of interference don't seem to apply. Indigenous peoples are not potentially outnumbering anybody else, our labour is not coveted. On the other hand, First Nations are often accused of somehow keeping settlers from succeeding by supposedly hoarding land, and all Indigenous people stand accused of not vanishing fast and romantically enough. And I am all too aware that even being utterly and obviously impoverished and steadfast loyal participants in society has not been any protection against antisemitism for Jewish people, nor has simply following instructions from whites simply saved African North Americans either. Socially sanctioned terrorism has to be rationalized somehow. But if economics doesn't explain it, and fear doesn't explain it, what is the deadly and insidious appeal of these ways of thinking? What can catch us out? UPDATE 2024-05-20 - Indrajit Samarajiva has an excellent pair of posts that delve further into these issues, especially in their expression as false inclusion. He is among the far too few writers at this time who have recognized that "DEI" is a mean-spirited move along the lines of what many women refer to as the glass cliff, the counterpart to the glass ceiling. Whereas the latter keeps women from moving up, the former allows women to move up so long as the whole situation is about to implode, leaving them to wear the failure they had nothing to do with. See Samarajiva's The Subtle Propaganda of Hollywood and How Inclusion is Imperialism. But don't think he is not thoughtful and critical about the role of racialized people who opt to take part in supporting and running the empire: How Brown People Run the White Empire, The Diversity Dumpster Fire. Having read about the intense social disruptions that are the typical context of pogroms and sanctioned civil and state violence against scapegoated peoples, of course I get reminded that encouraging religious or race-based divisions is useful for social control and distracting people from resisting changes they don't want. But there again, I don't think pointing at nefarious elites is enough of an explanation, although it is an unavoidable part of the mix. Power plays and economics certainly play a role. But those are nastily logical, and rationalizations don't have to be logical, and in fact generally aren't. In the end, I can't help but wonder if what is at issue to begin with is actually a vicious sort of jealousy, even though it winds up being a psychological explanation, much to my annoyance. During those periods of intense social disruption, the world is coming apart for the people who are in what we now are encouraged to call "the mainstream." The majority of people have at least found things bearable, until in those disrupted periods the things going on finally impacted their ability to socially reproduce themselves, to maintain the social ties and ways of life that they most valued. And then they looked towards others who were different from them, who if in precarious circumstances by choice, by necessity, or even by law were working and living together in cohesive communities and thereby weathering difficult conditions. And it didn't trouble the majority much that those others were helping each other to survive, until the majority felt themselves threatened by circumstances outside their control. No matter that those others were likely experiencing disruption themselves already on top of other conditions they were dealing with. But they were different, in the minority, and therefore vulnerable, in the midst of communities who often shared a religious faith that equated acts of destruction and boundary crossing with expressions of power. Then the terrible explanation becomes, if people feel utterly powerless to oppose those who are oppressing them, and they deem themselves to be part of the majority who ought to be doing well in the world and aren't, and having power is defined as being able to ignore boundaries and wreak destruction at will, then they will take petty satisfaction in attacking those whom they are in a position to oppress. There is a horrible logic to this, and while it is a psychological explanation, the point is that it is not a relabelling of this behaviour as "crazy." Calling someone's behaviour "crazy" is very good for individualizing the behaviour, even when, as in this case, what we are wrestling with is a socially constructed rationalization for unremitting violence. Not individually constructed, not individually maintained. (Top) Asking the Wrong Question (2020-06-08) If the effort to render everything in life into something from which greedy thieves could steal from everyone else and call it "surplus value," or rather such bullshit labels as "data exhaust" (a term properly eviscerated by Shoshana Zuboff in The Age of Surveillance Capitalism), it is no surprise that their efforts to destroy universities and remake them into corporations is ongoing. They thoroughly encourage all of us to see students as consumers who therefore should expect a pleasing product in the form of a stable job with hopes of keeping financially above water, while being denied any right or role in how the corporatized university should work, how it should treat those who work and learn in it, or anything else. This is supposed to somehow solve the problem of what universities should actually do, by claiming that they should solely provide vocational training, professional degrees, and so-called "STEM," closing down everything else. Somehow an awful lot of mainstream commentators are quite sure that the humanities and fine arts are really useless luxuries, that should only be produced if at all at the behest of the rich via patronage or as mass marketed gewgaws flogged by corporations. The result is a closed circle, because all the definitions are made to enforce just one answer. A very narrow definition of what is "useful" or "vocational" plus a claim that universities should only be about voacational training, and voila, there is no discussion to be had. The supposedly hardboiled, hypermasculine vocationalists will sneer at the supposedly unrealistic, effeminate advocates of a liberal education. All of which leads me to conclude that if we participate in this pseudo-discussion, we must be asking the wrong question. In fact, we aren't asking a question at all. We are starting from the conclusion, and then arguing over how to enforce that conclusion in the real world. We waste our time trying to categorize areas of study as "useful" or "luxuries" when this is a distraction that at best encourages us to get lost in desperate efforts to protect what we love from those who insist what we love is childish or soulless. So no, I don't think that the question to ask about post-secondary education in general let alone universities in particular is whether it should be vocational or bust. Obviously of course, vocational training is important, although alas there is no guarantee that even the most highly recommended vocational training of the moment will lead directly to longterm employment and the other things we associated with it. Really, the question makes me think of Ken Robinson's response to demands that educational standards be raised, to which he replied reasonably enough, "Why would we lower them?" We already know from documented experience and extensive evidence how to raise educational standards, and they don't involve creating a sort of mass produced curriculum to pour down the throats of a mass student who is somehow transmogrified into the perfect, obedient mass workforce. Universities and other forms of post-secondary education are critical in the reproduction of society. They are the ongoing product of many people working together, Raewyn Connell describes so eloquently in her new book, The Good University. She means all the people too, the janitors, the groundskeepers, the secretaries, the food services staff, maintenance, everyone, not just the instructors and students, let alone administrators. So the real question is, what sort of society do we want and expect universities and indeed other post-secondary education institutions to support and reproduce? Let's stick to universities for the moment. Do we want universities to help expand and reproduce the ideal world of capitalist fundamentalists, in which everyone is a completely atomized individual engaged in constant competition, except of course for the capitalist fundamentalists, who expect to work as a team and thereby run the show? Do we think that universities should support the development of a cohesive society with a balance between individual expression and competition with cooperation and community expression? Do we want universities to reproduce and reinforce the current class system and its attendant inequalities which are the product not of merit but quite bluntly, of theft? Obviously I have selected provocative example visions here, but there is every reason to stop pretending that class and inequality are not the elephants in the room when we discuss education in general. That is the basis of the concern about vocational training and ensuring people aren't driven into indentured serfdom by student loan debt because they have been charged both invidious tuition and usurious interest because they have been told that without the credential they are seeking they will be unemployable. At times universities have served mainly as upper class twit credentialing mills in which the real point was the social connections the generally male students made among themselves, producing and reproducing the "old boy's club." At other times, an expansion of universities in size and numbers has been encouraged in hopes of quickly generating updated masses of workers, and perhaps curtailing effective social protest and change in the meantime. But if universities are a creation of people working together, then they are likely to have diverging visions of what society they are supporting and encouraging to develop. That is not necessarily a bad thing, but it does reinforce how important it is that universities are not developed into the equivalent of a restaurant franchise. They need to respond to the specifics of their time, place, and communities. Here again, I am echoing some ideas that Raewyn Connell describes eloquently and wonderfully in her book, which is short and an excellent blend of practical and philosophical points. In our efforts to focus on the right question when we consider what universities are for and what sort of society they should support, it won't hurt us to bear in mind that capitalist fundamentalists and politicians alike share a pathological focus only on the short term. Unless you are an adherent to one of the various faith systems that insist that humans are doomed and the time before the end is indeed short, there is hardly a reason to accept such a mean-spirited and selfish perspective. We may not have universal agreement on what the public or social good is and whether universities should serve it, but these are quite reasonable and potentially productive discussions for us to have. Not pointless circling and tail-chasing about vocational training. (Top) Yes, the World is Burning (2020-06-01) And its not just the climate. Over much of the world, a nasty intersection of factors has come together to drive an increasing number of not mere protests but outright revolts. In october 2019, there were revolts and protests in twenty-two countries, most completely ignored in the media, including major protests in russia, which in conditions of the revised "red scare" could be expected to be splashed across the news everywhere as proof of how bad the russian government is. But I guess that what coverage of those protests would show instead is that russians are no more monolithic sheep following around the officially ensconced leader of their government than people are anywhere else, and that would make amping up the new "red scare" a lot harder among already skeptical and hardboiled audiences. Within the following month, four more countries began bubbling over with new protests and revolts. It is true that I am not differentiating these in terms of protests and revolts triggered by an externally supported coup, which is a key factor in many cases, from those in which people are protesting the fact that they are denied any role in governing their societies, economies, or other meaningful aspects of their lives, despite living in what are officially labelled "democracies." I have not differentiated them because in the end whether a coup has been the precipitating event or not (maybe it is a new tax or price rise, wild fires, global warming, or...), fury, frustration, and growing desperation about the authoritarian or authoritarian-trending governments and corporate networks they live within is something they all hold in common. This is not an idiosyncratic perspective on my part either. It isn't hard to find other bloggers making the same point, from the folks at naked capitalism to brave new europe to Caitlin Johnston. It's just the official mainstream media that isn't covering it. I wholeheartedly agree that this is major scary shit, by the way. Much as I dislike the ongoing effort to revive the corpse of world war one propaganda about how it was somehow noble when it was in fact a vicious pissing match between imperial powers that inconveniently to industrial capitalism gave major windfall profits but annihilated much of the potential workforce in the battlefields and trenches – and it is the latter that the veterans both civil and military wished us to remember so we wouldn't get fooled again and thereby would best honour their sacrifices – I can't argue with those who see parallels between pre-world war one conditions and the present. Here the world is again in a situation where a combination of ailing and imploding imperial powers, upstart wannabe imperial powers, and the various hangers on who hope to hold the victorious bully's coat are all jockeying for economic and military position. The hyper-rich have either conveniently forgotten that the controls on their greed were meant to keep their heads off pikes and blunt the critiques from radical philosophical and political opponents, or they think this time they have enough captured military, media, and control of food, water, and medicine to do whatever they please and get away with it. At the minimum, they have forgotten the bitter lesson that desperate people facing prospects of little more than early death for themselves and their families will fight like just what they are: people with nothing left to lose. The uncomfortable question percolating up more and more often in american independent media outlets and anecdotally speaking among my american acquaintances, is whether or even when their current civil war is going to turn overtly hot. By which I think they mean when it will turn overtly hot for them, as opposed to racialized poor americans and immigrants and refugees in the united states. On top of all that, the tactic of encouraging a sense of mass confusion and disorder is backfiring on the supporters of neoliberalism. Anyone who has read Naomi Klein's 2007 book The Shock Doctrine or at least read a reasonable executive summary is familiar with induing disorder and confusion in order to allow corporations to rush in and begin pillaging and asset stripping the affected populations. A small-scale version of the same tactic is the sort of indoctrination tactics applied to new recruits in military bootcamps, which can indeed produce extraordinary and frankly frightening change in the individuals who have undergone it. It only takes seeing a friend whose view of the world was a sort of gentle hippy hybridized with holding down a practical job while going to school and trying to fund his love of skydiving come back from bootcamp acting and sounding like GI Joe to be appalled. The trouble is, to keep the new faith installed and the conversion up to date, the shock has to be administered again. People involved in evangelical religions are all too aware of this, because they struggle and suffer terribly when the euphoria of their most recent religious hyper-experience wears off. It's not their fault as individuals, nor is it the fault of societies that shock-induced changes don't last. That's the nature of shock. It is also the nature of an originally shocking stimulus that keeps getting repeated to stop being shocking, to stop being effective at generating the weakened resistance to imposing otherwise unacceptable or at minimum unconsidered changes. For the neoliberal fundamentalists, worse yet is that it is getting harder and harder to predict and stifle dissent. The peak of their power may have been during Barack Obama's and Stephen Harper's use of the respective militaries and paramilitaries they temporarily controlled to violently surpress protests and related efforts to build and demonstrate the effectiveness of alternatives to their policies. I am not suggesting that the world is on the brink of another mass war, although it is nevertheless an unhappy possibility. Rather, my point is that the world is definitely burning. We have to decide whether the price of the apparent status quo is worth it, when the outcomes right now include an ever increasing authoritarianism now supposedly made acceptable because corporations are the ones pursuing it, and the ongoing debacle of global warming which is getting much more attention now that venice is drowning instead of islands across micronesia. (Top) Fading Newspapers (2020-05-25) By now I doubt anyone subscribes to any claim that newspapers are healthy, thriving publications in north america. There is a broad consensus, at least in the remaining newspapers and other news outlets, that the primary cause of the troubled fortunes of newspapers is "the internet." Nothing at all to do with the phenomenon of asset-stripping that is currently destroying older department store companies and has been working its way through formerly successful newspapers for years, as summarized by naked capitalism's repost of Andrea Germanos' article 'Destroyer of Newspapers' Vulture Fund Buys Majority Stake at Tribune Publishing. Nothing at all to do with the precipitous collapse in quality of newspaper content as old and new newspaper owners alike gutted their newsrooms because they decided actual reporters were unnecessary if they could make a pretence of journalism by using the top ten to twenty search results in google plus some carefully selected social media posts. Nothing at all to do with the reformulation of newspapers as little more than thin coatings around capsules of advertising, including advertising purporting to be news reporting, both via "sponsorship" and invidious formatting. The growing evidence of how untrustworthy most newspapers are as sources of information on current events or ongoing complex sagas, from ongoing coup-fomentation by the united states and its allies to the growing effects of global warming. All of which is actually not to say that "the internet" is not an important factor, or rather, the specific role developed by advertising monstrosities like what google has become. For awhile there, search engines in general behaved when it came to news something like the radio or the television news ticker, boosting headlines. That in itself was a great way to suggest news sites and subsequently newspapers for people to read regularly. This is not the way things have developed in the longterm of course, and probably couldn't have done because the incentives were and are perverse. Google makes money not from actual content but from shoving advertisements on people who are looking for stuff online. If how a business is funded depends on advertisements, you can anticipate that advertisements will grow to take up all the space they are permitted to have. Now, if the many organizations interested in having an indexed and searchable internet had happened on the solution of a subscription-supported search engine service earlier, the current state of internet search and surveillance capitalism would be quite different. But when it comes down to it, I don't think google or any other search engine is the primary cause of newspapers' difficulties so much as they are exemplars of how to outmanoeuvre older organizations playing the same advertising game. This begs the question then of why newspapers would insist on depending on advertising. Claims that the potential audience for newspapers does not value investigative reporting or accurate current events reporting is not upheld by the ongoing furor over intimations of manipulation by "social media" companies or the survival of alternative media outlets funded wholly or primarily by subscribers. Practically speaking, it is all about accountability. Subscribers can demand that any profits be sown back into investigative reporting, a more frequent publication cycle, a special weekend edition, and so on. That is not so comfortable for an organization rejigged so that it provides a regular dividend to shareholders, who won't see any reason to invest more in the actual reporting and publishing functions if the current formula makes money. And if the formula stops making money, or doesn't make as much, then rather than invest time and money changing the formula, they sell out, or as in the current conditions, asset strip. Notice the core point of having the newspaper running is no longer the same. Subscribers generally don't stick around to funnel money into absentee shareholder pockets while a formerly useful, if not loved publication is reduced to a caricature of its former self, however problematic that self may have turned out to be, because a subscriber-based publication will reflect its subscriber base. It is all too easy to sneer at anyone younger than fifty years old, claiming that they believe everything online comes for free and that they would rather steal than pay a subscription fee. However, as I've just argued, what people of any age are unwilling to pay for is a bunch of crooks filling their pockets while they foist advertising and now incessant internet surveillance on us. That makes sense. Contrary to the propaganda line of the various boosters of surveillance capitalism online, people are not just going along with the proliferation of advertising, otherwise the desperate search for ways to prevent ad blockers from working and identify bugs that can be manipulated to allow browser and user fingerprinting wouldn't be happening. "Internet of things" franchises wouldn't be shut down after apparently random numbers of years, despite having loyal, if not expanding numbers of people buying and using the things. This tells us that what those things are generally treated as is as one more means to gather lots of juicy data, which is currently being bid up to prices nearly as insane as real estate on the off chance that even the most mundane details can be somehow profitable. Plenty of people have already identified the shell game that is meant to persuade them that privacy and security are luxuries they must pay for or surrender in order to have access to "news." I suspect that newspapers are not fading so much as they are being actively strangled as far as possible by interested parties who see them not as businesses providing a service that obligates them to seek to report fairly and accurately and otherwise respecting a community of subscribers. The alternative those parties are after is what they think will be a sure fire short term profit driver, production and distribution of "news" and advertisements at scale with minimal effort put into the creation of the first in hopes of making perfect manipulators out of the second. Bizarre stuff. (Top) The Colonization of Imagination (2020-05-18) A discussion of science fiction writing online led me to revisit a seven-page essay by Ursula K. Le Guin, one of her most famous, "Why Are Americans Afraid of Dragons?" Originally presented as a Clarion Workshop address in 1974, it is one of many of Le Guin's sharp and funny examinations of the peculiar "american," by which she apparently means people in the united states, mainly those who are self-racialized and therefore think they are white, way of treating fiction. I have described it as sharp and funny, and it is. Yet, reading it in its 1979 version in the collection The Language of the Night: Essays on Fantasy and Science Fiction, it is also wistful and sad as Le Guin argues that all of us should trust children not to confuse reality with fantasy, and sketches the inched state of adults who despise fiction unless they can tie it to their work. I was refreshing my memory of the essay when I was particularly struck by her words on the fourth page of the 1979 version. Now, I doubt that the imagination can be suppressed. If you truly eradicated it in a child, he would grow up to be an eggplant. Like all our evil propensities, the imagination will out. But if it is rejected and despised, it will grow into wild and weedy shapes; it will be deformed. At its best, it will be mere ego-centered daydreaming; at its worst, it will be wishful thinking, which is a very dangerous occupation when it is taken seriously. Where literature is concerned, in the old, truly Puritan days. the only permitted reading was the Bible. Nowadays, with our secular Puritanism. the man who refuses to read novels because it's unmanly to do so, or because they aren't true, will most likely end up watching bloody detective thrillers on tbe television, or reading hack Westerns or sports stories, or going in for pornography, from Playboy on down. It is his starved imagination, craving nourishment. that forces him to do so. But he can rationalize such entertainment by saying that it is realistic – after all, sex exists, and there are criminals, and there are baseball players. and there used to be cowboys – and also by saying that it is virile, by which he means that it doesn't interest most women. Le Guin's reference to pornography gave me special pause, because note that her purpose in bringing it up is not to get into the subject of pornography's quality as literature or entertainment, or the grave ethical and legal issues entailed in its production. Not at all, her point is that it is a type of junk food for the imagination, which can be fed wholesomely or badly. Humans actually need to use their imaginations, and add images and ideas to them. Then for other reasons, I spent some time reading Muriel Rukeyser's book The Life of Poetry. While on one hand I simply can't agree with the claim that the common aversion to poetry among people in the united states (and also canada, but it was outside of her purview for that book) is simply evidence of psychological illness, though it may correlate with that at times. Instead, poetry is part of that vast genre of fiction, and it demands the use and freedom of the imagination. And many people in the united states have religious heritages – whether they think they are white or not – that include stern invocations against freethinking for fear that this can only lead to a loss of faith. If faith can only be as fragile as so many religious and crypto-religious creeds insist, it is not irrational or unhealthy to fear fiction and its invitation to freethinking at all. Far from it, it is totally rational, because if faith is that fragile, it is also unbelievably precious, the key item that many people are certain is their insurance against their worst possible imaginings. Still, I wholeheartedly agree with Rukeyser's broader point about the actual accessibility of poetry, and that many people have been discouraged away from it and from riches poetry can hold. Alas, it seems to me that discouraging people away from using their imaginations is in the end meant to allow those imaginations to be colonized. After all, the equivalent of junk food for the imagination is great for an occasional treat, but highly dissatisfying when that is all there is. It is intended to be used and thrown away, this sort of junk food. It is designed to be unsatisfying, so that we always want more, more often, with more excitement. Hence pornography, action movies, and even such seemingly staid fare as police procedurals get ever more extreme, more graphic, more violent. Yet people actually want fiction of all sorts that allows for rereading, reviewing, and/or relistening. Works that on revisits reveal new meanings, not necessarily because they are abstruse, but because between visits we have had new experiences and learned new things, and this changes how we experience them. Enough people act on the desire for more satisfying fiction by defying barriers set up to keep them fixated on the other stuff exclusively, and so they read poetry, build fandoms, and argue fiercely about what a "classic" is. Those who want to colonize our imaginations and keep them that way are not so fond of resistance. After all, in a fundamentalist capitalism environment, anything that holds our attention and pleases us without forcing us to pay for it again and again is not something that can be used to steal value from us. Atrophying our imaginations so that we are unable to recognize or accept the existence of other fiction than the junk food kind is far more profitable for and acceptable to the committed capitalist. Of course, the companion loss of tolerance for and interest in non-fiction is just a convenient side effect. (Top) Atomization and Its Discontents (2020-05-11) The ultimate state of a person, according to the latest and most extreme version of liberalism, is one of complete self-sufficiency. This state is characterized by the individual in question being totally responsible for their own food, clothing, shelter, health, entertainment, and anything else we might think of that doesn't fall into one of those categories. Based on my observations of the latest advertisements for cell phones and the internet of surveillance in the form of things, such a person achieves these things via very specific ways of interacting with the world. They interact with the world via a very narrow type of technology, cell phones and in more affluent places, computers and various "internet of things" devices. Via these computers, the ultimate liberal individual interacts with the world by ordering whatever they want using the cell phone or a voice controlled gadget that monitors them at all times. They manage their appointments for work and the officially social events via whatever social media applications of choice, and perhaps interact with people much of the time primarily through computers, whether by email, video meetings, or chat while playing games. The most well off such individual can literally speak what they want, and their internet surveillance devices will promptly provide, or else suggest and nudge what said person must really want. All interactions with other people can be achieved on a "just in time" basis, either using the rubric of the so-called gig economy or highly structured, short term appointments usually labelled "speed dating" and similar. It's all so excruciatingly efficient, and matches certain visionings of the future favoured by such characters as Kevin Kelly and Ray Kurzweil. Yet all is not right in this present with the future unequally distributed, to paraphrase William Gibson. After only a few minutes of effort, it isn't hard to find articles in both mainstream and alternative media reporting on the grievous state of people living in the countries currently deemed "most modern," especially the united states. From blogs like naked capitalism to old guard propaganda outlets like the new york times, a steady stream of articles detailing dropping life expectancies, a raging opiate epidemic, and rising numbers of "deaths of despair," the people most likely to have something like this ideal liberal individual state in their own lives are not doing so well near or in those conditions. Yet we have been told and are being told over and over again, that really, these people must be happy, they have everything they could possibly want. After all, they are totally free of any obligations to other people unless they choose to take them up. That may even sound like a convincing description, so long as we don't let ourselves notice that all of us, including those many millions living in the united states, live within varying levels of constraint on our choices. Many, many people live under the kind of constraints that force them to "choose" to take out crippling loans of such usurious quality that they will never be free of debt and effective indenture to work at any miserable job they can scrape together, including submitting themselves to prostitution and working in the other lower levels of organized crime, unless and until the day they die. And even then, maybe their heirs will be saddled with those debts, because they have been roped into accepting a particular sort of social obligation to their creditors. I have also observed a renewed interest in the research and theories of Émile Durkheim, especially his study and explication of the causes and results of "anomie," a word lightly anglicized from the greek and meaning simply a lack of social and ethical standards or customs. Watch out for those who try to define the word as "lawless," because that is at best misleading. Laws can be made to either reinforce or break down accepted social and ethical rules, or to maintain or change accepted customs. We have been encouraged to law and ethics as simple equivalents, but it isn't difficult to think of examples that would be best represented by a Ven diagram overlapping in a small region. Organized crime is "lawful" in the sense that it follows specific, formalized rules, and it involves a system of ethics, however appalling many of us may find it. Furthermore, changes in the formalized rules may be agreed on and enforced to drive a change in those troubling ethics, leaving them no less troubling, to be sure, but responsive to new conditions. In any case, Durkheim's broad point is that under conditions of extreme division of labour and/or social change, people may be reduced to a state of anomie in which they lose any sense of or commitment to social and ethical standards or customs of the communities they were formerly or even never part of. Humans don't develop customs, ethics, or even laws alone: they are the product of our interactions with one another, and the challenges that these may involve. This is not a new phenomenon, and there is arguably even a religious version, "acedia," which means simply "a state of not caring." Characterized in the literature I have checked as a mortal sin, acedia especially torments cloistered religious, who despite endeavouring to follow a calling based in focussed ritual prayer and work to support that ritual, apparently become listless and unable to do anything in the worst cases. This sounds suspiciously like depression induced by repetitious work that doesn't engage the mind or the senses, combined with acute social isolation. Add to that the austerity of the sorts of buildings designed for monks and nuns to live in when pursuing a cloistered existence, and it hardly seems surprising that few would be able to withstand the pressure from these conditions overall and remain engaged with their tasks and in a full state of care and interest. Perhaps it shouldn't seem odd that many aspects of the state of perfected liberal individualism are so similar to a secularized version of religious cloistering, and that in both cases the destruction of social bonds should be so troubling and frankly dangerous to the affected person's physical and mental health. After all, "secularism" itself is still primarily a state of ambient christianity with the obvious labels rubbed off, with shaky maintenance of separation of church and state. (Separation of church and state is absolutely a good idea, we just shouldn't fool ourselves about whose religion is still being imposed on everyone anyway – the separation is less about religion than it is about who wields coercive social power and what that power is allowed to affect.) On the other hand, despite the intensive structuring that this sort of "computerized individualism" apparently entails, with every minute scheduled and a chance to reassert how productive and efficient you are if you follow the precepts this model of being demands, the result in how people experience the world is counterintuitive. So much measurement and restructuring of life to be measurable, and yet a few minutes of searching online or better yet talking to other people in firmspace reveals a growing sense of confusion. Too much is happening, everything is confusing. Most people are finding themselves in worse conditions than ever, heedless of their determined efforts to follow the path they were told leads to solid prospects including steady employment, a roof over their head and so on. This is the underlying theme that attempts to encourage intergenerational fighting is meant to distract us from. Millenials, baby boomers, and generation x are all mostly finding that they have been saddled with someone else's bill of bad goods, including that someone else's bad debt. It is little wonder they are discontented, and the great hope for all of us is that the majority of us accurately pick out who that "someone else" is in order to put a stop to the imposition of complete social atomization on the majority of humans on Earth. (Top) Versioning and Document Encoding (2020-05-04) Besides the quest for the perfect notebook, which is probably endemic to writers the world over, as both a writer and a historian I have an abiding interest in document encoding. This is first simply about not losing data, a need shared with anyone alive, practically speaking. In the computer-specific case, this can still be a hair-raising issue to cope with, though far less so than even ten years ago. Today we have far better automatic back up software, autosave in programs that doesn't slow everything to a halt even with large files, and far better file recovery in the event of a program and even system crash. At the moment things seem to have gone a bit backwards in the macos corner of software, where apple has apparently derogated the "file changed" marker in the leftmost window button in each program window. This change is utterly mystifying to me, because having a "you haven't saved this yet even if the program has performed an autosave" is a wonderful bit of insurance, besides giving you fair warning that there is an outside chance you could lose your most recent typing if you haven't saved yet. That allows a quick assessment of the potential worst case scenario should file recovery have problems, and that can certainly happen with complex file formats that are actually folders of compressed files. True, it is an extra bit of code plus icons that may never be absolutely necessary, but this is hardly a contribution to software bloat. But let's return to the question of text encoding in particular. The digital humanities quarterly's 2019 edition includes a paper by Desmond Schmidt discussing encoding for the purposes of digitizing documents that have been annotated or reedited preparatory to producing a final edition or edited after printing to correct errors. The usual means of doing this can be hard on several levels: hard to encode, hard to read, hard to search, hard to index, hard to protect against bitrot and software obsolescence. In A Model of Versions and Layers he suggests a model for how to manage and potentially remove all these sources of difficulty. Besides the practical interest of Schmidt's discussion, there is some deeper geek interest in that it is modelled in effect on how computer code is versioned to track changes in a way that genuinely applies underlying principles rather than just shoving a computer at the question. As programmers are familiar above all, program code is customarily saved at intervals. That can be set automatically so that a save is carried out every x number of minutes, just as anyone who has used most current software has also encountered, consciously or not. The quest for a version numbering style that perfectly reflects the amount of change between two versions, however close together or far apart they may be, haunts many programming project managers. In the *nix world, utilities like diff can be fed two versions of the same program, then report the changes between them. The nature of the changes is not encoded in either version, instead diff identifies them by comparison. It's not as overtly fancy as multiple colours and comment bubbles including usernames possible in large word processing packages, but again, it's the underlying principle is the same. Schmidt is arguing for what could be referred to a forward modelling approach to versioning and encoding of changes. The starting point is a visibly edited hard copy document, and the sequence and interrelationship of the changes can only be worked out from internal evidence. The challenge is to work backwards to recreate a plausible series of versions that remove the changes until what is left is the original for that specific hard copy. Then, all the hardworking scholar has to do is encode the results in a robust fashion that doesn't lose all those hard won and tested deductions. Schimdt argues that the way to achieve this is to create a separate, unicode-based simple text document, following a minimal markup scheme. He may not be aware of markdown, which is already very close to the sort of markup scheme he has in mind, is widely supported, and quick to learn and apply. As he notes, the resulting versions could simply all be kept in the same folder, and the folder named for the original document just as many of us have learned to expect in desktop computing environments. Overall, Schmidt lays out an excellent example of how to think through the challenges of using and applying computers to digitize documents. From the beginning, we are reminded of how much intellectual labour goes into encoding in the first place. "Tagging" any document whether born digital or starting from hard copy is far from a rote exercise. It is time consuming and takes considerable labour, even before such challenges as tag revision or extension come into the picture. This is why google is so busy abusing captcha to get images tagged by people for free. It is easy to forget that this labour in itself has to be respected and accounted for, otherwise previously applied labour may be lost by accident or by necessity because the file format imposes not a file format change but a complete redigitization. This helps him carry out a robust analysis of what sorts of mark up styles, character sets, and file formats are most likely to be useable even if the software used to look at the digitized tets changes completely within months, let alone years. The model and practice Schmidt suggests also shares features with such programs as nisus writer pro and scrivener. Both encode data in a baseline rtf format in individual files. The program is focussed on doing the more elaborate display work based on stylesheet calls in nisus, or via tags and organization in scrivener. It is possible to open the rtf files in another program if necessary, including TextEdit or a plaintext editor like BBEdit. Admittedly, it wouldn't necessarily be pretty in BBEdit, but the key is always, if the original program was gone or the file so borked that it is unusable, can you get the data out. As a matter of respect for people using these programs to carry out their work from day to day, I suspect that this will develop into a longer term approach that will outlast the current implementation of the commendable and absolutely necessary "open document" initiative. (Top) Fairness and Accuracy in Reporting is an Important Goal (2020-04-27) The challenges of providing fair and accurate coverage of newsworthy events have never been simple, and since they arise from human relationships over time, they never stay the same shape. The goal of providing such reporting, including making a reasonable decisions about how to answer the question of what is newsworthy, how to cover it, and for how long, is an absolute gold standard. Nobody is going to manage to perfectly achieve it, and there are times that it will be further away than others. Not all technically unreachable goals are positive when we take them on board to guide our behaviour and ambitions, but this is certainly one of them. The technical unreachability is no excuse for cynicism, and I understand that journalism majors manage to resist the temptations of cynicism more often than not in their field. Ah, and I should not miss this key detail: the fair and accurate coverage I am discussing here is not the supposed appearance of those qualities, but their actual existence. Therefore mendaciously setting up "two opposing sides" to present an issue when that in effect supports oppression is not fair or accurate. It is propaganda. So despite recent self-serving claims by corporate representatives currently controlling most media outlets, they are making moral and ethical judgements every time they decide what they cover, how they cover it, and for how long. That is inevitable. It's just that they are trying to pretend they aren't. The division between the opinion pages and the reporting pages is difficult to maintain, especially since the editorial team will inevitably inflect the reporting. They will determine which article pitches and which general areas of interest their publication will cover. Or they will take their marching orders from their corporate overlords, as the case may be. Yet it has been clear from the earliest days of mass-printed news in the form of broadsheets and ballads, and eventually newspapers and all the mass media afterwards, that the importance of thorough, critical reporting can hardly be underestimated. Perhaps that is the better description to use for reporting, "critical and accurate" reporting, in the sense of actually fact checking, including learning what people whom we may disagree with are saying and writing from them, and reporting that as clearly and directly for others as we can. It isn't necessary to lie about what a person says or argues if what they say or argue is actually abhorrent. Quotes that don't just repeat their most clickbait-ish aphorisms are especially important, because sometimes those bits are not representative of what a person actually says or believes at all. Of course, this also requires judgement. Hence the use of multi-paragraph statements of ethical and political positions rather than empty slogans that was once much more common in newspapers. It would be neat to see a serious resurgence of these, with a reflective update at the beginning of each year, where considerations of whether the statement still makes sense in light of experience, success in enacting it, and so on. As part of my research for this thoughtpiece, I have been following a number of different "news sites" by means of their rss feeds, as well as digging into their archives a bit where they are not fully paywalled. The active and ongoing destruction of investigative reporting together with the deliberate suicide by refusing to provide solidly researched and wide ranging stories that would help maintain a subscriber-based model of newsroom funding is alas, no news to anyone at this point. There is a reason that an important part of the blogosphere is made up of independent investigative journalists, some funded via such providers as patreon or paypal. People are genuinely supporting such blogs, so contrary to the tech industry claims, it is indeed posible to compete with what is supposedly "free." It's just that the knowledge about content quality in news reporting and how it is achieved is not always handed down between generations. I am not a fan of the term "media literacy" – it is a marketing term, not a description of a skill any of us needs or can develop. "Critical thinking," that way of thinking in which we ask questions and check whether we should trust a source before adopting it as one, that is a much better description. For myself, I have four things that I check when reading articles from a news site, and often any other sort of site. You may be surprised by what isn't there.
Solid and fair coverage of other perspectives, including women's and Feminist perspectives more specifically, are not precluded by having "right" or "left" political leanings. That may be stating the obvious, but in the current conditions of encouragement to supposed "ideological purity" as opposed to the actually far less labour-intensive checking things out for ourselves, I feel it is important to state this up front. My definition of a good news site specifically is not predicated on it advocating for political or social positions I subscribe to, since in real life no one can expect what happens in the world to always conform with what they believe should happen. But I do expect a good website to report what happened, and not shut out coverage of certain events for "wrongpolitic." I have also been following another site via its rss feeds, the necessary but tragically uneven fairness and accuracy in reporting (FAIR). I don't mean "uneven" because the people working hard at that organization are focused on the united states. Far from it, that is clearly their chosen remit because they can't cover the world, and they state up front what their remit is. So we can agree or not with them about their decisions, but we know what they are. My difficulty is with some decisions they have clearly made that are not stated up front, and are wildly problematic. They are not the only ones to have the issue I am about to raise, as is too common for reasons that I will explain in a moment. But they did not start as badly off as they are now, and no doubt they have varied and wavered as we must expect of fellow human beings. A key area in which news coverage is generally grievously bad and horrifyingly inaccurate is anything to do with Feminism, as opposed to supposedly woman-friendly forms of liberalism. I have watched and read coverage of Feminist protests where the placards in the photos and the sound in the video reproducing what the women are actually saying firmly contradicts the copy written or voiced over around them. Such clumsiness, or perhaps even subtle resistance to ridiculous editorial policies is not just part of independent news blogs with varying levels and types of funding support and editorial boards. I have seen examples from united states and canadian news networks, both privately funded, and in the case of canada, state funded. So it occurred to me to see what the folks at fairness and accuracy in reporting had found and written about with respect to coverage of Feminist protests, speeches and so on more recently. After all, their discussion of the way north american news outlets depict what is happening in venezuela is exemplary, based on comparison and contrast, honestly selected quotes (no cherry picking), and gives fact-based reasons to question the mainstream coverage. In other words, they are not just derogating news outlets for their politics, they are taking issue with their coverage, which is being detrimentally affected by their prioritization of their politics over all other considerations. Under its topic headings, FAIR has several under which women's issues more generally and perhaps Feminist action are most likely included: Domestic Violence, Homophobia and LGBT Issues, and Sexism. The first has apparently been a moribund topic area since 2017, despite soaring rates of domestic violence in north america, let alone the united states. It has noticeable peaks and valleys, with several articles between 2016-2017, and a scattering between 1993 and 1997, including an examination of a specifically anti-Feminist publication. In the second, critique of depictions of lesbians and censorship of media in which lesbians appear has apparently vanished since 2005. By 2010, coverage in this area is almost exclusively about transgender issues, including startling repetition of "transactivist" messaging without apparent analysis. Since 2017, it appears that FAIR has left off any analysis or critique of news coverage that would fall under this header all together, despite its ongoing controversial nature. For Sexism, critical analysis apparently ended in 2015, after fine coverage from 1989 on. I concede that FAIR's website may not provide a complete sense of its actual coverage, and/or that what they have depends in part on who is available to do the work based on their funding levels and competition for journalists. But this is also their select sample to represent their coverage, in areas of socio-political interest that are not settled for good anywhere let alone in the united states. This is still the result for a period in which women's rights, Feminism, and transgender ideology have all been of high interest and concern, including an important peak in the past four to five years. This is a surprisingly bad result for an organization of such solid reputation, and it is no Johnny-come-lately, clearly stating on its website that it "...has been offering well-documented criticism of media bias and censorship since 1986." So it is a real surprise to not even be able to find coverage of the ongoing debacle of online censorship by facebook and twitter. That censorship has preferentially affected women, especially Feminists and lesbians attempting to discuss such matters as gender identity ideology, women's rights, or at this point free speech itself. Whether "social media" corporations like it or not, and whether the rest of us like it or not, they are part of "the media" and their censorship practices deserve scrutiny. (Top) An Interesting Coincidence (2020-04-20) Sometimes real life presents remarkable coincidences, the sort of thing that I suppose statistically is bound to happen eventually, especially for parallels and repetitions that are fairly superficial. Still, the coincidences around theatres are so intriguing that I think they are worth drawing out to provoke thought about them, and how they have changed over time. The concept of a theatre, a place where people literally go to look at something, is ancient. The ancient greek theatres built around natural stone formations and later built in purposeful imitation of them are justly famous, although it is unlikely they are the very first examples of theatres in general. The many predecessors composed primarily of people setting a blanket on the ground and sitting on the side of a hill are unlikely to leave any sign behind them. There is a strong argument that ancient greek theatre was a form of religious ritual at its root, and evidence for ritual gatherings goes back further into the human past than the ancient greeks as well. It looks like the key shift that the ancient greks made we can be sure about were scale and permanence. For the time being at least, it is not possible to make evidence based arguments about who was allowed to attend and who wasn't changed during this shift in ancient greek practice. In athens the theatre was a male-only thing so far as we can tell, yet we can't even be sure all the greek polities debarred women from joining the theatre audience. We do know that by the hellenistic period, the theatre had become a more contained space, although whether women could attend is still debatable to some degree. By the time the romans enter the picture and take over everything, the theatre is even more contained, and perhaps stands as a sort of contrast to the stadia used for horse racing and gladiatorial games, where cheering, intra-audience violence and even interference with the competitors was more possible if not more likely. In any case, all these spaces were used primarily in daylight hours, and part of the point to taking up a seat was to be seen as much as to see. Theatre crowds were very much social crowds, albeit pursuing a different sociality perhaps than those in the stadium. This sociality could be a real challenge too, since an outdoors, daylit theatre in the mediterranean can easily be very hot or very cold to sit in for long periods on benches of various types. The equivalent of the "cheap seats" probably offered few amenities apart from perhaps the seats themselves, leaving it to the spectators to bring a cushion and a hat to fend off the Sun. Something interesting started to happen as time went on though, in that the theatres in europe became ever more enclosed, no doubt in part to enforce payment to see the show, especially once it was an entertainment more so than an overtly religious ritual. (Remember, it is the religious ritual part that made the christian church hierarchies so hostile to theatre.) But once the theatre had been enclosed from the regular out of doors, the proprietors had two problems. Lighting and sound so the plays could be seen and heard, and crowd management. Perhaps especially crowd management, as we can easily look up a range of accounts of renaissance theatre and later opera and symphony performances in which audience members did not hesitate at all to heckel, jeer, and even hurl things on stage. A dissatisfied crowd was a dangerous one, difficult to get removed from the building safely for themselves, the case, and the building, difficult to persuade not to take over the show via sheer force. This is how we got fixed seating including fixed cushions if any, bigger gaps between audience and stage, and lowered lighting over the crowd. Still, theatres right into the early twentieth century were large and elaborate, as the black and white photo illustrating this thoughtpiece illustrates. The element of seeing and being seen, plus the idea that a theatre expressed the social cachet of the location, even if only as a marketing ploy, did a lot to encourage visually luxurious interiors. This persisted in smaller theatres that converted to showing moving pictures, where the screen was blocked by the curtain instead of what were once the stage sets. Then it seems movie theatres began to split in aesthetic from theatres specializing in life performances, being reduced to the drab sardine cans with sticky floors many of us are familiar with today. Going to a movie is not necessarily a social event at all now, and we are under particular pressure to keep quiet in the dark while the movie plays and the sound blasts us from all sides, and those of us who are more obstreperous about avoiding ads try to sort out which start times are actually start times. Adults at least are expected to keep quiet, keep eyes front, and watch, unless they are reviewers, in which case no doubt they may take notes. There is a great deal of fascinating superficial resemblance between the present and arguably most common form of theatre, the movie theatre, and an allegory developed by Plato. That allegory is of course the (in)famous allegory of the cave. Plato is not making any comment about theatre with this allegory, he is in fact using it to make particular points about human knowledge. He especially wants to argue that generally received human knowledge is flawed, and that those who struggle to achieve accurate knowledge will not be easily accepted by those who have not undergone that struggle themselves. He does this though via a curious thought experiment, in which many people living in a cave are chained to seats in that cave, from which they are shown shadows and presented sound effects in lieu of looking at and listening to real things. The only light these prisoners experience comes from behind them, a weak firelight that is so poor that such people would be utterly blinded by actual sunlight, as indeed does Plato's exemplar escapee. There are many odd things about this allegory, not least the uncomfortable question of who is holding these people prisoner and showing them nonsense. After all, the only people who can know differently have escaped and come back, apparently able to move about freely and attempt to disturb the chained audience's peace. At that point in the narrative, Plato's mouthpiece Socrates argues that the audience would declare the free person's senses ruined, and that if they were freed themselves they would refuse to follow his example and kill him. For now let's leave the philosophical puzzles aside and consider how surreal it is that so much of the modern movie theatre experience corresponds to the allegorical cave. Not the chaining or people taking fiction for reality of course. Right? (Top) In That Case, You'd Be Seriously Lost (2020-04-13) A recent orienteering event, in which people compete to see who can navigate to various checkpoints in the least time using a map and compass, reminded me again of an essay by Jim Baggott, published last year on aeon. In itself, Baggot's essay is not about orienteering at all. In fact, as its title tells us, "Post-Empirical Science Is An Oxymoron, And It Is Dangerous," Baggot is defending real science, knowledge developed via systematically collected, repeatable observations. Empirical science is not the only way to learn, but so far it has proved to be the best system of learning that permits self-correction and retention of what is true to the best of the human ability to understand. Please note that "empirical science" as defined here is anything but limited to whatever people who think they are white call science. The basis is systematically collected, repeatable observations. That can be achieved with or without fancy laboratories, and passed on with or without the currently creaking research publication system, which is creaking because of efforts to make it more amenable to rentierism, not the nonsense parroted by so-called "post-modernists" and "post-structuralists." Nonsense post-structuralists and post-modernists may be parroting, but their words are far from harmless. They are key weapons in the arsenal of those who want very much to reassert authoritarianism in all its possible guises, and there are two areas the proponents of authoritarianism always attack first: women's rights, and empirical science. So Jim Baggot is fighting the good fight in this essay, he is actively working against what is a very dangerous form of woolly thinking at best, at worst something far more sinister. He has so much good to say, that his selection of bizarre, weak, and therefore catastrophically bad examples is mystifying. It comes close to completely self-defeating, as I observed in a comment thread about Baggot's essay that got completely derailed debating those distracting pseudo-examples instead of the key point he was supposed to making, as given in the title. The first self-defeating example starts out well, in that he focuses on the notion that humans know more about how the physical world works than we did before, because of science. The hedge he has made here is a perfectly fair one. There is not even a broad consensus about what other worlds there are, if any, besides the physical world. Then he tries to argue that scientific progress is why "we" have "smartphones" and the ancient greek philosophers did not. This comes across as an almost complete non sequitur, which is unfortunate, because what Baggot is getting at – I think, based on the principle of reading charitably – is that more empirical knowledge about how the physical world works had to be built up between the time of the ancient greek philosophers and now to be able to conceive of and build smartphones. That is plainly true. But as set out, readers can easily get railroaded off into arguing about whether smartphones are really a scientific advance or something else. This is a hard knot to unravel, because for the most part Baggot does not attempt to unpack the various moral judgements that tend to tag along with the terms "scientific advance" and "scientific progress." It is also true after all, that being able and willing to make smartphones is not in itself inherently good. The other completely distracting example is that of using a GPS application on a smartphone to navigate with. Now, as it happens, there are neat bits tied up with this example, not least that it is one of the few potential daily life tasks that require the application of Einstien's theories of relativity. He is probably trying to be funny when he comments that without the application of Einstein's theories, "after a couple of days you'd have a hard time working out where on Earth you are." Except no, no you wouldn't. That's why we still have paper maps, road signage, and such wonderful social options as asking other people to help us navigate instead. I am not trying to strawman here, or make an updated version of the critique of writing that Plato attributed to Socrates in a dialogue set out in and preserved in writing, as it happens. The point is, the conditions under which Einstein's theories apply on Earth are very few, and the GPS example is one of the even smaller number of these that we may interact with ourselves. The "empirical science" bit that it could show, if Baggot had approached it this way, is how our own observations of errors in navigation from using a GPS application could allow us to accurately determine that not enough satellites were detectable by our GPS receiver for it to be useful so we had best switch to another way of navigating. In other words, I think a much stronger argument for empirical science is its basis in an accessible and repeatable criterion for putting together cause and effect accurately. Any of us, once we are clear on the difference between deduction, reasoning from a general principle to make sense of observations, and induction, reasoning out a general principle from many observations, we're good to go. We all learn these things by necessity, even if we never learn to label what we're doing with such fancy words, because that's how we figure out the basics of our daily lives. When we watch toddlers chucking blocks repeatedly or pouring out juice or whatever, at least some of the time those kids are literally just seeing what will happen, because they don't know. They are performing simple experiments. Those are the sorts of observations that in time build up into our adult knowledge of the physical world, a knowledge that we can apply to work together with one another on all manner of things for all manner of purposes. Whatever else happens, that key technique of experimenting and paying attention to the outcome, and then trying to make sure that what we observed is actually causation not correlation, that is an extraordinary mental multitool that happens to show in particular relief in the case of empirical science. And that mental multitool is a key element of our bullshit detection systems, and we need those for more than just science. With all that said then, what about the people who are honestly concerned that the notion of there being a single, absolute truth is nonsense, and dangerous nonsense that itself can be abused to support authoritarianism, among other ways of being most of us would like to discourage. Well, the funny thing about this claim is, empirical science is in agreement with it. In science, there is not a single, absolute truth forever. We are human beings whose experience and knowledge is inevitably limited by our specific place and time. That's why empirical science doesn't have a point at which all the scientists throw up their hands and go, "Well, that's it, there's nothing left to learn." Occasionally very influential scientists have suggested something similar, usually along the lines that all the major puzzles have been solved and now it is down to decimal places. That however, is not empirical science. It may be an expression of more or less understandable exhaustion or bafflement, but it is not empirical science. What empirical science does say, is that the longer a given explanation holds up under scrutiny and experiment, the more we can simply accept it as the truth, though it may not be the absolute truth. Here we can go back to Baggot's discussion of gravitational theories, where Isaac Newton's theory is all we need in our everyday lives for the most part, but it cannot be extended to conditions where Einstein's theories apply. That doesn't make Newton's theory false, that makes it limited, as Baggot lays out quite well. Some versions proponents of post-structuralism and post-modernism want to claim that if there is no absolute truth for ever and always, there is no truth at all. The ones who take that tack usually don't take too much longer to begin trying to persuade anyone who will listen to them that this means that any moral or ethical arguments are therefore not only merely contingent, but absolutely contingent. These are the people who try to argue that a person can consent to abuse, and therefore be rightfully abused. But this is nonsense, because we can observe the actual impact on the person who is suffering abuse, and see that they are being harmed. We have evidence available to us that people suffering abuse can be persuaded, at least for awhile, that they somehow deserve or chose what they are suffering, and that this persuasion cannot be equated with a person making a free choice at all. This goes back of course to my earlier point about bullshit detection, and that remarkable mental multitool we develop from our earliest days. This was of course what Baggot was getting at. Alas for his distracting examples! (Top) |
Thought Pieces
Thought Pieces
|
Copyright © C. Osborne 2024
Last Modified:
Sunday, September 08, 2024 13:53:00
|