|
FOUND SUBJECTS at the Moonspeaker |
Military Projection (2024-05-27) One of the stranger sagas of the turn of the twenty-first century is the rediscovery and explosion into a meme of the "Keep Calm and Carry On" poster, originally designed by the british government early in the second world war. Luckily this particular poster was never widely distributed, because it doesn't take much research into civilian records and the reports collated by mass observation to realize these posters would hardly have raised morale. They were more likely to have utterly infuriated people among whom were many survivors of the first world war fed up with the way their government and the military authorities insisted "the masses" would collapse into disorder and rioting in the event of any bombing campaign. The first world war had already shown the insulting projections from the reactions of male soldiers to bombardment and sexist stereotypes of women were so false that the government and military authorities should have been embarrassed but clearly were not and learned nothing. This does not make any claim against the bravery of general infantry by the way. Quite the opposite: in world war one, general infantry were poorly paid, miserably led, and it seems the model of them many of their commanding officers used was that of a pack of barely trained dogs. The general infantry were mostly cut off from their established support systems, with every reason to expect they would not have their needs for basic food, shelter, and medical care met in a timely manner. Worse yet, the early claims that the war would be easy and "over by christmas" meant they suffered more and more loss of morale as the war went on. It is a miracle they were able to develop as much cohesion as they did, and the general infantry did adjust to the horrifying conditions of the extremes of tench warfare as best they could. With these details in mind, it is easier to understand why recruits go through brief basic training, then longer unit training. Yes it is to practice in something like battle conditions, but even more importantly it is meant to establish unit cohesion and trust in the military structures their lives will depend on when in service. Meanwhile, civilians have all these things in the form of their communities and familiar surroundings. In one of those synergies so common in our day to day lives, before getting too far into writing this thoughtpiece, I happened upon two excellent articles bearing on the strange projections about "what the regular person will do in extremis." One is from the Teams Human substack, Alarm is Appropriate, the Volcano is Erupting, by Chloe Humbert, dated 6 july 2022. This post introduced me to the term "elite panic" and led me to another excellent link briefly outlining the meaning of this term and who coined it, James B. Meigs' 2020 article Alaska Earthquake Elite Panic vs. the Resilient Populace: The Lessons of a Forgotten American Disaster. He writes, Disaster researchers call this phenomenon "elite panic." When authorities believe their own citizens will become dangerous, they begin to focus on controlling the public, rather than on addressing the disaster itself. They clamp down on information, restrict freedom of movement, and devote unnecessary energy to enforcing laws they assume are about to be broken. These strategies don't just waste resources, one study notes; they also "undermine the public's capacity for resilient behaviors." In other words, nervous officials can actively impede the ordinary people trying to help themselves and their neighbors. UPDATE 2023-01-26 - It seems the origins of the term "elite panic" lie with scholars Lee Clarke and Caron Chess in "Elites and Panic: More to Fear Than Fear Itself," in the journal Social Forces, 87(2), 2008: 993-1014. UPDATE 2024-09-08 - If anyone had told me how horribly relevant this thoughtpiece would be, as israel continues to aerially bombard the gaza strip in occupied palestine, I would certainly not have believed them. If you read both of these posts and follow the links and references to the various studies, you will soon learn that general populations are incredibly good at responding to and managing disasters. Or perhaps you have already read Rebecca Solnit's A Paradise Made in Hell, or (though I hope not) you have experience dealing with a disaster, so this is familiar knowledge. Turning back to the grim question of aerial bombing, the whole reason I wound up digging into this topic was to answer a stubborn question. Besides being a spiteful act intended to do as much damage and cause as much death as possible, what was supposed to be the point of indiscriminate bombing? Was it actually militarily effective? I could see targeted bombing being effective, if accuracy could actually be achieved, because it could do such sensible things as destroy an opponent's air force, disrupt communications and transportation and so on. But just randomly slaughtering and wrecking civilians and civilian buildings looks obviously counterproductive, it is bound to generate resistance. My go to book for starting references and pointers to more detailed records and studies is Sven Linqvist's 2003 book, A History of Bombing. It is simply shameful this book seems to be completely out of print. Sure enough, Lindqvist not only includes direct quotes from primary sources revealing an early expression of what we now call "elite panic" that would later inspire the quietly dropped "Keep Calm and Carry On" posters. He also includes references showing that civilian and military leadership were already well aware before world war two of the ineffectiveness of indiscriminate bombing for breaking morale or resistance in enemy civilian populations. Lindqvist also engages in a fascinating examination of the theme of "striking from on high" as a means to mortal pretence to having god-like powers and control over others. He didn't take this too far, as his book has enough material to cover already. Nevertheless, it is another source of clear evidence that "elites" are convinced they are somehow above and beyond "mere mortals" and supposedly have perfect knowledge of how "mere mortals" will react to the point of striving to force the general population to behave according to their beliefs. It is now a common joke at the expense of economists to refer to them complaining how everything would work out perfectly if only the real world had the decency to behave according to their models. But the dangerous fact is there are many people who seriously think this way, and they are not only economists. (Top) More Modernity (2024-05-20) Among the various thoughtpieces here are two from mid-2019 on the subject of rejecting "the modern," a strangely amorphous concept excellent for obfuscating reality. There didn't seem anything notable to add, especially after I had managed to put together some relatively sensible thoughts on the topic of the art movement known as "modernism." Then, quite unexpectedly in reading through the second, 2009 edition of Samir Amin's book Eurocentrism, I read yet another definition of modernity (on page 57 of the monthly review press version). According to Amin, "Modernity is based on the principle that humans create their own history, individually and collectively, and, as a result, they have the right to innovate and disregard tradition." I must admit that to me the two main parts of this definition are non sequiturs. Starting from the second part, which follows "as a result" is a truly strange thing to read. We are human beings living in a changing world and changing societies. Whether we mean to or want to, we will innovate in order to maintain the broad principles of our cultures as far as we possibly can. This is hardly a "right," it is a necessity. We must change and readjust to remain part of our cultures. If we don't, either we will literally die, or else we will assimilate into other cultures. We have no means to hold back this requirement, we don't get to choose. Yet I do concede to Amin the point that to the extent possible, it is the right of communities and even individuals to refuse to go along with changes they sincerely disagree with, or to propose and attempt to enact changes they deem necessary, so long as they don't infringe on the right of others to do the same. For the first part about humans creating their own history, I do wonder if here the difficulties of translation have intruded, but Amin authorized this rendering of his original french. Therefore I think it must be taken as it reads. Considering humans must change and integrate those changes into their understanding of themselves, their societies, and the world, it makes sense that humans end up creating their own history. I understand history here to mean simply "the transmitted record of past events" by which I should own explicitly the events are always defined by and about humans. While declaring the sort of analysis and writings that go into geology, biology, and the like seems uniquely and accidentally rather sensible. While the motivation to begin with seems to have been to continue on the wrong assumption that all the world outside humans have and has nothing we humans should respect, it accidentally overlaps with the truth that we humans should not pretend to know and recite the histories of other than human beings. Well, not unless those other than human beings should choose to share their histories with us, but that points to a whole other range of topics and practices beyond the scope of this thoughtpiece. A major element in Amin's book and his work more broadly is to refute the stubborn redefinition of "non-westerners" as somehow entrapped in the past while "westerners" live not merely in the present but continue on into the future. It reminds me of several authors whose speculative fiction works I enjoy and whose philosophical and political views I respect, but who also hold a total commitment to the notion that if humans don't "go to the stars" humans will somehow be degraded and denied motivation for positive change. Setting aside the question of why positive change is assumed unable to motivate human commitment in itself, I simply don't agree with this belief about the supposed meaning and "necessity" of space travel. I agree it would be interesting and incredibly interesting if a means of travelling and studying things in space in person could be done safely and without ridiculous allocations of human labour power and the materials available to us on Earth. But I can't see the point. We have more than enough challenges to keep us busy on Earth, including the majority of its surface, which is covered by water and an almost absolute mystery to us now. Suppose we can figure out how to respectfully and non-destructively visit the depths of the ocean to learn more about the total wonders there, if it could be done in a way complementary to putting an end to the practices wrecking the type of environment we need to live well on land. That sounds profoundly mind blowing and positive in terms of its potential as human accomplishment with all the demands on our ingenuity and cooperative capacities it must have. Perhaps we would be more effective if so much effort and obsession wasn't trapped in creating pseudo-phalli to send into space and bomb "non-westerners" from a distance. Unfortunately, "modernity," the "modern" and so on are not useful for what I like to think we, as in humans in general, would on balance like them to be useful for. If they could be useful as respectful means to bring together diverse human beings to live together in all our societies in our complex shared world, we should certainly have seen it work by now. Instead the balance of evidence clearly shows these notions are utterly co-opted into the meanly vicious evil of proselytization and missionizing, which are at root forms of physical and psychological coercion and cannot be redeemed. I am certainly live to the challenge of the examples of societies engaged in horrifying practices, and the way so many of us have been trained to think hearing about these practices is license to swoop in with all force possible to stop them. Yet, here again, the evidence before us trounces claims to taking up "responsibility to protect" which is just a mealy-mouthed update of "white man's burden." There is in fact, no such thing, and the only reason "westerners" in influential places declaim about the supposed need to bring "modernity" to racialized peoples is because what they can no longer openly call "white man's burden" is not about fixing injustice but taking advantage of others for their own benefit. The first challenge is to stop the vicious practices of missionizing and proselytization, including their iterations under the rubric of "modernizing," which spawn social violence and coercion rather than ending it. (Top) Destroy the Business Schools (2024-05-13) Originally I was tempted to use a soft-pedalling title to this thoughtpiece, because I do have friends and acquaintances who are business majors of one type or another, and despite the socially destructive training business schools engage in, remained decent people. They are critical of the programs they completed, and a bit disappointed with its content even if they are pleased they were able to parlay the degree or certificate into a higher paying job. That acknowledged, I doubt many of them would agree with me about getting rid of the business schools all together. At one time Kevin Kelly, before Cool Tools fell over into a full-bore advertising outfit some years ago, had a post on a modest program of self-education instead of spending thousands of dollars on an "MBA." To be fair to Kelly and the latest version of the site, the 2008 post is still there, The Personal MBA, and the associated links to the program documents are updated and live. While my recommendation business schools be destroyed is a strong one, I did not come to this conclusion without actually trying to see what the appeal was, and trying to take up what seemed actually useful. However, I found "business books" both popular and academic astoundingly thin on content. Probably the most interesting, and not for its business content which was all provided in the first ten pages, was Chris Guillebeau's book The 100 Dollar Startup. It is mostly silly beyond the yes sensible advice about hedging bets on a new business by not pouring all your money and energy into what is likely to fail nine times out of ten, but the anecdotes are often fun. The trouble with business courses and degrees is not merely that they are expensive and low content. As philosopher Peg Tittle observed and sought to correct with her important textbook Ethical Issues in Business, typically there is no or minimal instruction in ethics required in business degrees, let alone rigorous study of it. Based on practical observation of the ills of the now almost completely enclosed and advertising and porn hijacked internet, growing involuntary surveillance imposed via perverted computers, and the environmental havoc wrought by capitalism, business students are generally not learning or applying positive ethics. I suspect very few of them read any Adam Smith, not even his Wealth of Nations, let alone his intended companion volume, The Theory of Moral Sentiments. I have also observed the impact of MBA training on management at my workplace, where many ambitious colleagues completed MBA programs in order to move into middle management positions. The tragic result at minimum I have seen is an extraordinary level of penny-wise, pound-foolish behaviour and decisions. It is all about the conveniently defined "externalities." As long as the bad outcomes land on someone else, preferably all costs on the public purse and any destruction on somebody else, then the decision taken is deemed good if it pours money into the pockets of "investors." Time and again in the past fifty years, this has meant a switch from longterm employees who can use their knowledge and experience to carry out their work efficiently and well to short-term contractors paid at the lowest level possible and otherwise treated like garbage who do exactly as they are told because they have no time to learn the job. The MBA-equipped managers are certain their "efficiencies" are real, as they asset-strip the organization and leave it to go bankrupt after cleverly dropping the liabilities on others. Business majors are thoroughly indoctrinated in the belief, unless they are incredibly resistant, that the only useful and relevant education is training for wage work. By which the meaning is, as we have all heard and read again and again, all education should be reduced to vocational training. At the rate things are going, post-secondary education is reeling drunkenly between being a means to coercing young people into impossible to pay off levels of student debt, and programs of actual indenture such as that still embedded in the scheme of exchanging a number of years of military service for an education. Over the past several years, many people who listened to the demand to "learn to code" and join a cult-like "technology business" have been laid off in their thousands. Those who worked for and became thoroughly committed to larger corporations like microsoft or google are incapable of applying their now hyperspecialized skills elsewhere. But forcing such commitment is a great business strategy for managers, who can use it to extract huge amounts of free labour power in the form of overtime and "special projects" before kicking employees whose seniority makes them "too expensive" out the door. From what I have observed, there is a total refusal to engage in any true systems thinking or proactive, constructive behaviour in this form of "management." The point is always to create a bubble within which the managers can exert as much direct control as possible, which in turn enables them to maximize the exploitation for profit. The most insane part of this is that the result is not only socially and environmentally destructive, it is actually economically and business destructive. The root of the larger problem is the current reward system for being endlessly greedy, with all the cruelty and viciousness unrestricted avarice entails. An important start on curing the ill is shutting down all business schools and ejecting business degrees and certificates from all universities, colleges, and other publicly funded educational institutions. If the businesspeople want such things, they should build their own independently of public institutions, and if the people leading public institutions show themselves too irresponsible and greedy themselves to refuse to allow those institutions to be parasitized, they should be driven out of office. At this stage, it is probably necessary for faculty and graduate students in other programs to take over the institutions to run them instead, or more likely, to found alternate institutions that explicitly embed in their charters that they will not allow the inclusion of business degrees or support business schools. (Top) The Tinkerbell Complex (2024-05-06) Philosopher Susanne K. Langer's work is seeing a growing resurgence of interest, as is often the way of things in the field. Different arguments and analyses resonate with the challenges of a given time period. It may seem a bit strange at first to think of Langer in this context, even though her particular interest was in the connections between aesthetics, emotions, and thought. I am by no means well-versed in her work, but began to learn the rudiments of it via Emily Erwin Culpepper's theology dissertation, Philosophia in a Feminist Key: The Revolt of the Symbols. Culpepper provides a wonderful encapsulation of Langer's analysis and argument in Philosophy in a New Key: A Study in the Symbolism of Reason, Rite, and Art, which "establishes symbolizing as a basic human need and constructive process of mind." Some time ago I encountered Langer's brief recounting of an unforgettable performance of "Peter Pan" in which the children in the audience were called upon to clap if they believed in fairies, thus saving Tinkerbell's life. The coercive element of this stuck in Langer's mind, and having had a similar experience at a different "children's performance" of another story, the basic outline of her anecdote stayed in my own memory. Yes, in both cases these are "just performances for children" but as the fierce conflicts over what is appropriate and reasonable to show and otherwise teach children, such performances are not "just" anything. We can at least agree that there is no guarantee a given lecture or performance will or will not impact children in a meaningful way. Langer's point though, it seems to me, and Culpepper's too, goes to the ways in which symbolizing is a basic human need, and how it can be abused for coercive purposes. With that in mind, perhaps online arguments about breaking the "fourth wall" are not so absurd as they can seem. At the moment in "western" countries at least, those particularly entrained with the mass corporate entertainment media, something very strange is going on. Thousands of people seem to be suffering from a sort of "Tinkerbell complex" such that if everyone around them is not constantly feeding them a specific ritual response, they insist they will literally die or their mental health implode. Alas, a nontrivial portion of the people making these claims are already suffering mental health challenges, and their demands for behaviours that reinforce and entrench their disordered thinking won't help them. Still, the people entangled in the "Tinkerbell complex" are outliers in the extremes of their behaviour. Another large number of people are not so much entangled in it as somehow convinced they should invest energy in supporting and enlarging it. There is always some level of social pressure to conform to broad behavioural trends, often for the general good. However, as we all know too well from present events let alone historical examples, trouble follows a shift into full on coercion to conform to demands for outward behaviour at odds with reality and/or genuinely held personal belief. In the present we have seen multiple obvious examples, as in the case of indoctrination into various forms of religious and political extremism. While the ways this form of coercion can lead to actions ranging from loss of the most extremist members of a friend circle to loss of livelihood and even interpersonal violence, these are not the primary areas of interest in this thoughtpiece. Re-encountering Langer and Culpepper with their analyses of the human need to symbolize and the potential outcomes of it suggested a new perspective on what has made the new "Tinkerbell complex" so virulent. If symbolizing is a human need, and among the ways humans carry it out are such activities as taking part in performances as actors or audience, and in such basic choices as what to wear, how to speak and so on, then it will be dangerous to frustrate or pervert that need. In the context of highly corporatized and enclosed "cultural industries" currently characterizing "the west," symbolizing is a fraught business. Fewer and fewer people are encouraged to symbolize for themselves in the most basic ways, through those much maligned practices labelled "arts, crafts, and humanities." None of these need be done professionally, amateur or family-based practices and even performances are perfectly reasonable. Yet most "cultural works" people may easily access are artificially enclosed versions they are not permitted to reuse or remix in creative ways and thereby do their own symbolizing. The multiple infamous examples of disney corporation suing people who made home made versions of disney brand art for extremely small-scale, non-commercial purposes come immediately to mind. But there are others, such as the near impossibility of finding non-disney versions of fairy tales among children's book selections today, at least in english. Or the ongoing monopolization of publishing, which narrows what can be published to the point of homogenizing authors and their works. For example, the graduates of specific science fiction writing workshops produce uncannily recognizable and similar short stories and novels. So even the exemplars people could use to make their own "arts, crafts, and humanities" have narrowed in fact even if there seems to be a great many of them. Fan fiction and fan works more generally began as remarkable counters to these conditions, but has since been co-opted by marketing and torn almost completely from their positive roles. The end result is a sad mirror of the state of "anglo food ways" in northern north america. Almost completely under the control of a few megacorporations, riddled with nutritionally empty for formerly traditional fare, and much of it thoroughly adulterated with sweet-tasting and poisonous forms of sugar and monosodium glutamate. Due to the impact on rents of rampant real estate speculation, many people live in food deserts or are unable to take the time to find and cook meals rather than lean on prepackaged and degraded food products. To eat what looks and tastes rather like food but get little or no real nutrition from it can drive a person frantic as they struggle with feeling constantly hungry, but inevitably desperately unsatisfied. Similarly, blocked from making healthy contributions to their own culture through symbolizing with others, people become frantic for satisfying stimuli. And just as the persistent hunger induced by nutritional deficits can feel momentarily satisfied by a hit of junk food, so can the need to symbolize by a hit of "social media" or a watching the latest netflicks hit or purchasing the most recent entertainment product release. But the hits can't last long, and clapping can't save Tinkerbell any more than a foolish demand to declare allegiance to a belief in fictional fairies can. (Top) Narratives and Appearances (2024-04-29) Some elements of our strange present are not terribly new, although the way "newness" and "individualityTM" are hogtied into media stories about how today is the pinacle of all time in anything and everything we can think of can deflect us from noticing it. For instance, the social conflict over the role of narratives and appearances is hardly a cutting edge argument. Ancient greek philosophical and artistic tradition found an important energy source in striving to determine the right balance between them. Their arguments for and against "noble lies" and on the question of whether what we now refer to as "fiction" in english can be harmful to individuals and subsequently to society at large are among the best known in the "western mainstream." They are hardly the only ones who added to a deep tradition of subtle thought and struggle over the meaning and ethics of valorizing appearance and narrative over practical considerations. Buddhists, Taoists, and other practitioners of explicitly meditative traditions have approached them from a different direction defined in part by encounters with what our minds are capable of when we strive to embody the paradox of consciously not thinking or meddle with experiments in sensory deprivation. Then there are yet other peoples who have considered these matters through observing the effects of mind-altering substances. Everyone seems to find that sometimes narratives and "keeping up appearances" can be helpful, yet also dangerous. Helpful in that they can help a person or even a society develop in constructive ways. But dangerous, in that pleasant dreams can be addictive in light of their capacity to stimulate our limbic systems and produce a form of temporary, negative escape. For my own part, what has drawn my attention back to these things is not yet another encounter with the angry anti-fantasy novel brigade, but today's latest flavour of authoritarianism, determined as always to strictly limit which narratives may be shared and force everyone to conform to certain appearances. Their tell-tale tactics of name calling, shaming, humiliation, and finger wagging alongside a complaint of "that's a bad look" are all too familiar. I don't mean the interpersonal versions of these behaviours, obnoxious as they are, because those versions carry little water without two things. First, a widespread identification with the authoritarians, and as a correlate second, a feeling of personal license to act as petty enforcers of what those authoritarians claim. Without this nexus, authoritarianism can't take hold or endanger anyone. No, I mean the societal versions created by hijacking how people communicate and take part in rituals together. The interference and attempt to control all communication is long past the point of possible to hide behind pretences to creating or maintaining any sort of respect or safety. Yet I think the sheer insane degree of interference in social rituals at all levels and scales seems least recognized, or perhaps least overtly discussed. No, I am not referring to pandemic related restrictions. Yes, they tangled temporarily with social rituals, and worse yet were deliberately undermined and mishandled to prevent them from having a real and positive effect which would have been the reasonable trade off for a brief interruption. I would suggest the disrespectful and actively counter-productive insincere application of restrictions on gatherings and social interactions to curb and stop the spread of COVID-19 made unavoidably evident the way authoritarian actions had already devastated social rituals. The justification of those far earlier and more extensive actions was maintaining a particular narrative and appearance of what "democracy and freedom" are supposed to be. Hence, it is now almost impossible to have an impromptu street party or other informal gathering. Any such event is promptly broken up by police because it was "unauthorized" or "unpermitted," a nuisance to traffic or whatever other pretence. Copyright law is so absurdly and pointlessly extended to satisfy rentier capitalists and other thieves as to drive the majority of people in industrialized countries out of openly making, sharing, and remixing their own cultures. Is it any wonder that in countries where this has gone furthest, people are commonly unable to identify what their culture consists of in a positive manner, as opposed to declaring, "well, we aren't like those people over there we don't like"? Just try to get together in a restaurant, pandemic or not, and have a shallow or deep visit or discussion with friends. You will soon discover this is no trivial thing to manage, when most restaurants are afflicted with loud music and/or constantly flashing large screen television sets. Furthermore, now if you and your friends and acquaintances make the mistake of stumbling onto a forbidden topic as identified by the authorities, you will likely all be thrown out. The capacity to have open discussions without raising the stakes immediately up to risk of humiliation and threats of arrest is gone in many places. To insist on the importance of a single, unquestionable narrative and associated appearances above all else lends itself to disrupting social rituals necessary for positive social cohesion. Positive social cohesion involves safeguarding the most vulnerable, while admitting there is not one answer for all time to the question of how best to do that, even as specific principles do indeed stand the test of time and experience. Such cohesion stands up just fine to questioning, to the existence of more than one story, and to the firm removal and prevention of authoritarian structures and groups. If we are fooled into focussing all our attention on "keeping up appearances" and preventing ourselves and others from ever encountering anything but what we have heard before from a few people we have decided are authorities who will think for us, we have indeed fallen prey to delusion. (Top) Great Title, Troubled Execution (2024-04-22) Nancy Leong is a law professor at the university of denver, and her book Identity Capitalists: The Powerful Insiders Who Exploit Diversity to Maintain Inequality reflects this as well it should. She brings together personal experience, legal analysis, some sociology, and even some very limited political analysis that she becomes very defensive about towards the end of the book. Her coinage of the terms "identity capitalist" and "identity capitalism" are apt, and they are well defined. There are many places where Leong's writing is laugh out loud funny, and her analyses, both legal and otherwise, often scintillating and a true joy to read. This is true even where I am inclined to disagree with her conclusions, because she generally sets out her arguments clearly and in a manner to encourage an honest and engaged reader regardless of the reader's personal views. Given my druthers, I would love to tell everyone to read this book with no caveats whatsoever. Alas that I cannot do that, and it is not because it is so firmly centred in the particular economics and social pathologies of the united states. What happens in the united states affects the rest of us whether we like it or not, and when it comes to the phenomena Leong is discussing, we definitely need to understand what is going on. No, the issue is not that. Nor I think, is it Leong herself as a writer, scholar, or lawyer. In fact, I think this book has been cruelly served by its publication in a difficult and regressive time, even though absolutely yes, in this time most of all it is needed. Leong is dealing with the crux of a core element of this period of regression: the deliberate manipulation of law and social practice with a view to creating an appearance of supposed "diversity, equity, and inclusion" in a manner that allows oppression to continue anyway. The identity capitalists may perform an identity for profit, or perform an image of tolerance for "minority identities" for profit, or even both depending on the time or place, all while in fact reinforcing the very forms of oppression they are supposedly piously opposing. When Leong's text is focussed on these aspects, it is brilliant and searing, without question. But these sections are strange, dispersed at intervals among shockingly bleary passages that read unmistakably like the product of a bitter battle with editors to refuse to allow her analysis and arguments to be gutted for fear of "offending" the "diversity, equity, and inclusion" crowd, who are, after all, peak identity capitalists. Her very few references to Robin DiAngelo are cringeworthy precisely because they come across as wedged in, and DiAngelo is emblematic of the very issue that Leong is critiquing. This is all very frustrating to read, and while yes, I don't know this was the dynamic at hand, I wasn't privy to any emails or meetings or anything of that sort. Nevertheless, as a writer with some modest publishing credits and experience trying to get potentially controversial material into print, I don't think my assessment is out of line. My purpose in sharing it is to set up an important acknowledgement of Leong's manifest determination in this text. I just wish that she could have had better support from some people with strengths in history, political science, and Feminist analysis, because without them it must have been that much harder to fend off the less useful content-related edits, and it also weakened her analysis. For instance, it took nearly a hundred pages for Leong to use the word "stereotype." Her discussion of identity capitalists who "perform their identities" and those who benefit by having an "identity performance" that is deemed acceptable to people in positions of power is maddening precisely because she accepts the ridiculous framing that everything we are is something we "identify with" even as she also acknowledges there are aspects of ourselves that are involuntary and cannot change. She is a lawyer, and precise, careful language is a must in law. I can't believe this didn't trouble her too. How much easier it would have been if she could have explained that what "identity capitalists" perform or judge is not an "identity" as such at all, but a set of stereotypes, be they sex role stereotypes, racial stereotypes, even professional stereotypes. Yes, these stereotypes are currently being labelled "identities," and that is precisely part of the strategy of using and abusing the stereotypes to profit, survive, or both. I kept wanting to say, "Your stereotypes are not my identity." This of course goes both ways, neither are my stereotypes Leong or anyone else's identity. On the flip side, Leong does manage to acknowledge and unpack some fo the ways class pays a role in identity capitalism, no easy thing to do in a country where the mainstream determination to pretend class does not exist there is still high. Yet, if Leong had been able to access some works already in print in the course of composing her book, the result would have been even more powerful. How incredible it would have been, if she could have read and discussed Andrea Dworkin's analysis in Right Wing Women, which discusses in nuanced and sensitive terms why some women are so committed to policies that are counter to the interests of women as a sex class, and often counter to their own as individuals in the longer run. For the parallel case in the context of african americans, there are W.E.B. Dubois and Adolph Reed on the problems of "uplift." Her legal coverage is complete as we should expect, Kimberlé Crenshaw is well cited for her development of the notion intersectionality in law. I found myself wishing deeply that Leong had an opportunity to read and analyze Betty McLellan's Unspeakable: A Feminist Ethic of Speech, in which McLellan works on ideas for overcoming the issue of unequal ability to be heard. In fact, let me quote some of Leong's discussion of free speech, because this is one of several places where her ability to write beautiful prose shines through the more confused text typical of the chapters not built around a legal framing: ...Even Barack Obama has said, "The strongest weapon against hateful speech is not repression; it is more speech – the voices of tolerance that rally against bigotry." What this position ignores is that the ability not only to speak but also to make oneself heard is not distributed equally in American society. Perhaps everyone has the same First Amendment rights, but not everyone has the same ability to use them.... (154) First Amendment doctrine is the product of centuries; it can't be revised in a few sentences. But a good first step would be to get the identity capitalism out of it. Let's drop the pretext that the First Amendment is an unqualified good for underrepresented minorities. Let's drop the pretext that hateful speech is the price we pay for free speech when the "we" paying the price is often the disempowered groups victimized by hate speech, and the people benefiting from free speech often pay little or no price at all. If free speech doctrine is to protect hate speech, then courts and commentators should engage in a more honest reckoning with the costs of so-called free speech. (155) Rereading these quotes reminds me that the compiler and editor of Robert's Rules of Order for parliamentary procedure was a citizen of the united states trying to come up with agreed on rules to help keep meetings where decisions are made from devolving into brawls. In the end though, the section of the book that troubled me most was a few pages in the conclusion, 183 to 185. The section begins with Leong's account of a terrible interaction with an uber driver who tried to browbeat her into ignoring her own commonsense by accusing her of racism when his license plate number did not match the one the uber app on her cell phone told her to expect. There are real, documented cases of women suffering rape and sexual assault by uber drivers. This is exactly the sort of situation where any woman would be well advised to send the driver on his way. The driver's response seemed strange to me, because wouldn't such a driver who knew he was in the right place and trying to meet with the right fare instead say, "I realize the number doesn't match, you must't have seen that uber sent you an update, the original car has a flat" or something similar. Then all Leong would have needed to do was check her phone, and there the update would be. It doesn't sound like there was an update though. If we want to find a problem in the interaction rooted in other than the driver or Leong, we can point straight back at uber, which is an infamous abuser of its drivers, who are badly paid and work terrible hours. Not too many paragraphs after this, Leong reveals her anticipated audience for this book: wealthy white people. I am glad this was only in the last few pages, because that suggests it may have been an editing effect, but a helpful editor would have either excised this or insisted on a rewrite to say more. "Doctor Leong, you can't stop there! Your book has so much in it for a larger audience, yes, a more diverse one. More people than other lawyers or white people more generally are going to be interested in and value reading this book." I really believe that. Identity Capitalists is a good book in spite of how it was shaped in the publishing process. It is worth the time spent reading it and discussing it with interested friends, bearing in mind that it has suffered some mean editing and Leong does not seem to have had an opportunity as yet to read and incorporate more relevant research and argument beyond lawyers and recent united states election commentary. I hope Leong will be able and willing to write a follow up incorporating materials like those I mentioned, and has better fortune in her editing team, because there is great potential in doing it. That is the flip side of a first book being rough around the edges, experience can be applied to overcome the earlier challenges. The topic and Leong's explanation and analysis of it certainly deserve it. (Top) There is No Marketplace of Ideas (2024-04-15) One of the most corrosive and frankly stupid analogies ever made is the one between a marketplace and a free flowing discussion. At first it must have seemed harmless to most people, because they were diverted from its actual content by references to the ancient athenian agora. Never mind that the agora was a large, clear space used for public gatherings, among which were the gatherings for trading and selling, markets. The idea was to get us thinking about romanticized visions of political debates and of Socrates sounding off at his final trial about his principles and why athens should have pensioned him for life instead of giving him the death penalty. For now let's set aside the puzzling phenomenon of crediting athens in that period as having the most freedom and democracy anywhere when it was a slave state and precarious empire and the majority of the population had no right to participate in athenian politics at all. The idealized vision is undeniably inspiring, and a real goal to strive for: having and maintaining public debates where people work together to come to a consensus on what to do about the issues affecting their lives. But all that has precisely nothing to do with the present notion of a "marketplace of ideas" where supposedly every idea gets a fair hearing. It is not likely I would have cottoned on to this when I did, except the notion was evoked suddenly as part of a campaign of favour-currying to a group of capitalist fundamentalist potential clients by a colleague who is now employed elsewhere. (Not with the capitalist fundamentalists.) An important part of what makes the "idea marketplace" nonsense is that ideas are not passive products set out in sacks and on tables, nor are they even statements presented without comment and in the easiest way possible online or offline. On the rare occasions I get to haggle over a price, there is nothing going on like an exchange and exploration of ideas. There is no simple or final answer to sorting out which ideas are better or worse. Mere "majority rule" or "authoritarian enforcement" – both regularly imposed on actual markets – doesn't work with ideas or anything else with nuance involved. The pressure to not think and just conform, partly discussed in the previous thoughtpiece, is certainly not an example of giving ideas a fair hearing or of making a decision after due debate and consideration. Indeed, it is the opposite. But we live in a time when people in "business" who have genuine ethics are driven out, leaving the rest to act on their worst impulses of greed and dishonesty. Hence the ongoing collapse of the lamentably daft "nft craze" and of cryptocurrencies, which were both bidded up past any realistic assessment and used as means to vacuum real money out of the victims' pockets in exchange for a poorly encoded file and some electronic pretend money. When people first began to discuss ideas in larger groups, they held meetings. In different times and places they added more or less formal structure to the meetings to help ensure everyone had a chance to speak, ideas were actually questioned and debated, and the outcomes were understood and agreed on. The actual diversity in how people went about this historically is hidden by typical treatments of "democracy" or "debate" because the authors speak almost exclusively about the ancient greeks with maybe a side eye at the ancient hebrews and phoenicians, then they skip to ancient rome, its republic, and subsequent destruction of other governance and public discussion practices wherever they could achieve it. One thing once more commonly accepted and appreciated was that good decision making and preparation to put the decisions to work takes time. I have noticed most complaints about other people's governing structures where they include a significant public debate and consensus-finding element, are that they can't be "democratic" because they are too slow and aren't responsive enough. The complainers like to point to situations such as war as times when such structures don't work, conveniently ignoring the evidence that a different system of debate and decision making is typically available for handling fast-moving disasters. Today the two major systems of airing and testing ideas besides what are supposed to be democratic governance systems (labelling them is one thing, it is their actions and effects that need checking) are subject or profession-specific conventions and publishing. Originally the conventions most scholars are so eager to attend were indeed primarily about sharing work and taking part in seminars where their new research and ideas could be considered and developed. Many of them have either already been or teeter on the edge of being mostly an attendee-subsidized means for universities and corporations to interview possible hires. The state of both academic and profession-specific publishing is horrifying. Most of it has been enclosed by such conglomerates as elsevier and sage, while "open publishing" is being eviscerated from the inside by crooked "pay to publish" newcomers and dubious "open journal" fronts put up by the older and larger publishers. Peer-review is a mess, because it is devolving from a practice of honest assessment to one of checking that the proposed publication does not challenge already accepted ideas. These are the results of striving to reduce those systems to "marketplaces." I am not sure why so many people have given up on the effective means of sharing ideas these types of subject and profession-specific meetings and publications once were. The original means of "publishing" was via letters, literally handwritten letters shared among a list of interested people. The people who wrote excellent letters and participated constructively in larger meetings with their peers drew others into the network of letter recipients, who were also expected to write letters in the circle. Weak or deranged material and their authors would lose places on the mailing lists and not be paid much mind at meetings. In any case, participants were expected to know their stuff and be ready to ask good questions and read the letters and eventually articles and papers with care. This applied to both what today are labelled the humanities and the sciences. Paradoxically, the more possible it has become for anyone to take part in this type of grass-roots generated and controlled publishing via more recent technologies, the more people seem determined to ignore these possibilities and suffer in the exploitative enclosures instead. The most recent justifications for this have been, "how else do we stop the conspiracy theorists, extremists, and misinformation?" The same ways we already know work. First, tell the truth about what is happening and focus on creating and maintaining solid public education that encourages and honours each person's ability to think and understand. Second, don't feed the trolls, including don't repeat the names and reproduce the images of mass killers, which encourages others to follow their example. Third, face up to the factors that feed extremism, which are ruthless sexism and class warfare, and take steps to end both. Fourth, shut down and destroy the propaganda mills, and root out and delete the algorithms used to pump anything that feeds anger, anxiety, and disgust over all other feelings. We know these ways work, and surprisingly fast. We also know to reach and keep the positive results of these methods, we must commit together to carrying them out over the long term and without trying to delegate it to a few people and not put any work in ourselves. Over-delegation is a means for the dishonest and the vicious to suborn real democracy and open debate. (Top) Refusing Thought (2024-04-08) Certainly there are times, places, and contexts in which nobody should have to think. Websites are a minor example, and these days not as widely experienced as they once were because so many newcomers to the internet have been misinformed that web applications are websites. There is a quick way to differentiate the two: if the web address takes you to a site that absolutely does not work without javascript turned on, even if you use the option to turn off the stylesheet in the web browser, chances are it's a web application. Still, I am less concerned here with the times, minor or major, when a person should not be expected to think and has every right to insist no thinking be required. Instead, I am more concerned with a persistent and growing phenomenon in which many people I know as well as strangers speaking in multiple venues angrily refuse to think and attack anyone or anything giving them pause such that they think. Many of the attacks are preceded by an angry invocation of an authority they insist has already told them what "the answer" is, and therefore no one has any right to think or question the authority. A moment before, or after they have calmed down, the same people will declaim piously on how freedom of speech and to question is so important, and aren't we lucky we have that in canada? The only way to hold those two contradictory things in mind without collapsing in confused upset is indeed not to think about them. It's about the declaration and performance of the right words, and especially the way right performance licenses verbal abuse against others. Maybe it is a way to feel powerful at least for a moment under the difficult conditions of being in a country that is an eager colony of a collapsing empire, I don't know. After all, I cannot know what these other people actually think or feel, apart from what their words and actions show me. But maybe these are somewhat extreme examples and perhaps tied to controversial issues. If that is the case, it still does not explain the now typical response to an expression of curiosity, which at best nowadays is a snarky why don't you just "google" it like everyone else does. Or on observing others reading something that is not a pop culture sort of thing, "Why are you reading that?" No answer except silence minimizes the offense the person who asks takes. Let alone how often being seen reading a book instead of staring into a phone in public can raise hackles. Only so many people read in public from a paper book or magazine these days, so it is undeniably something noticeable. This makes me wonder if it goes back to the performativity aspect of many declarations about "our democracy" and so forth. Reading in public, expressing curiosity to another person instead of telling an advertising company about it to see what they will show you, may then be taken not as genuine actions but performances. This is hardly a new phenomenon of course, demands for social conformity are a critical element in any society, and to some degree we all perform according to the rules to get along with others. Certainly there are fights not worth having. Yet the quality I am striving to get at here, the intensive effort to control the thoughts of others, this is the matter that troubles me. Demanding detailed, constant conformity and looking askance at expressions of curiosity, questions, or evidence of reading other than "current market approved works" suggests a much more detailed demand for outward conformity and otherwise complete silence than has been at large in the past twenty years or so. We are being pressured to allow outside parties to control our thought, and not to think independently at all. This is all so strange and sad, not merely due to the political implications, but most of all due to the social implications. It begs the question of how a mixed group of people can casually spend time together and chat under our current conditions of constant, detailed behaviour policing lest someone question a current party line or fail to be interested in the "current market approved work" of the moment. There is an unfortunate aspect of the present "diversity, equity, and inclusion" drive that has slipped far away from the hopeful project of steadily removing structural biases, to an obsessive focus on controlling what people say and demanding they recite secular catechisms. It doesn't take much observation, let alone thought, to notice that the new "diverse, equal, and inclusive" imagery incorporated in many websites and publications these days has a striking propensity for featuring visible minority bearded men and bearded white men with long hair wound up in a bun. Or how the "diversity, equity, and inclusion" specialists spend a remarkable amount of effort finding or making up "awareness days" and creating micro-aggressions that everyone besides their supposedly enlightened selves engage in. In some workplaces, every day is an exercise in finding out what staff is presumed guilty of by the "diversity, equity, and inclusion" team. We are definitely not supposed to notice or think about such things these days. It's mighty hard not to, however. (Top) "You Keep Using That Word. I Do Not Think it Means What You Think it Means." (2024-04-01) There are no doubt many other quotes from the 1987 cult hit movie The Princess Bride than the one featuring as the title of this thoughtpiece, but there are few so well known and so useful. The word, or strictly speaking, words, that brought this quote to mind again was the unfortunate "computer literacy." It has all the qualities of the most fatuous marketing-speak: tries to co-opt a real term to achieve a patina of authority, then uses the pseudo-authority to try to persuade people they should be anxious about something. From there of course the next step is to offer an expensive solution to the supposed problem conjured up to create the feeling of anxiety. Nevertheless, I am not unsympathetic to computer programmers who complained vociferously about computers and software aimed at a "consumer" market rather than exclusively at them. Not because I agree with such an attitude, but because many computer programmers despise mendacious market-speak, and in this they share a perspective with the majority of the world. This actually helps make some sense of the widely shared ire among many in the computer science crowd against apple, which by now is grudgingly recognized for its consistent advertising prowess. Yet this did not help explain the depth of feeling against not merely the corporation, and not just its wares, but also much of its software philosophy long before its turn of the twenty-first century mutations. There are certainly solid critiques of apple computer as a corporation, its stance on matters of free/libre software and its now completely despicable record on repairability and upgradability. Indeed, I have touched on some critiques myself in other thoughtpieces, especially An Unsurprising Worm in the Apple. Still, I think there are some things about the early apple computer that deeply annoy older computer programmers because the path it took decentred their values and interests – it is not coincidental that the most annoyed of these programmers seem to be male. It is fascinatingly telltale that an early insult nickname for a macintosh computer (repurposed from its earlier reference to poorly programmed microprocessors) was "toaster." I do wonder how many people, including many of those annoyed computer programmers realize GNU/linux was barely available to the broader public until the mid-1990s. It was far from trivial to purchase any of the most readily available computers at the time, not least because they were all expensive. The most readily available were not so available for people who did not live in cities, and even less so for people who were not living in wealthier countries. Supposing you were willing and able to acquire a computer of some type by mail order, or by going into a city to pick one up, after that you were pretty much on your own. It bears noting as well that in the mid-1990s, dial-up internet access was the rule, and then almost exclusively through universities or corporate offices. In other words, you had to be a student in the one or an employee of either to have easy access to the internet without necessarily having any computer, or without having your own dial-up modem if you had a computer. Some developers and engineers could see potential for computer sales beyond corporations, universities, and the steady but small hobbyist niche. Among the problems pursuing such possibilities besides cost, was figuring out how to create a computer people would be willing to have in their homes. They would certainly have to be able to do something with them. These were some of the challenges the developers of apple's human interface guidelines dealt with, and love the guidelines or hate them, they solved many significant problems in still unsurpassed ways. First, it was necessary to design an interface for people who are not programmers. Programmers need and use manuals, and for historical reasons it was not unusual to need to read circuit diagrams and similar to complete common tasks even into the mid-1990s. However, for people who were not already motivated to literally build and program computers, manuals are a critical barrier. Nobody wants to study in order to do what they want on a device presented to them as a useful tool. Therefore the interface has to facilitate exploratory learning, trying things out to see if they would work and how they worked. Professional programmers seem prone to interpreting this type of design as an insult to their intelligence, perhaps because so much of how they demonstrate that intelligence is by making computers work via their skilled use of circuit diagrams and manuals. Nevertheless, if people need to take a course or study a massive book to do much with a computer beyond plug it in and turn it on, the vast majority of people simply won't. A good analogy here is learning to play music. It is possible to figure out how to play any musical instrument by ear, and play happily with friends and for family gatherings without ever "going pro" or taking up more elaborated musical forms calling for more study. The majority of people take this route. A minority opt for studying music so that they can play the more elaborate works typical of classical music. The second necessity after the interface was to provide software and peripherals supporting tasks of interest to a broad range of people outside of the office. The decision to focus on word processing and simple artwork turned out to be a cleverer idea than the more dour, business-machine oriented crowd like to admit. There was already a long tradition of home and small-scale publishing of zines, posters, booklets, and so on. Acquiring or otherwise accessing early copy machines was difficult and expensive in its own right. Typewriters had come a long way and were fairly inexpensive, but correcting typos was probably the worst issue. The ability to buy a home computer and small printer to compose and print documents was a genuine revolution, although at the moment it is a somewhat stymied one. Regardless, at the time this meant people could see a practical day-to-day use for such a computer. These were the sorts of tasks a person generally needs to start from in getting familiar with an operating system on a computer, as well as the behaviour and quirks of the computer as a machine. The more recent attempts to claim that in retrospect the "techbro" dislike for apple was really because Steve Jobs was a nasty human being should be embarrassing the people who claim it. They are fine with all sorts of jackasses, from Eric S. Raymond to Elon Musk. They are all about "separating the artist from the work" as long as the work is something they like, but in this all humans have the same difficulties with consistency. In the end though, I think a deeper reason for the common programmer hostility against apple computers originally came from the company's contradictory approach to computers and software. On one hand, apple led a trend towards a remaking computers into ordinary things, machines more akin in position to household appliances than mysterious ones "computer wizards" somehow made work. That certainly raised the ire of "techbros" who have sadly accepted the sex role stereotype of brainy men as "effeminate" and therefore always having to shore up or prove their masculinity. On the other, apple tried for a different sort of mystification by playing up the notion of the computer as a magic box, one the eventual owners of the computers were discouraged more or less subtly from seeking to make and run their own software. There are undeniable "bake the cake and eat it" and condescending qualities to this strategy that would reasonably piss off anyone on those grounds. It's really too bad these issues were not better expressed and therefore discussed at the time, because it would probably have put rockets under support for free/libre software and helped forearm people generally against attempts to prevent them from repairing or upgrading their computers. (Top) Unsurprisingly, Taylorism is Nonsense (2024-03-25) To be honest, I am not convinced that there is such a thing as business philosophy, although I agree absolutely that there is plenty of philosophy relevant to business as much as to the rest of life. That acknowledged, that rare thing, a book dealing with "business philosophy" in a focussed and serious way is very much worth the read. I stumbled on Matthew Stewart's book while trying to find some more current secondary sources on the execrable Frederick Taylor, specifically some sources that talked about how he actually worked up his numbers. Somehow it did not surprise me at all to find critiques demonstrating that he faked his numbers, depending on the growing mystification of numbers and newfangled statistics in united states culture specifically. The blight of treating numbers as magic and beyond the reach of commonsense questions once arranged into a table or graph is a thoughtpiece for another day. For now, it is enough to say I traced a reference to Stewart because his treatment of Taylor's work and career meant he had gathered some very juicy references. His critiques of Taylor and broader business thought were bonuses. I heartily recommend the book, although it does struggle a bit in the second half. For an appetizer-size introduction to it, Stewart's own original article in the atlantic The Management Myth from june 2006 is still online. Over my own years of trying to make some sense of the strange behaviour and demands of "management," if only in hopes of preserving my sanity, I had also run into many exaltations of Taylor as a sort of patron saint of mass production methods. It didn't hurt his posthumous reputation on this point that he completed many of his most famous studies and publications at the same time that Henry Ford was figuring out how to make a new sort of mass production line and take full advantage of interchangeable parts. It doesn't take long to determine that Taylor was especially interested in finding ways to force maximum labour out of labourers for minimum wages, as well as his utter contempt for any person who works with their hands. The dangerously seductive idea many of burgeoning capitalist extremists of his time were already chasing was that of making the labourers completely interchangeable, and preferably equipped with little to no education of any kind. Every means should be taken to prevent the labourer from applying any judgement whatsoever. In real life, as we all should know too well but too many have forgotten, this goal can't be reached. Or rather, this goal can't be reached at the same time as maintaining a business into the longer term. We forget at our peril how in fundamentalist capitalism, there is no longterm, just taking every possible effort to maximize profit by every possible means. The actual meaning of "efficiency" can never be taken for granted. We always have to ask just how it is being measured and the final result the "efficiency drive" is aimed at. Overall, the current "business philosophy" is predicated on a refusal to trust labourers in any way, and to exploit them in every way possible. This is not a new situation by any means, and I hope by now the obnoxious whining from the managerial cadres about the presumed laziness and disloyalty of labourers of all kinds is recognized for what it is. Employers and managers can hardly expect to get more than a strictly contracted amount of work and no more loyalty than the contract term implies when they take as given that anyone who can be persuaded to work for them is too stupid to deserve respect or decent treatment. I am not too sure what to make of the people who complain incessantly about "young people who don't know how to work." There are not too many "young people" in my experience who are not working two or three jobs plus attending some type of post-secondary schooling because every job with fulltime hours demands it just to get an interview. The jobs older people are retiring out of are typically vanishing with their last pay cheque. Supposedly everyone is supposed to work in "service jobs," which is interesting, because the original capitalists made their first fortunes taking advantage of the stampede out of "service" which was so often poorly paying for extreme working hours under abusive conditions. The capitalist fundamentalists are always going on about how we can't turn back the clock, but they are certainly giving it a try. I have always found the ways in which philosophers are stereotyped and abused in the media thoroughly strange, particularly accusations of refusing to take part in society or not caring for what is happening in the real world, not being practical. This is especially strange considering that if anybody can be justly accused of such things, it is the most committed "businesspeople" with their pathological greed and insistence that society is not real. All the better for them to believe there is no society and therefore no social relations or obligations. Then such businesspeople can do as they please, serene in their rationalizations for whatever actions they take to maximize their profits. Or at least, they can until the real world intervenes in its inexorable fashion, with the collapse of the very scams they expected to make them permanently rich and beyond anyone else's rules. (Top) A Smiley History (2024-03-18) I suppose it was inevitable in an environment of constant coercion against non-pharmaceutical interventions like masking in the middle of a pandemic, a major theme of that coercion would be demands to "smile" and "show our smiles." It is no consolation that this demand for social submission and performance typically imposed on all women in many "western" societies is in a period of special expansion to dig at anyone seen as not getting with the current propaganda lines about how supposedly an immune system deranging illness is "mild." Among the knock on effects of the manic demands for everyone to stop wearing any sort of effective mask as part of effective management of exposure to aerosol-borne illness is a renewed interest in the nature and meaning of the smile. Hence historian Colin Jones' study of the smile in eighteenth century paris published in 2014 has won renewed interest in our awkward pandemic present, The Smile Revolution: In Eighteenth-Century Paris. He shared a précis of his arguments and a teaser of the book in an article posted at aeon.co in october 2020, The Smile - A History. I will not delve much into the history aspects per se, since I am the first to admit to having a sketchy sense of french history for france itself, although like many people who grew up in canada, the outline I have of france's colonial interactions with what is currently called canada is fairly complete. Instead, I am inclined to pick up on some of the potential and actual social meanings of "the smile" Jones' article discusses. Smiles don't have a fixed meaning, which is why people concerned to force others to respect them because they are not in fact very respectable tend to be quite obsessed with how others smile. Such people are acutely sensitive to the ways in which smiles can both hide and conceal far more than they would wish. This is a concern that probably the majority of us share at least to some degree, and there are many cases where it is quite reasonable to try to assure ourselves that our own smile or that of other people corresponds to the meaning we hope it does. A positive example is any time we hope to please others. The trouble there is how we are mis-trained to look for a quick and to our eyes genuine smile as a sort of immediate confirmation. This is not really how expression of genuine enjoyment works, especially when the other people include those whom we are not very close to, so they may be a bit shy to emote too much. We can't know what other people than ourselves are actually thinking are feeling for absolutely certain, indeed sometimes we can't even know that for ourselves. We can make reasonable suppositions though, and in the end, we have to make a leap of faith and trust what others tell us. A tough thing to do at times, and necessary. Quite the tangle of power relations smiles can be in. Or, more often than not, they are simply smiles. To return to Jones and his article, another striking element is his argument for the smile as a reflection of class. He discusses early conduct books and notions of who had their mouths open in public, whether to laugh or show their teeth, for example. Yet the particularly pointed aspect of the argument to me was his discussion of the origins of dentistry, or perhaps more accurately, what is now called "orthodontry" with its focus on straightening and whitening teeth. Inability to access the new fancy orthodontic possibilities back in eighteenth century france according to Jones, and I would say definitely now, reveals poverty and precarious work conditions. That gives away that I myself have been through difficult times forcing me to forgo even basic dental care because I could not afford it, and accept a mercury amalgam filling because it was the cheapest option and the affected tooth was better kept than pulled. Not so long ago, it was thoroughly unusual to see a person with what in my junior high days we called "chiclets," the early manifestation of what is now familiar to most of us as the united states-based "movie star smile." By unusual, I also mean unusual in the united states. For those not quite sure what I mean, it is worth peaking at another thoughtpiece for its illustration, Fourth Wallisms. The picture is a screen grab of DeForest Kelly in the role of Dr. McCoy in the original Star Trek, at a lighter-hearted moment. He did not have "move star teeth," nor indeed did the majority of actors at the time. Based on my thoroughly unscientific survey, it seems to me that "movie star teeth" really took hold during the 1980s, not coincidentally also the decade when the tide began to turn against smoking. Of course, as Jones notes, there is more to a smile than moving our mouths and showing or not showing our teeth. A smile expressing positive feelings also involves our eyes, and if we are speaking, a smile is audible in the voice. Truth be told, I am not at all convinced by the sudden efflorescence of complaints about how difficult it is for people to recognize any smile at all let alone a pleased one if the other person's mouth is not visible. Severe difficulties in this area afflict a minority of the population, and some of them are able to improve their skill with assistance such as appropriate training. The majority of the rest of us are able and willing to help clarify matters when we encounter another person who has minor or severe difficulties figuring out our expression or response to our interaction with them. This is not new, and it was true before the COVID-19 pandemic. Therefore the complaints are whining at best, attempts at forcing others to behave differently at worst, and all together obnoxious nonsense. (Top) Echolocation in an Echo Chamber (2024-03-11) Eli Pariser's book The Filter Bubble popularized the term, referring to the virtual echo chamber created by so-called social media and advertising companies masquerading as search engines. Pariser strove to warn the general public about the dangers of such bubbles, emphasizing political polarization. Yet far more dangerous is the way that the filter bubble retrains those unaware of it or actively committed to it to take active steps to reproduce the bubble in freespace. We humans are social animals, and we depend upon our network of friends, family, and acquaintances to locate ourselves in our ongoing present. Or better described, echolocate ourselves in our ongoing present. We send them a signal with a question, and observe their response, the echo back, usually putting the most trust in those closest to us, and a few trusted acquaintances. In a real sense, social media platforms strive to mimic acquaintances of the trusted kind. The key trouble being, echolocating inside a filter bubble is impossible. Once trapped inside such a bubble, or committed to staying in one, the inability to echolocate becomes naturalized, and in an effort to make the bits of evidence from outside the bubble causing cognitive dissonance, the person in that position strives to increase the size of the bubble. Having been persuaded that their sense of discomfort is the measure of what is right in the world and for others, it doesn't take much more for such a person to be persuaded that they have the right, no the duty, to force others to conform to the criteria and messages inside their bubbles. Fanaticism and witch hunting – which is of course not "witch hunting" at all but a process of figuring out who on the outside they can pick off with no penalties to themselves and therefore an effective license to harass, hurt, and at the extremes, kill. We already know how this works, and have all too many examples easily verified of how it ends. There are genuine parallels between sensory deprivation and the sort of thought and reasoning deprivation that the bubble dwellers engage in. Extreme sensory deprivation causes insanity, as indeed does persistent sleep deprivation. According to the most recent explanations of how our senses work, our senses don't work in as passive receptors, but as active seekers of information. Our senses, just like our minds and our overall ways of being in the world, constantly compare outside information to what our previous observations, sensations, expectations are. When we are healthy, all of this works well, and we are able to adjust to changes of all kinds in our world, from our interactions beyond our own bodies to what happens within our minds and bodies. There is strong evidence that without adequate interaction with outside stimuli, our senses go haywire, and we begin to hallucinate as the lack of corrective feedback allows mistakes and distortions to expand and become entrenched. Mental thought loops have a similar quality, blocking outside information and leaving us vulnerable to glomming onto pseudo-patterns as our moorings slip away. Some of the most brilliant and astonishingly cross-culturally interpretable artworks illustrate such terrible mental whirlwinds and their grim outcomes if their victim cannot find and accept help in time. In the so-called "west" among the most famous examples of this are the stories and fates of the titular characters in Oedipous Tyrannos and Hamlet, Prince of Denmark. I have read people who leap from the realization that many people have lost most if not all of what might have been their solid mental and social moorings to claims that the solution is to impose some form of patriarchal religion. The problem, according to these people, is the lack of "authorities," lack of proper belief in an absolute master who will impose a morality and way of living. Alas, they too are in a bubble, the one from which they strive to avoid noticing or accepting that in fact they are advocating for the very sort of disordered thinking causing the problem in the first place. It is undeniably more difficult to be a responsible human being, to think and grapple with difficult questions and different viewpoints than to delegate it all to an authority figure. The trouble is, for this to work, the person's faith in the authority figure must be absolute, or the person must be utterly unscrupulous and willing to do anything for the authority figure's approval. Either way, the authority figure's authority is brutally precarious. Somehow, some way, the person with pretentions to absolute authority must avoid all suspicion of being other than superhuman. At this point, the catholic church has been trying to solve this challenge for the popes and has made an ass of itself repeatedly to the point of its practical authority being about inertia plus having an extraordinary amount of stolen lucre in its coffers and a well-developed indoctrination system for those it can catch at the right time. To be sure, the catholic church is not the only organization of its kind, and the ills of it as an organization do not detract from the intelligence, ethics, and practical good deeds of many people who consider themselves catholics. Any such organization needs to find a way to balance hanging onto people who are not locked in the bubble associated with it while maintaining enough coercive force to extract money and goods. Bouts of violence and provocative propaganda are reserved for when the organization is striving to increase its coercive strength, but both tactics are not actually controllable. They are not just double-edged blades that can turn in the hand, but double-edged blades with no hilts. Filter bubbles are more like softening exercises leading us to ignore our instinctive caution when we are faced with the social and mental equivalent of such hilt-less, double-edged blades. It is no coincidence that the people most desperately vulnerable to filter bubbles and a catastrophically weakened belief in their capacity to think and reason are young people, typically below the age of twenty-five or so. In other words, during the lengthy period of chiildhood and then adolescence, when in the earliest days it is more than enough in a day to manage the basics of playing with other children. There is too much to learn for a small child to take up adult reasoning or responsibilities. Those have to wait until they have learned a great deal more about the world and their place in it. Adolescents still have a great deal to learn about social life, their own emotions, and the trials and tribulations of politics, and so they are not able to take up full on adult reasoning and responsibilities either. But they are capable of beginning to take them up by expanding their range and testing the new things they are learning, which is difficult, at times terrifying, and at other times mortifying. Unlike when they were small children, adolescents are beginning to understand consequences and recognize those that last more than a day or so. The worst and best capability we come into at adolescence is the capacity to imagine what could happen. It is completely understandable that at such a tricky time, any young person would be tempted to flee into a filter bubble and seize on an authority to replace the parents and parental figures whom they now understand are people, not superhumans. It is not kind or ethical to try to make those bubbles permanent prisons in which they remain trapped and afraid. (Top) Reflections on Firmspace (2024-03-04) Some notes towards something that might feed an essay later, which started from an offline discussion from some time ago. Thinking first of all, of freespace, also known as firmspace. Both of these terms are proposed as alternative synonyms to the pejoratively connotated "meatspace." Properly analogous to "firmware" in that it isn't completely a hardware phenomenon, but is not dependent on being loaded and run on a computer, and it can be changed. For instance, we can (with care and effort) change a habit, or learn a new skill – or break a leg, on a really bad day. The alternative form, "freespace," seeks to echo the kinds of freedom Richard Stallman identified as critical to free software. Stallman is (in)famous for his uncompromising definition of what he describes as the four essential software freedoms:
Admittedly the freedom to study and change a program demands more than access to the code, it is also necessary to have a compiler for the code, and indeed one of the earliest and most important completed parts of the GNU project was and is the GNU compiler collection. It is also true that Stallman was focussing on freedom as a right to speak, act, or think without restrictions. The closest firmspace parallel to software freedom is probably "free speech" as argued about in the united states, and subsequently the growing controversy over the ongoing attempt to enclose culture with the false concept of "intellectual property." Explicitly drawing out the connection to speech, ideas, and culture is important, not least because it helps improve understanding of why the software case has become such a lightning rod for controversy. To my knowledge, no one who thinks seriously about free/libre software and advocates for it thinks it is an example of a license for people to use software to do harm. The expectation is that by ensuring that source code can be audited for bugs and backdoors, it becomes that much harder for them to remain uncorrected and therefore a source of risk. It is notable also that free/libre software advocates, including Stallman himself, question the insertion of computers and software into places and uses that put the personal freedom, security, and privacy of people at risk. By acknowledging the dangers of digital surveillance and trying to impose software where it opens up options for severe harm, as in the case of the so-called "internet of things" and opposing those applications of software, such advocates also insist there are broader ethical considerations that can't be ignored. Clearly the software freedoms can no more stand alone than "freedom of speech." The parallels between the subset of speech that is software and "speech" more broadly reminds us of those who deliberately abuse the notion of "free speech" to license abuse of selected groups of people. The great difficulty speech, culture, and so on present on a daily basis is they are not simple, abstracted things built up from minimal, simple rules with only a few narrow allowed possibilities. They are all social relations, which drags in those difficult ethical considerations, all willy nilly. Hence cyberspace, where "software" is supposed to rule is also a medium in which more than one scholar has argued that software is not just computer code, but a type of law code. Lawrence Lessig is the most famous literal lawyer working in related areas, but I am thinking more of someone like Alexander R. Galloway, who wrote the 2004 book Protocol: How Control Exists After Decentralization. A central aspect of his argument is that software programs and protocols guiding its development and implementation impose a structure that is difficult to change or work around. Mere decentralization does not prevent such structures from affecting people using the software, whether those structures are implicit or explicit. Richard Stallman has sought in part, to break down and replace the structures entrenched in the production and distribution of software via its origins in the military. We would all be unwise to lose sight of the fact that computers and their attendant software are rooted in a desire to control and replace people, people deemed a source of error and social disorder. This is true from Charles Babbage trying to automate production of perfect mathematical tables to Herman Hollerith seeking to create a machine to swiftly and accurately tabulate census data so that central governments could try to manage populations. One of the most recent developments in "artificial intelligence" is programs taking massive databases of digitized visual artworks, and using those databases as a starting point for forward modelling of new "art." I think it is fair to use scare quotes here, because what people typically mean by art is some item, be it a visual, audio, or sculpture that expresses the thoughts and feelings of the person making it. These latest programs are impressive in how they both imitate art so well and so badly. They are fascinating in the weird juxtapositions they produce, and in the subtle but stubborn oddness of even the most impressive results. The fact these oddities persist reiterates how we humans continue to lack understanding of how exactly we feel or think, and how much we learn by trying to create software that mimics at least our thinking. I am fascinated by people who get annoyed with this sort of commentary. It seems like they want to believe that somehow the computer runs a program, thereby magically becoming able to think, not just imitate thinking, to create "real art" not just a simulacrum of it. As if the fact of software producing simulacra at best is some kind of derogation or denial of the achievement represented by getting a result of this sort at all. Yet I don't think it is either of those things, but instead a recognition of the understandable limitations of the software on one hand, and programmers' understanding of how thought and creativity work on the other. We have something verging on a proof by induction that neither randomness nor pattern matching can explain them. If nothing else, referring at least to firmspace rather than meatspace enables us to see both the achievements represented by these massive software projects, and the fact that there is nothing superior about computers and software compared to firmspace. There isn't a hierarchy between the two. (Top) Unexpected Trivia Question Answers (2024-02-26) I suspect that everyone of a certain age remembers very well the infamous costume Carrie Fisher was forced to wear in her role as Princess Leia in Return of the Jedi. Her experience is one of the earliest movie scripts I am aware of where a main female character starts the movie as a hero with a proactive, effective role in her own right, who is soon reduced to a sexual object wearing something atrociously inappropriate. It didn't help that Leia is supposed to be behaving like a hero because of Han Solo, whom she has known for two minutes. When I first found that incongruous, I was of an age when such potential adult relationships couldn't make sense to me regardless of what a movie implied or said. Growing up made it seem even stranger, but I must admit increased my appreciation of Leia strangling Jabba the Hutt. In any case, like many people my age, I missed few opportunities to at least look at the glossy "making of" books that George Lucas and his marketing team were busy licensing as part of their growing empire of lucrative tie-ins. Such books are generally enjoyable to read, although tragically the printing quality of even the glossy ones meant to sit conspicuously on the coffee table has dropped precipitously. The worst example I have ever seen is from 2009 and came from the marketing extravaganza of Peter Jackson's Lord of the Rings trilogy. I had never seen a glossy publication where the colour screens were misaligned make it to the bookstore shelf before, though certainly I had seen comic books that had suffered such printing indignities. The reason this type of glossy "making of" book is relevant, is because by accident I have never seen one for Return of the Jedi. So it remained a mystery to me just where Leia's appalling second costume of the movie came from, even if I was sure it came from somewhere. For good or ill, George Lucas is notoriously derivative in his visual design selections, and has similar leanings when it comes to music, such that one of the movie soundtrack composers he has worked with most often is of course John Williams. There is a certain pleasure in identifying the sources, and the original Star Wars trilogy is great practice with easy examples. This is probably an important factor in how successful these movies were, with so many echoes that could please kids and either please or annoy adults. Since it was long before the current era of crass external product placements, it seemed fun enough as long as nobody paid too much mind to the Leni Riefenstahl references. But oddly enough, I never did run into an explanation of where the Leia slave bikini costume came from, not even as a trivia question or "nerd fact" when the original Star Wars trilogy was rereleased on DVD and there were big viewing parties to see them again. Hilariously, while digging around in the internet archive's unlocked recordings collection I stumbled on the 1963 london symphony orchestra recording of Rimsky-Korsakov's opera Sheherazade. There, as can be seen in the quoted album cover above, is the unmistakable original of the costume Carrie Fisher must have half froze to death in during the filming of Return of the Jedi. I have no idea who is wearing the costume on this album cover, and there is no notation even of the photographer that I can see on the scan of the album cover. Perhaps that is just as well, because this version of the costume is on its tiptoes at the very edge of being outright pornography. Such thoughts aside, by all accounts the opera is rather good. But now knowing a bit more of its visual origins anyway, and the stories it is associated with, the selection of it and the scenes Carrie Fisher gets stuck wearing the derivative version in makes even less sense. And never fear, I was and am now under no illusions about the coherency of George Lucas' scripts. They aren't, it's all about the visuals and a romp through a hackneyed storyline without being too boring about it. Then again, I suppose that by the late 1970s Lucas figured he could include more imagery oriented to post-pubertal males and they would be the least likely to ask puzzled questions such as, "Why would a giant slug-guy find non-slug people pleasing to look at or have chained to them with nearly nothing on?" It's even stranger to think about if you happen to have seen any of the comic book versions of the Star Wars trilogy, in which Jabba the Hutt is still shown as a two-legged, lizard type. Obviously there is nothing profound anywhere in this thoughtpiece, just a bit of silliness. Do have a look and a listen in the unlocked recording section of the internet archive. There are hundreds of gems set free by the united states copyright modernization act, including albums by Edith Piaf and Nana Mouskouri, Jimmy Hendrix and Thelonious Monk. There is also an impressive number of language instruction recordings and sound effects collections. (Top) Information Architecture and Its Discontents (2024-02-19) Hard as it may be to believe, Rosenfeld and Morville first published their book on information architecture concepts as applied to websites nearly twenty-five years ago. I read a second or third printing of the first edition for a new job because one of my tasks was to design and build an internal website that had to organize and serve a wide range of documents and information types. It is worth skimming this original edition just to see how much the visual interfaces of web browsers have changed, and to see just what a blight online advertising has become. Later on I read the second edition of Information Architecture for the Worldwide Web, which included a recommendation not to use the original edition at all. Oddly enough, while the second edition is much heftier and has many more examples and longer discussions, I did not find it especially useful in comparison to its predecessor. This has more to do with the book's orientation to commercial sites, and the issue of trying to wedge everything into blogging software or something emulating sharepoint had not taken hold yet. For my professional situation, fending off the demand to tear apart a sane and maintainable information architecture to artificially force the pages into a database was the big task. I had not yet come to understand that employers and the vendors of those databases want "websites" to be built with their software to destroy their portability and the ability to maintain them without expensive licenses. Ah, those were more naïve days. Overall it seems like most professional web developers took on board only certain aspects of the best practice recommendations. It is fair to say that foisting firmspace metaphors onto websites is no longer a thing, although as always there are designers and artists who implement opaque interfaces for creative reasons. I can't remember the last time I ran into an "under construction" page, although placeholder pages and cybersquatter pages are commonplaces now. The "miscellaneous" and "stuff" categories have generally vanished in terms of having a navigation link, although there are probably still a few of them around. Since a good integrated development environment includes support for projects, it is usually simple to keep folders of working documents and "not sure where to put this yet" items where they won't be published but will be backed up with everything else. Yet the underlying principles and recommendations around building and tweaking websites to keep them navigable and reasonable to learn by exploration have mostly fallen by the wayside. After all, the general attitude today seems to be that a centralized search engine will provide navigation, and that too much transparency will facilitate "stealing the site design" or something along those lines. Meanwhile, the software used to manage very large websites that depend on massive databases are more and more often using obscure link names which actually block such basic and useful practices like checking article titles or blogpost dates. Websites are difficult to manage in large organizations, because as Rosenfeld and Morville reiterate, they are charged with internal politics. Hence the number of redesigns to such an organization's website can often be estimated with remarkable accuracy by counting the number of navigation systems it has. I was skeptical of this measure until one day in a meeting discussing a planned redesign of just such a site, I pulled up the front page of the current site to better follow the proposed changes in the mock up versus what was there. To my amazement, there were already seven different navigation systems implemented, and all of them performing so poorly that most users had to find the specific thing they wanted by using an external search engine. In the end the redesign made the site look fancier and incorporated slide shows on every main page, and added yet another navigation system. Generally the most efficient way to find things on the site remains using an external search engine. Alas that internal politics supports encrustations on a website to the point it is like a sunken ship covered in barnacles. The latest proposed solution to that is to create a "webapp" and have people use that instead. All of this suggests that one of the worst things when it comes to designing organization websites is the insistence that it should be all things to all people and meet the internal political demands for a "consistent look and feel" for which read, "control by marketing/human resources." This is a problem information architecture simply can't solve. Internal websites and public facing websites for the same organization are typically quite different from one another. That actually isn't about internal politics, it is about genuinely different needs, including differences in security requirements and so forth. For good or ill, internal websites often ignore web standards and best practices because the employer imposes software and hardware requirements, leaving staff stuck using a barely functional web browser with a site full of proprietary plug ins and shoddy designs imitating one or another "social media" platform. But even external websites can't actually do everything for everyone who might have reason to visit it. A website focussed on marketing and sales calls for different affordances and underlying software for file management than one intended to support research or uploading files. Yet it is true that for some organizations there is a genuine need for all three types of site. Rather than actually build three closely related and connected sites, the general approach seems to be to shove everything under the aegis of the marketing site. This exacerbates and encourages navigation and maintenance issues. The surreal result is that in the quest to produce websites that are all about marketing and "branding," web developers are pressured to produce and realize designs that obfuscate information in every way possible. This is so every where the results are at complete odds to what a person might reasonably expect an organization's managers to want. In my most recent experience, this includes an online catalogue that demanded I provide a postal code to search for a product even though I had selected an online ordering option, then provided no means to actually narrow the search results to an outlet in my area. It also provided no way to sort new product from listings for second hand items, nor product sold by the company directly and that posted by affiliated resellers. These are nontrivial issues of such severity that in the end I gave up not just on the site but also on that company. So this probably expensively designed and implemented website does the opposite of its ostensible purpose even as a way to market and encourage sales. (Top) Microsoft's Approval is Stigma, Not Reassurance (2024-02-12) Seriously, and despite my personal dislike of microsoft as a corporation and in the way it designs and licenses its software, I do wish it were otherwise. It would be better for organizations like microsoft to not be too busy pursuing the worst sort of capitalist tactics at the expense of everything else rather than actually developing and supporting solid, secure, trustworthy by individual persons software. Being a grown up and having awareness of my surroundings, I get that microsoft is no better than apple in terms of "operating system" and companion software vendors, and the differences between them were originally happy accidents. Now they are fossils of a very different time, with vestigial parts some of which have real positive potential but have been left to wear away. But "microsoft approved" or recommended is now a waving flag to run away for anyone who is serious about the security and dependability of their systems. I can no more completely avoid microsoft software than anyone else, so I get a constant dribble of direct information via the experience of trying to use an employer provided work station for day to day tasks that really should be no big deal and should not entail constantly calling a microsoft server somewhere to give up data and beg for a background pat on the head. I did not and do not expect windows to look pretty, and would gladly trade down to windows 7 graphics to get the genuine stability at least of that system for working at the office purposes. Employer-provided machines are not secure from an employee perspective and should always be treated as a security nightmare about to happen – sad but true. I have colleagues who manage personal matters on their company work stations, which makes my hair stand on end. No matter how innocent my personal matters are, to my mind there is absolutely no justification to manage them through a machine I don't own or administer. For at the office purposes, well, microsoft software keeps getting worse. For windows itself, after an update to the latest version, which always imposes an almost full day's loss of real work because all the software and network features need to be tested again and then debugged, there were additional problems. To shut down the ersatz facebook and advertisement collection jammed into the "new and improved" taskbar, I had to head straight to a search engine. Then I had to figure out how to turn off all manner of new and distracting alerts, then reset a couple of alert sounds that were needed but too loud and too long. I soon discovered that the settings to adjust keyboard backlighting have either evaporated from this particular laptop series, or perhaps they have been removed by the employer for obscure reasons best known to themselves. This is too bad, because the de facto settings are such that the keyboard lights up only exactly as long as I am hitting at least one key. After that all the lighting turns off instantly, so my hope was to just turn the keyboard backlighting off all together because it turned out that this is a surprisingly distracting setting. Who knows what is going on with the mouse drivers, as the mouse I use with the work station came with it and plugs in through a usb port. Yes, a wired mouse, and I was told bluntly by the IT staff at work, "Never lose that mouse." They understandably find support ticket requests to make more fashionable wireless mice work properly maddening. All of this just to document that when pressed, I do indeed seek a guarded modus operandi with microsoft products when I can't avoid them. There are issues with microsoft apart from its corporate policies or aesthetic concerns. In terms of its overall philosophy, it seems clear that the designers at microsoft in both the software and user interface sections don't really understand the needs of people who use their products, don't care, or in the least bad scenario are prevented from applying their expertise. Software that meets microsoft criteria is miserable to use and quick to wreck or lose data, which I suppose is considered a plus by the "cloud" team who want everybody to store data on microsoft servers so it won't be lost when microsoft's software crashes or dies outright. Based on the menus and font choices in windows, everything about onscreen typography and legibility has been thrown out to reinstate all caps labels and impose such stupidities as a "tell me what to do" option. The only reason the stupid "tell me what to do" buttons are not obviously mocked online is because somebody realized just in time that if they were animated in any way they would invoke the sort of vitriol inspired by the wretched "Clippy." Even my colleagues who think windows is wonderful hated "Clippy," nor are they fans of anything called a "wizard" although they are good with a sensibly arranged "set up assistant." I can't say from observation or direct experience whether microsoft has those even for windows, but suspect that such things would be all about trying to autofill data and applying default settings while making it very difficult to change or correct them. Add to that the horrors of the windows registry and libraries loaded in the background, which in my experience guarantees at least one program, usually a critical one, starts a memory leak, and things are not looking good. If this is what is okay in flagship microsoft products, the company is hardly going to signal boost other programs and projects that show them up. I don't quite know what to make of the ongoing saga of ubuntu apparently striving for some kind of co-arrangement with microsoft by prioritizing support for various microsoft protocols. It bodes ill though, and is one of the reasons I gave up on that gnu/linux distribution. That said, it does lead me to wonder if a potential pivot microsoft is considering, besides the blatant attempt to enforce Bill Gates' dream of reducing computers running windows to dumb terminals that won't work unless on the network and logged into redmond, is to replace the zombie windows system with a microsoft-branded gnu/linux. There is nothing technically stopping them from doing the latter, and it seems to me that windows has long passed the point that the original macos system reached in the early 2000s, which is another strike against using microsoft as a source of software recommendations. (Top) Yet Another Online Enclosure Attempt (2024-02-05) I appreciate that google's ongoing thrashing in the form of trying to enclose more and more "users" into corrals made up of web applications is not news. After all, back in october 2022 (e.g. here is a round up on the topic at techrights, Gmail Considered Harmful) google and microsoft began even more openly tag teaming in an effort to destroy the use of actual email clients, self-hosted email, and well-tested and supported related standards including imap and pop3. Part of what amazes me about the people who have been caught up in the steadily worsening products sold as "gmail" and "outlook 365" is that a significant portion of them are old enough to have rightly turned up their noses at AOHell and similar efforts to keep them dumb and helpless when surfing the web or otherwise using the internet. Younger people are often in the shittier position of being forced to use such pseudo-services and insecure software as microsoft, google, and yes apple products in order to complete and submit their school work or do wage jobs. More than one university has found the MBA gospel of cutting costs and infrastructure until all "value" has been stripped and the organization folds. Hence over the past five to seven years now, universities have been throwing away their own mail servers to replace them with "outlook 365" and have long tried to force students and staff to use microsoft products by "offering" them prepaid subscriptions to microsoft office products. The pre-payment is via tuition for students, so it isn't like they can opt out. Meanwhile, I suspect that for faculty the pre-payment is taken out of their professional development fund. Governments have been busy jumping on the microsoft cloud bandwagon, so that in countries like canada everyone can look forward to the rogers telecom company having an equipment failure that shuts down most banking and many services plus outages and security breaches through microsoft's collection of back doors. From what I have heard and observed, apple has become a junior partner as it has managed to capture market space formerly held by the blackberry. These enclosure attempts online and in terms of software and hardware keep happening, apparently with the idea that it will finally become just like the firmspace enclosure of land to drive people into situations where they can only survive by working for wages for even the most vile of employers. The fight in that case isn't over yet, and there is plenty of real and ongoing successful resistance. Contrary to the pessimistic claims in some quarters that the online world is all done now, reduced to just a surveillance and propaganda system under the control of various militaries and their corporate frontmen, I do think that there are more options, even quite accessible ones. But if everyone can be convinced that there is nothing available except their poisoned offerings, surveillance, and generally miserable stuff, then it doesn't matter if there are any other options after all. The thing is, not everyone is so convinced at all, and in this I am not referring to the people who have decided the thing to do is to rush to an alternate version of the same thing that has collapsed or is in the process of collapsing, like "social media" and "smart phones." For one thing, online access is not universal or an easy thing for "everyone" to get and use as a matter of practicality. For example, regardless of whether they want to be "online" they can't because it costs too much, or the electricity supply is irregular or such that priority is to use it for necessities, so that if they do go online, they go online with a purpose, do what they have to do, and close the connection. Furthermore, it is quite clear that capitalism can't continue, and people understand this on a direct day to day basis. The cost of living is key of course, and more people are standing together to enforce their right to repair and continue using older hardware of all kinds. Where that hardware includes computers, there is a growing understanding and teaming up by people to insist on their right to use free-libre software that can be properly security vetted and patched to their satisfaction. All of which leads me to a rather unexpected idea. What if there is a greater parallel between land enclosures and online enclosures than meets the eye. What if they in fact share that they are response to exploiters who are fearful lest their marks simply walk away and ignore them? Capitalist systems cannot survive without constant predation. Or if a less violent analogy is preferable, then it is like a hot air balloon: it can't stay in the air without constant renewal of its hot air, which means it constantly needs fuel to heat the air and a pump to keep the balloon inflated. Sure, a self-destructive air balloon pilot could dump all the weight attached to the balloon possible, but in the end it still comes down to somehow maintaining supplies and protection from the elements from the ground. So the various capitalist interests are hoping against hope to finally find the perfect, neverending means to maintain their power and profits by controlling who eats and how they eat, and where that can't be done too overtly like in "first world countries" by forcing everyone to be "online" to do necessary day to day tasks. Failing that, there is always spreading and fuelling addiction by inventing new ways to hijack the human dopamine system. Hence the ever-increasing number of legal, highly addictive opiates, the continuing policy of allowing drug dealers and their suppliers to run rampant while criminalizing their victims, the addicts. Then of course there are the somewhat newer addictions, like gambling and pornography, and newest of all social media. But all of this indicates desperation and barrel scraping, and as I observed in a much older thoughtpiece, What If There is Nowhere Else to Go? once there is nowhere else to go, nothing else to find a way to profit from, and no means left that works to force people to continue in the same suicidal way, things get complicated. If running away or stealing from others are no longer feasible means to keep a system running, then like it or not, the general population are going to seek and try other options. I actually think one real option people may be looking at is actually shutting down what is currently "the internet" let alone "the web" as we know it. There are lots of respectful and useful ways to make use of the remarkable tools that computers are, and we humans are social animals who I suspect can't help but hear the siren call of enabling and maintaining hyper-long distance communication. But there are certainly more constructive and life-friendly (by which I mean all life, not just human) ways to do it, and people are onto them. (Top) Messaging Fail (2024-01-29) From what I have read and heard, this is supposed to be an era of total propaganda – oops, I mean controlled public messaging – no, public relations war – ah, the hell with it, an era of total propaganda. Due to accident I wound up sans-television set back in the late 1990s, and have never replaced it, though of course like many people of with enough good luck, yes I have an external computer monitor. Yet I have never purchased a cable subscription again, and this not for lack of sampling television fare. While travelling and staying in hotels, I do take a look at what is on the television these days, but have found that matters have gotten so bad that it takes less than ten minutes before between five and seven commercials run. Worse yet, unlike the olden days when cable channels were owned by several companies whose commercial breaks did not overlap, now they are all in synch. So much for that. Even the few radio stations a person can actually pick up using a regular radio are mostly overrun with advertising or transparently repeating propaganda bulletins on the hour in lieu of "news." At least on the latter point, radio has form: as something for the broader population it was supposed to tell everyone what to think. That is why today it often isn't possible to get a decent weather report from a radio station unless you tune into shortwave stations favoured by sailors. The cable-television-ization of the web is pretty much done, although I do think it is still possible to get around the search engine monopolies if webmasters who are not more concerned to shill propaganda than anything else are willing to work together. But for the moment the propaganda shysters are most prominent, and their mendacious speech ways and pseudo-arguments are taking up a great deal of space. Case in point is the stupidity of recent campaigns for certain things on the basis of "deserving" whatever the thing is. If a person or group "deserves" something, they are deemed by somebody else to have "earned" the thing by showing the right qualities. There is a lot of condescension in the term, which its latin root "serv-" gives away. The latin word for "slave" was "servus," which later mutated into the term "serf." Truth be told, I think the ancient greeks were far more honest in their terminology for people entrapped into labouring against their own interests for others. Their regular word was "doulos" and the root there refers to trapping or ensnaring someone. Either way, such people were not deemed social persons, and that became a useful rationalization for treating them to random torture or execution whenever their pretended owners thought such violence would wring more utility out of the slave. A really pleasing slave was so pleasing that they could deserve to be manumitted, released from slavery. But of course, the best slaves never wanted to be free. Even with no interest or knowledge of these details, there is something not quite on with the notion of "deserving" today, I think because it is now being deployed in a clumsy form of attempted moral suasion. It probably doesn't help that there is also a strong whiff of paternalism around it, and a sense that something akin to a child deserving a treat if they clean their room and don't cry when their hair is combed. This is all the worse when it comes to matters that are in fact arguments about rights. There is a clumsy series of over-wordy signs in a few yards in my town declaiming that "everyone deserves a family doctor." On first seeing these, I found myself pondering why the signs bugged me, even though I support correcting the perverse incentives that are destroying the ranks of general practice physicians so that people are not reduced to having no access to medical advice and treatment outside of an emergency room basis. That strikes me as basic human necessity and decency. That isn't something earned, people have a right to assistance from a doctor when they need it, including currently out of fashion because MBAs say it is unprofitable preventative medicine. I also agree with the older practice of sharing and maintaining sensible basic medical knowledge, the sort of thing inappropriately formalized into first aid classes. Unfortunately such formalizing mostly convinces people they don't need it because why should they allow anyone else to teach them, or that it costs time and money they can't afford. In other words, these yard signs end up tugging on a thread of troubles ranging from deliberate policies undertaken to remove original basic medical care provided by non-professionals within communities and families, to ongoing neoliberalization that incentivizes people to become doctors "for the money," which drives them into price-gouging specialities and out of general practice. It doesn't matter whether a person training to be a doctor would prefer to be a general practitioner, if the cost of becoming a medical doctor is so high that the only way to pay off the expenses is to already be rich or to be a specialist in a fashionable area. A mealy-mouthed beg that "everyone deserves a family doctor" instead of facing up to the systemic factors that make it impossible for in some areas for at least a quarter of the population to have any consistent access to a general practitioner including via execrable walk-in clinics, makes for a massive messaging fail. Who is going to be rallied to what amounts to begging for crumbs at a table, which comes to declaring incapacity to make any real change? But I suppose that is the benefit of that sort of message. It allows the people repeating it to show their bellies to their "superiors" while also showing that really, they care to their "inferiors." In fact, that is probably the biggest messaging fail of all in the "they deserve it" tack, that it is an expression of cynicism. (Top) Uncomfortable Echoes (2024-01-22) As the best history books do, Ned and Constance Sublette's The American Slave Coast: A History of the Slave-Breeding Industry provides a feast of food for thought. It took me some time to pick up the book and sit down with it, because I was not at all certain how they would handle their sources. I was nervous about the almost pornographic treatment of information about slave-breeding in other books. Even if the treatment was not pornographic in the usual and more literal sense, I was a bit worried about the potential for presentation of the history in a manner that provides vicarious and puerile thrills in the form of hyper-simplified comic book-style hero-villain depictions. On one hand this hardly seemed a remotely likely discussion based on what I had learned about the Ned and Constance Sublette as scholars, and the book itself is not a popular history. On the other, sometimes strange things happen between original drafts, rounds of edits, and final publication. Thankfully, my concerns on these points were completely overcome within the first ten pages, and the subsequent narrative is as properly uncomfortable and full of important information as could be wished. I should also note that as a person who did not grow up in the united states and does not live there, the Sublettes provided just enough help to ensure readers like myself did not have to stop and look up dates and names to understand what was going on. For many reasons the early history of slavery in the united states continues to deeply impact it to this day. I do wonder if many readers have really taken note of the Sublettes' comment on the last page, 668, in the last two paragraphs. Antebellum slavery required a complex of social, legal, financial, and political institutions structured to maximize profit that flowed only to a small elite, while leaving the rest of the population poor. It wanted no legal oversight beyond the local, no public education, and no dissent. For laborers, it wanted no personhood, no wages, education, privacy, clothing, human rights, civic identity, civil rights, reproductive rights, or even the right to keep a stable family. It existed at the cost of everything else in society, including the most basic notions of humanity. The history of the slave-breeding industry demonstrates how far the unrestrained pursuit of profit can go. Anyone reading these two paragraphs could reasonably feel very uncomfortable right now. There are undeniably grave issues of deliberate destruction of mechanisms such that the general population, not just the richest capitalists, have true influence on the policies of what purport to be their democratic governments in much of the so-called "west." There is often almost no recourse against local government abuses except expensive litigation, decades of local activism, or both. The war against public education in the united states is infamous, and its imitators are just as determined to destroy it in countries like canada. Dissent isn't totally banned yet, but we are certainly not permitted to dissent on a wider and wider range of matters for less and less reason other than a corporation demands obedience to increase its profits. The multi-decade attempt to utterly destroy privacy for anyone but the rich is still ongoing in the form of attempts to pressure everyone into having a cell phone, then having a "smart" cell phone, then having a "smart" cell phone with multiple privacy violating applications. People are indeed very aware of this, including the young "digital natives" who calmly tape over cameras on their phones and laptops and do the best they can to resist by swapping sim cards and the like. I could go on, but the point is clear. Of course, connecting the evils of slavery with the way they echo the perverse incentives of capitalism is hardly a popular step with the very rich people who exert more and more pressure in an effort to completely censor any and all media. Never mind that such efforts amount to trying to plug leaks in a disintegrating boat out at sea. It is both fascinating and horrifying to realize that enslaved people were redefined as a type of capital, and therefore yes, it is true and no exaggeration to say that the southern states of the united states who sought to secede were fighting to preserve their capital, which meant preserving slavery. There is a horrible logic to it, and the corrosive rationalizations for the brutality of the system always came down to, "slaves are not people like us." And this is not "people" in the only marginally less vicious sense of "people, but not of our class," but "not even human at all." Presumptions about supposed lack of intelligence or ability to feel pain in african americans remain embedded in medical and educational guidelines at least in northern north america. Those presumptions were extended to other enslaved populations, including kidnapped Indigenous peoples of the americas, and then extended some more to indentured labourers and finally just plain labourers. The all important difference of course, was that the non-slave labourers were encouraged to share the racist presumptions of their employers, and then show they did so by abusing enslaved and free peoples who were of other "races." Part of what brought the original slavery system down around the ears of its proponents in the united states was the determination of new-style capitalists who were able to achieve super-profits by hyper-exploiting free labour. Furthermore, they wanted free labourers who could spend their own money, as the new-style capitalists were developing and building a mass-consumption based profit model. Another big part was that more and more people found that the old rationalizations for slavery were not enough, and that they could openly dissent from going along. I think there was also a paradoxical result of the constant doubling down on racist rhetoric, because racist rhetoric sounds more and more absurd and harebrained, as well as dangerous and critically important to oppose, the more it is engaged in. Hence the wild proliferation of euphemisms and fine-grained pseudo-categories. We are seeing a similar dynamic today, predicated on classist, sexist, and racist prejudices. (Top) Exploiters' Tactics (2024-01-15) I have read many brief mentions of Frederick Douglass, who won his freedom and literacy to become one of the great african american speakers and writers of the nineteenth century. Reproduced here is a copy of a rarely featured picture from his 1855 book, My Bondage and My Freedom. It seems not wholly accidental that mainstream citations lean more towards showing him in his older age, as opposed to a picture like this one. Considering that the preponderance of citations of Douglass online and off that I have encountered are by united states-based non-african american writers, perhaps this should not be surprising. Even less surprising is how rarely I happen on lengthy excerpts from Douglass' keen analysis of politics and economics in his time, except at the wonderful site Black Agenda Report founded by the late Glenn Ford and now led by Margaret Kimberley. The following quote is one of many excellent citations of african americans in The American Slave Coast: A History of the Slave-Breeding Inustry by Ned and Constance Sublette. It is much easier to trace nineteenth citations now to the original texts through the internet archive. As the times continue to be the wrong sort of interesting and the appearance of stability of the parasitic capitalist system collapses, the need for us all to understand the tactics applied by oppressors to keep the rest of us down grows ever greater. We ignore the hard won knowledge of our predecessors at our peril. Just because officially chattel slavery is not supposed to exist anymore, or the analysis in a given book or other source is more than a hundred years old, that does not in itself make it irrelevant. After all, no such age cut off is applied to the documents and sources convenient to power, such as various works by ancient greeks and romans, or such italian writers as Machiavelli or Bocaccio, who lived in the fifteenth and fourteenth centuries, respectively. We should always turn right around and look up who we are told not to read at all, let alone carefully, especially if we have any reason to think the point is to prevent us questioning the status quo. In that case we are rarely ill-served by reading whatever we are not "supposed" to. Merely reading something is not going to change our minds if it does not say anything relevant to what we are experiencing. If whatever the source is is truly irrelevant, it will strike us as such when we read it. But enough preamble, let us turn to Douglass and his incisive commentary on how slaveholders turned white labourers against slaves. The circumstance which led to [Master Hugh] taking me away, was a brutal outrage, committed upon me by the white apprentices of the ship-yard. The fight was a desperate one, and I came out of it most shockingly mangled. I was cut and bruised in sundry places, and my left eye was nearly knocked out of its socket. The facts, leading to this barbarous outrage upon me, illustrate a phase of slavery destined to become an important element in the overthrow of the slave system, and I may, therefore state them with some minuteness. That phase is this: the conflict of slavery with the interests of the white mechanics and laborers of the south. In the country, this conflict is not so apparent; but, in cities, such as Baltimore, Richmond, New Orleans, Mobile, &c., it is seen pretty clearly. The slaveholders, with a craftiness peculiar to themselves, by encouraging the enmity of the poor, laboring white man against the blacks, succeeds in making the said white man almost as much a slave as the black slave himself. The difference between the white slave, and the black slave, is this: the latter belongs to one slaveholder, and the former belongs to all the slaveholders, collectively. The white slave has taken from him, by indirection, what the black slave has taken from him, directly, and without ceremony. Both are plundered, and by the same plunderers. The slave is robbed, by his master, of all his earnings, above what is required for his bare physical necessities; and the white man is robbed by the slave system, of the just results of his labor, because he is flung into competition with a class of laborers who work without wages. The competition, and its injurious consequences, will, one day, array the non-slaveholding white people of the slave states, against the slave system, and make them the most effective workers against the great evil. At present, the slaveholders blind them to this competition, by keeping alive their prejudice against the slaves, as men – not against them as slaves. They appeal to their pride, often denouncing emancipation, as tending to place the white working man, on an equality with negroes, and, by this means, they succeed in drawing off the minds of the poor whites from the real fact, that, by the rich slave-master, they are already regarded as but a single remove from equality with the slave. The impression is cunningly made, that slavery is the only power that can prevent the laboring white man from falling to the level of the slave's poverty and degradation. To make this enmity deep and broad, between the slave and the poor white man, the latter is allowed to abuse and whip the former, without hinderance. But – as I have suggested – this state of facts prevails mostly in the country. In the city of Baltimore, there are not unfrequent murmurs, that educating the slaves to be mechanics may, in the end, give slave-masters power to dispense with the services of the poor white man altogether. But, with characteristic dread of offending the slaveholders, these poor, white mechanics in Mr. Gardiner's ship-yard – instead of applying the natural, honest remedy for the apprehended evil, and objecting at once to work there by the side of slaves – made a cowardly attack upon the free colored mechanics, saying they were eating the bread which should be eaten by American freemen, and swearing that they would not work with them. The feeling was, really, against having their labor brought into competition with that of the colored people at all ; but it was too much to strike directly at the interest of the slaveholders; and, therefore – proving their servility and cowardice – they dealt their blows on the poor, colored freeman, and aimed to prevent him from serving himself, in the evening of life, with the trade with which he had served his master, during the more vigorous portion of his days. Had they succeeded in driving the black freemen out of the ship yard, they would have determined also upon the removal of the black slaves. The feeling was very bitter toward all colored people in Baltimore, about this time (1836), and they – free and slave – suffered all manner of insult and wrong. – From Frederick Douglass, My Bondage and My Freedom, pages 309-311 This tactic is still in use to this day, and we should all be on guard at the hint of this among all too many divide and conquer techniques. Their vicious effectiveness as always depends upon a highly tempting piece of bait that can be especially difficult to resist when others who have taken it pressure those who resist it to go along with them. The ones who took the bait are understandably embarrassed, and exploiters depend on this embarrassment to act as a further enforcement mechanism. That is, they depend on those who took the bait to blame the ones who didn't for their embarrassment, even though if anyone is responsible for it, it is truly the men exploiting them. But they have often also been convinced that they can't do anything to hold the exploiter to account without excessive risk to themselves, so they go after their resistant peers and thereby do the exploiters' work for them. Few tactics are crueller than this one. (Top) |
Thought Pieces
Thought Pieces
|
Copyright © C. Osborne 2024
Last Modified:
Sunday, September 08, 2024 13:53:00
|