Title graphic of the Moonspeaker website. Small title graphic of the Moonspeaker website.

Where some ideas are stranger than others...


The Moonspeaker:
Where Some Ideas Are Stranger Than Others...


Cover of historian of technology Marie Hicks' 2017 book published with MIT press, *Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing.* The paperback edition is still in print. Cover of historian of technology Marie Hicks' 2017 book published with MIT press, *Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing.* The paperback edition is still in print.
Cover of Marie Hicks' 2017 book published with MIT press, Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing. The paperback edition is still in print.

Originally, questions about the role and use of computers stayed behind certain types of doors, especially military and academic ones. This was a logical outcome of their origins. Computers began as machines expected to serve limited purposes for a restricted number of highly placed men, men in charge of such tasks as national censuses, corporate and military records management, and perhaps most famously, encryption and decryption. Many timelines and overviews of "the history of computers" show the effects of the intermittent partial incorporation of new detail as military records age out of various forms of "restricted" status. Over time british engineers, academics, and military officers have all received belated recognition of their code breaking work at the now famous landmark victorian folly, england's bletchley park. Further resurgence has revealed the irascible Charles Babbage and his early computing engine designs, the budding mathematician Ada Lovelace who could not reach her potential due to her social position and early death from cervical cancer, and Herman Hollerith whose tabulating machines became the foundation product line of the ibm corporation. All were important contributors to the development of today's computers, even if their contributions were not always direct, yet the episodic treatment of this development in most popular histories tend to leaves us ill-equipped for present-day debates. The emphasis is on the technical details of the machines, eliding almost all of their socio-political context. Admittedly, this provides a ready structure for telling a tale, commencing with when machines first count and do certain arithmetic operations rapidly and accurately, then moving on to subsequent elaborations on the earlier solutions. Shifting away from overview histories moves the reader promptly into other questions, especially the use and abuse of late computing machines and early computers in war and late capitalist economies. There remains another angle not much explored yet, formed by the independent conceptualization of computing machines by Alan Turing in comparison to that characterizing the work by Babbage, Hollerith, and their military and corporate successors.

The best known and older line of thinking goes all the way back to the Jacquard loom, with its early implementation of a punched card control system to improve the speed, accuracy, and number of threads handled by his machines. In time the use of punched cards as a means of carrying out data processing found its way into the core of Charles Babbage's designs for his analytical and difference engines, as well as Herman Hollerith's for his tabulating machines. Babbage's efforts ran aground on the shoals of his own personality and the persistent contempt for manual labourers embedded in anglo societies, which led him to alienate the engineers he needed to actually build his designs. Hollerith had the benefit of a different background: son of german immigrants to the united states, he completed undergraduate and graduate work in engineering, and found himself at the right place and time to put his early machines to work in the united states census office. Both Hollerith and Babbage were struggling to solve the problem of how to handle data processing requiring thousands of simple calculations completed accurately and in a timely fashion. The original approach to handling masses of data had been to bring together groups of low paid "computers," originally young men, later preferentially women, to carry out one step in the calculation again and again, passing their work on to the next person. The resulting division of labour was expected to work because each human computer was explicitly not supposed to apply their own judgement, and all the better if they could complete the calculations as mindlessly as possible. However, this demand for mindlessness runs at odds with maintaining accuracy, as does the tedium inherent to this type of repetitious task. The computing machine designs were meant to overcome the human failure to behave like machines.

The resulting development and production of Hollerith's designs were for limited purpose and limited access machines. Tuned to carry out a narrow range of tasks and algorithms, by the time Thomas J. Watson took over "international business machines," the path to a lease-based sales model was clear. For the purposes of marketing and corporate control, Watson emphasized uniformity of training and clothing in his technical staff, the men whose job was to set up, tune, and maintain the machines under the lease agreements. Even so, the most famous early calculating machines in english-speaking countries were more homely and analogue: slide rules. For a time these challenged the engineer's compass as the best known symbol of engineering as a discipline, before their abrupt eclipse by early electronic calculators. That acknowledged, it is important to take note of the role of corporate demands for conformity of action, thought, and even personal comportment in Watson's version of ibm. It's resemblance to military and religious regimens is not coincidental, and is also important to understanding aspects of later responses to computers and their use. Along this line of development, the point was to keep computing machines limited access, mysterious objects requiring a special cadre of men to maintain them and ensure their accuracy. The resulting mystique was a powerful impetus to sales, advertising, and staff recruitment.

Andrew Hodges described Alan Turing's insistence on working out mathematical problems alone and not always to his own benefit as he spent time retracing earlier research. Yet on balance Turing's instinct to follow his own path produced dividends that to this day are not fully appreciated. The best known is his conceptualization of a general purpose calculating machine. A famous non-conformist, Turing was very much the antithesis of the ranks of ibm sales and leasing staff, let alone the various military officers and men he worked with and among during the second world war. He combined an extensive interest in theoretical mathematics with practical engineering during and after his war service, working on problems of memory storage, processing speed improvement, and assembly language programming. But what is often lost in discussions of Turing's work on computers and his thoughts on intelligence is his deep interest in biology. His later research indirectly reveals Turing understood the incredible potential of computers for modelling more complex phenomena than trajectories and orbits. He realized the new machines made otherwise impossible feats of mathematical modelling of complex systems possible, such as his studies of the relationship of chemical gradients to cell growth and the development of spots and stripes on skin and fur. Turing's point in proposing what is now known as the "Turing test" was to improve the study of how humans think, not spur an obsession with writing programs capable of imitating human responses to prompts. As Diane Proudfoot stated in her proposed interpretation of the Turing test, "Turing observed that if an observer could work out the rules behind a machine's behaviour, that would determine that it was not intelligent." Furthermore, contrary to what we might expect from posthumous representations of Turing on the large and small screen, he gave serious thought to how people might respond to computers. One of his most prescient comments on this point comes form his 1947 "Lecture on the Automatic Computing Engine," in which he stated:

The masters are liable to get replaced because as soon as any technique becomes at all stereotyped it becomes possible to devise a system of instruction tables which will enable the electronic computer to do it for itself. It may happen however that the masters will refuse to do this. They may be unwilling to let their jobs be stolen from them in this way. In that case they would surround the whole of their work with mystery and make excuses couched in well chosen gibberish, whenever any dangerous suggestions were made. I think that a reaction of this kind is a real danger.

As our present day computer-related dilemmas show, Turing's observation here is an understatement. An understatement not because Turing did not see the severity of the issue, but due to his wry sense of humour. He would spend much of the rest of his life struggling with the horrors of managerial mindsets and systems that impeded his research and ultimately killed him. The people with those same mindsets would focus on developing computers as a means to replace and surveil workers.


At this point, two major views of how computers as we know them today are in play. One comes from an authoritarian position that computers are fundamentally computing machines applied to management purposes by a restricted number of technocrats. The other derives from the realization that computing machines are in fact general purpose in nature, and can be programmed and used for practical purposes by anyone who is able to combine the interest, training, and machine access to do so. This realization knocked the original positioning of computers as elite items off its footing, and after successful miniaturization and improvements in energy efficiency of computers and their peripherals, it was even harder to restore the mystique. By the late 1980s and early 1990s, computers had become something more akin to a "consumer" item, much to the disgust of the newly minted fraternity often referred to today as "techbros." But this was the first new generation, the ones formed by companies engaged in software and hardware development and striving to clear out the women who made up a considerable portion of the early ranks of computer programmers. As JSTOR Daily writer Jess Romeo explains in summarizing historian Nathan Ensmenger's research on the alteration of the original sex role stereotype of the programmer as a low-ranking, poorly paid young woman engaged in drudge work. The replacement sex role stereotype is clearly set out in such folk publications as the Jargon File and The Hacker's Dictionary. The new "real programmer" was a male pseudo-ascetic, indifferent to bodily needs and content with low wages so long as he can "code." It is hard to beat Romeo's pithy overview:

Companies selected candidates using aptitude tests that favored "antisocial, mathematically inclined, and male" candidates, Ensmenger finds. So, in classic snake-eats-tail fashion, workers who fit that type "became overrepresented in the programmer population, which in turn reinforced the original perception that programmers ought to be antisocial, mathematically inclined, and male."

Among the effects of this remodelling of who programmers supposedly must be besides the eviction of most women from the field, was a new vogue for "extreme programming," and behaviour patterns in young men chronicled and critiqued by commentators such as Joseph Weizenbaum and Sherry Turkle. The result had some accidental benefits of wider impact than the executives choosing to reshape their programming workforce likely intended. For one thing, it fed hundreds of scandalous news cycles, which made for lots of free advertising of the education now required to get these jobs, and their revised sex stereotyping into "masculine and valued" instead of "feminine and degraded." Furthermore, false depictions of computers as requiring a strange, withdrawn, male priesthood to make them work began to spread throughout popular culture. Between this new imagery and depictions of adults as humiliatingly dependent on their own children just to set the clock on the new must-have technology of the 1980s, the video cassette recorder, the new computer executives shifted strategy. Having redefined their preferred workforce, they now had to redefine the products they were selling. It was time to make war on the general purpose computer and the unexpected, burgeoning success of the free software movement initiated by Richard Stallman in 1983. They now hoped to find ways to restrict general purpose computers after the fact. The managerial cadre of the new "technology companies" had access to massive amounts of money, an unfounded reputation for brilliance, and a growing understanding of how to manipulate the media and lobby governments. These were some of the fruits of the successful effort to reshape the training, credentials, and definition of who a "real programmer" was.

The next step was to begin breaking down the idea that ordinary people could own and manage computers without corporate supervision, let alone that they owned any data they could load or make on computers. The first propaganda effort to this end I am aware of featured, as such plays always seem to, Bill Gates and his inseparable project, microsoft. His special interest may relate to the origins and licensing of MS-DOS as much as Gates' openly acknowledged goal of having every computer be a microsoft dumb terminal that required a constant connection to microsoft servers and a paid up subscription to work. Calling potential and actual customers thieves is a recipe for failure, so this media line was allowed to fade away in favour of trumped up claims of mass computer-based "piracy" and then of a convenient epidemic of computer viruses. Former microsofter turned technology journalist Brian Lunduke traces a broader and more successful strategy in his 12 january 2023 article, No Backup: The Demise of Physical Media. The title is interesting because it deletes the agent. Physical media did not merely die, they have been actively attacked and murdered. I vividly remember scare campaigns during my early university career claiming disks that didn't come in shrink wrap and a corporate logo must contain some form of malicious software. Meanwhile, each newer model of computer came with fewer built-in options to read physical media. Computer and media corporations shared the task of lobbying and lawfare to extend copyright, impose digital restrictions, and create "software patents" and the absurdity known as "intellectual property." Today the bleeding edge is the fight to preserve the right to repair for everyone, which really means the right not to be extorted by businesses that sell computers or machines with computers embedded in them.

In the 1980s, it was still a commonplace to build a home computer from a kit, including the earliest models sold by apple and the stalwart of early computer rooms during the 1980s, the commodore series. Putting these kits together demanded some basic training in soldering and reading circuit diagrams, but the parts and tools were generally available because of the overlap with equipment needed for radio and television repair. Hence the early market niche for computer sales was very much a hobbyist one. The growing availability of pre-assembled computers squeezed this niche, but didn't actually close it until the very nature of the components changed. By the 1990s computer customization demanded little more than a set of screw drivers after selecting and purchasing the parts. Today hands-on customization is more common among people interested in graphics and data processing intensive tasks, such as gaming, media production, and research modelling. People who take this approach to buying and maintaining their computers are a stubborn group: interested, willing and determined to research the details, and capable of rejecting poorly designed or actively hostile hardware and software. The strategy of choice against them is constant fear-mongering about security, combined with development and effort to introduce centrally controlled hardware modules. Once again, microsoft is at the infamous head of the pack in this area, with its efforts to produce central processing units with a "trusted platform module" to allow it to control which operating systems are allowed to load on the hardware. This is not as slick an approach as that taken by apple, with its fixation since around 2010 on rendering it impossible to repair or upgrade the hardware it sells by a combination of excessive use of glue and non-standard components and fittings. Understandably, critics investigating these hardware tactics are concerned about how they can prevent installing and running a software chosen by the person who purchased the computer, and from carrying out basic repairs and maintenance.

Free/libre software and hardware are critical alternatives to the proprietary and unauditable options sold by large technology corporations. Being "free/libre" in software means that the code is free to download, examine in human-readable form, change or improve, compile, and run. In hardware, it means the designs are similarly open to examination and free to share, eschewing such strategies as hidden extra operating systems and units. None of this entails that computer programmers and designers work without pay, although it may ultimately drive them to avoid working for large technology corporations. The free/libre software and hardware movement is one of the strongest sources of resistance to the older, authoritarian view of computers and computing. Among its strengths is grassroots support by volunteer contributions of labour power, code, designs, hardware, and funding. But this also makes the organizations developed to coordinate all this support vulnerable to co-optation by corporate funding. Technology corporations always label their money "donations" and "sponsorship," but it is clear that they expect to exert some form of control via influence on hiring and board membership in the various organizations. Computer scientist and technology journalist Roy Schestowitz and his colleagues have tracked the application of this strategy by multiple technology corporations at the TechRights website. They have documented the effects of this strategy not only on free/libre software and hardware organizations, but also on technology journalism more widely.

The war on general purpose computing has gone far over the past forty years, and it has had many casualties. Only now are individuals and communities realizing that attempts to break general purpose computing tie directly to efforts to legitimize constant violation of human rights, especially those to privacy, freedom of thought and speech, freedom from extortion, and freedom of association. What seems to have crystallized the issue for society at large is the growing spectre of "machine intelligence." Despite the sales success of books like Shoshana Zuboff's The Age of Surveillance Capitalism and Edward Snowden's Permanent Record, mass surveillance didn't bring this issue home. Problems with "internet of things" devices and acting on the right to repair didn't either. But with the growing possibility that computers could be used to strip people of the ability to make a living didn't just make the relevance of the war on general purpose computing real to a far wider population, it has also led to a resurgence in interest in the english Luddites, their analysis, and their tactics.

The original Luddites were early nineteenth century english workers facing the first stirrings of what is better known today as the "industrial revolution" in britain. They were not simply opposed to the introduction of new machines because they were new. They objected to the machines because they were used as an excuse to throw skilled labourers out of work and slash wages of all people who had to sell their labour power to live. To add insult to injury, the products of many of these early machines was inferior to what the workers made, an early warning of the way mass production can be manipulated into a way to force people to buy inferior goods. Arguably the Luddites were open to machinery as a means to help them in their work so that they could both produce what they were expected to by factory owners, and have a shorter working day and better wages. This was not and is not an unreasonable vision. The workers in the factories who sew endless numbers of expensive sports shoes and cotton t-shirts continue a struggle with echoes of this simple demand to share the wealth so that those whose labour power garners the capitalist so much profit may have the basic necessities of life. In many cases, this can mean adding as little as an additional $0.50 an hour to the workers' wages, a rounding error on the balance sheets of multinational corporations like nike and adidas.


Again and again, debates and concerns about computers come back to the concept of "intelligence." Prescient thinkers on the implications of programmable computers include, remarkably, Ada Lovelace, who observed in her famous translator's notes to L.F. Manabrea's article on Babbage's analytical engine:

It is desirable to guard against the possibility of exaggerated ideas that might arise as to the powers of the Analytical Engine. In considering any new subject, there is frequently a tendency, first, to overrate what we find to be already interesting or remarkable, and, secondly, by a sort of natural reaction, to undervalue the true state of the case, when we do discover that our notions have surpassed those that were really tenable.

The intermittent bouts of speculation and media frenzy over the purported abilities of the latest "artificial intelligence" or "machine learning" programs dependent on massive datasets and thousands of computer processors to produce output are wonderful recent examples of the very phenomenon Lovelace describes. She had no doubt of the real powers of processing and calculation potentially available of Babbage's analytical and difference engines were fully realized and put to work. Their output could reasonably be expected to be as dazzling as the most elaborate outputs of the best Jacquard looms, but it was important not to allow what we could imagine outstrip what the machines could actually do. Over a hundred years later in the 1960s computer scientist Joseph Weizenbaum began raising important questions on the same themes, exploring and critiquing how "artificial intelligence" programs involved fooling people. Worse yet, people who had been fooled could and did become so committed to the belief in the illusion that they would refuse to believe the computer running the program was not "intelligent" in the usual human sense.

As I often do when exploring topics like this one, I spent some time delving into the Oxford English Dictionary to see what it had to say about "intelligence," including the secondary variety, "military intelligence." Besides accidentally generating the chuckle-worthy error message below instead of a cross-reference such as "see n. intelligence, sense 2," it reported the first sense of the word as "the ability to acquire and apply knowledge and skills." When put in military garb, "intelligence" becomes "the collection of information of military or political value; people employed in the collection of military or political information; military or political information." Now this is very interesting indeed, the persistent pairing of military and political information. Whose intelligence is recognized and respected, and who is considered to have the right to access intelligence and decide on whether it is collection and what the scope of any intelligence collection will be is profoundly political. The people most invested in mass surveillance, propagandists and the military, would generally prefer to avoid serious scrutiny and democratic controls on their practices. Computers are wonderful scapegoat for and distractions from what they are doing. The surveillance proponents insisted the data collection "can't be helped" because it was part of how computers worked. When that excuse crumbled under scrutiny they tried to claim it was all good, because they mostly gathered metadata, and they could just "anonymize" it. These excuses aren't cutting any ice either. Now they hope the fearmongering excuse that they are acting to "save the children" will work well enough to achieve the carte blanche they so desire.

Accidentally funny error message generated by an electronic edition of the *OED* in response to a word search for 'military intelligence.' Accidentally funny error message generated by an electronic edition of the *OED* in response to a word search for 'military intelligence.'
An accidentally funny error message generated by an electronic edition of the OED in response to a word search for "military intelligence."

But now it is worth taking a step back to ask, "why are so many people so excited by the idea of interacting with a seemingly intelligent computer program?" The excitement rarely fixates for long on such issues as the ability to reduce highly patterned tasks to programs run on computers, thereby replacing the humans who used to carry out such apparently rote work. People shy away from the bigger questions about an economy based on rapid mass production, the fundamental drive behind the desire to reduce making things of all sorts to rigid sequences of stereotyped tasks humans find impossibly boring to do well and safely. It is not so comfortable to admit the deeper motivation for mass production is in fact for producing weapons and ammunition for warfare at home and abroad. Instead the emphasis is on "talking to the computer" as if it were another person, for many pundits and tech-centred people a deeply desired thing. Some seem to see it as a means to finally have a sort of "oracle," in the sense of being able to access all the knowledge in the world by saying the program's key world to wake it up and then asking it whatever question. Others seem more interested in the potential integration of such programs into robots, thereby producing at long last the perfect slave: intelligent but without feelings, and always reprogrammable to make it more pleasing. A much sadder proportion of those eagerly seeking a sentient program are those who seem to hope for a speaking partner who does not need or expect the sort of socialized behaviours expected by other human beings. None of these three positions is healthy. I have also read suggestions that such computer programs, whether accessed via a terminal or somehow integrated into a robot, could be used to provide companionship to lonely elders and basic education to small children. These ideas are even worse, yet also even more revealing. The following series of questions completes the picture.

  • Who is the first apparently total source of information humans encounter?
  • Who is expected to carry out the work associated with servants?
  • Who do historians and anthropologists often postulate must have been the first human slaves?
  • Who is expected to socialize and provide basic education to children?
  • Who is delegated the conversational shitwork, including managing rude or clumsy participants?
  • Who most commonly carries the responsibility to care for elders?

Readers familiar with The Religion of Technology by David F. Noble will now more than recognize where this is going. A significant proportion of excitement about "artificial intelligence" is indeed about replacing women, not to spare women the work they do, whether it is considered "theirs" due to sex role stereotypes or not. No indeed, the purpose is to avoid dealing with actual women. There is plenty of cross-cultural and historical evidence that in patriarchal societies, a significant number of men seek to avoid women all together and express persistent doubts about whether women are genuinely intelligent, independent beings.

Computers have no agency in this, but they are not neutral either. Thanks to the anonymous proprietors of the excellent blog LibrarianShipwreck I can quote Joseph Weizenbaum's observation, "The computer has long been a solution looking for problems – the ultimate technological fix which insulates us from having to look at problems." In a thousand and one ways, computers as machines and as concepts have become premiere tools for obfuscation, the curtain hiding the carnival barker masquerading as the "Great Oz." The whole point of computers as deployed and elaborated through the twentieth century and beyond has been to enforce and shore up the current social, political, and economic status quo. Whether that be by diverting attention, supporting surveillance and spreading propaganda, or running the deadly and destructive gambling games favoured by the one percent of the population that has stolen and hoarded the majority of the world's wealth. It is tempting to try to close this essay with the claim that computers needn't serve in this role, that they and their associated infrastructure can be redesigned to turn them away from it to more constructive and honourable uses. However, I am not certain this claim would be plausible. The earliest conceptualization of computers derived from a desire to process massive amounts of data faster and more accurately than is possible for humans. So computers crystallize impatience and a stubbornly anti-human ethos, regardless of the actual feelings and intentions of the many men and women who have contributed to building and programming them. And yet, that alternate vision of computers as general purpose machines able to help humans understand such matters as the profound mysteries of biology and and the turbulent workings of the atmosphere is not anti-human at all. This is the tantalizing vision for many people who find the surveillance and "artificial intelligence" computer applications if not a source of concern, simply irrelevant.


Among Ursula K. Le Guin's many books of fiction and non-fiction, one of the most unusual and best beloved is her rendition of the Tao Te Ching. Still in print to this day, many of the chapters include Le Guin's own usually brief commentaries and chapter titles. Chapter 65 gave Le Guin particular pause, and its reflections on knowledge, power, rulers, and ruled refuse to settle into an obvious message or moral. This is characteristic of the entire ancient text. China is among the nations that developed a combination of highly elaborated machine technology, scientific research, hierarchical bureaucracy, and mass production methods, all long before the european middle ages. Chinese scholars and officials wrestled with familiar though not identical questions to those all of us face with respect to computers and their relation to power and knowledge today. In closing then, here is Le Guin's rendition of chapter 65 of the Tao Te Ching.

One Power

Once upon a time
those who ruled according to the Way
didn't use it to make people knowing
but to keep them unknowing.

People get hard to manage
when they know too much.
Whoever rules by intellect
is a curse upon the land.
Whoever rules by ignorance
is a blessing on it.
To understand these things
is to gave a pattern and a model,
and to understand the pattern and the model
is mysterious power.

Mysterious power
goes deep.
It reaches far.
It follows things back,
clear back to the great oneness.

  1. The classic text on the use of ibm machines specifically by the nazis to support genocide and war activities is Edwin Black's IBM and the Holocaust: The Strategic Alliance Between Nazi Germany and America's Most Powerful Corporation, New York: Three Rivers Press, 2001.
  2. See Capter 3 (pages 49-71) of Dorion Swade, The Difference Engine: Charles Babbage and the Quest to Build the First Computer, New York: Viking, 2000.
  3. Black, IBM and the Holocaust, 24-31.
  4. For an overview of the shift from male to female computers during the early nineteenth century at harvard, which gives a solid summary of their tasks and relative salaries, see The Maid Who Mapped the Heavens, Andrea J. Buchanan, 30 july 2019. For a longer and wider-ranging treatment, see David Grier, When Computers Were Human, Princeton: Princeton University Press, 2005.
  5. On IBM corporate culture, see Black, 41-43. On leasing, maintenance, and adjustment of equipment to purpose, see Black 209-210. Alternatively, a more ibm-friendly source on its corporate culture is How IBM's Corporate Culture Evolved: Work Songs, Muppets, and AI, by David Cassel at The NewStack, 7 april 2019 and ibm's own account of IBM attire.
  6. In asia, the most well-known early analogue computer is the abacus. For more on the slide rule, see hackaday: Slide Rules Were The Original Personal Computers by Al Williams, 5 november 2015. On the abacus, Abacus: Mystery of the Bead - A Brief History.
  7. On Turing's lone method of working, read about his work on his King's mathematics dissertation around the sentence "Working in his self-contained way, he had not thought to find out first whether his objective had already been achieved," in Chapter 2 of Andrew Hodges, Alan Turing: The Enigma, Princeton: Princeton University Press, 1983. Also see B. Jack Copeland and Diane Proudfoot, Turing, Father of the Modern Computer.
  8. Turin, Alan. Morphogenesis, Amsterdam, Elsevier, 1992. His most famous paper on the subject, including diagrams is available from the Royal Society of London website, The Chemical Basis of Morphogenesis, Transactions of the Royal Society of London, 237 (14 August 1952): 37-72.
  9. Turing, Alan. "Computing Machinery and Intelligence," Mind 59(236), 1950: 433-460. Hodges, Alan Alan Turing and the Turing Test. Gonçalves, Bernardo "Can Machines Think? The Controversy that led to the Turing Test," PhilSci-Archive PrePrint, 2021.
  10. Turing, Alan, page 486 of "Can Digital Computers Think?", pages 482-486 in The Essential Turing: Seminal Writings in Computing, Logic, Philosophy, Artificial Intelligence, and Artificial Life, Plus the Secrets of Enigma, edited by B. Jack Copeland, Clarendon: Oxford University Press, 2004. "But I certainly hope and believe that no great efforts will be put into making machines with the most distinctively human, but non-intellectual characteristics such as the shape of the human body; it appears to me to be quite futile to make such attempts and their results would have something like the unpleasant quality of artificial flowers. Attempts to produce a thinking machine seem to me to be in a diVerent category. The whole thinking process is still rather mysterious to us, but I believe that the attempt to make a thinking machine will help us greatly in finding out how we think ourselves."
  11. Proudfoot, Diane A New Interpretation of the Turing Test, The Rutherford Journal - The New Zealand Journal for the History and Philosophy of Science and Technology, Vol. 1, 2005-2006. Turing, Alan "Lecture on the Automatic Computing Engine (1947)," page 394 in The Essential Turing: Seminal Writings in Computing, Logic, Philosophy, Artificial Intelligence, and Artificial Life, Plus the Secrets of Enigma, edited by B. Jack Copeland, Clarendon: Oxford University Press, 2004.
  12. Turing, Alan "Lecture on the Automatic Computing Engine (1947)," page 392 in The Essential Turing: Seminal Writings in Computing, Logic, Philosophy, Artificial Intelligence, and Artificial Life, Plus the Secrets of Enigma, edited by B. Jack Copeland, Clarendon: Oxford University Press, 2004.
  13. Romeo, Jess JSTOR Daily: How Computer Science Became a Boys Club, 29 august 2021. For more details on Nathan Ensmenger's work, see "'Beards, Sandals, and Other Signs of Rugged Individualism': Masculine Culture within the Computing Professions," Osiris 30(1), 2015: 38-65. Or, to see the free extracts from Ensmenger's book The Computer Boys Take Over, cambridge (u.s.): MIT Press, 2010.
  14. For example, Joseph Weizenbaum, Computer Power and Human Reason: From Judgement to Calculation, New York: Penguin, 1984 and Sherry Turkle, "The Subjective Computer: A Study in the Psychology of Personal Computation," Social Studies of Science, 12: 173-205, 1982.
  15. Stallman, Richard gnu.org: Initial Announcement of the GNU Project, 27 september 1983.
  16. Microsoft employee Adam Barr summarizes the origins of MS-DOS and some of the controversy around it at Proudly Serving My Corporate Masters: Origins of MS-DOS, 2 march 2005.
  17. Lunduke, BrianNo Backup: The Demise of Physical Media, 12 january 2023. This source is Lunduke's substack, which he has titled "The Lunduke Journal of Technology."
  18. An example of a 1970s "kit computer" is the Cosmac Elf, described at The History of Personal Computing. Richard Moss at Museum Crush describes some 1980s examples in The Joys of 1980s Home Computing, 24 january 2020.
  19. The hardware and software elements of microsoft's recent moves against free/libre software and the general purpose computer are set out with references by eBuzz Central.
  20. A good place to begin exploring this coverage is the TechRights Main Page.
  21. Zuboff, Shohana The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontiers of Power, New York: Public Affairs, 2019. Snowden, Edward Permanent Record, New York: Metropolitan Books, 2019.
  22. For example see Jathan Sabowski at theConversation, I'm a Luddite. You Should Be One Too by Jathan Sabowski from august 2021, episode 13 of Justin Podur's civilizations podcast on the industrial revolution 1:33:09 - 1:35:40 from july 2021, or the growing Luddite Library of references curated at LibrrianShipwreck.
  23. LibrarianShipwreck Why the Luddites Matter, David F. Noble, Progress Without People: The Mew Technology, Unemployment, and the Message of Resistance, Toronto: between the lines press, 2000. David F. Noble was a fascinating and brilliant scholar who died shockingly young in 2012. Noam Chomsky observed that Noble was too radical for MIT. For a brief biography and one of Noble's last interviews, see Suzan Mazur's Peer Review as Censorship article from february 2010, published on counterpunch.
  24. See for example David Bacon's article The Maquiladora Workers of Juárez Find Their Voice, november 2015. Nike suffered a public relations disaster when its use of sweated labour reached the public eye in the 2010s. See the overview by Rhys McKay at who.com, The Truth Behind The Alleged Nike Sweatshops, 2018.
  25. Lovelace on Manabrea, page 722, note G in L.F. Manabrea "Sketch of the Analytical Engine Invented by Charles Babbage, Esq. With Notes by Ada Lovelace," Scientific Memoirs, 3(1843): 666-731.
  26. LibrarianShipwreck, "Computers Enable fantasies" – On the Continued Relevance of Weizenbaum's Warnings, 26 january 2023.
  27. Angus Stevenson and Christine A. Lindberg (Editors) New Oxford American Dictionary, third edition. Oxford: Oxford University Press, 2010. "n. 1. intelligence."
  28. Angus Stevenson and Christine A. Lindberg (Editors) New Oxford American Dictionary, third edition. Oxford: Oxford University Press, 2010. "n. 2. intelligence."
  29. David F. Noble The Religion of Technology: The Divinity of Man and the Spirit of Invention, New York: Alfred A. Knopf, 1997.
  30. See L. Frank Baum The Wonderful Wizard of Oz, Chicago: G.M. Hill Co., 1899. (Slightly newer edition in black and white.)
  31. Ursula K. Le Guin Lao Tzu's Tao Te Ching: A Book About the Way and the Power of the Way, Boulder: Shambhala, 2009.
Copyright © C. Osborne 2023
Last Modified: Friday, January 27, 2023 21:31:50