The Moonspeaker:
Where Some Ideas Are Stranger Than Others...
From Zork to Applescript – Thoughts on Computer Languages (2024-09-30)
As far as I know, there are not too many younger programmers or gamers who spend much time with the text-only adventure games emulated in various ways from back in the days just post-dinosaurs of computers. That is, the computers the games were originally played on were not of the sort anyone was likely to have at home, but they could log into a mainframe on a university campus to run it. Those mainframes were still technically limited access, locked up the way servers typically are today. There was a brief interval when it was possible to peer at the "big machine" with the most RAM that programs with bigger datasets were sent to run on, but in those somewhat more innocent days they were perhaps not so interesting. After all, they were headless and easier to access by remote login anyway, provided you had the credentials. Fortunately, games like Zork and its many clones did not need much memory at all, and could be a great deal of fun, provided the player was able to figure out the game's vocabulary, which was not necessarily so simple. The problem was not nouns, in my experience, but verbs, and depending on what was going on, prepositions. Due to further experience and study of linguistics, I am now aware that these are the typical difficult parts of many human languages.
This is hardly a revelation, of course. However, what led me to go back to this experience again was a comment on a website I read regularly to the effect that what we usually call "computer languages" are not languages. Due to the fast-moving nature of the conversation and how the comment was an aside to the main topic, I never had a chance to find out what the person commenting meant in order to confirm I had not misread and to learn their reasons for making this declaration. It had never occurred to me to question the elision between the terms "computer programming language" and "computer language." Yet it is quite intriguing to do so, because one of the "computer languages" most vaunted as being "english-like" is applescript, and really, it isn't. Yet, it also shares the difficulties of text-only games like Zork when it comes to verbs and prepositions, with their echoes of human language learning challenges. The parallels come from the fact that humans develop and implement languages intended for use in programming computers, and in that sense they are means of communicating between people, at least indirectly. The various manuals, standards, and textbooks are all about saying to newcomers to the computer programming language, "here is how you tell the computer what to do."
Flipping to my usual dictionary, the OED, it duly informs me that a language is "a method of human communication... in a structured and conventional way." The implication is that the human communication of concern is reasonably direct, there is not the equivalent of a compiler step to make the language understandable, typically speaking. Thinking over what I have learned about computer programming, I am aware of the critical role of abstraction. It is possible to program computers using machine code, but it is painful and error-prone (and undeniably a great learning experience – and I am not being sarcastic in saying so). The basic actions of a general purpose computer in its most simplistic forms only has a limited repertoire of tasks it does, and these can be built up into more complex actions. So in time the first level of abstraction was to define a computer programming language that had subroutines that consistently ran the same chunk of machine code to do a particular task. The levels of abstraction have been getting higher ever since, with at least a few wonks claiming it should be possible to program just by dragging boxes around on a screen, having gotten a bit delirious after the successful implementation of a similar-looking approach to connecting objects when designing window managers using object-oriented programming. (Their exuberance is understandable, window management is far from trivial.)
But, perhaps what the commenter was getting at when they said what we program computers with is not a language is, humans don't communicate like this. We don't use higher and higher levels of abstract terminology in order to somehow invoke the same remembered event or behaviour every time. This is not the way we humans work. In fact, we can see the truth of this in the type of struggle we can have with learning computer programming languages, because their rough mimicry of the most challenging elements of human language when we learn another besides our mother tongue simply reflects our ways of thinking and learning. There is no communication because the computer has no intelligence to respond with, and it cannot achieve the sort of pattern-matching we can when necessary. Humans don't get stuck when in one language you get on the plane to fly somewhere while in another you go inside it. We can make sense of a great deal of surprising language via context, whereas computers can't handle what happens to fall outside of the map of the abstractions they were programmed with. Perhaps the key feature is that real life human languages are constantly extensible and changing, while every effort is made to keep computer programming languages as static as possible and minimize their capacity to produce unexpected results from running them on computers. It may be possible to generate startling and novel errors while programming computers, but the causes go back to only a few causes such as running out of memory, garbled pointers, and so on. Once the bug is traced to where it came from, it is completely predictable.
I have been turning over in my mind an argument that mathematics is indeed a language, not in the either romantic or insanely hubristic sense implied by claims mathematics is "the language of nature," but in the sense that humans use it in a structured and conventional way to communicate with one another. We don't use it expecting mathematical arguments or formulae to somehow serve as irresistible directions to take certain actions outside of the narrow confines of mathematics itself, or at least, I expect no one sane does. An interesting aspect of mathematics is how a few properties defining a certain type of number or set can be logically applied beyond their original usage. One of the most famous examples is the parallel line postulate, which can't be an axiom and led to the formalized discovery of non-Euclidian geometries. Based on the results of the study and application of computer programming approaches so far, it is not possible to do something similar in that context. There are styles of programming, such as functional or object-oriented, or ways of programming assuming different fundamental objects such as the list in lisp. There doesn't seem to be any strong argument for one higher level programming approach over the other, and they don't reveal anything new, except insofar as they reveal how much we still don't understand about how we think. Mathematics does not necessarily help us understand generally how we think, but that has never been a pretence of the subject. I suppose if the declaration "computer programming languages are not actually languages" is true, it would be most reasonable to declare mathematics to be at minimum especially "language-like." Meanwhile, Jef Raskin's description of a game as an undocumented program can be extended to what we usually call "computer programming languages" as follows: an undocumented computer programming language is a game of the type represented by Zork. Nobody wants to deal with a poorly documented computer programming language when they are trying to do work. Trying to make a computer programming language "seem like english" does not help people adopt it for use and generally does it no favours on the software engineering side of things, pace applescript.
Previous Thought Pieces
|