Computing and tradition are not usually spoken of in the same breath. Computing has always looked to the future and eagerly thrown away the “cruft” holding us back in the present. However, it is also a fundamental fact of life that the most powerful government agencies and the wealthiest corporations would grind to a halt if an obscure, decades-old mathematics library their code relies on stopped working.
Neal Stephenson’s In The Beginning, There Was The Command Line is a brief for the importance of tradition that tries to be a design critique. Stephenson warns us of the dangers of replacing the command line (and everything it symbolizes) with graphical user interfaces. Given that he wrote the book in 1999, that battle has long since been lost.
There have been some attempts to update and critique the book for the new century. The most risible point Stephenson makes is that the command line itself is the most pure way to experience computing reality:
So an [operating system] is a stack of metaphors and abstractions that stands between you and the telegrams, and embodying various tricks the programmer used to convert the information you’re working with—be it images, e-mail messages, movies, or word processing documents—into the necklaces of bytes that are the only things computers know how to work with. When we used actual telegraph equipment (teletypes) or their higher-tech substitutes (“glass teletypes,” or the MS-DOS command line) to work with our computers, we were very close to the bottom of that stack. When we use most modern operating systems, though, our interaction with the machine is heavily mediated. Everything we do is interpreted and translated time and again as it works its way down through all of the metaphors and abstractions.
In 2004, Garret Birkel responded at length to these and many other claims the book makes:
Computer-like circuitry has burrowed its way into almost every kind of device, in every arena of human progress. The microchip embedded in a rice cooker, for example, could potentially respond to a command-line interface. We could wire a teletype to most of the appliances we use in a day. But when someone says, “Use the computer!”, the image that pops into our collective imagination is relatively well defined…Consider the garden-variety machine from Dell. We pry open the cardboard box, wrestle the styrofoam jaws apart, and dump the thing on a tabletop. “There!” we say. “A brand-new computer!” But by the historical definition, we’re actually looking at dozens of computers. A heap of computers. A computing collective.
- The CPU → The cache manager inside the CPU
- The on-board sound chip
- The USB controller
- The power-management system
- The network controller
- The graphics CPU
- The hard-drive controller
- The microcontroller inside the hard drive
- The settings-management CPU inside the display
- The key encoder inside the keyboard
Clearly a PC has become something more than a teletype. It has not just evolved from its ancestors, but actually contains multiple copies of its ancestors. If we wanted to treat a modern PC like a teletype, we would have to ignore all but one of these embedded computers, and throw the mouse, printer, disk drives, speakers, and game controllers in the trash can.
I will not try to go over the same ground Birkel covers. My intent here is to directly engage with the basic case Stephenson advances in the early and middle parts of the book. I submit that Stephenson has not really made an argument about an interface as much as the web of traditions surrounding it. Those traditions are what is so vital, even if the command line interface may be a superior way of engaging them.
The shell before time
This post was composed using a venerable old text editor, inside a terminal emulator running a shell of similar vintage. To fit the mood for writing about the book, I have enabled a black-and-green color scheme. A cursor blinks whenever I stop writing. I do not have any mirrored shades or — perhaps more appropriately — a neckbeard.
I am somewhere in between Stephenson’s “Eloi” — who live blissfully ignorant and luxurious — and the “Morlocks” that labor underground to make their machinery work. 1 But in that respect, I am also like Stephenson himself. He freely admits throughout his book that he gets tired of the austerity and friction of the command line interface (CLI) and operating systems (Linux) that model a similar asceticism.
I am obviously not writing this post on a pure text interface. The closest I get to doing that is when I set up the i3
window manager and divide my screen into a grid of windows. Most — but not all — these windows contain terminal emulator prompts. And my terminal emulator is a simulacra of a teletype machine rather than a pure replica of it. Should that matter? You won’t really see a conclusive answer in the book, even if Stephenson does address the question.
Stephenson’s book isn’t, despite the title, really about the command line. Hymns and hosannas about the command line do take up a lot of literary real estate. But the book’s message is partly that some things can only be communicated textually:
Back in the days of the command-line interface, users were all Morlocks who had to convert their thoughts into alphanumeric symbols and type them in, a grindingly tedious process that stripped away all ambiguity, laid bare all hidden assumptions, and cruelly punished laziness and imprecision. Then the interface-makers went to work on their GUIs, and introduced a new semiotic layer between people and machines. People who use such systems have abdicated the responsibility, and surrendered the power, of sending bits directly to the chip that’s doing the arithmetic, and handed that responsibility and power over to the OS. This is tempting because giving clear instructions, to anyone or anything, is difficult. We cannot do it without thinking, and depending on the complexity of the situation, we may have to think hard about abstract things, and consider any number of ramifications, in order to do a good job of it. For most of us, this is hard work.
The graphical user interface (GUI) is, to Stephenson, not just a replacement of far more “direct and explicit” mode of communication but also a generalized metaphor for reality. That metaphor is quickly colonizing the entire world. “A few lines of computer code can…be made to substitute for imaginable mechanical interface.” The substitute is often slapshod.
Stephenson raises the prospect of people using GUI touchscreens to drive cars as a reductio of GUI colonization, but that’s now a mundane reality today. How time flies. The root of his gripe with the GUI is the seductiveness of unthinking:
The OS has.. become a sort of intellectual labor-saving device that tries to translate humans’ vaguely expressed intentions into bits. In effect we are asking our computers to shoulder responsibilities that have always been considered the province of human beings—we want them to understand our desires, to anticipate our needs, to foresee consequences, to make connections, to handle routine chores without being asked, to remind us of what we ought to be reminded of while filtering out noise.
I do not think his arguments — which he is at best ambivalently committed to — are entirely correct. There is a good deal of truth to what he says, but time has not been kind to the case that words keep you grounded in reality. Both social media (at least before TikTok) and lately generative AI, the supposed culprits for many social ills, are also mainly navigated by text interfaces. ChatGPT and other Large Language Models (LLMs) are quite literally manipulated through textual prompts! And yet the latter do not obviously entail the sort of direct and authentic communication Stephenson attributes to the CLI. Text, not the dreaded GUI, is the instrument of doom here.
So what, then, has actually gone wrong? Birkel argues that Stephenson tends to gloss over the way in which even the CLI itself was always a convenient abstraction. Even the Stone Age computing interfaces that Stephenson describes at the beginning of the book are light years away from the beginning of digital computing in the mid-20th century. If we’re searching for the real culprit, the GUI is a red herring. The perp we’re trying to apprehend is hiding in the 1940s, likely in the big room where they keep ENIAC.
Magical mystery machines
Given the way Stephenson rightly argues that spurious metaphors and analogies inherent to the GUI harm computing, it is ironic that one of the big issues with his case is actually a dubious analogy to another technology.
You cannot sell a complicated technological system to people without some sort of interface that enables them to use it. The internal combustion engine was a technological marvel in its day, but useless as a consumer good until a clutch, transmission, steering wheel and throttle were connected to it. That odd collection of gizmos, which survives to this day in every car on the road, made up what we would today call a user interface. But if cars had been invented after Macintoshes, carmakers would not have bothered to gin up all of these arcane devices. We would have a computer screen instead of a dashboard, and a mouse (or at best a joystick) instead of a steering wheel, and we’d shift gears by pulling down a menu…The steering wheel and gearshift lever were invented during an era when the most complicated technology in most homes was a butter churn. Those early carmakers were simply lucky, in that they could dream up whatever interface was best suited to the task of driving an automobile, and people would learn it. Likewise with the dial telephone and the AM radio.
Stephenson laments that “it is no longer acceptable for engineers to invent a wholly novel user interface for every new product.” But is the culprit the GUI, or the digital computer itself? The digital computer is, among many other things, a general-purpose system. Analog computers were, for all of their limitations, models of the environments and problems they were associated with. As Charles Care argued. analog computers were essentially modeling systems. The fit of model to reality is why analog fire control machines survived in the Navy long past the introduction of digital computing.
Digital computers are technologies for simulation, not modeling. The responsibility for making them match their outer environments rests solely with the software designer and system operator, not the system itself. Because the digital computer can be made to be anything we need it to be, it is actually the invasive and predatory generalized metaphor for reality Stephenson is bemoaning. The car (and other devices Stephenson names) has converged functions that have been concretized in a very particular milieu of operation. The digital computer is the true view from nowhere.
One implication of this is that while GUIs may rely on misleading metaphors, CLIs ask you to imagine something beyond your sensory capacities. I have often heard complaints from people skilled in more directly physical crafts like sculpting, sewing, embroidery, painting, and other similar activities that it is impossible to develop mental models of what computers are doing. They shy away from experimentation with code and CLIs because they cannot get the tactile feedback of looking at something and feeling it in your hands.
Stephenson often compares the command line to writing, and this is quite revelatory. Literary fiction is only possible due to the mind’s capacity for imagination. Writing may be more “direct” in some ways than multimedia, but it is still about building up something that doesn’t exist in the reader’s mind. The reader has to keep that inner structure — which will differ from person to person — in their head as they traverse the novel. Literary fiction may require more intense attention, but it is open to question whether it really promises direct communication.
Likewise, the CLI is emblematic of the enormous role the imagination plays in even “bare bones” computing. You have only your eyes and fingers to manipulate and control the most complex machine humanity has ever invented. Only one of these senses gives you any useful — but still very sparse — perceptual feedback. The imagined mental models users need to control the system comes from something else associated with the CLI — we will get to this later — not necessarily the CLI itself. Taking the computer-car comparison to its logical conclusion…well…drives this point home.
Driving a car with a graphical touchscreen is bad, sure. I think the Tesla user interfaces are rubbish. Driving a car with a text adventure game interface is also uniquely hazardous. “A pedestrian is crossing the street slowly. You can likely clear the intersection before he passes. Do you want to risk it? Yes/no.” Would it make things better if there were also some crude ASCII representations of the steering wheel and the street created by an international computer standards group?
Even with the command line, that is what we are trying to do with the modern computer. It’s a tall order! Which is why I’m skeptical that the answer to our problems is adding more friction and taking away the user’s candy treats.
Friction’s false allure
As long as there are gamers, there will be gamer rants about why games today are too easy. There’s always some substance to these gripes. I sometimes don’t understand how I could have possibly beat the games I played as a kid. They’re certainly much harder than the games I play today.
I like to think that those games taught me the value of persistence and discipline. I was so upset that I kept failing over and over again. But I kept on trying and eventually beat even the hardest levels. I don’t make any claims about the character-building value of hard games for anyone except myself. Other people may not have gotten the same value out of them.
Regardless, games — or at least the games you retroactively mythologize as appropriately challenging — are optimized case studies in friction. 2 Enemies fight just hard enough to lose gracefully to you. Weapons and movement mechanics give you options but still constrain you. When there’s big penalties for screwing up, you look before you leap. It’s hard to imagine games without at least minimal friction. You can type in the God Mode cheat, but what’s the point?
There are many people that believe that the absence of friction is what makes computing bad today. Frictionless user interfaces and user experience designs are what makes social media harmful. More friction can make social media and smartphones better. So, much in the same way some gamers today argue that getting Mario and his relatives through those difficult timed jumps built character, technology critics think friction will make computing better.
Discussions of friction modestly parallel the old debate about whether graphical user interfaces cut too much friction from computing. Stephenson’s book was an obvious contribution to that debate. The problems I have with Stephenson’s arguments also extend to the case for more friction in social media (and other) applications.
When you have a complex technology with no guaranteed connection to anything in the external world, what is the appropriate level of friction? Social media allows you to link to an enormous aggregated hive mind. It’s inherently volatile! Henry Farrell recently developed a simulation suggesting that recommendation systems — the scapegoats of social media critics — actually may not make as much of a difference as popularly believed.
The model predicts a fairly straightforward outcome. If people are able to search for evidence and arguments that confirm their biases, and to easily publish such evidence too, they will tend to create large online communities glom together around shared rationalizations, defend them, and produce more of them. In other words: hell hath no limits, nor is circumscribed. You don’t need modern social media algorithms to act as Virgils, conducting people into our current online inferno. There are entrances everywhere, and all you need to find them is open search, easy self-publishing and ordinary human psychology. If our model is right, we would likely be in much the same situation as we are at the moment, even if platform companies had never discovered machine learning. People would still be driven by their own wants to discover and create the kinds of shared rationalization that dominate online political debate today, and search and Web 2.0 type publishing would make discovery and sharing really easy.
Perhaps what really made social media so turbulent was its capacity to connect more people together than prior online communities, and connection is obviously the motivation for the Internet as a whole. Farrell reminds us, all of this has in some way happened before. It sometimes seems like there isn’t anything interesting or novel to be said about it. Or at least something that hasn’t been said somewhere between the 1940s and 1970s.
The sheer age of seemingly novel arguments about computing and society hints at what I will spend the rest of this post talking about: the power of tradition.
That old time UNIX religion
Throughout the book, Stephenson conflates ways of seeing with ways of believing. You can see this in some of the paragraphs I block-quote, with the claim that the CLI is a superior model of reality brushing up against a somewhat different claim that using it signifies the assumption of responsibility. He is, though, not really committed to his bits. He will lavish something with page after page of praise only to retreat back to the cool pose a few pages later.
Stephenson is far more effective at arguing — though mostly indirectly — that the tradition behind the CLI is the real source of its appeal. The interface perpetuates a web of tradition. People do not necessarily find comfort, guidance, and purpose in tradition because it it offers them the best model of reality. If there is something uniquely powerful and pleasurable about the CLI, it cannot be separated from the power and pleasure that humans find in tradition in general.
Stephenson repeatedly reminds us that the CLI is a preserved facsimile of telegraphic communication. You can trace a line from telegraphs to teletypes to physical terminals and eventually to the terminal emulator that most people use to access the CLI. That is close to two hundred years of tradition. The “telegraphy” you engage in with the CLI may be a Ship of Theseus, but it still binds you to something bigger than yourself.
I can personally attest to that.
I struggled to see the power of tradition the blinking cursor and prompt symbolized for a long time. Thankfully, I do now. It is actually wonderful to realize that even if your computer is a space age artifact inconceivable to your parents back when they submitted jobs to mainframes, you are still using a crudely similar interface and model of communication. You are sharing, however superficially, in the same ritual.
What made — and still makes — learning about things like the terminal difficult is that you’re not just learning an alien interface. You also quickly discover that doing things with it require you to also learn about the ecology of UNIX, GNU, and Linux (in my case) software surrounding it. And a good deal of that software in turn depends on software written for platforms — like the PDP-11 — that died out before you were born. 3
As Stephenson tells us, UNIX — the ancestor of most non-Windows operating systems — is in and of itself a complex web of tradition that lives on through the systems it influenced. “Acculturation” to the ways of UNIX is appealing because it also binds the user to the past.
Windows 95 and MacOS are products, contrived by engineers in the service of specific companies. Unix, by contrast, is not so much a product as it is a painstakingly compiled oral history of the hacker subculture. It is our Gilgamesh epic. What made old epics like Gilgamesh so powerful and so long-lived was that they were living bodies of narrative that many people knew by heart, and told over and over again—making their own personal embellishments whenever it struck their fancy. The bad embellishments were shouted down, the good ones picked up by others, polished, improved, and, over time, incorporated into the story. Likewise, Unix is known, loved, and understood by so many hackers that it can be re-created from scratch whenever someone needs it. This is very difficult to understand for people who are accustomed to thinking of OSes as things that absolutely have to be bought.
Each new addition to the UNIX ideal has some chance of surviving long enough to become part of the larger “epic.” Therefore, “UNIX has slowly accreted…[and] acquired a kind of complexity and asymmetry about it that is organic, like the roots of a tree, or the branchings of a coronary artery.” The CLI is to the UNIX mythos what archaic languages, texts, and practices are to religions we see today. You do not necessarily need to know these things to believe, but it certainly heightens your experience if you do. The CLI is a means by which we commune with tradition, and that is what makes it so indispensable.
I recognize, though, that your propensity to buy that claim is predicated on how responsive you are to any appeal to tradition or some notional Great Canon of culture.
The “compleat” gentleman
I never learned to speak and/or read Latin or Greek. I’ve read some of the classics, but I wouldn’t say I got an classical humanistic education. I have still always been receptive to arguments that this kind of education gives you the perspective and grounding to better interpret the present. After all, it is hard to understand American political rhetoric up to the mid-20th century without some knowledge of Greco-Roman culture and the Bible. Learning about UNIX through the CLI is a kind of ersatz classical education. Wait, really?
Yes, really. I will say more about why later, but first I need to do some throat-clearing. I am stepping over a minefield as thick as the one separating North and South Korea from each other.
Even if my rhetorical trick of comparing Ye Older Computers to Ovid and Tacticus appeals to the technical, it might be an even harder sell for actual humanists. Traditional education is much-mythologized but not really remotely as prominent as it used to be. What did it use to be, you may ask? Stop reading if you hate Harold Bloom.
I am going to ask for grace in how sloppily I throw around terms like “traditional”, “classical”, and “humanistic education.” I recognize I am appealing to a far more dignified version of the 1950s advertisements Internet trolls cite when they ask us to “retvn” to Leave It To Beaver. Let me at least temporarily have my “Plato to NATO” throwback to reference.
Back in those sepia-toned times, you learned the classics and the “dead” languages they were originally written in. You at least superficially read the Bible. And the “you” here was not just the stereotypical British boarding school student. As I’ve noted before, it is truly incredible how much American political rhetoric and culture in general presumed some familiarity with a common antiquated template.
The notion of a great “canon” of Western culture was later bolted onto the pre-existing template. The template itself is still a core component of elite education in some schools. Mark Zuckerberg is not the only prominent figure to have studied the classics at Phillips Exeter Academy and similar prep schools. If you aren’t in that kind of environment, what value do you get out of it?
What do I think? In hindsight, what sticks with me is the simple fact that classical culture not only persisted after the decline of the Roman Empire but became omnipresent across Europe, the Middle East, Eurasia, and Africa. It was not just laboriously preserved by monastic scribes. It flourished long after its progenitors were distant memories. If Europeans, Arabs, Africans, and others all judged it worth keeping around, perhaps we should too.
The “traditional” education may not guarantee creativity, self-actualization, or even sound judgment, but it is a sizable inheritance anyone willing to engage with it may claim. The value is intrinsic. You may not be rewarded for it, but you miss out on a rich source of insight and meaning if you won’t at least ingest some of it.
Furthermore, t’s not as if the questions that you find in the classics have been conclusively resolved in 2024 either. Maybe we’re all still in the cave looking at the shadows on the wall. So even if Stephenson’s jeremiads against the GUI aren’t as substantive as they appear, he still makes an extremely strong case for committing to the bits with the CLI.
Stephenson is virtually alone in arguing — even if he never argues wholeheartedly — that there is something culturally vital in stripping away the layers of mediation between us and the computer systems we use. As with the classical education, we may not derive external reward from it but we gain something equally important. That “something” is admittedly vague and hard to nail down. And it goes against a technical culture that looks forward rather than backward.
An elegant interface for a more civilized age
In the grand scheme of things, the CLI is not really that old compared to, say, the pen. But the experience of using it feels so odd for millennials and successive generations after us because it is archaic. Things like man
pages are sometimes baffling and cryptic because they are associated with a “way of seeing” we did not natively assimilate. To mix metaphors, it is a language, like Latin, kept alive despite its irrelevance to daily life.
Using the command line is becoming increasingly archaic even among computer science students. You can imagine how little it is used among the general population. So if it — or rather, the various things intertwined with it — are a language linked to an antiquated tradition, should you really learn to read and speak it? Especially if you don’t aspire to be the next Zuck?
Many of the affirmative answers are utilitarian. It’s essential for doing something economically valuable. It’s the only way to gain mastery over your computer. You aren’t a real [insert here] unless you learn how to use it. We all ought to have “digital literacy.” You’re driving blind without it. And so forth.
All of these justifications are more fragile than they appear, because all extrinsic motivations for learning something “impractical” eventually fall by the wayside. So what’s left? The value of the older way of seeing and the web of traditions associated with it is ultimately similar to the value provided by taking the classics seriously.
Comparing the CLI, UNIX, and the web of tradition both are embedded within to the classics isn’t just a throwaway reference. There are too many parallels to ignore.
Much like the classics are a common inheritance we all may access, the hidden and very much antiquarian world of computing is a place of great beauty and is also endowed with rich meaning. There is an argument to be made that in communing with the web of tradition that these increasingly primordial interfaces and programs belong to, we are also claiming an important common cultural inheritance.
In immersing ourselves in an older world we also better appreciate the physicality and historicity of computer systems. They are not magical objects or ethereal illusions. They are real things, situated in particular contexts, like any other object of material culture. Our world is built on top of their scaffolding, much as modern cities grew out of Greek and Roman ruins.
And much like we turn to classics and other great works of the past to better understand human nature, our computing machines are very much extensions of ourselves and reflections of how we see ourselves. In their past we see how others have used them to seek out and explore the mysteries of the universe and paint vivid pictures of human mentality.
The strange, alien, and beautiful space Stephenson elegizes is where we encounter problems and questions that are still relevant today. The Turing Test, after all, was supposed to be conducted using a teletype machine. I could go on further, but you may already grasp what I am trying to say.
So, again, we ought to consider the possibility that engaging with seemingly archaic interfaces and the traditions they are a part of is intrinsically rewarding in the same way the much-mythologized traditional humanities education is. I cannot tell you exactly how to do that, even if I offer up my own motivations for it at the end of the post.
I can say that it doesn’t mean treating them like novels, sculptures, and paintings. It does entail bringing them to culture — and learning to see them as more than just functional tools. Computing, though not totally human, is a human activity that ultimately transcends utilitarian use-value.
That may sound like a bizarre or even ridiculous argument. The “hacker’s delight” aside, computers are intimately associated with work and bureaucracy. Even thinking of them as tools of self-expression concedes they are still tools. However, our current woes with computers partly arise from how we regard them solely as things we use.
Beyond “aboutness”
Prioritizing what computers can do for us over all else makes us indifferent to what computers are to us. Stephenson never really makes a coherent argument about what computers are to him — or us i— n general, but we can still learn something valuable from him. Not just about the value of tradition, but about the importance of choosing for ourselves what computers mean.
What are computers for? Or rather, what are computers about? Again, I concede that the question may seem silly. Aren’t there readily available answers? Numerical calculation, bureaucratic organization, data storage and retrieval, simulation and gaming, and so forth.
The framing of the question, admittedly, isn’t helpful.
Technical objects are products of human intent but maintain their own separate existence in the world. It’s reductive to think only about what they can do for us because so much of our lives hinges on what they do to us. If technical objects influence our psychology and modulate our social activity, we also connect them to the world and determine what they ingest and emit.
It’s a very complex — and convoluted — relationship. Technical objects are still fraught with meaning, but that meaning isn’t always easy to reach. Sadly, our heads are often clouded by externally imposed meaning that frequently proves to be incomplete, misleading, and/or ephemeral. “Aboutness” is a pernicious trap.
In the philosophy of engineering, even an object as simple as a doorbell can have two subtly different functions. In the first, you press a button to generate a noise. In another, you alert someone inside that you are there. One perspective invokes explicit design intent, another doesn’t. This isn’t sophistry.
Archaeologists get into fistfights over questions like this when they look at artifacts from past civilizations. Did this device do X or Y? Or was it symbolic of religious belief? The computer, in particular, has been many things since the first primitive analog machines emerged, from industrial workhorse to enabler of creative self-expression. There is a recurring cycle of fear and wonder about the computer, and we are very much in the “fear” stage.
We could talk about the various maladies people commonly catalog — generative AI slop, crypto scamcoins, dark patterns, family spyware, mobile gambling, public and private surveillance, gig economy taskrabbiting, and the perennially broken enterprise systems you might suffer through at work. But I don’t think that discussion would get us anywhere productive. Let’s return back to the question of what computers are for or about.
The biggest barrier to answering those kinds of questions is the unfortunate reality that we are continuously bombarded with messaging about why we don’t have a choice at all. The people who talk the loudest about computing unanimously agree that the future is inevitable. Either get with the program or get out of the way. If you disagree, it’s just delusional cope.
Somewhere along the line, we lose the notion that computing can be about something other than utilitarian adaptation to inescapable trends. And if there is no significant need to care about your computer except as a means to an end, then there isn’t reason to learn about anything other than what it can narrowly do to further your (externally defined) interests.
It isn’t surprising, then, that computers are associated far more with ignorance, superstition, and fear in the public imagination than knowledge and rationality. Even some of the most prominent people who make them have given up on liberal democracy altogether. The will to power is often the will to ignorance.
The tradition of freedom
Deciding for yourself what computers are about may seem to be in tension with honoring the Wisdom of the Elders (TM). Tradition always constrains agency at least somewhat. But the traditions behind UNIX, GNU, Linux, the CLI, and other systems are also deeply sincere ideals. And one of the most important components of those ideals is freedom and choice.
I do not believe people will embrace friction without idealism. Discussions of computing and friction often have an overly puritanical dimension. You have fallen prey to the sins of lust, sloth, envy, and greed — and will be thrown into hellfire by an angry God. Tech critics always concede the argument from the get-go when they frame friction as taking away your toys. There is never any real consideration of why people would willingly accept restrictions.
Why did Stephenson choose to embrace friction, as he does at one point in the book? I will quote his discussion of UNIX at length to illustrate the interplay of tradition and freedom.
[UNIX] is hard to learn. The process of learning it is one of multiple small epiphanies. Typically you are just on the verge of inventing some necessary tool or utility when you realize that someone else has already invented it, and built it in, and this explains some odd file or directory or command that you have noticed but never really understood before. ..the file systems of Unix machines all have the same general structure. On your flimsy operating systems, you can create directories (folders) and give them names like Frodo or My Stuff and put them pretty much anywhere you like. But under Unix the highest level—the root—of the filesystem is always designated with the single character ”/” and it always contains the same set of top-level directories
/usr /etc /var /bin /proc /boot /home /root /sbin /dev /lib /tmp
and each of these directories typically has its own distinct structure of subdirectories. Note the obsessive use of abbreviations and avoidance of capital letters; this is a system invented by people to whom repetitive stress disorder is what black lung is to miners. Long names get worn down to three-letter nubbins, like stones smoothed by a river.
This is not the place to try to explain why each of the above directories exists, and what is contained in it. At first it all seems obscure; worse, it seems deliberately obscure. When I started using Linux I was accustomed to being able to create directories wherever I wanted and to give them whatever names struck my fancy. Under Unix you are free to do that, of course (you are free to do anything) but as you gain experience with the system you come to understand that the directories listed above were created for the best of reasons and that your life will be much easier if you follow along (within /home, by the way, you have pretty much unlimited freedom)
“Acculturation” into the ways of the elders is, as Simba learns in the Lion King, the basis of one’s own freedom of action. Grounding in tradition is an enormous part of what makes it feasible to make responsible decisions about a complex and ambiguous artifact such as the general-purpose digital computer. And the tradition of UNIX is very much a tradition of freedom, however much the ideals behind techno-utopianism have gone awry.
Even if not all implementations and derivations of UNIX have empowered the masses, the cumulative culture associated with what UNIX eventually became is a powerful force for freedom and choice. I did not always believe this, but I certainly do now.
Stephenson’s recollections of the 1980s and 1990s computing worlds reminded me of my own personal journey through computing. 4 I grew up almost exclusively with MacOS, the very operating system Stephenson eventually switched to after writing the book under review. 5 To me, Mac was a shiny, Internet-capable typewriter with a big file drawer strapped to it. And every few years I’d get a new shiny, Internet-capable typewriter and think nothing of it.
Mac was certainly a portal to UNIX, but a heavily guarded one. And knowing nothing but Mac engendered habits I needed to work hard to erase. I was held back by an all-consuming sense that I couldn’t go outside the boundaries of typical computer use. Something bad would happen — I’d break my well-engineered and gold-plated consumer product beyond repair, I’d upset some vaguely defined authority figure, I’d just do wrong in general.
Computers were, in other words, delicate things I could look at but never touch. This is a harmful attitude, but also a pervasive one. It eventually made me very unhappy, and forced a decisive break that I will write about in detail some time in the future. I often am jealous of people who grew up playing with Commodore 64s, Apple IIs, and similar platforms from a very young age. I only developed that childlike enthusiasm as an adult. It’s better than never, though.
Going back to the earlier tradition Stephenson describes in the quoted paragraphs through Linux helped me form a much healthier and happier relationship with computing. That relationship helps me tolerate the friction of learning what all of those odd-sounding programs mean and do from an interface lacking user-friendly graphics. I’m motivated to pursue a path forwards that guarantees more friction because I care about the ideals that tradition represents.
I want them to survive. I cannot personally do much to alter the larger negative trends surrounding computing. However, I can at least perpetuate the things I want to see by making myself a better embodiment of the ideals I value. What I’ve realized now is that standing for those beliefs is more than just choosing to run X or Y system. It also means pushing myself to be a better and more knowledgeable user of it, and of my electronic devices in general.
I do not presume committing to that quest will instrumentally benefit me, make me look cooler to others, or even give me the ability to explain to someone else why their fresh OS installation has gotten fucked up. All of those things would be nice, but they do not matter the most to me. I owe the great web of tradition behind UNIX and Linux a great deal. The least I can do to pay the people behind it back is to focus my energies on learning about what they created.
Tidying up
It is not uncommon for a conservative to mistake tradition for a superior model of reality. The good old ways are the best ways. Not because they are good and old, but rather because they help you see visions others cannot. This argument is not usually stated outright. It is often hidden behind homilies to common sense and injunctions to respect rituals even if one cannot understand their function.
But sometimes it is in fact explicitly argued at length, and often eloquently. Eloquence and elegance does not translate to logical coherence or persuasiveness. It is still, however, illuminating even when dubious. In The Beginning, There Was The Command Line.is a powerful argument for the importance of tradition that thinks it is a case for tradition being a superior model of reality.
I frequently disagreed with Stephenson’s arguments. As this post was already long enough, I’ve spared the reader other gripes and nitpicks I have. However, I still immensely enjoyed reading it because it clarified a commitment I’ve been struggling to fully understand and take on. The most you can expect from any author is to give you something valuable you can take away for yourself. If you’ve read this far, I certainly hope I at least have done the same for you.
Footnotes
-
This is a reference to HG Well’s The Time Machine. ↩
-
I am not talking about Clausewitizian friction here, but it is still relevant. ↩
-
That software in and of itself is part of a lineage stretching back to World War II and period immediately preceding it ↩
-
This is a very personal topic for another time. ↩
-
I do plan to write specifically about my experience with Mac in detail. ↩