This is a (very) lengthy post (on technology, design, friction, tradition) and a host of other things) extracting meaning from the decades-old tar.xz of Neal Stephenson’s In The Beginning, There Was The Command Line. As a map for the reader, I argue the following:

  1. The general-purpose digital computer is a colonizing force. It erodes context, it has no inherent linkage to reality, and it is not clear how friction in the abstract will improve it.
  2. Ideals, practices, and traditions are what make interfaces work. The command line interface works because of the culture associated with it, not anything inherent to it.
  3. We should learn to appreciate technology as culture. Prioritizing what computers can do for us obscure what computers are to us.

Much like Stephenson’s book itself, my post is not really arranged in linear order and frequently interpolates autobiographical anecdotes. Like much of my other writing, it amalgamates many “leftovers” I’ve written over the years but never bothered to fully develop or publicly display.

Press X to doubt

The post you are reading was composed using a venerable old text editor, inside a terminal emulator running a shell of similar vintage. To fit the mood for writing about the book, I have enabled a black-and-green color scheme. A cursor blinks whenever I stop writing. I do not have any mirrored shades or — perhaps more appropriately — a neckbeard.

At one point I even tried listening to some 1990s electronic music to capture the feel of the era Stephenson wrote about but I couldn’t stop giggling. I had to flick off the music, turn the terminal back to a normal color, and play some video games before I could start writing and editing again.

I am somewhere in between Stephenson’s “Eloi” — who live blissfully ignorant and luxurious — and the “Morlocks” that labor underground to make their machinery work. 1 But in that respect, I am also like Stephenson himself. He freely admits throughout his book that he gets tired of the austerity and friction of the command line interface (CLI) and operating systems (Linux) that model a similar asceticism.

I still use Linux, though, and it seems he does not.

I chose to read a decades-old book the author himself (sort of) disavowed in part because of a debate currently raging about the future of computing. Perhaps the best way to introduce that debate is to talk about People Being Mad About Video Games.

As long as there are gamers, there will be gamer rants about why games today are too easy. There’s always some substance to these gripes. I sometimes don’t understand how I could have possibly beat the games I played as a kid. They’re certainly much harder than the games I play today.

I like to think that those games taught me the value of persistence and discipline. I was so upset that I kept failing over and over again. But I kept on trying and eventually beat even the hardest levels. I don’t make any claims about the character-building value of hard games for anyone except myself. Other people may not have gotten the same value out of them.

Regardless, games — or at least the games you retroactively mythologize as appropriately challenging — are optimized case studies in friction. 2 Enemies fight just hard enough to lose gracefully to you. Weapons and movement mechanics give you options but still constrain you. When there’s big penalties for screwing up, you look before you leap. It’s hard to imagine games without at least minimal friction. You can type in the God Mode cheat, but what’s the point?

Much in the same way some gamers today argue that getting Mario and his relatives through those difficult timed jumps built character, technology critics think friction will make computing better.

There are many people that believe that the absence of friction is what makes computing bad today. Frictionless user interfaces and user experience designs are what makes social media harmful. More friction can make social media and smartphones better.

My opinion? Press X to doubt.

Discussions of friction modestly parallel the old debate about whether graphical user interfaces cut too much friction from computing. Stephenson’s book was an obvious contribution to that debate. The problems I have with Stephenson’s book are the same problems I have with technology friction takes.

When you have a complex technology with no guaranteed connection to anything in the external world, what is the appropriate level of friction? Social media allows you to link to an enormous aggregated hive mind. It’s inherently volatile! It might be that connecting so many people together inherently causes problems!

Henry Farrell recently developed a simulation suggesting that recommendation systems — the scapegoats of social media critics — actually may not make as much of a difference as popularly believed.

The model predicts a fairly straightforward outcome. If people are able to search for evidence and arguments that confirm their biases, and to easily publish such evidence too, they will tend to create large online communities glom together around shared rationalizations, defend them, and produce more of them. In other words: hell hath no limits, nor is circumscribed. You don’t need modern social media algorithms to act as Virgils, conducting people into our current online inferno. There are entrances everywhere, and all you need to find them is open search, easy self-publishing and ordinary human psychology. If our model is right, we would likely be in much the same situation as we are at the moment, even if platform companies had never discovered machine learning. People would still be driven by their own wants to discover and create the kinds of shared rationalization that dominate online political debate today, and search and Web 2.0 type publishing would make discovery and sharing really easy.

Farrell reminds us, all of this has in some way happened before. It sometimes seems like there isn’t anything interesting or novel to be said about it. Or at least something that hasn’t been said somewhere between the 1940s and 1970s. Or maybe, in Stephenson’s case, 1999?

The shell before time

It is not uncommon for a conservative to mistake tradition for a superior model of reality. The good old ways are the best ways. Not because they are good and old, but rather because they help you see visions others cannot. This argument is not usually stated outright. It is often hidden behind homilies to common sense and injunctions to respect rituals even if one cannot understand their function.

But sometimes it is in fact explicitly argued at length, and often eloquently. Eloquence does not translate to logical coherence or persuasiveness. It is still, however, illuminating even when dubious. In The Beginning, There Was The Command Line.is a powerful argument for the importance of tradition that thinks it is a case for tradition being a superior model of reality. It fails as a design critique, but succeeds at something else more subtle and constructive.

Throughout the book, Stephenson conflates ways of seeing with ways of believing. In one breath, he argues that the CLI is a superior model of reality and then makes a quite different claim that using it signifies the assumption of moral responsibility. He is, though, not really committed to his bits. He will lavish something with page after page of praise only to retreat back to the cool-cat pose of the cyberpunk auteur a few pages (or a chapter) later.

Stephenson’s book isn’t, despite the title, really about the command line. Hymns and hosannas about the command line do take up a lot of literary real estate. But the book’s message is partly that some things can only be communicated textually:

Back in the days of the command-line interface, users were all Morlocks who had to convert their thoughts into alphanumeric symbols and type them in, a grindingly tedious process that stripped away all ambiguity, laid bare all hidden assumptions, and cruelly punished laziness and imprecision. Then the interface-makers went to work on their GUIs, and introduced a new semiotic layer between people and machines. People who use such systems have abdicated the responsibility, and surrendered the power, of sending bits directly to the chip that’s doing the arithmetic, and handed that responsibility and power over to the OS. This is tempting because giving clear instructions, to anyone or anything, is difficult. We cannot do it without thinking, and depending on the complexity of the situation, we may have to think hard about abstract things, and consider any number of ramifications, in order to do a good job of it. For most of us, this is hard work.

The graphical user interface (GUI) is, to Stephenson, not just a replacement of far more “direct and explicit” mode of communication but also a generalized metaphor for reality. That metaphor is quickly colonizing the entire world. “A few lines of computer code can…be made to substitute for imaginable mechanical interface.” The substitute is often slapshod.

Stephenson raises the prospect of people using GUI touchscreens to drive cars as a reductio of GUI colonization, but that’s now a mundane reality today. How time flies. The root of his gripe with the GUI is the seductiveness of unthinking:

The OS has.. become a sort of intellectual labor-saving device that tries to translate humans’ vaguely expressed intentions into bits. In effect we are asking our computers to shoulder responsibilities that have always been considered the province of human beings—we want them to understand our desires, to anticipate our needs, to foresee consequences, to make connections, to handle routine chores without being asked, to remind us of what we ought to be reminded of while filtering out noise.

Much of the book contains jeremiads about the imperialism and sinfulness of the GUI. ical user interfaces. Given that he wrote the book in 1999, that battle has long since been lost.

There have been some attempts to update and critique the book for the new century. The most risible point Stephenson makes is that the command line itself is the most pure way to experience computing reality:

So an [operating system] is a stack of metaphors and abstractions that stands between you and the telegrams, and embodying various tricks the programmer used to convert the information you’re working with—be it images, e-mail messages, movies, or word processing documents—into the necklaces of bytes that are the only things computers know how to work with. When we used actual telegraph equipment (teletypes) or their higher-tech substitutes (“glass teletypes,” or the MS-DOS command line) to work with our computers, we were very close to the bottom of that stack. When we use most modern operating systems, though, our interaction with the machine is heavily mediated. Everything we do is interpreted and translated time and again as it works its way down through all of the metaphors and abstractions.

In 2004, Garret Birkel responded at length to these and many other claims the book makes:

Computer-like circuitry has burrowed its way into almost every kind of device, in every arena of human progress. The microchip embedded in a rice cooker, for example, could potentially respond to a command-line interface. We could wire a teletype to most of the appliances we use in a day. But when someone says, “Use the computer!”, the image that pops into our collective imagination is relatively well defined…Consider the garden-variety machine from Dell. We pry open the cardboard box, wrestle the styrofoam jaws apart, and dump the thing on a tabletop. “There!” we say. “A brand-new computer!” But by the historical definition, we’re actually looking at dozens of computers. A heap of computers. A computing collective.

  • The CPU The cache manager inside the CPU
  • The on-board sound chip
  • The USB controller
  • The power-management system
  • The network controller
  • The graphics CPU
  • The hard-drive controller
  • The microcontroller inside the hard drive
  • The settings-management CPU inside the display
  • The key encoder inside the keyboard

Clearly a PC has become something more than a teletype. It has not just evolved from its ancestors, but actually contains multiple copies of its ancestors. If we wanted to treat a modern PC like a teletype, we would have to ignore all but one of these embedded computers, and throw the mouse, printer, disk drives, speakers, and game controllers in the trash can.

I will not try to go over the same ground Birkel covers. My intent here is to directly engage with the basic case Stephenson advances in the early and middle parts of the book.

I do not think Stephenson’s arguments — which he is at best ambivalently committed to — are entirely correct. There is a good deal of truth to what he says, but time has not been kind to the case that words keep you grounded in reality. Both social media (at least before TikTok) and lately generative AI, the supposed culprits for many social ills, are also mainly navigated by text interfaces. ChatGPT and other Large Language Models (LLMs) are quite literally manipulated through textual prompts! And yet the latter do not obviously entail the sort of direct and authentic communication Stephenson attributes to the CLI. Text, not the dreaded GUI, is the instrument of doom here

The great deceiver

If the GUI is not to blame, what has actually gone wrong? Birkel argues that Stephenson tends to gloss over the way in which even the CLI itself was always a convenient abstraction. Even the Stone Age computing interfaces that Stephenson describes at the beginning of the book are light years away from the beginning of digital computing in the mid-20th century. If we’re searching for the real culprit, the GUI is a red herring. The perp we’re trying to apprehend is hiding in the 1940s, likely in the big room where they keep ENIAC.

Given the way Stephenson rightly argues that spurious metaphors and analogies inherent to the GUI harm computing, it is ironic that one of the big issues with his case is actually a dubious analogy to another technology.

You cannot sell a complicated technological system to people without some sort of interface that enables them to use it. The internal combustion engine was a technological marvel in its day, but useless as a consumer good until a clutch, transmission, steering wheel and throttle were connected to it. That odd collection of gizmos, which survives to this day in every car on the road, made up what we would today call a user interface. But if cars had been invented after Macintoshes, carmakers would not have bothered to gin up all of these arcane devices. We would have a computer screen instead of a dashboard, and a mouse (or at best a joystick) instead of a steering wheel, and we’d shift gears by pulling down a menu…The steering wheel and gearshift lever were invented during an era when the most complicated technology in most homes was a butter churn. Those early carmakers were simply lucky, in that they could dream up whatever interface was best suited to the task of driving an automobile, and people would learn it. Likewise with the dial telephone and the AM radio.

Stephenson laments that “it is no longer acceptable for engineers to invent a wholly novel user interface for every new product.” But is the culprit the GUI, or the digital computer itself? The digital computer is, among many other things, a general-purpose system. Analog computers were, for all of their limitations, models of the environments and problems they were associated with. As Charles Care argued. analog computers were essentially modeling systems. The fit of model to reality is why analog fire control machines survived in the Navy long past the introduction of digital computing.

Digital computers are technologies for simulation, not modeling. The responsibility for making them match their outer environments rests solely with the software designer and system operator, not the system itself. Because the digital computer can be made to be anything we need it to be, it is actually the invasive and predatory generalized metaphor for reality Stephenson is bemoaning. The car (and other devices Stephenson names) has converged functions that have been concretized in a very particular milieu of operation. The digital computer is the true view from nowhere.

One implication of this is that while GUIs may rely on misleading metaphors, CLIs ask you to imagine something beyond your sensory capacities. I have often heard complaints from people skilled in more directly physical crafts like sculpting, sewing, embroidery, painting, and other similar activities that it is impossible to develop mental models of what computers are doing. They shy away from experimentation with code and CLIs because they cannot get the tactile feedback of looking at something and feeling it in your hands.

Stephenson often compares the command line to writing, and this is quite revelatory. Literary fiction is only possible due to the mind’s capacity for imagination. Writing may be more “direct” in some ways than multimedia, but it is still about building up something that doesn’t exist in the reader’s mind. The reader has to keep that inner structure — which will differ from person to person — in their head as they traverse the novel. Literary fiction may require more intense attention, but it is open to question whether it really promises direct communication.

Likewise, the CLI is emblematic of the enormous role the imagination plays in even “bare bones” computing. You have only your eyes and fingers to manipulate and control the most complex machine humanity has ever invented. Only one of these senses gives you any useful — but still very sparse — perceptual feedback. The imagined mental models users need to control the system comes from something else associated with the CLI — we will get to this later — not necessarily the CLI itself. Taking the computer-car comparison to its logical conclusion…well…drives this point home.

Driving a car with a graphical touchscreen is bad, sure. I think the Tesla user interfaces are rubbish. Driving a car with a text adventure game interface is also uniquely hazardous. “A pedestrian is crossing the street slowly. You can likely clear the intersection before he passes. Do you want to risk it? Yes/no.” Would it make things better if there were also some crude ASCII representations of the steering wheel and the street created by an international computer standards group?

*NIX in a box

My own computing journey, which I’ll write more about over time, made some of the jibes Stephenson throws at Apple and MacOS resonate with me. However, it appears those criticisms no longer resonate with Stephenson himself.

Stephenson, who praises Linux and BeOS (RIP) in the latter parts of the book, eventually became a MacOS man.

When you wrote “In the Beginning was the Command Line,” you were very much in love with BeOS. As nice as BeOS was, it is now mostly gone. Do you still use BeOS 5, or have you acquired YellowTab from Zeta? Or, instead have you embraced the new UNIX based MacOS X as the OS you want to use when you “Just want to go to Disneyland”?

Neal:

You guessed right: I embraced OS X as soon as it was available and have never looked back. So a lot of “In the beginning was the command line” is now obsolete. I keep meaning to update it, but if I’m honest with myself, I have to say this is unlikely.

Today people with little knowledge of the past backdate the current “MacOS” to the entirety of Apple’s history, despite the more than syntactic and even semantic differences between “Mac OS” and “MacOS” after OS 9. After that Mac OS became MacOSX+.

Given how long this post is, I’ve probably tripped up at least once or twice in delineating the differences too.

I have much less positive things to say about MacOS. Don’t get me wrong, this isn’t a personal criticism if you like it. I’m not trying to get you to use some obscure Linux distro or even an abstract art performance version of Plan 9.

I suppose that, in making prescriptive judgments about computers and software in general, I am stepping over a minefield as thick as the one separating North and South Korea from each other. But so be it.

I could care less what you do with your computing devices as long as they help you live your life. But do they really? How would you know? There’s a lot of money to be made from the assumption you don’t ask yourself that.

I will always remember the day that I dropped Apple like a rock.

The weather was terrible, and everything around was either closed or opening late. I woke up, got my coffee, and booted up the Mac Mini. Or, at least, that was the plan. It wasn’t working. I fruitlessly tried to fix it, wondering if the thunderstorm had fried something critical.

Giving up, I tried to remotely reinstall the entire operating system. That failed too. I waited for what felt like an eternity for the download and installation process to complete. More inscrutable errors. For the next few hours, I tried again and again to follow instructions and MacOS failed each time.

I stared at the malfunctioning little box and called it every single name in the Captain Haddock index.3 I was doing everything “right” and my computer was still bricked. The more I thought about the situation, the more I hated everything about Mac.

I hated that I needed an Apple account. I hated that it was so tedious to install programs outside Apple’s walled garden. I hated that I felt so powerless. I don’t remember what was actually wrong today. It really didn’t matter. I needed a change.

Truth be told, I wasn’t entirely fair to Cupertino. I was in a horrible mood in general. The non-technical parts of my life weren’t going so well either and I was under a lot of stress. The crash was just the last thing I really needed. It felt like being kicked while I was down.

I needed to do something. And I was ready to throw away a computer brand I had carried with me since I literally learned to type. As you might imagine, that was when I became a Linux user.

In hindsight, Apple actually did me a favor. It taught me a valuable lesson. An important consequence of computers being technologies from nowhere is that X or Y interface design decision is less important than decisions we make about our orientation towards the computer in general.

I grew up almost exclusively with Mac. To me, Mac was a shiny, Internet-capable typewriter with a big file drawer strapped to it. And every few years I’d get a new shiny, Internet-capable typewriter and think nothing of it. Knowing nothing but Mac engendered habits I needed to work hard to erase.

I was held back by an all-consuming sense that I couldn’t go outside the boundaries of typical computer use. Something bad would happen — I’d break my well-engineered and gold-plated consumer product beyond repair, I’d upset some vaguely defined authority figure, I’d just do wrong in general. Computers were, in other words, delicate things I could look at but never touch.

The fact that MacOSX and beyond had a command line and was a partial derivative of UNIX did not make much of a difference. I had no cultural context for what it was. The most I knew was that it was a “break in case of emergency” device walled off from the regular user. The antidote was immersion in a much different cultural system than the one Cupertino bestowed upon me.

That good old time UNIX religion

As I said before, the CLI on its own offers very little. It is quite like driving a car using a 1970s-80s text adventure game system. Albeit far more cryptic and confusing.

In the grand scheme of things, the CLI is not really that old compared to, say, the pen. But the experience of using it feels so odd for millennials and successive generations after us because it is archaic. Things like man pages are sometimes baffling and cryptic because they are associated with a “way of seeing” we did not natively assimilate. To mix metaphors, it is a language, like Latin, kept alive despite its irrelevance to daily life.

Using the command line is becoming increasingly archaic even among computer science students. You can imagine how little it is used among the general population. So if it — or rather, the various things intertwined with it — are a language linked to an antiquated tradition, should you really learn to read and speak it? Especially if you don’t aspire to be the next startup pro?

Many of the affirmative answers are utilitarian. It’s essential for doing something economically valuable. It’s the only way to gain mastery over your computer. You aren’t a real [insert here] unless you learn how to use it. We all ought to have “digital literacy.” You’re driving blind without it. And so forth.

All of these justifications are more fragile than they appear, because all extrinsic motivations for learning something “impractical” eventually fall by the wayside. So what’s left?

Part of what makes Stephenson’s book so frustrating is that he often talks about the cultural, class, and aesthetic signifiers associated with computing brands and interfaces but only devotes a meager (if tasty) morsel of the book to tradition. When he does, he notices something quite significant:

[UNIX] is hard to learn. The process of learning it is one of multiple small epiphanies. Typically you are just on the verge of inventing some necessary tool or utility when you realize that someone else has already invented it, and built it in, and this explains some odd file or directory or command that you have noticed but never really understood before. ..the file systems of Unix machines all have the same general structure. On your flimsy operating systems, you can create directories (folders) and give them names like Frodo or My Stuff and put them pretty much anywhere you like. But under Unix the highest level—the root—of the filesystem is always designated with the single character ”/” and it always contains the same set of top-level directories

/usr /etc /var /bin /proc /boot /home /root /sbin /dev /lib /tmp

and each of these directories typically has its own distinct structure of subdirectories. Note the obsessive use of abbreviations and avoidance of capital letters; this is a system invented by people to whom repetitive stress disorder is what black lung is to miners. Long names get worn down to three-letter nubbins, like stones smoothed by a river.

This is not the place to try to explain why each of the above directories exists, and what is contained in it. At first it all seems obscure; worse, it seems deliberately obscure. When I started using Linux I was accustomed to being able to create directories wherever I wanted and to give them whatever names struck my fancy. Under Unix you are free to do that, of course (you are free to do anything) but as you gain experience with the system you come to understand that the directories listed above were created for the best of reasons and that your life will be much easier if you follow along (within /home, by the way, you have pretty much unlimited freedom).

What made — and still makes — learning about things like the terminal difficult is that you’re not just learning an alien interface. You also quickly discover that doing things with it require you to also learn about the ecology of UNIX, GNU, and Linux (in my case) software surrounding it. And a good deal of that software in turn depends on software written for platforms — like the PDP-11 — that died out before you were born.

Stephenson repeatedly reminds us that the CLI is a preserved facsimile of telegraphic communication. You can trace a line from telegraphs to teletypes to physical terminals and eventually to the terminal emulator that most people use to access the CLI. That is close to two hundred years of tradition. The “telegraphy” you engage in with the CLI may be a Ship of Theseus, but it still binds you to something bigger than yourself.

As Stephenson tells us, UNIX — the ancestor of most non-Windows operating systems — is in and of itself a complex web of tradition that lives on through the systems it influenced. “Acculturation” to the ways of UNIX is appealing because it also binds the user to the past.

Windows 95 and MacOS are products, contrived by engineers in the service of specific companies. Unix, by contrast, is not so much a product as it is a painstakingly compiled oral history of the hacker subculture. It is our Gilgamesh epic. What made old epics like Gilgamesh so powerful and so long-lived was that they were living bodies of narrative that many people knew by heart, and told over and over again—making their own personal embellishments whenever it struck their fancy. The bad embellishments were shouted down, the good ones picked up by others, polished, improved, and, over time, incorporated into the story. Likewise, Unix is known, loved, and understood by so many hackers that it can be re-created from scratch whenever someone needs it. This is very difficult to understand for people who are accustomed to thinking of OSes as things that absolutely have to be bought.

Each new addition to the UNIX ideal has some chance of surviving long enough to become part of the larger “epic.” Therefore, “UNIX has slowly accreted…[and] acquired a kind of complexity and asymmetry about it that is organic, like the roots of a tree, or the branchings of a coronary artery.” The CLI is to the UNIX mythos what archaic languages, texts, and practices are to religions we see today. You do not necessarily need to know these things to believe, but it certainly heightens your experience if you do. The CLI is a means by which we commune with tradition, and that is what makes it so indispensable.

MacOS is, as Stephenson notes, a product. This has not changed despite MacOSX+ — which replaced the classic Mac Stephenson wrote about — being a portal to UNIX. It is a heavily guarded portal. It is, figuratively, ashamed of its own heritage. It is no wonder I would have to work so hard to immerse myself in the tradition hidden beneath the surface.

An elegant interface for a more civilized age

Computing and tradition are not usually spoken of in the same breath. Computing has always looked to the future and eagerly thrown away the accumulated “cruft” holding us back in the present. However, it is also a fundamental fact of life that the most powerful government agencies and the wealthiest corporations would grind to a halt if an obscure, decades-old mathematics library their code relies on stopped working.

Stephenson is far more effective at arguing — though mostly indirectly — that the tradition behind the CLI is the real source of its appeal. The interface perpetuates a web of tradition. People do not necessarily find comfort, guidance, and purpose in tradition because it it offers them the best model of reality. If there is something uniquely powerful and pleasurable about the CLI, it cannot be separated from the power and pleasure that humans find in tradition in general.

I can personally attest to that.

I struggled to see the power of tradition the blinking cursor and prompt symbolized for a long time. Thankfully, I do now. It is actually wonderful to realize that even if your computer is a space age artifact inconceivable to your parents back when they submitted jobs to mainframes, you are still using a crudely similar interface and model of communication. You are sharing, however superficially, in the same ritual.

I never learned to speak and/or read Latin or Greek. I’ve read some of the classics, but I wouldn’t say I got an classical humanistic education. I have still always been receptive to arguments that this kind of education gives you the perspective and grounding to better interpret the present. After all, it is hard to understand American political rhetoric up to the mid-20th century without some knowledge of Greco-Roman culture and the Bible. Learning about UNIX through the CLI is a kind of ersatz classical education. Wait, really?

There is a stereotypical model of the way a refined gentleman was educated in the past. I know it is not as simple as what I lay out, but that model does have some load-bearing properties for the argument I’m going to make. I am going to ask for grace in how sloppily I throw around terms like “traditional”, “classical”, and “humanistic education.” But let me at least temporarily have my “Plato to NATO” throwback reference.

Back in those sepia-toned times, you learned the classics and the “dead” languages they were originally written in. You at least superficially read the Bible. And the “you” here was not just the stereotypical British boarding school student. As I’ve noted before, it is truly incredible how much American political rhetoric and culture in general presumed some familiarity with a common antiquated template.

The notion of a great “canon” of Western culture was later bolted onto the pre-existing template. The template itself is still a core component of elite education in some schools. Mark Zuckerberg is not the only prominent figure to have studied the classics at Phillips Exeter Academy and similar prep schools. If you aren’t in that kind of environment, what value do you get out of it?

What do I think? In hindsight, what sticks with me is the simple fact that classical culture not only persisted after the decline of the Roman Empire but became omnipresent across Europe, the Middle East, Eurasia, and Africa. It was not just laboriously preserved by monastic scribes. It flourished long after its progenitors were distant memories. If Europeans, Arabs, Africans, and others all judged it worth keeping around, perhaps we should too.

The “traditional” education may not guarantee creativity, self-actualization, or even sound judgment, but it is a sizable inheritance anyone willing to engage with it may claim. The value is intrinsic. You may not be rewarded for it, but you miss out on a rich source of insight and meaning if you won’t at least ingest some of it.

Furthermore, it’s not as if the questions that you find in the classics have been conclusively resolved in 2024 either. Maybe we’re all still in the cave looking at the shadows on the wall. So even if Stephenson’s jeremiads against the GUI aren’t as substantive as they appear, he still makes an extremely strong case for committing to the bits with the CLI.

Stephenson is virtually alone in arguing — even if he never argues wholeheartedly — that there is something culturally vital in stripping away the layers of mediation between us and the computer systems we use. As with the classical education, we may not derive external reward from it but we gain something equally important. That “something” is admittedly vague and hard to nail down. And it goes against a technical culture that looks forward rather than backward.

The “compleat” gentleman

The value of the older way of seeing and the web of traditions associated with it is ultimately similar to the value provided by taking the classics seriously.

Comparing the CLI, UNIX (as well as the things surrounding it), and the web of tradition both are embedded within to the classics isn’t just a throwaway reference. There are too many parallels to ignore.

Much like the classics are a common inheritance we all may access, the hidden and very much antiquarian world of computing is a place of great beauty and is also endowed with rich meaning. There is an argument to be made that in communing with the web of tradition that these increasingly primordial interfaces and programs belong to, we are also claiming an important common cultural inheritance.

In immersing ourselves in an older world we also better appreciate the physicality and historicity of computer systems. They are not magical objects or ethereal illusions. They are real things, situated in particular contexts, like any other object of material culture. Our world is built on top of their scaffolding, much as modern cities grew out of Greek and Roman ruins.

And much like we turn to classics and other great works of the past to better understand human nature, our computing machines are very much extensions of ourselves and reflections of how we see ourselves. In their past we see how others have used them to seek out and explore the mysteries of the universe and paint vivid pictures of human mentality.

The strange, alien, and beautiful space Stephenson elegizes is where we encounter problems and questions that are still relevant today. The Turing Test, after all, was supposed to be conducted using a teletype machine. I could go on further, but you may already grasp what I am trying to say.

So, again, we ought to consider the possibility that engaging with seemingly archaic interfaces and the traditions they are a part of is intrinsically rewarding in the same way the much-mythologized traditional humanities education is. I cannot tell you exactly how to do that, even if I offer up my own motivations for it at the end of the post.

I can say that it doesn’t mean treating them like novels, sculptures, and paintings. It does entail bringing them to culture — and learning to see them as more than just functional tools. Computing, though not totally human, is a human activity that ultimately transcends utilitarian use-value.

That may sound like a bizarre or even ridiculous argument. The “hacker’s delight” aside, computers are intimately associated with work and bureaucracy. Even thinking of them as tools of self-expression concedes they are still tools. However, our current woes with computers partly arise from how we regard them solely as things we use.

Beyond “aboutness”

Prioritizing what computers can do for us over all else makes us indifferent to what computers are to us. Stephenson never really makes a coherent argument about what computers are to him — or us i— n general, but we can still learn something valuable from him. Not just about the value of tradition, but about the importance of choosing for ourselves what computers mean.

What are computers for? Or rather, what are computers about? Again, I concede that the question may seem silly. Aren’t there readily available answers? Numerical calculation, bureaucratic organization, data storage and retrieval, simulation and gaming, and so forth.

The framing of the question, admittedly, isn’t helpful.

Technical objects are products of human intent but maintain their own separate existence in the world. It’s reductive to think only about what they can do for us because so much of our lives hinges on what they do to us. If technical objects influence our psychology and modulate our social activity, we also connect them to the world and determine what they ingest and emit.

It’s a very complex — and convoluted — relationship. Technical objects are still fraught with meaning, but that meaning isn’t always easy to reach. Sadly, our heads are often clouded by externally imposed meaning that frequently proves to be incomplete, misleading, and/or ephemeral. “Aboutness” is a pernicious trap.

In the philosophy of engineering, even an object as simple as a doorbell can have two subtly different functions. In the first, you press a button to generate a noise. In another, you alert someone inside that you are there. One perspective invokes explicit design intent, another doesn’t. This isn’t sophistry.

Archaeologists get into fistfights over questions like this when they look at artifacts from past civilizations. Did this device do X or Y? Or was it symbolic of religious belief? The computer, in particular, has been many things since the first primitive analog machines emerged, from industrial workhorse to enabler of creative self-expression. There is a recurring cycle of fear and wonder about the computer, and we are very much in the “fear” stage.

We could talk about the various maladies people commonly catalog — generative AI slop, crypto scamcoins, dark patterns, family spyware, mobile gambling, public and private surveillance, gig economy taskrabbiting, and the perennially broken enterprise systems you might suffer through at work. But I don’t think that discussion would get us anywhere productive. Let’s return back to the question of what computers are for or about.

The biggest barrier to answering those kinds of questions is the unfortunate reality that we are continuously bombarded with messaging about why we don’t have a choice at all. The people who talk the loudest about computing unanimously agree that the future is inevitable. Either get with the program or get out of the way. If you disagree, it’s just delusional cope.

Somewhere along the line, we lose the notion that computing can be about something other than utilitarian adaptation to inescapable trends. And if there is no significant need to care about your computer except as a means to an end, then there isn’t reason to learn about anything other than what it can narrowly do to further your (externally defined) interests.

It isn’t surprising, then, that computers are associated far more with ignorance, superstition, and fear in the public imagination than knowledge and rationality. Even some of the most prominent people who make them have given up on liberal democracy altogether. The will to power is often the will to ignorance.

Computer enthusiasts have an ambivalent relationship with the ideals behind the software and hardware they use. Too much zeal about using unpopular computing solutions and rejecting mainstream alternatives can make you the brunt of mockery. 1980s-1990s cyber-utopian ideals are not terribly popular today and even viewed with suspicion.

But Stephenson’s fear of really embracing idealism harms his book and holds it back from reaching its true potential. I personally find the ideal of being able to make your operating system what you want it to be deeply underrated. You may not get the most efficient setup, but you can have something that genuinely feels like an extension of your identity and even a phantom limb. That feeling is powerful beyond words, and it changed my life for the better.

Overcoming friction contributes to that feeling — you can’t have frictionless freedom — but it is not the most integral element. I owe that silly little Linux penguin a profound debt of gratitude. I deeply needed to feel like I had something I chose for myself when I ditched Mac for Linux, and Linux provided it. It’s worth extra trouble to get more in touch with those feeling.

Breaking up to make up

Deciding for yourself what computers are about may seem to be in tension with honoring the Wisdom of the Elders (TM). Tradition always constrains agency at least somewhat. But the traditions behind UNIX, GNU, Linux, the CLI, and other systems are also deeply sincere ideals. And one of the most important components of those ideals is freedom and choice.

“Acculturation” into the ways of the elders is, as Simba learns in the Lion King, the basis of one’s own freedom of action. Grounding in tradition is an enormous part of what makes it feasible to make responsible decisions about a complex and ambiguous artifact such as the general-purpose digital computer. The cumulative culture I speak of is an extraordinarily valuable inheritance we take for granted, however far astray techno-utopianism has gone.

I did not always believe this, but I certainly do now.

Everyone in my family used Mac and I had always used it without question. I never thought about about doing anything else. I never cared about what was inside that hermetically sealed box. And there might as well have been nothing inside, because everything that mattered to me was on the surface. Computers were solely utilitarian objects to me. Means to an end, but nothing more.

Later on, I developed new professional and intellectual interests that forced me to treat my computer as more than just a shiny (black) box. It was only then that I realized how ignorant and helpless I really was. Beneath the aesthetically pleasing exterior was a strange inner world filled with mysterious, enigmatic, and often cryptically named objects and mechanisms. It was daunting but also exciting.

I was growing into a different kind of person, but Mac didn’t want to grow with me. I found myself using Linux more and more simply to get things done, first in a virtual machine and then on commodity hardware. But Linux was for work. Mac was where I kept everything really important to me. Maybe I’d switch one day, but I kept putting it off. I had bigger things to worry about.

Life often pushes you off the fence, and that one last crash finally forced the decision I was avoiding. After spending so much time Hamlet-monologue-ing between Mac and Linux, I chose to join the cult of the penguin.

The other issues in my life made the decision much easier. If I had wanted to, I could have just taken everything off my dead Mac’s hard drive and ported it to my new main computer. But there was nothing of value to me left on the Mac.

I could just let it go. I wanted a fresh start.

I understood in that moment that my Mac was really just an elaborate “dumb” terminal. All of the data that actually mattered to me was either backed up elsewhere or in applications I accessed with Internet browsers. Little real computation was actually being done on my desktop.

Over the years, I’ve installed and toyed with an eclectic mix of operating systems and Linux distributions. I overloaded one of the computers I built with SSDs and HDDs, each containing a different OS or disto. Old Mac laptops and refurbished discount computers were cannibalized. Sometimes the firmware disagreed. No matter. I’d find some weird way to dump it, edit it, and overwrite it.

There was something refreshingly transgressive about installing progressively weirder and more esoteric operating systems on what used to be my Mac Mini, taking apart dirt cheap commodity hardware and peering into their innards, and cobbling together FrankenMachines from odds and ends. I could say the same about similar activities like trying out hacks in old emulated games.

All of this cumulatively amounted to a necessary step to purge myself of the belief computers were delicate things I could look but not touch. My closet has a graveyard of “deconstructed” computer parts and electronics I’ve taken apart, mostly to purge myself of that authoritarian superego telling me “don’t do anything out of the ordinary.”

I don’t say this to brag. Rather, I want to illustrate that leaving Mac was not just about becoming a Linux obsessive. The best — if imperfect — analogy/metaphor/whatever is to leaving your hometown and traveling nationally and internationally to see the world. I was a parochial country bumpkin seeing Paris, Cairo, New Delhi, and Tokyo for the first time.

It’s a cliche in memoirs that travel sometimes dramatically changes people’s lives, and that’s also the case for me as well.

When I say “freedom and choice,” I do not mean that I subscribe to any particular doctrine or set of principles. I do not hold them in contempt, but my ideals are more generalized and flexible. Most critically, they begin with people rather than machines.

People grow and change over time. They try on new identities and roles, adopting some and discarding others. For productive change to occur, the person needs an environment that supports their growth and gives them freedom to define for themselves what they desire. Accordingly, the machine should be like half patient companion, half encouraging tutor.

I am thankful to Linux because it held my hand when I needed to feel more in control of my life, politely tolerated my experimental periods of trying on and discarding different versions of myself, and always encouraged me to be curious about what I didn’t know and poke around to see how things worked. While its complexity may be intimidating to others, having so many options and choices allowed my technology to grow and change with me.

Keeping faith with the past

I often am jealous of people who grew up playing with Commodore 64s, Apple IIs, and similar platforms from a very young age. I only developed that childlike enthusiasm as an adult. It’s better than never, though.

Going back to the earlier UNIX (and other assorted things) culture Stephenson recounts through Linux helped me form a much healthier and happier relationship with computing. That relationship helps me tolerate the friction of learning what all of those odd-sounding programs mean and do from an interface lacking user-friendly graphics. I’m motivated to pursue a path forwards that guarantees more friction because I care about the ideals that tradition represents.

Caring about ideals matters. Computers may not be obviously “about” anything, but the willingness to make them about something is ultimately what can make even a fraying, puny, “potato” laptop feel like a NSA supercomputer.

Many of us don’t want to be idealistic about computers because we’re afraid of being hurt. We’ve seen in our own lifetimes how idealism about the power of PCs, the Internet, social media, and all manner of other “phenomena surrounding computers” curdles into disappointment and cynicism. We don’t want to invest hope and be let down again.

I do not believe people will embrace friction without idealism. Discussions of computing and friction often have an overly puritanical dimension. You have fallen prey to the sins of lust, sloth, envy, and greed — and will be thrown into hellfire by an angry God. Tech critics always concede the argument from the get-go when they frame friction as taking away your toys. There is never any real consideration of why people would willingly accept restrictions.

Moreover, as I have said before earlier, the argument that interfaces with more friction are inherently more bound to reality runs into the alienating nature of the general-purpose digital computer itself. Computers aren’t totally without meaning because technical objects as a whole will never be entirely free-floating things. But any interface — especially one as spartan and austere as the CLI — is not going to connect you to much of anything without some kind of grounding in culture and tradition.

I want the traditions I have come to appreciate so much to survive. What I’ve realized now is that standing for those beliefs is more than just choosing to run X or Y system.

It also means pushing myself to be a better and more knowledgeable user of it, and of my electronic devices in general. Adopting Linux and assimilating the web of values and traditions I mention helped make me aware of the choices I make in the devices I use and the way I organize my living environment. It also gave me hope of one day altering them to better reflect my needs — and learning more about the things around me to see new possibilities beyond what I need and want today.

I do not presume committing to the quest I allude to will instrumentally benefit me, make me look cooler to others, or even give me the ability to explain to someone else why their fresh OS installation has gotten fucked up. All of those things would be nice, but they do not matter the most to me. I owe the great web of tradition behind UNIX, GNU, and Linux a great deal. The least I can do to pay the people behind it back is to focus my energies on learning about what they created.

I frequently disagreed with Stephenson’s arguments. As this post was already long enough, I’ve spared the reader other gripes and nitpicks I have. However, I still immensely enjoyed reading it because it clarified a commitment I’ve been struggling to fully understand and take on. The most you can expect from any author is to give you something valuable you can take away for yourself. If you’ve read this far, I certainly hope I at least have done the same for you.

I will write some other time about the specific nature of the commitments I want to make, but for now I’m logging out of the session.

Footnotes

  1. This is a reference to HG Well’s The Time Machine.

  2. I am not talking about Clausewitizian friction here, but it is still relevant.

  3. Captain Haddock is one of history’s greatest “verbal abusers.”