What we pay attention to changes the shape of our brain. Like water running through a landscape day after day after day, the neural pathways used in our work and routines cut familiar channels that give rise to the ecosystems of our mind. But paying attention is naturally selective. When we choose what to focus on, we are also choosing what to ignore.
I've been programming on and off since the 20th century. I see marked differences in myself, my behavior, perception, etc. when actively coding vs. not. It's finally time to investigate. What am I really paying attention to when programming? What are the implications?
And for the reader, what is your experience? Similar? Different? Or if you're not a programmer, how does your discipline shape your perception?
When programming, I am writing the future. Every line of code is an instruction to be carried out predictably, if all goes well, a billion times over. Bits of simple logic—compounded across programs stacked on giant stacks of programs—create complex systems of logical inevitabilities.
To accomplish this, every command I write exercises a tiny mental loop:
Does it work? I test my code and expect an immediate response from the compiler. If yes, I move on to the next bit of logic in my day's lineup. If no, I retest my underlying arguments, breaking down my code into smaller and smaller bits. Every step of the way, my attention walks as efficiently as possible, a garden of forking yes/no paths. It's as if it were me, and not just an electric current, running through the logic gates of my computer's circuit board.
When our brain focuses its attention, neuromodulators amplify its activated neural circuits. Like a spotlight on stage, the "speaking" actor is lit up. And likewise, the rest of the stage is intentionally dimmed. We actively suppress competing areas of the brain
"with slow waves in the alpha frequency band (between eight and twelve hertz), which inhibit a circuit by preventing it from developing coherent neural activity."
What part of my brain am I suppressing when I spend all day coding? The easy and obvious answer would be the opposite of logic—something to do with emotion. There's certainly not much place for emotion in programming. This may influence why some of us gravitate more to front-end development—to be not so far from people. Or vice versa. But does programming suppress emotion? Let's explore deeper.
I learned to code on LSD. Twenty-some years later, there are a few things I love about software development. But none more than the way it encourages and rewards mental plasticity through constant learning and unlearning.
We break systems apart in order to understand the imperative logic that glues them together. In the game of programming, the rules are entirely knowable… with enough digging. Somewhere, at some point in time, someone wrote them (and hopefully documented them). Debugging feels like the archeology of quirks, from programs layered atop older programs. There's a real joy in discovery when we unlock a path forward. Over time, this clarity develops into confidence, and we approach systems knowing that yes, if needed, we could do the work to understand any part of this. Fortunately, this is far from what's usually required; we rarely go deeper than the syntax and grammar of our immediate environment. But our environment still is always changing, and so we must always be learning.
The second joy of programming: I can build something out of nothing—or more specifically, out of logic and electricity. I've built with stone, wood, ink, plastic, etc. The freedom afforded by code is unique and specific—a kind of scalable mental scaffolding. Logically valid patterns can repeat ad infinitum into some functional space, limited for the most part only by their utility.
For me, programming enhances this way of seeing, which carries over into other areas of my life. My chess game improves drastically when actively coding. I see more clearly defined the risks and consequences of a position. Things seem more obvious, inevitable. I can break down an argument into its key dependencies, find flaws, and put it all back together efficiently optimized towards some solution. My mind begins to resemble the programs I work on.
My mind begins to resemble the programs I work on. The ability to build applications is empowering; but I lose power when I become just another program in the stack for people to input instructions and expect outputs. I've seen it throughout my career. No one wants to be a code monkey. And yet, it's a natural consequence of specialization. Developers are inevitably busy writing code all day, not talking to people. Yes, we're a strange bunch to begin with, and to follow the stereotype, likely more comfortable with computers than people anyways. But it seems our work reinforces our role as awkward, estranged mental laborers.
I'd thought I could hack life with the same ease I could hack a program. But life is not an abstract model of logic where outcomes are inevitable and predictable. From where I stand as a microscopic organism inside an unknown number of infinite universes, math can only solve for so much. Certainly, many of society's rules can be gamed. My heightened logic serves me well in certain domains. But, I struggle too—there's a price I pay for spending days programming.
It's difficult to be exact talking about this, but I feel it strongly in 3 areas of my life.
This last point is a bit of a trope, so I want to expand. When my attention is zeroed in programming, it's not emotion that my mind suppresses, but rather, my ability to listen, to really listen. I notice it when talking with my wife (with whom I have 10+ years of observed patterns). After a day of programming, I hear her “inputs” and I can “output” the appropriate answer. But I lose touch with her human experience. What's not explicitly defined ceases to register. Unless I'm being intentional about it, programming often degrades my capacity for empathy.
Empathy requires a different relationship to the unknown. It enables us to “hold space” for the valid, yet truly unknowable experience of another person. It saves us from projecting our ideas of ourselves, filled with expectations and judgements, into that space. That's not to say we don't try to understand or pattern-match. Or that we don't share our own experience in response. It's just that there's no equation that can solve for two peoples life, history, genetics, etc. We try, of course, with derivatives, approximations, and metaphor. But this is where logic ends and art begins.
So what? Programming supports my way of life, my family. These side-effects are collateral damage. But I know that by seeking to understand them, I can at least begin to address negative impacts.
Perhaps, this is just one man's experience and not something I should generalize. But my curiosity beckons, if what I'm writing about is an experience at all common to programmers, what are the implications?
This obviously applies to computer work in general; the physical constraints and proposed solutions are well-documented—good ergonomics, exercise, etc. But on the mental side, how do we ground ourselves during a day spent riding the proverbial lightning? Stretched across the course of a career, how does this contribute to programming being a “young person's game,” where individual contributors as they get older burn out and switch to manager roles?
Tech has a lossy view of the human experience. We are degraded to a few million data points. Emotion is gamified for attention, dollars, and worse. Human activity online can be easily modeled by a bot. It's common to marvel at how advanced bots and algorithms have become. But on the flip side, it reveals how shallow our models of the human experience online actually are, that a few automated clicks and some NLP can fool us.
As they say, we only improve what we measure. But we don't measure what we don't know. We don't talk about empathy because it's not a variable or a metric in any of our systems. So how can we make space for empathy in our programs as they continue to eat the world?
We're entering the terrain of science-fiction here, but what would it mean to program for empathy, to encode it in our systems? Here's one possible requirement: can our programs ever account for unknowable truths?
Perhaps, we're not so far away as that sounds. Statistics is the discipline of math devoted to “equations with unknowns,” and machine learning is summarily “programming with statistics.” Rather than coding commands, it's coding with questions and showing the program how to answer with probable outcomes. One problem ML programmers struggle wtih: there's often no discernible logic to its black box solutions. It can pass our tests without our understanding.
What if that is a feature, not a bug? On the path to artificial intelligence, perhaps we must leave space in our programs for something valid, yet unknowable. Of course, that space is not empathy itself. Not even close. Our algorithms still reflect and institutionalize our biases. We are more than pure logic machines. And now, there's a space in our computers that appears to be more than human logic. What if, in a (sad) twist of fate, we learn to respect the unknown in our programs, and that scales up into our platforms, and we might then on a global scale finally learn to respect the unknown in each other?
A hacker can dream. In the meantime, I'll try my darnedest to program with empathy—ie. to think about people not just in terms of my software, as more than just "users". When I pay attention to what I pay attention to, I notice I start to change the shape of my reality.
Thanks for reading! I'm a product engineer, writing about how to be a human in a computer world. This site is my digital garden. Explore, enjoy. My mailbox is always open.