5 min read
Sympathy for the Machine

The web might have started with C-based HTTP servers and the occasional Perl cgi-bin script, but the explosive growth was only enabled by software that made it easier to build things people want: PHP, Python, Ruby, and JavaScript. All of these are definitively bad languages if you are a mathematician or a programming language theorist or a computer, but the companies they enabled are the intro credits to Silicon Valley. As the web got more complicated, tools like jQuery, Bootstrap, and Sass smoothed out the grunt work so developers could build the Web 2.0 apps that everybody wanted. Time and time again, technology evolved in increasingly complex ways and humans developed abstractions to fit that complexity inside the human context window.

This pattern — machine complexity abstracted for human convenience — has been the defining story of software for decades. It was never the only story.

Ten years ago I started a company with some of the smartest people I’ve ever worked with. They introduced me to this notion of mechanical sympathy: what happens when you write software that’s aware of the underlying hardware it executes on? An easy example is thinking about how slabs are allocated in Redis — cache data misaligned to memory boundaries can lead to dramatic slowdowns that aren’t obvious when you’re writing code in JavaScript. When people talk about the ironic gap between how fast computers are today and how slow we sometimes experience them as, most of the time they’re talking about how an infinite layer of abstractions and virtual machines have fully divorced the user experience from the underlying hardware. Mechanical sympathy was the counterargument: that the most powerful software comes not from abstracting the machine away, but from respecting how it actually works.

I thought, at the time, that this was a side-effect of over-education. I was wrong.

The world is being turned upside down as LLMs reshape everything we thought we knew about building software. The shift that matters most isn’t the one everyone’s talking about — it’s not “AI writes your code now.” It’s that developer convenience, the principle that drove thirty years of language and framework evolution, isn’t the bottleneck worth solving. The new bottleneck is knowability: can the behavior of a system be understood from its source code alone, without running it?

Dynamic languages and layers of abstractions compound into systems that have to be observed to be understood. The convenience of overriding method_missing in Ruby to do black magic metaprogramming made sense in 2010 when a human was the one holding all the context. An LLM reasoning about your codebase doesn’t benefit from your elegant runtime tricks — it benefits from strong types, explicit contracts, and compile-time guarantees that make the system’s behavior statically legible. Rust’s rigid approach to memory usage and limited extensibility isn’t a constraint anymore. It’s a signal that the machine can actually use.

This is mechanical sympathy for a new kind of machine. The original idea was about respecting the hardware your code runs on. The update is that there’s a new machine in the loop — one that reads, reasons about, and modifies your code — and it rewards a fundamentally different set of choices than the ones we’ve been making for human convenience. Strong compile-time guarantees are worth more than the convenience of one-liner abstractions when your most prolific collaborator can’t set a breakpoint.

I don’t think last-mile languages like TypeScript will ever really go away. I do suspect that Node as a server platform and other dynamic runtimes will wane in favor of strongly typed, memory-safe approaches like Rust, Go, and perhaps even more esoteric languages that trade developer convenience for machine legibility. In the browser, I wouldn’t be surprised to see the crushing complexity that React brings fade away like jQuery did a generation ago — not because a better framework replaces it, but because the abstraction layer it provides is solving for the wrong audience.

The relatively short history of programming as a career has always depended on a combination of arcane knowledge and abstractions to build things. Assemblers and compilers, punch cards and operating systems, network layers and protocols and web browsers — all of it designed to help humans understand what the machine is doing. For the first time, that’s not the only audience that matters. The machine is reading too, and it knows itself better than we do. The developers who internalize this — who build systems with sympathy not just for the hardware underneath but for the intelligence above — are the ones who will move the fastest.