Natural Language Might be the Last Programming Language
The use of AI from a productivity standpoint has been exceptional for me. But if I'm being honest, productivity isn't even the biggest impact AI has had on my work. The single biggest impact has been learning.
I'm learning more right now than at any other point in my career. I've been at this for over 20 years.
That might sound like a throwaway line, but I mean it. I've always had technical gaps in my knowledge. The kind of stuff where you know something exists, you work around it every day, but you never really sat down and understood the foundations of it. How things actually work under the hood. I've spent my career on the product side of the house, and while I'm an engineer at heart, there are layers of computer science I never properly explored.
AI has been exceptional at filling those gaps. Not just answering my questions, but actually teaching. Walking through concepts at my own pace. Connecting dots I didn't even know were disconnected.
The Entire History of Programming Is a Story of Abstraction
Recently I've been digging into the foundations of computer science. How computing actually works from the ground up. Binary, logic gates, instruction sets, memory architecture. All the stuff I sort of hand-waved through earlier in my career.
The thing that struck me most is how the entire history of programming is really a story of abstraction. Layer after layer, each one designed to move us further from the machine and closer to human intent.
We started with machine code. Literal 1s and 0s. Direct instructions to hardware. If you wanted a computer to do something, you had to speak its language, down to individual bits.
Then we created Assembly language. A thin layer on top of machine code that made it slightly more human-readable. Instead of raw binary, you could use short mnemonics. Still extremely close to the hardware, but a meaningful step toward something a person could reason about.
Then came compiled languages like C. You could write in something closer to English, and a compiler would translate it down to Assembly for you. You didn't need to know what registers to use or how the CPU handled branching. The compiler figured that out.
Then scripting languages like Python and JavaScript, where an interpreter handles the translation at runtime. You don't think about Assembly at all anymore. You don't think about compilation. It just happens behind the scenes.
Each layer moved us further from the machine and closer to expressing what we actually want to happen.
Every Leap Made the Layer Below Invisible
Here's the part that really clicked for me. Every previous leap in this progression didn't just make the layer below easier to write. It made the layer below invisible.
Assembly didn't make machine code easier. It made you stop thinking about machine code entirely. You didn't write better binary. You stopped writing binary.
Compiled languages didn't make Assembly easier. They made Assembly something the computer handled for you. You could go your entire career writing C and never once look at the Assembly output.
Scripting languages didn't make compiled code easier. They made compilation something that just happens. Most JavaScript developers have never compiled anything in their lives. They don't need to.
The pattern is consistent and unbroken. Each new layer doesn't improve the old one. It buries it.
Now We Have LLMs
Now look at what's happening with large language models. What are many of us using them for? Writing code.
We're using natural language to generate both compiled and scripted code. The same languages that were themselves abstractions over lower layers. We're talking to an AI in plain English, and it produces Python, JavaScript, React components, SQL queries, infrastructure configs. Whatever we need.
Like many people out there without a deep programming background, I've been doing what the internet has unfortunately decided to call "vibe coding" (still makes me a little queasy using that term). I've been using natural language to generate code that I then review, test, debug, and deploy.
It works. Remarkably well. I've shipped things I never could have built on my own, in a fraction of the time it would have taken a traditional development process.
But here's the question that won't leave me alone.
Why Isn't Natural Language the Programming Layer?
If the pattern holds, and it has held for the entire history of computing, then natural language shouldn't just be a convenient way to generate code. Natural language should be the programming layer itself. Everything beneath it should be handled automatically, the same way interpreters handle everything beneath scripting languages today.
Think about it. When you write Python, you don't look at the bytecode the interpreter generates. You don't review the Assembly. You don't check the machine code. You trust the layers beneath you to handle it.
So why are we using natural language to have AI write code that we then review, debug, and deploy? Why is the code still something we see and manage? If this is truly the next layer of abstraction, shouldn't the code become invisible too?
We're in this awkward middle phase right now. Natural language is the input, but code is still the artifact we interact with. We generate it, read it, tweak it, commit it. It's like if Assembly programmers had to review the machine code their assembler produced before running it. That's not a new abstraction layer. That's just a faster way to write the old one.
Code Becomes an Intermediate Artifact
A lot of people are talking about how humans won't write code in the not-too-distant future. I actually think it's more substantial than that.
I think most of us won't even be aware of the code.
The idea is that "code" as we know it becomes an intermediate artifact. Something generated and executed behind the scenes, not something humans need to see or manage. Like Assembly today, it's still there, it still runs, the machine still needs it. But it's not something most people ever interact with.
Maybe the code is generated in real time and disposable. You describe what you want, the system generates whatever instructions are needed, executes them, and the code evaporates. No repository. No version control. No deployment pipeline. Just intent in, outcome out.
That sounds radical, but is it really? That's exactly what happens when you run a Python script today. An interpreter generates bytecode, executes it, and you never see any of it. You don't version-control the bytecode. You don't deploy it. It's an intermediate artifact that exists only to serve the layer above it.
If natural language becomes the layer above code, then code takes on that same role. It becomes something the system needs internally, but not something humans need to manage.
What This Means for Product People
I think about this a lot from a product development perspective. If we're heading toward a world where the gap between intent and execution collapses, the skills that matter shift dramatically.
The ability to clearly articulate what you want becomes the core technical skill. Not syntax. Not frameworks. Not knowing which package manager to use. The ability to describe a desired outcome with enough precision and context that a system can execute it correctly.
That's product thinking. That's what product people have been doing their entire careers. Translating user needs into clear requirements. Defining what "done" looks like. Thinking through edge cases and tradeoffs. The difference is that today we hand those requirements to a team of engineers who translate them into code. Tomorrow, we might hand them directly to a system that handles the translation itself.
I'm not saying engineers go away. Far from it. Someone still needs to build and maintain the systems that make this abstraction possible. Just like someone still maintains compilers and interpreters today. But the number of people who need to understand code in order to build software products could shrink dramatically.
Are We Closer Than We Think?
I go back and forth on this. Some days I think we're five years away. Other days I think pieces of it are already here and we just haven't recognized the pattern yet.
The progression from machine code to Assembly took decades. Assembly to compiled languages, maybe another decade or two. Compiled to scripting, a similar timeframe. Each transition happened faster than the one before it.
The transition from scripting to natural language is happening right now, faster than any previous shift. The tools are improving month over month, not decade over decade. I'm doing things today that weren't possible six months ago. The trajectory is steep.
So here's the question I keep coming back to: is this an inevitable conclusion of a pattern that's been consistent since the dawn of computing? Or is this just an uninformed pipedream from a product guy who's been spending too much time talking to AI?
I think it's the former. I think it's coming faster than most of us expect.