1️⃣ Software 3.0: English as Code. He reframes software’s evolution in three eras:
Software 1.0: Hand-coded logic.
Software 2.0: Trained models; neural net weights are the program.
Software 3.0: You program in English. Prompts are the code.
Everyone who can write a clear sentence is, in theory, a coder now.
2️⃣ LLMs aren’t Utilities - they’re Operating Systems. Karpathy’s most powerful framework: we’re in the ✨ mainframe era of AI ✨
In the 1960s OS world, there was
▪️Expensive, centralized compute. Few owned mainframes, many shared them.
▪️Time-sharing. Jobs batched, users were thin clients.
▪️Command-line interfaces. No GUI, just terminals.
▪️Remote access. The computer lived in a data center, users dialed in.
In LLMs today? Same story.
▪️Massive, costly, cloud-native. Nobody runs GPT-4 locally.
▪️Thin clients. We pipe requests via browser or API.
▪️No AI GUI yet. We’re typing into terminals (ChatGPT).
We’re pre-personal computer. Someone still has to build the AI equivalent of the desktop, the mouse, the spreadsheet.
3️⃣ Partial Autonomy + The Autonomy Slider. Karpathy’s Tesla experience taught him what happens between flashy demos and reliable autonomy: a decade of boring, hard work. In 2013, he rode in a Waymo car that handled 30 minutes of Palo Alto driving perfectly. The demo worked. It’s 2025. We’re still debugging self-driving at scale.
The same is true for AI agents. The opportunity is augmenting people with AI “Iron Man suits,” not replacing them with Iron Man robots. Cursor, Perplexity are early examples of where this is going.
▪️They package context, orchestrate multiple LLM calls, and give users GUIs to audit AI output.
▪️They offer an autonomy slider - letting humans choose how much control to give up.
The future is co-pilot software - where humans steer, AI assists, and the feedback loop is fast.
4️⃣ Docs and infra need to meet AI halfway. Today’s software is built for humans and APIs. Tomorrow’s needs to be legible to agents:
▪️Ditch “click here.” Use curl.
▪️Replace PDFs with agent-friendly Markdown.
▪️Build tooling that packages context so LLMs don’t fumble their way through HTML and menus.
We need to design for a new consumer: not just people, not just code, but people-like machines.
We’re in AI’s mainframe era. The personal computing revolution will come. The job now is to build what comes between. And in the meantime, I guess we’ll keep typing into our terminals and hoping the prompt does what we meant.
This post was originally shared by on Linkedin.