The MLnotes Newsletter

The MLnotes Newsletter

Share this post

The MLnotes Newsletter
The MLnotes Newsletter
The Hottest New Programming Language is - English

The Hottest New Programming Language is - English

Insights from Andrej Karpathy's recent keynote

Jun 25, 2025
∙ Paid

Share this post

The MLnotes Newsletter
The MLnotes Newsletter
The Hottest New Programming Language is - English
Share

"Software is changing again, and I think it's changing quite fundamentally. I think roughly speaking, software has not changed much on such a fundamental level for 70 years, and then it's changed I think about twice quite rapidly in the last few years." - Andrej Karpathy

These words from Andrej Karpathy, former director of AI at Tesla, set the stage for a profound transformation in the world of software development. His recent keynote talk at the Startup School in San Francisco is super inspiring.

As AI engineers, data scientists, and tech professionals, we find ourselves at the forefront of this revolution. But with great power comes great responsibility, and the specter of AI risk looms large over our exciting new frontier.

The Evolution of Software: From 1.0 to 3.0

Software 1.0: The Traditional Paradigm We All Know

For decades, software development meant writing explicit instructions for computers in languages like C++, Python, or Java. Karpathy refers to this as "Software 1.0" – the foundation of our digital world.

Software 2.0: The Rise of Neural Networks

The advent of deep learning ushered in the era of "Software 2.0." Instead of writing explicit code, we began training neural networks, effectively programming through data and optimization algorithms. This shift marked a significant departure from traditional software development practices.

Source: Andrej Karpathy

Software 3.0: Programming in Natural Language

Now, we stand at the precipice of "Software 3.0" – a paradigm where we program Large Language Models (LLMs) using natural language prompts. This revolutionary approach democratizes programming, but it also introduces new challenges and potential AI risks that we must carefully navigate.

Source: Andrej K
Source: Andrej K

LLMs: The New Operating Systems

Why LLMs Are More Than Just Another Tool

Karpathy draws a compelling analogy between LLMs and operating systems:

"LLMs don't only have properties of utilities. I think it's also fair to say that they have some properties of fabs1, and the reason for this is that the capex2 required for building LLM is actually quite large."

This perspective shifts our understanding of LLMs from mere tools to fundamental infrastructure. Just as operating systems provide a platform for applications, LLMs are becoming the foundation for a new generation of AI-powered software.

The Utility-Like Nature of LLM Providers

AI is the new electricity. - Andrew Ng

LLM providers like OpenAI, Google (with Gemini), and Anthropic are emerging as utility-like entities. They invest heavily in infrastructure (akin to power plants) and offer metered access to their intelligence via APIs. This utility model introduces new considerations for AI safety and regulation, as we become increasingly dependent on these "intelligence grids."

The Psychology of LLMs: Understanding Our New Digital Colleagues

LLMs as "People Spirits"

Source: Andrej K

Karpathy introduces a fascinating concept: LLMs as "people spirits" – stochastic simulations of human-like intelligence. This anthropomorphic view helps us understand both the strengths and limitations of these systems.

Stan Marsh Ai GIF by South Park

Cognitive Quirks and Limitations

While LLMs possess superhuman capabilities in certain areas, they also exhibit cognitive deficits:

  • Hallucinations and confabulation

  • Jagged intelligence (excelling in some areas while failing at simple tasks)

  • Lack of persistent memory

Understanding these limitations is crucial for mitigating AI risk and designing effective human-AI collaboration systems.

Designing LLM Apps with Partial Autonomy

The Autonomy Slider: Finding the Right Balance

Karpathy introduces the concept of an "autonomy slider" in LLM applications. This allows users to control the level of AI involvement, from simple autocomplete to full-fledged autonomous agents. Striking the right balance is key to maximizing productivity while maintaining human oversight.

We can Model After Cursor for AI-Assisted Coding

The Cursor app exemplifies effective LLM integration in software development:

  1. Context management

  2. Orchestration of multiple LLM calls

  3. Application-specific GUI for easy human auditing

  4. Flexible autonomy levels

By studying successful applications like Cursor, we can derive best practices for designing LLM-powered tools that enhance productivity without compromising on AI safety.

Human-AI Collaboration Loops is Super Important

Speeding Up the Verification Process

To truly leverage the power of LLMs, we need to optimize the human-AI collaboration loop. Karpathy emphasizes two key aspects:

  1. Speeding up verification through effective GUIs

  2. Keeping AI "on a leash" to prevent overwhelming the human collaborator

Lessons from Tesla's Autopilot Development

Karpathy shares insights from his experience with Tesla's Autopilot:

"We did more and more autonomous tasks for the user, and maybe the story that I wanted to tell very briefly is actually the first time I drove a self-driving vehicle was in 2013... This drive was perfect. There were zero interventions, and this was 2013, which is now 12 years ago."

Source: Andrej K

This anecdote illustrates the long journey from impressive demos to real-world deployment, highlighting the need for patience and rigorous testing in AI development.

Democratizing Programming in the AI Era

Everyone Is Now a Programmer

The advent of natural language programming through LLMs has democratized software development. Karpathy coins the term "vibe coding" to describe this phenomenon:

"Suddenly, everyone is a programmer because everyone speaks natural language like English. This is extremely bullish and very interesting to me and also completely unprecedented."

Building a Custom iOS App Without Swift Knowledge

Karpathy shares his experience of creating an iOS app using vibe coding:

"I built this iOS app, and I don't actually know how to program in Swift, but I was really shocked that I was able to build like a super basic app... This was just like a day of work, and this was running on my phone like later that day."

This example demonstrates the transformative potential of LLM-assisted programming, but it also raises questions about job security for traditional developers and the need for new skills in the AI era.

Building for Agents: Creating Future-Ready Digital Infrastructure

Making Our Digital World LLM-Friendly

As LLMs become more prevalent, we need to adapt our digital infrastructure to be more "LLM-friendly." Karpathy suggests several approaches:

Keep reading with a 7-day free trial

Subscribe to The MLnotes Newsletter to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 MLnotes
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share