Relying on AI without losing yourself

·7 min read

Andrej Karpathy tossed out the phrase “intelligence brownout” to describe what happens when AI systems go down. It’s one of those terms that sounds slightly ridiculous and yet weirdly spot-on. Because I’ve felt it. Not a total blackout — I don’t suddenly forget how to code or write — but definitely a flicker. Productivity stutters, confidence wobbles, and I find myself staring at my screen like someone just unplugged a part of my brain.

The most recent reminder came on June 10, 2025, when ChatGPT went down for over 10 hours. Google searches for "OpenAI status" became the second most popular search worldwide with over 500,000 queries. OpenAI later revealed it was a network connectivity issue on GPU nodes triggered by a routine operating system update - a surprisingly mundane technical failure for something that felt like losing a piece of cognitive infrastructure.

The question is: is this dependency actually bad? Or is it just the normal price of progress? Humans have always leaned on tools. Fire, calculators, compilers, Google — each one a crutch that let us offload something annoying or difficult. Maybe AI is just the next step. Or maybe it’s different. Maybe, for the first time, we’re not just outsourcing work, but thinking itself.

That’s why brownouts are so unsettling. They don’t just reveal fragile infrastructure; they expose fragile humans.

The lazy dev’s dream (and nightmare)

Before AI coding assistants, debugging was this messy ritual: Google the error message, skim Stack Overflow threads, open twelve tabs of docs, copy-paste code until something half-worked. Ugly, yes. But it forced you to actually understand what was happening. You had to learn, whether you liked it or not.

Enter AI. Now I can just… explain the issue in plain English. The model suggests fixes, writes the code, handles syntax, edge cases, boilerplate. I don’t need ten tabs open. I don’t even need to remember the exact method signature. It feels like having a senior dev permanently on call — patient, tireless, and weirdly confident.

But here’s the rub: sometimes that “senior dev” is drunk. When I was learning frontend, I leaned hard on AI. I’d ask for a feature and get back massive 300-line code dumps. I didn’t know enough to question it, so I shipped it. Months later, when I finally had enough frontend fluency, I realized my codebase was a swamp of redundant dependencies and sloppy patterns. Cleaning it up was like wading through molasses. AI hadn’t saved me time; it had deferred the pain.

That’s the nightmare scenario: you get speed today at the cost of long-term debt you don’t even see piling up.

And yet, AI has also been my teacher. I had no clue Redis caching or background tasks were even a thing until the model casually introduced them. That single nudge opened an entirely new corner of backend development for me. It was like having a mentor who occasionally drops knowledge bombs you didn’t know you needed.

So which is it — crutch or mentor? Honestly, both. AI can either make you smarter or lazier. It depends on whether you treat it as autopilot or as a co-pilot. If you just accept whatever it spits out, you’ll atrophy. If you let it point the way and then dig deeper yourself, you’ll level up faster than you thought possible.

Fragility is not just slower typing

The brownout idea really lands when you think about fragility. Outages only matter if they hurt. And they do. If Copilot vanished tomorrow, I’d still write code — but slower, crankier, with a lot more caffeine. If ChatGPT disappeared, I could still figure things out — but hours would replace minutes.

But the real risk isn’t just slower typing. Fragility shows up when you no longer trust your own judgment without the tool. If every architecture decision, every bug fix, every naming choice needs AI to validate it, then the outage isn’t just inconvenient — it’s destabilizing. You’re left second-guessing yourself, because you haven’t been exercising those muscles.

Developers don’t lose productivity first; they lose confidence. And confidence is what lets you tackle unfamiliar problems, make trade-offs, and actually own your work. Brownouts sting not because we type slower, but because we realize we’ve quietly been renting judgment instead of building it.

So, should we deliberately practice without AI? Like pilots doing manual drills when autopilot fails? Maybe. But resilience doesn’t come from artificial drills alone. It comes from building enough independence that, when the tool flickers, you can still think clearly. If without AI you’re completely paralyzed, you’re fragile. If you can still grind through — slower, clumsier, but still moving — you’re resilient enough.

Not just another tool

A lot of people wave off these concerns by saying AI is just another step in the tool chain: calculators, compilers, Google. But that comparison is lazy. Calculators took over arithmetic. Compilers took over translation. Google took over retrieval. None of those took over reasoning.

AI is different. It doesn’t just fetch or compute; it suggests, reasons, weighs trade-offs. It doesn’t just give you an answer; it frames the question itself. That’s why a brownout feels so personal. It’s not just that a tool went down. It’s that part of your externalized thinking went dark.

And here’s the deeper question: what does that do to our identity as developers? A craft is partly defined by the parts you can’t outsource. If debugging intuition, naming, problem decomposition — the actual thinking parts — are consistently handed to AI, then what exactly is “yours” in the work? When the lights flicker, you notice which parts of your craft you still own, and which parts you’ve quietly outsourced.

The danger isn’t that AI literally steals our intelligence. The danger is that we confuse its convenience with our capability. We stop flexing our own judgment because the machine is so damn quick at flexing for us. And then, when the machine stutters, we realize we’ve let parts of our identity atrophy.

The cultural divide

Not everyone even notices these brownouts. I know plenty of people who barely touch AI tools. For them, outages are irrelevant. At most, they might complain when a service they use hiccups because some hidden AI dependency failed.

Meanwhile, AI-native users — devs glued to Copilot, students writing essays with ChatGPT, marketers drafting campaigns — feel like their entire workflow just flatlined. That gap is striking. For one group, brownouts are a mild inconvenience. For the other, they’re existential.

Picture an old-school sysadmin who still writes Bash scripts from muscle memory. For them, AI downtime is a non-event. Contrast that with a junior dev who’s never written a non-AI-assisted function in their life. For them, the brownout is a panic attack. The same outage reveals a massive cultural divide in how people experience their own skills.

That divide won’t last. As AI seeps deeper into everything, even people who claim they “don’t use AI” will feel it. Their apps, their tools, their workplaces — all of it will quietly depend on AI in ways they don’t see. Today, brownouts mostly hit early adopters. In a few years, they’ll hit everyone.

Living with the flicker

So where does that leave us? I don’t think AI dependency is bad. It’s inevitable. Humans have always extended themselves with tools. What’s different now is that we’re extending not just our muscles or memory, but our reasoning. That makes outages feel personal — like someone dimmed the lights in your own head.

But dependency only turns toxic when we stop thinking for ourselves. The goal isn’t to shun AI or larp as some heroic “manual-only” purist. The goal is to draw the line: let AI accelerate the grunt work, but don’t outsource the critical thinking that defines your craft. Brownouts are reminders of where we’ve blurred that line. They flicker the lights just long enough to show us what we’ve offloaded.

So here’s the uncomfortable question to leave hanging: if you can’t code without AI, are you really coding? Or are you just typing prompts? The answer to that question is what separates resilience from fragility. And the next time the lights flicker, you’ll find out which side you’re on.