Preparing children for a post-scarcity world

Practical skills for a future where supply is infinite

Preparing children for a post-scarcity world

Last week I wrote about the question we should stop asking our kids. The response surprised me. Thousands of parents feeling the same unease, asking the same follow-up:

Okay, but what do we do instead?

I don't have a complete answer. Nobody does. But I've been experimenting with my own kids, and I've landed on a few things that feel right.


The skills that actually matter now

Every parenting article about AI says the same thing: teach creativity, empathy, critical thinking. It's not wrong. It's just vague.

Here's what I think it actually means in practice:

Taste

When anyone can produce anything, the ability to judge quality becomes rare.

Kids are already using AI for school projects. The first drafts are always... fine. Competent. Generic. The skill isn't getting the AI to produce something. It's knowing when the output is good enough and when it isn't.

The question worth asking isn't "is this correct?" That's easy to verify. It's "is this good?" That's harder. That's taste.

Practical version: When you watch something together, ask what worked and what didn't. When they make something, ask what they'd change. Make evaluation a habit.

Wanting

This sounds strange, but most people have never learned to want well.

We're trained to ask "what can I get?" or "what's realistic?" Those are supply questions. The demand question is different: "What do I actually want to exist in the world?"

A kid says they want to "help animals." The old response: "Great, maybe you could be a vet someday."

Better question: "What specifically about animals? What problem would you solve if you could solve any?"

Watch what happens. They start talking about factory farming, or ocean pollution, or shelter dogs nobody adopts. That's not a career path. It's a vision.

Practical version: Don't ask what they want to be. Ask what they want to build, fix, or change. Then take it seriously.

Orchestration

Knowing how to do things matters less than knowing what to do.

I've seen kids who can't code build working apps in a weekend. They describe what they want, iterate on the results, catch bugs by testing, and direct the fixes.

That's orchestration. They're conducting, not playing every instrument.

Practical version: Give them a project, give them AI tools, and step back. Let them figure out how to direct it. The goal isn't the output. It's the skill of directing.


What to stop pushing

Credentials over proof

Degrees and certificates made sense when proving competence was hard. Now a portfolio proves more than a diploma.

I've started de-emphasizing grades. Not because they don't matter for college (they still do, unfortunately). But because showing what you can do will matter more than proving where you've been.

Single-path thinking

"Pick one thing and get really good at it" was solid advice when skills were stable. Now they shift faster than a four-year degree takes to complete.

I'm encouraging exploration over specialization. Try lots of things. Go broad before going deep. The ability to learn new domains quickly beats mastery of a domain that might not exist in ten years.

Fear of failure

If building things becomes cheap (and it is), then experimenting becomes cheap too.

I'm trying to normalize failure as information. Not "it's okay to fail" as a platitude, but genuinely treating failed projects as useful data, not wasted time.

A kid's first app will crash constantly. That's the point. Every crash is a lesson in debugging, in persistence, in treating failure as information instead of judgement.


How to talk about AI without scaring them

I've made mistakes here. Early on I talked about AI "taking jobs" and watched the mood in the room shift. That framing was wrong.

Now I frame it as: "AI is an incredibly powerful tool. The people who learn to use it well will have a huge advantage. You're learning early."

Not threat. Opportunity.

Some things that have worked:

  • Let them use it. Supervised, but hands-on. Abstract warnings mean nothing. Experience builds intuition.
  • Be honest about uncertainty. "I don't know exactly what jobs will exist when you're my age. Nobody does. But I know the people who can direct AI will be valuable."
  • Focus on agency. You decide what to build. AI helps you build it. You're the director, not the replaced.

The part I'm still figuring out

I don't have this solved. My kids will enter the workforce in a few years, into a world I can't fully imagine.

What I keep coming back to is this: the old playbook optimized for scarcity. Get a rare skill, protect it, exchange it for money.

The new playbook optimizes for abundance. Have a clear vision, direct powerful tools toward it, iterate fast.

I'm trying to teach the new playbook while I'm still learning it myself.

Some days that feels inadequate. But I think showing them that adults are adapting too, that learning never stops, might be the most useful lesson of all.


Practical takeaways

Daily:

  • Ask "what would you build/fix/change?" instead of "what do you want to be?"
  • When they use AI, ask "is this good?" Build the taste muscle

Weekly:

  • One project where they direct AI tools toward a goal
  • Review something together: what works, what doesn't?

Mindset shifts to model:

  • Exploration over specialization (early)
  • Proof over credentials
  • Failure as data, not disaster

None of this is certain. I might be wrong about all of it.

But I'd rather prepare my kids for a world that might come than a world that's already fading.

At least I'm trying to ask better questions now.


Raising kids in the AI age

A series about preparing children for a future we can't fully predict.

Raising kids in the AI age
A series about preparing children for a future we can’t fully predict

In this series: