🤤“I, Student”: Why Outsourcing Your Homework to AI Is Like Hiring a Robot to Do Your Push-Ups

Act I: The Great Temptation

It starts innocently enough.

You're staring at a blinking cursor on a blank Google Doc. The assignment is due tomorrow. Your brain feels like mashed potatoes. And right there, like a shiny vending machine filled with perfect sentences, sits ChatGPT, whispering sweet nothings like:
"Just let me take care of it. It'll be fast. No one will know."

And you think, “Hey, what’s the harm in a little help?”

But according to Simon Sinek and a growing chorus of educators, psychologists, and grumpy-but-wise professors, you’re not just outsourcing an essay. You’re outsourcing your own brain.

Because here’s the dirty little secret about learning: the stuff that feels the worst is the most important.

Act II: The Messy Middle Is the Whole Point

Sinek tells a story about sending an AI-generated apology to someone after a fight. The message was beautifully worded, thoughtful, sincere—and fake. The person on the receiving end felt it immediately. Why? Because humans are weird little emotional barometers. We can sense authenticity like dogs can smell fear and bacon.

This is the key point: people respond to effort, not perfection. They respond to you, not your outsourced, algorithmic proxy.

When students use AI to do the work for them, they lose something bigger than a letter grade. They lose the moment where learning actually happens—the moment where thinking feels uncomfortable, where ideas bump awkwardly into each other like introverts at a middle school dance. That moment is gold.

It’s not the polish that matters. It’s the process.

Act III: The Push-Up Problem

Learning is like exercise. You get stronger not when it’s easy, but when your muscles are crying and your brain is muttering “I hate you” under its breath. That’s when growth sneaks in.

Now imagine this:

You hire a jacked personal trainer to do your push-ups for you. They’re doing 100 flawless reps. You’re watching with admiration. But your own arms? Still twigs.

That’s what happens when AI writes your essay. The work gets done, but you didn’t get any smarter.

The actual neural rewiring—that magical soup of frustration, creativity, and mild panic—is the part that teaches you how to think. How to structure an argument. How to tell the difference between a good idea and a flashy but shallow one. AI can’t gift-wrap that for you. It’s non-transferrable.

Act IV: But Isn’t Using AI Smart?

Sure. It can be. When used with your brain—not instead of it.

Let’s say you use AI to brainstorm. Or you have it explain quantum tunneling like you're five. That’s fine. You’re still in charge. You’re the chef, and AI is your sous-chef handing you peeled carrots and pre-sliced onions.

But if AI cooks the meal, plates it, and presents it to the judges with a flourish, then who’s really learning to cook?

Sinek puts it bluntly: we’re teaching students that it’s okay to be performative instead of authentic. We’re encouraging a shortcut culture. And worst of all, we’re making people afraid of being imperfect—when that’s the whole point of being human.

Act V: The Hollow Echo

There’s also a more subtle danger: the feedback loop.

If AI writes your essay, you get a good grade. That good grade tells you: “Yes, excellent. You are smart.” But in truth, it was the AI that was smart. You just happened to own it.

This creates a weird confidence inflation. You start believing you understand things you don’t. And then, when you’re called to actually discuss the topic—or apply it in a real-world situation—you’re like a parrot that memorized Shakespeare but doesn’t know what a metaphor is.

That’s not education. That’s cosplay.

Act VI: What the Research Says (Spoiler: It’s Not Great)

Recent studies back this up. A 2024 report from the Stanford Center for Learning and AI Ethics found that students who relied on AI for written assignments performed significantly worse on critical thinking and retention tests weeks later. They also showed less ability to articulate their ideas verbally.

Translation: AI might save you time today, but it costs you clarity tomorrow.

And in an era where clarity is power, where being able to express your thoughts cleanly can change your job prospects, your relationships, even your civic life—that’s a steep price.

Act VII: The Emotional Cost

We also need to talk about confidence.

It might feel like AI makes you more confident because you’re turning in better work. But it’s a false boost. Because deep down, you know you didn’t earn it. That seed of doubt sticks.

Over time, this can hollow out your self-trust. You start second-guessing whether you could actually do the work yourself. Ironically, the more you rely on AI, the less confident you become in your own thinking.

Confidence isn't built by avoiding the hard stuff. It's built by wrestling with it and coming out the other side.

Act VIII: The Fix Isn’t Fear, It’s Framing

The goal here isn’t to shame students into dropping ChatGPT cold turkey. AI is here, and it’s not going away. The real challenge is mindset.

Here’s the new frame: your assignments aren’t paperwork, they’re gym sessions for your mind.

Don’t aim to finish them. Aim to grow through them.

Yes, use AI as a tool. Have it check your grammar. Help you brainstorm angles. Maybe even ask it for feedback on a rough draft. But never let it be the writer. That’s your job. That’s where the brain sweat lives.

Act IX: The Bigger Picture

Simon Sinek says it best: we’re teaching students not just how to write essays or pass tests—we’re teaching them how to be human.

Being human means fumbling. Struggling. Feeling overwhelmed and showing up anyway. When you skip that part, you’re not just cheating on a paper. You’re short-changing your own development.

And that matters, because life is basically a never-ending group project where the prompt is vague, the stakes are high, and nobody brought snacks.

In that chaotic, beautiful mess, your ability to think, communicate, and reflect deeply is the best tool you’ve got.

Finale: Real Work, Real You

So the next time you stare down an assignment and feel the pull of the AI shortcut, pause.

Ask yourself: What am I really trying to get out of this?

If the answer is just “a grade,” then sure, AI can help you fake it. But if the answer is “a smarter version of myself,” then there’s no shortcut. You’ve got to do the push-ups.

Bad metaphors and all.

If you want to explore this topic further, this video was a big part of the inspiration to write this post.

Previous
Previous

🧘How Humanists Can Make Peace with Our New AI Overlords (While Still Being Deeply Human About It)

Next
Next

🧠Designing Minds: Education for an AI-Augmented Future