
The Cognitive Trade
Are We Offloading Too Much to AI? A Group Chat Recap
Editor’s Note: In a recent ON_Discourse group chat, four members: founders, strategists, and technologists, gathered to explore a deceptively simple question: are we taking it too far?
We’ve been offloading cognitive tasks for centuries, from cave paintings to calendars to Google Maps. But as AI steps into more sophisticated roles in writing, decision-making, and even ideation, we’re entering new territory. Are we scaffolding ourselves toward greater capability, or slowly anesthetizing the very muscle that makes us human?
Our session touched on 4 key themes:
1. Offloading is not bad and it is not new.
Let’s start here: not all offloading is created equal.
Calculators help us push our logic. Notebooks help us process our thoughts. AI is a little different. It is not inherently bad, but we have to know how to use it.
AI is like a calculator; without practice, we lose skills, but scaffolded, it boosts learning.”
That’s the tension. Tools can be empowering. But when they become invisible defaults, they can also hollow out our cognition, what several studies now call cognitive debt. Unlike “digital amnesia,” where we forget what we can Google, this goes deeper: offloading thinking means we may stop knowing how to think at all.
2. AI offloading is the exception.
One of our members, an executive who launched one of the most successful products in digital history, had some thoughts about the stickiness of the interface. It is no accident that people are starting to use these tools more.
They are engineering cognitive fatigue… They are engineering memory retention issues. They know that. And they're looking past it.”
We pressed this point because it sounded insidious and also familiar.
Are you comparing this to cigarette manufacturers who knew nicotine was addictive?”
The answer was simple and clear. "Absolutely."
3. AI is not smart, people are just lazy.
Generative AI is slick and fast and the promise of quick deliverables is almost too tempting to ignore. More than that, the majority of people will succumb to the temptation and they will deliver empty, vapid work in record time. They will have no memory of the outcomes and will eventually get replaced.
One of our members acknowledged the surface-level benefit of these tools but circled back to the real work.
While things can sound good, there’s a lack of clarity at the end… I spend more time chiseling away at arguing with whatever the output is to get to my point.
Chiseling is a common trope in these conversations. We use it because you can't chisel rock without sincere effort. It is not easy.
4. You can’t generate perspective, you can only get from other people.
The conversation culminated in a deeper inquiry: if AI is not actually smart, where do we find the wisdom and answers we really need? The answer was not technical.
Wisdom grows through questioning, failure, and refinement together.
I know that for me, wisdom… I do not find it on the screen. Wisdom comes from talking with people.
They key takeaway from this session is that our species is hardwired for finding shortcuts. But the long path, the one that leads to discernment, clarity, and maturity, requires friction.
The good news is that our species is also simultaneously hardwired for disappointment and taste. This striving for perfection is the one quality that will connect generative creators with fellow people who want to make it better.
We run events every week. If you want to participate, inquire about membership here. If you want to keep up with the perspectives that we hear, you can subscribe to our weekly newsletter.