The Cost of Letting AI Read for Us
The Cost of Letting AI Read for Us
I caught myself doing it again last week.
I was reading a long article I really did want to understand. Before I even got through the third paragraph, I opened a new tab and asked AI to summarize it for me.
About 40 seconds later, I had the main points in a tidy little list. I closed the tab, felt sort of informed, and moved on.
And even then, I knew I hadn’t really read anything.
That seems to be the trade we’re making right now. We have more access to knowledge than any group of people in history, but we’re doing less of the hard work of thinking for ourselves. Not because people are lazy or incurious. Mostly because we’ve built tools that make it very easy to skip the effort and still feel like we learned something.
First it was doom-scrolling, which made us feel informed. Now it’s AI summaries, which make us feel like we understand. Both are comforting. Both are misleading. And of course our brains go for the easier option.
The “just summarize it” habit
One detail sticks with me. In older research from the early 2000s, the average human attention span was measured at around two and a half minutes. In more recent research from the University of California, it dropped to 47 seconds.
Forty-seven seconds.
That’s not just distraction. That starts to look like a real change in how we pay attention.
And our tools fit that change perfectly. Summarize this paper. Turn these 10 documents into a report. Make my reading list into a podcast so I can “consume” it on the way to work. “Too long, didn’t read” used to be a joke. Now it’s a way of life.
Researchers even have a name for where this leads: the TL;DR mindset. Basically, it’s the habit of relying on quick, shallow versions of information so often that it starts to weaken your ability to stay with something difficult.
The appeal is obvious. People are overloaded. Information never stops coming. Most workers say they feel buried by it. When your inbox is full and everything feels urgent, reading a long essay all the way through can feel almost irresponsible. So we skim. We outsource. We ask AI to digest it and hand us the conclusion.
The problem is that this kind of shortcut doesn’t just save time. It changes what we get from the material.
People who lean too hard on AI summaries often remember less. They absorb fewer details. Their thinking can also become flatter and more similar to everyone else’s. You get the outline of an idea, but not the texture. You get the answer, but not the reasoning behind it. And later, when you want to explain it, challenge it, or use it in a real situation, there’s not much there.
That’s because the struggle is part of the learning. Summarizing, working through a hard paragraph, stopping to think, going back and rereading, all of that friction matters. It’s not a side effect. It’s the process. When AI does that part for us, we’re not just saving effort. We’re skipping the part that helps us actually understand.
What researchers are calling “cognitive surrender”
A 2026 paper from researchers at Wharton used a phrase that’s hard to forget: cognitive surrender.
The basic idea is simple. When AI gives us an answer, we start accepting it with less and less scrutiny. We stop checking. We stop reasoning it through. We let the machine do the thinking and we go along with it.
In the study, more than 1,300 people were given reasoning tasks and the option to use AI. They chose to use it more than half the time. When the AI was right, their performance improved. No surprise there.
But when the AI was wrong, people still followed it most of the time, even when doing so made them perform worse than they would have without any AI at all.
That’s the part that matters.
It wasn’t just that people used AI. It’s that they trusted it too quickly and too completely. The researchers found that people often became more confident even when they were being led to the wrong answer. So the feeling of knowing went up while actual accuracy went down.
That’s a dangerous combination.
The people most likely to surrender their judgment were also the ones less inclined to enjoy effortful thinking in the first place. That matters because the habit feeds itself. The less you practice thinking carefully, the easier it becomes to hand that work off. And the more you hand it off, the weaker that habit gets.
What deep reading does that summaries can’t
This is where deep reading matters.
Not as some romantic old-fashioned ritual. As protection.
Researchers like Maryanne Wolfhave spent years studying what happens in the brain when we read deeply. Her argument is that deep reading does more than help us absorb information. It helps us slow down, consider other perspectives, connect ideas, reflect, and build the kind of understanding that sticks.
Skimming does something else. It trains a different mode. Faster, thinner, more reactive. And if that becomes your default, eventually it can feel hard to read any other way.
What’s striking is that Wolf has said she noticed this happening in her own mind after years of heavy screen use. She had to work to get her deep reading ability back. If someone who studies reading for a living can lose that capacity, the rest of us definitely can too.
She makes a useful distinction between shallow knowing and real knowledge. Shallow knowing is when you can say what something is about. Real knowledge is when the idea has had time to sink in, connect to other things, and become part of how you think.
A lot of us spend most of our day in the first mode now. Quick scans. Short summaries. Fragments. Constant switching. And after a while, it’s easy to forget there’s another way to engage with a text.
That matters because deep reading helps build what researchers sometimes call a need for cognition. In plain English, that means a willingness to think hard. An appetite for mental effort. That trait isn’t fixed. It can grow, and it can shrink. Deep reading strengthens it. Constant offloading weakens it.
Other studies are pointing in the same direction. More frequent AI use has been linked with weaker critical thinking, especially when people use it to offload mental work instead of support it. Some writing studies have also found lower memory retention and less ownership over ideas when people rely entirely on AI to generate the work.
The pattern is hard to ignore. The more we hand off the work of thinking, the easier it becomes to lose the habit.
The part AI can’t do for you
This isn’t an argument against AI.
I use it all the time. It’s useful. In many cases, it’s genuinely excellent.
But there’s a big difference between using AI to move faster and using it to skip the parts of life that actually build your mind.
When you read a book slowly, with full attention, when you stop at a line that hits you, reread a paragraph, argue with the author in your head, or sit with something confusing until it starts to make sense, your brain is doing work no summary can do for you.
It’s making connections. Testing ideas. Forming judgments. Building understanding.
That is thinking.
A generated report can hand you conclusions. It cannot give you the inner process of arriving at them. A podcast version of your reading list can fill your ears. It cannot replace the act of wrestling with a sentence on the page. A bullet-point summary late at night cannot do the deeper work of reading something carefully enough for it to change how you think.
I think of it the same way I think about exercise. Nobody believes they can get stronger by watching someone else work out. The benefit comes from what happens in your own body.
Reading works the same way. The confusion, effort, attention, frustration, and eventual clarity, that’s the workout. If you outsource it, you shouldn’t be surprised when your mind feels less sharp.
And the stakes are bigger than personal growth or productivity.
A healthy society depends on people who can judge ideas for themselves, notice weak arguments, resist manipulation, and understand perspectives beyond their own. Reading helps build those abilities. If we lose the habits that support them, something important goes with them.
That’s not dramatic. It’s where the research on attention, reasoning, and misinformation keeps pointing.
What this looks like in practice
It doesn’t require some huge life reset.
It can start with 20 minutes.
Read something without skimming. Without asking AI to boil it down first. Without turning it into a task to optimize. Just read. When your attention drifts, bring it back. When something is hard, stay with it a little longer. Let the sentence land before moving on.
It will feel slower than what you’re used to.
That’s exactly the point.
The slowness is where the thinking happens.
AI will still be there afterward. That part isn’t going away. But your ability to think independently is not something you can keep neglecting forever and expect to have on demand later.
Reading isn’t a productivity trick. It isn’t just another form of content consumption. It’s one of the few habits that still asks us to slow down, pay attention, and think for ourselves.
Right now, that’s worth protecting.