If you’ve ever tried reading an old classic written before the printing press was invented in the 15th Century, say, Homer or Milton, you may have noticed just how dense it was. Both form and content are far from “light reading.”
There’s a reason: cost.
Today, a book costs $20. Before the printing press, that same book cost $30-50,000 in today’s money. A year’s salary for the average tradesman.
When something costs that much, it better be worth reading a hundred times. This meant no beach reads or diet books, just the strong stuff that left people wanting to read it repeatedly.
Eventually the printing press came along, reducing the price of books over time by 99.9%, and our relationship with the act of reading changed with it.
It’s not all that different to the pattern we see with calculators destroying mental arithmetic, laptops killing handwriting, and Google making information abundant but less valuable.
Now, it seems to be happening again. A recent joint study between Microsoft and Carnegie Mellon University noticed that AI seems to make human cognition “atrophied and unprepared.”
“As humans increasingly rely on generative AI in their work, they use less critical thinking, which can ‘result in the deterioration of cognitive faculties that ought to be preserved.”
When we outsource our thinking to automation, we automatically stop noticing the stuff AI gets wrong. The same way using a pocket calculator turns off our arithmetic switch, AI turns off our critical thinking switch.
The advantage, in theory, is that it frees up the deeper part of our thought process to focus on the “ essential” stuff.
But in reality, it can easily dim our ability to spot nuance – where the most insights often hide.
That’s why so much AI-generated work feels boring and second-rate. It lacks heart and soul because it has neither.
The AI algorithm always optimizes for the center, for mediocrity. But it’s at the edges where the action is.
Everything is optimized. But when was life ever optimized, besides never?