Critical Ignoring: A Conversation With Christopher Mims on the AI Tech Bubble
Is your BS detector regarding the latest AI trends sharp enough for 2026?
Christopher Mims, Technology Columnist for The Wall Street Journal, believes that in an increasingly polluted information ecosystem, the most important skill you can develop is “critical ignoring.”
With over a decade of experience and 500+ columns under his belt, Chris has learned that the easiest person to fool is often yourself, especially when you’re excited.
The veteran journalist helped us explore the transition from hype-driven decisions to the logical frameworks of scenario planning.
Chris talked with us about:
- Why “critical ignoring” is the most important skill for 2026
- Treating AI as the world’s “least neurotypical intern”
- Why humanoid robotics is currently the biggest bubble in tech history
The Science of the BS Detector
Chris observed that many leaders get swept up in the latest technology because they fail to look inward before making a judgment.
Drawing on his background in neuroscience, he argued that we are most vulnerable to being fooled when we are emotional or surrounded by others shouting about the next great thing.
“I’ve learned to be careful when I’m excited,” he explained. “Step one is don’t get sucked in by the hype. The important asterisk on that assertion is all of us are going to get sucked in by somebody’s hype at some point.”
To avoid this trap, he advocated for a concept called “critical ignoring,” the practice of filtering out information in advance to prevent cognitive overload (Check out Chris’s article on the topic here).
Once the noise is filtered, he suggested using “lateral reading” to verify assertions by checking independent sources rather than just following a single AI or news thread.
“Funny enough, AI can help with this,” Chris suggested. “You can use it to test your assumptions… I’ll ask AI, ‘What’s wrong with this hypothesis?’”
Chris warned that without these formal tools, even the most intelligent people treat AI as an infallible “oracle” rather than what it actually is: “fuzzier” software that is just as fallible as its human creators.
The "Least Neurotypical Intern"
One of the most critical elements of modern workflow is understanding exactly where AI fits in the hierarchy.
Chris argued that rather than being an all-knowing entity, AI functions like a brilliant but inconsistent assistant.
“You have to think of it as like the world’s least neurotypical intern,” Chris told our host, George Jagodzinski. “ It can't do more than what a person does. It's not superhuman in that way. It's only superhumanly fast.”
This “spiky intelligence,” as Chris put it, means that AI can solve complex research problems one minute and go completely off the rails the next.
To bridge this gap, Chris suggested that the goal shouldn't be to replace humans, but to use AI to augment senior developers and “best entry-level people.”
“No matter how amazing and new and useful a new technology is, adoption is rate-limited by human's ability to learn it, to integrate it into what they do to change all the systems around it.”
Punctuated Equilibrium: The Distruption Myth
While the tech world is obsessed with disruption, Chris argued that true, industry-shaking change happens far less often than we think.
“Most developments are incremental and big disruption is pretty infrequent,” he suggested.
Borrowing from evolutionary biology, he pointed to “punctuated equilibrium,” long periods of relative stasis interrupted by brief, intense bursts of adaptation.
“People will say, ‘Oh, my God! Claude’s new agentic framework [is] the biggest thing since ChatGPT,’ [but] is it?” Chris posited. “Or are we still iterating on what was the true disruption?”
Furthermore, he identified humanoid robotics as a prime example of modern tech potentially flying too close to the Sun.
While the goals may be noble, Chris believes the massive capital requirements and current limitations of physical intelligence make it the “biggest bubble in tech right now.”
“Are we going to get the kind of AI that's required to make humanoid robots accomplish [complicated] tasks and be truly relevant and cost-effective in factories anytime soon?” Chris argued. “Absolutely not.”
Ultimately, Chris believes the key to surviving the AI age is a healthy dose of accountability and practical service.
The core philosophy of Chris’s approach to technology and career can be summed up by the advice of his college mentor:
“What you should do in life is whatever’s at the intersection of what you’re good at and what you enjoy.”
Craving more? You can find this interview and many more by subscribing to Evolving Industry on Apple Podcasts, on Spotify, or here.