Parrots Are All You Need
Why generative AI will introduce greater change, faster, than the Internet itself—and what that means for knowledge workers.
This essay has been revised to reflect how my thinking has developed since its original publication. Cross-references to later work represent the current state of an evolving framework. The version here is authoritative; the original captured where I was in January 2023. Note also that this was written before “[blank] is all you need” became a very tired cliche. Yes, I would do it differently now.
There are three types of CIOs: those who are convinced that AI is an urgent priority, those who will soon realize they need to make AI an urgent priority, and those who will be out of work in two years.
For those in the latter two categories, the world changed in June 2017 with the publication of an article entitled “Attention Is All You Need,” which revolutionized natural language processing model training. Then, on November 30, 2022, ChatGPT immediately became the fastest-growing platform in the history of the Internet. Its success is not due to enormous technological advancement but because the latest in natural language processing tools became something anyone could use.
There are thousands of articles describing ChatGPT, so I’ll focus on impact rather than technology. By “impact,” I mean this: generative AI will introduce greater change, faster, than did the Internet itself.
A Word About Knowledge
Knowledge can be broadly categorized as: explicit (documented), implicit (stored in someone’s head), or tacit (“we know more than we can tell”). ChatGPT is trained on the public Internet, making it a repository of all explicit knowledge available publicly.
Think about what that actually means. We have, for the first time, a tool that can retrieve, synthesize, and articulate the sum of publicly documented human knowledge in response to natural language questions. That’s not a marginal improvement. That’s a category shift.
What I’ve come to understand since writing this piece is that the traditional categories need rethinking. Tacitness is agent-relative—knowledge that was tacit for humans alone may become explicit when AI enters the picture. And the very definition of knowledge shifts when we recognize it as capability to produce outcomes rather than justified true belief. The parrot’s “regurgitation” is, operationally, knowledge—if it reliably produces the outcome you need.
The Stochastic Parrot
ChatGPT has been criticized as a dangerous “stochastic parrot” that simply regurgitates without insight. The term comes from a 2021 paper warning about the risks of large language models—systems that generate plausible-sounding text by predicting likely word sequences without genuine understanding.
The criticism is valid. These models don’t truly comprehend what they’re saying. They pattern-match and recombine. They hallucinate confidently. They lack the contextual judgment that humans bring to complex decisions.
Yet the impact remains disruptively potent due to one simple fact: most business activity is deliberately not novel.
The Uncomfortable Truth About Knowledge Work
Most of us don’t want to admit this: most of what every knowledge worker does involves regurgitation of someone else’s prior art. Very little of knowledge work is truly creative. We deliberately avoid creative effort because “creative” solutions carry higher risk than proven, established solutions.
Think about your own work for a moment. How much of your day involves truly novel thinking? How much involves applying established frameworks, following documented procedures, synthesizing existing research, or reformulating ideas that others have already developed?
For most of us, the majority of knowledge work is intended to produce the least risky result—which means using the most explicit solution. We write reports that follow standard formats. We create presentations using proven structures. We draft communications using established templates. We solve problems by adapting solutions that worked elsewhere.
I’m not criticizing this. It’s good practice. Organizations value consistency and predictability. Clients expect proven approaches. The preference for established solutions over novel ones is rational and appropriate for most business contexts.
But it also means that explicit solutions are accessible to the pre-trained generative “stochastic parrot.”
The Real Competitive Landscape
There’s a phrase circulating in discussions about AI and employment:
“AI will not replace you. A person using AI will.”
This captures something essential about where we’re headed. The real question isn’t whether AI can do your job. It’s whether someone using AI can do your job faster, cheaper, or better than you can do it without AI.
The parrot doesn’t need to be creative. It doesn’t need real understanding. It just needs to be fast, cheap, and good enough. For tasks that rely primarily on explicit knowledge—documented procedures, established practices, publicly available information—good enough is increasingly sufficient.
What This Means
Think about how much of your work is creative effort. Think about how much is sui generis, unique, developed by you alone. Work best supported by documentation and established practice is most susceptible to being accelerated and ultimately automated by AI.
The more your work depends on synthesizing explicit knowledge, the more vulnerable it is to AI augmentation or replacement. The more it requires tacit knowledge, contextual judgment, real creativity, or deep relationship management, the more insulated it remains.
I’m not saying knowledge workers should panic. I’m saying they should adapt. Learn to use these tools effectively. Understand their capabilities and limitations. Focus on developing skills that complement rather than compete with AI capabilities.
The stochastic parrot has arrived, and for the vast majority of knowledge work that deliberately relies on established, documented, proven approaches, it might be much of what you need. Those who recognize this will learn to work with it. Those who don’t will find themselves competing against it—and losing.