AI was supposed to make work smarter. Instead, it’s making a lot of it dumber.
A new study from BetterUp Labs and Stanford’s Social Media Lab has uncovered a growing divide in the workplace: “pilots” who use AI as a powerful co-pilot to elevate their work, and “passengers” who hand off tasks to AI and flood teams with digital junk now known as workslop.
According to the research, 15% of workplace content is already AI-generated, and the average employee now encounters low-quality AI output nearly every day. Over half say they feel annoyed, confused, or distrustful after seeing it, and that distrust adds up fast. It was also estimated that this “invisible tax” costs $186 per employee per month, or roughly $9 million a year for a 10,000-person company.
So, if you’re leading a team that’s eagerly “embracing AI,” the hard truth is this: you might not be getting smarter. You might just be getting noisier.
The Real Divide: Pilots vs. Passengers
AI doesn’t make everyone better all the time. It just makes the difference between the skilled and the lazy impossible to ignore.
Pilots use AI like a creative partner. They prompt with precision, give feedback, refine outputs, and use the tool to expand their thinking. These are the optimists and problem-solvers, the people who see AI as a collaborator, not a crutch.
Passengers, on the other hand, treat AI as an autopilot. They copy-paste, skip context, and churn out work that looks polished but says nothing. It’s the kind of “AI writing” that sounds impressive at first glance but collapses under scrutiny.
The outcome is easy to spot: memos that say nothing, reports that recycle old ideas, and presentations that look polished but lack depth. When that becomes normal, trust and talent quietly fade.
The Leadership Wake-Up Call: Don’t Reward Volume, Reward Value
If you are a business leader, the challenge is not whether to use AI. It is whether your company teaches people to think with it or through it.
Here is what pilot-mode leadership looks like in practice:
- Set a “no slop” standard.
Make quality control non-negotiable. For example, if your marketing team uses AI to draft blog posts, make it a rule that every AI-generated piece must be fact-checked, edited by a human, and reviewed by a manager before publishing. Encourage transparency by asking employees to mark which parts were assisted by AI so that reviews focus on the thinking, not just the text.
- Train for prompting, not parroting.
Do not stop at “how to use ChatGPT.” Host short workshops where teams learn how to write better prompts, ask critical questions, and validate information. For instance, your customer support team could practice using AI to summarize conversations while ensuring tone accuracy and brand consistency.
- Redefine productivity.
Stop rewarding people for how many reports, slides, or campaigns they can churn out. Instead, measure the value created. A practical approach is to add a “clarity and originality” metric to your performance reviews. Reward employees who use AI to uncover insights or simplify complexity, not those who just produce more noise faster.
- Build feedback loops.
Make it a norm for teams to critique AI-generated work openly. Create an internal “AI review” channel where employees can post examples of AI use, good or bad, and discuss what made it effective or sloppy. Encourage people to ask, “Does this add value or is it just work slop?” without fear of judgment.
- Lead by example.
If you are using AI, show it. For example, a CEO can share how they used AI to draft a memo outline but then refined it personally to ensure accuracy and tone. When leaders model transparency, employees learn that AI is a tool to enhance work, not to replace thought.
Hiring for Pilots, Not Passengers
Building a pilot culture starts with who you hire. If you bring in people who see AI as a shortcut, you will get shortcuts. But if you hire those who see it as a collaborator, you will build a smarter, more resilient team.
Here is how to spot and hire pilots in practice:
- Ask how they use AI in their daily work.
Look for candidates who use AI to improve thinking, not to replace it. A pilot might say, “I use AI to test my ideas or draft options before I refine them.” A passenger will say, “I use AI to write my emails or reports for me.” - Give an AI-based task during interviews.
Ask them to use a generative AI tool to complete a short task, then explain their process. Pilots will show curiosity, reasoning, and iteration. Passengers will rely on copy-paste outputs. - Test for curiosity and ownership.
Pilots are learners. Ask about the last new tool or skill they explored and how they applied it. Look for evidence of self-learning and reflection, not just familiarity. - Evaluate communication clarity.
AI magnifies poor communication. Ask candidates to explain a complex idea simply, or to critique an AI-generated paragraph. Pilots will spot nuance and improve it. Passengers will accept it as “good enough.” - Check how they give and receive feedback.
Pilots see feedback as part of collaboration. Ask how they handle being corrected or challenged. Their answers will reveal whether they refine their process, or defend poor work.
When hiring globally, this distinction matters even more. Remote and offshore teams that understand how to use AI responsibly will multiply value, not duplicate effort.
How Filta Helps You Hire the Right Pilots
We help companies identify and attract the kind of global talent who approach AI with curiosity, critical thinking, and accountability. Our recruitment process looks beyond technical skills to assess problem-solving, adaptability, and ownership, the exact qualities that define a “pilot.”
Through rigorous skills-based assessments, culture-fit interviews, and data-driven insights, Filta helps you build teams that can collaborate intelligently with AI instead of competing against it. Whether you are growing your marketing, operations, finance, or creative teams, Filta connects you with global professionals who elevate the way work gets donem, not just automate it.
Final Insight
AI is not replacing people. It is revealing them. The difference between a “pilot” and a “passenger” is not about who knows more tech, but it is about who takes more ownership.
If your team uses AI as a shortcut, expect shallow work and low trust. If they use it as a strategic partner, you will see sharper ideas, stronger collaboration, and measurable results.
The future of work will not be defined by who uses AI, but by who uses it well.
For more resources and insights, visit filtaglobal.com