The green border around the screen share flickers, a digital halo signaling that the performance has begun. My manager, a man who recently spent 32 minutes explaining why we need to ‘pivot toward velocity,’ is currently hovering his cursor over a text box. He is showing us his new AI workflow. He types a prompt-something vague about ‘optimizing stakeholder synergy’-and we all sit in a heavy, collective silence, watching a gray cursor blink. It is 3:02 PM on a Tuesday, and I am acutely aware that I just spent my lunch break trying to open a door that clearly said ‘pull’ by leaning my entire body weight into a ‘push’ maneuver. That is the current state of my cognitive processing, yet here I am, expected to marvel at a machine that is about to generate 12 paragraphs of absolute nothingness.
Performance Metric: 102% Fluff
The AI finishes. The text is beautiful. It uses words like ‘robust,’ ‘scalable,’ and ‘holistic’ with a frequency that would make a thesaurus blush. My manager beams. ‘Look at the time we saved,’ he says. He hasn’t realized that the summary is 102% fluff. It contains zero actionable data, zero budget allocations, and fails to mention that the project in question was actually canceled 2 days ago. But the box was filled. The AI was used. The ‘strategy’ was executed. We are not working anymore; we are practicing a very expensive form of digital puppetry.
This is the rise of Productivity Theater 2.0. In the old days, you just had to keep your Outlook status green and occasionally rattle some papers when the boss walked by. Now, we have shifted into a more sophisticated era of deception. We aren’t just faking work; we are using cutting-edge neural networks to automate the appearance of being a visionary. My boss doesn’t actually want a solution to our supply chain bottleneck. He wants an ‘AI Strategy’ he can present to the board, a shiny deck filled with 22 slides of generated diagrams that look like a spider had a stroke while staring at a McKinsey report. It is the optimization of the void.
The Meteorologist and the Edges
I think often about Carlos F.T., a man I met while hiding from a buffet line on a transatlantic crossing. Carlos is a cruise ship meteorologist, a job that sounds romantic until you realize it involves staring at 2 radar screens for 12 hours a day, trying to decide if a swell is going to make 3002 passengers vomit simultaneously. Carlos doesn’t care about ‘synergy.’ He cares about the specific gravity of the air and the precise moment a cold front decides to ruin a gala.
“
The problem with modern systems is that they give you the answer before you’ve felt the problem. If I use an automated model to predict wave height without looking at the horizon, I miss the rogue wave that the model thinks is a statistical impossibility. The model is built for the average. Life happens at the edges.
– Carlos F.T., Cruise Ship Meteorologist
We are currently living in the ‘average’ generated by millions of tokens. When my boss asks for an AI-generated project plan, he is asking for the most average possible version of that plan. It is safe. It is recognizable. It is also entirely useless because it doesn’t account for the fact that Sarah in accounting is on maternity leave or that our primary server has a 42% chance of overheating if the weather stays this humid. The AI doesn’t know about the humidity. It only knows what a project plan looks like.
The Cost of Average: Performance vs. Reality
Ignores known constraints (Humidity, Sarah)
Accounts for real-world variables
The Aesthetic of Output
We have reached a point where the aesthetic of productivity has become more valuable than the output itself. We are being buried under a mountain of ‘smart’ summaries, ‘automated’ catch-ups, and ‘intelligent’ insights that require 52 minutes of human correction for every 2 minutes of machine generation. I find myself spending more time ‘curating’ the AI’s hallucinations than I ever spent actually writing the reports from scratch. But I can’t stop. To stop would be to admit that the ‘strategy’ is a hollow shell, and in the modern corporate ecosystem, admitting the emperor is naked is a quick way to get your ‘velocity’ reduced to zero.
The Erosion of Critical Thinking
Hesitation to Delete Perfect Nonsense
There is a specific kind of exhaustion that comes from being a ghost in the machine. You start to doubt your own instincts. You see a sentence that you know is wrong, but it’s formatted so perfectly in a Notion document that you hesitate to delete it. Maybe the AI knows something I don’t? This erosion of critical thinking is the hidden cost of the theater. We are outsourcing our skepticism to the same tools that are feeding us the nonsense.
“
The tragedy of the modern office is that we have mistaken a louder megaphone for a better argument.
– Anonymous Observer
I remember a meeting last month where we spent 72 minutes debating which AI tool to use for ’email sentiment analysis.’ The irony was that we only receive about 12 emails a day from actual clients. The rest are internal automated notifications. We were essentially looking for a tool to tell us if our other tools were being polite. It is a hall of mirrors where the reflection is more important than the person standing in front of it. We are building a bureaucracy of bots, and we are the middle managers tasked with making sure they get along.
Distinguishing Performance from Value
Reporting Friction
Making it easier to show work was done.
Optimized Void
Spinning wheels in high definition.
When you look at the landscape of tools being pushed on us, very few are designed to solve the friction of the actual task. They are designed to solve the friction of the report. They make it easier to show that you’ve done something, rather than making it easier to do the thing. This is where companies like AIRyzing become so vital to the conversation, as they represent the push to actually separate the performative noise from the tools that provide tangible, structural value. Without that distinction, we are just spinning our wheels in high-definition.
Scheduling Failure: 22 Suggestions
I recently tried to use an AI to help me schedule a meeting between 2 departments. It suggested 22 different times, all of which were on a Sunday or during the company-wide holiday. When I pointed this out to the ‘Productivity Lead,’ he told me I just needed to ‘refine my prompting technique.’ This is the ultimate gaslight of the AI era. If the tool doesn’t work, it’s your fault for not speaking its language.
Carlos F.T. would laugh at this. He once told me about a captain who insisted on using a new automated docking system that didn’t account for the specific tug of a local current. The ship ended up 12 feet away from the pier, dangling in the harbor while the computer insisted it had arrived. ‘The captain was so proud of the screen,’ Carlos said. ‘He forgot to look out the window.’ We are all that captain right now. We are so proud of our dashboards and our ‘AI-enabled workflows’ that we have forgotten to check if the ship is actually at the dock.
The Toll of Flow Loss
There is a psychological toll to this. When your day is composed of 82% performative tasks, you lose the ‘flow’ that makes work meaningful. You become a prompt engineer for your own life, constantly adjusting the inputs to make sure the output looks ‘professional.’ I find myself checking my own emails to make sure they don’t sound too human, because human sounds messy. Human sounds like someone who might push a door that says pull. And in a world of optimized AI strategies, there is no room for the ‘push-pull’ of real, messy, human creativity.
The Closed Loop Equation:
Why are we generating a 12-page summary of a 2-page document? Why are we using AI to write emails that will be read by an AI that will then summarize them back into 2 bullet points for the recipient? We are creating a closed loop of data where humans are just the biological batteries keeping the servers warm.
I miss the friction of real work. I miss the moments where a problem was so difficult it required 2 hours of staring at a blank whiteboard, not 2 seconds of clicking a ‘generate’ button. The whiteboard didn’t give me an answer, but it forced me to think. The ‘generate’ button gives me an answer, but it forces me to stop thinking. It is a trade-off that feels more like a surrender every day.
AI Strategy Document Completion
Mislabeling Detected
It had 42 charts, none of which had labeled axes. It was praised as a ‘landmark achievement.’ The theater was a success.