Photo by Caique Araujo.
Over the weekend, our fifth grader was working on an assignment. A good one. The class spent a bunch of time learning about poetry and as a final project, the kids had to write 10 poems. Each demonstrating a different style. Any topic. Our kid chugged along for the first nine, but at the tenth, there was a loud groan.
I don’t get it. Free verse makes no sense. How can I write a poem where there are no rules or structure to the style?
We do the thing parents do. We ask what was covered in class (I don’t know, we were out for Passover that day. I missed it, the kid says accusingly). OK, we ask if there are any samples or prompts, anything at all to get rolling. There is a sample poem. Blessings upon blessings. We start there. The poem on offer is incredibly famous. It’s the kind of thing every school kid learns. But if you don’t know what free verse is and you missed that day of class, it mostly just sounds like a fine poem.
So we do the next thing parents do. We open wikipedia and quickly run into another wall. There is an entire entry on free verse, including several famous examples. And while the description is technically accurate and the examples are, in fact, examples. The words on the page are flat.
At this point, the kid is pretty frustrated. So we try again.
Hands off keyboard. Close your eyes. Don’t open them until the audio stops.
We head back in time with ancient audio clips that have been digitized for YouTube. We pull poems and jazz and play everything loud. The meter. The rhythm. The structure, the lack of structure. Poems that are about one thing and also something else entirely. Poems delivered in a calm whisper, seething with rage.
We stop the audio.
Do you get it?
The kid blinks.
I don’t think I get it in that I don’t understand all of what the poems are about. But I can hear it. I can feel it.
Thankfully, not all learning requires this type of ear-tuning. There are plenty of places where reading a book will give you most of what you need. Or where the wikipedia entry will suffice. But sometimes, it really is a mix of what you know and how you feel.
Two roads converged
Businesses are a funny place for the intersection of what you know and how you feel. Because on the one hand, execs will say “fuck your feelings.” And on the other hand, those same execs want you to know your gut sense is important data. So when, unprompted, several middle managers want to talk to us about this specific intersection, our ears perk up.
The thing they know. Their orgs need them to do more with less. The economy is bumpy but the work hasn’t stopped. They still have customers. Those customers still need things. The managers have been through several layoffs and watched their orgs shrink. But no one, not a single person has stopped by to say “Hey, here’s what we’d like you to put down.” If anything, it’s been the opposite. Some version of “AI mumble hand-wave just keep on truckin.” So they keep going. They do more with less. They put in requests for resources (budget, headcount, etc) but they expect the requests will be denied. And for the most part, they’re right.
The thing they feel. The promise that AI will fill in the gaps is still a ways off. If it’s coming at all. Increasingly, our managers say much of their time is spent keeping their teams on track. That’s not unusual for middle managers. But specifically, the org-wide mandates for AI experimentation at every level and every function provide wide air-cover for losing the fucking plot. They find staff wandering in the wilderness. Taking days on tasks that used to take hours. And when they dig in on why, there’s that sinking feeling again.
Shaving yaks
To be fair, like, from an employee point of view, shit has gotten very weird in the last few years. Execs and board members have — some faster than others, some louder than others, some more terminally LinkedIn than others — seen the coming of AI as a generational event. Either an existential threat or transformative opportunity. And they appear to universally believe that their employees are too change-averse to adopt these tools organically like any other normal technology. So they’ve issued mandatory “screw around with AI” policies to force people over that hurdle.
And the result is that in 2026, actual earth humans go to their jobs and have performance conversations with their managers that have nothing to do with their impact on the business. They are literally measured, in whole or in part, on whether they’re playing with the AI tools enough, and whether they’ve spent sufficient compute. While the CEOs of the companies building those AI tools go on press tour crowing about how many jobs they’ll eliminate.
That’s weird. That’s just a very weird way to run a planet.
The thing these execs are right about is that incentives do shift behaviour. If you fire 30% of the company and tell the survivors that their continued employment is contingent on AI use, they will use some AI. They will find the AI-shaped parts of their job and focus their efforts there. There may not be very many parts of their job that are intrinsically AI-shaped, and the ones that are might be the lowest-value stuff, but they’ll make it happen. A whole generation of non-coders are learning how to install homebrew packages so they can have an agent use command-line tools to convert a Microsoft Word doc to a PDF. Yak hair everywhere.
And as for what the execs have gotten wrong. Well. Robert Hart over at The Verge, writing about a Gallup survey of Gen Z workers, notes,
“Only 18 percent said they were hopeful about the technology and 22 percent said they were excited, down from 27 percent and 36 percent, respectively. At the same time, anger is growing: 31 percent of respondents said they feel angry about AI, up from 22 percent last year. Anxiety about AI has remained relatively steady at around 40 percent.”
At least! The anxiety levels? Have remained steady? At 40% of those surveyed? A free management tip from us is that you generally don’t want a third of your workforce angry about the work you have them doing.
And as for the impact on the firms using it, last month’s NBER research with nearly 6,000 senior business leaders tucks in this banger,
“Third, executives report little own-firm impact of AI over the last 3 years, with nine-in-ten reporting no impact on employment or productivity.”
Like, what are we doing, here?
What matters is what matters
There’s an old line from Keynes about how “the market can remain irrational longer than you can remain solvent” and, fair enough. The folks betting the hardest on AI have a lot of capital to burn. You only need to glance (sidelong, with an eyewash station nearby) at X.com to see how willing these same people are to obliterate business value, and social good, and any shred of personal fucking decency, in an attempt to buy themselves cool. So the planet might remain weird for a while yet.
But can your org afford to? Can you actually keep doing this scattered-in-all-directions version of work? Where your employees are disengaged and angry while your managers struggle to find a through-line on any of it? In pursuit of transformative gains that 90% of leaders aren’t seeing, fully 3 years in?
This is not an “AI is useless” newsletter.
This is a “what matters is what matters” newsletter.
It matters that work be about something. That there be some actual impact out in the world that results from your efforts. It matters operationally for your business that people know what’s important and how to trade off priorities. It matters emotionally for your employees to feel like their struggles have a point and that there is value in their contributions. It matters for the customers or communities you serve that there’s any consistency or care in the way you show up for them. You have a stake in that. We all have a stake in that. AI doesn’t.
You can’t ask Claude what matters to you. Like, you can. It will give you an answer, but it will be a hollow amalgam of what other people have said. That might work for some topics, but it doesn’t work for meaning. Not because AI tools, or their builders, are opposed to you finding meaning at work. Just because, empirically, AI tools seem to be distracting a lot of people to the point that they lose it.
You’ve gotta come back into your body. Wiggle your toes. Hands off of keyboard, eyes closed and don’t open them until you actually feel something about which work matters and how you can get that work done well with the people around you.
AI tools or no. Yaks or no. We will bet on the teams that feel the meaning of their work, over the ones too busy to think about it, all day every day.
— Melissa & Johnathan