The Witness Class
As AI masters production, humans are building a meaning economy based on presence, ritual, and intentional inefficiency
At Milan Fashion Week in September, Tod’s staged needlework as theater. Artisans in white frocks hand-stitched leather while models walked past, the labor presented as the luxury good itself. The CEO declared machines would never replace their artisans. It sounds like posturing until you check the numbers: handmade product sales rose 25% this year while the events industry just crossed $1.35 trillion.
The Hype Correction Nobody Expected
MIT Technology Review declared “The Great AI Hype Correction.” The headline stat: 95% of companies that tried to implement custom AI systems failed to scale beyond the pilot stage within six months. Ninety-five percent. Not “struggled.” Not “saw mixed results.” Failed.
The details are worse. Researchers found 90% of surveyed companies run an ‘AI shadow economy’: workers hiding their personal chatbot use because the official enterprise tools are broken. The automation designed to replace human judgment now depends on humans hiding their judgment to keep it running.
The machines aren’t making us more productive. They’re making us believe we’re more productive while we’re actually slower. We automated confidence, not competence.
Scale AI tested AI agents on 240 real-world freelance projects. The best agent achieved a 2.5% success rate. Humans earned $143,991. The top AI earned $1,720.
But the finding that should have made everyone sit down came from METR’s developer productivity study. Experienced software developers given AI coding tools took 19% longer to complete tasks. The twist: they believed they’d finished 20% faster. Machine intelligence hadn’t made them more productive. It had made them feel more productive while actually slowing them down. The algorithms drowned developers in options, removing the friction that forced thinking. What looked like acceleration was productivity theater. The only audience was the developer.
These aren’t failure stories. They’re revelations: what we were actually doing when we thought we were just doing tasks.
After all this investment, the machines still can’t replace task execution. So: what were humans actually providing?
What AI Can’t Fake
Johns Hopkins tested 350 AI models on a simple task: watch three-second video clips of people interacting and assess social dynamics. Which people are communicating? Who’s paying attention to whom? The kind of thing any human can evaluate in a glance.
None of the models could do it.
The models excel at recognizing objects, faces, text (anything that holds still long enough to be categorized). But the moment the task requires reading a room, the algorithms go blind.
Machine intelligence can summarize documents but not judge what matters.
Automated systems can generate empathetic responses without feeling empathy. Researchers call this ‘compassion illusion’: the algorithm identifies sadness but cannot inhabit sorrow. It generates comfort without caring. It is a performance with one participant. The machine produces content. It cannot witness meaning.
That is the gap.
And where there’s a gap, there’s a market. The meaning economy is already emerging, not as resistance to automation, but as direct response to what automation revealed we’d lost.
The Meaning Economy Emerges
The health and wellness coaching industry will grow from $18.8 billion to $30.6 billion by 2032. Not because coaches provide information unavailable elsewhere (any AI can access more health data than any human could memorize). The value is in the witnessing.
The client isn’t purchasing expertise. The client is purchasing someone to notice they’re alive. Monthly subscriptions start at $200.
Mentorship demand has exploded. According to Fortune, 83% of Gen Z workers believe having a workplace mentor is important. Only 52% have one. That gap (31 percentage points of unmet human validation) is a market. Harvard research found that mentorship closes the socioeconomic gap by two-thirds, not because mentors provide better career advice than ChatGPT could generate, but because they provide something else entirely: the confirmation that someone’s experience is part of a continuum, not an isolated event in an indifferent universe.
Body doubling captures the dynamic most purely. The premise: have another human present while you work. Not coaching. Not supervising. Just existing nearby. Platforms like Deepwrk and Flown are productizing the discovery that humans work better when someone else is simply there. You pay $40/month for a ‘verified human to watch you work.’ The unwritten selling point: the algorithm can see you, but it cannot witness you. That’s the difference you’re compensating for.
Studies show that accountability check-ins can increase goal achievement from 25% to 95%. The productivity doesn’t come from advice. It comes from being seen. The Meaning Economy isn’t filling a new need. It’s commodifying what was lost when optimization eliminated the friction that bred human connection.
The No AI Movement launched a third-party certification system for products and services created without AI involvement. Think of it as organic labeling for human attention. The certification doesn’t claim AI-free products are better. It claims they’re scarcer.
Why are we comfortable with ‘compassion illusion’ from machines but desperate for ‘witnessed existence’ from humans? Why did a generation raised on “authentic self-expression” conclude that authenticity requires external certification?
The New Professions: The Rise of the Proxy Human
The first to emerge will be the Narrative Anchor. As AI tools flood organizations with generated content, the bottleneck shifts from production to integration. How do a thousand disparate AI outputs become a coherent strategy? How does a company develop a soul when its content is algorithmically generated? The answer: hire someone whose only job is to provide the through-line.
The Narrative Anchor sits at the intersection of strategy and anthropology. The algorithms write the emails and draft the presentations. The Anchor ensures that the story being told is actually lived by the organization. They look at a “perfect” AI-generated quarterly report and say, “This is accurate, but it’s a lie, because it ignores that our sales team is burning out.” They function as the organization’s conscience, compensated specifically to say the quiet part out loud (the part the algorithms are trained to smooth over). Expect to see job postings by Q2 2026: ‘Chief Narrative Officer. Salary: $180K-$240K. Responsibilities: Make our AI-generated output feel like it came from humans who give a shit.’
Close behind will come the Proxy Grandparent. As the mentorship gap widens, a market for ‘rented wisdom’ will solidify. Not consultants (who come to solve problems) but elders who come to provide context. The Proxy Grandparent doesn’t tell you how to restructure your career. They listen to your existential dread and say, “I saw this in 1999. It passed. You will too.” The value isn’t the insight. It’s the proof that your experience is part of a continuum, not a data point in a vacuum. MentorCruise already connects 51,000 mentees with 4,000 mentors across 171 countries, delivering 12,800 hours of mentorship monthly. The next iteration won’t be career acceleration. It will be existence validation.
Meanwhile, Experience Curators are commanding six-figure salaries for making business processes deliberately less efficient. Distinct from event planners, Experience Curators won’t focus on logistics but on constraints. Where event planners remove friction, Experience Curators introduce it, designing meetings with mandatory silence periods, collaborative projects that reject AI assistance, decision-making processes that require human deliberation time. The job description will read like satire but pay like consulting: ‘Slow down your team’s output by 40%. Increase their sense of meaning by 200%. Results guaranteed or your inefficiency refunded.’
What these professions reveal: we’re not compensating for skills. We’re compensating people to perform humanity at us until we remember what it feels like.
But professions are only half the story. The meaning economy isn’t just creating new jobs. It’s creating new status hierarchies that invert everything the attention economy taught us to value.
Status Markers: The New Luxury of Latency
In the attention economy, speed was status. In the Meaning Economy, latency becomes the ultimate flex.
By 2026, being known for taking weeks to reply to an email will signal not disorganization but depth. The implication: this person is engaged in high-meaning, low-output work that requires total isolation from the notification stream. The old status symbol was instant access. The new one is being unreachable.
Luxury hotels and co-working spaces will begin marketing ‘AI-Sanitized Architecture.’ Not just “no wifi” spots but environments structurally designed to prevent algorithmic interference. Walls with signal-dampening materials. Faraday cages as standard amenities. The dark comedy writes itself: the wealthiest humans paying $10,000 per night to sleep in Faraday cages while their assistants monitor their disconnection via AI dashboards. Meanwhile, the working class gets squeezed into hyper-optimized, AI-surveilled gig workflows where every microsecond is monetized. We’re building a two-tier system where the rich pay fortunes to live like the 19th-century poor (disconnected, manual, slow) while everyone else disappears into algorithmic acceleration they can’t escape.
Watch for ‘Hand-Finished’ labels to appear on things that have no business being handmade: software code, legal contracts, architectural plans. “This legal document was drafted by GPT-7 but reviewed and hand-signed by three exhausted humans” becomes a seal of quality, proving someone took the risk of actually understanding it. The certification won’t claim the human review made it better. Just scarcer. Just witnessed. Just consequential to someone who could be held accountable.
Business Models Built on Witnessing
The subscription model is evolving. We are moving from paying for access to content to paying for access to presence.
The 2026 version of body doubling is a B2C service where clients pay a monthly fee to have someone “present” with them digitally while they work, create, or just think. Not a coach. Not a boss. A witness. Clients share their screen or a video feed. A verified human sits on the other end, nodding occasionally, observing their process. The AI agents can do the work for them, but the agents cannot witness the struggle. The profound psychological relief of being seen struggling without judgment (what the coaching industry is quietly selling) will be productized at scale. We’re not reclaiming community. We’re renting back what we optimized away, purchasing from strangers the social infrastructure we demolished in pursuit of efficiency. The product description will be remarkably honest: ‘We don’t make you better. We make you seen. $79/month.’
MIT tells us developers with AI tools take 19% longer to complete tasks despite believing they’re faster. Why? Because they’re drowning in options and losing the friction of thinking. The machines removed the struggle that used to force clarity. Premium services will emerge where clients subscribe to have someone curate their failures. The service listing will read: ‘I will intentionally break your AI flow, forcing you to engage with the problem on a human level again.’ Clients are paying for someone to introduce inefficiency back into their lives because they’ve accidentally optimized themselves out of the experience of living.
The testimonials read like performance art:
‘I hired Sarah to delete my GPT access for eight hours a day. My output dropped 60%. My thinking improved 200%. I’ve never been more inefficient or more alive. Five stars.’ (Marcus T., Senior Developer)
‘The Failure Subscription saved my creative practice. My consultant disables my autocomplete tools, hides my reference materials, and forces me to work from memory. I produce less. I make more mistakes. The work finally feels like mine again. Worth every penny.’ (Jennifer K., Design Director)
‘I pay $300/month for someone to randomly disable features in my productivity stack. I hate it. I need it. I can’t explain it to my spouse.’ (David R., Product Manager)
The subscription tiers will be darkly comic:
Basic ($99/mo): One forced failure per week
Premium ($299/mo): Daily inefficiency injection
Enterprise ($999/mo): Complete dismantlement of your optimization infrastructure
The service won’t market itself as self-help. It will market itself as archeological recovery: ‘Rediscover the thinking you lost to convenience.’
The Failure Subscription isn’t a bug in the Meaning Economy. It’s the purest expression of what the economy is actually selling: the experience of being human in a world that optimized humanity into a bottleneck. We’re so lost we’re paying to be sabotaged to feel human.
Corporate adoption will follow consumer behavior, as it always does, though the translation will be predictably absurd.
The Theatre of Authenticity: When Corporate Incentives Meet Meaning Economics
When corporate HR departments discover the Meaning Economy, the results are exactly as absurd as you’d expect.
A Fortune 500 HR department mandates a corporate retreat. The theme is ‘Deep Connection Through Analog Presence.’ To prove they care about the ‘human element,’ executives confiscate all smartphones and lock employees in a mountain cabin for 72 hours.
Panic ensues within four hours. Employees attempt to build a router from toaster parts and aluminum foil. One team tries to hack the cabin’s smart thermostat to send messages. Another group discovers they can’t remember anyone’s contact information because it’s all stored in confiscated devices.
The CEO (Trurl hasn’t sent his own email in a decade) stands at the front forcing participation in ‘Radical Vulnerability Circles.’ People are instructed to “share their truth” without digital mediation. The AI sentiment bot secretly monitoring the session via hidden microphones flags a ‘Toxic Positivity Event,’ but the humans are just crying because they forgot what it felt like to be bored together.
By hour 48, something shifts. Without devices, people start having actual conversations. The forced inefficiency creates accidental intimacy. Someone admits they hate their job. Someone else admits they have no idea what the company actually does. A third person starts laughing and can’t stop.
The retreat ends. Everyone gets their phones back. Within 20 minutes, the vulnerability evaporates. Someone creates a Slack channel called “Remember That Weird Cabin Thing?” It gets 47 emoji reactions and is never mentioned again. The HR dashboard marks the initiative as ‘Engagement Success: 87% Positive Sentiment.’
Nothing changed. Everyone knows nothing changed. The theater was the point.
Or consider the ‘Analog-Only’ dating app launching in 2026 to massive fanfare. The pitch: ‘We find your soulmate. The old-fashioned way. You write a letter. We physically mail it to them. You wait.’
The app uses advanced AI logistics to route the letters and optimize matching algorithms, but the interaction is agonizingly slow. Users pay $50 monthly for the privilege of waiting three weeks for a response that could have been a text, convinced that the delay proves ‘true love’ because suffering equals authenticity.
The twist? The matchmaking algorithm is GPT-4. The handwriting recognition is AI. The entire infrastructure is automated except for one element: the mandatory waiting period. The only human part is the suffering. That’s the feature. That’s what people pay for.
The most ridiculous status symbol of 2026? Certified Imperfection. High-end furniture makers will sell tables with ‘hand-carved inconsistencies’ where apprentices deliberately introduce flaws into otherwise perfect pieces. In the digital realm, ‘glitch artists’ will intentionally introduce human errors into AI-generated text or video, selling the mistakes as proof of authenticity.
The mistake becomes the signature. The flaw becomes the proof of humanity. Collectors will pay six figures for a paragraph of text where the grammar is slightly off, where a metaphor doesn’t quite land, where the rhythm breaks in a way no algorithm would allow. Not because it’s better. Because it’s scarcer.
Watch for gallery exhibitions in late 2026: ‘Imperfect: A Retrospective of Human Error in the Age of Algorithmic Perfection.’ The centerpiece will be a novel written entirely without autocomplete, marketed as ‘unassisted human prose.’ Critics will call it ‘bracingly mediocre.’ It will sell for $40,000. The buyer will never read it. The ownership is the point.
What to Watch For
If the Meaning Economy is real and accelerating, specific signals will emerge in the next six months.
Watch for the first high-profile legal battle over what constitutes ‘handmade’ or ‘human-authored’ in a copyright or commercial case. The case will likely involve either a luxury goods manufacturer accused of using AI in ‘handmade’ products, or a publishing dispute over whether AI-assisted writing can be marketed as ‘authored.’ Once courts recognize that human creation has premium value regardless of quality metrics, the Meaning Economy gets legal infrastructure.
Look for the first major bank, airline, or healthcare provider to launch a ‘VIP Tier’ that guarantees human-only support, explicitly promising slower wait times but ‘real understanding.’ If they sell it for a premium (and they will), the shift is undeniable. The value proposition will be inverted: pay more, wait longer, get less efficiency, receive more presence.
Watch for platforms like MentorCruise or ADPList to shift from ‘career advice’ to ‘existence validation.’ The marketing language will shift from ‘Accelerate your career’ to ‘Make sense of your trajectory’ to ‘We witness your experience.’ The ultra-premium tier will offer ‘long-term relational witnessing’ (someone who knows your story well enough to tell you what it means).
A major third-party certifier will emerge (similar to Fair Trade or USDA Organic) but for ‘Low-AI Content’ or ‘Human-Intensive Creation.’ Products bearing the label will command 15-40% price premiums not because they’re superior but because they’re rarer. The market will treat human attention the way it currently treats organic farming: a premium attribute serving a demographic willing to subscribe to perceived authenticity.
Watch for the first viral manifesto from a group of ‘Analog Coders’ or ‘Unassisted Engineers’ who refuse to use autocomplete, copilots, or AI coding assistants, framing their deliberate slowness as a craft movement. The manifesto will use language borrowed from artisanal movements: ‘hand-coded,’ ‘small-batch software,’ ‘heritage programming languages.’ The tell: when the first job posting appears requesting ‘AI-free development practices’ and offering salary premiums for engineers who work without assistance.
These signals won’t emerge because people are rejecting AI. They’ll emerge because people are discovering what AI revealed we’d already lost.
The Reservation for Human Being
What the past year has revealed is a distinction we forgot to make: “doing” isn’t the same as “being.”
What’s remarkable isn’t that we’re discovering this distinction now. It’s that we forgot it existed. Somewhere between “time is money” and “move fast and break things,” we convinced ourselves that existence could be optimized. That presence was reducible to productivity. That witnessing was a luxury rather than a human need.
The Meaning Economy isn’t emerging because AI failed at something. It’s emerging because AI succeeded at revealing what we’d already sacrificed.
Machine intelligence can generate the content. It can’t witness the creation. Automated systems can provide the information. They can’t validate that the information matters to someone in particular. The algorithms can optimize the process. They can’t be present for the struggle. AI can solve the problem. It can’t suffer alongside someone while they figure out what the problem actually is.
The Meaning Economy is an attempt to build a reservation for human being inside a world optimized for machine doing. It’s messy, inefficient, and expensive. It’s also, possibly, the first genuine answer to the question everyone’s been asking: what will humans do when machines can do everything?
Turns out the answer was hiding in plain sight. Humans will do the things that require being there. Not to produce. Not to optimize. Not to scale. Just to be present, witnessed, imperfect, and irreplaceably human.
The irony of automation isn’t that it replaces human labor. It’s that it reveals which aspects of human experience were never about labor at all.
We’re not building a Meaning Economy to resist the machines. We’re building it because the machines showed us what we’d already lost (and made us willing to buy it back). The subscription fees, the certifications, the premium tiers for human attention: these aren’t resistance. They’re capitulation with a price tag.
We’re not reclaiming what was taken. We’re renting it back from ourselves, wondering why it feels different than it used to, unable to articulate that what we’re purchasing isn’t presence but the memory of what presence felt like before we turned it into a market.
The Meaning Economy isn’t the solution. It’s scar tissue forming over the amputation.











I just read this again. So good.
It takes so much watching to reach conclusions like this. I am 100% on the same page as you. What I believe is that trends move forward with their opposites attached, what you describe is actually the dialiectic of AI advancement. Inspired by Douglas Adams, Douglas Rushkoff and Douglas Coupland, I've been trying to create a set of 21 Neologisms to describe in a comic way, the nature of our new co-parented (with AI) culture - I am convinced our stupidity will save us in the end. - Today is DADLIGHT
https://www.linkedin.com/posts/chrishogg_day-19-dadlight-activity-7409434810026348544-oj0s?utm_source=social_share_send&utm_medium=member_desktop_web&rcm=ACoAAAAMwM4BBFPWkwezEFwl-tCKwyx2dpFJkJE