Algorithmic Folk Beliefs
The rituals we've developed to appease systems we don't understand
Rydra posts at 7:23 PM with religious devotion. Three months ago, a video at that exact time went viral. Correlation, causation—who can say? The algorithm might have favored the content, the sound, the hashtags, or perhaps cosmic alignment. Now Rydra tracks every variable in a spreadsheet that would make a quantified-self evangelist weep: hashtag count, caption length, emoji deployment, Mercury’s retrograde status. When daylight savings hit, she triggered three Discord servers into existential crisis. The question: does the platform respond to absolute time or audience activity? She has developed a liturgy.
She is not being irrational.
In 1948, B.F. Skinner watched pigeons invent religions. When food dropped at random intervals, the birds repeated whatever motion they’d been performing at that moment. One spun counterclockwise until dizzy. Another bobbed its head like a metronome. They had discovered causality where none existed, creating rituals from randomness.
Case rewrites her resume for the seventeenth time. Reddit whispers that ATS systems reject PDFs. Forum oracles swear two-column layouts break the parsing. A career coach charges $200 to reveal the keywords that will unlock employment. Case strips her professional identity to ASCII art, sacrificing personality for machine readability. The irony isn’t lost on her: she’s optimizing her humanity for robot comprehension. She clicks save anyway.
The trajectory is visible now. “ATS Whisperer” as certification program. High school electives in Resume Formatting Divination. Guidance counselors explaining that competence matters less than correctly appeasing the parsing spirits. Students will memorize incantations: “Calibri 11pt summons interviews, Times New Roman 12pt courts rejection.” Nobody will question this. Everyone will have a story about the qualified candidate rejected for mysterious reasons, the underqualified applicant inexplicably advanced, the perfect resume that vanished into void. We’re teaching our children to pray to machines while telling them prayer is irrational.
Research suggests most ATS folk beliefs are wrong. Systems accept PDFs, parse two-column layouts fine. The widely circulated claim that “75% of resumes are rejected by robots” appears in countless LinkedIn posts and career advice articles, but tracking it to a credible source proves difficult. Yet the myths persist, practices multiply, because the platforms remain black boxes. Case experiences rejection. The platform won’t explain why. So she develops theories, tests interventions, shares discoveries. This is what black box gaslighting produces: rational people driven to irrational-seeming behavior because the machinery denies what they experience while refusing to reveal how it actually works.
Anthropologists have a name for this: superstition under uncertainty. A century ago, Malinowski watched islanders perform elaborate fishing rituals before dangerous deep-sea voyages while skipping the magic entirely for safe lagoon fishing. The pattern holds: where stakes are high and mechanisms are hidden, rituals emerge.
We have built digital systems that reward and punish according to rules we cannot verify. Then we express puzzlement when people respond exactly as humans always have.
Miller cancels rides through the app rather than letting passengers initiate the divorce, because forum wisdom claims this protects his rating. He takes the scenic route home after shifts, hoping the system will reward his territorial devotion with better fares tomorrow. He logs out and back in at precise intervals: a digital genuflection that may or may not summon profitable rides.
Researchers call these behaviors “work games,” but the phenomenology predates platforms by millennia. Miller practices hopeful causal reasoning as old as human cognition. He connects actions to outcomes that may be random, may be determined by invisible mechanisms, or may actually be influenced by his behavior in ways he misunderstands entirely.
Evolutionary biologists explain this with simple math. False positives cost less than missing a real connection. When discovering a genuine pattern offers more benefit than performing useless rituals costs, those rituals survive natural selection. We’re descended from organisms that erred on the side of imagining agency where none existed.
Miller isn’t wrong to perform rituals. He’s wrong about which rituals matter.
The platform does respond to his behavior. Just not the behaviors he thinks. Every logout, every cancelled ride, every route choice feeds the system data about his compliance, his availability, his willingness to absorb uncertainty without demanding explanation. The algorithm doesn’t reward his ceremonies. It rewards his participation in a game where only one side knows the rules, and that asymmetry is the point.
Already, “Algorithm Whisperer” is emerging as job function even if not yet as job title. The trajectory toward formal credentialing is visible for anyone who looks. State universities offering majors in Superstition Optimization. LinkedIn profiles listing Multi-platform Folk Belief Integration. The field exists whether anyone admits it or not. There are already consultants. Already courses. Already optimization guides circulating through Reddit threads like holy texts passed from congregation to congregation, each variation claiming the true revelation about what the platforms want.
Dating profiles will list these systems like astrology signs: “ENTP, Virgo rising, believes LinkedIn performs best on Tuesday mornings, won’t post TikTok after 9 PM, compatible with morning posters only.” The algorithms have become our new fate, our new horoscope, our new gods of chance.
Instagram’s leadership has repeatedly stated that shadowbanning doesn’t exist as users understand it. Meanwhile, investigative reporting has documented patterns of content demotion, hashtag suppression, and reach limitation without notification. Users experience the effects. The platform denies the framing.
This dynamic deserves a name: black box gaslighting. It’s epistemic violence, being told your perception isn’t real while the machinery remains too hidden to prove otherwise.
Rydra experiences it when her posts inexplicably reach 1% of her followers. Case experiences it when identical resumes produce wildly different outcomes. Miller experiences it when his rating drops for reasons the app won’t specify. The platforms tell them nothing is wrong, they’re imagining patterns, the systems work fairly. Meanwhile, engagement drops, applications vanish, income fluctuates, and the machinery that determines these outcomes remains determinedly opaque.
This is not conspiracy but economics.
Transparency would require platforms to admit uncomfortable truths. How much is arbitrary. How much is A/B testing users as experimental subjects. How much optimization serves extraction rather than experience. Opacity provides plausible deniability. “The algorithm” becomes an unquestionable authority, simultaneously blameless and unaccountable.
Occasional transparency gestures reveal just enough to claim openness while maintaining the opacity that matters: how the system actually responds to individual behavior in real time. Algorithm overviews describe categories, not consequences. Recommendation explanations name factors without weighting them. The performance of transparency preserves the reality of opacity.
What emerges in response is folklore passed user to user like oral traditions around a digital campfire. Stories about content moderation and recommendation engines that explain the inexplicable drops in engagement, that provide frameworks for navigating platforms that refuse to explain themselves. The folklore may be wrong in its specifics. But it is not irrational. It is exactly what humans do when consequences are real and causes are hidden.
Bester adds a line of code that serves no purpose, because removing it once crashed the program and no one knows why. Gibson would recognize this immediately: a fragile pattern that works precisely because no one understands why. Steve wrote the function three years ago before vanishing into startup purgatory. The team has an unspoken agreement: Steve’s magic function stays. Fridays, Bester refuses deployments. The last Friday deployment summoned digital demons, cascading failures that kept the CEO Slacking him at 11 PM. Now Friday is sacred, a sabbath for servers.
This practice has a name: cargo cult programming, after the Pacific Islander communities who built replica airstrips hoping to summon the god-like planes that had brought cargo during World War II. The parallel is uncomfortably precise. Even those who build the infrastructure develop ritualistic behaviors around their own opacity.
The gig worker worrying about invisible ratings responds rationally to irrational circumstances. Her income depends on satisfying a mechanism that won’t reveal its satisfaction conditions. The influencer developing posting ceremonies engages in hopeful pattern-matching as old as human cognition.
Psychology calls this “illusion of control.” People throw dice harder when wanting high numbers, prefer self-chosen lottery tickets despite identical odds. We’re pattern-seekers who feel agency where none exists.
But “illusion” misdiagnoses the phenomenon. Platforms do respond to behavior, just opaquely. Recommendation engines change based on actions. ATS systems process resumes according to logic. These aren’t beliefs in magic but attempts to divine rules of machines that refuse disclosure. Calling it “illusion” frames the problem as cognitive bias rather than designed information asymmetry. It pathologizes the user rather than interrogating the system.
Consider the market emerging to fill the gap. “Senior Algorithmic Appeasement Strategist. 7 years experience in TikTok ritual design, Instagram shadowban recovery, and YouTube thumbnail divination. 34% engagement improvement through evidence-based superstition frameworks.” This reads like parody. Check LinkedIn in six months. The only thing that will change is the job titles getting less self-aware about what they’re selling. By 2027, McKinsey will publish “The Algorithmic Ritual Revolution: $5 Trillion Opportunity in Digital Superstition Optimization.” Nobody will bat an eye.
“The Identity Strainer Theory” circulates on TikTok: platforms actively filter content from marginalized identities. Platforms deny it. Users experience it. In the gap between denial and experience, folk theories proliferate like mushrooms after rain.
These theories prioritize functionality over accuracy. They offer models for action under uncertainty. They create communities where strategies exchange and lore passes down. They convert passive confusion into active negotiation. Even wrong theories manage anxiety when stakes are high. Malinowski observed this a century ago.
Gig economy researcher Lindsey Cameron documented worries about invisible judgment among platform workers. The feeling of being watched and judged by mechanisms that can’t be understood or appeased. Drivers report stress, anxiety, and what Cameron calls “anxious freedom”: simultaneously autonomous and controlled. They’re independent contractors satisfying invisible managers, free to work when they choose as long as they choose what the platform prefers.
The platforms, meanwhile, call worker tactics “gaming” while using psychological techniques derived directly from behavioral research to maximize extraction. The gamification is intentional. The resistance tactics are predictable responses. The whole arrangement has the structure of a game neither side will admit they are playing.
Platforms understand what Malinowski knew a century ago: uncertainty breeds ceremony, ceremony breeds engagement. Transparent mechanisms would be less profitable than hidden ones. Users who understood the logic would optimize and move on. Uncertainty keeps them engaged, experimenting, checking metrics seventeen times daily.
Superstitions aren’t bugs.
They’re engagement models.
We’ve constructed a technological environment where consequential decisions are made by mechanisms that won’t explain themselves. We’ve privatized and automated ancient forces humans once propitiated through ceremony. The forces that determined hunting success, crop growth, favor granted or withdrawn.
Platform logic that determines content visibility, application consideration, ride offers, partner suggestions operates according to rules as inscrutable as divine will. Humans do what they’ve always done with inscrutable power: develop ceremonies.
Ceremonies may not work. That’s beside the point.
They provide structure for engaging uncertainty. They create agency where minimal exists. They convert the terrifying arbitrariness of hidden platforms into something navigable, discussable, refinable.
University of Chicago behavioral scientist Jane Risen showed that humans can recognize beliefs don’t make sense yet act on them anyway. She calls this “acquiescence.” We aren’t fooled by superstitions but find them more useful than their absence.
Nobody knows if platform folk beliefs are real. The platforms won’t say. But the stakes stay high and the mechanisms stay hidden, so the ceremonies continue. Most people reading this recognize the pattern. The subject line length that opens emails. The LinkedIn posting time that maximizes engagement. The dating app bio structure that generates matches. Checking metrics. Adjusting variables. Performing ceremonies that might be superstition or might be survival tactics. It’s genuinely hard to tell.
Rydra will keep posting at 7:23 PM. Case will keep stripping her resume of personality. Miller will keep logging out and back in. They do what humans do when confronted with inscrutable power: seek patterns, develop practices, hope performance matters.
Platforms benefit from confusion. Whether they admit it is irrelevant.
The mechanism is self-reinforcing. Opacity generates uncertainty. Uncertainty generates ritual. Ritual generates engagement data. Engagement data refines the opacity. Users who understood the logic would optimize once and move on. That’s precisely what can’t be allowed to happen.
What’s already happening will accelerate. The platforms won’t become more transparent because transparency conflicts with profit. The folk beliefs won’t disappear because humans don’t stop being human when confronted with digital inscrutability. The consultants selling ceremony optimization will multiply because the market rewards those who provide structure for navigating uncertainty, even false structure.
The pattern accelerates. Resumes that list optimization credentials instead of skills. Dating profiles specifying posting times rather than interests. Children learning platform appeasement before algebra because invisible systems that determine opportunity matter more than visible ones that teach nothing the machines value.
Unless we choose transparency over extraction.
Rydra posts at 7:23 PM. It might not matter. It might matter more than she knows in ways she’ll never understand. The platform won’t tell her. That’s not a bug. That’s what makes her check back tomorrow to see if today’s ceremony worked, to adjust variables for the next attempt, to remain engaged with a system that profits from her uncertainty.
She has developed a liturgy. So have you. That’s what we do now: perform rituals for systems that won’t explain themselves, hoping our ceremonies matter more than the silence suggests.
Research Notes: Algorithmic Folk Beliefs
Started with something I kept noticing in casual conversations: people talking about Instagram like it’s a moody deity that needs appeasing. Post at 3:47pm for maximum engagement. Use exactly seven hashtags. Don’t edit captions or the algorithm gets angry. These weren’t teenagers being silly. These were professionals, people who understood technology, t…








