The Friction Artists
How engineers learned to feel good about what they build
About a dozen Google employees resigned over Project Maven in 2018. Not thousands. Not hundreds. Twelve. The Pentagon’s machine-learning contract that sparked 4,000 petition signatures and months of internal controversy ended with roughly twelve people walking out publicly. Google declined to renew the contract. Something shifted.
This math shouldn’t work. A company with over 100,000 employees doesn’t typically alter course because twelve people quit. Yet the drone-vision project died, and those resignations were later cited as the catalyst. One engineer who left: “If I can’t recommend people join here, why am I still here?”
That question ricochets. Not just for the engineer asking it, but for the invisible thousands who never made the news. It ricochets into midnight calculations about mortgages and equity vesting. Into dinner party explanations that sound reasonable until spoken aloud. Into the quiet accommodation we’ve perfected—staying but shifting sideways, contributing but not to that, building but drawing private lines no one else can see.
The tech industry has the highest turnover of any sector: between 13 and 21 percent of employees leave each year. Average tenure at Google: 1.1 years. At Uber: 1.8 years. These are not factory workers trapped in company towns. These are the talent. The builders who can’t stay anywhere longer than it takes to vest their first equity tranche.
The exit interview is always the same ritual: “seeking new challenges,” “exciting opportunity,” “better alignment with personal goals.” Nobody says “I couldn’t look at myself in the mirror anymore.” The system has pre-approved excuses that make everyone comfortable.
But buried in the data sits a gap that should trouble anyone paying attention. When researchers survey departing employees, 62 percent cite “belief in the company mission” as relevant to their decision. When they survey employers, only 15 percent think mission mattered. The employers aren’t asking the right questions. Perhaps they don’t want the answers.
Somewhere in that gap live the friction artists. The ML engineer who requested a transfer rather than train the content moderation system on data that felt exploitative. The security researcher who let the surveillance tool optimization slip behind schedule until someone else picked it up. The infrastructure lead who didn’t refuse the assignment (that would create a record) but whose team somehow never got around to the integration that would have made the facial recognition system work at scale.
These decisions rarely make news. They rarely make anything. No petition. No walkout. No letter to the CEO. Just small acts of friction that accumulate in ways no one tracks. Strategic incompetence as ethics. Delay as dissent. The professional version of “I’d love to help but I’m slammed right now.”
The ACM and IEEE maintain a Software Engineering Code of Ethics. It dates to 1999, back when “software engineer” still felt like an aspirational job title rather than the profession that would restructure every other industry. The code instructs engineers to “promote no interest adverse to their employer or client, unless a higher ethical concern is being compromised.”
There’s the loophole, right in the founding document. When ethics conflict with employer interests, ethics are supposed to win.
In theory.
Unlike doctors or lawyers, software engineers have a code of ethics with all the enforcement power of a YouTube comments policy. Violate it, ignore it, disregard it entirely. No licensing boards. No bar exam. No continuing education. No enforcement body. The profession articulated principles in 1999 and then spent the next quarter-century not enforcing them, which is either admirably libertarian or catastrophically negligent depending on whether the engineer is building recommendation algorithms or just scheduling meetings.
The result: In this ethical vacuum, each engineer becomes a one-person supreme court, weighing algorithms against morality in the privacy of their own skull. The code provides moral cover (point to it if challenged) but no practical protection. Being seen as “difficult” can still end a career. The whisper network doesn’t distinguish between principled objection and personality conflict. Someone who refuses an assignment on ethical grounds looks, from the outside, exactly like someone who’s territorial about their code or bad at collaboration.
This explains why the friction artists stay quiet. Making yourself visible as an ethics problem ends careers. Making yourself invisible as a business problem might actually slow something down.
But sometimes visibility happens anyway. 2018 was the year tech workers realized they were workers, as journalist Nitasha Tiku put it. Not founders-in-waiting. Not temporarily embarrassed millionaires. Not partners in the mission but labor resources being optimized like any other input. The revelation was less political awakening and more cognitive dissonance finally achieving critical mass.
After Project Maven came Project Dragonfly: Google’s secret effort to build a censored search engine for China, one that would link users’ phone numbers to their search queries and suppress results about democracy, human rights, and religion. Fourteen hundred employees signed a letter. One senior research scientist resigned publicly. By December, the project had “effectively ended” after what internal sources described as a clash with members of the privacy team. By July 2019, Google confirmed it was dead.
Microsoft employees sent an open letter when the company won a $480 million contract to supply HoloLens headsets to the Army. “We did not sign up to develop weapons,” they wrote, “and we demand a say in how our work is used.” Over 250 employees signed. The contract survived.
At Amazon, employees and shareholders delivered 150,000 petition signatures demanding the company stop selling its Rekognition facial recognition system to police. In June 2020 (after years of pressure, and in the charged aftermath of George Floyd’s murder) Amazon announced a one-year moratorium on police use. Later extended indefinitely.
Seven hundred Google employees formed a group called “Maven Conscientious Objectors” to discuss strategy and share concerns. The term deliberately evoked the draft-era language of refusing to participate in war. The parallel was not subtle.
These public victories became case studies in collective action, evidence that organized workers can shape what their companies build. And they did shape it: specific projects, cancelled or delayed, because employees refused to be complicit.
But the academic literature has grown increasingly skeptical about what any of it actually changed.
Here’s what changed: some engineers got to feel better about themselves. Here’s what didn’t change: the structural incentives that make building surveillance tools more profitable than not building them.
A 2024 paper in Science and Engineering Ethics states it plainly: “Tech ethics is vague and toothless, has a myopic focus on individual engineers and technology design, and is subsumed into corporate logics and incentives.” The concern isn’t that individual ethical action doesn’t matter to the individual. It’s that individual ethical action primarily serves to manage the engineer’s own conscience rather than change anything structural.
Feel good about not building the surveillance tool. Meanwhile, someone else builds it.
This is conscience management dressed up as ethics. And once seen, it can’t be unseen. The entire apparatus of tech worker resistance might be less about preventing harm and more about allowing well-compensated professionals to maintain their self-image as good people while participating in systems that don’t care whether they’re good people or not.
Consider the selection effect. Twelve public resignations killed Project Maven. But what about all the projects that didn’t die?
Over time, engineers with ethical objections transfer quietly away, making room for engineers without those objections. The facial recognition contracts succeed because the teams self-select for people comfortable building them. The content moderation systems launch because everyone who found the training data exploitative already moved to infrastructure. Eventually the most ethically sensitive engineers concentrate in the least ethically troubling projects. The troubling projects get staffed by those less inclined to object.
Working on certain projects becomes a signal. Everyone knows what it means when someone stays on the surveillance team. What it means when someone volunteers for the law enforcement contracts. The system doesn’t need to enforce compliance. It just needs to let people sort themselves.
The corporate immune system doesn’t attack ethical resistance; it incorporates it, turning antibodies into nutrients. Routes around it. Optimizes. The friction artists exit to save their consciences. Someone else takes their place. The project continues.
Optimize for moral comfort. The system adapts around it. Everyone gets what they need.
The retaliation data suggests this pessimism isn’t paranoid. Meredith Whittaker, who helped organize the Google walkout, later said she was told to “abandon her work on AI ethics.” She left. Timnit Gebru was fired after a dispute about an ethics paper. Margaret Mitchell was fired while investigating the claims Gebru had raised.
The pattern isn’t subtle: companies praise ethical concerns in the abstract and punish them in the particular. They fund ethics research. They establish ethics boards. They issue statements about responsible AI. And then they fire the ethicists when they find ethics. It’s not hypocrisy if there’s clarity about the purpose. Ethics teams exist to manage perception, not prevent harm. When they accidentally try to prevent harm, that becomes a performance review issue.
And yet.
The projects did die. Maven, Dragonfly, the police contracts for Rekognition. Real outcomes, not just symbolic gestures.
But every engineer with ethical qualms who transferred away from surveillance tools made room for an engineer without those qualms. The projects that died got replaced by projects that learned to staff differently. The academic critique is right that individual action can’t fix systemic problems. Individual action primarily serves to manage the engineer’s own conscience. The system routes around resistance. Both things are true. That’s the trap.
What if the friction artists matter precisely in the ways we can’t measure? Every transfer request that delays a timeline. Every integration that never quite gets prioritized. Every engineer who declines an interview at a company whose products they can’t stomach defending. The cumulative effect isn’t tracked because it isn’t trackable. It manifests as unexplained friction, staffing difficulties, projects that take longer than projected.
The executives don’t log “failed to recruit because of ethical reputation.” They log “talent shortage.” Which might be the most effective form of resistance available: making yourself legible as a business problem rather than an ethics problem. Business problems get budget. Ethics problems get performance reviews.
Thirteen former Palantir employees published an open letter in 2024, condemning their former company’s work with immigration enforcement. The letter was unusual not because the critique was new (critics have targeted Palantir for years) but because it came from inside. Former employees breaking non-disparagement agreements, speaking publicly about what they built and what they now regret.
It’s a strange form of witness. Build the thing. Cash the checks. Vest the equity. Now warn others, like an arsonist who’s found religion and wants to talk about fire safety. The moral high ground is slippery when standing on the ashes of what was personally lit. But that’s also what makes the testimony harder to dismiss. These aren’t outside activists who don’t understand the constraints. They know how the sausage gets made because they held the knife.
Maybe that’s the role the friction artists play. Not preventing harm (that may be beyond individual power) but accumulating evidence of what the harm costs. Every departure adds to a record that only becomes visible in retrospect. Someday, when the histories get written, someone will tally the engineers who left rather than build the thing. The number will be larger than anyone expected.
The question is whether the record matters if it comes too late to change anything it documents.
There’s a version of this story that ends with solutions. Unionization, professional licensing, legal protections for conscientious objection. These reforms might help. Some are already underway.
But the reforms don’t alter the fundamental asymmetry. The power to build belongs to institutions. The power to refuse belongs to individuals. When those powers collide, institutions usually win, not through force, but through absorption. The objector leaves. Someone else takes their place. The project continues. The institution remains.
What the friction artists demonstrate isn’t that individual ethics can stop this process. It’s that individual ethics make it marginally harder. Slightly more expensive. Fractionally slower. Every small refusal is sand in the gears of a machine designed to build whatever it’s told to build.
The sand doesn’t stop the machine. It doesn’t even slow it down meaningfully. It just ensures the engineers can tell themselves they tried while the machine keeps building. That’s conscience management all the way down.
This may not be ethics in any substantive sense. Just the friction generated by people trying to live with themselves while participating in systems that don’t care whether they live with themselves or not.
But it’s what we have. The friction artists, making their quiet calculations, deciding day by day what they will and won’t build. The choices stay invisible. The cumulative effect stays unmeasured. The institutions keep building.
Somewhere right now, an engineer is deciding whether to take a meeting about a project they can’t defend. Weighing the mortgage against the conscience, the career against the cost of being difficult.
They’ll take the job. They’ll tell themselves it’s complicated, that someone else would do it if they didn’t, that at least they’ll do it thoughtfully. They’ll draw their private lines and feel good about where they’ve drawn them.
What they’re building doesn’t care about the lines. It never asked where they’d draw them. It never needed their comfort to ship.
And it’s shipping tomorrow.
Research Notes: The Friction Artists
Started with a question: twelve engineers quit Google over Project Maven in 2018, and the project died. How does that math work? A company with over 100,000 employees doesn’t typically alter course because twelve people resigned. Yet that’s what happened. Something about that arithmetic felt important.











