Reboot. Repeat. Regret


“They were careless people… they smashed up things and creatures and then retreated back into their money.”
— F. Scott Fitzgerald, The Great Gatsby
In the 1920s, Fitzgerald wrote of reckless elites who broke the world and floated off in silk shirts and champagne flutes. A hundred years later, Sarah Wynn-Williams gave us a new version in her memoir Careless People: A Story of Where I Used to Work, chronicling her time at Facebook (now Meta). It’s the same pattern but now the excess isn’t parties. It’s platforms. The debris isn’t emotional; it’s societal.
We don’t just live in the wake of tech’s carelessness. We scroll through it.
Wynn-Williams isn’t an outsider. She was there, inside the glass walls and growth charts. Her account is both a memoir and a warning a whistleblower’s documentation of what happens when consequence is considered a PR issue, not a moral one.
At Facebook, she saw the cost of growth up close. Wynn-Williams later went public, accusing leadership including Mark Zuckerberg and Sheryl Sandberg of enabling misinformation, brushing off dissent, and putting expansion above ethics. From the platform’s delayed response to the Rohingya genocide in Myanmar to its quiet accommodation of Chinese censorship demands, she paints a picture of a company that didn’t just move fast it muffled what didn’t fit the story.
“This isn’t the age of disruption. It’s the age of consequence.”
What she describes isn’t evil in the cartoon villain sense it’s quieter. It’s ambient. A culture so committed to acceleration that consequence becomes a rounding error. Or someone else’s job.
And it’s the same kind of recklessness that Telle Whitney is calling out in her new book Rebooting Tech Culture. Only this time, it’s not just about Facebook. It’s the whole damn operating system.
Move Fast, Break People
Let’s be honest: tech never fixed what Facebook broke. And now, AI is being built in the same mould.
The old playbook is back: ship fast, scale faster, deal with consequences when the lawsuits roll in. It worked (financially) for social media. So why not try it on reality itself?
Whitney’s warning is clear. As someone who’s spent decades inside the industry, she sees the same exclusionary culture playing out all over again in AI labs, in billion-dollar startups, in investor portfolios chasing “the next OpenAI.”
These tools aren’t being designed for everyone. And often, not by everyone either.
There’s a growing divide between those who build and those who are built upon. AI is increasingly shaped by a narrow set of voices — a concentration of elite engineers, funders, and platforms — while the rest of the world becomes its test set. The results affect how we search, how we learn, how we work, and how we’re seen.
Shoshana Zuboff called this out years ago in The Age of Surveillance Capitalism:
Tech's true innovation wasn’t the algorithm — it was the ability to turn human experience into data, and then into profit.
It’s not paranoia. It’s business.
And as The Social Dilemma made painfully clear, when the product is free, you are the product but with AI, we’re not just the product. We’re the pre-training data. The training set. The discarded scaffolding.
What’s missing isn’t just representation. It’s resistance. Teams move too fast for ethics to catch up, and feedback loops are flattened under quarterly metrics. A misfire in social media meant polarisation. A misfire in AI could mean systemic bias at the speed of scale.
The question isn’t whether these tools will reshape the world.
It’s whether anyone will be accountable for how they do it.
“We fed the machine our stories. It spat out a world that forgot us.”
When Memory Becomes Monetised
Tech companies used to profit from our attention keeping us glued to screens so they could sell our gaze to advertisers. Then came the data era: every click, scroll, and message became raw material to feed algorithmic predictions.
Now, they’re profiting from something more permanent: our labour, captured in memory.
Our labour — not in real time, but in permanent memory.
Large language models like ChatGPT, Claude, and Gemini were trained on the open internet. That includes blog posts, Stack Overflow answers, GitHub issues, fan fiction, academic papers, podcast transcripts, showreels, digital art — anything public enough to be scraped. If it was online, it was fair game.
They didn’t ask. They didn’t pay.
But now it’s productive, polished, and resold.
They trained the model on us. Now they sell it back to us as if we never existed.
Every beautifully phrased response, every auto-generated image, every AI-written line of code is built on top of a million anonymous contributions stitched together, stripped of origin, and served back like it was born in a lab.
The question isn’t whether these tools will reshape the world.
It’s whether anyone will be accountable for how they do it.“The real looting was not of banks or shops, but of language, memory, and meaning.”
— Rebecca Solnit
First, they commodified our time. Then our behaviour. Now, they’ve commodified our creativity — the final layer of what it means to be human.
They call it artificial intelligence. But what powers it is very real: Teachers. Designers. Journalists. Coders. Poets. Musicians. The dev who answered your urgent bug question at 2 a.m., unpaid and uncredited. The writer whose copy was good enough to train a headline generator. The animator whose style became aesthetic fodder for infinite prompts.
Online, that labour was invisible. Inside the model, it’s erased. This isn’t just mimicry. It’s replacement. A library turned vending machine. Convenience with no citation. Scale with no soul. We used to worry about plagiarism.
Now we’re watching industrialised forgetting, disguised as progress and monetised as product.
The platforms are betting you won’t notice. Or worse: that you’ll love the output too much to care.
But not everyone is letting it slide. Legal challenges are emerging from Getty Images v. Stability AI, where the company is accused of copying 12 million copyrighted photos, to Authors Guild v. OpenAI, in which writers claim their books were ingested without consent to train LLMs (Ars Technica, 2023). The question isn’t just ethical anymore — it’s legal. And precedent is still being written.
Designed to Exclude
Whitney traces the roots of modern tech culture to what she calls the PayPal mafia a small group of men who created the companies that still shape our world: Facebook, Tesla, Palantir, LinkedIn. These founders didn’t just launch products. They created a cultural template: hyper-competitive, male-dominated, obsessed with the myth of the lone genius.
Emily Chang’s Brotopia calls this out for what it is — a system built by and for a specific kind of founder, one that actively sidelines women and anyone who doesn’t match the brogrammer archetype. Meritocracy, she argues, became myth. A cover story.
And that myth got intellectualized.
In Zero to One, PayPal co-founder Peter Thiel describes the ideal founder as “a man with a plan” — singularly focused, contrarian, and immune to social consensus. It’s an ethos that rewards disruption, not dialogue. Vision, not feedback. Strength, not inclusion.
The result? A system where “culture fit” became a weapon. Where VC firms chased pattern-matched genius while filtering out dissent. Where innovation meant building for yourself — and assuming everyone else would follow.
It’s not a bug.
It was the blueprint.
What Diversity Really Does
Whitney isn’t just calling out the problem, she’s showing what better looks like.
She points to companies like AMD, where CEO Lisa Su and CTO Mark Papermaster led an inclusive design process that produced a more modular chip one that helped AMD dethrone Intel. Inclusion wasn’t charity. It was strategy.
And that strategy is measurable.
A 2020 report from McKinsey & Company found that companies in the top quartile for ethnic and gender diversity were significantly more likely to outperform their peers financially — a finding that’s only strengthened over time. McKinsey, 2020
The takeaway? Diversity isn’t decoration. It’s infrastructure. And the moment you treat it like a checkbox rather than a creative engine, you lose what it’s actually for.
Wynn-Williams saw this, too. Her Facebook memoir reveals how brilliant people slowly slip away from companies that don’t hear them. They don’t leave in protest. They leave in silence, like a tab quietly closing in a sea of open ones.
Culture Change Starts Small
Whitney offers a new model, one built on what she calls the six Cs:
Creativity. Courage. Confidence. Curiosity. Communication. Community.
Not as slogans. As systems. Values that reward listening, make space for different kinds of thinking, and actually support the people inside the product pipelines.
But these aren’t soft skills they’re the structural foundation of innovative, resilient teams.
Creativity means allowing ideas to come from anywhere, not just the most senior or loudest person in the room.
Courage is the ability to challenge legacy thinking, suggest unpopular ideas, and experiment without fear of retribution.
Confidence builds when people know their voice matters, not just when it echoes leadership's opinion.
Curiosity fuels better questions, better product decisions, and better ethical foresight.
Communication means active listening, psychological safety, and transparency beyond performative updates.
Community reminds us that innovation is a team sport one that only thrives when everyone belongs.
You don’t have to be a CEO to start. Team leads can build this now. So can mid-career engineers. So can new grads. But the first step is seeing that our current tech culture isn’t neutral it’s inherited.
And maybe it’s time we stop building with hand-me-down ethics.
Not as slogans. As systems. Values that reward listening, make space for different kinds of thinking, and actually support the people inside the product pipelines.
You don’t have to be a CEO to start. Team leads can build this now. So can mid-career engineers. So can new grads. But the first step is seeing that our current tech culture isn’t neutral — it’s inherited.
And maybe it’s time we stop building with hand-me-down ethics.
The New Tech Heroes Don’t Look Like the Old Ones
We know the names: Zuck. Musk. Jobs. Bezos.
They were cast as visionaries. Mavericks. World-changers. Their biographies became Bibles. Their aphorisms, scripture.
And so, the industry measured success by the size of your exit, the edge in your voice, the chaos you could command. Leadership became performance. Genius became exclusion.
But maybe that story is outdated.
Maybe it was always a bit of a lie.
Because the real work — the kind that lasts, the kind that includes, the kind that heals what tech has broken — is being done elsewhere. Quietly. Sustainably. Without a TED Talk.
Whitney names her heroes: Lisa Su, who rebuilt AMD through inclusive design. Jayshree Ullal, who leads with substance, not spectacle. And the thousands more women, people of colour, neurodivergent thinkers, community-first builders — who are architecting futures that don’t require someone else’s ruin to function.
These are the technologists shaping systems with empathy, not ego.
They’re not tweeting through a crisis. They’re preventing it.
And maybe it’s time we stopped chasing disruption and started honouring care.
Maybe the future doesn’t need more geniuses.
Maybe it needs more listeners.
Let’s Stop Being Careless
In Careless People, Wynn-Williams doesn’t write a takedown. She writes a warning.
In Rebooting Tech Culture, Whitney gives us the manual for doing it differently.
Together, they tell a story that’s uncomfortably familiar: the most dangerous thing in tech isn’t the code. It’s the culture. The willingness to “figure it out later.” The assumption that harm is just the price of progress.
We’ve seen what happens when you build without responsibility. Facebook taught us that.
AI doesn’t have to be the sequel.
But if we don’t reboot the culture not just the models it will be.
And next time, the stakes will be much higher.
References
Books
Zuboff, S. (2019) The age of surveillance capitalism: the fight for a human future at the new frontier of power. New York: PublicAffairs.
Thiel, P. (2014) Zero to one: notes on startups, or how to build the future. London: Virgin Books.
Chang, E. (2018) Brotopia: breaking up the boys’ club of Silicon Valley. New York: Portfolio.
Wynn-Williams, S. (2025) Careless people: a story of where I used to work. New York: Random House.
Whitney, T. (2025) Rebooting tech culture: a new playbook for building inclusive innovation. Cambridge, MA: MIT Press.
Documentaries
Orlowski, J. (2020) The Social Dilemma. [Film] Distributed by Netflix.
Podcasts
Ibarra, H. and Whitney, T. (2025) Does the tech industry need a reboot? [Podcast] Harvard Business Review, 14 May. Available at: https://hbr.org/podcast/2025/05/does-the-tech-industry-need-a-reboot (Accessed: 4 June 2025).
Reports
McKinsey & Company (2020) Diversity wins: how inclusion matters. Available at: https://www.mckinsey.com/capabilities/people-and-organizational-performance/our-insights/diversity-wins-how-inclusion-matters (Accessed: 4 June 2025).
News & Legal Articles
Ars Technica (2023) Getty Images sues Stability AI for copying 12 million photos, violating copyright. Available at: https://arstechnica.com/tech-policy/2023/02/getty-images-sues-stability-ai-for-copying-12-million-photos-violating-copyright/ (Accessed: 4 June 2025).
Business Today (2025) From political interference to “share a bed” claims: who is Sarah Wynn-Williams? Available at: https://www.businesstoday.in/latest/world/story/from-political-interference-to-share-a-bed-claims-who-is-sarah-wynn-williams-ex-meta-executive-behind-careless-people-memoir-mark-zuckerberg-sheryl-sandberg-joel-kaplan-468127-2025-03-17 (Accessed: 4 June 2025).
Hindustan Times (2025) Facebook whistleblower Sarah Wynn-Williams’ explosive memoir shakes Silicon Valley. Available at: https://www.hindustantimes.com/world-news/us-news/facebook-whistleblower-sarah-wynn-williams-explosive-memoir-shakes-silicon-valley-101742093810596.html (Accessed: 4 June 2025).
Wikipedia (n.d.) Careless People. Available at: https://en.wikipedia.org/wiki/Careless_People (Accessed: 4 June 2025).
Subscribe to my newsletter
Read articles from Daniel Philip Johnson directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Daniel Philip Johnson
Daniel Philip Johnson
Daniel Philip Johnson | Fullstack Developer | E-commerce & Fintech Specialist | React, Tailwind, TypeScript | Node.js, Golang, Django REST Hi there! I'm Daniel Philip Johnson, a passionate Fullstack Developer with 4 years of experience specializing in e-commerce and recently diving into the fintech space. I thrive on building intuitive and responsive user interfaces using React, Tailwind CSS, SASS/SCSS, and TypeScript, ensuring seamless and engaging user experiences. On the backend, I leverage technologies like Node.js, Golang, and Django REST to develop robust and scalable APIs that power modern web applications. My journey has equipped me with a versatile skill set, allowing me to navigate complex projects from concept to deployment with ease. When I'm not coding, I enjoy nurturing my bonsai collection, sharing my knowledge through tutorials, writing about the latest trends in web development, and exploring new technologies to stay ahead in this ever-evolving field.