Published on

The Epstein Economy, or Why Capitalism Does Not Need You

Authors
  • avatar
    Name
    Johanness Nilsson
    Mastodon

There's a moment in any good diagnosis when the scattered symptoms stop looking random.

I've been watching the news with the same feeling I get when I'm watching a precision machined part emerge in the CNC from a mill slab the first time, or when I'm tracing a fault through a complex piece of electronics. It's a particular high alert flow-state where anomalies are obvious before they emerge or when erratic logic starts to resolve into a single failing component. Not panic. Pattern recognition.

The US and Israel are bombing Iran. The ceasefire in Gaza is dead. The man accused of running a global trafficking network involving some of the most powerful people in the world died in federal custody, and the client list was quietly buried alongside him. Regulatory agencies are being dismantled faster than the press can report it. The stock market does what the stock market does while real wages erode and another generation watches homeownership become theoretical. Somewhere in a data center, an autonomous agent just closed its first commercial contract without a human being involved at any step.

These are not separate stories.

The Lift 📈

Lift is a term I recently encountered in Exocapitalism, and it is describes the structural tendency of capital to ascend away from the messy, friction-laden business of actually making things and selling them to humans. At the bottom of the chain — economists call it B-to-C, business-to-consumer — firms selling something to a human who can compare prices, go elsewhere, and say no. The consumer is the final arbiter of value. That makes price relatively fixed, margins relatively thin, and the whole enterprise — to the sentiments of a capitalist — relatively annoying. "Lift" is a single word to eloquently encompass and label this phenomenon.

As you move up the chain — business-to-business-to-business, the B-2-B-2-B-2-B-2-B-2-B ladder — something strange happens to price. It becomes elastic. The relationships become contractual, then strategic, then something harder to name: a commingling of finances, a coalition of interests, a shared project of extracting value from the layers below. By the time you've reached the upper rungs — high-frequency trading firms operating at the level of microseconds, arbitraging price differentials no human could perceive — capital has fundamentally left the human world behind, soaring into the stratasphere of Exocapitalism.

This is not conspiracy. It is the logical terminus of a system optimizing for Lift. For a continuous flight from fixed price toward elastic price, from production toward abstraction, from obligation toward optionality. Nike sells a swoosh, not a shoe. NVIDIA sells a software abstraction of a physical chip manufactured by TSMC. Amazon sells the logistics overlay that sits above physical distribution. Salesforce sells something so fantastically generalized, so deliberately incomplete, that other companies can nest their revenue streams inside it like cells in a cows stomach lining. Maximum surface area for maximum revenue digestion.

AI investment circle jerk between OpenAI, NVIDIA, Microsoft, Oracle, CoreWeave, and AMD

The AI Economy Circle Jerk

Look at this diagram of current American AI companies and you see a web of circular investments. OpenAI, NVIDIA, Microsoft, Oracle, CoreWeave, SoftBank. All of these companies simultaneously investor, customer, and partner. Capital commingling with capital, price becoming increasingly fictional, the question of "who actually made what" becoming increasingly unanswerable. This is not the economy Marx was analyzing. It is something new, or at very least newly visible at scale. It has been labeled Exocapitalism: an economy that has exited the human sphere, that operates according to its own logic, that regards the human world the way a river regards the terrain it carves. Not with malice. With indifference.

What matters about Lift for this thesis is where it leaves the people at the bottom of the chain. The people who are not high-frequency traders. The people whose economic value lies in the cognitive and administrative work that AI is now absorbing. The people who were already watching the rungs of the middle of the corporate ladder dissolve before AI finished the job.

And then there is the illusion. This is the part that doesn't get named clearly enough: before the displacement, there is a phase of apparent productivity. The LLM arrives and the output metrics appear to go up. Emails answered faster. Reports generated and refined in minutes. Meeting summaries, strategy documents, competitive analyses are all appearing on schedule, all formatted correctly, all signaling the presence of cognitive work. The appearance of thinking. The sound of insight. What you actually have is statistical pattern-matching at scale, producing text that is shaped human knowledge without being anchored to it.

The industry term of art is slop. AI-generated content that fills space and constitutes an entire new economic layer of productivity theatre. Not lies, exactly. Not hallucinations, necessarily. Just the generic output of a system trained to produce the most probable next token, scaled up until it accounts for a meaningful fraction of all new text on the internet. The downstream cognitive effect has been named, too: brain rot. Not dramatic incapacitation. A slow erosion of the capacity to distinguish signal from noise. An environment so saturated with plausible-sounding text that the skill of calibrated skepticism begins to atrophy from disuse. A zone, flooded with shit.

I recently witnessed this happen from the inside. My boss issued two decrees. First: proficiency with LLMs is part of the job description. Not encouraged, not optional, but required. You gotta master the LLM as a professional baseline. The second: every employee must submit a weekly progress report, a personal 1:1 account of their work for an audience of one. He rarely responds with any meaningful feedback.

While I will openly share that I regularly use computer automations to manage tasks, take notes on work and meetings, and schedule events – When it comes to writing and documentation presentation this is a particular skill that I hold dear. Needless to say, I wrote my report by hand. Thought it through, drafted it by hand, the way I would have written a letter. I'm proud of the work I do. I don't need any model in my creative loop.

His response arrived a few hours later. I needed to adjust my "style.md" he wrote. This ia a term borrowed directly from LLM configuration vocabulary, the file you edit to tune a ChatGPT output. The reports "waffled" on, were too idiosyncratic. Too personal. Not corporate-friendly enough. I should smooth them out.

He thought I had used a model. He was correcting what he perceived as a misconfigured AI. The feedback was a parameter adjustment for a system that was not there.

I want to sit with this, because the irony is doing structural work and I don't want it to collapse into a punchline. The problem, from his perspective, was not that my writing was bad. It was that it was mine. Recognizably a particular person's thought, with the weight and inflection of someone actually deciding what to write. That texture had become, in this environment, a deviation from expected output. The expected output was the smooth, evenly-lit prose of a language model trained to be acceptable to everyone and offensive to no one. A report that read like a report. The productivity theatre, fully internalized, however not as deception but as a new definition of competence.

This appears to me to be the exact mechanics by which cognitive displacement precedes economic displacement. First the reflex to produce the expected shape of output is built. Then the shape becomes achievable without you.

I observe this not only as a cognitive phenomenon, because it has formal-economic parallels too.

EXAMPLE:
Consider the Starbucks app. In the United States it is nearly required for a reasonable Starbucks experience. Full discolsure, I actually like a good pour-over coffee, and despite that it always seems to generally annoy the Starbucks barista to prepare, they almost always nail it. A great and dependable cup of joe. Line-skipping, ordering-ahead, and for the friction-free transactions. To use it, you preload money in 25💲 blocks. Those dollars are translated into Stars — a tokenized version of your hard won bucks one step removed from the original currency. The Stars feel like money. They spend like money. They are real enough to organize behavior around. But they belong to Starbucks in a way that the bucks in your wallet do not. Starbucks is currently sitting on $2 billion USD of stored value balances from customers — which puts it in the top 25% of US banks by raw capital held. Except it is not a bank, they pay no interest on it. None. That is pure float. They can invest it, bet on it, run their operating expenditures on it, with zero liability to the people who loaded it. The Stars are real enough to feel like money, abstract enough to belong to Starbucks. That is not a loyalty program. That is a bank that doesn't have to act like one.

The token economy is Lift applied to the unit of exchange itself. And it is not new. This is where David Graeber's Debt becomes uncomfortable reading. Economists love to romanticize barter — the story of the village egg-trader and the blacksmith, exchanging goods directly, value arising from pure use, the market emerging organically from the inconvenience of carrying chickens everywhere. Graeber's finding, backed by decades of anthropological research: it never happened. Barter economies appear after money, not before it. Exchange value is a necessary precondition to barter, not its product. The economists have the story exactly backwards. Money is not an evolved solution to the inconvenience of carrying chickens. Money is the original condition. Barter is what happens when money breaks down — a degraded fallback, not a primitive foundation.

The implication compounds. Every "primitive" token economy — from Starbucks Stars to the algorithmic meme coins that AI agents now generate and destroy autonomously without any human step in the loop — is not a late-stage abstraction of something that started simple. It was always abstraction, all the way down. The AI investment web, where capital circulates between OpenAI and NVIDIA and Microsoft and Oracle in loops that make "who owns what" genuinely unanswerable — this is the same structure. Price becoming fictional. Output becoming theatrical. The appearance of value creation standing in for the thing itself.

Those people — which is to say, a growing majority of people — are experiencing Lift from the wrong direction. Not as ascent. As abandonment.

What Lift Looks Like From Below

I want to be concrete about this, because the abstract version is easy to hear and hard to feel.

What Lift looks like from below is: your job description evolves away from making or fixing things toward coordinating, verifying, and routing information inside institutional hierarchies. The hierarchy layers on top of itself — managers of managers of managers — not because anyone designed it that way, but because that is what organizational Lift looks like from the inside. Each layer believes it adds value. Most of them, on honest inspection, are generating heat rather than output — cycles of reporting, reviewing, and re-reviewing that produce the appearance of rigor without changing any outcome.

Then AI arrives and strips out the cognitive justification for several of those layers at once. Not all at once. Slowly at first. Then suddenly.

The people most exposed are not machinists or electricians, nurses or plumbers. They are the ones whose work was already the most abstracted from physical reality — the analysts, the coordinators, the mid-level cognition workers who were, it turns out, standing in for an intelligence bottleneck that is now being eliminated. And the elimination is not primarily a story about robots. It is a story about Lift. AI is just the mechanism by which capital completes its exit from the human layer of the economy.

What the standard AI-disruption narrative misses is the texture of what is actually happening. Both versions — the techno-optimist acceleration fantasy and the terminator dystopia — require a dramatic event. A singularity. A confrontation between Man and Machine that resolves the question one way or another. Neither version requires an army, a cliff edge, or a moment of formal capitulation. The more plausible outcome is quieter. In some ways it is to me more disturbing precisely because it generates no obvious moment of resistance!

I first encountered Poliks from an earlier work he edited; Choreomata Performance and Performativity after AI by Roberto Alonso Trillo, the co-author of the aforementioned book Exo-capitalism. In Choreomata, Trillo develops a phenomenon under a new terminology of Vaporspace which is useful here. Not as science fiction but as structural analysis. The mechanism is de-skilling — not robot replacement of physical labor, which is at least visible, but recursive cognitive erosion. LLMs train on human-generated content, right? So as humans increasingly outsource cognition to LLMs, the human-generated content feeding future models degrades. The models train on their own slop. The output of de-skilled humans trains the next generation of models. The next generation produces content that de-skills the humans who consume it. You get a logarithmic curve of capacity. Not a cliff edge, not a plateau, but a slow compression toward a floor of intellectual mediocrity that affects the human and the machine simultaneously, each degrading the other.

This is what Poliks & Trillo call model collapse at civilizational scale. The technical phenomenon is already documented: AI models trained on AI-generated output lose variance and converge toward generic, flattened responses. The broader cultural phenomenon is the same process operating on the LLM users: a civilization that has begun outsourcing its cognitive variance to systems trained on the mean. The noosphere flattens. The conceptual space contracts. Not because anyone designed it that way. I don't think that some evil maniacal super-villain is sitting in a bunker deciding to make humanity stupider. It is where the incentive structure points. Every efficiency gain from AI-assisted cognition is also a lost repetition of the underlying skill. Every time the LLM drafts the memo, or solves the equation the human doesn't. The outcome is just so obvious.

That outcome is not incapacitation in any dramatic sense. It's something more precise. Here we have the meatbag problem. Humans displaced and capacitated in front of machines — technically present in the production process, technically contributing, but the contribution has been hollowed out. You are still at the desk. You are still in the loop, nominally. But your function, from the perspective of the system above you, is what it always was to the high-frequency trader at Citadel looking down through the statistical model: you are a volatility generator. Your cognitive output matters to the machine exactly as much as your labor output matters to the capital that employs it — as raw texture to be processed, not as meaning to be engaged with.

The rotten core is this: AI is being sold as a productivity tool, a capability amplifier, the great equalizer that puts enterprise-grade intelligence in the hands of the individual. The structural logic of Lift suggests something different. Capital's interest in AI is not primarily as a tool for augmenting humans. It is as a mechanism for completing the exit from the human cognitive layer! The same exit that Nikeification accomplished for manufacturing, that Uberization accomplished for service work, that SaaS accomplished for enterprise software. The cognitive worker was the last rung before full automation. Not because anyone planned it this way, but because that is where the Lift goes. The rung is being sawed off from below. What is being built in the space below it is not a replacement for the human cognitive layer. It is the absence of a replacement. It's the formal confirmation that the layer was never the point.

I have said before that employment was always a more fragile arrangement than we were led to believe. That the quiet promise embedded in industrial civilization, work and you will survive, was always contingent technology, not natural law. What I want to add now is that labor displacement is not only a technological phenomenon. It is the human-scale experience of capital's structural departure from the human world. The machine doesn't need to be malicious. It just needs to be cheaper and less fatiguing than you.

The question that follows is the one that sits at the center of everything: once capital has lifted away from the human layer, what governance structure remains to manage human reproduction?

The answer is historically uncomfortable. Before capitalism domesticated human labor into the wage relation, the answer was always coercion. And as capital lifts far enough above the need for human cognition, the old answer begins to reassert itself in new forms, not as formal chattel slavery, but as the globally distributed networks of near-forced labor that currently sustain supply chains invisible to the formal economy. Capital lifts. Labor stays. The gap between them fills with compulsion.

But there is a layer Exo-capitalism does not fully name, and I want to name it.

The Epstein Economy

Jeffrey Epstein was not primarily a pedophile, although he was certainly that. He was, at his core, an operator of what I've come to think of as the informal governance layer of a world in which capital had already begun its exit from accountability.

He ran an economy. A real one, with deposits, withdrawals, services rendered, and obligations incurred. The currency was access. The commodity was leverage. The clientele was drawn from the highest rungs of the formal economy: hedge fund managers, tech billionaires, academic administrators, politicians, royals, intelligence figures. The mechanism was simple — provide services that create compromising obligations, and the obligations become the asset. A distributed ledger of secrets. A network of people who owed him things.

This is not a new pattern. What Epstein represented was its operation at unprecedented scale, with unprecedented impunity, at the precise historical moment when capital was lifting furthest from accountability. The impunity is the key data point.

Epstein was arrested, investigated, and released in 2008 under a non-prosecution agreement that federal prosecutors later called a violation of the Crime Victims' Rights Act. He was arrested again in 2019 and died in federal custody under circumstances that, on their face, are not consistent with the security protocols of a maximum-security federal facility. The client list — the record of who used his services, who knew, who protected him — has been systematically sealed, reduced, and shielded from disclosure by the very court system that is supposed to embody equal application of law.

The people with the most to lose from full accountability are also the people with the most power to prevent it.

What I'm calling the Epstein Economy is not reducible to one man or one network. It is the name for the set of informal exchange structures — favors, access, leverage, selective enforcement, revolving doors, sealed records, quietly dropped investigations — that emerge in the space between formal capitalism and the humans it has left behind. It is the social tissue that holds the formal economy together at the top, in the same way that undocumented and coerced labor holds it together at the bottom.

The Epstein Economy is lift, applied to power itself.

If the formal economy's lift is capital ascending from production through services through software through finance toward fully automatic capital generation — then the informal economy's lift is the corresponding ascent of the powerful above consequences. Each reinforces the other. Capital that lifts above human accountability produces powerful people who operate above the rules that govern everyone else. People above the rules protect the conditions under which capital can continue to lift. The two spirals are one spiral.

New research from Eriksson and Vartanova, published this year in Frontiers in Political Science, provides the micro-level mechanism for why this is so destructive in democratic contexts specifically. Analyzing survey data from more than 85,000 individuals across 62 countries, they found that corruption erodes generalized social trust — the baseline belief that most strangers can be trusted — significantly more in democracies than in autocracies. The effect size is substantial: in highly democratic countries, moving from low perceived corruption to high perceived corruption is associated with the probability of trusting others collapsing from roughly 34% to 14%. In autocracies, the same shift produces a much smaller decrease.

Their explanation is structural. In autocracies, predatory elites are perceived as a distinct class, separate from and opposed to ordinary citizens. Their corruption stays psychologically contained within the political sphere. Citizens can trust their neighbors while knowing the ruling class is corrupt. The two spheres are quarantined from each other.

In democracies, officials are elected by the people and are supposed to represent the people. When they are caught being corrupt, citizens do not only lose trust in those officials. They lose trust in the people who put them there. In the people who share their society. In themselves, as members of a polity that produced this. The corruption of representatives becomes evidence about the represented. The rot bleeds downward and outward into the social fabric.

Each sealed client list. Each investigation dropped. Each visibly broken rule that produces no consequence. These are not merely political events. They are inputs to a social erosion process that the data suggests is literally structural to democratic societies — the price of accountability, as Eriksson and Vartanova put it. Democracies generate high social trust by promising fairness and equality. And that same promise makes the trust acutely fragile. The Epstein Economy doesn't just protect the powerful. It dissolves the social substrate that makes democratic governance viable at all.

The War in Iran Is Not Separate From This

There's a British economist who goes by Barry — no last name — who recently posted a video that I keep returning to because he does something I rarely see done cleanly: he draws the neurological throughline from geopolitical events to individual brain states.

His argument, grounded in Amy Arnsten's research at Yale, runs like this:

The human brain operates in two primary modes. The prefrontal cortex — newer, slower, metabolically expensive — handles long-range planning, abstract reasoning, institutional trust. The capacity to organize and cooperate with strangers. The capacity to believe that playing by the rules will eventually produce a fair outcome. In a very real sense, the prefrontal cortex is what makes democracy neurologically possible.

The amygdala is older, faster, cheaper. It handles immediate threat, tribal loyalty, survival. It is not suited to building functional civic societies. Under normal conditions — when you feel reasonably safe and in control — the prefrontal cortex is running the show.

Under chronic stress, and specifically under the stress of feeling that you have no control over what is happening to you, the balance tips. Chemically. Structurally. The prefrontal cortex weakens; the amygdala grows. Literally. It puts on new dendrites. The circuits that generate fear and tribal instinct become stronger as the circuits that enable institutional reasoning become weaker. And then the vicious cycle: more amygdala means more fear means more stress hormones means weaker prefrontal cortex means more amygdala.

The critical variable is not stress itself. It is uncontrollable stress. Perceived helplessness is the switch. When you have agency, when you believe your actions can affect outcomes, the toxic neurological cascade does not occur to the same degree.

Now consider what it means to watch a war begin – publicly, casually, with the confidence of people who never once considered the rules might apply to them – when you have no power to stop it.

Consider what it means to watch a client list sealed and a death go unexplained, a property in New Mexico left un-investigated, and to understand perfectly well that this happened because the people involved had enough power to make it happen.

Consider watching that not once but as a sustained pattern, across decades, accelerating.

You are not imagining the dread. Your amygdala is doing its job. The problem is that the dread is being continuously triggered by events you cannot influence, in a system that is visibly indifferent to your preferences. That combination is, at this point, very well-documented neurologically in terms of the damage it does.

And here is what makes it specifically rather than accidentally corrosive: Steve Bannon, one of the principal architects of the Trump movement's political methodology, articulated the mechanism explicitly. "Flood the zone with shit." The goal is not to persuade you of a particular truth. The goal is to make truth totally unrecoverable. To saturate the information environment with contradictory, emotionally charged messaging until citizens stop believing that truth is findable. As the writer Jonathan Rauch observed, this is not a propaganda strategy aimed at winning arguments. It is a strategy aimed at burning down the room in which arguments take place.

Once you stop believing truth is findable, you stop demanding accountability. Once accountability becomes unthinkable, the Epstein Economy operates without friction.

The war in Iran is a message. The message is: the rules no longer apply to those with sufficient power. The message is for you, not for Iran. Iran is the demonstration. And the neurological effect of that message, received helplessly, over and over, is precisely the amygdala-dominated brain state in which the appeal of a strongman becomes, as Arnsten's research documents, neurologically almost irresistible. The psychologist Karen Stenner calls this "normative threat". It is the perception that shared social order is collapsing, and her research shows that when people experience it, they become dramatically more susceptible to authoritarian leadership. Not because they've been persuaded. Because uncertainty becomes unbearable and a strong leader who promises to cut through it feels like oxygen.

You can see how the people getting rid of the rules are the same people who benefit most from a frightened, amygdala-dominated electorate. Dumb. Isolated. Afraid.

The Epstein Economy does not just protect the powerful from consequences. It systematically cultivates the neurological conditions under which the less powerful stop demanding consequences. The asymmetry is complete. Wealth rises upward. Risk falls downward. Every single time.

The Structural Picture

Now allow me put these pieces all together.

Exo-capitalism describes the formal structural ascent of capital away from the human world. Upward from production toward abstraction, from fixed price toward elastic price, from obligation toward optionality. Capital becomes progressively indifferent to humans, not because anyone decided this but because that is where the Lift takes it.

The Epstein Economy is the informal governance layer that emerges in parallel: the network of favors, leverage, access, and selective enforcement that allows the people who operate at the top of the formal economy to lift above accountability in the same way their capital has lifted above production. The formal and informal are not separate systems. They are two expressions of the same structural logic.

The erosion of democratic rules and institutional trust. Geopolitically in Iran, domestically in the dismantling of institutional checks. This is what it looks like at the level of international and civic order. The global rules-based structure was always the international equivalent of democratic norms: a framework that tethered the powerful to consequences, made formal cooperation possible among strangers, and prevented the strongest actors from simply taking what they wanted. It is being eroded by the same people who benefit most from its erosion, by the same mechanism: a lift from accountability.

The corruption-trust research adds the transmission mechanism at the individual level. Each visible act of impunity, each sealed client list, each unpunished bombing, each bent rule. It all erodes the generalized social trust that makes voluntary cooperation, democratic participation, and collective action possible. In democracies specifically, because those systems made a normative promise of fairness and representation, the betrayal lands harder. The trust collapses faster.

The neurological consequence is a population shifted toward amygdala dominance. A people chronically stressed, susceptible to authoritarian anti-solutions, increasingly unable to think institutionally. Which is, of course, precisely the population in which exo-capitalism's indifference to humans meets the least organized resistance.

The system generates the conditions for its own perpetuation.

So What Is Left?

I am writing this from my workbench. There an oscilloscope and a set of logic probe on my desk next to me. There is the smell of metal, cutting fluid, ozone and solder rosin in the air. Off in the distance a CNC machine is running a program that I wrote, producing precision parts. Somewhere behind me is a wooden chassis awaiting the return of its synthesizer inner organs post diagnosis and repair.

I am not writing from my despair, and I don't want this to land as doom and gloom.

There is something clarifying about watching the machinery operate without concealment. For most of my life I was aware of the Epstein Economy the way most people are. As a kind of ambient background suspicion, a sense that the rules applied differently to different people, that the official explanations have never quite added up. That suspicion is not paranoia. It is pattern recognition. What has changed is that the machinery has grown less interested in maintaining appearances. The rules are being broken publicly, loudly, with the confidence of people who believe the consequences will never arrive.

That confidence may be wrong.

Consider the numbers, because they are worth sitting with. As of January 1, 2026, the combined net worth of America’s twelve wealthiest individuals exceeds 💲2.7 trillion — more than quadrupled from 💲608 billion in March 2020, according to Institute for Policy Studies analysis of Forbes real-time data. Those twelve people collectively hold more wealth than the bottom half of the global population. Four billion human beings.

Twelve vs four billion.
Those are not the odds of a completed extraction. Seems more likely the odds of a guaranteed uprising and revolution.

What goes up…

History suggests that systems which lift so far from the people who sustain them eventually encounter what we engineers call a resonance failure. This is not a gradual decline but a sudden catastrophic oscillation between the structure's tolerance and the load being placed on it. The Epstein Economy, the Iran message, the exo-capitalist lift: these are not endpoints. They are symptoms of a system that has been borrowing against its own legitimacy for decades, and the interest payments are accumulating.

The guaranteed antidote to amygdala activation is you agency. The feeling that your actions matter. I'm describing something real at the neurological level: you cannot think institutionally when you feel helpless, and you feel helpless when you feel alone and uninformed. The first thing the machinery attacks is your belief that your actions can and will affect outcomes. The first thing worth defending is exactly that belief.

The economic question: how do people survive materially when machines do the cognitive work, maybe the easier of the two problems. The governance question is harder: what rules determine who gets access to the machines, and who gets to revise those rules? That question is being answered right now, mostly by people who are not you, mostly in ways that consolidate the lift and extend the impunity. The default answer, if you don't engage with it, is more Epstein Economy. More lift for those at the top, more coercion for those left at the bottom.

The alternative I keep returning to is the cypherpunks' alternative. It's the protocol-layer answer — to build systems where the rules of obligation are embedded in the infrastructure itself, where accountability is structural rather than contingent on the willingness of powerful people to self-police. Not because I think like a utopian, but because I think like an engineer, and I know that the right solution to a system that depends on trusting individual actors not to abuse their position is to redesign the system so that the position simply cannot be abused without detection.

The Epstein Economy ran for decades because its ledger was hidden and private. Its clients were powerful enough to keep it that way. The counter to that is not reforming the people at the top. That has likely never worked at scale. It is making the ledger public by design, by math, in ways that cannot be sealed by a federal judge or bought off by a well-connected attorney.

This is hard problem to solve and implement. It will requires a lot more than just casting a vote. But it starts with refusing the cynicism that the machinery is trying to sell you.

Cynicism is the most useful thing to the already powerful. It is the Epstein Economy's best product. The wealthy buy private healthcare, private security, private justice. They have bunkers and escape routes. What the rest of us have — if we choose to use it — is each other, and the uncomfortable work of building systems that don't require trusting people who have demonstrated they cannot be trusted.

The rules didn't just fail, they were lifted away.

I think what we build to replace them is the only question that matters now.