home

search

Fault Lines

  Ed had never thought of himself as a decision-maker.

  He had been trained to follow systems, not challenge them. Even before he learned about the chip, even before the words control architecture entered his vocabulary, his life had been shaped by structures he did not question. Work orders arrived. He executed them. Problems were defined for him. Solutions were implied. The world had felt narrow, but stable.

  Now it felt infinite and brittle.

  Ed had been taught, gently and relentlessly, that systems were not moral actors. Systems were neutral. They produced outcomes, and outcomes were judged only insofar as they maintained stability. The word breaking itself had been framed as childish, destructive, a failure of imagination. You repaired systems. You optimized them. You worked within them. To break one was to admit that you did not understand it. And yet the longer Ed examined the structure that governed his life, the more that lesson curdled.

  The system did not merely constrain behavior; it authored it. It shaped what people could know, what they could want, and how much of themselves they were allowed to feel. If morality required consent, then nothing inside the system qualified as moral. Happiness achieved through sedation was not virtue. Order achieved through enforced ignorance was not peace. If he pulled the thread and the structure collapsed, would that be violence—or would it simply be the first honest act the system had ever endured?

  What frightened him was not the scale of the damage but the possibility that the system was right about one thing: that most people did not want to be free. Ed understood the cost of knowledge now. He felt it every time he remembered how easy it had been to accept his place before the chip failed. The system had not lied to him about happiness; it had simply defined happiness as the absence of conflict.

  If he shattered that definition, he would be giving people something heavier in its place—choice, responsibility, grief, anger, guilt. He would be forcing them to wake up. And Ed could not escape the possibility that in doing so, he would be condemning many of them to lives they were not equipped to live. The system was monstrous, yes. But it was also merciful in a way only a prison could be. The question that would not let him go was whether mercy without truth was anything more than cruelty with better public relations.

  He sat at the long table in the smaller meeting room of the settlement, hands wrapped around a mug that had gone cold, listening to Rocky talk through possibilities. The room smelled faintly of oil, dust, and old circuitry. Someone had tried to clean recently, but the effort had been symbolic at best.

  Across from him, Rocky leaned forward, elbows on the table, eyes bright in a way that made Ed uneasy. Rocky was in his element. That alone should have been reassuring. Instead, it made Ed more aware of how little ground he was standing on.

  Pearl Jammer sat to Ed’s right, quiet, hands folded, posture relaxed. Too relaxed.

  Pearl looked like someone waiting for a conversation to end so he could begin a more important one somewhere else.

  “So,” Rocky was saying, tapping the table with one finger, “the question isn’t whether the drones can be freed. It’s whether the system will survive the attempt.”

  Ed nodded slowly. “Define survive.”

  Rocky smiled thinly. “Good question.”

  Pearl inclined his head slightly, as if appreciating the phrasing. “Let’s be precise,” he said. “We’re not discussing morality yet. We’re discussing feasibility.”

  Ed glanced at him. Pearl’s voice was calm, almost soothing, but there was an edge beneath it. Not urgency. Calculation.

  Rocky continued. “The chips aren’t just trackers. They’re not just behavioral nudges. They’re integrated control nodes. They regulate feedback loops, dopamine response, task satisfaction, emotional variance. Pull them out physically, like Rex did? You get instability. Confusion. Sometimes collapse.”

  “But you can pull them out,” Ed said. “Rex proved that.”

  “Yes,” Rocky agreed. “On an individual level. Carefully. With medical oversight. And even then, survival rates might not be great.”

  This tale has been unlawfully lifted from Royal Road. If you spot it on Amazon, please report it.

  Pearl’s eyes flickered briefly at the word survival.

  “What about remote intervention?” Ed asked. “Software?”

  Rocky leaned back. “That’s where it gets interesting.”

  Ed felt his pulse quicken. He had expected resistance, caution. Not interest.

  Ed understood that software was never just software. It moved faster than bullets and reached farther than orders, and once released it could not be recalled by apology or regret. Changing the chips would not be an act of liberation alone; it would be an act of authorship. He would be deciding, for millions of people who had never met him, that their lives would fracture at a specific moment into before and after. Some would wake confused, stripped of certainty and comfort. Some would fail. Some would die. The system had hidden those costs behind efficiency and silence, but breaking it would surface them all at once, raw and undeniable.

  Ed did not pretend otherwise. Responsibility did not end at intention; it began there. If he acted, he would own the consequences not as an abstract moral victory but as a ledger written in disrupted lives. Yet leaving the code untouched meant accepting a quieter violence that never asked permission. Between imposed peace and chosen chaos, Ed realized there was no innocent option, only one he was willing to stand behind when the noise finally stopped.

  “If the chips were just hardware,” Rocky continued, “this conversation would be over. But they’re not. They’re part of a distributed system. Firmware updates. Behavioral patches. Compliance reinforcement. It’s all centralized.”

  “Where?” Ed asked.

  “The stations,” Rocky said. “Primarily. Secondary hubs on Earth. And redundancy nodes scattered across the system.”

  Pearl interjected gently, “Which means any attempt to override them would be immediately visible.”

  Rocky nodded. “Yes. Unless—”

  He stopped.

  Ed leaned forward. “Unless?”

  Rocky smiled again, this time with something like satisfaction. “Unless the system believes the change is part of itself.”

  Silence settled over the room.

  Pearl’s expression didn’t change, but Ed caught the micro-shift: a slight tightening around the eyes, a recalibration.

  “Go on,” Pearl said.

  Rocky pulled a tablet toward him and began sketching with his finger. “The chips don’t operate independently. They’re synchronized through a moral arbitration layer.”

  Ed frowned. “A what?”

  “A model,” Rocky clarified. “An ethical framework. The system doesn’t just tell drones what to do. It tells them why it’s good. Why compliance equals safety. Why deviation equals harm.”

  Ed felt something cold move through him.

  “They don’t just remove choice,” he said slowly. “They replace it.”

  Rocky pointed at him. “Exactly.”

  Pearl folded his hands tighter. “You’re suggesting that to free them, we don’t break the chips. We rewrite the justification.”

  “Yes,” Rocky said. “We give the system a new moral truth.”

  Ed stared at the table. The idea felt enormous. Terrifying. Familiar.

  “Truth,” Ed repeated. “Or just… another lie.”

  Rocky hesitated. For the first time since the conversation began, he looked uncertain.

  “That’s the line,” he said. “I don’t know where it is.”

  Pearl smiled faintly. “Most revolutions don’t.”

  Ed closed his eyes briefly. Images surfaced unbidden: the bus. The guards. The word recycler. The ease with which his life had almost ended because a system had deemed it efficient.

  “What happens,” Ed asked, “if we do nothing?”

  Rocky didn’t answer.

  Pearl did. “Then the system stabilizes. The instabilities burn out. The drones remain content. Humanity continues as designed.”

  “And the people who die?” Ed pressed.

  Pearl met his gaze evenly. “They will be mourned by those permitted to mourn them.”

  Something hardened inside Ed.

  “That’s not an answer,” he said.

  “It’s the only one that lasts,” Pearl replied.

  Ed stood abruptly, pushing his chair back. The sound echoed too loudly in the small room.

  “No,” he said. “It doesn’t last. It just delays.”

  Rocky looked up at him, startled. Pearl remained seated.

  Ed paced once, then stopped. “You asked earlier if the system would survive the attempt. I think that’s the wrong question.”

  Rocky tilted his head. “What’s the right one?”

  “Whether we deserve to,” Ed said.

  The words surprised him as much as anyone else. They felt right anyway.

  Pearl watched him carefully now. “Careful,” he said. “You’re drifting into abstraction.”

  “No,” Ed replied. “I’m drifting into responsibility.”

  Rocky let out a breath. “I can do it,” he said quietly.

  Ed turned. “Do what?”

  “I can design the rewrite,” Rocky said. “Not a shutdown. Not a wipe. A release. The system keeps running but the moral arbitration flips. Autonomy becomes the highest good. Consent replaces compliance.”

  Pearl’s voice was smooth. “And the consequences?”

  Rocky’s jaw tightened. “Mass confusion. Identity crises. Probably deaths.”

  “How many?” Pearl asked.

  Rocky shook his head. “I can’t model it. Too many variables.”

  Ed felt the weight settle fully now. Not fear. Not doubt.

  Choice.

  “When?” Ed asked.

  Rocky looked at him sharply. “You’re serious.”

  “Yes.”

  Pearl stood. Slowly. “This is premature,” he said. “You need more data. More modeling. More time.”

  “No,” Ed said. “Time is the system’s weapon.”

  Pearl studied him for a long moment, then nodded once. “Very well. Continue discussing technical parameters. I’ll… coordinate resources.”

  He turned toward the door.

  Before he reached it, it opened.

  Punny stood there, breathing hard.

  Behind him, flanked by two settlement guards, was Brain 1.

  The room froze.

  Punny’s voice was flat. “We have a problem.”

  Ed felt his stomach drop.

  “What happened?” Rocky asked.

  Punny didn’t look at him. He looked at Pearl.

  “Brain 1 murdered Brain 2,” Punny said.

  Silence detonated.

  Brain 1 smiled faintly.

  Pearl did not.

Recommended Popular Novels