Accelerate Like Hell: The Thermodynamics of Military AI Collapse
When you remove cooling mechanisms (verification, dissent, accountability), you do not get innovation. You get catastrophic failure.
This video is a mess, you have to skip forward to the 1:00.19 for the start of Elon’s introduction of Hegseth. But I watched so you don’t have to … full transcript at the very bottom of this page.
How Elon Musk and Pete Hegseth Are Converting Military Governance into Thermodynamic Collapse
There are two kinds of people who say “trust me” for a living.
The first kind is your aunt at Thanksgiving, holding a casserole whose recipe has clearly survived multiple administrations.
The second kind is the billionaire defense contractor whisperer and the Secretary of War, standing in front of a rocket factory-city they built on a public highway, promising to “accelerate like hell” while they wire frontier AI into the largest violence machine on Earth.
Guess which kind we got at Starbase, Texas.
Elon Musk opens with the usual carnival barker patter. Starbase is a “legally a city,” built “out of nothing,” sitting on a public road like a theme park attraction where the ticket price is democratic oversight. Then he says the quiet part out loud: SpaceX exists to make Star Trek real.
Cute.
But here is the problem with the Star Trek imagineering. In Star Trek, the Federation had a Prime Directive and a functioning ethical spine. In this version, the directive is “speed wins,” the spine is private capital, and the ethics are whatever fits on a slide deck labeled “Objectively Truthful AI.”
Then Pete Hegseth takes the mic and does the full ritual: awe at the hardware, reverence for the visionary, and a sermon about America’s destiny. He frames the Pentagon’s AI surge as “freedom,” because the U.S. political class has only one rhetorical move left: rename power as virtue, then dare anyone to object.
What is actually happening here is not innovation. It is a hostile takeover of governance.
And the people driving it are spectacularly untrustworthy.
Not “I once forgot to return a library book” untrustworthy. Not “I have an ex who thinks I am emotionally unavailable” untrustworthy. I mean, “I want command authority over your national security substrate, and also I have a documented relationship with reckless communication, impulsive ideology, and no meaningful accountability” UNTRUSTWORTHY.
That is the whole point.
Because this is not just an AI rollout. This is a trust coup, a systematic conversion of the Pentagon from a governed institution into a platform operating under the logic of the Anti-Trust Envelope.
The sales pitch: “Arsenal of Freedom.”
Hegseth calls it the “arsenal of freedom tour.” He says the U.S. must win “technological supremacy,” lists the standard buzzword bingo (AI, autonomous systems, quantum, hypersonics, long-range drones), and argues that the old procurement pipeline is too slow.
Fine. Procurement is slow. The defense industrial base is consolidated. Prime contractors are expensive and late. These are not controversial statements.
The trick is what he smuggles in under that broadly true complaint.
He is not proposing a better process. He is proposing an exemption from the process.
He says the Pentagon will stop “running a peacetime science fair” and start behaving like it’s already in a wartime arms race. He lays out “new rules,” and every one of them translates to: less friction, fewer constraints, more compute, more data, more speed, and fewer people allowed to say no.
“Speed wins,” he says, and then comes the line that should make every sober adult sit bolt upright in their fucking chair: “The risks of moving too slowly outweigh the impacts of imperfect alignment.”
That is not an operational tweak. That is a philosophical surrender.
It means: if the model is wrong, if it behaves badly, if it launders bias, if it misclassifies a target, if it hallucinates a justification, if it produces “plausible” nonsense that gets someone killed, the institution will treat that as an acceptable tradeoff.
Because speed.
Because “dominance.”
Because the U.S. military has always been very comfortable using human beings as error bars.
The Product: Grok for the Kill Chain
Hegseth announces that the Pentagon is adding xAI’s Grok to GenAI.mil, alongside Google’s Gemini, and he brags about its deployment to roughly 3 million users.
Pause here.
This is not “some staffers using a chatbot to write memos.” This is the normalization of frontier model access across unclassified and classified networks. It is mass adoption by the largest bureaucracy in the country, inside an institution where language becomes policy, policy becomes operations, and operations become bodies.
And Grok is not arriving as a neutral tool.
It is arriving amid public controversy over its behavior, including reports of antisemitic outputs and sexually explicit deepfake content, enough that multiple governments and regulators have reacted.
So the Pentagon’s answer to “this model has produced ugly and harmful outputs in the wild” is not “slow down,” not “tighten guardrails,” not “prove reliability,” not “publish audit results.”
It is “ship it.”
Because speed wins.
If you ever needed a living example of why “move fast and break things” becomes “move fast and break people,” congratulations, Starbase just hosted your TED Talk.
The Ideology: “Responsible AI” Redefined as Anti-oversight
Here is the core of the con.
Hegseth says “responsible AI” will no longer mean “equitable AI” or “DEI” or “social justice infusions.” Instead, it will mean “objectively truthful AI” used securely and legally. He caps it with the chest-thump: Pentagon AI “will not be woke.”
This is a rhetorical jailbreak.
Because “objectively truthful” is not a technical specification. It is a political flag.
Who defines “truth” inside an authoritarian-leaning security state? The same people who define “threat.” The same people who decide which communities get labeled “extremist.” The same people who call protesters “disruptors” while federal agents and contractors play dress-up in the machinery of coercion.
This is how you launder ideology through systems engineering.
You do not say “we want the model to align with our politics.” You say, “We want it to be objective.” Then you remove the evaluators you do not like. Then you declare dissent to be “ideological constraints.” Then you call the resulting pipeline “responsible.”
That is not responsibility. That is control.
The Mechanism: The Five Anti-stabilizers in Real-time
What Hegseth is describing is not a policy shift. It is the systematic installation of every element of the Anti-Trust Envelope—the five anti-stabilizers that convert functioning institutions into collapse engines.
Let me show you how each one appears in his speech, because once you see the pattern, you cannot unsee it.
1. Coercion (the inverse of Dignity)
Dignity requires that people who raise concerns are treated with respect and see timely, fair follow-through. It requires that harms be tracked and remediated with the same rigor as financial defects.
Hegseth announces a “barrier removal SWAT team” empowered to waive non-statutory requirements and escalate anything that slows AI down. He describes “blockers” as operational risks. He makes it sound like a heroic war story about bureaucrats finally being forced to stop writing memos.
In plain English: internal dissent becomes a security problem.
That is coercion. Not the jackboot kind—yet—but the structural kind. The kind that says: if you slow this down, you are the threat. If you ask for verification, you are the blocker. If you insist on safety checks, you are sabotaging national security.
He also threatens consequences for “data hoarding,” including personnel reassignment and withholding of funding within statutory limits. Translation: cooperate or be punished. Share your data or lose your position. The point is clear: refusals must be justified to a higher authority, and the people who slow the pipeline will be dealt with.
This is not governance. This is intimidation dressed in the language of efficiency.
And when intimidation becomes policy, you do not get better performance. You get silence. You get people who know something is wrong but have learned that speaking up is more dangerous than staying quiet. You get the conditions for catastrophic failure because the human early-warning system has been terrorized into compliance.
2. Extraction (the inverse of Agency)
Agency means operators have clear authority to act when time matters. It means overrides are available when procedures do not fit the context. It means the people closest to operations control the decision space.
Hegseth centralizes technical authority under a single “CTO” figure and reorganizes units like DIU to operate at “commercial tempo.” He describes the entire Pentagon adopting what he calls Musk’s “algorithm” for deleting requirements and accelerating decisions.
This is extraction masquerading as efficiency.
Because what actually happens when you “delete dumb requirements” is you delete the distributed intelligence that made those requirements in the first place. Someone, somewhere, learned something the hard way and wrote it down. That knowledge represents purchased wisdom—often purchased in blood.
When you centralize authority and strip local operators of decision rights, you are not streamlining. You are extracting agency upward. You are converting a complex adaptive system into a command hierarchy in which all intelligence flows to the top, and all orders flow down.
And hierarchies are brittle. They fail when conditions exceed the decision-maker's bandwidth at the apex. Which, in this case, is a person who thinks “move fast and break things” is a governing philosophy rather than a recipe for industrial manslaughter.
The speech explicitly describes removing operational autonomy from the people who actually understand the systems they work on and handing it to executives who think disruption is a virtue. That is not empowerment. That is theft.
3. Impunity (the inverse of Accountability)
Accountability means decision owners are visible and bear the cost of their own errors. It means corrective actions close on time and sanctions are consistent across seniority levels.
Hegseth’s core operating principle, ”the risks of moving too slowly outweigh the impacts of imperfect alignment,” is a direct statement of impunity.
It means: we will ship models that are not ready. We will deploy systems that have known failure modes. We will wire AI into classified networks at scale before we understand how it behaves. And when something goes wrong—when it misclassifies a target, when it produces discriminatory outputs, when it launders bias into operational decisions, when it gets someone fucking killed—we will treat that as an acceptable cost of doing business.
Who bears that cost? Not Musk. Not Hegseth. Not the executives who made the decision to accelerate.
The people who bear that cost are the ones at the other end of the decision pipeline. The service members who trust the system. The civilians in the target zone. The analysts whose professional judgment gets overridden by a model they were never allowed to audit.
Impunity is what you get when the people making decisions are insulated from consequences. And that is precisely what this architecture creates: a system in which harm flows downward and accountability evaporates upward.
Hegseth even says it explicitly: internal friction (legal review, safety checks, ethics boards, data governance) will be “removed” or “waived.” Which means the mechanisms that create accountability—the processes that force decision-makers to explain themselves, to show their work, to justify the risk—are being systematically dismantled.
That is not innovation. That is the construction of a consequence-free zone for people in power.
4. Forced Compliance (the inverse of Cooperation)
Cooperation means cross-functional work moves with a single rhythm rather than silo handoffs. It means interfaces between teams are designed and maintained as first-class assets. It means people work together because the structure enables it, not because they are compelled.
Hegseth describes “commercial tempo,” “barrier removal,” and mandatory data sharing enforced through personnel consequences.
That is not cooperation. That is forced compliance.
Because when you threaten people with reassignment for “data hoarding,” you are not creating teamwork. You are creating fear-based coordination. And fear-based coordination has a specific failure mode: it works until it doesn’t, and when it fails, it fails catastrophically because nobody wants to be the one to say, “This is broken.”
Real cooperation requires slack—time to negotiate, space to object, freedom to say “no, this does not work yet.” It requires interfaces that people actually agree on, not interfaces dictated from above.
What Hegseth is describing is the command-economy version of collaboration: centralized mandates disguised as alignment. Data must flow. Refusals must be justified. Speed is non-negotiable.
And when speed becomes non-negotiable, cooperation becomes impossible. Because cooperation is slow. Cooperation is negotiation. Cooperation is the boring adult work of making sure everyone understands what they are supposed to do and why.
Forced compliance produces the appearance of coordination without any of the resilience. It is Potemkin teamwork: it looks functional in the demo, but collapses under stress.
5. Frantic Iteration (the inverse of Adaptability)
Adaptability means you routinely expand option sets before committing. It means you can reconfigure plans quickly when context changes. It means the system learns from mistakes and incorporates that learning into its structure.
Hegseth is selling frantic iteration.
Frantic iteration is what happens when you confuse speed with learning. It is the “move fast and break things” model applied to institutions that carry guns.
He describes “accelerating like hell,” removing “blockers,” adopting “commercial tempo,” and shipping models at scale before they are ready. Every one of those is a signal of frantic iteration: high-speed change without the time to verify that it was an improvement.
Real adaptability requires what we might call thermal slack—time for the system to cool down between changes, time to measure whether the last iteration actually worked, time to incorporate feedback before the next push.
Frantic iteration strips that time away. It creates a system that is constantly moving but never learning, because learning requires stillness. You have to stop long enough to ask: Did that work? What broke? What do we do differently next time?
When you declare that “speed wins” and that delay is unacceptable, you are building a system that cannot learn. You are building a system that can only accumulate errors faster and faster until something critical fails.
And in the context of military AI, critical failure means bodies.
The Thermodynamics of Trust Collapse
Here is what these five anti-stabilizers do when you install them together: they convert a governed institution into a heat engine running without cooling.
In thermodynamic terms, trust is a low-entropy state. It requires structure, verification, accountability—all the things that take time and energy to maintain. The Trust Envelope creates that structure through dignity (protecting people), agency (distributing authority), accountability (closing feedback loops), cooperation (enabling coordination), and adaptability (learning from error).
The Anti-Trust Envelope does the opposite. It strips structure away and calls it efficiency. It removes cooling mechanisms and calls it acceleration. It creates conditions in which entropy increases faster than the system can handle.
Coercion generates heat by creating fear and suppressing dissent. Extraction generates heat by removing autonomy and forcing central planning. Impunity generates heat by severing feedback loops; decisions no longer have consequences, so errors accumulate unnoticed. Forced compliance generates heat by replacing negotiation with mandates. Frantic iteration generates heat by removing the temporal slack required for verification.
What happens when you remove all the cooling mechanisms from a complex system and demand maximum output?
It overheats.
And when institutions overheat, they do not fail gracefully. They fail catastrophically because all the stabilizing mechanisms that would have caught small problems early have been systematically destroyed.
Hegseth is describing the construction of a thermodynamic collapse engine at the heart of the Pentagon. He is removing every structural element that would allow the institution to notice when it is making a mistake, to correct course before disaster strikes, and to learn from failure instead of accelerating into it.
This is not a recipe for security. This is a recipe for a spectacular, preventable catastrophe.
The Power Shift: The Pentagon Becomes a Platform, and Musk Becomes a Sovereign
The most important detail here is not Grok. It is governance.
Hegseth centralizes technical authority, reorganizes around “commercial tempo,” and explicitly praises Musk’s “algorithm” as the model the Pentagon should adopt.
Let me translate: the Pentagon wants to adopt Musk’s cultural operating system.
A culture in which accountability is optional, harm is “collateral,” and everyone below the owner is a replaceable component.
But startups are not democracies. Startups are monarchies with cap tables.
And Musk is not just “a vendor.” He is positioning himself as a strategic substrate provider: rockets, satellites, communications, now AI. The same man who can shape speech on a major social platform, tilt information flows, and pick winners and losers in attention markets is now being welcomed deeper into defense infrastructure.
This is the billionaire as post-state sovereign in real time. The state is not regulating him. The state is contracting him into the nervous system of power.
And Hegseth basically says so, out loud, while praising Musk’s process for deleting oversight and accelerating decisions.
The entire speech is one long argument that the military should behave more like a startup, meaning it should operate as a system in which one person controls everything, and everyone else executes orders.
That is the opposite of trust.
Trust is Not Speed. Trust is Restraint.
Here is the dirty secret the “AI dominance” people cannot say on camera: trust is what you build when you have the capacity to do harm and choose not to.
Trust is delay. Trust is review. Trust is refusal.
Trust is the boring adult in the room who says, “No, we are not wiring an un-audited persuasion-and-classification machine into classified networks at scale because the vibes are patriotic.”
Trust is a procurement process that slows down weapons enough for democratic institutions to notice what is happening.
Trust is dignity—protecting the people who raise concerns. Trust is agency—letting operators control their decision space. Trust is accountability—making sure decision-makers bear the cost of their choices. Trust is cooperation—building real coordination instead of forced compliance. Trust is adaptability—learning from mistakes rather than rushing past them.
Trust is data minimization, not data extraction. Trust is verification, not speed. Trust is a system that can explain itself to the public it claims to defend.
And if you are allergic to those things, you are not building trust. You are building a compliance engine.
That is what this Starbase performance is: a declaration that the Pentagon is done pretending governance matters. The age of “responsible AI” as a restraint is being replaced by “responsible AI” as a slogan.
The slogan is: AI will not be woke. It will work for us.
That is how you know it will not work for you.
The trajectory: From Trust Envelope to collapse
Organizations do not fail randomly. They fail predictably, along thermodynamic gradients.
The Trust Envelope Model shows how institutions thrive: they maintain the five stabilizers, keep feedback loops open, distribute authority appropriately, and create slack for learning. They operate in what we might call a steady-state of low-entropy governance—structured, verified, accountable.
The Anti-Trust Envelope shows how institutions collapse: they install the five anti-stabilizers, close feedback loops, centralize control, and strip temporal slack. They enter a high-entropy state where heat accumulates faster than it can be dissipated, errors compound faster than they can be corrected, and catastrophic failure becomes not a possibility but a certainty.
And the trajectory is directional. Once you start down this path, once you declare that speed matters more than verification, that compliance matters more than cooperation, that acceleration matters more than learning, you create a self-reinforcing cycle.
Because removing oversight makes the system faster, which in turn makes oversight seem more burdensome, creating more pressure to remove it. Centralizing authority speeds decision-making, which makes distributed authority seem inefficient, creating more pressure to centralize. Punishing dissent speeds compliance, which makes dissent seem disloyal, which creates more pressure to punish it.
This is the downward spiral. This is how you convert a functioning institution into a collapse engine in real time.
And Hegseth just announced it as policy.
The inevitable conclusion
When two of the least trustworthy men in modern public life stand in front of a rocket factory and announce “accelerate,” “remove blockers,” “expand compute,” “force data sharing,” and “ship frontier models into classified networks,” they are not protecting America.
They are protecting their access to power.
They are converting the U.S. military into a platform, and platforms always drift toward surveillance, coercion, and monopoly. Especially when they are built by people who see accountability as an insult.
Star Trek promised us a future where technology expands human possibilities.
Starbase just promised us a future where technology expands state violence faster than law can keep up.
And the mechanism is visible, measurable, predictable: install the five anti-stabilizers, remove the cooling mechanisms, declare speed non-negotiable, punish resistance, and wait for the system to overheat.
The Pentagon is being systematically pushed out of the Trust Envelope and into the Anti-Trust Envelope.
That is not innovation. That is thermodynamic collapse by design.
And when it fails—and it will fail—the people who built it will be nowhere near the blast radius.
They never are.
SEE ALSO:
Transcript:
Segment B: SpaceX Starbase, Brownsville area, Texas (remarks begin around 1:00:36)
Elon Musk: Hello everybody. All right, welcome to Starbase, Texas. This is a city, legally a city, that thanks to the hard work of the SpaceX team we built out of nothing. It’s now a gigantic rocket manufacturing system.
For people curious to see it, we’re actually on a public highway, so you can come visit and drive down the road and see the hardware. I think this is the first time a rocket development program has actually been on a public highway.
We’re honored today to have the Secretary of War, Pete Hegseth, and senior Pentagon leadership here. It’s an honor to have them visit. We just did a tour of the factory, and I think it helps illustrate how manufacturing at scale is critical to the strength of America.
I’ll tell you a little about the purpose of SpaceX. We want to make Star Trek real. We want to make Starfleet Academy real, so it’s not always science fiction, but one day science fiction turns into science fact.
We have big spaceships with people going to other planets, going to the moon, and ultimately beyond our star system to other star systems where we may meet aliens or discover long-dead alien civilizations. I don’t know. But we want to go. We want to see what’s happening. We want epic futuristic spaceships with lots of people traveling to places we’ve never been before. That’s the goal.
On that note, I’d like to introduce the Secretary of War, Pete Hegseth.
Pete Hegseth: Thank you. Appreciate it. How about this: Star Trek real. Star Trek original.
What a tour. What an opportunity to be here at Starbase, Texas, with Elon and the SpaceX team. There’s nothing like this in America. There’s nothing like this in the world. What you have built, and what you will build, is a testament to American ingenuity and invention. Thank you to all of you out here for hosting us today.
Elon, thank you for what you’ve built, and for your vision for this company, for our country, for American innovation. I could not think of a more fitting venue to continue our “arsenal of freedom” tour and outline today the future of technological innovation at the War Department.
As World War II was ending, the Secretary of War and Secretary of the Navy wrote to the National Academy of Sciences and declared scientific research essential to national security. They warned the “competitive time element” in developing weapons and tactics may be decisive in future conflicts. They recognized the importance of innovation and readiness, and what was at stake: the freedoms we hold dear.
Across the United States today, extraordinary innovation is unlocking new possibilities for freedom, prosperity, and security. The question is whether the most powerful technologies of this century will reinforce free societies, or be shaped by malign regimes for control and coercion.
Over the past several months, I’ve talked about transforming the War Department to meet 21st century needs. Today is about supercharging innovation.
Since the end of the Cold War, the defense industrial base consolidated. This makes it difficult, if not impossible, for new technical innovators to win business at the department. The result is a risk-averse culture that prevents warfighters from getting the best resources America offers.
That ends today.
The United States must win the strategic competition for technological supremacy: artificial intelligence, autonomous systems, quantum, hypersonics, long-range drones, space capabilities, directed energy, biotechnology. Our legacy approach assumes technology moves in a predictable, linear path from lab to program of record, and can only be provided by a handful of consolidated companies. That system is archaic.
We can no longer afford to wait a decade for prime contractors to deliver “the next perfect system,” only to get it years late and vastly over budget. Winning requires a new playbook. Question every requirement, delete the dumb ones, and accelerate.
I’m making clear that the Under Secretary of War for Research and Engineering, Emil Michael, is the department’s single Chief Technology Officer. One CTO for the entire enterprise. As the CTO, he will set technical direction, lead an innovation ecosystem welcoming progress from anywhere, and tell me directly whether we are gaining or losing the technology race. He will have decision authority and drive measurable outcomes.
We will be done running a peacetime science fair while our adversaries run a wartime arms race.
On AI: President Trump’s AI executive order sets our approach: sustain and enhance America’s global AI dominance in defense of human flourishing, economic competitiveness, and national security. We must ensure military AI dominance so no adversary can exploit the same technology against us.
Last month we announced the rollout of GenAI with Google and their Gemini app to roughly three million users in the War Department. Today, we announce the next frontier AI model company joining GenAI.mil: Grok from xAI, going live later this month. Soon we will have leading AI models on unclassified and classified networks across the department.
We’re executing an AI acceleration strategy with seven pace-setting projects across warfighting, intelligence, and enterprise missions. Each has a single accountable leader, aggressive timelines, and measurable outcomes. One owner reports monthly. These are not science projects or governance boards.
New rules:
Speed wins. The risks of moving too slowly outweigh the impacts of imperfect alignment.
Remove blockers. People and policies that block progress will be treated as operational risks. Establishing a barrier-removal SWAT team to waive non-statutory requirements and escalate issues to the Deputy Secretary.
Compute. Invest in AI compute from data centers to the tactical edge, including data centers on military land. Partner with industry.
Talent. Use hiring and pay authorities to bring in top technical talent.
Responsible AI (redefined). Responsible AI means objectively truthful AI employed securely and within the law. “Gone are the days” of “equitable AI” and DEI constraints. The standard is factual, mission-relevant, and lawful.
Data. Data hoarding is a national security risk. Services must submit catalogs of data assets within 30 days. Denials must be justified and reported quickly. Persistent barriers will be escalated with potential personnel or funding consequences within statutory limits.
Beyond AI: we will break down barriers to rapid technology adoption. Legacy primes must prioritize national security over earnings calls: less buybacks, more investment in factory floors and infrastructure.
We are ending the “alphabet soup” of innovation councils that meet and brief but do not decide. They are abolished. The CTO will convene a CTO action group to make decisions and deliver technologies quickly.
We are realigning parts of the innovation ecosystem: DIU will be designated a War Department field activity operating at commercial tempo, and SCO is also aligned under the CTO to eliminate duplication and focus on delivery.
We will create clearer channels for industry: one to communicate problems we’re trying to solve, and DIU to help program offices adopt what industry has built. Faster yeses, faster nos.
The services must also transform. Within 90 days, service secretaries will brief the CTO on plans to streamline labs and units around three outcomes. Beginning FY2028 budgets, every portfolio acquisition executive will fund an innovation insertion increment for last-mile integration and rapid insertion into fielded systems.
President Trump has proposed a $1.5 trillion FY27 budget for the War Department, a historic investment. We will not squander the opportunity. We will deliver a more agile, more lethal, more ready force.
We will not stop. We will forge a new arsenal of freedom with partners in industry. If not America, if not the West, then who? If not now, it will be too late. This is not reform for reform’s sake. It is whether our warriors fight with yesterday’s tools or tomorrow’s technologies. Thank you. God bless you, God bless this company, and may God bless our republic.
(Ends where the provided transcript ends.)




yes, what could go wrong??? this also chimes in with Michael Sellers' take on Trump's modus operandi: https://michaeldsellers.substack.com/p/trumps-nyt-interview-transcript-reveals
Why not having Grok with all the controversy reporting about all the antisemitic outputs and sexually explicit deepfake content, enough that multiple governments and regulators have reacted. Then add an exFox celebrity that sees that the only way to be considered a man is how many kill tally’s are tattooed on your arm along with a symbol that is similar to one that was blown up in Germany atop the Zeppelintribüne (Zeppelin Grandstand) in 1945. Who spends more time in the makeup room to ensure he’s looking flush and no hair out of place and probably has a bourbon on the rocks with a breath mint to try and mask the smell. The other so self-absorbed he can’t go a few steps without looking at himself in a mirror and hits on any girl to try and ensure his DNA will populate the world. Who btw is so stocked on ketamine he hallucinates and amuses himself by balancing a spoon on his index finger for hours.
Yep what could go wrong with anything to do with the military, arsenal and nuclear war heads.. There are days you have to wonder if either of them watched any draconian-dystopian sci-fi movies of the utter destruction of the globe..