This piece stopped me for a different reason than most. Not because it was dense or dramatic, but because it described a pattern I’ve watched unfold before, usually right before institutions begin making very confident mistakes.
What Rachel is writing about doesn’t feel, to me, like a story about technology. It feels like a story about attention. About what happens when organizations under pressure begin mistaking access for understanding, speed for judgment, and volume for insight. I’ve seen that substitution play out in other domains, long before anyone attached it to artificial intelligence, and the outcomes are rarely subtle.
I’m not approaching this as a technical expert. I’m reading it as someone who’s spent years near large systems and learned how they behave when they’re anxious. When timelines compress. When performance becomes a proxy for competence. When saying yes is rewarded and saying slow down begins to sound like disloyalty.
What struck me most wasn’t the risk of error itself, but the kind of error being invited. Rachel isn’t warning about occasional mistakes. She’s pointing to distortion. To outputs shaped by environments that reward outrage, simplicity, and emotional charge. That’s a different category of danger. Errors scatter. Distortion points somewhere. And in contested environments, someone is always ready to aim it.
My thoughts keep returning to the people downstream. The analyst expected to trust what appears on the screen. The planner whose hesitation is read as friction. The young service member told the system is authoritative and delay is failure. When things go wrong, accountability rarely moves upward through procurement chains or press conferences. It settles on the shoulders of the person closest to the action.
That pattern is familiar. Institutions almost never admit design failure. They talk instead about training, implementation, misuse. The structure remains intact. The individual absorbs the cost.
What’s being described here feels like another instance of drift. Not one reckless decision, but a series of choices that all lean in the same direction. Faster. Louder. More confident. Less careful. Over time, that posture becomes normal. Doubt fades. Verification thins. Judgment gets replaced by throughput.
This isn’t limited to the military. It shows up wherever organizations forget what rigor feels like. Where branding outruns discipline. Where certainty is prized more than accuracy. Where caution is treated as obstruction rather than care.
I don’t read this as a call to panic. I read it as a warning about forgetting. Forgetting that information isn’t the same thing as intelligence. Forgetting that knowing takes time. Forgetting that human judgment isn’t a bottleneck to be removed, but a responsibility to be carried.
When institutions lose that memory, harm doesn’t spread evenly. It concentrates. It finds the people with the least ability to refuse and the fewest ways to push back.
That’s the part worth holding onto. Not the technology. Not the personalities. The pattern.
Because once a system starts confusing performance with understanding, it can move very quickly while becoming very bad at noticing where it’s going.
And by the time that becomes obvious, someone else is already being asked to pay for it.
Thank you for this. The core risk is 100% attention under pressure.
THIS is exactly why friction matters. Friction is where judgment lives. It’s where meaning gets tested, where doubt has room to surface, where someone can say “wait” before momentum hardens into doctrine. When institutions start treating friction as disloyalty or inefficiency, they don’t get faster understanding; they get faster distortion.
And you’re right, distortion is the danger. Errors scatter. Distortion aims. In contested environments, that aim never stays neutral for long.
The pattern you’re naming is the warning. Technology just happens to be today’s accelerant.
Ignoring the fact that AI doesn't remotely do what's promised, and ignoring the fact that Musk's goal seems to be to steal government data, which are already two huge problems. But even ignoring that. The whole rush to AI has been predicated on beating China to AI superiority. Yet the government approved the sale of advanced AI chips to China, which completely undercut the entire argument for the chaos and negligence that the rush to AI is causing. It's doing something that doesn't work, doing it badly, doing it rushed, to beat China to it, but we are selling China our one advantage. It's lunacy. I would expect nothing else from Trump and Hegseth and Bessent and Miller and Vought.
There will be no films of higher ups gaslighting and tricking the pentagon. They will individually ask for meals and be force fed Twitter. This post should be elevated to whistleblower status.
I think it starts by making the public aware of it. But also, make sure military personnel understand their own risk. Ultimately, shit rolls downhill; lower-level Intelligence staff will be the only ones left holding the bag when it hits the fan.
This piece stopped me for a different reason than most. Not because it was dense or dramatic, but because it described a pattern I’ve watched unfold before, usually right before institutions begin making very confident mistakes.
What Rachel is writing about doesn’t feel, to me, like a story about technology. It feels like a story about attention. About what happens when organizations under pressure begin mistaking access for understanding, speed for judgment, and volume for insight. I’ve seen that substitution play out in other domains, long before anyone attached it to artificial intelligence, and the outcomes are rarely subtle.
I’m not approaching this as a technical expert. I’m reading it as someone who’s spent years near large systems and learned how they behave when they’re anxious. When timelines compress. When performance becomes a proxy for competence. When saying yes is rewarded and saying slow down begins to sound like disloyalty.
What struck me most wasn’t the risk of error itself, but the kind of error being invited. Rachel isn’t warning about occasional mistakes. She’s pointing to distortion. To outputs shaped by environments that reward outrage, simplicity, and emotional charge. That’s a different category of danger. Errors scatter. Distortion points somewhere. And in contested environments, someone is always ready to aim it.
My thoughts keep returning to the people downstream. The analyst expected to trust what appears on the screen. The planner whose hesitation is read as friction. The young service member told the system is authoritative and delay is failure. When things go wrong, accountability rarely moves upward through procurement chains or press conferences. It settles on the shoulders of the person closest to the action.
That pattern is familiar. Institutions almost never admit design failure. They talk instead about training, implementation, misuse. The structure remains intact. The individual absorbs the cost.
What’s being described here feels like another instance of drift. Not one reckless decision, but a series of choices that all lean in the same direction. Faster. Louder. More confident. Less careful. Over time, that posture becomes normal. Doubt fades. Verification thins. Judgment gets replaced by throughput.
This isn’t limited to the military. It shows up wherever organizations forget what rigor feels like. Where branding outruns discipline. Where certainty is prized more than accuracy. Where caution is treated as obstruction rather than care.
I don’t read this as a call to panic. I read it as a warning about forgetting. Forgetting that information isn’t the same thing as intelligence. Forgetting that knowing takes time. Forgetting that human judgment isn’t a bottleneck to be removed, but a responsibility to be carried.
When institutions lose that memory, harm doesn’t spread evenly. It concentrates. It finds the people with the least ability to refuse and the fewest ways to push back.
That’s the part worth holding onto. Not the technology. Not the personalities. The pattern.
Because once a system starts confusing performance with understanding, it can move very quickly while becoming very bad at noticing where it’s going.
And by the time that becomes obvious, someone else is already being asked to pay for it.
Thank you for this. The core risk is 100% attention under pressure.
THIS is exactly why friction matters. Friction is where judgment lives. It’s where meaning gets tested, where doubt has room to surface, where someone can say “wait” before momentum hardens into doctrine. When institutions start treating friction as disloyalty or inefficiency, they don’t get faster understanding; they get faster distortion.
And you’re right, distortion is the danger. Errors scatter. Distortion aims. In contested environments, that aim never stays neutral for long.
The pattern you’re naming is the warning. Technology just happens to be today’s accelerant.
Excellent take on this.
Ignoring the fact that AI doesn't remotely do what's promised, and ignoring the fact that Musk's goal seems to be to steal government data, which are already two huge problems. But even ignoring that. The whole rush to AI has been predicated on beating China to AI superiority. Yet the government approved the sale of advanced AI chips to China, which completely undercut the entire argument for the chaos and negligence that the rush to AI is causing. It's doing something that doesn't work, doing it badly, doing it rushed, to beat China to it, but we are selling China our one advantage. It's lunacy. I would expect nothing else from Trump and Hegseth and Bessent and Miller and Vought.
AI will kill us even sooner than we feared.
Yes. AI is going to kill us through the environmental damage caused by datacenters that don't actually do anything.
Truly monstrous. The sorcerer's apprentice is being instructed to kill us.
This obfuscating data lingo is designed to keep us clueless.
SKYNET is coming.
We’re doing nothing to stop it.
This Narrative Warfare is destroying America!
Engineered Chaos leads to democracy failure!
Just posted a deep‑dive teaser on how foreign powers weaponize America’s internal chaos:
https://substack.com/@geopoliticsinplainsight/note/c-198193128
There will be no films of higher ups gaslighting and tricking the pentagon. They will individually ask for meals and be force fed Twitter. This post should be elevated to whistleblower status.
Sounds like we are in deep doo doo. What to do?
I think it starts by making the public aware of it. But also, make sure military personnel understand their own risk. Ultimately, shit rolls downhill; lower-level Intelligence staff will be the only ones left holding the bag when it hits the fan.
This week is just sucking more and more.
https://thistleandmoss.com/p/the-gathering-this-week-still-sucks-really-bad-sigh