This Is an Every-Service Problem: Space Power and the Risk of Fundamental Surprise

February 23, 2026

By Catherine R. Cline

The United States is unlikely to be surprised in space in the traditional sense. It tracks launches globally, monitors orbital behavior in near–real time, and maintains an unmatched catalog of space objects. If surprise is defined as a failure to see something coming, then U.S. space power appears resilient.

While early warning is indispensable, it alone cannot prevent the most dangerous forms of surprise. The real danger emerges when surprise is not a failure to detect activity, but a failure to recognize when fundamental assumptions have become obsolete. In that case, U.S. space power may be exposed in ways that even superior warning capabilities cannot address. In space, the primary risk is not what goes unnoticed, but what strategic thinking remains unquestioned.

Modern space competition increasingly unfolds through ambiguity, reversibility, and cumulative effects. Here, adversaries do not need to hide their actions to achieve surprise; they only need to act in ways the existing frameworks cannot interpret. When facts do not fit, responses feel mismatched, and familiar tools fail to provide advantage, the problem is not a lack of warning but a deeper failure of understanding.

To understand why this form of surprise is so dangerous, and why space power is particularly vulnerable to it, it is useful to distinguish between two fundamentally different kinds of surprise. 

Situational vs. Fundamental Surprise

Zvi Lanir, an Israeli strategist and theorist of surprise, illustrates his core insight with an anecdote about Noah Webster, the famed lexicographer. One day, Webster returned home unexpectedly and found his wife in an intimate moment with another man. His wife exclaimed, “You surprised me.” Webster replied, “No—you astonished me.” The difference matters.

Webster’s wife was caught off guard by an unexpected event. Had she known her husband would return early, she could have avoided the situation. This was a situational surprise, a failure of warning within an otherwise intact understanding of the world.

Webster’s experience was different. What shocked him was not merely the event itself, but what it revealed: his assumptions about his marriage, his household, and his own understanding of reality were suddenly and decisively wrong. No additional piece of information—no earlier warning—would have prevented that realization. This was a fundamental surprise.

Lanir’s paradox is this: the more sophisticated a system becomes at preventing situational surprise, the more vulnerable it may become to fundamental surprise. Advanced systems excel at detecting anomalies within known frameworks, but struggle to recognize when the framework itself is no longer valid. Modern organizations, especially highly professional, technologically advanced ones, are very good at preventing surprises of the first kind. They are far less capable of recognizing when they are approaching the second. Space power sits squarely in this danger zone.

Modern space operations excel at situational awareness. The United States is exceptionally good at detecting discrete events: launches, maneuvers, interference, debris creation, and system degradation. These observations are processed through intelligence architectures designed to separate signal from noise and deliver timely warning to decision-makers. This model assumes that surprise results from insufficient or misunderstood information. Improve collection, reduce noise, and surprise becomes less likely. This logic has dominated intelligence and military thinking for decades.

Lanir argues that this logic is incomplete. In many historical cases—Pearl Harbor, Sputnik, Vietnam, and the Yom Kippur War—the problem was not a lack of information but an inability to interpret information that contradicted the prevailing paradigm. Analysts correctly observed the facts but misjudged their meaning because they were interpreted through obsolete conceptual frameworks. In Lanir’s terms, situational surprises are failures of detection. Fundamental surprises are failures of self-understanding. Space competition increasingly favors the latter.

Organizational Blunders and the Limits of Reductionism

Lanir draws on W. Ross Ashby’s Law of Requisite Variety, which holds that a system can regulate only as much environmental complexity as its internal structure can absorb. Organizations optimized for efficiency and control often reduce internal variety through specialization, hierarchy, and standardization. This produces operational excellence under stable conditions, but brittleness when conditions change fundamentally. The U.S. space enterprise reflects this tradeoff. Decades of optimization have produced extraordinary technical capability alongside functional segmentation, risk-averse processes, and tightly coupled bureaucratic control. These features deliver precision and reliability, but they also narrow the range of change the system can comfortably recognize as meaningful.

This structural constraint is reinforced by reductionism. Intelligence organizations are predisposed to study events rather than processes, components rather than systems, and capabilities rather than meaning. Space magnifies this weakness. Actions in space rarely produce isolated effects. A satellite maneuver is simultaneously a technical act, a strategic signal, a legal precedent, a commercial disruption, and a test of escalation norms. Effects propagate across military, economic, informational, and alliance systems. A system built to ask “What happened?” will struggle to answer “What has changed?” This is the core risk of fundamental surprise in space: the United States may correctly observe adversary actions while misunderstanding the strategic transformation those actions represent.

What Space Fundamental Surprise Looks Like

Fundamental surprise does not announce itself as a catastrophe. It appears first as normalcy and becomes visible only in retrospect, when leaders realize that long-standing assumptions about power, constraint, or identity were false. In space, a fundamental surprise is unlikely to take the form of a sudden, overt attack. More plausibly, it will form as the gradual erosion of deterrence credibility, the loss of strategic leverage despite intact capabilities, or an inability to respond proportionately because existing responses no longer map to the problem. By the time this realization occurs, the window for adaptation may already be closing.

Two contemporary patterns, one technological, one behavioral, illustrate how this form of surprise is emerging.

China and Ambiguous Technology

China’s Shijian-21 satellite is publicly described as a debris-removal platform. From a situational perspective, it poses a controllable challenge. Its orbit can be tracked, its maneuvers modeled, and its capabilities assessed. Any overtly hostile action would be detected in near-real time. The fundamental surprise does not lie in what the satellite does, but in what its existence represents. A robotic arm capable of grappling debris is, by definition, capable of grappling with an active satellite. By deploying a dual-use system, China has eroded the assumption that weapons and non-weapons in space are clearly distinguishable.

This is not a failure of warning. It is a failure of interpretation. The strategic meaning of proximity operations has changed, even though no threshold event has occurred.

Russia and Ambiguous Behavior

If China weaponizes technological ambiguity, then Russia weaponizes behavioral ambiguity. Its pattern of co-orbital “nesting doll” satellites, where one satellite releases a smaller sub-satellite to conduct so-called inspections, illustrates this approach. None of these actions are surprising individually. Each is detected, cataloged, and explained. The fundamental surprise is not that Russia might launch an attack, but that the pattern of behavior itself constitutes the attack. Through repeated, non-destructive acts of intimidation, Russia reshapes expectations about acceptable behavior in orbit.

This cumulative strategy creates a strategic environment defined by persistent uncertainty. Each action is tolerable in isolation; together, they alter the meaning of space security without triggering traditional warning thresholds.

The Enduring Value of the Center of Gravity and Early Warning

A reasonable critique of this argument is that it risks overstating the limits of existing analytic frameworks while undervaluing tools that remain essential to military planning. Center of gravity analysis provides discipline, prioritization, and focus. Early warning enables deterrence, escalation control, and crisis management. Expanding analytic frameworks risks ambiguity inflation, analytic paralysis, and loss of operational decisiveness.

This critique is not wrong. Center of gravity analysis remains effective when adversary power is centralized, when effects are primarily kinetic, and when decisive action against a critical node can reasonably be expected to produce strategic outcomes. Likewise, early warning remains indispensable for detecting launches, attacks, and discrete hostile acts. None of these capabilities have lost their value. The problem is not that these tools fail—but that they are increasingly applied beyond the conditions in which they perform best.

In a multi-domain environment characterized by redundancy, reversibility, and cumulative effects, the search for a single center of gravity often leads to artificial convergence. Analysts may identify a “critical” node not because it governs system behavior, but because doctrine demands one. This can generate false confidence and bias decision-makers toward decisive solutions where none exist.

Similarly, early warning systems are optimized to detect events, not shifts in meaning. They excel at answering whether something has happened but struggle to assess whether the strategic framework itself has changed. When adversaries deliberately operate below thresholds, exploit dual-use ambiguity, or pursue long-term normalization rather than immediate advantage, warning indicators may function perfectly while still failing to prompt adaptation.

This is not an argument for abandoning center of gravity analysis or early warning. It is an argument for recognizing their limits. In contemporary competition, these tools should be treated as inputs rather than arbiters, complemented by approaches that surface distributed leverage, cumulative effects, and changes in strategic meaning. The risk is not analytic pluralism; it is analytic monopoly. Over-reliance on tools optimized for decisiveness obscures forms of competition designed precisely to avoid decisive confrontation.

Why More Information Will Not Solve This

A common response to perceived vulnerability is to demand more data, better integration, and faster decision cycles. These improvements are valuable, but insufficient. No amount of situational learning can substitute for fundamental learning. This vulnerability is especially acute for operational concepts built around decision advantage. An adversary whose objective is to undermine the integrity of decision-making itself does not need to outpace an OODA loop (Observe, Orient, Decide, Act—is a decision-making framework describing how actors perceive information, interpret it through mental models, choose a course of action, and execute it under conditions of uncertainty); it needs only to make observation and orientation unreliable. When leaders cannot trust what they see, speed becomes irrelevant.

Organizations resist fundamental insights because they destabilize roles, missions, and resource allocations. It is easier—and safer—to treat anomalies as situational problems to be fixed than as signals that the system itself may require redefinition.

This Is an Every-Service Problem

The weaponization of uncertainty in space is not a niche concern. It strikes at the foundational technologies and assumptions underpinning the American way of war. For the Army and Marine Corps, degraded confidence in space-enabled navigation and timing undermines maneuver and fires. For the Navy, corrupted space-derived data threatens cooperative engagement and fleet defense. For the Air Force, aircraft built around data-fusion risks operate in an information environment that resembles a hall of mirrors rather than a source of advantage.

This challenge cuts to the heart of Joint All-Domain Command and Control. Much of the current effort focuses on building the technical architecture of a combat cloud. Yet the reliability of that cloud, and the trustworthiness of the data it carries, has become the primary target of adversary strategy. They are not trying to win a fight within the network. They are working to prove that the network itself cannot be trusted.

The Risk of “Almost Right”

The United States has built a space enterprise designed to prevent situational surprise, and that achievement should not be discounted. But the greater danger lies elsewhere. If adversaries are not trying to surprise the United States tactically, but to outgrow its assumptions strategically, then warning alone will not be enough.

Fundamental surprise announces itself through discomfort—when facts fit poorly, responses feel mismatched, and success no longer produces advantage. Recognizing these moments necessitates a shift from prediction to comprehension, from object-based tracking to effect-based analysis, and from technical certainty to cognitive agility.

If the United States fails to make this shift, it risks becoming something far more dangerous than blind: an exquisitely informed victim of a surprise it never saw coming.

Catherine R. Cline is a U.S. Army major and strategist with operational experience in space, cyber, and special operations and holds a master’s degree in cybersecurity. She currently serves in a joint assignment at U.S. Space Command

Related Posts

Waiting for Favorable Conditions

Waiting for Favorable Conditions

By Joe Byerly They checked the news first thing in the morning. Then again at lunch. Then one more time before bed. They waited for life to return to something that felt recognizable. It was hard to believe that leaders could be so casually selfish—treating the lives...

Ep 172: How Work Stress Hijacks Your Life with Dr. Guy Winch

Ep 172: How Work Stress Hijacks Your Life with Dr. Guy Winch

Dr. Guy Winch, bestselling author and psychologist, joins Joe to discuss his newest book, Mind Over Grind to explore how job stress quietly spills beyond the office—and into our evenings, our sleep, and our relationships. What starts as a difficult meeting...