Summer reading
My summer reading has been a little unusual.
I re-read James Salter’s The Hunters, a novel about US fighter pilots in the Korean War and then a biography of USAF Colonel John Boyd by Robert Coram.
Boyd is a fascinating character: massively confrontational in his interpersonal style but a genuine, creative strategic thinker with his influential Energy-Maneuverability and OODA loop theories (see link in comments). The books make great companion reading, and both relevant in terms of AI.
In The Hunters, success in aerial combat comes down to one thing: “who could see the farthest.” The main character,Captain Cleve Connell, discovers that technical skill isn’t enough. Survival depends on the ability to read patterns, assess character under pressure, and make split-second judgments based on hard-won experience. This same critical discernment that Salter’s Korean War pilots needed—and that Boyd later codified in his OODA loop—has become a valuable skill in the age of AI.
Boyd’s OODA loop (Observe, Orient, Decide, Act) reveals why human judgment remains irreplaceable. While AI excels at the “Observe” phase (processing vast amounts of data) it can break down at “Orient,” the crucial step where raw information becomes actionable insight.
Recent analysis shows AI lacks what researchers call a “world model”. Language models can suggest putting glue on pizza or classify a turtle as a rifle, because they operate through statistical pattern matching, not understanding. Unlike humans, AI has no lived experience to check its outputs against reality. Just as Boyd’s pilots needed to “orient” their observations within tactical understanding, we need to approach AI with what might be called a “Think-Prompt-Think” cycle. Before querying an AI system, we must think about what we’re really asking and why. After receiving output, we must think critically about whether it makes sense given what we know about the world.
We need to develop what Salter called “the flyer’s eye”—the ability to see not just what’s there, but what’s missing or misleading. As AI automates routine analysis, the premium shifts to those who can ask the right questions and interpret ambiguous results. Salter’s Korean War pilots operated at a unique inflection point, flying the last jets to engage in pure air-to-air combat without computers. They succeeded through a combination of (at the time) “advanced” technology and ancient human skills: judgment, pattern recognition, and the ability to read character under pressure.
AI provides unprecedented analytical power, but to get the most from it requires people who can recognize when statistical patterns mislead, who understand how organisations and individuals behave under stress, and who can maintain perspective.
-
John Boyd - The fighter pilot who changed the art of war
Generative AI’s crippling and widespread failure to induce robust models of the world