Strategy, Evolution, and War: From Apes to Artificial Intelligence by Kenneth Payne. Georgetown University Press, 2018, 269 pp.
Strategy, Evolution, and War is an ambitious work that outlines a broad history of strategic warfare and how it’s changed throughout human history, then uses that history to predict how artificial intelligence (AI) will change it further in the near future. Dr. Kenneth Payne, whose past work links evolutionary psychology with modern war fighting, claims that AI’s potential to make decisions based on a distinctly nonhuman psychology could change warfare more radically than anything since the development of the social human brain. He leverages the work of a strong cadre of scholars in history, behavioral economics, psychology, and international relations to provide the theoretical bases for his arguments. Payne then illustrates the advantages and dangers of AI and its effects on warfare, acknowledging its dramatic potential without succumbing to science-fiction-like exaggerations.
Payne’s central thesis is that there have only been two instances in history that truly revolutionized strategic decision making despite the frequent use of revolution when discussing military strategy. The first revolutionary event was about 100,000 years ago when the human brain fully developed its capacity for social interaction, theory of mind, elaborate deception, and cooperation. The second event is occurring today and will be fully realized when AI is charged with making strategic decisions or autonomously carrying out strategy.
The author begins by setting necessary boundaries to his work. He first limits his discussion to the strategy of warfare. A discussion of AI’s potential impact on strategy in other realms would be interesting, and the author occasionally references the other instruments of power, but such a limit is necessary to keep this already ambitious work focused on its intended topic. He also discusses the definition of autonomy in both human and AI decision making. He questions whether an AI could ever be truly, completely autonomous, and further asks if human beings, with our unconscious heuristics and chemically-driven mental processes, are fully autonomous ourselves. Additionally, he sets a high bar for the definition of a revolutionary development as something that changes the very foundation of warfare psychology. Finally, he discusses the psychological underpinnings of human strategy and how they developed from an evolutionary standpoint. This section is essentially a literature review that cites other scholars in evolutionary psychology and behavioral economics, as well as the development of cultures, political systems, and wartime strategy. The competing viewpoints Payne references result in a brief yet complete overview that underpins the rest of his arguments.
From here, the work moves quickly through several major developments in warfare, from hoplite tactics in Greece, to Clausewitz’s theories of war, to airpower, and eventually, nuclear weapons and the Cold War. Payne states all of these changed how wars were discussed, planned, and executed, but argues that none of these changed the foundations of strategy that result from our human decision-making processes. He argues the creation of writing systems came close by externalizing and recording strategic thinking for future examination, but this still did not change our evolved psychology. Similarly, nuclear weapons, the most dramatic change in weapons technology in history, did not change the way we make decisions. Their destructive power simply accentuated certain heuristics and biases, like loss aversion, that were already present in our psyche.
This segment may leave a student of history wanting more. Payne admits skipping large portions of human history, including many dramatic changes in tactics and technologies. However, this does not detract from the discussion but allows Payne to demonstrate his point without getting bogged down in a deeper examination of the history of human conflict. Additionally, Payne’s extensive references provide curious readers plenty of material to examine further if they desire.
In the third and final section of the book, Payne examines AI. He accurately characterizes today’s AI as more of a decision-making aid rather than a decision maker and posits potential futures for AI development. Most notably, he emphasizes how AI decision making at the tactical, operational, and strategic levels will be driven by distinctly nonhuman decision-making processes. It will not be constrained by the heuristics and biases of human psychology. It is this, and not a sci-fi-inspired rogue AI, that would cause strategic warfare to deviate sharply from our plans and expectations, and the strategic decision making that has guided us in conflict for all of human history.
Further, Payne notes that goals can change during conflict, and nations often shift their objectives. The use of a strategic AI would severely limit this flexibility. If two AI-supported nations were in conflict, the speed at which the AIs could operate means the nation that takes time to adjust its AI’s goals would be at a distinct disadvantage. During the time it takes to adjust, the opposing AI could cycle through its observe, orient, decide, and act (or OODA) loop thousands or millions of times, exponentially building its decision advantage.
This book offers an outstanding synopsis of the evolution of strategy in war and a great jumping-off point for discussions on the future of AI. It is impressively complete for its brevity, but readers with an established knowledge in history or AI will likely want more depth from the discussion. Indeed, a longer version of this work on how AI relates to and deviates from our evolutionary psychology would be beneficial to the overall thought catalog that feeds the AI discussion. Nevertheless, the thorough citations offer plenty of opportunity to explore those topics deeper, and the book’s brevity allows readers of all knowledge levels to develop a baseline or stimulate thoughts and conversations on AI’s implications for strategy. I highly recommend it to anyone interested in the history and future of strategy and how AI fits into that future.
Capt Brian Hill, USAF
Fort Bragg, North Carolina