Violent conflict has been a constant in human history and is likely to persist. While conflicts often occur between organized political entities—what we call states—other organizing principles, such as tribes, dynasties, or ethnic groups, have also shaped many historical struggles. The Norman invasion of England in 1066 was at its core a personal campaign for William the Conqueror to succeed Edward the Confessor who had died without a natural successor; there was no declaration of war between states in the modern sense of warfare. The Rwandan Civil War (1990–1994) which led to the terrible genocide was largely an ethnic conflict, similarly was without a formal declaration. Nonetheless there have been statements from as long ago as the ancient world approximating declarations of war, such as the Stele of Vultures from Sumeria, dating to around 2600 B.C.E., just as there is a long history of formal treaties of peace since the Treaty of Kadesh of 1269 B.C.E. between the Egyptian Pharoah Ramses II and the Hittite King Hattusili III. Declarations of war and treaties of peace can be thought of as diplomatic instruments, distinct genres of international relations, with long histories but not always employed.
Declarations of war and treaties of peace are by no means exclusively modern phenomena; they are as old as war and peace. However, the widespread expectation that states frame military action in formalized political statements that include a justification for the use of organized violence and an explanation of the motivating grievances, is a result of the gradual formalization of statehood and international relations in the course of modernity. The Peace of Westphalia of 1648, which codified the modern terms of sovereignty, was a turning point in this process that involved at its core the modernization of political authority, i.e., the move away from dynastic rule or other forms of premodern domination toward legalized legitimacy, the rule of law. States must henceforth explain the grounds for their actions, especially those actions that impinge on other sovereignties.
While not explicitly a declaration of war, the American Declaration of Independence exemplifies this imperative of providing justifications for steps that will lead to armed conflict. This obligation underpins the famous opening sentence of the Declaration: “When in the Course of human events, it becomes necessary for one people to dissolve the political bands which have connected them with another, and to assume among the powers of the earth, the separate and equal station to which the Laws of Nature and of Nature’s God entitle them, a decent respect to the opinions of mankind requires that they should declare the causes which impel them to the separation.” The final clause is vital. Prevailing political culture evidently “requires” that the “causes” be enumerated in a declaration in order to explain the justification of the pending violence: violence without justification is disallowed. The Continental Congress, which issued the Declaration, was of course not the body of an already existing state but rather only a representation of the thirteen separate colonies acting in concert to declare their rejection of rule by England. Since there was not yet a unified American state, the document cannot be viewed as a full-fledged declaration of war in the modern sense, but rather an announcement of hostilities by a still nascent “America” against England. Yet the text did provide a justification for acts of rebellion that would lead to war, and it therefore approximates a declaration of war in its act of explaining the necessity of violence as the appropriate means to correct a long list of accumulated grievances.
Modern expectations concerning diplomacy were developed further in the Congress of Vienna in 1815, which established a system of post-Napoleonic states as well as the network of international emissaries, embassies, and frameworks for negotiation. The 1907 Hague Convention firmly established the expectation that armed conflict between states must be preceded by clear statements of intent. Article III stipulates: “that hostilities between [the contracting parties] must not commence without a previous and explicit warning, in the form either of a declaration of war, giving reasons, or of an ultimatum with a conditional declaration of war.” This requirement of a statement effectively outlawed surprise attacks; there should be no acts of belligerence without prior warning. The word must precede the deed. However outlawing certain practices does not necessarily prevent them.
When Japan attacked Pearl Harbor on December 7, 1941, it had not issued a prior declaration of war (although there is some indication that it had intended to do so). This circumstance amplified the anger in the United States. The lack of a prior declaration made the attack appear not only brutal but also treacherous and without warning. President Roosevelt responded on December 8 with his “Date which will live in infamy” speech, which concludes with a request to Congress: “I ask that the Congress declare that since the unprovoked and dastardly attack by Japan on Sunday, December 7, 1941, a state of war has existed between the United States and the Japanese Empire.” It is worth noting that Roosevelt did not ask that Congress issue a declaration of war in order to initiate conflict: rather, he asks that Congress “declare” in the sense of recognizing the reality that war already existed in the wake of the Japanese attack. What really matters then are the deeds, not the words. The secondary status of words—declarations—when measured against the decisiveness of the deeds of war is as old as Pericles’ Funeral Oration as reported by Thucydides, but even more familiar from Abraham Lincoln’s phrasing in the Gettysburg Address: “The world will little note, nor long remember what we say here, but it can never forget what they did here.” The actions of the warriors decide the outcome, not the texts of the declarations. As important as the demand of the Hague Convention may be in trying to deter surprise attacks, in the end it is force that decides the victor.
Congress promptly acceded to Roosevelt’s request, legally establishing that the U.S. was indeed at war with Japan, even though the war had already been in existence at least since the attack the day before. The same gap between deed and action marks the end of the war as well. The formal surrender took place when the Japanese foreign minister Mamoru Shigemitsu and General Yoshijizo Umezu signed surrender documents on the USS Missouri in Tokyo Bay, marking the end of the war on September 2, 1945, which became “V-J Day.” In fact, the tides of war had turned against Japan as early as the Battle of Midway, June 4–7, 1942. Atomic bombs were dropped on the Japanese cities of Hiroshima on August 6, 1945 and Nagasaki on August 9, soon followed by Emperor Hirohito’s radio announcement of surrender on August 15. American occupation troops began to land in Japan on August 28, leading to the formalization of the surrender in the September 2 signing. The definitive peace treaty was the Treaty of San Francisco, signed on September 8, 1951 to come into effect on April 28, 1952. The series of distinct events and dates demonstrates how the war came to an end with Japan’s defeat only through a series of steps, and the formal peace treaty was merely the culmination and conclusion of a long process.
What we learn from this history is that wars have begun and ended in different ways, but the expectation of a formal declaration of war is only a twentieth-century phenomenon. Yet even in the wake of the Hague Convention, wars are not always declared in advance of attacks. Moreover, the initiation of hostilities like their conclusions are in practice separate from opening announcements or concluding documents. This distance between deeds and words, between fighting and declarations, holds even in the case of the arguably classic modern war, World War II, with the highly dramatic moments of Roosevelt’s speech to Congress and the signing of surrender on the Missouri. That discrepancy has only grown greater in the subsequent decades. What happens on the ground is more important than what takes place on paper.
Since World War II, there has been no lack of armed conflicts, but formalized declarations of war—in particular declarations issued prior to the onset of hostilities—are rare indeed, just as are definitive peace treaties that bring conflicts to complete conclusions. When member states of the Arab League invaded Israel in May of 1948, the League did provide a formal statement, albeit addressed to the Secretary General of the U.N. and not to its adversary, Israel. The U.S. did not issue a declaration initiating the Korean War, since the conflict formally belonged to the United Nations. There was no declaration for the Vietnam War, although Congress did adopt the Gulf of Tonkin Resolution in 1964. Argentina did not declare war when it invaded the Falkland Islands in 1982, nor has Russia declared war in Ukraine, designating its aggression instead as a “special military operation,” which is apparently something different from a real “war.” The U.S. invasion of Iraq (2003) was not preceded by a declaration of war but only by an “Authorization for Use of Military Force” because of the claim of weapons of mass destruction. Turkey’s operations in Syria since 2016 were not preceded by a declaration of war, as its target was primarily Kurdish forces, rather than the Syrian state. India and Pakistan have declared war at stages in their extended belligerence, but not for example in the 2019 Balakot airstrike, presumably in order to avoid formal “war” between nuclear states.
The history of peace treaties is similarly mixed. Since World War II, there have been several formal peace treaties, such as between Israel and Egypt (1979), Israel and Jordan (1994), and Eritrea and Ethiopia (2018), all between states. Agreements that have been less than binding than treaties have taken the form of armistices, cease-fires, or other arrangements, especially with the involvement of non-state actors. The 1973 Paris Peace Accords ended the American war in Vietnam, although fighting continued until the fall of Saigon in 1975. The Good Friday Agreement (1998) ended the “troubles” in Northern Ireland. The conflict between the government of Colombia and FARC (Revolutionary Armed Forces of Colombia) was ended with a peace agreement—but not a treaty—in 2016.
In general, it is fair to say that traditional diplomacy of declarations of war and treaties of peace continues to hold some sway, even if they are not omnipresent in conflict resolution processes. Hostilities currently begin and end in much less formalized ways, indeed so much so that the expectation of formal statements appears anachronistic if not fully obsolete. This shift in the practice of international affairs reflects some underlying changes in the nature of the international system and warfare as well as domestic political cultures. Three key points in this transformation have particular importance.
- The presence of nuclear weapons and the recognition of their potential for enormous destruction tends to make direct conflict between nuclear adversaries unlikely. As a result, forms of asymmetric conflict ensue. In Vietnam, the U.S. did not do battle with Russia or China directly, but rather with Vietnamese forces that could be seen (to some extent at least) as proxies for the nuclear powers. In the Ukraine War, Russia has not faced western powers directly (at least not yet), but instead an opponent that Moscow views (at times) as a western proxy. Given the incommensurability between the respective sides, direct diplomacy—of which declarations of war or peace would be components—becomes less likely. Furthermore, asymmetric power arrangements tend not to lead to definitive victories or defeats, meaning that the grounds for conflict may continue even after a notional cessation of hostilities framed loosely as a cease fire, rather than definitively as a treaty.
- Beginning with the Hague Conventions and in the wake of the world wars, a thick network of international law has developed that increasingly subjects any military actions to scrutiny and potential litigation in national and in international courts. This process of legalization exposes any belligerent party to a supplementary front that has come to be known as “lawfare,” the strategy to tie up the party engaged in violence or its political and military leaders in extensive court cases. In this legalized context, eschewing a formal declaration of war can at least delay the initiation of lawfare, to the extent that the claim can be made that the conflict is not “war,” as with Russia’s “special military operation.” This extension of the laws of the war, as part of the broader legalization of international affairs in the “rules based order,” has gone hand in hand with a broad cultural stigmatization of war. The potential heroization of military accomplishment that was still part of the culture of World War II appears to belong to the past; hence the inclination to avoid declaring “war,” pushing armed conflict away from the public eye and into the murkiness of “special operations.” In a related vein, avoiding a formal declaration leaves the belligerent government with greater latitude, since it is not restricted by the terms of any such declaration and associated international law.
- Article 1, Section 8 of the U.S. Constitution ascribes to Congress the power to declare war. Article II, Section 2, designates the president as commander-in-chief, and gives him the power to make treaties and appoint ambassadors. While the Senate has the obligation to “advise and consent” on appointments and treaties, and while both houses together control the budget, the normal course of foreign policy is in the hands of the executive branch. As Congress has become less amenable to compromise, a de facto power shift has taken place, away from the legislature and to the president. While the president cannot declare war explicitly, he nonetheless retains the capacity to authorize engagement in armed conflict at lower levels. The fact that formal declarations of war are unlikely results in part from congressional dysfunctionality, leaving more power, in many fields including military matters, with the executive. Meanwhile within the executive, the natural home of diplomacy, the Department of State, suffers from deep-seated structural problems that inhibit the normal conduct of foreign policy. Key functions, including the utilization of military force in manners short of formal war, become the purviews of the National Security Council and the Pentagon. The marginalization of the State Department is cut from the same cultural cloth as the reduced role of Congress, as power shifts toward more instrumental sectors of government.
As we saw above, the brute facts of war and peace are not the same as declarations and treaties. There is an irreducible difference between bullets and documents, the worlds of soldiers and the words of lawyerly diplomats. To be sure, in the harsh reality of conflict both dimensions are vital, the use of force and the will to negotiate. However, the existential priority of force, the realism of violence, can at best be limited but never eliminated by diplomacy. In the world as it has developed in recent years, we are seeing further reduction in the capacity for diplomacy as well as the diminished significance of international organizations. This is the fraying of the “rules-based order,” in Ukraine and the Middle East, in the Sahel and in Venezuela, and step by step in the western Pacific. If we lose the will to enforce the law, internationally just as much domestically, the rule of law will not endure. “Declarations of war” may go out of fashion; war will not.