How the Joint Staff Calculated a Defense Program’s Return on Investment
As our office computed the utility of its wargaming grants and library, we learned some things that others in DoD may want to copy.
When our office was handed the Defense Department’s Wargaming Repository and a $10-million annual Wargaming Incentive Fund last year, senior leaders made clear that good stewardship of this valuable resource would include calculating its return on investment. Of course, calculating ROI for a defense initiative is more difficult than doing it for a company, where one can compare investment with cash flows.
So we — that is, the Joint Staff’s Studies, Analysis, and Gaming Division — adapted an approach described in a Harvard Business Review article. Because the purpose of defense programs is to support a public good — national security — and not to generate profits, we removed the second step (“Forecast the cash flow”) and doubled our analytical efforts on the fourth (“Evaluate the investment”).
Essentially, we sought to answer two sets of questions:
- Are DoD organizations using the Repository to disseminate the lessons they learn through wargames? Are other organizations reading their writeups?
- Are WIF-funded games aligned with senior-leader priorities? Do the games inform senior decision-making?
To answer the first questions, we looked at readership statistics for the Repository, which at the time contained roughly 750 entries on completed and future wargames. These offer such detail as purpose; scenario; methodology; number of participants; points of contact; and whatever lessons, insights, and best practices may have emerged. Among other uses, they feed a monthly wargaming report sent to 500 officials from the Office of the Secretary of Defense, the Joint Staff, service branches, and combatant commands.
Related: Better Wargaming Is Helping the US Military Navigate a Turbulent Era
Related: North Korea: The Military Options
Related: What You Need to Know About Russia’s Big Wargame on NATO’s Doorstep
We found that these entries had been viewed more than 14,000 times by more than 800 people from 20 DoD organizations (including the OSD, the Joint Staff, and all of the service branches and combatant commands). Nearly three-quarters of the views were from organizations that had not participated in the wargame in question — that is, people reading about others’ wargames. This exceeded our expectations; we had judged that the minimum required return should be entries that are viewed at least half of time by unrelated organizations.
Turning to the second set of questions, we focused on 54 WIF-funded games run between May 2016 and April 2018. (That’s roughly one-fifth of the total entered into the Repository during that timeframe; these games were run by 14 DoD organizations, including the Office of the Secretary of Defense, the Joint Staff, all service branches, and most of the combatant commands.)
To understand whether these WIF-funded games were aligned with senior leaders’ priorities, we checked them against the 2018 National Defense Strategy’s principal priorities (China and Russia) and secondary ones (North Korea, Iran, and counter-terrorism). We found that 68 percent of the war-gamed scenarios focused on the principal priorities; 24 percent looked at the secondary priorities; and 8 percent focused on topics outside of these priorities but relevant to national security.
Determining how much WIF-funded games inform senior decision-making is difficult. Senior leaders occasionally vouch publicly for their utility; in April, the leader of U.S. Transportation Command testified to Congress that his wargame revealed critical security vulnerabilities and “drove changes in how we plan for attrition, cyber, mobilization, authorities, access, and command and control.” But this is relatively rare; for one thing, the results of most games are classified. So we looked at other metrics. We found, for example, that 32 percent of such wargames had direct participation of at least one general officer or member of the Senior Executive Service. We also found that senior leaders received lessons-learned briefs after 96 percent of WIF-funded wargames. In an effort to improve our understanding, we have begun requiring awardees to complete a post-wargame questionnaire.
Our efforts to calculate ROI had some unexpected benefits. The experience of searching the Repository’s back-end usability data helped us to improve its interface and make it a better information-sharing platform. We also discovered that some organizations that accepted WIF grants had not fulfilled all of the related obligations — such as the requirement to brief senior leaders on lessons learned. So we barred WIF recipients from receiving additional grants until they met their outstanding obligations, and began sharing compliance updates at our biweekly meetings. Over the year starting in July 2017, the proportion of delinquent WIF awardees declined from 38 percent to less than 9 percent.
The current National Defense Strategy aims to “ensure effective stewardship of taxpayer resources,” and other DoD organizations may be asked to calculate the ROI of programs they manage. This may be a challenge due to a lack of requisite knowledge or experience and because ROI is a business concept that typically measures the profitability rather than public usefulness of programs. Nevertheless, based on our experience, we believe the DoD managers can carry out sensible ROI analysis that is useful to their own organizations and to leaders. Such analysis must ask a narrow number of questions, seek the relevant hard data, and stay focused on the original intent of the program and its current utility to senior defense leaders. That was the essence of the approach we took and it is one we believe others could adopt to their benefit.
The views expressed in this article are the authors’ own and do not represent the views of the Department of Defense or the U.S. government.