GAO reports mixed results for the military's major automated information systems
The Government Accountability Office (GAO) has completed its first assessment of the military's major automated information systems (MAIS), and reports mixed results--particularly for the Army's Global Combat Support System-Army (GCSS-A) and the Air Force's Defense Enterprise Accounting and Management System's (DEAMS)--though Navy CANES received better marks.
The Government Accountability Office (GAO) has completed its first assessment of the military’s major automated information systems (MAIS), and reports mixed results—particularly for the Army’s Global Combat Support System-Army (GCSS-A) and the Air Force’s Defense Enterprise Accounting and Management System's (DEAMS)—though Navy CANES received better marks.
Of the 14 selected Department of Defense MAIS programs, the GAO reports that: 9 stayed within their planned cost estimates, while 5 did not (with cost increases ranging from 3 to 578 percent); 5 programs remained on schedule, while 9 experienced delays (ranging from 6 months to 10 years); and 8 programs met their system performance targets, while 5 did not fully meet their targets. One program did not have system performance data available.
Looking at these areas collectively, 3 programs stayed within their planned cost and schedule estimates and met their system performance targets, and 2 programs experienced shortcomings in all of the areas—cost, schedule, and performance.
“The three selected programs demonstrated mixed results in effectively defining and managing risks of various levels,” according to the GAO report issued March 28 entitled, Selected Defense Programs Need to Implement Key Acquisition Practices.
“Specifically, Navy's Consolidated Afloat Networks and Enterprise Services (CANES) had implemented key practices for risk management, including identifying risks that could negatively affect work efforts. In contrast, the Air Force's DEAMS’ risk reports were out of date and not regularly updated to include the current status of mitigation actions. To its credit, the program had recently taken steps to improve its risk management process, such as establishing a risk and issues working group. These recent steps should help the program effectively identify and manage program risks going forward.
“Finally, Global Combat Support System-Army had developed program risks and mitigation plans, but the program was using multiple risk management systems that contained inconsistent data. Until the program establishes a risk management system that includes a comprehensive and up-to-date log of all current threats to the program, it will lack assurance that it is appropriately mitigating all identified risks.”
The three selected programs demonstrated mixed progress in implementing key requirements management and project monitoring and control best practices. Specifically, the Navy and Army programs had implemented key requirements management best practices.
However, while the Air Force program had also implemented selected practices it had not traced all of its lower-level requirements to its desired higher-level system capabilities, which is inconsistent with requirements management best practices, the GAO reports. Program officials stated that they expect this mapping to be completed by the fourth quarter of fiscal year 2013, according to the report.
Regarding project monitoring and control practices, the Navy program had implemented key best practices, while the Air Force and Army programs lacked certain practices. For example, while the Air Force program regularly communicated with its stakeholders, it had not ensured stable leadership--having four program managers in the past 4 years.
“DOD commented that it supports tenure agreements, with the first two program managers each completing 3-year terms,” stated the GAO report. “While the third and fourth program managers did not complete 3-year tenures, DOD stated that it expects the current program manager to do so.”
The report went on to say that the Army program also met with stakeholders, but it did not effectively use its independent verification and validation (IV&V) function to monitor its program. Until the Army program specifies the roles and responsibilities of the IV&V agent to ensure that it maintains its independence from the risk management processes that it reviews, the program jeopardizes its ability to fully monitor and control the program.
“GAO recommends that DOD direct the Army program to address weaknesses in its risk management and IV&V practices. DOD concurred with these two recommendations and provided additional information that removed the need for a third potential recommendation regarding leadership on the Air Force program.”