|Slide from DCGS January 2016 evaluation. CREDIT: Director, Operational Test and Evaluation, Pentagon.|
“Everyone focuses on this little piece of fiberglass flying around called an unmanned aerial vehicle,” Lt. Gen. David Deptula, a key architect of the aerial surveillance system, told Air Force Magazine in September 2015. “But it’s just a host for sensors that provide data to this vast analytic enterprise we call the Distributed Common Ground System, which turns the data into information and hopefully knowledge.”184
Housed on dozens of networked military computers scattered around the world, DCGS is built and maintained by over 70 contractors including Booz Allen Hamilton of Virginia, L-3 Communications, Lockheed Martin, and Raytheon.185
DCGS allows users to access some 700 disparate sources of intelligence information186 including live video feeds, thermal imagery, radar, and mobile phone tracking data; take advantage of social network analysis tools including Palantir and Modus Operandi to analyze data; and not least, to set up and execute targeting missions via the NCCT.
A 2015 fact sheet produced by the LeMay Center at Alabama’s Maxwell Air Force base best explains the system’s capabilities: “An example of cross-cueing would be a DCGS signals operator employing sensors aboard a U-2 on the other side of the planet to geolocate a target signal and then cue a geospatial analyst working in the same room to coordinate with a Predator unit thousands of miles away to steer its video sensor to observe the source of the signal, and immediately report their findings directly to a supported unit in the area.”187
In practice, this means a phone signal tracked by a U-2 pilot flying 60,000 feet over Syria could be observed in close to real time by a DCGS analyst in Virginia who could ask a drone pilot in Nevada to zoom a camera on a Predator at 10,000 feet so that an imagery analyst in Florida could take a closer look before calling in a jet to drop a bomb. The Pentagon calls this “reachback” because it allows troops in the field to get immediate support from military personnel at bases located in the U.S.
The first DCGS was set up in 1994 and dispatched to Guantanamo Bay, Cuba, to support U.S. military operations in Haiti.188 Since then it has evolved into a global system with five main Air Force DGS hubs—in California, Germany, Hawaii, South Korea, and Virginia—and dozens of smaller sites scattered around the U.S.189 All military services now have their own versions of DCGS (see box) that can receive and redistribute data from airborne aircraft—from the Predators to the U-2s.190
The main DCGS sites are vast, windowless warehouses where analysts work in small groups monitoring multi-screen systems on darkened operation floors.191 Their main task is processing, exploitation, and dissemination (PED). Every day the DCGS commanders create a PED tasking order (PTO) to instruct drone operators on the data they would like to collect.192 As the data flows back to be archived, it is then tagged and analyzed by software as well as by human operators. DCGS also employs linguists who interpret data and intercepted conversations to ensure accurate tagging and analysis and occasionally to support soldiers in the field in near real time.193
“We do data conditioning to help analysis. We help with the mundane tasks to make data ready to be analyzed, make it easier to discover and get on the screen,” Patrick Biltgen, a senior mission engineer for BAE Systems Intelligence and Security who worked on DCGS, told Tactical ISR Technology.194
Yet to this day, DCGS is a multi-billion dollar boondoggle. A staggering 54 out of 64 Air Force DCGS users surveyed by the Institute for Defense Analyses in 2015 gave it less-than-the-minimum score for usability.195 A 2016 Pentagon evaluation unearthed by CorpWatch suggested that the Air Force system was unavailable 67 percent of the time.196 And a 2012 Pentagon evaluation reported that DCGS-Army has to be rebooted every eight hours.197
To understand why the system is such a shambles, it is important to view DCGS in the context of how the military has historically collected data. “The root cause of many of these difficulties is adherence to a centralized Cold War collection management doctrine focused on production [of large quantities of intelligence] rather than goals and objectives,” wrote Col. Jason Brown in Joint Forces Quarterly in 2014. At the time, Brown worked at the DCGS base in Ramstein, Germany. Today he has overall responsibility for DCGS as commander of the Air Force’s 480th ISR Wing. Its motto is Non Potestis Latere, Latin for “You Can’t Hide.”198
For example, ever since the late 1950s, the Pentagon has relied on high-flying spy planes like the U-2 together with satellites to provide information on distant enemies. Intelligence was often derived from a series of high-resolution photographs taken from space. The film was delivered to ground analysts who pored over them for details and especially for changes over time to spot troop or weapons movements. A data request could easily take eight days to fulfill.
Even after the advent of video-equipped drones in the Yugoslav war in the 1990s, the analysts continued to follow the old system by converting video feeds into still images that they printed out on paper and examined.199
After the 2003 invasion of Iraq, such methods quickly became moot in Iraq. Soldiers on the ground were no longer trying to detect the advance of slow moving bulky tanks. Instead they were up against shadowy networks of quick-moving urban fighters who could plant a roadside bomb at night and then blow it up by remote control hours or even minutes later.
It didn’t help that the analysts had no real understanding of what data to request. “For example, analysts would submit GMTI [requests] over cities failing to recognize the … platform’s inability to distinguish moving targets in the clutter of an urban environment,” Brown adds. “Many leaders and analysts eventually realized that it was not viable to submit formal intelligence requirements and then hope all the pieces would arrive at the right time.”200
Meanwhile as new data gathering technologies proliferated, the Pentagon
simply tasked new recruits to collect it all, regardless of the ultimate
goal. “Current hierarchical collection management processes separate the
tasks of collectors, exploiters, and analysts into ever-smaller discrete tasks,
but in practice their reassembly downstream rarely works as elegantly
doctrine suggests,” Brown wrote in another magazine. “This Industrial Age
mentality assumes the end goal is ‘finished intelligence’ produced in
centralized factories assembling components created in isolation from one
The biggest such intelligence “factory” is the one that Brown now manages at the
headquarters of the 480th Wing at Joint Base Langley-Eustis.202
Hundreds of analysts work side by side at that Hampton, Virginia base,
in semi-circular pods of six.203 Around the world, Brown has a total of
6,000 Air Force analysts working for him at the 27 Air Force DCGS sites.
Every single day these analysts manage an estimated 20 terabytes of
data that they categorize into searchable databases like UNICORN (the
acronym for the unified collections operations reporting network.)204
At the same Virginia Air Force base, this data is then converted into striketargets by the 363rd ISR Wing205 under the command of Col. Michael “MiG” Stevenson.206 “Our analysts go through and sort through all that and do long-term studies and determine trends. So using their data from a year ago to the present day, we come up and determine: here is what the enemy’s doing,” Stevenson told local reporters on a March 2015 publicity tour of the base. “It’s a very detailed, methodical process to get targets advanced to the point where they are actually struck by aircraft.”207
But more candid interviews suggest that the analysts, most of whom joined the military straight out of high school and are rarely older than 25, are simply following a rule book blindly. “Many assume that every crew was aware of what not only they were doing and why, but also what the other assets assigned were doing. The reality is that is not the case,” Lt. Commander Peter Salvaggio, who was in charge of a piloted Lockheed EP-3 reconnaissance aircraft, told a military researcher back in 2011. “And the sad part is that you typically find out months later at a conference over a cup of coffee during a BS session. Only then do you find out what was really being requested.”208
Now, the Air Force wants to solve this problem by entrusting more to algorithms. “We have invested in more airmen analysts, but the growth in our force cannot keep up with the growth of raw data,” Maj. Gen. Robert Otto, commander of the Air Force ISR Agency, told the journal Tactical ISR Technology. “To deal with this we need to develop more advanced, more automated search and analysis tools.”209
Yet this approach has already failed and is not likely to improve without investing more in the human element. Automated computer search tools in the hands of young soldiers with no knowledge of cultures half way around the world, are likely to increase the likelihood of errors, rather than reduce them.
addition to the overwhelming quantity and often useless quality of data
gathered—or perhaps precisely because of it—Pentagon evaluators, as far
back as March 2010, have consistently given DCGS a failing grade.210 This pattern began soon after the contractors were asked to upgrade the software to allow intelligence analysts scattered around the world to collaborate via the internet.211
Several years later, all indications suggest that Air Force DCGS is still as clunky. “Major system shortfalls included system instability, slow system response times, and an inability to simultaneously receive and exploit full motion video, Global Hawk imagery, and U-2 imagery,” J. Michael Gilmore, director of the Pentagon’s Operational Test & Evaluation Directorate (DOT&E), wrote in a 2014 memo to Bill LaPlante, assistant secretary of the Air Force for acquisition.212 Gilmore noted that the Air Force itself had concluded internally that the system was not “operationally effective or operationally suitable.”
DCGS is not unique to the Air Force. The Army, Marines, Navy and the Special Forces each has its own version of the globally networked computer system, with the Navy even fielding versions on ships. The Army version, built by a number of major military contractors led by Northrop Grumman, has been heavily criticized by soldiers who say that the system barely works, noting that it was even unable to provide routine weather forecasts because of coding errors.216
The soldiers have an advocate—Duncan Hunter, a member of Congress from San Diego who has been waging war against the incumbent contractors for years.
“It’s supposed to be like this big cloud portal, so that anybody can access it. But nobody does—because it doesn’t work! It’s like opening PowerPoint or whatever and clicking on everything and nothing works,” Hunter told the New Republic magazine in 2013. “For all of Afghanistan, it’s got a total of sixty-six persons of interest. You would think thousands. It’s a complete scam.”217
Hunter’s argument was not welcomed by military top brass, who went as far as to instruct senior officers to push back by calling their own members of Congress.218 Unfortunately, Hunter undercut his own argument by vehemently supporting a rival system manufactured by Palantir, which has been lobbying to take over the contract.219
He revived his criticism in October 2015 when an Air Force AC-130 gunship mistakenly bombed a Doctors Without Borders hospital in the Afghan city of Kunduz, killing at least 22 people and injuring over 30. “My office has learned from multiple service members and officers that … the primary components of the Pentagon’s flagship Intelligence system, the Distributed Common Ground System, were not operational in Afghanistan,” Hunter wrote in a letter to the Ashton Carter, then head of the Pentagon.220
Documents from the DOT&E back up Hunter’s criticisms. “Battalion commanders and staff indicated they did not consider [DCGS] to be very helpful for the fight on the ground. As a workaround, some battalion analysts resorted to tracking the battle using pencil and paper,” J. Michael Gilmore, the DOT&E director, wrote in a January 2016 evaluation of DCGS-Army.221
To get around this, the Air Force had rigged up a scaled-down version with the elements that had been approved. But Gilmore concluded that the modified version did “not provide a joint, net-enabled capability for controlling [intelligence] platforms and sharing the data they collect.”
To convince evaluators that the system was viable, the Air Force then split the oversight of DCGS into four smaller programs in the hope of getting at least some parts approved. This scheme prompted an angry memo from Gilmore. “The reduced level of oversight and priority is increasing the opportunity for continued problems, lack of resources and priority, and provides a false impression of reduced risk associated with the program,” he wrote.213
But, instead of consolidating the system, the Air Force then split the oversight of DCGS into eight parts in 2015, which resulted in yet another angry memo from Gilmore. “Such a balkanized test program does not permit an accurate assessment of the overall AF DCGS operational capability,” he wrote in a 2016 memo obtained by CorpWatch under the Freedom of Information Act.214
In the same memo, Gilmore noted that the AF-DCGS was only up and working 33 percent of the time, despite the contract’s requirement for a minimum of 98 percent of the time. “Evaluation results revealed serious problems with the Air Force’s ability to collect, reduce, and report signal intelligence.”
Meanwhile, the Pentagon’s evaluators also reported that the contractors were making little progress with integrating new sensors. By 2016 they had only been able to add one new working element: the synthetic aperture radar system, which is not even widely used.215
• Report Index • Download Report • FAQ/Press Materials • Watch Video •