Update, 8/30, 2:50 EST: the United States Army's Deputy of Cybersecurity confirmed with BuzzFeed the existence of a computer security flaw that enables unauthorized access to users without proper security clearance. They say the best fix is to make soldiers aware of proper conduct, instead of fixing the technology itself.
The U.S. Army has been aware for years of a major security flaw in the system soldiers use to access computers — and has done nothing to fix it, two sources, including an officer who alerted superiors to the risk, told BuzzFeed.
Update: Roy Lundgren, the Army's Deputy of Cybersecurity, confirmed with BuzzFeed that the security failure exists and has the potential to provide users unauthorized access. [Full statement below.]
Today countless computers, and the soldiers who use them, remain vulnerable to a simple hack, which can be executed by someone with little or no security expertise.
The officer, who reported the flaw two years ago, was told to keep quiet, despite evidence of its widespread exploitation. Another soldier, who went to his superiors and even Congress, got no results.
The hack allows users with access to shared Army computers to assume the identities of other personnel, gaining their securities clearances in the process, by exploiting issues with the computers' long and buggy log-out process, according to the sources familiar with the flaw.
The officer, an Army lieutenant, spoke on the condition he not be named; he is referred to here as "Mark." He discovered the flaw in October 2011, when he was playing around on his military computer during one of his 18-hour shifts. Being "of the hacker mind-set and being really, really bored," as he puts it, he wanted to see if there were any holes in it.
That's when he discovered the major, and obvious, computer security flaw.
"Oh shit," Mark said to himself when he figured it out. "This isn't good."
He described to BuzzFeed calling in his superiors — two middle-ranking officers, one in military intelligence and the other in computer communications.
As Mark described it, their eyes grew wide.
But, according to Mark, they told him there was nothing they could do. It would cost too much to fix it, they told him. It would require redoing too many contracts. "The term they used is that it would be 'impractical' to try and fix it," he says.
Instead, they made him sign the Army's version of a nondisclosure agreement. If he told anyone else about what he found, he could face prison time, he said.
"I'm showing you this so you can fix this," Mark recounts telling the officers. "This is obviously a huge problem. I'm probably not the only asshole who figured out how to do this."
Update: "If an issue is reported to our cybersecurity directorate, we would normally contact the system owner and ask them for an assessment," the Army told BuzzFeed, not commenting on the response to this specific report. "Often the risk is known and mitigating factors are already being applied and/or the organization has developed a plan of action to correct the issue."
At least one other soldier besides Mark has tried to formally report the security flaw, going to his military superiors as well as Congress and the Pentagon. This soldier's efforts, too, were met with inaction and silence.
Mark made a second attempt to report the security flaw when a new officer replaced one of his superiors. But again, nothing came of it.
"At that point I could try to talk with one of the division-level guys, but I know from personal experience that he is one of the people who plays the game," he said. "I wondered if it would raise a red flag about me if I tried to keep addressing the flaw."
Update: When asked about Mark's non-disclosure form, the Army did not comment. The proper steps, it said, is to report within their chain of command with the supporting professional IT/cybersecurity staff.
Big private tech companies like Google, Facebook, and Microsoft routinely seek out and sometimes pay people like Mark who expose security flaws. Some have set up bounty systems giving any member of the public who finds and reports a bug up to $20,000.
The military has no such system. If reporting to a superior goes nowhere, then in reality, there is little recourse for soldiers who discover computer security problems. They could report a bug to the Department of Defense inspector general, which handles complaints about fraud, waste, and abuse. But that's not an obvious avenue for computer issues. Moreover, if their superiors found out, they could face retaliation.
One refrain in the wake of the National Security Agency leaks is that Edward Snowden should have reported his concerns up the chain of command rather than leaking documents to the press. But the internal reporting system is seriously broken in the military. All too often when a soldier reports misconduct or illegal activity, it is swept under the rug.
Perhaps the most egregious recent example of such a mind-set is the tragically late response to reports of widespread sexual assault in the service. Women's reports weren't just ignored — the victims were subject to retaliation including but not limited to being barred from medical treatment, having their information made public, and being discharged from the military. Recent pressure on the issue led to an updated version of the Military Whistleblower Protection Act, first created in 1988. The fact that it had to be updated to specifically include people reporting sexual assault speaks to its inadequacy.
Retaliation against internal whistle-blowers is a fact of military life. Between October 2012 and April 2013, the Department of Defense's inspector general's office received 695 complaints about "whistleblower reprisal, restriction of service members from contacting an IG or member of Congress, procedurally improper mental health referrals and senior official misconduct." Those are only the cases which were reported.
Mark's case suggests serious issues with the military's security reporting infrastructure too, even when the issue at hand is ideologically neutral.
Now, almost two years later, the security flaw still exists as confirmed by the Army itself.
"It is still happening," says Mark. "People know about it and no one is addressing it." Knowledge of it has even spread to low-level soldiers who don't work in technology, as confirmed with BuzzFeed by more than one source.
Update: The Army contends that instead of improving the security flaw itself, individual soldiers should make sure they are properly logged off. "The government and industry must manage numerous risks each day," says Lundgren. "Often software and/or hardware solutions are not available, supportable, or necessary In the case of many risks, they are managed via other mitigations such as modifying policy, procedures, or training."
In response to the problem they are planning an "Information Assurance/Cybersecurity Awareness week" in October as a follow up measure to their new handbook, released last February, which stresses the importance of individual responsibilities to protect information.
To fully understand the significance of the security flaw, you need to understand the Army computer security system. In order to log into a shared Army computer — say, in a computer lab on a base — you need to insert your personal Common Access Code military ID. Each card contains a chip that has the individual soldier's permissions and security details, and which helps the military track your activity. Once you remove the card, you are fully logged out.
But Mark found that it was possible to access the system as the last user, even if his or her military ID has been removed.
When a computer stalls during the shut-down process — if, for example, a program locked up and required a force quit or if Outlook is delaying the process with a large file upload — the computer can remain temporarily logged in without the presence of the key card. If the next user jumps on at that moment, the shut-down process can be canceled and the log-in can remain active with credentials and security clearance. All subsequent activity will be recorded as the previous user's.
Update: "There are instances where the logoff process does not immediately complete upon removal of the CAC. This occurs when the system is running logoff scripts and shutting down applications," Lundgren told BuzzFeed. "The period of time that a system can be accessed following CAC removal before system logoff completes is normally not sufficient to gain unauthorized access."
This is almost certainly the result of a system design mistake, not malice, according to Daniel Cohen, an RSA cyber-security expert. "Personally I haven't heard of this exploit or weakness in the system, but it sounds very severe," he says.
According to Mark, the hack is simple to accomplish on both secure and non-secure computers. Mark has even tested the exploit to see if it would allow a user to gain access to SIPRNet, the classified DoD network from which Chelsea Manning acquired some of the files she then leaked to the press. It could.
Since many military computers have stuffed, cluttered hard drives as the result of long-term use by large numbers of soldiers, they often hang while shutting down. When soldiers sharing computers are in a rush, this identity swap can easily happen by accident.
For a hacker or leaker to manipulate this exploit would be easy. It would simply involve "a little bit of social engineering," as Mark says. "But that is easy since most people just pull their card and walk away, often without looking at the screen. 'Hey, buddy, can you print X out before you go? Wait, you can't find X? Let me pull it up. Can you grab it off the printer? Thanks, man, here's your card; see you in 12 hours.'"
Recently, Mark saw a number of soldiers watching an Entourage DVD on a operation center computer. "Hey, you don't have rights on that computer," Mark recounts saying to one of the soldiers. "I look at him and he says, 'Well, sure,' and he pulls out his card and waves it at me and the computer still plays."
It's not just the log-in problem. Security in general is fairly lax in the computer rooms overseas. After the Manning leak, one of the fixes advised was to have soldiers rename various files in the SIPRNet database, as if that would add a level of security. Soldiers also routinely bring USB sticks, DVDs, and CDs into the tactical operation center computer rooms. The sign on the door prohibiting it doesn't deter them.
"It is a boring job. You are just sitting there for 18 hours waiting for chaos to happen," says Mark. "So multiple TVs are on showing drone feeds, but you have one that is playing a Game of Thrones DVD or a movie that was burned from BitTorrent."
He has gone to his superiors with recommendations for numerous best practices to improve security, ranging from setting up a routing security to having an ID card system with levels of access and systems to prevent DDOS attacks, but no one was interested.
The Army's Deputy of Cybersecurity Roy Lundgren's full response to BuzzFeed:
The DoD requires that all computer workstations be configured to utilize CAC authentication. There are instances where the logoff process does not immediately complete upon removal of the CAC. This occurs when the system is running logoff scripts and shutting down applications.
The period of time that a system can be accessed following CAC removal before system logoff completes is normally not sufficient to gain unauthorized access.
If there is a vulnerability--as described by buzzfeed.com, the Army would need to exam the actual system to assess the vulnerability. If a flaw exists, it could be mitigated with a configuration change or a change in training that highlights the need to stay at a work station until the logoff process is completed.
The government and industry must manage numerous risks each day. We look at each situation and decide if it is a low risk or high risk situation. Then the decision must be made how the risk will be managed. Often software and/or hardware solutions are not available, supportable, or necessary. In the case of many risks, they are managed via other mitigations such as modifying policy, procedures, or training.
If an issue is reported to our cybersecurity directorate, we would normally contact the system owner and ask them for an assessment. Often the risk is known and mitigating factors are already being applied and/or the organization has developed a plan of action to correct the issue.
Soldiers can report a security flaw within their chain of command, but should also alert the supporting professional IT/cybersecurity staff.
The use of thumb drives, and playing games and movies on an operational network is, in general, prohibited. It is the job of commanders and leaders at all levels is to take appropriate action to preclude unauthorized actions.
The entire Army will be observing an Information Assurance/Cybersecurity
Awareness week October 14-18 in conjunction with the national-level October Cybersecurity Awareness Month. This new Army initiative launched in February 2013 augments current policy, training, and inspection processes and aims to raise awareness and change culture.
Commanders and other leaders are re-emphasizing the importance of protecting our information and systems, and key processes to ensure this. The Army is also emphasizing that cybersecurity is the business of all leaders and that we cannot ignore information assurance/cybersecurity requirements due to a lack of knowledge and/or convenience.
Contact Justine Sharrock at firstname.lastname@example.org.
Got a confidential tip? Submit it here.