Did Washington Block Discussion of a Security Patch? Should It Have?
Reports of cyber-attacks and security hacks have been filling the Net lately. Sony’s “Playstation Network” has suffered a very public series of crippling hacks that may have compromised the personal information of the network’s 100 million users – and cost the electronics giant over 14 billion yen ($170-million dollars.) South Korean officials announced they were stepping up Internet security barriers in the wake of what it says are accelerating attacks from the North. And at the other end of the globe, Ireland has been struggling to fend off computer attacks intended to infect otherwise clean local servers with malware.
As first reported in Wired, the flaws affect the “SCADA” systems of various Siemens control devices – many of which can be found in very high-level industrial, processing and generating facilities around the world. SCADA stands for ‘supervisory control and data acquisition’ – systems that allow users to both monitor and control a wide variety of processes – and not surprising for a company valued at $80 billion dollars, Siemens products can be found everywhere. Nuclear plants, natural gas pipelines, waste-water treatment, chemical production – with a big enough security hole, all these and many other facilities are potentially at risk from hackers seeking to take control of the plant.
The holes were to be publicly announced last week by NSS Labs security analyst Dillon Beresford, who was slated to present his findings and fixes at this year’s “Take Down Con” security convention. But before he spoke, he was contacted by officials with Siemens and the U.S. Department of Homeland Security who requested Beresford not reveal technical data about the flaws. He agreed, telling Kim Zetter in Wired: “Based on my own understanding of the seriousness behind this, I decided to refrain from disclosing any information due to safety concerns for the consumers that are affected by the vulnerabilities.”
It was, in fact, a Siemens SCADA system that was targeted in last year’s Stuxnet attack that hobbled Iranian nuclear facilities. The fear for DHS: a larger security hole could lead to magnitudes-greater damage and put untold numbers of lives at risk if someone could exploit a SCADA system in an operating nuclear reactor. Beresford gave DHS and Siemens the data he had collected, canceled his presentation, and expected the company would take it from there. And that’s when things got sticky.
“While NSS Labs has demonstrated a high level of professional integrity by providing Siemens access to its data, these vulnerabilities were discovered while working under special laboratory conditions with unlimited access to protocols and controllers.”
Siemens spokesperson Wieland Simon went on to say that Beresford and NSS Labs didn’t investigate how Siemens SCADA systems are employed in real-world applications, and that the holes weren’t really such a big problem after all.
“Operating under laboratory conditions and without any IT security measures in place, security experts have revealed some irregularities in the products’ communication functions. The irregularities found under such conditions are of no significance.”
It was not what Dillon Beresford was hoping to hear.
“To say, I sense a bit of evasiveness by Siemens, relating to important questions by the public and the press at large, would be an understatement,” he wrote in the SCADASEC public board. “The issues I reported are vulnerabilities, not irregularities in the products’ communications functions. Again, this is another egregious example of a vendor trying to minimize the impact of multiple security vulnerabilities in their products and being somewhat evasive about the truth.”
Beresford’s boss at NSS Labs, Rick Moy, underscored this theme when he told Greg Keizer at Computerworld:
“Siemens chose to use language that’s vague and misleading. They tried to downplay the impact to their customers. That’s what was concerning to us.”
Cyber-security writer Bruce Schneier, writing in his blog “Schneier on Security“, took it a step further:
“Before full disclosure was the norm, researchers would discover vulnerabilities in software and send details to the software companies — who would ignore them, trusting in the security of secrecy. Some would go so far as to threaten the researchers with legal action if they disclosed the vulnerabilities.
“Later on, researchers announced that particular vulnerabilities existed, but did not publish details. Software companies would then call the vulnerabilities “theoretical” and deny that they actually existed. Of course, they would still ignore the problems, and occasionally threaten the researcher with legal action. Then, of course, some hacker would create an exploit using the vulnerability — and the company would release a really quick patch, apologize profusely, and then go on to explain that the whole thing was entirely the fault of the evil, vile hackers.”
All of this is amplifying a long-simmering debate in security circles – whether researchers who discover serious problems should go public with their findings – forcing a response from the software firm – or whether they should allow corporate and governmental authorities respond in a more private manner – possibly letting a major hole go unplugged. In the end, who’s responsible for plugging the cyber-holes in the dam? Who is the Internet’s Hans Brinker?
Soon after, or perhaps even before the Siemens SCADA holes are plugged, it’s almost guaranteed there will be some new story about some new security hole – and the efforts of a few security researchers to offer their fingers to plug the dike. And the debate will continue.