RSA and SCADA: two ends of the disclosure spectrum

In the last week we have seen what I feel is two ends of the disclosure spectrum for security. The RSA SecurID incident with such a high level statement on one end (called "corporate spin" by Schneier) and 34 vulnerabilities and proof of concept code on SCADA systems on the other; disclosed by a researcher without following responsible disclosure principles. Both are far from ideal and badly in need of improvement.

The RSA statement seemed like exactly the wrong way to disclose a security incident. It teased with morsels like "information is specifically related to RSA’s SecurID two-factor authentication products", played buzzword bingo with "the attack is in the category of an Advanced Persistent Threat (APT)" and just confused with "customers should increase security for social media applications". Their advice to worried customers read like a Jedi mind trick; use strong passwords, do not open unknown email attachments, we were never here hacked. For Coviello’s credibility I hope it was some marketing or PR blunder that made this statement so devoid of useful facts.

The problem with this type of minimal disclosure is that all pundits could do was speculate. There was some good analysis of the possible impacts but without real data on what was stolen, what vulnerabilities they exploited and whether the threat had now been fully neutralized, it was all just blind guesses. So many outstanding questions: is SecurID now broken or can we rely on the fact that you need a pin, a serial number and the other factor; you know those pesky passwords we don't value so much anymore because we have a SecurID token... oh wait... In addition, with the obvious focus on SecurID, since it was mentioned in the press release, hardly anyone was considering everything else that RSA holds e.g. emails as HBGary found out can occasionally have some valuable information, the RSA portfolio includes DLP software, SIEM software, GRC, Identity Management and even a Fraud center which is provided as a SAAS and no doubt many banks use RSA datacenters for storing some interesting data. Oh yeah and apparently Envision is really good at mitigating APT's [PDF] if we only we had such a tool....

The extent and nature of the information stolen and how the breech occurred has many different implications for different companies and individuals. No doubt RSA is working with large companies, especially large banks that have rolled out hard token's to millions of customers around the world, but what about everyone else? A small company or individual has no way of performing a risk assessment and RSA has not provided anything of use beyond general security good practices. Schneier in 2007: "Secrecy prevents people from accurately assessing their own risk. Secrecy precludes public debate about security, and inhibits security education that leads to improvements. Secrecy doesn't improve security; it stifles it".

On the other hand what Luigi Auriemma did with the SCADA vulnerabilities is also irresponsible. While these systems are typically not connected to the Internet the world learnt of SCADA through Stuxnet. SCADA systems control some of the most critical national infrastructure around the world. Disclosing security vulnerabilities without providing any chance for the manufacturers to release patches and organizations to install them is disappointing for a skilled and professional researcher.

In the impact analysis that followed, I really do not understand statements like this: "systems deployed are not directly connected to the Internet, Løppenthien said. Those that are connected are usually protected by a firewall, which the hacker would have to bypass first" and "In my opinion there is absolutely no risk because these systems are not made to be reached via the internet" both statements in a Computerworld article. I consider a system connected to the Internet IF it can be reached via an Internet facing firewall, how long have we been preaching about application layer threats and the ubiquitous port 80 and 443? I really hope that Løppenthien meant these systems are behind multiple internal firewalls or better air-gapped networks with no systems that connect to the outside world (a nigh on impossible task these days). Also absolutely no risk? Stuxnet highlighted that attack does not need to be via the Interent, USB devices carried in by trusted employees with access work just fine. 

Also consider the nature of the findings: remotely exploitable vulnerabilities that allow arbitary file transfer and memory and buffer overflows. These are potentially high impact. SCADA is not Windows but I would not be surprised to see a Critical or Important rating assigned from Microsoft and a CVE rating of 8 or higher for types of vulnerabilities. There are some good arguments for both full disclosure immediately and responsible disclosure on the Bugtraq thread, ultimately though this debate has already occurred and responsible disclosure won. This does not mean avoid full disclosure forever but provide a period for the vendor to release patches and risk aware customers to apply them.  Schneier: "the threat of publishing the vulnerability is almost as good as actually publishing it".

Both these incidents highlighted to me that a middle ground is required. Vulnerability researchers have an obligation to follow responsible disclosure if they want a collaborative and profitable partnership with companies and companies need full and timely disclosure if want our trust. Especially a security company more than most; it is human nature to respect those that walk the talk. Do not just release some marking about how Envision mitigate against APT's; prove it. I hope one day there is an international regulation or binding industry standard for the disclosure of security vulnerabilities and incidents. It has worked well for the financial services industry that has excellent reporting and central databases for things such as credit card fraud. While service such as the Veris framework and ironically RSA e-Fraud network have made a start, we really need regulation to fix this externality and a set clear standards for what security vulnerabilities and incidents need to be reported, the level of detail required to protect the public and in what timeframes to allow reasonable notice but not the ostrich effect. Data loss reporting has made a start on this in many countries but broader security disclosure clearly needs a G20 level agreement and mandate. Schneier: "Vulnerabilities [and incidents] are largely an externality" and governments exist mainly to correct externalities, its past time they got on with it.

Photo credit: Krsoci Flikr

Like this post? Get updates via RSS or follow me on Twitter @rakkhis
Share this, that's how ideas spread: 

No comments:

Post a Comment


Written by