Chapter 72: Cyberterrorism – Stuxnet

Vulnerability is a cyber-security term that refers to a flaw in a system that can leave it open to attack.  Vulnerability may also be referred to any type of weakness in a computer system itself, in a set of procedures, or in anything that leaves information security exposed to a threat.

Nuclear power plants may be vulnerable to cyber-attacks, which might – in extreme cases – lead to substantial releases of radioactive material with consequent loss of lives, radiation sickness and psycho-trauma, extensive property destruction and economic upheaval.

Cyber vulnerability is referred to as a flaw in a computer system that can leave it open to attack.

1.      COMPUTER APPLICATIONS FOR NUCLEAR POWER PLANTS:

Cyber-attacks are made on computer systems operated for a wide spectrum of purposes.  Until now, no cyber-attacks on nuclear power plants have resulted in releases of radioactive material but the trends are disquieting.   The objective of a cyber-attack may not be to cause death and destruction, for example, but to disrupt the operation of a nuclear facility, to inflict economic damage, to embarrass government or utility officials, to blackmail companies, to get even, or just to test one’s skills or to see what happens.  There is even a risk of cyber-attacks aimed at other targets migrating into nuclear facilities and causing unpredictable damages.  The overly large distribution of Stuxnet has demonstrated this possibility.   Given the potential for great harm, any successful cyber-attack on a nuclear facility would – at the least – undermine confidence in the ability of the State to be a responsible host and the owner and operator to run the facility in a safe and secure manner.

Computers are used extensively in nuclear power plants around the world.  The various applications include:

  • Take data from sensors in the   field and display trends and ongoing system data, e.g. on a simplified system   flow path diagram;
  • Provide alarms when sensors   indicate abnormal conditions occur, e.g. low water or oil pressures in   supporting systems;
  • Take data from field sensors   and calculate true nuclear power on continual basis (referred to as a   calorimetric). This takes into account factors as heat lost by discharging   water from the steam generator blow-down system, variation in steam pressure   and its effect on steam flow rates;
  • Accumulate data and printout automatically if the reactor shuts down (trips) in order to aid the operators in quickly determining the cause of the shutdown (called a post-trip report);
  • Track all work status (testing and maintenance) at the plant;
  • Track the status of all equipment isolated for maintenance or other reasons;
  • Provide equipment technical information for ~ 150000 components;
  • Provide control of non-safety systems, and some limited safety-related applications at the plant. Such control is similar to that used extensively in fossil power plants;
  • Provide calculations of input from various reactor protection parameters as input to control systems, e.g. control rods;
  • Allow prediction of off-site effects of radiological releases during emergency conditions, if these occur. Input includes local meteorological tower data;
  • Simulation of the performance of plant systems on full scope models of the control room for training and re-qualifying operations, engineering, and management personnel; and
  • Calculations used in nuclear safety analyses, e.g. probabilistic safety assessment and transient and accident analysis.

Computers are not usually used in reactor protection applications to shut-down the reactor, except as an alternate to other, usually analog, systems.  Because many lines of computer program code are used, and the primary concern of regulators in each State is to ensure that such applications could never fail.  A limited number of applications of this type have been authorized in US plants.

2.     STUXNET:

Cyber-attacks may be intended to have local and limited effects, but radioactive material ejected from a failed reactor pays no heed to national boundaries.

For instance, here is an example of a recent cyber-attack.

Stuxnet is a computer worm discovered in June 2010 that is believed to have been created by United States and Israel agencies to attack Iran’s nuclear facilities.   Stuxnet initially spreads via Microsoft Windows, and targets Siemens industrial control systems.  While it is not the first time that hackers have targeted industrial systems, it is the first discovered malware that spies on and subverts industrial systems, and the first to include a Programmable Logic Controller (PLC) rootkit.

Different variants of Stuxnet targeted five Iranian organizations, with the probable target widely suspected to be uranium enrichment infrastructure in Iran; Symantec noted in August 2010 that 60 percent of the infected computers worldwide were in Iran.  Siemens stated that the worm has not caused any damage to its customers, but the Iran nuclear program, which uses embargoed Siemens equipment procured secretly, has been damaged by Stuxnet.  Kaspersky Lab concluded that the sophisticated attack could only have been conducted “with nation-state support”.  This was further supported by the F-Secure‘s chief researcher Mikko Hyppönen who commented in a Stuxnet Frequent Asked Questions (FAQ), “That’s what it would look like, yes”.  It has been speculated that Israel and the United States may have been involved.

Stuxnet is a threat that was primarily written to target an industrial control system or set of similar systems. Industrial control systems are used in gas pipelines and power plants. Its final goal is to reprogram Industrial Control Systems (ICS) by modifying code on Programmable Logic Controllers (PLCs) to make them work in a manner the attacker in­tended and to hide those changes from the operator of the equipment. In order to achieve this goal the creators amassed a vast array of com­ponents to increase their chances of success. This includes zero-day exploits, a Windows rootkit, the first ever PLC rootkit, antivirus evasion techniques, complex process injection and hooking code, network infection routines, peer-to-peer updates, and a command and control interface.

Stuxnet was discovered in July, but is confirmed to have existed at least one year prior and likely even before. The majority of infections were found in Iran. Stuxnet contains many features such as:

  • Self-replicates through removable drives exploiting a vulnerability allowing auto-execution;
  • Spreads in a Local Area Network (LAN) through a vulnerability in the Windows Print Spooler;
  • Spreads through SMB by exploiting the Microsoft Windows Server Service RPC Handling Remote Code Execu­tion Vulnerability;
  • Copies and executes itself on remote computers through network shares;
  • Copies and executes itself on remote computers running a “WinCC” database server;
  • Copies itself into Step 7 projects in such a way that it automatically executes when the Step 7 project is loaded;
  • Updates itself through a peer-to-peer mechanism within a LAN;
  • Exploits a total of four unpatched Microsoft vulnerabilities, two of which are previously mentioned vulnerabilities for self-replication and the other two are escalation of privilege vulnerabilities that have yet to be disclosed;
  • Contacts a command and control server that allows the hacker to download and execute code, including up dated versions;
  • Contains a Windows rootkit that hide its binaries;
  • Attempts to bypass security products;
  • Fingerprints a specific industrial control system and modifies code on the Siemens PLCs to potentially sabotage the system; and
  • Hides modified code on PLCs, essentially a rootkit for PLCs.

Stuxnet has a complex architecture that is worth outlining.   The heart of Stuxnet consists of a large .dll file that contains many different exports and resources. In addition to the large .dll file, Stuxnet also contains two encrypted configuration blocks.

The dropper component of Stuxnet is a wrapper program that contains all of the above components stored inside itself in a section name “stub”. This stub section is integral to the working of Stuxnet. When the threat is execut­ed, the wrapper extracts the .dll file from the stub section, maps it into memory as a module, and calls one of the exports.

A pointer to the original stub section is passed to this export as a parameter.  This export in turn will extract the .dll file from the stub section, which was passed as a parameter, map it into memory and call another different export from inside the mapped .dll file.  The pointer to the original stub section is again passed as a parameter. This occurs continuously throughout the execution of the threat, so the original stub section is continuously passed around between different processes and functions as a parameter to the main payload. In this way every layer of the threat always has access to the main .dll and the configuration blocks.

In addition to loading the .dll file into memory and calling an export directly, Stuxnet also uses another technique to call exports from the main .dll file. This technique is to read an executable template from its own resources, populate the template with appropriate data, such as which .dll file to load and which export to call, and then to inject this newly populated executable into another pro­cess and execute it. The newly populated executable tem­plate will load the original .dll file and call whatever export the template was populated with.

Although the threat uses these two different tech­niques to call exports in the main .dll file, it should be clear that all the functionality of the threat can be ascer­tained by analyzing all of the exports from the main .dll file.

3.     POSSIBLE ATTACK SCENARIO:

The following is a possible attack scenario.  It is only speculation driven by the technical features of Stuxnet.

ICS are operated by a specialized assembly like code on PLCs. The PLCs are often programmed from Windows computers not connected to the Internet or even the internal network. In addition, the industrial control systems themselves are also unlikely to be connected to the Internet.

First, the attackers needed to conduct reconnaissance.  As each PLC is configured in a unique manner, the attack­ers would first need the ICS’s schematics. These design documents may have been stolen by an insider or even retrieved by an early version of Stuxnet or other malicious binary.  Once attackers had the design documents and potential knowledge of the computing environment in the facility, they would develop the latest version of Stux­net. Each feature of Stuxnet was implemented for a specific reason and for the final goal of potentially sabotag­ing the ICS.

Attackers would need to setup a mirrored environment that would include the necessary ICS hardware, such as PLCs, modules, and peripherals in order to test their code.  The full cycle may have taken six months and five to ten core developers not counting numerous other individuals, such as quality assurance and management.

In addition their malicious binaries contained driver files that needed to be digitally signed to avoid suspicion. The attackers compromised two digital certificates to achieve this task. The attackers would have needed to obtain the digital certificates from someone who may have physically entered the premises of the two companies and stole them, as the two companies are in close physical proximity.

To infect their target, Stuxnet would need to be introduced into the target environment. This may have occurred by infecting a willing or unknowing third party, such as a contractor who perhaps had access to the facility, or an insider. The original infection may have been introduced by removable drive.

Once Stuxnet had infected a computer within the organization it began to spread in search of Field PGs, which are typical Windows computers but used to program PLCs.  Since most of these computers are non-networked, Stuxnet would first try to spread to other computers on the LAN through zero-day vulnerability, a two year old vulnerability, infecting Step 7 projects, and through removable drives. Propagation through a LAN likely served as the first step and propagation through removable drives as a means to cover the last and final hop to a Field PG that is never connected to an untrusted network.

While attackers could control Stuxnet with a command and control server, as mentioned previously the key com­puter was unlikely to have outbound Internet access. Thus, all the functionality required to sabotage a system was embedded directly in the Stuxnet executable. Updates to this executable would be propagated throughout the facility through a peer-to-peer method established by Stuxnet.

When Stuxnet finally found a suitable computer, one that ran Step 7, it would then modify the code on the PLC. These modifications likely sabotaged the system, which was likely considered a high value target due to the large resources invested in the creation of Stuxnet.

Victims attempting to verify the issue would not see any rogue PLC code as Stuxnet hides its modifications.

While their choice of using self-replication methods may have been necessary to ensure they’d find a suitable Field PG, they also caused noticeable collateral damage by infecting machines outside the target organization. The attackers may have considered the collateral damage a necessity in order to effectively reach the intended target. Also, the attackers likely completed their initial attack by the time they were discovered.

4.     INFECTION STATISTICS:

On July 20, 2010 Symantec set up a system to monitor traffic to the Stuxnet command and control (C&C) serv­ers which allowed to observe rates of infection and also to identify the locations of infected computers, ultimately working with CERT and other organizations to help inform infected parties.  The system only identified command and control traffic from computers that were able to connect to the C&C servers.  The data sent back to the C&C servers is encrypted and includes data such as the internal and external IP address, computer name, OS version, and if it’s running the Siemens SIMATIC Step 7 industrial control software.

As of September 29, 2010, the data has shown that there are approximately 100,000 infected hosts. The follow­ing graph shows the number of unique infected hosts by country:Slide1The following graph shows the number of infected organizations by country based on WAN IP addresses:Slide2Over 40,000 unique external IP addresses from over 155 countries were observed.  Looking at the percentage of infected hosts by country, shows that approximately 60 percent of infected hosts are in Iran:Slide3Stuxnet aims to identify those hosts which have the Siemens Step 7 software installed.  The following chart shows the percentage of infected hosts by country with the Siemens software installed:Slide4Looking at newly infected IP addresses per day, on August 22, it was observed that Iran was no longer reporting new infections.  This was most likely due to Iran blocking outward connections to the command and control servers, rather than a drop-off in infections.  The following graph illustrates the rate of infection of new IPs by country:Slide5Of the approximately 12,000 infec­tions, the figure 06 shows which variants resulted in the most infections.  The March 2010 variant accounts for 69 percent of all infections.  Thus, the March 2010 variant may have been seeded more successfully.  Note the single targeted organization in March 2010 was also targeted in June 2009 and in April 2010 and neither of those other seeded attempts resulted in as many infections as in March.  While smaller infection rates for the June 2009 variant would be expected since it had less replication methods, the April 2010 variant is almost identical to the March 2010 variant.  Thus, either the different seed within the same organiza­tion resulted in significantly different rates of spread (e.g., seeding in a computer in a department with less computer-security restrictions) or the data is skewed due to the small percentage of samples recovered.Slide6The concentration of infections in Iran likely indicates that this was the initial target for infections and was where infections were initially seeded.  While Stuxnet is a targeted threat, the use of a variety of propagation techniques (which will be discussed later) has meant that Stuxnet has spread beyond the initial target.  These additional infections are likely to be “collateral damage”—unintentional side-effects of the promiscuous initial propagation methodology utilized by Stuxent.  While infection rates will likely drop as users patch their comput­ers against the vulnerabilities used for propagation, worms of this nature typically continue to be able to propa­gate via unsecured and unpatched computers.

5.      SOME OBSERVATIONS:

According to Ralph Langner, even three years after being discovered, Stuxnet continues to baffle military strategists, computer security experts, political decision makers, and the general public. The malware marks a clear turning point in the history of cyber security and in military history as well.  Its impact for the future will most likely be substantial.

The degree of the damage to the Iranian nuclear power plants could very well be related to Iran’s low-tech approach to uranium enrichment.  The backbone of Iran’s uranium enrichment effort is the IR-1 centrifuge which goes back to a European design of the late Sixties / early Seventies that was stolen by Pakistani nuclear trafficker A. Q. Khan.  It is an obsolete design that Iran never managed to operate reliably.  Reliability problems may well have started as early as 1987, when Iran began experimenting with a set of decommissioned P-1 centrifuges acquired from the Khan network.  Problems with getting the centrifuge rotors to spin flawlessly will also likely have resulted in the poor efficiency that can be observed when analyzing the International Atomic Energy Agency (IAEA) reports, suggesting that the IR-1 performs only half as well – best case – as it could theoretically. A likely reason for such poor performance is that Iran reduced the operating pressure of the centrifuges in order to lower rotor wall pressure.  But less pressure means less throughput – and thus less efficiency.

As unreliable and inefficient as the IR-1 is, it offered a significant benefit: Iran managed to produce the antiquated design at industrial scale.  It must have seemed striking to compensate reliability and efficiency by volume, accepting a constant breakup of centrifuges during operation because they could be manufactured faster than they crashed.  Supply was not a problem.  But how does one use thousands of fragile centrifuges in in a sensitive industrial process that doesn’t tolerate even minor equipment hiccups? In order to achieve that, Iran uses a Cascade Protection System which is quite unique as it is designed to cope with ongoing centrifuge trouble by implementing a crude version of fault tolerance.  The protection system is a critical system component for Iran’s nuclear program as without it Iran would not be capable of sustained uranium enrichment.

Everything has its roots, and the roots of Stuxnet are not in the Information Technology (IT) domain but in nuclear counter-proliferation. Sabotaging the Iranian nuclear program had been done before by supplying Iran with manipulated mechanical and electrical equipment.  Stuxnet transformed that approach from analog to digital.  Not drawing from the same brain pool that threw sand in Iran’s nuclear gear in the past would have been a stupid waste of resources as even the digital attacks required in-depth knowledge of the plant design and operation; knowledge that could not be obtained by simply analyzing network traffic and computer configurations at Iran.  It is not even difficult to identify potential suspects for such an operation; nuclear counter-proliferation is the responsibility of the US Department of Energy and since 1994 also of the Central Intelligence Agency, even though both organizations don’t list sabotage under their official duties.

At the operational level, Stuxnet highlighted the royal road to infiltration of hard targets.  Rather than trying to infiltrate directly by crawling through fifteen firewalls, three data diodes, and an intrusion detection system, the attackers played it indirectly by infecting soft targets with legitimate access to Ground Zero: Contractors.

Whatever the cyber security posture of contractors may have been, it certainly was not at par with the Natanz Fuel Enrichment facility in Iran.  Getting the malware on their mobile devices and USB sticks proved good enough as sooner or later they would physically carry those on site and connect them to the FEP’s most critical systems, unchallenged by any guards.

Any follow-up attacker will explore this infiltration method when thinking about hitting hard targets.  The sober reality is that at a global scale, pretty much every single industrial or military facility that uses industrial control systems at some scale is dependent on its network of contractors, many of which are very good at narrowly-defined engineering tasks, but lousy at cyber security.  While industrial control system security had discussed the insider threat for many years, insiders who unwittingly help to deploy a cyber- weapon had been completely off the radar.  Obviously, they play a much more important role than the very small subset of insiders that may theoretically develop malicious intentions.

The question is – Are Nation-State Resources Required to Pull off Similar Attacks against the US or Their Allies?

It has often been stated that similar attacks against US (or other friendly) targets would require nation-state resources.  From a technical perspective, this is not true.  The development of Stuxnet did require nation-state resources – especially for intelligence gathering, infiltration, and most of all for testing.  The technical analysis clearly indicates that:

  • The cyber weapon was way too complex to warrant any hope for successful operation without thorough testing, and
  • Such testing must have involved a fully-functional mock-up IR-1 cascade operating with real uranium hexafluoride because both overpressure and rotor speed manipulations have completely different effects if executed on empty centrifuges.

Obviously, a fully-functional uranium enrichment test bed that replicates a top secret plant is beyond the reach of organized crime and terrorists.  But there are more copycat scenarios than the (quite silly) idea that adversaries could impact the operation of a uranium enrichment facility in the US and disguise such an attack as random equipment failure.

It is quite unreasonable to expect a sophisticated cyber-attack against a similar singular high-value US target, at least not in time of peace.  That doesn’t guaranty safety.  Attack technology can and should be separated from attack scenarios with their specific objectives and constraints.  Assuming that adversaries will try to maximize cost/benefit ratio, they will most likely focus on targets that are much easier to attack using lessons learned from Stuxnet – targets that are plentiful and accessible and much easier to attack, such as critical infrastructure installations.  Not only is civilian critical infrastructure a more promising target for adversaries because of better accessibility, but also because of standardization.  Even A. Q. Khan did not sell turnkey uranium enrichment plants which are used in hundreds of locations in different countries.  For power plants, electrical substations, chemical plants and the like, that’s a different story.  All modern plants operate with standard industrial control system architectures and products from just a handful of vendors per industry, using similar or even identical configurations.  This has implications that are much more important than the increasing network connectivity that is often identified as the biggest ICS security problem:

  • Intelligence gathering isn’t particularly difficult.  A good control system engineer that thoroughly understands the architecture and functionality of control system X for power plant A will be able to use most of his knowledge in power plant B or C as long as they use the same product and version, as one can easily tell just by looking at recruitment ads.  Knowing that control system engineers are faced with comparatively low salaries and unpleasant shift work makes them a source of relevant skills that can be drained easily; an approach that is much more promising than training hackers in industrial control systems and plant operations; and
  • Once that attack tactics are identified and implemented, they can be used not just to hit one specific target, but multiple targets.  A simultaneous low-key attack against multiple targets can have as much of an effect as a much more costly and sophisticated attack against a singular high-value target.  Attack sophistication and reliability can be traded for scalability.  It gives any potential attacker more bang for the buck if exploit code is not used exclusively against one specific target (such as an electrical substation, or water plant) but against multiple targets of the same breed, thereby achieving emergent destructive capability.  As an example, a cyber-attack against one power station (or electrical substation) is pretty much pointless as it has little to zero impact on grid reliability.  A simultaneous attack against multiple stations can, however, result in a cascading grid failure.  Adversaries beyond the script kiddie level will have figured that out already.

One of the toughest challenges is the fact that exploit code can be packaged into software tools.  The genius mastermind is needed only for identifying vulnerabilities and designing exploits.  Any software shop, no matter if government-driven, privately held, or in the criminal underground, would not implement such exploits as custom spaghetti code carefully adjusted to a single piece of malware, but use an object-oriented, modular approach.  At some level of software maturity, such exploit components can be made available in user-friendly point-and-click software applications, just like it is now for boilerplate malware development.  The skill set for those who assemble and deploy a specific sample of cyber-physical attack code will then drop dramatically.

Other factors that made the development of Stuxnet particularly costly and should not be expected in copycat attacks were the self-imposed constraints of the attackers.  Stuxnet’s developers decided damage should be disguised as reliability problems.  It is estimated that well over 50 percent of Stuxnet’s development cost went into efforts to hide the attack.  Stuxnet-inspired attackers will not necessarily place the same emphasis on disguise; they may want the victim to know that they are under cyber-attack, and perhaps even publicly claim credit for it.

Such thinking would certainly not limit itself to the use of low-yield cyber weapons.  It appears a stretch to assume that adversaries would be as concerned about collateral damage as US cyber forces, or would go so far to involve lawyers in their team for advice how to not violate international law.  In the industrial control system space, an open attack doesn’t even preclude follow-up attacks, as attempts to protect the targets and similar potential targets may take well over a year, allowing the attackers to strike again, maybe with fine-tuned exploits.

In order to estimate resources required for substantial Stuxnet-inspired cyber-physical attacks, one should first get credible scenarios straight.  Credible scenarios involve simultaneous or staged cyber-attacks against targets in critical infrastructure and manufacturing.  Such targets can be hit by a Stuxnet-inspired copycat attack without requiring nation-state capabilities.  The question why America’s adversaries didn’t try to achieve that already is as difficult to answer as it wasn’t known before 9/11 that terrorists could fly passenger airplanes into buildings.  The fact of the matter is that the capabilities of potential cyber attackers are on the rise, and at the same time vulnerabilities of potential targets for cyber-physical attacks are increasing due to a rush to more connectivity and convenience.  Not a promising development.

Resources:

  1. Technopedia – Vulnerability;
  2. US Department of State – Cyber Security for Nuclear Power Plants;
  3. Wikipedia – Stuxnet;
  4. Legal Experts – Stuxnet Attack on Iran was illegal;
  5. The Vault – Building a Cyber Secure Plant;
  6. Computerworld – Siemens – Stuxnet worm Hit Industrial Systems;
  7. BBC News Technology – Stuxnet Virus Targets and Spread Revealed;
  8. Symantec Security Response – W32. Stuxnet Dossier; and
  9. Langner – To Kill a Centrifuge – A Technical Analysis.
 

Chapter 73