On Feb. 18, the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) issued an alert about a ransomware attack on a natural gas compression facility that resulted in the business shutting down for 2 days.
The fact is critical infrastructure being targeted wasn’t newsworthy, as we’ve had a steady stream of warnings about such targeting by Iran, Russia, and North Korea — neither was the fact that the gas facility fell victim to a ransomware attack, given recent attacks on hospitals in Alabama, a U.S. maritime base, and the cities across the country including Baltimore and Atlanta.
No, the newsworthy piece of the story is that the unnamed company hadn’t taken steps to implement basic cybersecurity controls, as laid out in common cybersecurity frameworks such as ISO 27001 and the NIST Cybersecurity Framework.
First and foremost, “[t]he victim failed to implement robust segmentation between the IT and OT networks,” allowing the attacker to move between systems and “disable assets on both networks.” In layman’s terms, the company’s public facing network (IT) was compromised, and from there the attackers moved to the company’s operational technology (OT) network, used “for managing critical factory equipment and other factory operations.”
At least one report suggested that the IT and OT systems should have been “air-gapped,” meaning physically disconnected from each other (ideally, the OT system would’ve also had no internet connectivity). But air-gapping isn’t an effective cybersecurity control. For example, in August 2019 Ukranian authorities discovered employees at a nuclear power plant connecting cryptocurrency mining equipment to the network, and then connecting the network to the Internet — a network presumed to be air-gapped. There’ve been similar incidents in Russia, Australia, and Romania. In fact, almost 17 years ago, the Illinois State Police dispatch system was taken off-line for several hours by the Blaster worm, despite being air-gapped from the Internet. In short, air-gapping is not effective because it’s easily circumvented.
Rather, this unnamed gas facility should’ve “[d]efine[d] a demilitarized zone (DMZ) that eliminates unregulated communication between the IT and OT network,” and implemented network segmentation, “limit[ing] the ability of adversaries to pivot to the OT network even if the IT network is compromised.” In other words, much like the DMZ that separates North and South Korea, implementing a DMZ — using routers, switches, software, etc. — effectively segregating the Internet-facing network from the operational network.
Importantly, “[a]t no time did the threat actor obtain the ability to control or manipulate operations” at the gas facility. But this doesn’t mean the attacker couldn’t have done so, just that the attack was disabled before getting that far. In other words, they got lucky.
But luck is only a strategy in Vegas, not cybersecurity.
Although, admittedly, luck has played a role in other critical infrastructure attacks, such as when an Iranian-affiliated hacker obtained remote access to the systems of New York’s Bowman Dam — but was unable to release water because, luckily, the sluice gate control was disconnected at the time.
Interestingly, in this latest ransomware attack, there was “[a] separate and geographically distinct central control office,” although it was not set-up to control operations, resulting in an “operational shutdown of the entire pipeline asset lasting approximately two days.” Yet the whole point of geographically separated facilities is to ensure redundancy, and enable seamless operational failover; something completely lacking here.
Especially telling: CISA noted that “[t]he victim’s existing emergency response plan focused on threats to physical safety and not cyber incidents,” and that their “emergency response exercises also failed to provide employees with decision-making experience in dealing with cyberattacks.”
After years of pushing to fortify our critical infrastructure — and universally acknowledging the importance of a comprehensive, up-to-date incident response plan — critical businesses still haven’t heeded these warnings.
CISA noted that this gas company needs to “[i]mplement regular Data Backup . . . regularly tested and isolated from network connections.” After all, ransomware is designed to lock data and force the data owner to pay money to regain access. While some companies may inevitably choose to pay the ransom, there’s still no guarantee they’ll get the decryption key to secure return of their data. Thus, the best way to mitigate a ransomware attack is to have another copy of the data, segregated, stored off-site, and disconnected from the network so — if need be — you can restore from that backup.
While the CISA alert also went into detail about other cybersecurity issues at this unnamed gas facility, the bottom line is that the facility failed to implement basic cybersecurity hygiene, demonstrating that the only way to truly ensure uniform implementation of a cybersecurity framework is to mandate it — and to audit and verify compliance, much like we’ve seen in the insurance and financial services sectors.
Perhaps most disconcerting is that the week before this CISA alert was issued marked the 50th anniversary of the Defense Department’s Task Force on Computer Security’s “Ware Report” (named for task Force Chair, Willis Ware). Although the Ware Report was declassified in 1975, many of today’s companies still have yet to implement the cybersecurity controls it identified, including robust user access controls, system debugging, testing and certification, encryption and audits. Of special note to our unnamed gas facility, the Ware report even discussed network segmentation (“segregated operational modes”), system failover planning (“continuity of service to a remote user in spite of communication circuit failure”), etc.
Presciently, Ware noted in his transmittal memo that “providing security controls in computer systems will transcend the Department of Defense . . . the computing industry will eventually have to supply computers and systems with appropriate safeguards.”
Fifty years later, critical infrastructure companies have still failed to address 1970’s cybersecurity vulnerabilities.
We’ve yet to see computing companies uniformly pushing out technology with “appropriate safeguards,” such as securing software, disabling admin accounts by default, mandatory multi-factor authentication, etc. Sadly, 50 years from now the Ware report may be just as relevant as it is today — a legacy Ware would surely frown upon.
Joel Schwarz is a Managing Partner at The Schwarz Group, LLC, where he works as a consultant and attorney, and an adjunct professor at Albany Law School, teaching courses on cybercrime, cybersecurity and privacy. He previously served as the Civil Liberties and Privacy Officer (CLPO) for the National Counterterrorism Center and was a cybercrime prosecutor for the Justice Dept. and N.Y. State Attorney General’s Office. He was also counsel on e-commerce and privacy for MetLife.