Cybersecurity and Cyberwar by Allan Friedman & Peter Singer

 

 

Memorable quotes

This gap has wide implications. One US general described to us how “understanding cyber is now a command responsibility,” as it affects almost every part of modern war. And yet, as another general put it pointedly, “There is a real dearth of doctrine and policy in the world of cyberspace.” His concern, as we explore later, was not just the military side needed to do a better job at “cyber calculus,” but that the civilian side was not providing any coordination or guidance. Some liken today to the time before World War I, when the militaries of Europe planned to utilize new technologies like railroads. The problem was that they, and the civilian leaders and publics behind them didn’t understand the technologies or their implications and so made uninformed decisions that inadvertently drove their nations into war. Others draw parallels to the early years of the Cold War. Nuclear weapons and the political dynamics they drove weren’t well understood and, even worse, were largely left to specialists. The result was that notions we now laugh off as Dr.Strangelovian were actually taken seriously, nearly leaving the planet a radioactivehulk

Moreover, the Internet is no longer just about sending mail or compiling information: it now also handles everything from linking electrical plants to tracking purchases of Barbie dolls. 

The general lack of understanding on this topic is becoming a democracy problem as well. As we write, there are some fifty cybersecurity bills under consideration in the US Congress, yet the issue is perceived as too complex to matter in the end to voters, and as a result, the elected representatives who will decide the issues on their behalf. This is one of the reasons that despite all these bills no substantive cybersecurity legislation was passed between 2002 and the writing of this book over a decade later. 

Thus, while cyberspace was once just a realm of communication and then e-commerce (reaching over $10 trillion a year in sales), it has expanded to include what we call “critical infrastructure.” These are the underlying sectors that run our modern-day civilization, ranging from agriculture and food distribution to banking, healthcare, transportation, water, and power. Each of these once stood apart but are now all bound together and linked into cyberspace via information technology, often through what are known as “supervisory control and data acquisition” or SCADA systems. These are the computer systems that monitor, adjust switching, and control other processes of critical infrastructure. Notably, the private sector controls roughly 90 percent of US critical infrastructure, and the firms behind it use cyberspace to, among other things, balance the levels of chlorination in your city’s water, control the flow of gas that heats your home, and execute the financial transactions that keep currency prices stable. 

But just as in life, not everyone plays nice. The Internet that we’ve all grown to love and now need is increasingly becoming a place of risk and danger. 

In sum, understanding the Internet’s basic decentralized architecture provides two insights for cybersecurity. It offers an appreciation of how the Internet functions without top-down coordination. But it also shows the importance of the Internet’s users and gatekeepers behaving properly, and how certain built-in choke points can create great vulnerabilities if they don’t. 

Security isn’t just the notion of being free from danger, as it is commonly conceived, but is associated with the presence of an adversary. In that way, it’s a lot like war or sex; you need at least two sides to make it real. Things may break and mistakes may be made, but a cyber problem only becomes a cybersecurity issue if an adversary seeks to gain something from the activity, whether to obtain private information, undermine the system, or prevent its legitimate use. 

The canonical goals of security in an information environment result from this notion of a threat. Traditionally, there are three goals: Confidentiality, Integrity, Availability, sometimes called the “CIA triad.” Integrity’s subtlety is what makes it a frequent target for the most sophisticated attackers. They will often first subvert the mechanisms that try to detect attacks, in the same way that complex diseases like HIV-AIDS go after the human body’s natural defenses. For instance, the Stuxnet attack (which we explore later in Part II) was so jarring because the compromised computers were telling their Iranian operators that they were functioning normally, even as the Stuxnet virus was sabotaging them. How can we know whether a system is functioning normally if we depend on that system to tell us about its current function. 

When discussing cyber incidents or fears of potential incidents, it is important to separate the idea of vulnerability from threat. An unlocked door is a vulnerability but not a threat if no one wants to enter. Conversely, one vulnerability can lead to many threats: that unlocked door could lead to terrorists sneaking in a bomb. 

The good news is that there are only three things you can do to a computer: steal its data, misuse credentials, and hijack resources. Unfortunately, our dependence on information systems means that a skilled actor could wreak a lot of damage by doing any one of those. Stolen data can reveal the strategic plans of a country or undermine the competitiveness of an entire industry. Stolen credentials can give the ability to change or destroy code and data, changing payrolls or opening up dams, as well as the ability to cover tracks. Hijacking resources can prevent a company from reaching customers or deny an army the ability to communicate.

As the incident shows, our dependence on digital systems means that increasingly we face the question of how we can trust them. For cybersecurity, the users must trust the systems, and the systems must know how to trust the users. Not every machine is going to have a unwanted Pac-Man on the screen to tell us something is wrong. How do we know that the computer is behaving as we expect it to or that an e-mail from our colleague is actually from that colleague? And, just as importantly, how do computers know if we’re supposed to be and are behaving the way we’re supposed?

 

But what if we have never met each other? How will we exchange these secret keys securely? “Asymmetric cryptography” solves this problem. The idea is to separate a secret key into a public key, which is shared with everyone, and a private key that remains secret. The two keys are generated such that something that is encrypted with a public key is decrypted with the corresponding private key, and vice versa. Figure1.2 illustrates how public key cryptography works to protect both the coni dentiality and the integrity of a message. Suppose Alice and Bob—the classic alphabetical protagonists of cryptographic examples—want to communicate. They each have a pair of keys, and can access each other’s public keys. If Alice wants to send Bob a message, she encrypts the message with Bob’s public key. Then the only person who can decrypt it must have access to Bob’s private key

The last line of defense is akin to the strategy that nuns use to police Catholic school dances. The nuns often stuff balloons between teenagers dancing too closely, creating an “air gap” to ensure nothing sneaky happens. In cybersecurity terms, an air gap is a physical separation between the network and critical systems. Such practice is common with critical infrastructure, such as with power companies, and was even attempted by the Iranians to protect their nuclear research from cyberattack. 

The problem with air gaps, much like the abstinence that the nuns try to enforce, is that it often doesn’t work in practice. Giving up control of operational infrastructure involves sacrii ces in efi ciency and effectiveness. Power companies that don’t link up, for instance, may be less vulnerable, but they can’t run “smart” power grids that save both money and the environment. Similarly, maintaining an air gap is often unrealistic, as the Iranians discovered when their supposedly air-gapped systems still got infected by the Stuxnet virus. At some point, old data needs to come out, and new instructions need to go. 

But the hackback business has two major problems. The i rst is that the question of who has the “right” to carry out cyberattacks is unclear, which means that “cyber Blackwater” i rms are “skating on thin ice” legally, says Nate Fick, CEO of the cybersecurity i rm Endgame. The second is that it’s not yet clear that hackback is even that effective over the long term. Alex Harvey, a security strategist for Fortinet, explains that “Breaking in and shutting them down isn’t hard, but a new one will just pop up. You’ll get a couple of minutes of peace and quiet.” The bottom line in cyber defense is that it is a hard task, with various options that are far from perfect. But the only other option is to close the zoo and let the malware animals runfree.

We will go more into this in Part III, but the goal is to recognize the key part that human behavior plays in enabling threats, and then build constant awareness, reinforcing it with new training. If users fail to learn the lessons of proper caution, then their access privileges should be revoked. Indeed, some companies like Lockheed Martin even have “red team” programs that every so often try to trick their own employees. If the employee opens a link in a suspicious e-mail, for example, it links the offender to a refresher course on cybersecurity. Better we learn our lesson this way than download real malware, Trojan Horses, or any other Greek-borne cybergifts.

In cyberspace, an attack can literally move at the speed of light, unlimited by geography and the political boundaries. Being delinked from physics also means it can be in multiple places at the same time, meaning the same attack can hit multiple targets at once

By 2013, an average i rm of 1,000 employees or more was spending roughly $9million a year on cybersecurity, whether it was a bank or paint maker. One can think of these costs as a collective tax we all pay, resulting from the infrastructure that supports criminal enterprises 

The ultimate risk is that the ever-growing scale of cybercrime will undermine the broader system. If banks decide that the fraud rate from stolen banking credentials is greater than the cost savings and customer service benefits of online banking, they may just turn it off. 

“I am convinced that every company in every conceivable industry with significant size and valuable intellectual property and trade secrets has been compromised (or will be shortly), with the great majority of the victims rarely discovering the intrusion or its impact”

It worked to jump-start the Chinese economic boom, but it is not the most attractive approach in the long term; the Chinese factory that made early model iPhones, for example, earned only about $15 per phone for assembling a $630 iPhone.

As it tries to become the world’s largest economy, experts argue that the Chinese government is increasingly turning to cyber espionage to maintain its expansion. “They’ve identifed innovation as crucial to future economic growth—but they’re not sure they can do it,” says Jim Lewis, an expert at the Center for Strategic and International Studies. “The easiest way to innovate is to plagiarize.”

While many hold to the theory of free markets, this new practice privileges not those who innovate new business ideas but those who steal them. This, then, further exacerbates tensions that normally arise when democracies and authoritarian systems interact. Cyber theft has been described in the NewYork Times as “the No. 1 problem” that the United States has with China’s rise. In turn, those in China describe these accusations as evidence that the United States is still “locked in a Cold War mentality”

Dmitri Alperovitch, for example, is careful not to call what goes on mere theft, but a “historically unprecedented transfer of wealth.” As business plans, trade secrets, product designs, and so on move from one country to another, one side is strengthened and the other weakened. The target loses future potential economic growth derived from that secret in addition to forfeited development investment. Many worry that this “transfer” can ultimately have a hollowing-out effect on an entire economy. Each loss from cyber espionage is too small to be fatal on its own, but their accumulation might prove crippling.

“…the worries over vulnerabilities in critical infrastructure to cyberattack have real validity. From 2011 to 2013, probes and intrusions into the computer networks of critical infrastructure in the United States went up by 1700 percent. And the worries of cyberterrorists harming this infrastructure are certainly a real concern. For instance, in 2011 a water provider in California hired a team of computer hackers to probe the vulnerabilities of its computer networks, and the simulated attackers got into the system in less than a week. Policymakers must be aware that real versions of such terror attacks could expand beyond single targets and have a wider ripple effect, knocking out the national power grid or shutting down a city or even region’s water supply” 

This creates a whole new type of combat, where the goal may not be merely to destroy the enemy’s tanks but to hack into his computer networks and make his tanks drive around in circles or even attack eachother.

Regardless, CYBERCOM is growing rapidly in both size and perceived importance inside the US military. Indeed, the Pentagon’s 2013 budget plan mentioned “cyber” 53 times. Just a year later, the 2014 budget plan discussed “cyber” 147 times, with spending on CYBERCOM’s headquarters alone set to effectively double (all the more notable as the rest of the US military budget was being cut). 

Herbert Lin is Chief Scientist for computer science at the National Academies and one of the leading thinkers in the field of cybersecurity. As he has explained, to do a proper threat assessment, one essentially evaluates three basic factors:“The feasibility of adversaries being able to identify and exploit your vulnerabilities, the effect that would happen if they were able to take advantage of these vulnerabilities, and, finally, the likelihood that they will, in fact, be willing to do so.

Son of Stuxnet is a misnomer. What’s really worrying are the concepts that Stuxnet gives hackers. The big problem we have right now is that Stuxnet has enabled hundreds of wannabe attackers to do essentially the same thing. Before, a Stuxnet-type attack could have been created by maybe i ve people. Now it’s more like 500 who could do this. The skill set that’s out there right now, and the level required to make this kind of thing, has dropped considerably simply because you can copy so much from Stuxnet. 

The most important takeaway, then, is that we must avoid letting our fears get the better of us, or even worse, let others stoke our fears and thus drive us into making bad decisions. How we respond to this world of growing cyberthreats will shape everything from our personal privacy and the future of the Internet to the likelihood of regional crises and even global wars. So we better try to get it right. 

Multiple modes of planned failure is important to resilience planning: systems and organizations should not fail critically from a single attack but should have enough distributed control to continue operations. Another key aspect is that failure must be evident. If the system allows “silent failures,” its operators can’t adapt in a timely fashion. 

Framing cybersecurity as like a public health problem may not just be more effective, but also have huge policy and political implications. Importantly, while the rethinking still allows for the problem of deliberate attacks (public health must defend against biological weapons attacks, for instance), it shifts the focus away from a meme of just cyberattack-counterattack and toward the needed goal of cooperation, among individuals, companies, states, and nations

The way this change came about provides an instructive parallel to explore for cybersecurity today. Much like the sea, cyberspace can be thought of as an ecosystem of actors with specii c interests and capacities. Responsibility and accountability are not natural market outcomes, but incentives and frameworks can be created either to enable bad behavior or to support the greater publicorder. To clamp down on piracy and privateering at sea, it took a two-pronged approach that went beyond just shoring up defenses or threatening massive attack (which are too often talked about in cybersecurity as the only options, again making false comparisons to the worst thinking of the Cold War). The first strategy was to go after the underlying havens, markets, and structures that put the profits into the practice and greased the wheels of bad behavior. 

Today, there are modern cyber equivalents to these pirate havens and markets. And much like the pirate friendly harbors of old, a substantial portion of those companies and states that give cybercrime a legal free pass are known. These range from known malware and other cyber black marketplaces to the fifty Internet service providers that account for around half of all infected machines worldwide. 

In turn, privateers, who had been viewed as useful tools, turned into the bureaucratic rivals of the formal navies being built up in these states (here again, akin to how patriotic hackers lose their shine when states build out more of their own formal cyber military units). As Janice Thompson recounts in her seminal study of why the pirate trade ended, Mercenaries, Pirates, and Sovereigns , maritime hijackers (and their state-approved counterparts) became marginalized as nations’ values changed and they saw the need to assert greater power and control. 

The lesson here is that the world is a better place with commerce and communication made safe and freewheeling pirates and privateers brought under control.

Absent a uniform strategy, the dominant approach has been for each regulatory agency to look after its own industry. But the result, as the CEO of one cybersecurity i rm told us, is that “The ‘most critical’ of the critical infrastructure are the biggest laggers in cybersecurity.” 

Absent a uniform strategy, the dominant approach has been for each regulatory agency to look after its own industry. But the result, as the CEO of one cybersecurity i rm told us, is that “The ‘most critical’ of the critical infrastructure are the biggest laggers in cybersecurity.” While much attention has been paid to securing areas like i nance, where the incentives are more in alignment for regulation and investment, other areas of even more core importance and danger like water control, chemical industry, or the ports have almost none. In 2013, for instance, a study we helped guide of six major American ports found only one had any proper level of cybersecurity, due to the fact that the Coast Guard and Department of Transportation officials, who are in charge of regulating and protecting the ports, had literally no power or expertise in the area. 

With no new laws, in 2013, the Obama White House directed its executive agencies to “use their existing authorities to provide better cybersecurity for the Nation.” But what this meant in execution remains opaque. This returns us to the question of coordination and broad strategy.

Use of electronics is ubiquitous, so that, as one industry observer noted, “a $100 microchip might keep a $100 million helicopter on the ground.” Not only do we have scant protection against this tatack, but it’s currently dificult for any vendor to know who was involved in the upstream production to certify their security

Cybersecurity may seem a story of technology, but understanding and shaping human incentives matters the most in any effective defense. 

Responses must be considered at every level, from national security strategy to enterprise risk management, down to the technical level, where engineers must make fast decisions about network incursions. It is not just about protection; the wrong response could be worse than the attack itself. This is where the value of exercises and simulations come in. They don’t just test defenses at the pointy end of the cyber spear but also help all better understand the effects of their plans and procedures. 

At the technical level, controlled environments offer a semiscientific environment to study both attacks and defenses. “Test beds” are extensible simulations of systems, networks, and operational environments that can be attacked over and over again. This repetition allows researchers to simulate failures, test the interoperability of equipment and standards, and understand how attacks and defenses interact. And, of course, you can carry out actions in a test bed that you would never want to in the real world. One test bed created by the National Institute of Standards and Technology allows researchers to repeatedly crash a simulated version of the electrical power grid to observe its failure modes and resiliency—this would obviously be problematic with the actual powergrid. 

Controlled environments can be used to study the offensive side of cyber as well. Aparticular tactic used by security researchers are “honeypots,” or isolated machines that are intentionally exposed to attacks. By observing how different types of malware attack these machines, we can identify new types of attacks and devise defenses. Entire test “honeynets” simulate complete networks or even regions of the whole Internet. During these tests, there is a cat-and-mouse chase. 

On the defensive side, vulnerability tests and practice exercises are quite valuable for the actors in cyberspace that range from militaries to private companies. This can be as simple as penetration testing, or having a “red team” of outside security experts look for vulnerabilities to exploit. These experts understand how to attack live networks in a controlled fashion, and lay the foundation for what might be a more damaging attack without putting the actual operation at risk. More sophisticated exercises can be completely simulated like a traditional war game.

Exercises also help key leaders grasp what matters before a real crisis. Senior management, which too often dismisses cybersecurity concerns as either too technical or too unlikely, can get a hands-on understanding of the importance of planning. This exposure can prevent future panic and open the manager up to committing more resources toward defense, resiliency, and response. 

As one Estonian defense official explained, leaders have many priorities and interests, and so a health minister “who will be yawning through cybersecurity talk” might pay attention if the attack in an exercise involves something relevant to his department, such as a pension database. 

This exercise was criticized by some as focusing more on appearances than on the substance, especially when fake news coverage of the game was later broadcast on CNN under the title “We were warned.” Given the cost of these larger, more complex simulations, the designers must have a clear vision of the goals of the exercise and design the game appropriately. For example, i nding vulnerabilities is a different task from discovering better modes for coordination, just as testing strategy is different from raising public awareness

 

Exercises can also create useful opportunities to strengthen personal networks of cooperation between different agencies and even different governments. For instance, the European Network and Information Security Agency’s “Cyber Europe” war game is based on a fairly simple scenario but really has a goal of inducing key ofi – cials from different European countries to interact more on cyber issues. The whole idea is that you don’t want these people to talk for the first time in the midst of an actual cyber crisis. 

Former DHS official Stewart Baker highlighted this tension in scenario building:“If it’s so one-sided the attackers win all the time . . . then the exercise is not actually teaching people anything.” 

It isn’t by any means a silver bullet solution, but examples like the aforementioned Top 20 Critical Security Controls (put together by a joint team of US government and private security experts, so both the private and public were represented) can establish a set of baseline best practices. Just like safe building or i re codes, the Top 20 Controls lay out minimal requirements that any government agency or entity operating in an important business area should follow. They range from conducting an inventory of all authorized and unauthorized devices to controlling the use of administrative privileges that act as keys to the kingdom for hackers. 

Given all these different mechanisms for information sharing, is there enough? Many believe that there is not. The high-tech trade association TechAmerica has argued that “the inability to share information is one of the greatest challenges to collective efforts toward improving our cybersecurity.” Security consultant Erik Bataller insists that “the public and private sectors need to share more information—more parties must be included and new platforms used,” ideally to the point where we can have “real-time identification and response as threats occur. 

This vision, shared by many, is that instead of the current series of loose coalitions built around types of information or industry sectors, with varied levels of energy and activity, there would emerge a more universal kind of sensor network that could work across cyberspace to detect and respond to threats.

Many executives fear the cost of exposing a vulnerability to competition or consumers and believe the risks associated with sharing information are greater than the risks associated with a network security attack.

especially needed, as a 2012 study of corporate board members and senior managers illustrates. Despite the growing importance of the issue and the scale of risks, it found little evidence of widespread awareness or concern among these leaders about cybersecurity. Almost all boards studied had actively addressed traditional risk management issues (i res, natural disaster plans, etc.), but only a third indicated that information security had the attention of the board. The majority of boards did not receive regular reports on cybersecurity risks or breaches, potentially troubling in light of the recently enacted SEC guidelines

 

One of his biggest concerns in cybersecurity, however, is not merely the advancing threats in cyberspace but how we are going to i nd the people to respond to them

 

No one is exactly sure, but in a report entitled A Human Capital Crisis in Cybersecurity , the Center for Strategic and International Studies argued that the US government had only 3 to 10percent of the cybersecurity professionals it actually needs

 

and IT hiring managers at government agencies found that only 40 percent were satisi ed with the quality of applicants for cybersecurityjobs

The movement of the more experienced talent to the private sector also means that many of the “cool jobs” inside government agencies (like the “incident response” teams, which are like SWAT teams for cyber emergencies) go to outside contractors, further driving internal talent toexit.

 

After suffering from cyberattacks in the mid-2000s, in 2010, Estonia pioneered a new model of defense known as “Küberkaitseliit,” the Cyber Defense League. In essence, it was a cyber militia, allowing private citizens to volunteer to aid public efforts. The group includes everything from IT specialists to lawyers, and they have been used in roles that range from providing backup in cyber emergencies to helping as a “red team” in important national efforts, such as electronic voting

 

Notably, though, the group is not like a national guard, in that there are no physical standards, nor are the participants putting themselves under military law, liable to be deployed to Iraq or whatnot, or asking for military pay and benei ts. Instead, joining the league is voluntary, and then applicants are vetted for expertise and trustworthiness, which builds cachet. Members of the league both enjoy the work and think it benei cial beyond the cool factor, building knowledge and connections useful to their dayjobs. Nor is the group like a patriotic hacker community, in that it is formalized and transparent to the world, as there is no desire to keep it secret. In fact, it’s just the opposite; the nation is trying to show that its defense extends beyond its ofi cial resources

 

Information Technology Exchange Program. This would allow industry and government to swap cyber professionals for short stints, akin to a student exchange or fellowship program. worthy approach is to link broader efforts to reverse these tends to specii c needs and opportunities in cybersecurity. One concept is to play NICE, short for the National Initiative for Cybersecurity Education. Designed to take a national-level approach to increasing the cyber talent pool, among some of the ideas are a fellowship program that targets “Emerging Leaders in Cybersecurity,” to help steer them into cybersecurity degree programs, and the DHS’s Secretary’s Honors Program for Cybersecurity Professionals, which recruits college students into ten-week internships at one of its cybersecurity programs and then offers them a full-time job after graduation

We are looking for talent in all the wrong places. And the organizations and companies that most need this type of talent will be the least likely to attractit

 

So, much like the NSA, they also now recruit in nontraditional places to i nd new cyber talent. One of their best new hires was a young man who didn’t have a high school diploma. Before he joined the i rm, he’d been working at a pharmaceutical plant by day, stufi ng pills into bottles

Companies can also sponsor contests, such as the CyberPatriot high school competition, using prizes and prestige to draw out top talent

Grumman argues, one day the i eld will be viewed as a key stepping stone to success. “With the breadth of issues they have to address—not only technology, but economic and psychology— a career in cybersecurity can provide the broad base necessary to reach thetop

 

A good rule is that if you can’t bear to lose it, then prepare to loseit

 

Trends are guides, nothing more and nothing less. But these macro guides are important to identify. As the futurist John Nasibett once said:“Trends, like horses, are easier to ride in the direction they are going

 

Cloud empowers individuals by providing practically limitless computational resources and by sharing powerful computing resources over the network with many users. Anew startup, for example, no longer needs to worry about buying and running its own web servers, HR sales records, or even data storage—it can be rented from a cloud provider, saving as much as 40 to 80percent, depending on the situation

 

Individual machines become less important, and instead the companies that control the data and access to it play an increasingly essential role. This can actually solve some security issues:the average individual consumer or even IT security worker is probably not as good as the security engineers at the large i rms like Amazon or Google who specialize in the cloud and can bring scale to the problem

 

For example, Netl ix started as a company that rented movie and TV show DVDs that were shipped primarily via the postal system. With the rise of the Internet and online media, it shifted to a streaming digital model. But as Netl ix sent out its online movies and TV shows, it gathered vast data on the preferences of the individual consumer and the wider collection of viewers they were part of. The collection and analysis of this new scale of data allowed Netl ix to approach the market in a whole new way, even using it to produce and shape the very content and cast of its own hit series House ofCards

 

The Internet, and its structures and norms, grew out of the worldviews of that early mix of mostly American computer scientists who i rst built it. Their approach was shaped by the mix of academic freedom and hippie mentality that characterized the era, with a strong emphasis on the power and importance of connection, sharing, and openness. However, that worldview may not be the new norm of the evolved Internet.