At the end of 2019 the Office of the Under Secretary of Defense for Intelligence (OUSDI), in cooperation with WAR ROOM, announced an essay contest to generate new ideas and elevate thinking about insider threats and how we respond to and counter the threat. There was a fantastic response, and we were thrilled to see what everyone had to write on the topic. Ultimately, after two rounds of competitive judging, two essays rose to the top. Last week we presented the runner up’s submission. And now we are pleased to present to you the winning submission.

 

Ὦ ξεῖν’, ἀγγέλλειν Λακεδαιμονίοις ὅτι τῇδε

κείμεθα, τοῖς κείνων ῥήμασι πειθόμενοι.

Tell the Spartans, passerby, that here,

obedient to their words, we lie.

For all the recent press, the most famous instance of insider threat remains nothing to do with leaks, websites, intelligence documents, or hacking tools. You may not recognize the name Ephialtes, but you know him and his story all the same. Because 2,500 years ago, at a small mountain pass in Greece called the Hot Gates, he betrayed his country by showing the invading Persian army a small mountain track to flank the small defending force of 300 Spartans and allied Greeks. Ephialtes’ name means “nightmare” in Greek, and for the next two and a half millennia that is what the insider threat could be to virtually every organization facing a crisis or adversary.

The more than 2,000 year history of insider threat shows that it is a problem that does not stem from any particular conflict, any new technology or tool. Rather, it is a fundamentally human phenomenon. Only humans can bestow trust, so only humans can violate that trust. However, for all that history, recent research indicates that the insider threat is growing in the modern era. How can leaders square those two ideas – that insider threat has always been, and that it is more critical today than ever before? The answer is that both are true. While insider threat is fundamentally a human problem, it is also one that is exacerbated by the physical-digital convergence of the current 4th Industrial Revolution where troves of information can be sent anywhere, anonymously, and almost instantly.

Therefore, the solutions to the modern insider threat should recognize the fundamental humanity of organizations, but also function in the digital environment in which so many of us live and work. Only when we recognize those two fundamental natures of the modern insider threat can we begin to make meaningful and lasting progress against it.

People are the problem

For much of history, defending has been viewed as a problem of fortification: build bigger, higher, stronger walls and everything will be all right. However, a quick study of history of castles shows that more than a few ended not with fiery breaches of walls, but from the actions of an insider quietly opening a gate. So if leaders are interested in defending anything from medieval castles to modern classified information, they should start by understanding the insider threat as well.

The epigram at the start of this essay comes from the ancient Greek poet Simonides, and more than just being a good verse, it gives some insight in the nature of insider threat. Because what is the opposite of insider threat except for those that follow the rules even when it goes again their own self-interest. So why do some follow the rules and others not? Simonides tells us. Those who followed the rules did so because they were “πειθόμενοι” meaning “obedient” but also “convinced.” In other words, they followed the rules – even to the point of dying – because they believed in the shared values of the group. To begin to understand insider threat and help prevent it, we need to understand what can lead individuals to believe the shared values of a group and what can lead them to abandon those values.

Our research suggests that individuals do not even need to willingly abandon an organization’s shared values. The insider threat can as often be due to carelessness or complacency as to outright malice. Employees trying to maximize one value of an organization can often cut corners on another; for example, sharing passwords so that a client can always get what they need no matter who is in the office. While that opens a wide array of indicators that need to be looked out for, it also shows that these acts are not impulsive or random. They have discernible patterns – what Dr. Eric Shaw describes as the critical path from idea to action – that can be detected and interrupted before an individual takes a negative action.

People are the solution

There is a silver lining to the wide array of motives for insider behavior because it also introduces a wide range of solutions that can help. If losing shared values is the problem, reinforcing shared values can be a solution, and for that, the entire organization – everything from its people to its tools to its physical spaces – can be a valuable tool.

A leader’s first instinct may be to “double down” on controls – set new rules and policies in place – but if the problem is people not following the rules to begin with, why should additional rules fare any better than the existing ones? Rather, what is needed is to interrupt the pathway from idea to action with  signals that reinforce shared values of the organization. These can range from external controls that make undesirable behaviors more difficult (such as turning off USB ports to limit file downloads) or internal signals are those that seek to influence an individual’s beliefs and assumptions (such as reminders when sending email attachments outside of the organization).

But none of these concepts are new. So why does it seem like insider threat is increasing?

Digital complications

We are not all crazy and it is not just a function of news coverage; insider threat is increasing. According to corporate surveys there has been a 60% increase in insider threat events since 2016. But this just raises the question: Why? If insider threat is all about human behavior and humans have not changed, why is insider threat increasing?

The answer is that the world has shifted and become somewhat more complicated. The physical-digital integration of our current 4th industrial revolution means that there is simply more sensitive information than ever before. That simple increase in size of data and shift in format has made it easier to steal information and harder to protect it. It took several weeks to photocopy the Pentagon Papers by hand, while today a few mouse clicks in the cloud can compromise the personal information of millions of people.

Even more importantly, physical-digital convergence is also changing the basics of people’s motivations and actions. All of this data movement occurs in virtual space where individuals often feel anonymous, and, rightly or wrongly, those feelings of anonymity in cyberspace change how we act.  While in some cases that anonymity may reinforce group values, in others it can lead to risky behavior. The challenge is our understanding of the impacts this new physical-digital environment has on us has not kept pace. One survey found that while 80% percent of companies view loss of confidential information as a significant threat, only 43% had a system to monitor or control outgoing emails. Leadership has not changed, but leaders need help in applying those time-honored principles in a digital age.

A way forward

A key to moving forward on insider threat is remembering that people are the solution, but to interrupt the critical path in a physical-digital era requires physical-digital tools.

Prevent – It is perhaps no surprise then that a first step to avoiding an insider threat event is to prevent it from happening in the first place. Controls can be important, but those controls need to be balanced against their impact on limiting ease of business. Ultimately, if people’s motivations are the root cause of insider threat, then working hard to keep those motivations positive is the best prevention. High workforce engagement has been shown to be a key force in mitigating disgruntlement and increasing understanding of mission and expectations.

Mitigate – However, even the best, happiest organizations need to be prepared. To do so begins with understanding what needs to be protected. A data inventory can help define what the critical assets are, who the likely threat actors are, and what the organization’s appetite for risk with those assets are. With answers to those three questions, an organization can begin to implement programs to protect critical assets while maintain ease of business.

Detect – Finally, since no person is fully predictable and no control always effective, organizations should continuously monitor key activities looking for indicators of adverse behaviors. This monitoring should also apply to the prevent and mitigate programs themselves to continuously evaluate their effectiveness and identify areas of improvement.

Organizations are just groups of people using common tools to do a job. So any solution that wants to address why some people may break the trust of that group should take into account both the nature of people and the nature of those common tools. If not, we could be fated to live in Ephialte’s “nightmare” for another 2,500 years.

Dr. Michael G. Gelles, is managing director with Deloitte Consulting LLP, advising a wide variety of clients in the public and private sector. He is a known insider threat specialist focused on cyber and physical security risks, asset loss, exploitation, terrorism workplace violence and sabotage. Joe Mariani leads research into defense, national security, and justice for Deloitte’s Center for Government Insights.  Joe’s research focuses on innovation and technology adoption by both commercial businesses and National Security organizations. The views expressed in this article are those of the author and do not necessarily reflect those of the U.S. Army War College, the U.S. Army, or the Department of Defense.

Photo Description: Two People Standing

Photo Credit: Photo by Naveen Annam from Pexels

Print Friendly, PDF & Email

Tags:

  • Show Comments

  • Richard R. Allen

    In The Honest Truth About Dishonesty: How We Lie to Everyone – Especially Ourselves by Dan Ariely, he explains his various research experiments into honesty. What is interesting is that often the best practices recommended by the Big Four CPA firms are not especially effective. Other features would be effective and are not implemented because it hasn’t been blessed by a Big Four CPA firm. It would be an interesting series of research projects to test various controls for effectiveness in various settings in our Armed Forces. When looking at the common problems in security, one would like to see where the cluster of problems are and what are the attitudes? In The Failure of Risk Management, the author identified four methodologies to validate a risk framework. Previously, I have asked Ron Ross if the NIST risk framework was validated using one of the four methodologies? He said it is best practices.

    From my personal experience in cybersecurity in a civilian federal agency some of the issues are:
    1. We have a schedule to meet and it is too hard to code in security,
    2. A senior manager over a major system only cared about the implementation and didn’t care to execute normal tests of security before implementation of a system,
    3. Personnel can remain ignorant of the actual risks involved in their operations,
    4. To show their independence, some subsets of an organization will implement their own security standards,
    5. As the implementation guidelines for a software product evolve, no one bothers to review past implementations of user provisioning to bring them up to effective security,
    6. You have the obvious General Patton or General MacArthur who view rules as not applicable to them,
    7. To protect diversity hires, senior management will suspend rules which would normally result in termination,
    8. We use too many products which have confusing or incompatible security requirements so that standardized security can’t be achieved,
    9. We relied on vendors to install and secure products which they didn’t do, and
    10. We had a system in to test and we forgot to secure it.

    These are in addition to the normal criminals, spies, traitors, and malcontents. Some malcontents act out of unprovoked malice.

Your email address will not be published. Required fields are marked *

comment *