Affiliated with:

Data Security or the Ethics of “Inappropriate Curiosity”

Security 1

Ethical challenges to data usage can arise from many sources, including “inappropriate curiosity.” All organizations should enforce good ethical data management behaviors.

Introduction

Early reports have been disheartening. Several contractors for the Federal Government Department of State have been disciplined and dismissed for “inappropriate curiosity” over data held in their charge. What were they thinking! Any possible political motivation on whom and what they were snooping—or even the lack of it—is moot. The public trust has been betrayed. At this writing, further public reports of this incident will be relegated to a footnote in the six-o’clock news. However, as data management professionals and IT managers, clear lessons can be drawn. In fact, there are many points to be made, and no one person, role, or organizational entity bears the entire burden for learning about data security  and correcting the mistakes.

First, let us consider the basics of the case. Two or more federal government contract workers decided, on their own, to make an unauthorized “fishing expedition” through privileged, sensitive, and confidential files of US citizens. They were caught (be thankful for some controls) and appropriately sanctioned. Why it took almost two months for the breach to become known is beyond the scope of this essay, and appropriate investigations are underway. Instead, consider the enabling factors.

First, security professionals explain that the most egregious security breaches are the result of malicious, persistent, and sophisticated attacks. None of that seems to be true here. In fact, the contractors’ actions seem to be quite casual. So, clearly, it seems even the most basic controls were not in place. However, if sanctioning the perpetrators and locking the barn door closed this issue, there would be no reason for this article.

When systems fail big, a single cause is usually not the culprit. Most often, major failures are the result of a cascade of errors, each of which is nearly insignificant itself. These seemingly insignificant errors come together in unexpected ways and combine into an enormous, sometimes catastrophic, result. In addition, even in “big” failures, a human factor is usually in play. In the State Department incident, every error was a human factor failure.

Human factor failures are always preventable. But most importantly in this case, the errors of this State Department incident had little to do with technology, but everything to do with what can be generically called social systems. However, given that technology is merely a tool, and that complex social systems are becoming increasingly more reliant on IT, it is clear that social and technological systems have a high potential for combining in these aberrant and unexpected ways.

Basically, an IT or data management professional’s responsibility does not end where technology ends. Each of us has an ethical duty and obligation to society to not only see that our information systems are reliable and secure, but also to take affirmative action toward seeing they are only used in the ways intended. We will never be able to stop every malicious, persistent, and sophisticated attack. Moreover, our systems and policies need to show integrity even when “attacks” do not have malicious intent but are merely the result of misguided or ill-advised action.

Everyone knows no system is bullet-proof; we can only make it harder and harder to breach it. Building a higher wall around the compromised State Department data is probably a simple matter, but that only treats the symptoms, not the cause. More importantly, treatment is not solely a technological problem; it’s a social one, too. Arguing that State Department system designers “should” have been more diligent in security design is like asking a legislature to anticipate all possible ways a set of actions would be deemed illegal. What is legal is not necessarily ethical, and vice versa. While indeed we are a nation of laws, law alone cannot possibly guide us through this maze we call the human experience.

Society is placing greater trust in data technology every day, whether they realize it or not. And while security methods are well-known and largely a matter of time and money, lapses in human judgment will be with us forever. Therefore, as stewards of data technology, our responsibility for protecting the public against misuse of its data lies not only in the technological but also in the social.

Here, then, are specific social actions each of us is responsible for:

Security 2

Figure 1: Responsibilities for Ethical Behaviors Toward Data

  • As Data Management Professionals, we often have more than casual contact with sensitive personal data. Every person has an expectation of privacy, especially in data that can be misused to cause economic harm or loss of reputation or standing in community. We all have a near-absolute behavior requirement to hold that data confidential, regardless of how it was obtained. (We would only be released from this responsibility if holding the data confidential would cause a greater harm to others, but this judgment is not ours.)
  • As supervisors, we have an affirmative responsibility to ensure our delegates explicitly understand the nature of their responsibilities in matters of confidentiality. We do that not only by providing them good training as stewards, but also by giving our delegates equity in outcomes.
  • As technologists, our information systems must be “appropriately” secure. This, of course, is a loaded term. For background, let’s look to the legal profession. Our base assumption about people is that they’re “reasonable,” yet the first thing they teach you in civil law is “What happens when reasonable people disagree?” Given we can’t make our systems bullet-proof, we’re always looking to make “appropriate” trade-offs against likely scenarios. In addition to basic education on security methods, follow up on recent high-profile breaches and draw your own conclusions. Bottom line: Compliance with best practice and regulation is probably not enough.
  • As technology managers, we are obligated to create and enforce effective policy and procedure. These governance tools must serve our end-customers, executives, shareholders, staff, and every other economic or social stakeholder. We have a further responsibility to continuously refine and improve our policy and procedure when we detect its weaknesses and as technology and organizational structure change.

Conclusion

One way to look at ethics is to consider a system of behavior comprised of obligations and entitlements. In this context, people are entitled to their privacy and protection from harm and, specifically for us as IT and data management professionals, we are obligated to ensure systems and data in our control or stewardship cannot be subverted. The staggering amount of data collected on people, everywhere from banks to grocery stores, makes data management obligations loom large. Thus, exercising proper judgment in ethical matters small and large becomes a clear imperative. Although no one wants to face the quagmire of an ethical dilemma, strive to get the easy decisions right.

LinkedIn
Facebook
Twitter

James McQuade

James McQuade is an Enterprise Data Architect and Project Manager for forward-leaning, data-intensive, delivery-oriented organizations. Prior to specializing in Data Management, Mr. McQuade honed expert skills in Applications Development, delivering a consistent track record of success in Project and Account Management and Software Engineering. He is a graduate of Carnegie Mellon University (BS) and the University of Pittsburgh (MS).

© Since 1997 to the present – Enterprise Warehousing Solutions, Inc. (EWSolutions). All Rights Reserved

Subscribe To DMU

Be the first to hear about articles, tips, and opportunities for improving your data management career.