November 14, 2024
Information has never been more available as it is in this modern age of the Internet, 24-hour news, streaming services and endless sources of data collection devices and methods. But as the old saying goes "figures always lie and liars always figure." Now, no one is calling anybody a liar here, but information that has been collected, analyzed, used, manipulated, or just presented always has the potential for bias and even deception--intentional or not. WAR ROOM welcomes back Jeff Baker and Bob Bradford as they present a two-part article that tries to tackle the difficult task of making sense of data in decision making. Part one addresses the sources of data and how the social sciences can be used to arm leaders with better questions and improved judgment.

Armed with an improved awareness of the strengths and weaknesses of their information, leaders can make better informed decisions at the strategic level.

Strategic leaders are inundated with information that might inform their decisions. Their staffs and advisors accumulate and make sense of large amounts of data, analyze it, and then synthesize it—all to help their leaders make better decisions. However, the types of problems, and the data necessary to analyze those problems, are fundamentally different at the strategic level. Moreover, sources do not come with an equivalent of the Good Housekeeping seal of approval. Leaders must use their judgment to decide which information to trust. Baselining and improving leaders’ comfort and skill using data and conducting analyses can better arm them for these tasks at the strategic level.

In an effort to improve the analytical skills of U.S. Army War College graduates (who frequently operate in this world), data literacy is at the forefront of many discussions. To contribute to this discussion and skill-building effort, we present two articles that describe a few things strategic leaders need to know about social science and policy research as well as the data on which these fields are based. In this first article, we address the challenges that distinct worldviews can create, how to assess sources of data and describe some of the constructs used to represent the world in the social sciences. A good understanding of these concepts will allow leaders to ask better questions, more easily interpret information and improve their judgment. Armed with an improved awareness of the strengths and weaknesses of their information, leaders can make better informed decisions at the strategic level.

Different world views

Differences in worldview affect the different sources of data that leaders are inclined to use. Data consumers need to be aware of two aspects of their sources’ points of view when they are reading and collecting information. We do not expect to permanently add the words ontology and epistemology to the lexicon of strategic leaders, but we hope leaders will remember that people have fundamentally different assumptions and views of how they see the nature of the world and how they view knowledge. This understanding will better arm leaders to analyze data and sources.

Ontology is the philosophical study of the nature of being, becoming, existence or reality. It explores questions about what exists or can be said to exist and how entities can be grouped according to similarities and differences. Views of the world are either objective or subjective. The objectivist view of ontology proposes that truth does exist independent of human experience. There is an absolute truth. A subjectivist view advances that truth exists only as part of human awareness and only through human experience. Truth is relative to the experience of the observer.

Epistemology is the study of the “criteria by which we can know what does and does not constitute warranted or scientific knowledge.” As consumers of data and information, it is important to realize that people can hold different assumptions about what makes up valid scientific knowledge. These assumptions are frequently unstated, and divergent assumptions can lead to significant misunderstanding.

Some researchers believe that it is possible to study the external world objectively, without the researcher’s presence altering the outcomes or findings. Other researchers think the opposite—that knowledge is filtered through the lenses of language, gender, race, social class, etc. They do not necessarily reject the view that an objective world exists, but they do view knowledge as value-laden through the researcher’s experiences and beliefs.

As strategic leaders continue learning and growing their networks, they will be exposed to articles, book excerpts and news reports, and they will meet a variety of people who have unstated assumptions about the nature of existence and knowledge. Very rarely will these sources of new information state their ontological or epistemological stance, and the leaders will often need to infer them. Researchers’ philosophical outlooks will also dictate the types of research questions that they ask and the methods that they use to study those questions. As leaders interact with members of their networks, diverse worldviews and assumptions can cause friction and frustration. How people view the world influences what they deem is right or correct. For example, if you read an academic study written by a positivist, that author has an objective view of reality and believes that one true reality exists independent of other people’s beliefs. However, an interpretivist has a subjective view of reality. Reality is what we perceive it to be, and there can be multiple realities. The interpretivist viewpoint could frustrate the positivist and cause him/her to discount anything the interpretivist has to say since a positivist believes that there in only one reality and can’t fathom how anyone could see the world differently. Understanding the opposing viewpoints is important to communicate effectively.

Trusting data and analysis.

Many sources of data provide analysis that can support decisions at the strategic level, but leaders need help knowing which sources to trust, which to accept with caveats and which to reject outright.

Strategic leaders must develop good judgment skills to determine the credibility of information before using it to support decisions. For example, making force structure decisions based on a study produced by a pro-contractor think-tank without evaluating the underlying assumptions that form their analysis could lead to biased decisions. Leaders and advisors can start by checking the bona fides of the team conducting the analysis. Does the team have the skill and credentials to do the analysis? Have they conducted analyses before and have they been rigorous? Are they using good scientific practices in their study? Has the work been peer reviewed by a larger community of practice, or were all reviews internal to an organization? If the study relies on analytic tools and methods that you do not understand, find someone on your staff who can help you make an assessment. At the strategic level, the staff should include skilled policy or operations research analysts who can help you “check under the hood” of models and analysis. Take advantage of these people to help you build your judgment about sources.

Leaders should also try to identify the study authors’ potential biases and influences. Look at the analytic team’s track record: what else have they done and what did people say about its rigor and conclusions? When a leader reviews a study from a think tank, they should check the “about” page on their website to get a feel for ideological bias and economic incentives. Who is paying for the study, and what would they prefer the answer to be? Disregarding a study’s conclusions just because it confirms the assumptions of the people paying for it does not mean the study is inherently flawed; however, the leader should be wary.

Construct clarity is important because sometimes researchers from different disciplines—not unlike service members from different services—use the similar constructs and words differently.

Construct Clarity

Constructs are abstract concepts that are specifically chosen or created to explain a given phenomenon. Constructs may be simple concepts such as marksmanship—if soldiers can hit targets on the range, they can do it in battle—or a combination of related constructs that bring together several related underlying concepts. For example, a construct for communication skills might consist of other constructs for vocabulary, grammar, tone, volume, etc. Simply put, constructs represent what you are trying to measure. Construct clarity is important because sometimes researchers from different disciplines—not unlike service members from different services—use the similar constructs and words differently. We need to clearly understand the constructs used in the analysis before we can judge the conclusions based on them. In an effort to provide clarity, scholars have articulated four elements of construct clarity.

The first element is a clear definition. Sometimes authors simply fail to define their construct and assume universal understanding or universal agreement on the concept that they are studying or testing. Good definitions should do three things. First, they should effectively capture the essential properties and characteristics of the concept being studied. Second, researchers should avoid tautology; they should not use elements of the construct in their definition of the construct (e.g., a transformational leader defined as a leader who transforms organizations). Third, good definitions should be concise and focus as narrowly as possible on the essential characteristics of a phenomenon.

The second element is the scope of the construct. Very few social science constructs have universal applications, so it is important for researchers to state the conditions for which their construct is valid. Is their notion of performance only valid in aviation units, or is it valid in all types of organizations? Is there a time aspect to the construct? What are the boundary conditions? Failing to specifically state these conditions can lead to confusion and potentially opposing results in a study.

The third element is the relationship to other constructs. Since very few constructs are brand new, it is the researcher’s job to outline how the construct came about. Also, if it is related to other constructs, how do those constructs interact? How are they similar and how are they different?

Finally, the last element is coherence. Simply put, do the definitions, scope and relationship to other constructs make sense? Do they work together in a logically feasible manner that enables other scholars to understand and potentially build upon their work?

Two other important characteristics of constructs are validity and reliability. For example, a researcher may want to measure the level of empathy in different people. Empathy is an abstract concept that is not simple to measure. In order to have a good measure for empathy, researchers need to make sure that the construct is both valid and reliable. Validity refers to measuring the construct that we want to measure. Reliability means that we can measure the construct consistently and precisely, over and over, in different contexts and with different people. A measure can be reliable but not valid. If the measure produces consistent results but measures the wrong phenomenon, what good is it? Conversely, we can measure the right construct but not have consistent results. Then we have a valid but unreliable measure.

Measuring unobservable constructs in the social sciences can be challenging. Leaders should ask questions to develop a common language to communicate with others effectively: Is this a valid measure? Is it reliable? How did the author/presenter of the information address these two issues? Was this a new way to measure a concept? Have the scales changed over time?

Conclusion.

We have provided a list of a few things leaders should know and consider when looking at data and analysis. Leaders should strive to understand other viewpoints on the world and knowledge, assess sources for trustworthiness and assess whether the constructs used enable effective communication. In the next article of this series, we will address and identify potential challenges with data and explain the type and impact of errors in analysis. This list is a good start for leaders to build better judgement about analysis and data. It is by no means an exhaustive list. Strategic leaders should always work to improve their comfort and literacy with research and analysis. An educated consumer of analysis can use it effectively to make better decisions. Remember what Carly Fiorina, former CEO of Hewlett Packard said: “The goal is to turn data into information and information into insight.” Understanding the few things listed here can help along that path.

COL Jeffrey E. Baker is an Army Officer and instructor in the Department of Command, Leadership, and Management at the U.S. Army War College.

Bob Bradford is a retired U.S. Army colonel and the Professor of Defense and Joint Processes in the U.S. Army War College’s Department of Command, Leadership, and Management.

The views expressed in this article are those of the authors and do not necessarily reflect those of the U.S. Army War College, the U.S. Army, or the Department of Defense.

Photo Credit: Infographic vector created by rawpixel.com – www.freepik.com

Leave a Reply

Your email address will not be published. Required fields are marked *

Send this to a friend