Computer-Supported Inferential Analysis Under Data Overload Emily S. Patterson



Download 16.26 Kb.
Date conversion05.11.2016
Size16.26 Kb.

In CHI '99 Conference Proceedings. Pittsburgh, PA, ACM, New York


Computer-Supported Inferential Analysis

Under Data Overload

Emily S. Patterson

Institute for Ergonomics


210 Baker Systems, 1971 Neil Ave.
Columbus, OH 43210 USA
+1 614 292-6287
Patterson.150@osu.edu



ABSTRACT


A simulation study of inferential analysis under data overload was conducted with professional analysts. Using a process tracing methodology, vulnerabilities in the analysis process were identified that point to design criteria for useful support aids.

Keywords


Cognitive task analysis, data overload, inferential analysis, information visualization, process tracing

INFERENTIAL ANALYSIS UNDER DATA OVERLOAD

This research is driven by a formidable problem in many work domains: analysts tasked to generate a coherent description of a situation based on an avalanche of electronic data. Data overload is a fundamental, ubiquitous problem. Exacerbating this problem in many organizations are two widespread trends. The first is an organizational trend of reducing operational staffing and expertise during nominal situations. As a result, analysts are increasingly required to analyze situations that are outside their immediate base of expertise on a short deadline. The second trend is an increase in the amount of available electronic data. Ironically, this explosion in data availability, although good in principle, complicates the task of effectively sampling the information and adds new cognitive burdens in the analysis process [1].

Inferential analysis in the modern electronic environment involves constructing an explanatory story [2] based on information that is retrieved through keyword search. Events occurring in the world are represented as mainly textual descriptions in reports. Reports on the same events do not necessarily corroborate each other; rather the information is often discrepant along various dimensions [3].


SIMULATION STUDY OF INFERENTIAL ANALYSIS

A high-fidelity simulation study was conducted to better understand how experts conduct inferential analysis under data overload. Ten professional analysts with an average of thirteen years of experience were asked to perform an analysis of the Ariane 501 rocket launch failure that occurred on June 4, 1996, in order to answer a question on when it was, why it occurred, and what the impacts were. The analysts were provided with a database of approximately 2000 reports, which could be sampled by keyword searching and browsing by date and title. No single report, including the official Inquiry Board Report, had all of the information necessary to answer the question. The majority of the database was “on topic” in that the reports contained information that was relevant to the simulated task, but only nine documents in the database were identified as “high-profit” documents in that they were detailed, accurate descriptions of the event from a reputable source.

Two researchers directly observed this process, which was also audio and videotaped. The researchers noted what queries were used, how many documents were retrieved by each query, and what articles were opened. The electronic and handwritten notes generated by the participants were also collected. The participants were asked to think aloud during the process and give a verbal response to the question when they were ready.

A process tracing methodology [4] was used to construct protocols for each study participant. These protocols incorporated the queries that were used, information in the articles that were read, what was verbalized while they read each document, and their physical behavior (e.g., cutting and pasting information from documents).

ARIANE 501 LAUNCH FAILURE

The Ariane 501 incident was different than the typical rocket launcher incident for several reasons. First, the explosion was due to a design problem in the software rather than the classic mechanical failure – there was numerical overflow in an unprotected horizontal velocity variable in the embedded software that was re-used from the Ariane 4, which is a slower rocket. Additionally, it was the first launch of a new rocket design, which raised concern about the viability of the new design. Overall, however, launch failures are relatively common in the industry and first launches in particular are prone to failure, so the reputation of the Ariane program was not greatly damaged.

Some of the information available to the participants on the Ariane 501 incident was inaccurate due to the naturally occurring sources of inaccuracies in event-driven reporting. All reports immediately following the launch failure had some incorrect or misleading information (e.g., it was reported that ground controllers blew up the rocket when it had actually self-destructed). Other reports had inaccuracies due to translation from a foreign language, secondhand reporting, or a lack of technical expertise (e.g., the cause of the numeric overflow). Additionally, predictions were often different than the actual events (e.g., the original predicted delay to the next launch was a couple of months when the actual delay was about a year).


FINDINGS FROM THE STUDY

The inferential process employed by all of the study participants1 can be broken down into three interrelated stages: information selection, conflict resolution, and story generation. Information was selected from the database through the refinement of keyword queries and by sequentially browsing the returned reports by dates and titles. Some of the sampled reports were used as the main basis for the analysis, which we refer to as “key” documents. The key documents were used to generate the skeleton of the analysis product. Supporting documents were then used to corroborate the important information and fill in details. Conflicts in the data were flagged and judgments about which data to include were revisited as new information on the topic was identified. When the analysts felt ready, they organized their notes and generated a coherent story to respond to the question.

The findings of this study highlight potential vulnerabilities in the analysis process when analysts are asked to analyze something outside their base of expertise, are tasked with tight deadlines, and have a large dataset that can only be sampled and viewed sequentially through the computer “keyhole” [5]. First, there is the potential for premature closure during the analysis process, allowing analytic products to be uncorroborated, incomplete, or even inaccurate. The three analysts that made inaccurate statements in their verbal briefings took less time and read fewer documents than the five analysts who made no inaccurate statements.

Secondly, analysts are vulnerable to missing critical information. None of the three analysts who made inaccurate statements in their briefings used any of the nine high-profit documents as their key documents as compared to three of the five analysts who did. This vulnerability is particularly acute with regard to information that is updated over time. It is very difficult to know that a statement that was accurate at one point in time, such as the Cluster satellite program being discontinued as a result of the loss of the Cluster satellites, is overturned by later updates, such as when they later reinstated the program.

Finally, the analysts are vulnerable to a number of breakdowns in the process of corroborating data, partly because the process is not well supported by design aids. A variety of breakdowns, not all of which led to inaccurate statements, were observed in identifying conflicts, tracking the conflicts, and revising judgments about the data in the face of new information.


The strategy of this line of research is to understand why data overload is fundamentally hard. The findings of this study increase our understanding of the cognitive tasks and potential vulnerabilities in the inferential analysis process under data overload. These vulnerabilities are interesting because they are so difficult to address. They are not amenable to simple, straightforward adjustments or feature additions to current tools. Meeting the design criteria that lead from the study findings will require innovative design concepts.

ACKNOWLEDGMENTS


I would like to thank my advisor, David Woods, for exceptional support and encouragement. This material is partly based upon work supported under a National Science Foundation Graduate Fellowship.

REFERENCES


  1. Woods, D.D., Patterson, E.S., and Roth, E.M. Aiding the Intelligence Analyst in Situations of Data Overload: A Diagnosis of Data Overload. Institute for Ergonomics/Cognitive Systems Engineering Laboratory Report, ERGO-CSEL 98-TR-03, The Ohio State University, Columbus OH, 1998.

  2. Josephson, J., and Josephson, S. Abductive Inference. Cambridge University Press, New York, NY, 1994.

  3. Schum, D.A. Evidence and Inference for the Intelligence Analyst, Volume I. University Press of America, Lanham, Maryland, 1987.
  4. Woods, D. D. Process tracing methods for the study of cognition outside of the experimental psychology laboratory. In G. Klein, J. Orasanu, and R. Calderwood (Eds.), Decision Making in Action: Models and Methods. Ablex Publishing Corporation, Norwood, NJ, 1993.


  5. Woods, D. D. & Watts, J. C. How not to have to navigate through too many displays. In M. Helander (Ed.), Handbook of Human-Computer Interaction, 2nd edition. Elsevier Science Publishers, B.V., North Holland, 1997.




1 Two study participants’ data were not used in the analysis. One analyzed a different satellite failure. Another did not complete the task because the printer was not working during his session.



The database is protected by copyright ©hestories.info 2016
send message

    Main page