Date of Award

Winter 2003

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Engineering Management & Systems Engineering

Committee Director

Resit Unal

Committee Member

Charles B. Keating

Committee Member

Andres Sousa-Poza

Committee Member

Bruce Conway

Abstract

The growing complexity of technical systems has emphasized a need to gather as much information as possible regarding specific systems of interest in order to make robust, sound decisions about their design and deployment. Acquiring as much data as possible requires the use of empirical statistics, historical information and expert opinion. In much of the aerospace conceptual design environment, the lack of historical information and infeasibility of gathering empirical data relegates the data collection to expert opinion.

The conceptual design of a space vehicle requires input from several disciplines (weights and sizing, operations, trajectory, etc.). In this multidisciplinary environment, the design variables are often not easily quantified and have a high degree of uncertainty associated with their values. Decision-makers must rely on expert assessments of the uncertainty associated with the design variables to evaluate the risk level of a conceptual design. Since multiple experts are often queried for their evaluation of uncertainty, a means to combine/aggregate multiple expert assessments must be developed. Providing decision-makers with a solitary assessment that captures the consensus of the multiple experts would greatly enhance the ability to evaluate risk associated with a conceptual design.

The objective of this research has been to develop an aggregation methodology that efficiently combines the uncertainty assessments of multiple experts in multiple disciplines involved in aerospace conceptual design. Bayesian probability augmented by uncertainty modeling and expert calibration was employed in the methodology construction. Appropriate questionnaire techniques were used to acquire expert opinion; the responses served as input distributions to the aggregation algorithm. Application of the derived techniques were applied as part of a larger expert assessment elicitation and calibration study.

Results of this research demonstrate that aggregation of uncertainty assessments in environments where likelihood functions and empirically assessed expert credibility factors are deficient is possible. Validation of the methodology provides evidence that decision-makers find the aggregated responses useful in formulating decision strategies.

DOI

10.25777/sxzb-0f58

Share

COinS