Date of Award
Summer 1997
Document Type
Dissertation
Degree Name
Doctor of Philosophy (PhD)
Department
Engineering Management & Systems Engineering
Committee Director
Resit Unal
Committee Member
Laurence D. Richards
Committee Member
Han P, Bao
Committee Member
Abel A. Fernandez
Committee Member
James Schwing
Abstract
This dissertation describes the development, refinement, and demonstration of an expert judgment elicitation methodology. The methodology has been developed by synthesizing the literature across several social science and scientific fields. The foremost consideration in the methodology development has been to incorporate elements that are based on reasonable expectations for the human capabilities of the user, the expert in this case.
Many methodologies exist for eliciting assessments for uncertain events. These are frequently elicited in probability form. This methodology differs by incorporating a qualitative element as a beginning step for the elicitation process. The qualitative assessment is a more reasonable way to begin the task when compared to a subjective probability judgment. The procedure progresses to a quantitative evaluation of the qualitative uncertainty statement. In combination, the qualitative and quantitative assessments serve as information elicited from the expert that is in a subsequent step to develop a data set. The resulting data can be specified as probability distributions for use in a Monte Carlo simulation.
A conceptual design weight estimation problem for a simplified launch vehicle model is used as an initial test case. Additional refinements to the methodology are made as the result of this test case and as the result of ongoing feedback from the expert. The refined methodology is demonstrated for a more complex full size launch vehicle model.
The results of the full size launch vehicle model suggest that the methodology is a practical and useful approach for addressing uncertainty in decision analysis. As presented here, the methodology is well-suited for a decision domain that encompasses the conceptual design of a complex system. The generic nature of the methodology makes it readily adaptable to other decision domains.
A follow-up evaluation is conducted utilizing multiple experts which serves as a validation of the methodology. The results of the follow-up evaluation suggest that the methodology is useful and that there is consistency and external validity in the definitions and methodology features.
DOI
10.25777/cy4v-g241
ISBN
9780591603903
Recommended Citation
Monroe, Richard W..
"A Synthesized Methodology for Eliciting Expert Judgment for Addressing Uncertainty in Decision Analysis"
(1997). Doctor of Philosophy (PhD), Dissertation, Engineering Management & Systems Engineering, Old Dominion University, DOI: 10.25777/cy4v-g241
https://digitalcommons.odu.edu/emse_etds/99
Included in
Artificial Intelligence and Robotics Commons, Operational Research Commons, Risk Analysis Commons