The problem of handling dependent evidence is an important practical issue for applications of reasoning with uncertainty in artificial intelligence. The existing solutions to the problem are not satisfactory because of their ad hoc nature, complexities, or limitations. In this dissertation, we develop a general framework that can be used...
There are three families of exact methods used for probabilistic inference in
belief nets. It is necessary to compare them and analyze the advantages and
the disadvantages of each algorithm, and know the time cost of making
inferences in a given belief network. This paper discusses the factors that
influence...
This paper investigates the feasibility of compiling the functionality of a decision
theoretic problem solving engine into a set of rules or functionally similar construct.
The decision theoretic engine runs in exponential time, while the rule set runs in
linear time at worst. The main question that will determine the...
Reasoning about any realistic domain always involves a degree of uncertainty.
Probabilistic inference in belief networks is one effective way of reasoning under
uncertainty. Efficiency is critical in applying this technique, and many researchers
have been working on this topic. This thesis is the report of our research in this...
Knowledge-Based Model Construction (KBMC) has generated a lot of attention
due to its importance as a technique for generating probabilistic or decision-theoretic
models whose range of applicability in AI has been vastly increased. However, no
one has tried to analyze the essential issues in KBMC, to determine if there exists...