[CS logo]

CMSC 25000 - Artificial Intelligence
Winter 2004
Homework #6: Due February 24, 2004


Goals

Through this assignment you will:

Problem 1

Here we explore the issues surrounding reasoning under uncertainty. States of the world are described by assignments of values {xi1,xi2,...xiki} to random variables {X1,X2,..,Xn}. Assuming no knowledge of dependence or independence among variables, there are 2^n possible states over n binary random variables.

Bayes nets represent dependence between random variables by directed arcs where the child depends on the parent. This information allows us to reduce the number of computations to determine the child's probability to (k-1)k^p. However, even with this number of computations, doing complete inference on a Bayes net is still computationally very expensive.

In order to make inference more tractable, a simplifying assumption is introduced. Specifically, the 'noisy-or' assumes that we can treat probability of some event as the probability that at least one of its parents (or a leak) caused it. This reduces the number of computations for a child with one parent from 2^10 to 11.

The code for this problem can be found in the hw/hw6-code subdirectory of class web site. Running (load "load-hw6.scm") should load all the relevant code. examples.scm demonstrates how to use "infer" and how to define Bayes nets to run inference on.

The code includes some simplifying assumptions, so if you happen to assign to values to the same variable, it will simply take the latter assignment, rather than indicating a contradiction.

Part A: Probability Warmup

A new AIDS test has been developed that is 99.5% accurate. You are told that the current AIDS rate is 1 in 100000. What is the probability that someone has AIDS given that the test comes back positive?

Part B

Recall that the conditional probability of A given B, P(A|B) = P(A,B)/P(B). Implement a function conditional-probability that takes a bayes net (e.g. mcbn1) and two lists of lists similar to those used by infer demonstrated in examples.scm. Demonstrate your function's operation on P(e=#t|a=#f).

Part C: Extending Holmes

Here we extend the Pearl's original Holmes scenario to provide an alternative cause for the burglar alarm to sound. Again, based on Pearl's scenario (the appropriate Bayes net formulation for the initial state of the Holmes scenario can be found in Feb. 19th's class notes.), we add the following information:

Holmes, always the anxious type, listens to KNX, "news Radio", while at work, which will report the occurrence of a sufficiently large quake 97% of the time. Assume that KNX never makes a false positive report of a quake. Holmes thinks that a burglary and an earthquake can each set off his burglar alarm in independent ways that make a noisy-or a good model. From his original belief about the probabilistic connection between burglary and alarm, Holmes can compute a causal probability from burglary to alarm and a leak probability for an alarm occuring in the absence of a burglary. He estimates the probability of a quake causing an alarm as 70%. The incidence of quakes in Los Angeles is equal to the incidence of burglaries.

Draw the updated Bayes Net corresponding to above set of beliefs. Use the original Holmes net from class as your starting point.

Note: The probabilities and features are not identical to the example in the book.

Part D

Define the network you just drew as a Bayes Net according to the bnet definition in examples.scm.

Part E

Using your new network, the conditional probability function you defined above, and your knowledge of Bayes' Rule, compute the probability that Holmes' house was actually burgled given that he receives calls from both Gibbons and Watson, but ALSO hears a report of a quake on the radio.