foundations of computational agents
The third edition of Artificial Intelligence: foundations of computational agents, Cambridge University Press, 2023 is now available (including full text).
The simplest case occurs when a learning agent is given the structure of the model and all the variables have been observed. The agent must learn the conditional probabilities, for each variable . Learning the conditional probabilities is an instance of supervised learning, where is the target feature, and the parents of are the input features.
For cases with few parents, each conditional probability can be learned separately using the training examples and prior knowledge, such as pseudocounts.
Model | Data | ➪ | Probabilities |
---|---|---|---|
Figure 10.7 shows a typical example. We are given the model and the data, and we must infer the probabilities.
For example, one of the elements of is
where is the number of cases where , and is the corresponding pseudocount that is provided before any data is observed. Similarly, is the number of cases where , and is the corresponding pseudocount.
If a variable has many parents, using counts and pseudocounts can suffer from overfitting. Overfitting is most severe when there are few examples for some of the combinations of the parent variables. In that case, the supervised learning techniques of Chapter 7 could be used. Decision trees can be used for arbitrary discrete variables. Logistic regression and neural networks can represent a conditional probability of a binary variable given its parents. For non-binary discrete variables, indicator variables may be used.