foundations of computational agents
Introductions to probability theory from an AI perspective, and belief (Bayesian) networks, are by Pearl , Jensen , Castillo et al. , Koller and Friedman , and Darwiche . Halpern  overviews the foundations of probability.
Brémaud  describes theory and applications of Markov chains. HMMs are described by Rabiner . Dynamic Bayesian networks were introduced by Dean and Kanazawa . Markov localization and other issues on the relationship of probability and robotics are described by Thrun et al. . The use of particle filtering for localization is due to Dellaert et al. .
Manning and Schütze  and Jurafsky and Martin  present probabilistic and statistical methods for natural language. The topic model of Example 8.37 is based on Google’s Rephil described by Murphy .
For introductions to stochastic simulation, see Rubinstein  and Andrieu et al. . Likelihood weighting in belief networks is based on Henrion . Importance sampling in belief networks is based on Cheng and Druzdzel , who also consider how to learn the proposal distribution. There is a collection of articles on particle filtering in Doucet et al. .
The annual Conference on Uncertainty in Artificial Intelligence, and the general AI conferences, provide up-to-date research results.