foundations of computational agents
Jannach et al. [2021] overview recommender systems. The Netflix prize and the best algorithms are described by Bell and Koren [2007] and Jahrer et al. [2010]. The collaborative filtering algorithm is based on Koren et al. [2009]; see also Koren and Bell [2011]. The MovieLens datasets are described by Harper and Konstan [2015] and available from http://grouplens.org/datasets/movielens/. Jannach and Bauer [2020] explain why recommendations require more than optimizing an easy-to-optimize criteria.
The polyadic decomposition by Hitchcock [1927] was used in knowledge graph completion by Trouillon et al. [2016]. Its use with inverses for knowledge graph prediction was independently proposed by Kazemi and Poole [2018] and Lacroix et al. [2018]. Fatemi et al. [2020] extend this method for relations with multiple arguments.
Statistical relational AI is overviewed by De Raedt et al. [2016]. Plate models are due to Buntine [1994], who used them to characterize learning. Latent Dirichlet allocation and the plate models of language are by Blei et al. [2003].
Li et al. [2016] discuss truth discovery from crowdsourcing, which can be more general than the Boolean case presented here. Van den Broeck et al. [2021] provide an introduction to lifted inference, which allows inference without grounding.
Probabilistic logic programming was proposed by Poole [1993] and implemented in Problog [De Raedt et al., 2007]. De Raedt et al. [2008] and Getoor and Taskar [2007] provide collections of papers that overview probabilistic relational models and how they can be learned. Domingos and Lowd [2009] discuss Markov logic networks and how (undirected) relational models can provide a common target representation for AI. Pujara et al. [2015] discuss how statistical relational AI techniques and ontological constraints are used for making predictions on knowledge graphs. Probabilistic soft logic is described by Bach et al. [2017]. Relational dependency networks are by Neville and Jensen [2007].