Third edition of Artificial Intelligence: foundations of computational agents, Cambridge University Press, 2023 is now available (including the full text).

9.1.1 Factored Utility

Utility, as defined, is a function of outcomes or states. Often too many states exist to represent this function directly in terms of states, and it is easier to specify it in terms of features.

Suppose each outcome can be described in terms of features X1,...,Xn. An additive utility is one that can be decomposed into set of factors:

u(X1,...,Xn)= f1(X1)+...+fn(Xn).

Such a decomposition is making the assumption of additive independence.

When this can be done, it greatly simplifies preference elicitation - the problem of acquiring preferences from the user. Note that this decomposition is not unique, because adding a constant to one of the factors and subtracting it from another factor gives the same utility. To put this decomposition into canonical form, we can have a local utility function ui(Xi) that has a value of 0 for the value of Xi in the worst outcome, and 1 for the value of Xi in the best outcome, and a series of weights, wi, that are non-negative numbers that sum to 1 such that

u(X1,...,Xn)= w1×u1(X1)+...+wn×un(Xn).

To elicit such a utility function requires eliciting each local utility function and assessing the weights. Each feature, if it is relevant, must have a best value for this feature and a worst value for this feature. Assessing the local functions and weights can be done as follows. We consider just X1; the other features then can be treated analogously. For feature X1, values x1 and x1' for X1, and values x2,...,xn for X2,...,Xn:

u(x_1,x_2...,x_n)-u(x_1',x_2...,x_n)=w_1×(u_1(x_1)-u_1(x_1')).

The weight w1 can be derived when x1 is the best outcome and x1' is the worst outcome (because then u1(x1)-u1(x1')=1). The values of u1 for the other values in the domain of X1 can be computed using Equation (9.1.1), making x1' the worst outcome (as then u1(x1')=0).

Assuming additive independence entails making a strong independence assumption. In particular, in Equation (9.1.1), the difference in utilities must be the same for all values x2,...,xn for X2,...,Xn.

Additive independence is often not a good assumption. Two values of two binary features are complements if having both is better than the sum of the two. Suppose the features are X and Y, with domains {x0,x1} and {y0,y1}. Values x1 and y1 are complements if getting one when the agent has the other is more valuable than when the agent does not have the other:

u(x1,y0)-u(x0,y0) < u(x1,y1) - u(x0,y1).

Note that this implies y1 and x1 are also complements.

Two values for binary features are substitutes if having both is not worth as much as the sum of having each one. If values x1 and y1 are substitutes, it means that getting one when the agent has the other is less valuable than getting one when the agent does not have the other:

u(x1,y0)-u(x0,y0) > u(x1,y1) - u(x0,y1).

This implies y1 and x1 are also substitutes.

Example 9.3: For a purchasing agent in the travel domain, having a plane booking for a particular day and a hotel booking for the same day are complements: one without the other does not give a good outcome.

Two different outings on the same day would be substitutes, assuming the person taking the holiday would enjoy one outing, but not two, on the same day. However, if the two outings are in close proximity to each other and require a long traveling time, they may be complements (the traveling time may be worth it if the person gets two outings).

Additive utility assumes there are no substitutes or complements. When there is interaction, we require a more sophisticated model, such as a generalized additive independence model, which represents utility as a sum of factors. This is similar to the optimization models of Section 4.10; however, we want to use these models to compute expected utility. Elicitation of the generalized additive independence model is much more involved than eliciting an additive model, because a feature can appear in many factors.