AI is the study of intelligent agents that receive percepts and perform actions. Book covers logic, probability, perception, reasoning, learning, and action. Main theme is intelligent agents mapping percept sequences to actions. Requires sophomore-level computer science knowledge
Bayesian network represents variables and their dependencies via directed acyclic graph. Nodes represent variables, edges show conditional dependencies. Independent variables have no connected edges. Each node has probability function based on parent variables
Joint distribution shows all possible pairs of outputs of two random variables. Can be expressed as multivariate CDF or probability density/mass functions. Marginal distributions encode individual variable outputs
Bayes' theorem inverts conditional probabilities to find cause-effect relationships. Named after Thomas Bayes, who published it in 1763. Richard Price edited Bayes's work and contributed significantly. Pierre-Simon Laplace independently developed Bayesian interpretation in 1774
Range is the smallest interval containing all data values. Calculated as difference between largest and smallest values. Provides indication of statistical dispersion. Most useful for small data sets
Unbiased estimator's precision is at most Fisher information. Fisher information is defined as natural logarithm of likelihood function. Fisher information can be defined for twice differentiable distributions