Probably Approximately Correct

Leslie Valiant

Professor of Computer Science and Applied Mathematics, Harvard University. Recipient of the Turing Award.

Notes:

  • 1 "Unlike most algorithms, [ecorithms] can be run in environments unknown to the designer and they learn by interacting with the environment how to act effectively in it. After sufficient interaction, they will have expertise not provided by the designer, but extracted from the environment."
  • 3 "Learning is done by concrete mechanisms that can be understood by methods of computer science."
  • 43 "Our powers to specify what we wish to compute are greater than the expressive power of computation itself"
  • 49 "Learning is achieved in many steps that are plausible but innocuous when viewed one by one in isolation"
  • 52 "Ecorithms are also algorithms but they have an important additional nature. They use generic learning techniques to acquire knowledge from their environment so that they can perform effectively in the environment from which they have learned."
  • 52 "One now needs to analyze not only the algorithm itself but also the algorithm's relationship with its environment."
  • 53 "Aristotle said there are two forms of argument: syllogistic and inductive. Here I interpret these words to mean that if one has a certain belief the belief was arrived at either by logical deduction (syllogism) from things already believed, or by induction (generalization) from particular experiences."
  • 61 "The first assumption [for PAC learning induction generally] is the Invariance Assumption: the context in which the generalization is to be applied cannot be fundamentally different from that in which it was made."
  • 61 "This assumption [Invariance Assumption] requires that the functional relationships and the probability distribution that characterizes how frequently different situations arise remain somehow constant over time - it requires that there are some regularities that remain true."
  • 61 "The second assumption is the Learnable Regularity Assumption. It is essential to require that any useful criterion or regularity be detectable: whether the criterion applies to an instance should be resolvable by a feasible computation."
  • 61 "To explain induction is it also necessary to explain how an individual can acquire the detection algorithm for the regularity in the first place."
  • 70 "The Learnable Regularity Assumption substantively constrains what learning algorithms can do."
  • 77 "I believe the primary stumbling block that prevents humans fro being able to learn more complex concepts at a time than they can, is the computational difficulty of extracting regularities from moderate amounts of data, rather than then need for inordinate amounts of data."
  • 85 "The number of examples needed for PAC learning is proportional only to the small number of critical features...and depends only much more weakly on the the possibly numerous irrelevant, distracting ones."
  • 95 "The question of whether a specific function...has a higher performance than another function is determined by complex factors relating to both the current state of the species as well as the current environment."
  • 149 "I believe the attempt to make a thinking machine will help us greatly in finding out how we think ourselves." - Alan Turing