On this page
On this page
No Free Lunch
Let's start by thinking about search algorithms
Notation:
- X is countable search space (e.g. set of parameters)
- objective function f:X↦Y that evaluates a particular choice
- dataset dm={dXm,dYm} is a set of m pairs of (x,f(x)) (i.e. what you've tried so far)
- search algorithm A simply appends to this dataset (i.e. continues the search, ideally finding x's that lead to smaller f(x))
- arbitrary performance measure Φ:dYm→R (i.e. evaluate how good a search algorithm is)
Suppose we want to find the minimum of f. Then, the search algorithm is exactly gradient descent, as in a proposal for the next step in the search.
Key Points:
Edit this page
Last updated on 2/23/2022