The blessing and the curse of the multiplicative updates - discusses connections between in vitro selection, and the multiplicative updates of online learning
Presenter
March 27, 2012
Keywords:
- Biology
MSC:
- 92C15
Abstract
Multiplicative updates multiply the parameters by
nonnegative factors. These updates are motivated by
a Maximum Entropy Principle and they are prevalent in evolutionary
processes where the parameters are for example
concentrations of species and the factors are survival rates.
The simplest such update is Bayes rule and we give
an in vitro selection algorithm for RNA strands that
implements this rule in the test tube where
each RNA strand represents a different model. In one liter of the RNA soup there are approximately 10^20 different strands
and therefore this is a rather high-dimensional implementation of Bayes rule.
We investigate multiplicative updates for the purpose
of learning online while processing a stream of examples.
The ``blessing'' of these updates is that they learn very fast
because the good parameters grow exponentially.
However their ``curse'' is that they learn too fast and wipe out parameters too quickly. We describe a number of
methods developed in the realm of online learning
that ameliorate the curse of these updates.
The methods make the algorithm robust against data
that changes over time and prevent the currently good
parameters from taking over.
We also discuss how the curse is circumvented by nature.
Some of nature's methods parallel the ones
developed in Machine Learning, but nature also has some additional tricks.
This will be a high level talk. No background in online
learning or evolutionary Biology will be required.