More work on coevolving recurrent neurons:
F. Gomez and J. Schmidhuber.
Co-evolving recurrent neurons learn deep memory POMDPs.
In Proc. of the 2005 conference on genetic and
evolutionary computation (GECCO), Washington, D. C.,
pp. 1795-1802, ACM Press, New York, NY, USA, 2005.
Nominated for Best Paper in Coevolution.
PDF.
Simultaneously evolves networks at two levels of granularity:
full networks and neurons. Applied to
POMDP learning tasks that require to create
short-term memories of up to thousands of time steps, the method is
faster and simpler than the previous best conventional
reinforcement learning systems.
|
|
Related work on fast weights:
J. Schmidhuber. Learning to
control fast-weight memories: An alternative to recurrent nets.
Neural Computation, 4(1):131-139, 1992.
PDF.
HTML.
Compare pictures (German).
A slowly changing, gradient-based feedforward neural net learns to quickly
manipulate short-term memory in fast synapses of another net.
More fast weights:
J. Schmidhuber.
Reducing the ratio between learning complexity and number of
time-varying variables in fully recurrent nets.
In Proc. ICANN'93, Amsterdam, pages 460-463. Springer, 1993.
PDF.
HTML.
In a certain sense, short-term memory in fast synapses can be more
efficient than short-term memory in recurrent connections.
A related co-evolution method called COSYNE:
F. Gomez, J. Schmidhuber, R. Miikkulainen.
Accelerated Neural Evolution through
Cooperatively Coevolved Synapses.
Journal of Machine Learning Research (JMLR),
9:937-965, 2008.
PDF.
F. Gomez, J. Schmidhuber, and R. Miikkulainen (2006).
Efficient Non-Linear Control through Neuroevolution.
Proceedings of the European Conference
on Machine Learning (ECML-06, Berlin).
PDF.
A new, general method that outperforms many others
on difficult control tasks.
More recent work of 2013:
Compressed Network Search Finds Complex Neural Controllers with a Million Weights,
learns to drive without a teacher from raw high-dimensional video input
| |
Related work on evolution for supervised sequence learning:
a new class of learning algorithms for
supervised RNNs, which outperforms
previous methods:
Evolino (2005).
Related work on Compressed Network Evolution(1995-):
Many practical algorithms can evolve hundreds of
adaptive parameters, but not millions. Ours can, by evolving compact, compressed descriptions (programs) of huge networks.
J. Koutnik, G. Cuccu, J. Schmidhuber, F. Gomez.
Evolving Large-Scale Neural Networks for Vision-Based Reinforcement Learning.
In Proceedings of the Genetic and Evolutionary
Computation Conference (GECCO), Amsterdam, 2013.
PDF.
J. Koutnik, F. Gomez, J. Schmidhuber.
Searching for Minimal Neural Networks in Fourier Space.
The 3rd Conference on Artificial General Intelligence (AGI-10), 2010.
PDF.
J. Schmidhuber.
Discovering solutions with low Kolmogorov complexity
and high generalization capability.
In A. Prieditis and S. Russell, editors, Machine Learning:
Proceedings of the Twelfth International Conference (ICML 1995),
pages 488-496. Morgan
Kaufmann Publishers, San Francisco, CA, 1995.
PDF .
HTML.
Fibonacci web design
by J. Schmidhuber
| |


 |