Sunday, November 27, 2016

Evolving Spiking Neural Networks: Growth of Learning Machines

J. David Schaffer gave an excellent talk on evolving Spiking Neural Networks at the Center for Collective Dynamics of Complex Systems (CoCo) Seminar Series.

Many of today's neural network applications are based on multi-layer implementations of the perceptron. A perceptron implements a neuron model that sums weighted inputs and applies a non-linear activation function to calculate the output. Despite that this model deviates from how biological neuron networks work, the approach works and is used until today. In contrast, Spiking Neural Networks (SNNs) are a type of neural network that increase the realism in neural network simulation by introducing a time aspect into the model. Other than perceptron networks that decide upon their output at each propagation cycle, SNNs fire when a certain membrane potential is reached, which puts information into the timing of a spike.

J. David Schaffer shows how a genetic algorithm can be applied to generate an SNN for a given problem. Therefore, the chromosome representing an SNN will be mapped onto a binary string which will be evolved with mutation and recombination.

Evolving Spiking Neural Networks: Growth of Learning Machines from Complex Systems on Vimeo

Further readings on the topic:

No comments:

Post a Comment