Questions arose if a von Neumann Self-Referential Machine could evolve from a simple self-copy machine in a different set-up. The von Neumann model has the advantage of enabling new and more ways to change the system upon mutation - but what could be the evolutionary pressure to have a von Neumann architecture evolved in the first place?
Current experiments did not include sexual reproduction - could that facilitate the evolution of more complex architectures?
More from the group can be found at http://evosym.rince.ie/
Modern software systems suffer from increased complexity. Large software systems are composed of many components that are interlinked. The internal states of these systems contain a huge amount of information. The main obstacles lie in the lack of the reliability and robustness which lead to poor performance. For complex intractable problems, a random search (Monte-Carlo method) does not perform well. New and advanced approaches are necessary to deal with complexity.
Milan Tuba proposed a guided Monte-Carlo search method based on a hybrid of reinforcement learning and a genetic algorithm. As a proof-of-concept the approach is applied to the problem of information retrieval in the internet.
In the discussion, the performance of the algorithm in comparison to commercial search providers, like Google, was discussed.
An entity works as an independent agent following the given policies. An important feature could be the ability to predict the failure in a given subsystem.
According to EMCSR'12 keynote lecture of Peter Csermely we have to watch signs like slower recovery, increased self-similarity, and increased variance of fluctuation patterns in order to predict a system change into a state where it cannot handle the its environment with its current policies.
In the proposed crisis management of van Splunter and van Veelen a subsystem is supposed to emerge a warning when local adaption fails to handle the problem, e.g. if a small team of firefighters cannot confine a fire in their assigned area.
Carlos Gershenson told us about "Living in living cities". One of the challenges of 21st century is preventing the problems in the ultra-fast growing cities all over the world.
These are non-stationary (changing problems), traditional algorithms do not work well. Such challenges are for example urban mobility, logistics, telecommunications, governance, safety, sustainability, society and culture. A solution is to exploit properties of living systems, which are adaptive, learning, evolving, robust, autonomous, self-repairing, and self-reproducing, and to understand cities metaphorically as organisms.
Engineering methods cannot find a single solution to these changing problems. Instead it is necessary to constantly adapt the solution, thus have a self-organizing solution to a complex problem. Will cities become the "killer app" of cybernetics and systems research?
Discussion arose around the following issues:
But how do you get the officers and responsibles of a city to cooperate? There is a need for a strong motivation to overcome the inertia of the system.
Could cities instead built from scratch? No, because of the legacy issues - it is not possible to just tear down a large city and build it anew every few decades.
Can we prove that the system is robust against malicious behavior? Difficult since such a complex system cannot be easily predicted for all sets of possible inputs.
Are explicit measures necessary or would people themselves care for the necessary adaptations? This would not increase living standard for the people.
The presented approach allows people to share their self-generated content like photos or short videos instantly at such events. Existing platforms like flickr or youtube do not provide this liveness since most content is uploaded with a few days delay. The proposed approach organizes the content using an artificial hormone system. The hormone distribution is sensitive to the quality of a network connection and therefore, reflects a quality-of-service for a network path. The system is solely based on local decisions for forwarding, replicating and moving content. Over time, the content distribution in the network gets optimized in order to support short response times for requesters. Simulations show that the system competes well to other epidemic information dissemination methods such as Gossip.
The follow-up discussion brought up interesting questions:
How is overhead reflected in the simulation? Currently overhead is implicitly modeled into the transmission cost, which is valid for a constant packet handling overhead.
Furthermore the relation of the hormone-based approach to an ant colony optimization (ACO) algorithm was discussed. We identified a major difference to ACO, since there typically either the network or the content is assumed to be static. However ACO could be extended to handle the described scenario, which might inspire future work.
What is the effect, if tags are (more) complex? The system was started with a predefined tag hierarchy which can be extended to a more complex tag hierarchy. However, with more complex tags there is no guarantee for finding content.
Wilfried Elmenreich gave a talk on evolving a distributed control algorithm for flying UAV drones for a coverage problem. The problem of having multiple mobile agents covering (or as we say in robotics, "sweeping") an area is relevant for many applications like lawn mowing, snow removal, floor cleaning, environmental monitoring, communication assistance and several military and security applications.
The work by Istvan Fehervari, Wilfried Elmenreich and Evsen Yanmaz described a simple grid-based abstraction of the problem which was used to test two evolved and one handcrafted control algorithm which were compared to reference algorithms like random walk and random direction.
A short summary of Wilfried's talk and the slides are available here.
The talk triggered interesting discussion involving the comparison with the “belief-based” algorithm. A further question triggering future work is on the influence on the layout and number of sensors. What will happen if the environment changes? Since the algorithm has no memory of a map, a changing environment does not affect the result.
- B. McMullin, T. Hasegawa. Von Neumann Redux: Revisiting the Self-referential Logic of Machine Reproduction Using the Avida World. In R. M. Bichler, S. Blachfellner, and W. Hofkirchner, editors, European Meeting on Cybernetics and Systems Research Book of Abstracts, Vienna, Austria, April 2012.
- V. Sesum-Cavic, M. Tuba, and S. Rankow. The Influence of Self-Organization on Reducing Complexity in Information Retrieval. In R. M. Bichler, S. Blachfellner, and W. Hofkirchner, editors, European Meeting on Cybernetics and Systems Research Book of Abstracts, Vienna, Austria, April 2012.
- S. van Splunter, B. van Veelen. Coordination and Self-Organisation in Crisis Management. In R. M. Bichler, S. Blachfellner, and W. Hofkirchner, editors, European Meeting on Cybernetics and Systems Research Book of Abstracts, Vienna, Austria, April 2012.
- C. Gershenson. Living in Living Cities. In R. M. Bichler, S. Blachfellner, and W. Hofkirchner, editors, European Meeting on Cybernetics and Systems Research Book of Abstracts, Vienna, Austria, April 2012.
- A. Sobe, W. Elmenreich, and M. del Fabro. Self-organizing content sharing at social events. In R. M. Bichler, S. Blachfellner, and W. Hofkirchner, editors, European Meeting on Cybernetics and Systems Research Book of Abstracts, Vienna, Austria, April 2012.
- I. Fehérvári, W. Elmenreich, and E. Yanmaz. Evolving a team of self-organizing UAVs to address spatial coverage problems. In R. M. Bichler, S. Blachfellner, and W. Hofkirchner, editors, European Meeting on Cybernetics and Systems Research Book of Abstracts, Vienna, Austria, April 2012.