Design, Utilization, Self-adaptation and Control

Aller à la navigation Aller à la recherche

Strasbourg Complex Systems Roadmap


The emergent behaviour of complex systems is very desirable, because emergence is characterized by the fact that the whole is more than the sum of the parts.

In Computer Science, for instance, getting n computers to work together on a single problem usually means obtaining a speedup <n, because the n computers must communicate and synchronize to exchange their partial results.

If the n computers implement a complex system, one could hope to not only obtain a speedup of n, but possibly a supralinear speedup, >n. This behaviour is regularly observed when evolutionary algorithms are implemented on n computers.

In catalysis, the control of the nature and strength of the interactions between the catalytically active sites allows the tuning of the properties of the system, for example the reaction selectivity, the emergence of dynamic behaviour, or the formation of self-organized spatial patterns.

This is also true in Chemistry, Biology, Physics, ... so designing complex systems is what this page is about, with the objective to use the designed complex system and benefit from its complex behaviour. Ideally, synthetised complex systems should show self-adaptation ability and one should be able to somehow control them so as to keep them under control and possibly guide them to do what we would like them to do.



Artificial Complex Systems, Ant Colony Optimization, Evolutionary Optimization, Particle Swarm Optimization, Artificial Immune Systems, Artificial Evolution, Artificial Chemistries, Reaction-Diffusion, Genetic Algorithms, Genetic Programming, Evolution Strategies, Surface Chemistry, Bistable and Oscillating Reactions, Electrochemistry


Complex functional materials
Image analysis
Emergent algorithms
Sharing knowledge and skills

Objects on which emergent algorithms can be used

Optimization problems
Design problems
Emerging Complexity in Supramolecular Systems
Protein Networks


Adaptation and control


Massively Parallel Evolutionary Optimisation platform
  • The EASEA platform is currently developed in Strasbourg. It comprises a cluster of GPGPU machines with 15360 GPGPU cores with supralinear speedup, that is used for research on the development of complex systems on massively parallel computing architectures.


  1. G. C. Williams, Adaptation and Natural Selection, Princeton University Press, 1966.
  2. Fraser, A. S. (1957). Simulation of genetic systems by automatic digital computers. Australian Journal of Biological Sciences, 10, 484-491.
  3. Friedberg, R., Dunham, B., & North, J. (1958). A learning machine: Part II. IBM Research Journal, 3(3).
  4. Friedman, G. (1959). Digital simulation of an evolutionary process. General Systems Yearbook, 4, 171-184.
  5. Koza, J. R. et al.(2003). Genetic programming IV: Routine human-competitive machine intelligence. Kluwer Academic.
  6. Dorigo, M., & Caro, G.D. (1999). The ant colony optimization meta-heuristic. In D. Corne, M. Dorigo, & F. Glover (Eds.), New ideas in optimization (pp.101-117). London: McGraw-Hill.
  7. Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. Proceedings of the IEEE International Conference on Neural Networks (vol. 4, pp. 1942-1948).
  8. Ertl G., (2008), Reactions at Surfaces, from Atoms to Complexity (Nobel Lecture). Angewandte Chemie International Edition, (vol. 47, Issue 19, pp. 3525-3524).
  9. Krischer K, (2003), Nonlinear Dynamics in Electrochemical Dystems. In R.C. Alkire, D.M. Kolb (Eds), Advances in Electrochemical Science and Engineering, Vol.8, Chapter 2, Wiley.