Self-Organizing Systems (SOS) FAQ

Frequently Asked Questions Version 2.1 November 1998

For USENET Newsgroup comp.theory.self-org-sys

Index

  1. Introduction
  2. Systems
  3. Edge of Chaos
  4. Selection
  5. Interconnections
  6. Structure
  7. Research
  8. Resources
  9. Miscellaneous

1. Introduction

1.1 Science of Self-Organizing Systems

1.2 Definition of Self-Organization

2. Systems

2.1 What is a system ?

2.2 What is a system property ?

2.3 What is emergence ?

2.4 What is organization ?

2.5 What is state or phase space ?

2.6 What is self-organization ?

2.7 Can things self-organize ?

2.8 What is an attractor ?

2.9 What is an pre-image ?

2.10 How do attractors and self-organization relate ?

3. Edge of Chaos

3.1 What is criticality ?

3.2 What is Self-Organized Criticality (SOC) ?

3.3 What is the 'Edge of Chaos' (EOC) ?

3.4 What is a phase change ?

3.5 How does percolation relate to SOC ?

3.6 What is a power law ?

4. Selection

4.1 Isn't this just the same as selection ?

4.2 How does natural selection fit in ?

4.3 What is a mutant neighbour ?

4.4 What is an adaptive walk ?

4.5 What is a fitness landscape ?

5. Interconnections

5.1 How many parts are necessary for self-organization ?

5.2 What is feedback ?

5.3 What interconnections are necessary ?

5.4 What is a Boolean Network or NK model ?

5.5 What are canalysing functions and forcing structures ?

5.6 How does connectivity affect landscape shape ?

5.7 What is an NKC Network ?

5.8 What is an NKCS Network ?

5.9 What is an autocatalytic set ?

6. Structure

6.1 What are levels of organization ?

6.2 How is energy related to these concepts ?

6.3 How does it relate to chaos ?

6.4 What are dissipative systems ?

6.5 What is bifurcation ?

6.6 What are autopoiesis, extropy and suchlike ?

7. Research

7.1 How can self-organization be studied ?

7.2 What results are there so far ?

Some of these results are tentative, and subject to change as more research is undertaken and these systems become better understood. Many of these results are expanded and justified by Stuart Kauffman in his recent lecture notes, see: The Nature of Autonomous Agents.

  1. The attractors of a system are uniquely determined by the state transition properties of the nodes (their logic) and the actual system interconnections.
  2. Attractors result in the merging of historical positions. Thus irreversibility is inherent in the concept. Many scenarios can result in the same outcome, therefore a unique logical reduction that a state arose from a particular predecessor (backward causality) is impossible, even in theory. Merging of world lines in this way invalidates, in general, determination of the specific pre-image of any state.
  3. The ratio of the basin of attraction size to attractor size (called here SOF) varies from the size of the whole state space (totally ordered, point attractor) down to 1 (totally disordered, ergodic attractor).
  4. Single connectivity mutations can considerably alter the attractor structure of networks, allowing attractors to merge, split or change sequences. Basins of attraction are also altered and initial points may then flow to different attractors.
  5. Single state mutations can move a system from one attractor to another within the system. The resultant behaviour can change between fixed, chaotic, periodic and complex in any combination of the available attractors and the effect can be predicted if the system details are fully known.
  6. The mutation space of a system with 2 alleles at each node is a Boolean Hypercube of dimension N (number of neighbours). The number of adaptive peaks for random systems is 2 ** N /(N+1), exponentially high.
  7. The chance of reaching a random higher peak halves with each step, after 30 steps it is 1 in a Billion. The time required scales in the same way. Mean length of an adaptive walk to a nearby peak is ln N. Branching walks are common initially, but most end on local optima (dead ends). This makes finding a single 'maximum fitness' peak an NP-hard problem. Correlated landscapes are necessary for adaptive improvement.
  8. Correlation falls exponentially with mutant difference (Hamming distance), becoming fully uncorrelated for K=N-1 landscapes. Searches beyond the correlation length (1/e) sample random landscapes. Hence the number of recombination 'tries' needed to find a higher peak doubles with each sucess.
  9. For such systems with high connectivity, the median number of attractors is N/e (linear), the median number of states within an attractor averages 0.5 * root(2 ** N) (exponentially large). These systems are highly sensitive to disturbance, and swap amongst the attractors easily.
  10. For K=0, there is a smooth landscape with one peak (the global optimum). Length of an adaptive walk is N/2, directions uphill decreasing by one with each step.
  11. For K=1, median attractor numbers are exponential on N, state lengths increase only as root N, but again are sensitive to disturbance and easily swap between attractors.
  12. For K=2 we have a phase transition, median number of attractors drops to root N, average length is also root N. The system is stable to disturbance and has few paths between the attractors. Most perturbations return to the same attractor.
  13. Systems that are able to change their number of connections (by mutation) are found to move from the chaotic (K high) or static (K low) regions spontaneously to that of the phase transition and stability - the self-organizing criticality. The maximum fitness is found to peak at this point.
  14. Natural genetic systems with high connectivity K>2 have a higher proportion of canalysing functions than would be the case if randomly assigned. This suggests a selective bias towards functions that can support self-organization to the Edge of Chaos.
  15. To create a relatively smooth landscape requires redundancy, non-optimal systems. Maximal compression (efficiency) gives a rugged landscape, and stagnation on a local peak, preventing improvement. Above suggests that systems alter their redundancy to maximise adaptability.
  16. The 'No Free Lunch' Theorem states that, averaged over all possible landscapes, no search technique is better than random. This suggests, if the theory of evolution is valid, that the landscape is correlated with the search technique. In other words the organisms create their own smooth landscape - the landscape is 'designed' by the agents...
  17. If we measure the distance between two close points in phase space, and plot that with time, then for chaotic systems the distance will diverge, for static it will converge onto an attractor. The slope gives a measure of the system stability (+ve is chaotic) and a zero value corresponds to edge of chaos. This goes by the name of the Lyapunov exponent (one for each dimension). Other similar measures are also used (e.g. Derrida plot for discrete systems).
  18. A network tends to contain an uneven distribution of attractors. Some are large and drain large basins of attraction, other are small with few states in their corresponding basins.
  19. The basins of attraction of higher fitness peaks tend to be larger than those for lower optima at the critical point. Correlated landscapes occur, containing few peaks and with those clustered together.
  20. As K increases, the height of the accessible peaks falls, this is the 'Complexity Catastrophe' and limits the performance towards the mean in the limit.
  21. Mutation pressure grows with system size. Beyond a critical point (dependant upon rate, size and selection pressure) it is no longer possible to achieve adaptive improvement. A 'Selection or Error Catastrophe' sets in and the system inevitably moves down off the fitness peak to a stable lower point, a sub-optimal shell. Limit = 2 * mutation rate * N ** 2 / MOD(selection pressure)
  22. For co-evolutionary networks, tuning K (local interactions) to match or exceed C (species interactions) brings the system to the optimum fitness, another SOC. This tuning helps optimise both species (symbiotic effects). Reducing the number S of interacting species (breaking dependancies - e.g. new niches) also improves overall fitness. K should be minimised but needs to increase for large S and C to obtain rapid convergence.
  23. In the phase transition region the system is generally divided into various areas of variable behaviour separated by fixed barriers of static components. Pathways or tendrils between the dynamic regions allow controlled propagation of information across the system. The number of islands is low (less than root N) and comprises about a fifth of the nodes.
  24. At the critical point, any size of perturbation can potentially cause any size of effect - it is impossible to predict the size of the effect from the size of the perturbation (for large, analytically intractable systems). A power law distribution is found over time, but the timing and size of any particular perturbation is indeterminate.
  25. Plotting the input entropy of a system gives a high value for chaotic systems, a low value for ordered systems and an intermediate for complex system. Variance of the input entropy is high for complex systems but low for both ordered and chaotic ones. This can be used to identify EOC behaviour.
  26. For a network of N nodes and E possible edges, then as N grows the number of edge combinations will increase faster than the nodes. Given some probability of meaningful interactions, then there will inevitably be a critical size at which the system with go from subcritical to supracritical behaviour, a SOC or autocatalysis. The relevant size is N = Root ( 1 / ( 2 * probability) )
  27. Since a metabolism is such an autocatalytic set, this implies that life will emerge as a phase transition in any sufficiently complex reaction system - regardless of chemical or other form.
  28. Given a supracritical set of existing products M, and potential products M' (M' > M), equilibrium constant constraints predict that the probability of the difference M' - M set should be non-zero. Therefore there will be a gradient towards more diversity, in other words 'creativity' in any such system.
  29. Evaluating the above for the diversity we find on this planet shows that we have so far explored only an insignificant fraction of state space during the time the universe has existed. Thus the Universe in not yet in an equilibrium state and the standard assumptions of equilibrium statistical mechanics do not apply (e.g. the ergodic hypothesis).
  30. Given protein diversity in the biosphere this proves to be widely supracritical, yet stability of cells requires partitioning to a subcritical but autocatalytic state. This balance suggests a limit to cell biochemical diversity and a self-organizing maintenance below that limit. This is related to the Error Catastropy, too high a rate of innovation is not controllable by selection and leads to information loss, chaos and breakdown of the system.
  31. Two or more interacting autocatalytic sets that increase reproduction rates above that of either in isolation will grow preferentially. This is a form of trade or mutual assistance, an ecosystem in miniature.
  32. Such interacting sets can generate components that are not in either set. giving a higher level of joint operation, emergent novelty.
  33. If such innovation involves a cost, then the rate of innovation will be constrained by payback period. This is seen in economic analogues, where risk/profit forms a balance, as well as in ecological systems. Interactions must be net positive sum to be sustainable.

7.3 How applicable is self-organization ?

8. Resources

8.1 Is any software available to study self-organization ?

8.2 Where can I find online information ?

8.3 What books can I read on this subject ?

  1. Adami, Christoph. Introduction to Artificial Life (1998 Telos/Springer-Vertag). A good introduction with included Avida software, covering the main concepts and maths - see http://www.telospub.com/catalog/PHYSICS/ALife.html
  2. Ashby, Ross. An Introduction to Cybernetics (1964 Methuen)
  3. Ashby, Ross. Design for a Brain - The Origin of Adaptive Behaviour (1960 Chapman & Hall).
  4. Badii and Politi. Complexity: Hierarchical structures and scaling in physics (1997 Cambridge University Press). Technical and detailed review of the scope and limitations of current knowledge - see http://www1.psi.ch/~badii/book.html
  5. Bak, Per. How Nature Works - The Science of Self-Organized Criticality (1996 Copernicus). Power Laws and widespread applications, approachable.
  6. Blitz, David. Emergent Evolution: Qualitative Novelty and the Levels of Reality (1992 Kluwer Academic Publishers)
  7. Boden, Margaret (ed). The Philosophy of Artificial Life (1996 OUP). Essays on the concepts within the field, good background reading.
  8. Casti, John. Complexification: explaining a paradoxical world through the science of surprise (1994 HarperCollins). Takes a mathematical viewpoint, but not over technical.
  9. Cameron and Yovits (Eds.). Self-Organizing Systems (1960 Pergamon Press)
  10. Chaitin, Gregory. Algorithmic Information Theory (? Cambridge University Press) - see http://www.cs.auckland.ac.nz/CDMTCS/chaitin
  11. Cohen and Stewart. The Collapse of Chaos - Discovering Simplicity in a Complex World (1994 Viking). Excellent and approachable analysis.
  12. Coveney and Highfield. Frontiers of Complexity (1995 Fawcett Columbine)
  13. Deboeck and Kohonen. Visual Explorations in Finance with Self Organizing Maps (1998 Springer-Verlag)
  14. Eigen, Manfred. The Self Organization of Matter (?)
  15. Eigen and Schuster. The Hypercycle: A principle of natural self-organization (1979 Springer)
  16. Eigen and Winkler-Oswatitsch. Steps Toward Life: a perspective on evolution (1992 Oxford University Press)
  17. Emmeche, Claus. The Garden in the Machine: The Emerging Science of Artificial Life (1994 Princeton). A philosophical look at life and the new fields, approachable - see http://alf.nbi.dk/~emmeche/publ.html
  18. Formby, John. An Introduction to the Mathematical Formulation of Self-organizing Systems (1965 ?)
  19. Forrest, Stephanie (ed). Emergent Computation: Self-organising, Collective and Cooperative Phenomena in Natural & Artifical Computings Networks (1991 MIT)
  20. Gell-Mann, Murray. Quark and the Jaguar - Adventures in the simple and the complex (1994 Little, Brown & Company). From a quantum viewpoint, popular.
  21. Gleick, James. Chaos - Making a New Science (1987 Cardinal). The most popular science book related to the subject, simple but a good start.
  22. Goldstein, Jacobi & Yovits (Eds.). Self-Organizing Systems (1962 Spartan)
  23. Goodwin, Brian. How the Leopard Changed Its Spots: The Evolution of Complexity (1994 Weidenfield & Nicholson London). Self-organization in the development of biological form (morphogenesis), an excellent overview.
  24. Goodwin & Sanders (Eds.). Theoretical Biology: Epigenetic and Evolutionary Order from Complex Systems (1992 John Hopkins University Press)
  25. Holland, John. Adaptation in Natural and Artificial Systems: An Introductory Analysis with applications to Biology, Control & AI (1992 MIT Press)
  26. Holland, John. Emergence - From Chaos to Order (1998 Helix Books). Excellent look at emergence and rule-based generating procedures.
  27. Holland, John. Hidden Order - How adaptation builds complexity (1995 Addison Wesley). Complex Adaptive Systems and Genetic Algorithms, approachable.
  28. Jantsch, Erich. The Self-Organizing Universe: Scientific and Human Implications of the Emerging Paradigm of Evolution (1979 Oxford)
  29. Kampis, George. Self-modifying systems in biology and cognitive science: A new framework for dynamics, information, and complexity (1991 Pergamon)
  30. Kauffman, Stuart. At Home in the Universe - The Search for the Laws of Self-Organization and Complexity (1995 OUP). An approachable summary - see http://www.santafe.edu/sfi/People/kauffman/
  31. Kauffman, Stuart. The Origins of Order - Self-Organization and Selection in Evolution (1993 OUP). Technical masterpiece - see http://www.santafe.edu/sfi/People/kauffman/
  32. Kelly, Kevin. Out of Control - The New Biology of Machines (1994 Addison Wesley). General popular overview of the future implications of adaptation - see http://www.absolutvodka.com/5-0.html
  33. Kelso, Scott. Dynamic Patterns: The Self-Organisation of Brain and Behaviour (? MIT Press) - see http://bambi.ccs.fau.edu/kelso/
  34. Kelso, Mandell, Shlesinger (eds.). Dynamic Patterns in Complex Systems (1988 World Scientific)
  35. George Klir. Facets of Systems Science (1991 Plenum Press)
  36. Kohonen, Teuvo. Self-Organization and Associative Memory (1984 Springer-Verlag)
  37. Kohonen, Teuvo. Self-Organizing Maps: Springer Series in Information Sciences, Vol. 30 (1995 Springer) - see http://nucleus.hut.fi/nnrc/new_book.html
  38. Langton, Christopher (ed.). Artificial Life - Proceedings of the first ALife conference at Santa Fe (1989 Addison Wesley). Technical (several later volumes are available but this is the best introduction).
  39. Levy, Steven. Artificial Life - The Quest for a New Creation (1992 Jonathan Cape). Excellent popular introduction.
  40. Lewin, Roger. Complexity - Life at the Edge of Chaos (1993 Macmillan). An excellent introduction to the general field.
  41. Mandelbrot, Benoit. The Fractal Geometry of Nature (1983 Freeman). A classic covering percolation and self-similarity in many areas.
  42. Nicolis and Prigogine. Self-Organization in Non-Equilibrium Systems (1977 Wiley)
  43. Nicolis and Prigogine. Exploring Complexity (1989 Freeman). Within physio-chemical systems, technical.
  44. Pines, D. (ed). Emerging Syntheses in Science, (1985 Addison-Wesley)
  45. K.H. Pribram (ed). Origins: Brain and Self-organization (1994 Lawrence Ealbaum)
  46. Prigogine & Stengers. Order out of Chaos (1985 Flamingo). Non-equilibrium & dissipative systems, a popular early classic.
  47. Salthe, Stan. Evolving Hierarchical Systems (1985 New York)
  48. Schroeder, Manfred. Fractals, Chaos, Power Laws - Minutes from an Infinite Paradise (1991 Freeman & Co.). Self-similarity in all things, technical.
  49. Schweitzer, Frank (ed.). Self-Organisation of Complex Structures: From Individual to Collective Dynamics (1997 Gordon and Breach) - see http://www.gbhap.com/abi/phy/schweitz.htm
  50. Sprott, Clint. Strange Attractors: Creating Patterns in Chaos (? M&T Books). Exploring types of attractor with generating programs - see http://sprott.physics.wisc.edu/sa.htm
  51. Stanley, H.E. Introduction to Phase Transitions and critical phenomena (1971 OUP)
  52. Turchin, Valentin F. The Phenomenon of Science: A Cybernetic Approach to Human Evolution (1977 Columbia University Press). An online book covering similar concepts from an earlier viewpoint, - see http://pespmc1.vub.ac.be/PoS/
  53. von Foerster and Zopf (Eds.). Principles of Self-Organization (1962 Pergamon)
  54. von Neumann, John. Theory of Self Reproducing Automata (1966 Univ.Illinois)
  55. Waldrop, Mitchell. Complexity - The Emerging Science at the Edge of Order and Chaos (1992 Viking). Popular scientific introduction.
  56. Wolfram, Stephen. Cellular Automata and Complexity: Collected Papers, (1994 Addison-Wesley). Deep look at mostly 1D CAs and order/complexity/chaos classes - see http://www.wolfram.com/s.wolfram/books/ca-reprint/
  57. Yates, F.Eugene (ed). Self-Organizing Systems: The Emergence of Order (1987Plenum Press)

9. Miscellaneous

9.1 How does self-organization relate to other areas of complex systems ?

9.2 Which Newsgroups are relevant ?

9.3 Updates to this FAQ

9.4 Acknowledgements

9.5 Disclaimers


Сайт создан в системе uCoz