Excerpt

## Table of Contents

1. ANALYTICAL EXPOSITION

2. CRITICAL CONTEXT:

3. INTEGRATIVE CONCLUSION

REFERENCES

## ABSTRACT

The essay or review below describes or analyses the content, style and merit of the developments in Advanced Complexity Theory. Complex, self-organising, adaptive systems possess a kind of dynamism that makes them qualitatively different from static objects such as computer chips. Complex systems are more spontaneous, more disorderly, more alive than that. In the past three decades, chaos theory has shaken science to its foundations with the realisation that very simple dynamical systems can give rise to extraordinarily intricate behaviour. Chaos theory is the qualitative study of unstable, aperiodic behaviour in deterministic, non-linear, dynamical systems.Chaos / complexity scientists have identified a number of describing features of complex nonlinear systems. The main features of complex nonlinear systems are known to be “dynamic, nonlinear, chaotic, unpredictable, sensitive to initial conditions, open, self-organizing, feedback sensitive and adaptive”. Two central aspects of chaos theory are the mathematical study of abstract dynamical systems and the application of these dynamical models to complex behaviour in actual experimental systems.There are three aspects of Chaos that relate to fractal patterns, bounded infinity, and unpredictability. Since the notions of nonlinearity incorporate both chaos theory in mathematics and complexity theory in science, an understanding of complexity theory might assist in the elucidation of chaotic concepts.

## 1. ANALYTICAL EXPOSITION

The essay or review below describes or analyses the content, style and merit of the developments in Advanced Complexity Theory. Complex, self-organising, adaptive systems possess a kind of dynamism that makes them qualitatively different from static objects such as computer chips. Complex systems are more spontaneous, more disorderly, more alive than that. In the past three decades, chaos theory has shaken science to its foundations with the realisation that very simple dynamical systems can give rise to extraordinarily intricate behaviour. The edge of chaos is the constantly shifting battle zone between stagnation and anarchy, the one place where a complex system can be spontaneous, adaptive, and alive (Waldrop, M.M, 1992). A provocative transition in dynamical systems is:

Order ----> “Complexity” -----> Chaos.

Chaos theory is the qualitative study of unstable, aperiodic behaviour in deterministic, non-linear, dynamical systems (Kabanda, G., 2013). It is a specialised application of dynamical systems theory. Chaotic systems require impossible accuracy for useful prediction tasks. Chaos theory often seeks to understand the behaviour of a complex system by reconstructing its attractor, and knowing this attractor gives us qualitative understanding. Chaos theory includes theoretical hypotheses that assert relationships of qualitative (or topological) similarity between its abstract models and the actual systems it studies. Dynamics is used more as a source of qualitative insight than for making quantitative predictions. Its great value is its adaptability for constructing models of natural systems, which models can then be varied and analysed comparatively easily (Kabanda, G., 2013). Chaos theory is the quantitative study of dynamic non-linear system. Non-linear systems change with time and can demonstrate complex relationships between inputs and outputs due to reiterative feedback loops within the system. These systems are predictable but their behaviour is exquisitely sensitive to their starting point. Chaos is a sub-discipline of complexity. Complexity theory is the qualitative aspect drawing upon insights and metaphors that are derived from chaos theory.

Chaos / complexity scientists have identified a number of describing features of complex nonlinear systems. The main features of complex nonlinear systems are known to be “dynamic, nonlinear, chaotic, unpredictable, sensitive to initial conditions, open, self-organizing, feedback sensitive and adaptive” (Larsen-Freeman, 1997, p. 142). A numerical simulation of the behaviour of the system may reveal a period-doubling cascade. One should then obtain the scaling relations by examining the sequence of birfucation points, identify the relevant symmetries of the situation, and test the generality of these features by exploring other similar maps. This theory gives us understanding by showing us the mechanism responsible for unpredictable behaviour, but these are not causal processes. It is always appropriate to see understanding as providing knowledge of underlying causes. Chaos theory then relies heavily on digital computers. Two central aspects of chaos theory are the mathematical study of abstract dynamical systems and the application of these dynamical models to complex behaviour in actual experimental systems. Specific manifestations of chaotic solutions had to wait for the arrival of powerful computers with which to calculate the long-time histories necessary to observe and measure chaotic behaviour. Chaos / complexity theory is concerned with the behavior of dynamic systems, i.e., the systems that change in time. The study of chaos (the randomness generated by complex systems) is a study of process and becoming, rather than state and being. Dynamic systems move through space / time, following a path called an attractor, i.e., the state or pattern that a dynamic system is attracted to (Larsen-Freeman, 2002). Chaos / complexity theory focuses on complex systems. To Larsen-Freeman (1997), systems are complex for two reasons. First, they often include a large number of components, and second, the behavior of complex systems is more than a product of the behavior of its individual components.The outcome of a complex system emerges from the interactions of its components; it is not built in any one component. As such, the interactions (connectivities) amongst the components in the system are the essential building blocks of the unpredictable structures that may emerge in the future.

Logistic functions are generally expressed as the function: xk+1 = λxk(1 – xk) and are used to display patterns associated with population growth and decline, where the minimum value is zero and the maximum possible size is one (1). The constant λ is used a parameter to account for different species. Values less than zero (0) and greater than 1, when iterated, are going to move toward (negative) infinity, thus the scale to display should focus mainly between x = 0 and 1. The line y = x is used as a reference point for each iteration, since the resulting y becomes the new x. The process of iteration displays a relationship between the initial seed and its ensuing values for the function (Smitherman, S., 2014, p.5).There are three specific ideas that relate to the classroom and which draw on conceptual understandings associated with nonlinear dynamics: fractal patterns, bounded infinity, and unpredictability. These ideas incorporate various perspectives of chaos theory that exhibit patterns of nonlinearity (Smitherman, S., 2014, p.5).

There are three aspects of Chaos that relate to fractal patterns, bounded infinity, and unpredictability (Smitherman, S., 2014, p.6):

i. Fractal Patterns

Fractals are patterns of self-similarity that are generated using iterated functions. The word fractal is a way to describe geometric patterns that do not become more simplified (reduced) as one zooms in or out. According to Smithernan (2014, p.6), patterns of behavior in a classroom can relate to these fractal patterns. Still others are chaotic, like the behavior of students that are performed each day in the classroom. These fractaled patterns display dynamic relations that occur within a classroom among teachers, students, subject material and the classroom environment. By relating conversations in the classroom to fractal patterns, teachers can embrace a rich metaphor as a picture of what is occurring. The initial seed will have an impact on what conversation will ensue, the format of the discussion will affect the type of interaction, and the patterns of the resulting conversation may in fact display differing “orbits.”

ii. Bounded Infinity

Consider all the numbers that exist on a number line between the integers of zero and one. That is one example of a bounded set of an infinite amount of members. Teachers can connect to this notion of bounded infinity in their classrooms. A teacher may be restricted (bounded) by the national initiatives, state mandates, district criteria, school instructions, and curricular concerns, but within these boundaries are infinite possibilities. The potential relationships between teacher and students, among students, and how a teacher chooses to implement the subject material are boundless. This grants freedom to the teacher to not feel constricted by the limits that are imposed by outside sources but rather to be creative within them (Smitherman, S., 2014, p.9.

iii. Unpredictability

Chaos theory incorporates the notion that sensitive dependence to initial conditions is an important component needed to generate chaotic behaviors. Small variations in conditions may lead to large differences in nonlinear dynamical systems. Non-linear, open systems are divergent and generative, not closed and limited. An immediate consequence of sensitive dependence in any system is the impossibility of making perfect predictions, or even mediocre predictions sufficiently far into the future. Predicting becomes problematic beyond certain ranges of time (Smitherman, S., 2014, p.10).

Since the notions of nonlinearity incorporate both chaos theory in mathematics and complexity theory in science, an understanding of complexity theory might assist in the elucidation of chaotic concepts. The two areas are not mutually exclusive and should not be interpreted as such (Smitherman, S., 2014, p.13). While chaos theory is located within mathematics, complexity theory situates itself in science. The field of complex systems cuts across all traditional disciplines of science, as well as engineering, management, and medicine. It focuses on certain questions about parts, wholes and relationships. Complexity theory is an emerging field in which scientists seek patterns and relationships within systems. Rather than looking to cause and effect relations, complexity theorists seek to explicate how systems function to rely upon feedback loops (reiteration, recursion, reciprocity) so as to (re)frame themselves and thus continue to develop, progress, and emerge (Smitherman, S., 2014, p.16).

Complex non-linear dynamic systems, as illustrated by Figure 1 below, can be:

- dynamic: the behaviour of the system (e.g. classroom) as a whole arises from the interaction of its components.

- complex: any learning situation is influenced by many factors (teaching/learning characteristics, interaction patterns, methods, materials, time, etc.)

- nonlinear: learners learn in “jumps”.

**Figure 1: Chaotic behavior**

Abbildung in dieser Leseprobe nicht enthalten

The initial condition x gives after time t a point ftx. If x is replaced by x+x, then ftx is replaced by ftx+ftx , and ftx = (ftx/x). x grows exponentially with time t, we say that we have sensitive dependence on initial condition. More precisely, we have sensitive dependence on initial condition if the matrix of partial derivatives ftx/x has norm growing exponentially with t.

An attractor is the set on which the point P, representing the system of interest, is moving at large times (i.e. after so-called transients have died out) (Ruelle, D., 1991). For this definition to make sense it is important that the external forces acting on the system be time independent (otherwise we could get the point P to move in any way we like). It is also important that we consider dissipative systems (viscous fluids dissipate energy by self-friction). Dissipation is the reason transients die out. Strange attractors look strange in that they are not smooth curves or surfaces but have "non-integer dimension", i.e. they are fractal objects. The motion on a strange attractor has sensitive dependence on initial condition. What we now call **chaos** is a time evolution with sensitive dependence on initial condition. The motion on a strange attractor is thus chaotic (Kabanda, G., 2013).

The more general definition of an attractor is a set of points or states in state space to which trajectories within some volume of state space converge asymptotically over time (Kauffmann, S.A, 1993). Many but not all dynamical systems have attractors. Among those which do not are the classical Hamiltonian systems of physics, exemplified by the frictionless pendulum. If released at any defined position and initial velocity, the pendulum swings on a periodic, closed orbit in its state space without loss of energy. If displaced to a slightly larger or smaller orbit by a perturbation, the pendulum follows a different closed, periodic orbit in its state space, with a slightly different energy. Each obit is neutrally stable, for the system will remain in any orbit once placed there. No orbit drains a basin of attraction. The existence of attractors requires some form of driving and friction which prevents conservation of energy within the system itself. Thus, in addition to simple steady states, continuous dynamical systems may admit of more complex attractors.

The limit cycle is one type of an attractor where the system flows around in a loop repeatedly. Strange or chaotic attractors exist (Lorenz, E.N., 1963; Ruelle, D., 1979; Grassberger, P. and Procaccia, I., 1983; Mayer-Kress, G., 1986). In such a dynamical system, which might be ten-dimensional, the flow might, for example, bring all trajectories onto a two-dimensional attractor a bit like a Moebius strip with a pleat or some other folded form. The interesting property of such attractors is that, if the system is released from two points on the attractor which are arbitrarily close to each other, the subsequent trajectories remain on the attractor surface but diverge away from each other. The dimensionality of a strange attractor is often not an integer. Rather it is natural to define a fractal dimension (Mandelbrot, B., 1977) for the attractor, which might be 2.3 for an attractor which occupies more than two but fewer than three dimensions in the 100-dimensional space (Packard et al, 1980; Mayer-Kress, G., 1986). Such fractal attractors are already being found in biological systems - for example, in cardiac and neural electrical activity patterns (Holden, A.V., 1986, Mackey, M.C., and Glass, L.,1988).

Deng, Z., and Hioe, F.T. (1985) presented a result which showed the transitions from chaos to order and again to chaos as the coupling parameter between two nonlinearly coupled oscillators of a Hamiltonian system is varied continuously from - ∞ to +∞. He showed that there is no general correspondence between the classical chaotic motion and the Gaussian-orthogonal-ensemble distributions of the energy-level fluctuations of the corresponding quantum system.

Current models of neural nets, which may use a sigmoidal output response from the neuron to the input activity level rather than an all-or-none output from the neuron (Hinton et al., 1984; Hinton, G.E., and Sejnowski, T.J., 1986). Hinton et al., 1986; Hopfield et al, 1986a, 1986b; Grossberg, 1987), often seek to model pattern-recognition capacities and associative memories by these parallel-processing networks in terms of attractors of such networks (Hopfield, J.J., 1982a, 1982b; Toulouse et al., 1986). For example the attractors might be thought of either as memories held by the neural network or as concepts. Then such networks are naturally content-addressable, i.e. if released in the basin of attraction of a specific memory or concept, the system will flow under the dynamics of the network to that attractor.

**Decidability** tc "5. Decidability"

Many decision problems associated with the class of recursive functions are undecidable. To show that a decision problem is undecidable may be achieved through relating it in an appropriate way to another problem that is already undecidable, a process called reducing one decision problem to another. The idea of *computability* is closely related to that of *decidability.*

Informally, the problem of determining whether a designated property holds for arbitrarily chosen objects is said to be decidable if there is an effective procedure for making the required decision. Formally, such a decision problem is defined to be recursively decidable if the number-theoretic relation corresponding to the property in question is recursive. Undecidability is most commonly established by the method of reduction, which consists in showing that a decision procedure for the problem at hand would provide a decision procedure for some problem already known to be undecidable. Rice's theorem uses this approach to show that many simple problems concerning recursive functions are undecidable.

**[...]**

- Quote paper
- Gabriel Kabanda (Author), 2019, Developments in Advanced Complexity Theory, Munich, GRIN Verlag, https://www.grin.com/document/491439

Publish now - it's free

Comments