home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Resource Library: Graphics
/
graphics-16000.iso
/
msdos
/
fractal
/
iterat31
/
jargon.txt
< prev
next >
Wrap
Text File
|
1993-12-29
|
17KB
|
348 lines
A Short Introduction to the Jargon of Iteration Theory
══════════════════════════════════════════════════════
Iteration theory (just as every other complicated field) has developed its own
jargon. This list includes some of the more common terms. It may help you
understand some of the other documentation better, and it may help you
understand iteration better as well.
And if all else fails, you can use these spiffy mathematical terms to impress
your friends with your vast stores of chaotic knowledge.
Dynamical System
────────────────
A dynamical system is simply a function together with the domain the function
is defined on. The domain can be anything--a line, a line segment, the plane,
3-space, 6 dimensional space, or any of the other weird "spaces" mathematicans
are always coming up with. (In Iterate!, the domain of the function is always
the plane.)
The only restriction is that the domain and the range of the function must be
the same. Symbolically, we would write:
f: D D
This means that 'f' is a function with domain and range D. This requirement
makes sense if you think about it. When you iterate a function, you keep
feeding points from the range back into the domain. So if the range and the
domain aren't the same, you're going to be in trouble.
The reason this is called a "dynamical system" is that "dynamics" means
"movement". What we are studying when we look at a dynamical system is how
the points move around under the influence of the function.
Iteration
─────────
What we do when study a Dynamical System is "iterate" the points. This means
you start with a point x. Then figure out f(x). Then f(f(x)), f(f(f(x))),
f(f(f(f(x)))) and so on.
Writing all this f(f(f(f(f(x))))) stuff gets pretty tiresome, so
mathematicians abbreviate by writing fⁿ(x). This means that you apply
function 'f' to point 'x' 'n' times. So f²(x)=f(f(x)) and so on.
(It would be easy to get confused and think that f²(x) means "f(x) squared".
To distinguish between the two, mathematicians write (f(x))² if they mean
"f(x) squared." It would also be easy to get confused and think that f²(x)
means "the 2nd derivative of the function f." But if you're smart enough to
take the second derivative of the function f, then you should be smart enough
to tell the difference between f²(x) meaning "the second iteration of f
applied to x" and f²(x) meaning "the second derivative of the function f.")
Orbits
──────
What you are interested in looking at in a dynamical system is the path the
points take when they are iterated. This path is called the "orbit".
Another way of saying the same thing: The orbit of point x consists of these
points
x, f(x), f²(x)), . . . , fⁿ(x), . . .
The orbit of a point is what you see in Iterate! when you press <Space>.
Fixed points
────────────
Fixed points are points that don't go anywhere when they're iterated, that is,
x=f(x)=f²(x) etc.
Another way of saying the same thing: The orbit of a fixed point consists only
of the point itself.
Periodic Points
───────────────
Periodic points are points that come back to the original point after a
certain number of iterations. For instance, a period 2 point comes back to
the original point after two iterations:
x (starting point)
f(x) (a different point)
f²(x)=x (back to the starting point)
Periodic points of every different period are possible.
Once a periodic point returns to the starting point, it just repeats the same
points again until it reaches the starting point again.
For instance, here is a possible orbit for a period 5 point:
0, ½, 1, 1½, 2, 0, ½, 1, 1½, 2, 0, ½, 1, 1½, 2, 0, ½, 1, 1½, 2, . . .
As you can see, it just keeps repeating the same 5 points over and over.
So the orbit of a period 'n' point consists of just 'n' points.
Attracting Orbits
─────────────────
Attracting orbits suck nearby orbits closer and closer to them. For instance,
an attracting fixed point sucks all nearby points into itself. A period 3
attracting point sucks all points near its orbit closer and closer to the
orbit (the orbit consists of three points, of course).
Repelling Orbits
────────────────
A repelling orbit drives nearby orbits away from it.
Other Types of Orbits
─────────────────────
Many other types of orbits are possible. For instance, there are fixed points
that are attracting in one direction and repelling in the other.
By using techniques from elementary calculus, it is relatively easy to tell
which orbits will be attracting, repelling, or something else. Check the
literature for more details on this.
Using Iterate!, you can easily find examples of all of these different types
of orbits (fixed points, periodic points, repelling orbits, attracting
orbits, etc.). You may have to try several different functions with
different parameters, and try iterating several different points in different
areas of the plane for each of them, but eventually you will see all these
different types of orbits.
Strange Attractors
──────────────────
A strange attractor is similar to an attracting orbit. The difference is that
in an attracting orbit, everything is attracted into an orbit which consists
of a finite number of points. We would say the it is a "finite attractor". A
strange attractor, however, is an "infinite attractor". That is, there is an
infinite set of points that everything else is attracted to.
Where the attracting orbit consisted of only a few attracting points, you can
think of a strange attractor as being a whole shape that is attracting.
Usually this shape is a very, very weird shape; that is why it is called a
strange attractor.
As a rule, the strange attractor is a fractal, with fractal dimension less
than dimension of the dynamical system. For instance, in Iterate!, we are
iterating functions on the plane, which has dimension 2. So any strange
attractors we find in Iterate! will have dimension less than 2--say 1.7, 1.2,
or 0.5.
Usually, the dynamical system is chaotic on the strange attractor. It isn't
chaotic on the rest of the dynamical system, though, since the rest of the
system is just sucked up into the strange attractor. (See below for the
definition of chaos.)
To see a good example of a strange attractor, select the Horseshoe Map
(Function L) with default window and parameters. The "Horseshoe" shape that
you see when you iterate a point (which actually consists of horseshoes
within horseshoes within horseshoes) is a strange attractor. You will notice
that all points are drawn into this horseshoe shape--it is an attractor. You
will notice that once a point gets close to the horseshoe shape, it seems to
just jump around randomly on it--it moves chaotically on the strange
attractor. The horseshoe shape appears to have a fractal dimension between 1
and 2--probably about 1.4 or 1.5.
Another example of a strange attractor is Function F (the inverse Julia Set
function). Again, the strange attractor is a fractal with fractal dimension
between 1 and 2.
Although strange attractors _are_ strange (hence the name), a dynamical system
with a strange attractor is often easier to understand and analyze than one
without a strange attractor.
Forward and Reverse Orbits
──────────────────────────
To make the reverse orbit of a point, think of running the function backwards.
In other words, instead of applying the function to the point repeatedly, you
apply the inverse function of to the point repeatedly. All the points you get
by doing this are the "reverse orbit".
Another way of saying the same thing: The reverse orbit of a point 'x' is all
the points that are mapped to 'x' under iteration. In other words, if
fⁿ(y)=x, then y is in the reverse orbit of x.
If mathematicians are talking about "reverse orbits", they will often refer to
the normal orbit as the "forward orbit" just to be clear. If they are talking
about "forward" and "reverse" orbits, then usually just plain "orbit" means
the forward and the reverse orbits together. (Hey now, let's not hear any
complaints about this--you don't expect clarity and consistency from a bunch
of mere mathematicians, do you?)
In Iterate!, Function F is the inverse of Function E. So if you iterate a
point under Function E, you get the forward orbit of the point. If you
iterate the same point under Function F, you get the reverse orbit of the
point.
Chaos
─────
Mathematically, chaos is defined as a dynamical system with certain (chaotic)
properties. In your own personal life, you are welcome to define chaos any
way you want (most of us don't need to define it actually--we just live it).
But you might want to know the "official" definition of chaos as well. So
here it is:
A chaotic dynamical system must satisfy three properties:
1. Sensitive dependence on initial conditions. This means that any two
points that are close to each other must end up far away from each other
after a few iterations. This condition ensures that the points are
thoroughly scrambled up.
2. Topological Transitivity. This is a more technical requirement, so I
won't try to explain it. Basically, it insures that every area of the
dynamical system is scrambled--there aren't some small pockets somewhere
that don't become scrambled. (See "An Introduction to Chaotic Dynamical
Systems" if you want more info on this.)
3. Periodic points are everywhere dense. No, this doesn't mean that all
periodic points are stupid. It just means that any region in the
dynamical system--no matter how small--contains a periodic point.
You can think of a chaotic dynamical system as one that is thoroughly mixed,
and scrambled; the points move as though at random; the movement appears to be
unpredictable.
If you like homey analogies, you can think of a dynamical system as being like
mixing bread dough. A chaotic dynamical system is like thoroughly mixed bread
dough; a non-chaotic dynamical system is like dough that isn't well mixed.
If Properties 1, 2, and 3 happen in the mixing of the bread, then we can be
sure that it is well mixed:
Property 1 ensures that things that started out close together end up far
apart. For instance, the flour that we put in all together at the start
isn't still clumped up all together--it's spread far and wide.
Property 2 ensures that everything is mixed throughout the _entire_ dough.
For instance, the oil we put in isn't just mixed around in one little
corner of the loaf, but is evenly mixed throughout ALL of the dough.
Property 3 assures us that although the mixing process seems to be
"chaotic", disorderly, and generally difficult to understand, behind
this chaos is a very strong order, dependability, and even simplicity
(remember that the periodic points are about the simplest kind of
motion we can have, and Property 3 assures that they are scattered
throughout our bread dough). (*see note)
In the case of bread-making, this order, dependability, and simplicity
is best understood as a result of the result of the kneading process.
Kneading is very simple--a couple of simple motions are
repeated over and over in a sort of "iteration" of motion. And although
it is "chaotic", it is dependable and reproducible, too--every time we
knead bread dough, we end up with the same basic result.
*Note: Although everywhere dense periodic points are an important feature
of the mathematical formulation of chaos, there is a valid question about
whether they would actually appear in a physical representation of a
dynamical system, i.e., in bread dough. A mathematician would instantly
anwser, "Yes, of course they do! Or at least something so close to periodic
points that you couldn't tell the difference." A physicist might say, "Due
to the fact that space and time are ultimately discrete (in the 32nd
dimension--but let's not get into that), and after all, there are only a
finite number of elementary particles in the universe, let alone in a blob of
bread dough, ALL the points in the dough are ipso facto periodic and there's
NO SUCH THING as chaos in bread dough or real life." (The physicist could
easily be disproven by a brief tour of my apartment.) A really sane person
might come up with yet another answer. In any case, the question is a good
one, and not easy to answer.
Map
───
"Map" is simply another word for "function". The two words mean exactly the
same thing. For some reason, iteration theorists often use the word "map"
instead of "function".
What Good Is It?
────────────────
Usually when mathematicians are asked this question about their specialty,
they answer, "It expands the realm of human knowledge," "It challenges our
intellect," "In about 12,000 years it might be able to be applied to some
obscure scientific area," and stuff like this.
With Iteration Theory, though, we don't have to get into these flimsy type of
justifications (just as though someone ought to be paid just for thinking...
hmmph, the gall of those mathematicians). Iteration Theory has a ton of
concrete physical applications.
One obvious application is modelling population growth. Biologists typically
think of population change on a yearly basis. The trick is to find an
equation that will tell you next year's population if you know this year's.
(If you read "Function.txt" you will see that several of the functions that
are programmed into Iterate! were made with this kind of biological idea in
mind.)
So if we have such a function, and we know this year's population, we just
apply the function and Presto! we have next year's population. Apply the
function again and we have the population in two years. Apply it again, and
we have the population in three years, and so on.
And what is this? A Dynamical System, of course.
In fact, I have a book on my desk right now called "Chaos and Insect Ecology."
The authors talk about such things as whether the conclusions of Chaos Theory
can be applied to insect population dynamics; they apply chaos theory to
things as diverse as the spread of measles in New York City and the population
of martens in Canada.
Most anything that moves or changes (and that includes pretty much everything)
can be thought of as a Dynamical System and studied using Iteration Theory.
Weather prediction, for instance, has been extensively studied from this
angle. The most profound result of this study is the conclusion that the
equations governing the weather are chaotic. This makes long term weather
prediction impossible.
The studies have shown that a change in the weather conditions as small as a
butterfly's wing flapping can change the entire global weather pattern three
months later. So unless you can account for every butterfly's wings, each
person walking down the street, and other such changes that minutely affect
the weather, you can't predict the weather more than three months down the
road.
(This "Butterfly Effect" can be observed in any of the chaotic functions in
Iterate!. For instance, select Function I with default windows and
parameters. Iterate a point with <Space>. Use <Shift Right-Arrow> to move to
the very next point on the screen. You will see that the orbits of the two
points aren't close to each other at all. You can use the <P> command at the
command screen to enter points that are even closer to each other; then use
<I> to iterate them and <L> to examine their endpoints. You will find that
the endpoints aren't anywhere close to each other. This is the Butterfly
Effect: a small change in the initial conditions leads to a large change in
the end result. This also the basic idea behind "Sensitive Dependence on
Initial Conditions" mentioned above.)
The whole area of chaos theory-iteration theory-dynamical systems-fractals and
so on is really a brand new field. Most of the major discoveries in Iteration
Theory have been made in the 1980s. Although it is new, its impact has
already been major. These new methods promise to transform the way we think
about science and mathematics.
With Iterate!, you can see for yourself many of these exciting discoveries,
and maybe along the way you'll make a few of your own!
(Ver. 3.11, 12/93)