home *** CD-ROM | disk | FTP | other *** search
- Comments: Gated by NETNEWS@AUVM.AMERICAN.EDU
- Path: sparky!uunet!darwin.sura.net!paladin.american.edu!auvm!VAXF.COLORADO.EDU!POWERS_W
- X-Envelope-to: CSG-L@vmd.cso.uiuc.edu
- X-VMS-To: @CSG
- MIME-version: 1.0
- Content-transfer-encoding: 7BIT
- Message-ID: <01GMQX8HSXSI00008B@VAXF.COLORADO.EDU>
- Newsgroups: bit.listserv.csg-l
- Date: Fri, 24 Jul 1992 06:44:57 -0600
- Sender: "Control Systems Group Network (CSGnet)" <CSG-L@UIUCVMD.BITNET>
- From: "William T. Powers" <POWERS_W%FLC@VAXF.COLORADO.EDU>
- Subject: Energy, Entropy, Info
- X-To: CSG-L@vmd.cso.uiuc.edu
- Lines: 253
-
- [From Bill Powers (920723.2130)]
-
- Allan Randall (920723.2100) --
-
- ><Information does not necessarily travel in the same direction as
- ><physical energy.
-
- >I think you are confusing the concept of energy and that of entropy. >They
- are related, but not the same. It is the latter, not the former, >that
- information theorists associate with information content. There is >no need
- in traditional information theory for the kind of "net flow" of >energy in
- the direction from source to destination that you talk about >in your
- examples, both of which I think are pretty easy to refute.
-
- It's been a long time since I studied anything having to do with entropy,
- so remarks like yours create instant insecurity. I went back to my old
- books, and found in the Handbook of Physics:
-
- The increase in the entropy of a body during an infinitesimal state
- of a reversible process is equal to the infinitesimal amount of
- heat absorbed divided by the absolute temperature of the body. Thus
- for a reversible process
-
- dS = Q/T
-
-
- ... where Q is the infinitesimal quantity of heat.
-
- I have seen a somewhat different and more general-seeming definition,
-
- dS = k(dQ/Q),
-
- where dQ is simply a signed amount of energy absorbed and Q is the amount
- already present (of the same form). Shroedinger uses this form of
- definition in "Order, disorder, and entropy" in _What is life?_
-
- It seems clear that the change in entropy is a signed quantity, and that
- the sign (for the receiver of the energy) is the same as the sign of the
- direction of energy transfer dQ. (But see correction below -- I have this
- backward).
-
- You say,
-
- >The entropic formulation of information theory, along with the related
- >algorithmic formulation, has a pretty firm mathematical basis ...
-
- A firm mathematical basis does not mean the same thing as a firm physical
- basis, and a firm physical basis is not the same thing as a firm
- experiential or semantic basis. What do we gain by calling Q "order" and
- 1/Q "disorder?" Schroedinger proudly declares that only a physicist can
- understand his definition of negative entropy and its relation to order and
- disorder. If that is so, then physicists have created a systematic delusion
- which can be shared only with people who have been painstakingly trained to
- believe in it. We should not assume that everything we ordinarily call
- order and disorder is what a physicist would mean by such a term -- for
- example, the difference between THEDOG and TDHOEG. What a physicist means
- by order is not what other people mean by it, or even what the physicist
- means by it when, attempting to make physics explain life, he or she
- substitutes the ordinary meaning for the special meaning as if there were
- no difference. There is a certain arrogance in this proprietary attitude
- toward understanding that has always put me off physics -- even when I was
- a physics student.
-
- Browsing through my old Buckley, I find Raymond's article on
- "Communication, entropy, and life." Here he defines "The rate of increase
- of thermodynamic entropy during communication" as
-
- dS/dt = W/T,
-
- where "W is the average power expended in the communication device...", a
- neat way of avoiding saying which way this power (energy per unit time) is
- traveling. The assumption, of course, is that it is traveling into the
- receiver. The idea that you can affect a receiver by draining energy from
- it never occured to him, or as far as I can tell, any other information
- theorist.
-
- RE: the telegraph example.
-
- >The mistake you are making here is using the energy output of the >battery
- as the transmitting energy flow. This is incorrect. You are >treating the
- battery as the information transmitter.
-
- No, I am only assuming that the battery is the ENERGY source. Information
- is transmitted by draining the battery, which is located at the destination
- end of the circuit in the first form of my example. So if we include the
- battery as part of the black-box receiver in Chicago, it's clear that
- during transmission of a message, the wires at the Dodge City end are
- warmed when the key is closed (dQ), which increases their entropy by an
- amount depending on their initial temperature-energy, Q. The entropy has to
- "flow" in the same direction as the energy flow. However, the message
- "flows" in the opposite direction.
-
- [Here I discovered my error]
-
- Actually, now that you pin me down, I realize that I've made a mistake, but
- not the one you mention. If a constant current flowing in a wire heats it,
- the temperature (Q) rises, and as it does so, with dQ constant, dQ/Q must
- be falling. So in fact, entropy flows OPPOSITE to the direction of flow of
- energy. I told you it's been a while. All this does is change my examples
- so that entropy flows in the reverse direction -- it still doesn't
- necessarily flow in the same direction as information.
-
- >But the battery is NOT the originator of the message. It matters not a
- >wit whether we consider the battery to be part of the sender or the
- >receiver. The battery is thus more justifiably considered as part of >the
- medium of transmission. The message actually comes from the human >being
- who is putting out the dots and dashes. This *is* a flow of >energy from
- the human, and *does* decrease the entropy of the receiver >and increase
- the entropy of the source (and the universe).
-
- This "deduction" depends on insisting that energy DOES flow in the
- direction of the message -- you're begging the question. If energy or
- negative entropy flow is NOT the same thing as information flow, your
- argument is false. You can't (legally) assume your conclusion and then use
- it to prove that your conclusion is true. The battery is NOT, as you say,
- the originator of the message. But it IS the originator of the energy flow,
- and entropy flows in the opposite direction to energy.
-
- (By the way, if the telegraph operator is using a bug, there is no longer a
- single movement for each dot or dash, because the operator can simply hold
- the bug paddle sideways until the correct number of consecutive dots or
- dashes is perceived. And there's no energtic, or entropic, difference for
- the operator between making a dot and making a dash)
-
- So in this case the entropy and the information are travelling in the same
- direction. By moving the battery to the sending end, you can make the
- entropy and information flows go the opposite way (as normally assumed in
- transmitting messages by wire, sound, light, or radio waves by sending
- energy through a medium from a transmitter to a receiver).
-
- >Compare what happens to the case of a transmitter that outputs dots and
- >dashes due to chaotic or random forces in the world around it. These
- >messages are less ordered, and thus higher entropy, than the messages >put
- out by the human.
-
- Why are they less ordered, when they are telling us in detail about some
- very complex processes that present an endlessly new pattern? Does a
- message contain less information when it is about a more complex process?
- This concept confuses the atomic type of random-seeming disorder with
- macroscopic disorder, a completely different proposition. I consider a
- phase plot of a chaotic system to contain information.
-
- The relation of order to entropy at any level but the atomic is an analogy,
- not an equivalence. "Order" is an experiential term based on our capacity
- to perceive pattern and sequence; physicists have attempted to appropriate
- it to mean only the reciprocal of statistical disorder, and then have
- turned around to say that this restricted meaning is the ONLY meaning, thus
- invalidating the ability to perceive pattern and sequence. Physicists, like
- behaviorists and other psychologists, thus have blamed our ignorance on
- nature. The moment they did that, physics ceased to progress and started to
- disintegrate (expensively) into particles.
-
- For a clearer example, just think of transmitting a dot-dash message by
- touching an ice-cube to someone's skin. The body loses heat to the ice-
- cube, decreasing the ice-cube's entropy and increasing that of the body and
- its "cold receptors" -- I hope I still have my signs right. Information
- being defined as the negative of the entropy change, the formal definitions
- of information theory would say that we are taking information out of the
- body and putting it into the ice-cube. If instead we use a warm soldering-
- iron, at a temperature well above the skin temperature, then the entropy of
- the body is decreased by each brief touch and that of the soldering iron is
- increased. So in that case formally defined information is flowing from the
- soldering iron into the body. In both cases, information (semantic) is
- being transmitted into the body, for sensory nerves respond in either case.
-
- If you want yet another example, consider sending a message from ground
- level to someone two stories up by opening and closing a valve that lets
- water out of a hose. There's no way that energy can be transmitted up the
- hose, or entropy down it, using the valve.
-
- I don't think that the originators of information theory were thinking very
- much in terms of nervous systems. I don't think that they were looking for
- counterexamples, either. Physicists pay little attention to the properties
- of human perception. Especially at the higher levels, they simply project
- them into an objective universe. When HPCT gets into physics, physics, too,
- will undergo a(nother) revolution.
-
- >But what Shannon and Weaver showed was that there is a key aspect of
- >communication, which is now usually called information, that is
- >independant of this "meaning" or semantic content and has nothing to do
- >with perception. I think they succeeded.
-
- They succeeded in analyzing the physical situation under the assumption
- that the source of energy would always be at the source of the
- transmission, and that the energy would then travel to and have an effect
- on the receiver. They made a blunder in assuming that you can only affect
- the receiver by putting energy INTO it, but that doesn't make much
- difference under the circumstances they were trying to analyze. They didn't
- even have to worry about PNP vs NPN transistors -- just vacuum tubes.
-
- They didn't have to use the word "information" at all, except that they
- hoped to draw a parallel between the physical interactions and the
- psychological or semantic world. They never considered any of the details
- of sensory perception or neural transmission, so it never occurred to them
- that energy entering the nervous system didn't simply proceed into neural
- channels and make its way to higher centers, like electron flow in a wire.
-
- >Your rubber band experiment merely shows one example of a case where
- >control is necessary for information to be transmitted. It says nothing
- >about whether such control is necessary for information transmission in
- >general. I don't think it is. Give me reason to believe otherwise.
-
- If you consider information transmission to consist only of objective
- signals traveling through a physical channel independently of human
- knowledge, you're talking about physical "information" -- simple lineal
- cause and effect. But that kind of information transmission (whichever way
- the energy and entropy go) does not explain communication among human
- beings, which is a closed-loop process. All it does is set the limits of
- accuracy in transmitting the level-zero message, as in Martin Taylor's
- Layered Protocol scheme. As I said in my talk, the _meaning_ of a
- communication must be supplied by the receiver, and it is not likely to be
- identical to the meaning intended by the transmitter. The difference is not
- due to channel noise, but to the different experiences of the human sender
- and the human receiver.
-
- In fact, symbolic communication is an iffy way of getting meaning from
- source to destination. Experience is always far more detailed than our
- communications about it. When a mover struggles into the living room
- carrying a chair, the owner may say "Just put it down anywhere." But that
- is impossible: the behaving system has to put it down EXACTLY SOMEWHERE, to
- the limit of perceptual resolution. Our actual control processes are
- quantitative to the limit set by system noise; our symbolic communications
- are vague and fuzzy in comparison, admitting of many variations in the fine
- details of meaning that would still fit the message. So in interpreting
- communications, we always add enormously more detail by way of meaning than
- the message can possibly carry. This is why we misunderstand each other so
- easily despite all the acks and naks and multiple-bit error-detection and
- correction that goes on between keyboard at one end and screen at the
- other. Even despite the dictionaries we keep at our elbows. We do not mis-
- receive or misread the letters; we translate them into the wrong meanings.
-
- That is why control is required: we must not just emit our messages blindly
- and assume that the intended meaning shows up at the other end. We must not
- just assume that what we read into messages we receive was intended to be
- launched. We must get information back -- first from our own fingers as
- they blunder about over the keys, then from our own screen that shows what
- code was actually produced by our own flakey keyboard (displayed in a form
- we easily recognize), and then from the recipient of the message, to see,
- if we can, what meaning the recipient assigned to the strings of symbols we
- stuffed into our end of the wire. Many rounds of this closed loop must be
- traversed before a wise transmitter will admit that the intended meaning
- may just possibly have been noticed at the receiving end. Isn't that what's
- going on here?
-
- >PS: I'm actually a lot more favourable to PCT than I appear in
- >my posts.
-
- I knew that. Once you understand PCT, you can't un-understand it again.
- It's a trapdoor.
-
- Best,
-
- Bill P.
-