THE NATURE OF THE PROOF Part I

     The proof is part of Information Theory, about data transfer, about
learning (as a verb, to learn), and in particular mechanical learning.

     Learning as a verb is a causal PROCESS by which both THEORETICAL
and CERTAIN knowledge is gained.

     There is the LEARNER, THE LEARNED, AND THE LEARNED ABOUT.

     The learner is the one who learns.

     The learned is what is learned, the 'learning' in the noun sense.

     The "learned about" is the object being learned about, about which
the learning is true.

     Learner and learned about are TWO DIFFERENT OBJECTS separated by
space and time.

     The learned or learning is a CHANGE IN STATE in the learner, caused
by the learned about.

     If there is no change in state in the learner, nothing is learned.

     The change in state in the learner, including any other changes in
state resulting from the first change in state, IS the learning.

     Theoretical knowledge is born of generalizing from direct or
indirect observations or instances.

     Certain knowledge is a description of the direct observations
themselves.

     A direct observation is simply the process of looking at the thing
itself.

     An indirect observation is the process of learning about an object
by looking at another object causally related to, or the effect of, the
first object.  This produces only theoretical knowledge, never certain
knowledge.

     Since there is no, and cannot be any, direct observation in the
physical or mechanical universe, all observations made of or in the
physical universe are indirect in nature, and thus produce only
theoretical conclusions.

     Direct observations produce perfect certainties.

     Indirect observations produce theories made of evidence and models.

     Very quickly, if we are trying to learn about A by looking at B,
then B is the evidence, A is the model for the existence of A's causal
imprint in B, and the theory is the postulated causal relation between A
and the changes in state observed in B.

     The above presents us with the need to define the following terms.

     Causation, learning, machine and certainty.

     Causation means that changes in state in one object, A, NECESSARILY
result in changes in state of a second different object, B, a moment
later.

     The 'moment later' results from the fact that the speed of cause is
finite at the speed of light.

     Light itself is a form of causal messenger wave.

     The causing object is called the referent, and the affected object
is called the symbol.

     Throw a light switch and the light bulb turns on.

     The switch is the referent, and light bulb is the symbol.

     We can theoretically judge (learn about) the state of the switch
(A) by looking at the state of the light bulb (B).

     We say the state of the symbol TRACKS the state of the referent.

     Notice, the process of learning about the state of the switch from
the state of the light bulb DEPENDS absolutely on there being a valid
causal pathway between switch and light bulb.

     Without causation, meaning in the absence of valid causal pathways
between referent and symbol, there can be no learning.

     Notice that using MORE causal pathways to verify the first causal
pathway between switch and light, leaves open the question of whether
the second set of causal pathways are valid.

     Thus we can state that:

     Causal pathways can not be used to validate other causal pathways
with certainty.

     For the grammatically minded:

     MORE CAUSAL PATHWAYS DO NOT A MORE CERTAIN CAUSAL PATHWAY MAKE.

     Learning is any change in state in the LEARNER that is causally
related to and thus symbolizes the nature of the LEARNED ABOUT, where
the learner and the learned about are two different objects.

     The change in state in the learner is a SYMBOL OF FINAL AUTHORITY
for the learned about which is the ORIGINAL REFERENT.

     A machine is defined as any system of objects interacting via cause
and effect across a space time distance.

     Machines learn by being an effect, by BEING the second object, the
learner, which is changing state as a causal result of the learned
about, namely the external physical universe impinging upon the machine.

     In the mechanical world, all learning is symbolic in nature,
because the learner is a different object than the learned about.

     For example a learning machine can take a video picture of a cow
out in the physical universe.

     The picture of the cow is not a cow, but contains high DATA CONTENT
about the cow.

     Further the data in the picture also looks like a COW!

     We call the fact of high geometric similarity between the cow and
its picture, high GEOMETRICITY.

     Notice a picture of a cow may look exactly like a cow, but its
still a symbol for the real thing.

     Thus as a symbol, the picture of the cow has both high data content
and high geometricity relative to the original referent.

     One could however scan that picture into an encrypted data stream
that didn't at all look like a cow but yet retained recoverable data
about the cow.  Or one could write a book about the cow or its picture,
describing it in words.

     Both the encrypted data stream and the book are symbols for the
cow.

     In both of these symbols, they still have high data content but
very low geometricity.

     As data flows from the original referent to a symbol of final
authority through a causal pathway of many hops, the geometricity may
change from high to low and back again many times, but the data content
is hopefully conserved as it travels its path.

     Since each causal hop adds in its own component of change into each
symbol along the way, some times the data from the original referent can
be covered in so many other changes that it becomes unrecoverable.

     Have you ever received a fax of a perfectly good picture or text
that was none the less badly marred or unreadable in sections because of
added effects from the sending fax machine?

     Digitization and protocols for transmission and retransmission of
data helps greatly in this problem of data decay,

     But in the natural physical universe, most data pathways are
analogue in nature, and thus original data can get covered by so many
other effects added later in the chain, that the original data falls
below the noise floor of the transmission and becomes unrecoverable.

      Homer

------------------------------------------------------------------------
Homer Wilson Smith     The Paths of Lovers    Art Matrix - Lightlink
(607) 277-0959 KC2ITF        Cross            Internet Access, Ithaca NY
homer@lightlink.com    In the Line of Duty    http://www.lightlink.com
Sun May 15 15:26:30 EDT 2011