Information is Protophysical


Douglas J. Matzke
Texas Instruments
Dallas, Texas
matzke@ti.com
Presented at PhysComp96, Boston Univ, Nov 1996

Abstract:

Thought experiments led Einstein to discovering that gravity and
acceleration were identical, which lead to his famous theory of
relativity. Thought experiments can also be used in critical thinking
and understanding about the relationship between physics and
computation. This paper will discus several computational oriented
thought experiments related to spacetime (or gravity) that surfaced
during the planning of the PhysComp 92 and PhysComp 94 conferences.
Thought provoking questions will be developed from these thought
experiments, as well as other questions people asked during this
period. These thought experiments and questions will be discussed in
light of actual research supporting those issues, and the result will be
the view that information laws are topological constraints that precede
physical laws and therefore are protophysical.

1.0 Introduction to information and spacetime:

When Dr. Landauer argued that information was physical [1] he turned the
concept of information from a mathematical exercise into a physics
reality. In principal therefore, information is just another kind of
energy or matter (or visa versa). This notion of information does not
exactly jive with the information theory view of information being a
mathematical measurement for modeling communication systems or the
computer engineering and computer architecture ideas that computation is
defined as physical spatial (memory in Mbytes) and temporal (CPU in
Mips) resources, with the actual energy cost a technological dependent
variable.

Computer science has traditionally been concerned with the abstract
costs of computation, where as engineers have been concerned with the
physical mechanisms and physical costs of computing. As computer
technology continues to scale there will be less of a clean separation
between the abstract and physical computational layers. Also as computer
scientists continue to demand exponentially more computing resources for
tough problems, we are faced with the reality of computing resource
technology limits looming in the future.

For these reason, many scientists are looking at quantum computing as a
solution to providing more computing power than semiconductor scaling
alone will provide.  In a simular vein, perhaps computational leverage
could also be obtained by looking at relativistic notions of space and
time and observer frames.

This approach of looking at relativity theory may not seem like the most
obvious approach, but much work has been done combining relativity and
information theory with quantum mechanics. For example, the black hole
work of Schiffer and Bekenstein [2] led to the understanding that black
holes are gigantic "bit buckets" with a bit being the intrinsic quantum
increase of surface area. Likewise the generalization by Unruh [3]
showed that gravitation fields, impact information transfer rates due to
gravitationally induced thermal noise. Empty space itself also
represents an intrinsic zero point energy potential to support quantum
fluctuations, and must therefore also represent intrinsic computational
potential since information and energy are related. Even the big bang
must have started as a very special entropy state to allow the universe
to keep running ever since then towards a thermodynamic oblivion of
uniform heat death or big crunch.

Connections between gravity and computation are expected if both deal
intrinsically with spacetime and energy. Is it possible to take this
kind of thinking to the next step and find some connection between
computation and gravity, where computation can be viewed as information
dynamics? Hopefully this understanding would also generalize to include
the non-standard spacetime metrics obtained by quantum computing
leverage.

Most of the unified field theories propose multiple dimensions of space
to solve super symmetry constraints [4].  Interestingly, many of the
most successful computer science concepts also deal with supporting
higher dimensional semantics and mechanisms. For example, virtual memory
pointers can be thought of as spatial microcode that allow
representation of high dimensional topologies using the zero dimensional
topology of virtual memory space.  Likewise coding theory, content
addressable memories, neural networks, and object oriented programming
all deal with efficiently supporting mechanisms for high dimensional
semantics.  Again, it is no surprise that topology is the essence of
gravity as well as information and computation theories.

The major difference between gravitational and computational spacetime
is the difference between actual physical spaces and simulated
spaces. The most obvious is simulated spaces do not have real energy, or
a simulation of a nuclear reactor would destroy the computer and
building. Less obvious is that physical spaces have real "time" metrics
and isotropy. Isotropy is the ability of a system to behave consistantly
independent of the axis of observation of movement. High dimensional
simulated spaces will demonstrate anisotropic metrics when measured
against physical time (vs. virtual time metrics).

With this introduction the rest of the paper will discuss several
thought experiments and many thought provoking questions to elicit more
understanding about the relationship between computation and spacetime.

2.0 Spacetime Thought Experiments

Conventional thinking about exponential algorithms leads to the
classical understanding of NP-complete problems known today. If this
thinking is put into the context of a physics thought experiment the
following description applies. NP-complete problems are those problems
that scale exponentially based on the number of elements in the
solution. For very large problems, when one limits the amount of spatial
resources (defined as the number of computing elements), the time to
solve the algorithm takes more time that the predicted life of the
universe. This outcome remains a fact independent of technology scaling.

Another more interesting variant of this thought experiment was shared
by a person contacted for PhysComp 92. His thought experiment turned
this solution on it's side and suggested a time bound solution to the
problem. Imagine an answer to a very hard computation problem is desired
within a time T. Based on the speed of light, this time limit places an
upper bound on the size of the computer based on the distance light can
travel during that time. This time bound also determines the number of
computing elements that must be used to produce the answer based on the
exponential number of subsolutions. Assuming that each computing element
has a mass, it is possible to define the total mass of this
verysupercomputer. Fitting this mass inside the previous size constraint
would result in exceeding the mass density limit for a black hole event
horizon. No one will fund this effort.

Spatially bounded solutions to exponential problems appear to have a
mathematical interpretation but temporally bounded solutions seem to
suggest a very strong physical interpretation. Both of these thought
experiments actually have the same physical interpretation, namely that
exponentially hard problem solutions do not fit into our physical
universe. Leverage obtained by quantum computing is so interesting
because it is based on superposition principles, which effectively
exists outside of the spacetime of conventional deterministic computers.

The second thought experiment, called the twin paradox, is an early
relativity thought experiment regarding time dilation. This thought
experiment involves a pair of twins in the space program. The first twin
was onboard an advanced space ship and accelerated toward a distant star
to a large percentage of the speed of light, then decelerated, turned
around, and returned to earth in a similar fashion. The second twin
stayed on earth. When the twin returns from his trip he has not aged
very much compared to his now much older twin brother who stayed on
earth. The inflight twin not only got to see the universe, but
effectively aged more slowly as a result of it.

The alternative version of this thought experiment is to use twin
computers. The stay at home computer will be working on a problem that
will take the length of the trip. The other computer bound for space is
also working on the same problem. When the inflight computer returns it
is only partially complete and the stay at home twin has solved the
problem already. It seems like an advantage for humans to stay younger
but for computers remaining young is a disadvantage.

The analysis of the twin computer paradox suggests that the optimal
manner for computing an answer is to not go through any large
accelerations. This leads to ask the question,

        "Is there any kind of motion that can 
        accelerate the computation rate of a computer?"

The answer to this question is to accelerate the rest of the universe
away from the stationary computer. If this was done simultaneously for
all three dimensional axis then the computer would be more efficient
than another computer inside the universe. Of course to accomplish this,
the universe would have to move on a higher dimensional axis orthogonal
to our normal three dimensions of space. If this motion is completely
relative, instinctively we can believe moving relative to the known 3
axis of space may give computational leverage. In other words, higher
dimensional computers may be more efficient than lower dimensional
computers, especially for mapping higher dimensional applications. This
is certainly true in the limit of computer scaling [5].

This thought experiment begs the question regarding higher dimensional
spaces, What are the time properties of higher dimensional spaces (most
likely without mass since this is a property of 4space)? As demonstrated
by time dilation for photons themselves and Bells theorem, it is obvious
that higher dimensional systems (outside 4space) are comparatively
atemporal. This is expected since our time emerges as a result of
consistency constraints from relativity and quantum mechanics.

Just as high dimensional semantics are important to computation, higher
dimensions go one step farther beyond simulated spaces by reintroducing
isotropy, redefining temporality, and also locality (since they are
strongly linked notions). Compared to the energy metrics of 4space, the
atemporal and nonlocal notions of higher dimensional spaces appears to
be more like an information metric, where every inertial frame and
quantum interaction can be viewed as a large system of information
constraints.

3.0 Thought Provoking Questions

One of the earliest questions asked by a PhysComp colleagues was, "How
heavy is a bit?" As a result of Bekenstein's work we can now answer that
question as roughly Planck's area worth of energy/matter. The next
version of that question is, if a bit is heavy, how many bits does it
take to define each the primitive particles of physics?

This question can be approached from a computer science perspective, as
if each particle was a token of a message, and the particles defined the
symbol set of nature. Unfortunately this abstract solution does not
comprehend the complete picture, since these particles are not of equal
probability in the message. (Physics experiments can be thought of as
information dynamic simulations). Additionally, since information is
physical, this approach must comprehend energy dynamics due to fields
and also include Bekenstein's work.  Lastly, quantum constraints and
zero point energy constraints (energy potential of free space) must also
be comprehended as information.

This question is supported by the work of Noyes and Kauffman [6] where
they start out with the assumption that physics laws are derived from
discrete information dynamics. Supersymetry solutions are also
topologically oriented and related to knot theory, which has a strong
basis from topological constraints. All of these solutions are higher
dimensional solutions. John Wheeler understood this relationship between
information and gravity [7] and also proposed a pregeometric way of
thinking [8]. More work must be done to answer this question.

The quantum probability solutions and higher dimensional solutions beg
other questions besides atemporality. Can we have information encoding
as topologies without using matter/energy structures? Another way to
pose that same question is, What is the information encoding mechanism
of quantum probability distributions (and other fields)? or, Can
information be encoded directly as topological spacetime without
thinking of them as a particle (i.e. graviton)?

These questions are supported by the notion that spin is a property
independent of energy properties, but effectively represents an
informational oriented property that must be preserved in the face of
black holes. Likewise, quantum coherency in EPR is like an acausal
constraint, where either part can be influenced and effect the coherent
whole. Just labeling new concepts such as Qubits and Ebits is useful,
but what does it mean from an information encoding mechanism behind
physics constraints? All conservation laws must be built on top of some
more primitive consistency mechanisms, that must have topological
properties and information dynamics.

Another provoking question is the subtle aspect of simulating real
spaces. In order to simulate the dynamics of relativity, a model of an
inertial frame must be represented in the computer. In order to generate
the relativistic view of other elements and fields, this data
representation must contain information regarding velocity, direction,
position, etc. to be used in the computation task. But more important,
the inertial frame represents information state associated with the
observer.

Therefore, if information is physical, then inertial frames must also be
physical.  Unfortunately, physicists treat inertial frames as pure
mathematical abstractions with no physical reality. This is simply
proven by the fact that inertial frames can not be acted upon. Another
observation is that inertial frames can not actually be used as an
inertial frame for photons, as if the frame had some mass associated
with it.

This line of thinking is very fundamental, because inertial frames
formally define the observer for relativity just as quantum collapse is
the observation mechanism for quantum mechanics. Processes (running on
CPUs) can be thought of as the observer frame for computation, causing
the computation to unfold at decision points.  Can all three notions of
an observer be combined to make a comprehensive "observer" abstraction
that is consistent with information, quantum, and relativity theories.

It is clear that giving equal weight to information mechanics as well as
energy mechanics within physics raises very interesting questions. This
is very similar to the kind of process that happened initially with
particle/wave duality at the beginning of this century.  Information and
energy duality will most likely be the dominant paradox that must be
resolved next. Unified field theories that do not explicitly included
computational and informational perspectives will most likely be
incomplete.

4.0 Conclusions

This paper does not give many answers, but the questions raised about
the intrinsic informational properties associated with spacetime, and
physical "conservation" constraints supports that universe is a large
constraint system. Physical laws must be supported by some information
mechanism with topological properties that give rise to isotropy, 4space
energy metrics, coherency, nonlocality, atemporality, and acausal
characteristics.  The exact mathematical solution is unclear at this
time, but these thought experiments and questions suggest that unified
theories using higher dimensional solutions must have an information
metric orientation and not just more of the 4space energy metric
thinking. Therefore, information within 4space is physical, but
information is protophysical for mechanisms outside our 4space bounded
energy metric.


5.0 References

[1] Landauer, Rolf. 1992. "Information is Physical." Proceedings of the
Workshop on Physics and Computation. IEEE Computer Science Press.

[2] M. Schiffer, 1993, "The interplay between Gravitation and
Information Theory", Proceedings of the Workshop on Physics and
Computation, PhysComp 92, IEEE Computer Society Press.

[3] Unruh, W. G. 1976. "Notes on Black Hole Evaporation",
Phys. Rev. D14, 870

[4] Kaku, Michio. 1994. Hyperspace, A Scientific Odyssey Through
Parallel Universes, Time Warps, and the Tenth Dimension. Oxford
University Press

[5] Franco Preparata, May 1993, "Horizons of Parallel Computation",
Brown University, Technical Report No. CS-93-20.

[6] L. Kauffman and H. P. Noyes, 1996, "Discrete Physics and the
Derivation of the Electromagnetism form the formalism of Quantum
Mechanics", Proc. of the Royal Society A 452, pp. 81-95

[7] Wheeler, John. 1989. "It From Bit," Proceedings 3rd International
Symposium on Foundations of Quantum Mechanics, Tokyo.

[8] C Misner, K Thorne, J. Wheeler, Gravitation,
pp. 1203-1212. W. H. Freeman and Company. New York.