The ability of neural networks to compute solutions
to optimization problems has been widely appreciated since Hopfield and
Tank's work on the travelling salesman
problem [Hopfield:85b]. Chapter 11 reviews the general work
in CP on optimization and physical and neural approaches. We have
examined whether neural network optimization can be usefully applied to
compiler optimizations. The problem is nontrivial
because compiler optimizations usually involve intricate logical
reasoning, but we were able to find an elegant formalism for turning a
set of logical constraints into a neural network [Fox:89l].
However, our conclusions were that the method will only be viable if
and when large hierarchically structured neural networks can be built.
The neural approach to compiler optimization is worth pursuing, because
such a compiler would not be limited to a finite set of code
transformations and could handle unusual code routinely. Also, if the
neural network were implemented in hardware, a processor could perform
the optimizations at run time on small windows of code.
Figure 13.11 shows how a simple computation of is
scheduled by a neural network. The machine state is represented at five
consecutive cycles by five sets of 20 neurons. The relevant portion of the
machine comprises the three storage locations a, b, c and a register
r, and the machine state is defined by showing which of the five quantities
A, B, C,
and
occupies which location. A firing neuron
(i.e., shaded block) in row
and column r indicates that
is in
the register. The neural network is set up to ``know'' that computations can
only be done in the register, and it produces a firing pattern representing a
correct computation of
.
Figure 13.11: A Neural Network Can Represent Machine States (top) and
Generate Correct Machine Code for Simple Computations (bottom).
The neural compiler was conceived by Geoffrey Fox and investigated by Jeff Koller [Koller:88c], [Fox:89l;90nn].