next up previous
Next: The Ising Model Up: Neuroscience and Phase Transition Previous: Neuroscience and Phase Transition

Neural networks and the Ising model

Traditional inquiry suggests static representations for structures of language despite the obvious facts that language-processing involves historical aspects. Researchers in the field of artificial intelligence (AI) are very aware of dynamical and temporal connections involved in the production of natural language models (referred to as performance models). But the grammatical theory that underlies the competence model and provides the context of comparison for the performance model, does not take dynamical features into consideration. The introduction of a statistical approach to compensate for the failings of static rules has produced positive results; however, context evaluation must extract features defined by conventional grammar. It is inadequate in not acknowledging the dynamically generated features of natural language use, features that co-opt pre-adapted forms for new uses, and thus put them beyond the reach of any fixed grammar [52].

The first model to involve non-trivial dynamics and predictable useful behavior in neural networks (NN) was introduced by J.J. Hopfield. He presented a model that was biologically probable and tractable to formal analysis. A neural network is a set of constituents, usually referred to as nodes, coupled according to some rules. These nodes are neuron-type units that are usually arranged as sets of input and output nodes. More conventionally, a layer of hidden nodes interconnect with input output nodes. These hidden nodes have weight that are adjusted to perform a specific computation. In all cases the elements are interconnected in such a way that the input of each node is determined by the state of some or all of the other nodes. The interconnectivity forms a whole that can perform useful computations. Moreover, these networks are arranged in layers that feed-forward its result to the following layer in performing more complex computation. This kind of NN does not feedback any of its results to achieve its computation. The XOR boolean algorithm originally created an insurmountable problem for NNs. Years later, the creation of feedforward layered networks ultimately answered the problem so that an NN could resolve an XOR algorithm. Many feedforward neural networks are used in modeling natural language production, however, with limited success. Hopfield's approach is significantly different.

The Hopfield model interconnects nodes with feedback, that is, each node serves as input and output. Additionally the nodes are weighted so that they can only be in one of two states. Mathematical and simulation analysis demonstrated that this kind of system evolves to a stable fixed attractor. Furthermore, by manipulating the link strength, it is possible to encode any set of node states as attractor states. The flexibility of the system allow for the encoding of various patterns.

The model was suggested as an explanation for the mechanism of associative memory. When the system is presented with an input patterns that are sufficiently close to the state space of the stored pattern the input patterns will invariably evolve towards the state space of the stored pattern. The input pattern may be incomplete or noisy. Nonetheless, the pattern will dynamically evolve, travelling from its initial state to a specific basin of attraction.

This application has also been used in word disambiguation, that is, in the automated process of differentiating the different instances of a vocable across contexts. The process is statistical not semantic and uses a network of Hopfield models [57].

Since the formal description of the Hopfield model is identical to an Ising spin glass 5.1, the field of neural network attracted many physicists from statistical mechanics to study the impact of phase transitions on the stability of neural networks.


next up previous
Next: The Ising Model Up: Neuroscience and Phase Transition Previous: Neuroscience and Phase Transition
Thalie Prevost
2003-12-24