Patent No. 5842190 System and method for high speed computing and feature recognition capturing aspects of neocortical computation
Patent No. 5842190
System and method for high speed computing and feature recognition capturing aspects of neocortical computation (Sutton. et al., Nov 24, 1998)
Abstract
A network of networks system and method includes a neuron having processing capabilities; a first level module including a network of a plurality of interconnected neurons, the first level module also having processing capabilities; and a second level module including a network of interconnected networks or interconnected neurons, the second level module also having processing capabilities; wherein the first and second level modules are interconnected through neuron to neuron connections such that simultaneous processing can be carried out at by the neuron and by the first and second level modules. The system and method also includes means for forming a boundary between a first module having a first memory state and a second module having a second memory state such that the module comprising the boundary attains a third memory state distinct from the first and second memory states.
Notes: 1 Complicity and the Brain: Dynamics in Attractor Space
FEDERALLY
SPONSORED RESEARCH
This invention was made with government support under Contract No. N00014-19-J-4032
awarded by the Office of Naval Research. The United States government has certain
rights in the invention.
FIELD
OF THE INVENTION
The present invention generally relates to a neural network method and system
and, more particularly, to a "Network of Networks" method and system. The system
and method of the invention importantly are able, without intervention from
any external source, to map features and combinations of features of input stimuli
in a theoretically infinite progression, such that they are capable of responding
in distinctive ways to the features or combinations of features whenever presented.
The invention is particularly well-suited to solve complex problems and to model
the structure and function of aspects of the human brain for basic scientific
and clinical purposes.
BACKGROUND OF THE INVENTION
"Neural networks" are so called because they include a plurality of interconnected
neuron-like elements, which respond in certain ways to input stimuli, as do
analogous structures of the human brain. They are designed to achieve varying
degrees of accuracy based on a number of simplifying assumptions. Neural networks
commonly provide an architecture for modeling or constructing systems that "learn"
through iterative processing during which the strength of connections between
elements of the network is altered, enabling the realization of associations
between input and output patterns of potentially great complexity. Neural networks
can differentiate among input patterns so that, for example, the input pattern
"A" gives rise (maps) to an output pattern or signal "a," input pattern "B"
maps to "b," and so on. If constructed in a particular manner, a neural network
system can perform standard mathematical or logical operations or calculations
such as addition, integration, logical "exclusive or", etc. All such operations
will hereinafter be referred to as "computations" for ease of expression.
Neural networks commonly operate with varying degrees of accuracy and may not
exhibit the strictly deterministic characteristics of traditional computing
means; that is, neural networks may not consistently and accurately, within
a precisely known range or accuracy, reproduce a particular output when presented
with a particular input. This lack of predictability in some, but not all, neural
networks is related to the ability of such networks to continue to evolve so
that they exhibit improved performance when presented with new stimuli. Other
useful characteristics of neural networks include the ability to continue to
function, or to degrade gracefully, in the event that some components of the
network become inoperative and, after an initial learning period, to perform
computations with respect to large numbers of input variables with much greater
speed than is currently practicable with traditional computing means. As the
number of input variables grows very large, neural networks may be the only
practicable means for computation.
Common applications of neural networks include analysis of data for purposes
similar to those for which traditional means of statistical analysis are employed;
pattern recognition to permit identification of associations or patterns often
in the presence of noise; pattern or signal enhancement, such as image enhancement
in the presence of noise; and prediction of future system behavior on the basis
of past behavior, often without any a priori knowledge of the architecture of
the system.
Some neural network architectures have incorporated or featured a combination
of networks that could be described as constituting a "Network of Networks (NoN)".
A NoN may be described as consisting of three basic levels, the most elemental
being the representation of the smallest computing unit analogous to a neuron,
the second being a single network analogous to a group of interconnected neurons,
and the third being a network consisting of a group of interconnected networks
of neurons. The impetus for creating a NoN may be to increase the range of input
patterns that may accurately be differentiated, or, alternatively, to increase
the "memory" capacity of the system; to reduce the learning time during which
the system achieves the capability for such differentiation; or to decrease
the time necessary for performing calculations. These effects are achieved by
establishing connections between or among single networks that have the characteristics
of preserving the stability of "memory" in each of the networks while correlating
the responses of the networks in a useful way.
Prior art NoN architectures suffer from at least four drawbacks. First, the
number and complexity of connections among networks grows rapidly, whether exponentially
or in some other non-linear fashion, as the number of networks grows, thus presenting
a variety of practical constraints in implementation. Such practical constraints
include, for examples, limitations in hardware implementations with respect
to heat dissipation and available routes of connectivity, and, in software implementations,
with respect to rapidly escalating computing overhead. Second, while the increase
in the connections among networks increases the scale of computation, it does
not result in a change in the manner of computation; that is, a growing NoN
increases computational capacity but does not introduce novel computational
abilities.
Third, simple scaling of the architecture from one level (i.e., network of neurons)
to another (i.e., network of network of neurons) may not consistently be preserved;
that is, the operation and structure of the NoN may not be amenable to description
by the same equations or other specifications applicable to each constituent
network or to the basic computational unit (analogous to a neuron). This lack
of self-similarity or modularity among levels substantially increases the design
complexity of the system so that, in the case of a hardware implementation,
the same or similar integrated circuits, for example, cannot be used to implement
each and every level or, in the case of software implementation, the same or
similar subroutine or software module, for example, cannot be used to implement
each and every level. Fourth, the manner of connections and dynamics among networks
does not correspond to what is known about analogous connections and dynamics
in the human brain and central nervous systems in other species. Thus, prior
art NoNs depart from fundamental aspects of a proven design from which many
useful analogies can be made and insights obtained.
Accordingly, it is a general object of the present invention to provide a NoN
system and method that overcomes the drawbacks of prior art NoN systems and
methods.
SUMMARY OF THE INVENTION
In one embodiment of the invention, a NoN includes a neuron having processing
capabilities; a first level module including a network of a plurality of interconnected
neurons, a first level module also having processing capabilities; and a second
level module including a network of networks of neurons, the second level module
also having processing capabilities. The first and second level modules are
interconnected through neuron to neuron connections such that simultaneous processing
can be carried out in the neuron and in the first and second level modules.
In one embodiment, the invention further includes means for increasing the levels
of interconnected modules while increasing connections no more than linearly.
In an embodiment of the invention, it further includes means for forming non-predefined
functional components across modules based on input stimuli during computing
by the NoN.
In an embodiment of the invention, the NoN further includes means for forming
a boundary between a first module having a first memory state and a second module
having a second memory state, both modules being in the same level, such that
the module comprising the boundary attains a third memory state distinct from
the first and second memory states.
In one embodiment of the invention, the NoN further includes means for forming
spatially related functional components across modules based on temporal processing
sequences and means for forming temporally related functional components through
module processing based on spatially related modules.
In an embodiment of the invention, the connections are between both spatially
local and distant modules.
In an embodiment of the invention modules at different levels are self-similar
to one another.
In another embodiment of the invention, a method is provided for forming a NoN
comprising the steps of:
forming a neuron having processing capabilities; forming a first level module
including a network of a plurality of interconnected neurons, the first level
module also having processing capabilities; forming a second level module including
a network of interconnected networks of interconnected neurons, the second level
module also having processing capabilities;
specifying input and output pathways to and from neurons so that an external
input may excite the system to provide an output;
processing first level and second level module evolution based on neural network
dynamics;
further processing first level module evolution to a stage at which the achievement
of an attractor state in at least one first level module influences the activity
of at least one other first level module toward or away from an attractor state;
and
forming a boundary layer of first level modules between other first level modules
that influence the activity of at least one other first level module.
In an even further embodiment of the invention, an NoN includes means for forming
a neuron having processing capabilities; means for forming a first level module
including a network of a plurality of interconnected neurons, the first level
module also having processing capabilities; means for forming a second level
module including a network of interconnected networks of interconnected neurons,
the second level module also having processing capabilities; means for specifying
input and output pathways to and from neurons so that an external input may
excite the system to provide an output; means for processing first level and
second level module evolution based on neural network dynamics; means for further
processing first level module evolution to a stage at which the achievement
of an attractor state in at least one first level module influences the activity
of at least one other first level module toward or away from an attractor state;
and means for forming a boundary layer of first level modules between other
first level modules that influence the activity of at least one other first
level module.
Having now described the preferred embodiment of the invention, it should be apparent to those skilled in the art that the foregoing is illustrative only and not limiting, having been presented by way of example only. Numerous other embodiments and modifications thereof are contemplated as falling within the scope of the present invention as defined by appended claims and equivalents thereto.
Comments