for the European Union's Human Brain Project
Monograph by Dr. rer. nat. Andreas Heinrich Malczan
Artificial neural networks consist of nodes and connections. In
biological neural networks, these correspond to neurons and axons; dendrites
are often integrated into the model to avoid unnecessary complexity. To
precisely formulate the mechanisms of intelligence later on, we first
analyze the capability spectrum of biological neural networks and introduce
key terms whose exact definitions prevent future misunderstandings.
Vertebrates interact with their environment by receiving,
transforming, and evaluating signals. Receptors form the basis for
constructing a rich representation of both external and internal
environments. They translate physical or chemical stimuli into firing rates,
thereby activating neurons that transmit these signals via their axons.
At this level, various signal processing mechanisms emerge through
convergence, divergence, averaging, inversion,
latency delays, loop rotation of signals, and
distance-dependent damping. These mechanisms, together with the
intelligence-creating properties of neural networks described later, form
the foundation for intelligent responses, the construction of an internal
environmental model, and the accumulation of knowledge.
Convergence and divergence are fundamental algorithms of neural
networks—both biological and artificial. In artificial networks, they are
called nodes and connections; biologically, they correspond to neurons and
axons. We use the terminology most appropriate for the context.
While artificial networks often ignore latency delays and
distance-dependent damping, biological axons possess these properties. They
significantly expand the capability spectrum and are considered where they
are functionally relevant.
Vertebrates utilize homeostatic systems based on the evaluation of
numerous receptor signals. These systems form regional or global averages,
such as for blood pressure, oxygen saturation, or blood sugar levels. Over
evolutionary time, such averages have been incorporated into neural systems,
taking on important signal-theoretic functions.
Two forms of averaging have emerged:
This creates a functional transition from physiological monitoring
to signal-theoretic operations, which later become crucial for divergence
modules and on-off pairs.
Many receptors have evolved into two complementary groups:
Examples include:
The division into On and Off signals enhances the nervous system's
ability to accurately assess the strength of the original variable. This
capability is later systematically expanded in divergence modules.
On and Off signals can be transformed into each other. Biological
signal inversion occurs in two steps:
1.
Switching the signal to an inhibitory transmitter
(often GABA).
2.
Inhibiting an asymptotically limited average signal,
whose firing rate is approximately double the neuronal mean value.
The resulting residual signal is the inverted signal. The sum of
the original and inverted signals is a constant, roughly equal to twice the
statistical mean of the firing rates.
The simultaneous availability of both components of an On-Off pair
allows for precise determination of the absolute magnitude of the underlying
variable. This is a central building block for divergence modules and
subsequent intelligence-creating mechanisms.
For analyzing neural circuits, a set-theoretic perspective is
helpful. Since signals are time-dependent, we refer to a signal as a state
at a specific time or within a short interval of constant strength, denoted
as .
We distinguish:
In artificial networks, nodes represent elementary or complex
signals; in biological networks, input neurons and output neurons assume
these roles.
This allows neural networks to be interpreted as convergence
circuits, where elementary signals converge into complex signals. In
multilayer networks, divergence also occurs frequently, where complex
signals are decomposed back into elementary signals.
To fully characterize a neural network, the graph-theoretic
structure alone is insufficient. Each connection has a synaptic strength,
expressed as a real number:
In network theory, these are called weights. In artificial
networks, they are determined by learning algorithms designed to mimic or
replace biological learning processes.
This foundation prepares us to analyze the
learning mechanisms in the next chapter, which modify these weights and form
the basis for intelligence.
Monograph by Dr. rer. nat. Andreas Heinrich Malczan