The Spectrum of Capabilities of Biological Neural Networks

Artificial neural networks consist of nodes and connections. In biological neural networks, these correspond to neurons and axons; dendrites are often integrated into the model to avoid unnecessary complexity. To precisely formulate the mechanisms of intelligence later on, we first analyze the capability spectrum of biological neural networks and introduce key terms whose exact definitions prevent future misunderstandings.

1.1  Receptors and Signals

Vertebrates interact with their environment by receiving, transforming, and evaluating signals. Receptors form the basis for constructing a rich representation of both external and internal environments. They translate physical or chemical stimuli into firing rates, thereby activating neurons that transmit these signals via their axons.

At this level, various signal processing mechanisms emerge through convergence, divergence, averaging, inversion, latency delays, loop rotation of signals, and distance-dependent damping. These mechanisms, together with the intelligence-creating properties of neural networks described later, form the foundation for intelligent responses, the construction of an internal environmental model, and the accumulation of knowledge.

1.2 Signal Convergence and Divergence

Convergence and divergence are fundamental algorithms of neural networks—both biological and artificial. In artificial networks, they are called nodes and connections; biologically, they correspond to neurons and axons. We use the terminology most appropriate for the context.

  • Divergence: A neuron excites many other neurons via interneurons.
  • Convergence: Many neurons project onto a common target neuron.

While artificial networks often ignore latency delays and distance-dependent damping, biological axons possess these properties. They significantly expand the capability spectrum and are considered where they are functionally relevant.

1.3  Averaging with and without Asymptotic Limitation

Vertebrates utilize homeostatic systems based on the evaluation of numerous receptor signals. These systems form regional or global averages, such as for blood pressure, oxygen saturation, or blood sugar levels. Over evolutionary time, such averages have been incorporated into neural systems, taking on important signal-theoretic functions.

Two forms of averaging have emerged:

  • Averages without asymptotic limitation: They have an approximately linear characteristic and primarily serve to monitor physiological parameters.
  • Averages with asymptotic limitation: They generate signals whose firing rate approaches a constant maximum value. This form is essential for signal inversion and significantly extends the possibilities of signal processing.

This creates a functional transition from physiological monitoring to signal-theoretic operations, which later become crucial for divergence modules and on-off pairs.

1.4  On and Off Signals

Many receptors have evolved into two complementary groups:

  • On-Receptors: Firing rate increases with the magnitude of the original variable (monotonically increasing).
  • Off-Receptors: Firing rate decreases with the magnitude of the original variable (monotonically decreasing).

Examples include:

  • Motor control: Tendon organs (On) vs. signals from motor antagonists (Off).
  • Brightness perception: Light-On and Light-Off receptors.
  • Color vision: Red-On/Green-Off ganglion cells.

The division into On and Off signals enhances the nervous system's ability to accurately assess the strength of the original variable. This capability is later systematically expanded in divergence modules.

1.5  Signal Strength and Signal Inversion

On and Off signals can be transformed into each other. Biological signal inversion occurs in two steps:

1.     Switching the signal to an inhibitory transmitter (often GABA).

2.     Inhibiting an asymptotically limited average signal, whose firing rate is approximately double the neuronal mean value.

The resulting residual signal is the inverted signal. The sum of the original and inverted signals is a constant, roughly equal to twice the statistical mean of the firing rates.

The simultaneous availability of both components of an On-Off pair allows for precise determination of the absolute magnitude of the underlying variable. This is a central building block for divergence modules and subsequent intelligence-creating mechanisms.

1.6  Elementary and Complex Signals

For analyzing neural circuits, a set-theoretic perspective is helpful. Since signals are time-dependent, we refer to a signal as a state at a specific time or within a short interval of constant strength, denoted as .

We distinguish:

  • Elementary signals
  • Complex signals: Sets of elementary signals that can be represented as vectors through indexing.

In artificial networks, nodes represent elementary or complex signals; in biological networks, input neurons and output neurons assume these roles.

This allows neural networks to be interpreted as convergence circuits, where elementary signals converge into complex signals. In multilayer networks, divergence also occurs frequently, where complex signals are decomposed back into elementary signals.

1.7  Synaptic Strength and Weights

To fully characterize a neural network, the graph-theoretic structure alone is insufficient. Each connection has a synaptic strength, expressed as a real number:

  • Positive values → excitatory
  • Negative values → inhibitory

In network theory, these are called weights. In artificial networks, they are determined by learning algorithms designed to mimic or replace biological learning processes.

This foundation prepares us to analyze the learning mechanisms in the next chapter, which modify these weights and form the basis for intelligence.

Monograph by Dr. rer. nat. Andreas Heinrich Malczan