Gastvortrag: Daniela Danciu

30. Juni 2016

Neural networks: qualitative analysis & applications for modeling and simulating Distributed Parameter Systems

Abstract A.
All neural networks, both natural and artificial, are characterized by two kinds of
dynamics: the “learning dynamics” and the “intrinsic dynamics”. The learning dynamics is
the sequential (discrete-time) dynamics of the choice of the synaptic weights. The intrinsic
dynamics is the dynamics of the neural network (or of a technology system displaying a
neural network structure) viewed as a dynamical system after the weights have been
established by learning. Let us mention that the emergent computational capabilities of a
neural network can be achieved provided it has many equilibria. Also, the network task is
achieved provided it approaches these equilibria. This assertion is valid e.g. for such networks
as classifiers, content addressable memory networks, KWTA networks, cellular neural
The best suited neural network structures for such tasks are the so-called recurrent neural
networks (RNNs) i.e. neural networks with feedback interconnections – which, on the other
hand, may induce instabilities. This shows the importance of the qualitative analysis of the
intrinsic dynamics of the neural networks viewed as dynamical systems. As systems with
multiple equilibria, the qualitative analysis of such dynamical systems has to cover the local
qualitative properties of each equilibrium (stability in the sense of Lyapunov, asymptotic
stability, absolute robust stability) as well as the global behavior of the entire system
(dichotomy, global asymptotics and gradient-like behavior). The key point is that RNNs are
dynamical systems with their structure induced by the learning process (the first learning
dynamics) which gives the network synaptic weights. However, it is not compulsory that this
a posteriori induced dynamics should have the required properties for the system to cover the
design functionalities (tasks). Therefore, it appears that these qualitative properties have to be
checked separately, after the first stage of the design process.

The importance and study of distributed parameter systems (DPS) has increased
exponentially mainly due to their applications in both engineering and science. The study of
DPS encompasses several areas, such as model development, numerical approximations,
control design and experimental implementation. Considering the requirements for a
computational procedure to produce a “good” approximation of the original continuous-time
DPS, we have recently proposed a hybrid analytical-computational procedure for numerical
modeling DPS described by (nonlinear) hyperbolic partial differential equations (hPDEs)
with non-standard boundary conditions (BCs). Our systematic procedure relies on two
powerful tools: 1) a rigourously proved convergent Method of Lines which transform the
initial boundary value problem for hPDEs into a initial value problem for a large scale system
if ordinary differential equations (ODEs) and, 2) the Cellular Neural Networks paradigm used
for implementation of the computing structure. The procedure was successfully applied on
different mathematical models arised from science and engineering applications: for lossy as
well as lossless propagation phenomena, hyperbolic systems of conservation laws, hPDEs
with bilinear terms.