Hallo!
Hm. You can incorporate changes over time using a standard feedforward
ANN by wrapping your time-ordered vectors over a given time period into
a single input vector and increasing the number of inputs to the network
accordingly. But of course this introduces latency and other problems
(e.g. it could massively increase the number of training examples
required).
Pd has the ann_td external, which provides a 'time delay' neural network
which I believe incorporates time using a method similar to that
described above.
Yes of course, one other possibility is to use time delayed neural
networks ...
For example a hidden markov model or echo state network (= special kind
of recurrent neural network) should work.
I'm intrigued! Presumably these approaches avoid the latency problem by
maintaining the network's state? Are there other advantages -- easier to
train?
Hm ... I did not think about latency ... but if you do not process the
data in blocks there should not be a significant latency (also for the
time delay NN) ?
However, the advantage of the echo state network is that training is
linear and you cannot get in a suboptimal solution as with feedforward
neural networks (where the error surface has multiple local minimas) -
for a short introduction.
And it is recurrent - so in general more powerful ... from the link above:
"On a number of benchmark tasks, ESNs have starkly outperformed all
other methods of nonlinear dynamical modelling"
If you are interested, I implemented ESNs (with various extensions) in a
PD external will hopefully follow in summer ...
LG
Georg
_______________________________________________