Resource title

Recurrent neural networks with iterated function systems dynamics

Resource image

image for OpenScout resource :: Recurrent neural networks with iterated function systems dynamics

Resource description

We suggest a recurrent neural network (RNN) model with a recurrent part corresponding to iterative function systems (IFS) introduced by Barnsley [1] as a fractal image compression mechanism. The key idea is that 1) in our model we avoid learning the RNN state part by having non-trainable connections between the context and recurrent layers (this makes the training process less problematic and faster), 2) the RNN state part codes the information processing states in the symbolic input stream in a well-organized and intuitively appealing way. We show that there is a direct correspondence between the Rényi entropy spectra characterizing the input stream and the spectra of Renyi generalized dimensions of activations inside the RNN state space. We test both the new RNN model with IFS dynamics and its conventional counterpart with trainable recurrent part on two chaotic symbolic sequences. In our experiments, RNNs with IFS dynamics outperform the conventional RNNs with respect to information theoretic measures computed on the training and model generated sequences. (author's abstract) ; Series: Report Series SFB "Adaptive Information Systems and Modelling in Economics and Management Science"

Resource author

Peter Tino, Georg Dorffner

Resource publisher

Resource publish date

Resource language


Resource content type


Resource resource URL

Resource license

Adapt according to the license agreement. Always reference the original source and author.