D. R. Snyder, A. Goudarzi, C. Teuscher
We study the relationship between dynamics and computational capability in Random Boolean Networks (RBNs) for Reservoir Computing.Reservoir Computing (RC) is a computational paradigm in which a trained readout layer interprets the dynamics of an excitable component called the reservoir, which is perturbed by external input. The reservoir is often implemented as a homogeneous recurrent neural network, but there has been little investigation into the properties of reservoirs that are discrete and heterogeneous. RBNs are a generic and heterogeneous dynamical system and here we use them as the reservoir. An RBN is typically a closed system which consists of a network of N nodes with an average in-degree K, which we extend with an input layer that perturbs L nodes. We measure an extended separation property and the fading memory of externally perturbed RBNs and show that the optimal balance of these measures are maximized at critical dynamics. The computational capability of the network is an interplay between L, K and the length of the input stream T. We explore the L which adequately distributes input signals into the RBN, and find that it is dependent on K. Finally, we show that under most circumstances, near-critical connectivity Kc is desirable for reservoirs, but circumstances exist where ordered and chaotic networks are viable. These results are relevant to the construction of devices which exploit the intrinsic dynamics of complex, heterogeneous systems, such as biomolecular networks. Our findings underscore the supposition that intrinsic computational capabilities are maximal in substrates "at the edge of chaos."
View original:
http://arxiv.org/abs/1212.1744
No comments:
Post a Comment