The University Of Pennsylvania Researchers Introduced An Alternative …

The human brain is one of the most complex systems nature has ever created. The neurons interact with each other by forming recurring neural links and transmitting information through impulses.  Due to their incredible logical reasoning and numerical analysis methods, researchers try to implement these biological neural network methods into the current artificial neural systems. Neural computation methods involve the RNN in dynamical systems and neural replicas of computer architectures in machine learning.

The research group is asserting that advancements in current neural network technology could enable the complete distributed neural execution of software virtualization and logical circuits. This would be achieved without the need for any example data or sampling of the state space, which are typically required for training and refining these neural networks. Essentially, this suggests the potential for a more efficient and robust application of artificial intelligence in areas like virtualization and digital circuit design.

The current access to neural computation is limited due to the need for an understanding of the relationship between neural computers and modern-day silicon computers. This requires a neural network with a simple set of governing equations that manage many computer-like capabilities. As a consequence of the simple set of equations, networks such as reservoir computer (RC), which is a recurrent neural network (RNN) are well understood theoretically. Upon receiving the inputs, these evolve as a set of internal states, and output is a weighted sum of those states.  

The research team from the University of Pennsylvania developed two frameworks named state neural programming (SNP) and dynamic neural programming (DNP). RCs to solve analytic equations and perform operations, SNP is used. DNP is used to program RCs to store chaotic dynamical systems as random-access memories, implementing neural logic AND, NAND, OR, NOR, XOR, and XNOR.

Through  “Open-Loop architecture with SNP” researchers obtained a programming matrix with polynomial powers of time-lagged inputs, which can be used in operations as a high pass filter. In order to solve algorithms, Closed-loop architecture with SNP is used in which an RNN is programmed to store the substantial time history of a stochastic, non-differentiable time series, and a short-time Fourier transform is performed.

Simulating and virtualizing require programming of time history for continuous-time RNN so Closed-loop RNN with DNP method is implemented. Researchers tried to emulate the dynamics of the feedback of a 2000 state host RNN and 15 state guest RNN. They find that it is just simulating a chaotic Lorentz attractor without any samples. This concludes that:

Researchers have discovered that an alternative computing framework can be fully programmable, which challenges current approaches that mimic silicon hardware. Instead, they propose focusing on creating specific programming systems that maximize the full computational abilities of each unique system.

Check out the Paper and Blog. Don’t forget to join our 26k+ ML SubReddit, Discord Channel, and Email Newsletter, where we share the latest AI research news, cool AI projects, and more. If you have any questions regarding the above article or if we missed anything, feel free to email us at Asif@marktechpost.com

Check Out 100’s AI Tools in AI Tools Club

The post The University Of Pennsylvania Researchers Introduced An Alternative AI Approach To Design And Program RNN-Based Reservoir Computers appeared first on MarkTechPost.

<