blob: 8636a5a7d4690b86d0a9abb0def6a556bd4d7e44 (
plain) (
blame)
1
2
3
4
5
6
7
8
9
10
11
|
#+title: rnn-turing
#+date: <2018-09-18>
Just some non-rigorous guess / thought: Feedforward networks are like
combinatorial logic, and recurrent networks are like sequential logic
(e.g. data flip-flop is like the feedback connection in RNN). Since NAND
+ combinatorial logic + sequential logic = von Neumann machine which is
an approximation of the Turing machine, it is not surprising that RNN
(with feedforward networks) is Turing complete (assuming that neural
networks can learn the NAND gate).
|