diff options
author | Yuchen Pei <me@ypei.me> | 2021-06-18 12:58:44 +1000 |
---|---|---|
committer | Yuchen Pei <me@ypei.me> | 2021-06-18 12:58:44 +1000 |
commit | 147a19e84a743f1379f05bf2f444143b4afd7bd6 (patch) | |
tree | 3127395250cb958f06a98b86f73e77658150b43c /microposts/rnn-turing.org | |
parent | 4fa26fec8b7e978955e5630d3f820ba9c53be72c (diff) |
Updated.
Diffstat (limited to 'microposts/rnn-turing.org')
-rw-r--r-- | microposts/rnn-turing.org | 11 |
1 files changed, 11 insertions, 0 deletions
diff --git a/microposts/rnn-turing.org b/microposts/rnn-turing.org new file mode 100644 index 0000000..8636a5a --- /dev/null +++ b/microposts/rnn-turing.org @@ -0,0 +1,11 @@ +#+title: rnn-turing + +#+date: <2018-09-18> + +Just some non-rigorous guess / thought: Feedforward networks are like +combinatorial logic, and recurrent networks are like sequential logic +(e.g. data flip-flop is like the feedback connection in RNN). Since NAND ++ combinatorial logic + sequential logic = von Neumann machine which is +an approximation of the Turing machine, it is not surprising that RNN +(with feedforward networks) is Turing complete (assuming that neural +networks can learn the NAND gate). |