aboutsummaryrefslogtreecommitdiff
path: root/microposts/rnn-turing.md
diff options
context:
space:
mode:
authorYuchen Pei <me@ypei.me>2021-07-01 15:16:19 +1000
committerYuchen Pei <me@ypei.me>2021-07-01 15:16:19 +1000
commitbd3b4e7d8a436685f8b676da8f6ffe9498ab2e3f (patch)
tree1d29d70ddb35b18407c69792cd76e66d2a2280b6 /microposts/rnn-turing.md
parent661762ba8fd5fd685bfbe99473d7286efa85b381 (diff)
Added copyright notices and license headers to website content.
also removed more unused files.
Diffstat (limited to 'microposts/rnn-turing.md')
-rw-r--r--microposts/rnn-turing.md5
1 files changed, 0 insertions, 5 deletions
diff --git a/microposts/rnn-turing.md b/microposts/rnn-turing.md
deleted file mode 100644
index 5c7605c..0000000
--- a/microposts/rnn-turing.md
+++ /dev/null
@@ -1,5 +0,0 @@
----
-date: 2018-09-18
----
-
-Just some non-rigorous guess / thought: Feedforward networks are like combinatorial logic, and recurrent networks are like sequential logic (e.g. data flip-flop is like the feedback connection in RNN). Since NAND + combinatorial logic + sequential logic = von Neumann machine which is an approximation of the Turing machine, it is not surprising that RNN (with feedforward networks) is Turing complete (assuming that neural networks can learn the NAND gate).