aboutsummaryrefslogtreecommitdiff
path: root/microposts/rnn-turing.org
diff options
context:
space:
mode:
Diffstat (limited to 'microposts/rnn-turing.org')
-rw-r--r--microposts/rnn-turing.org11
1 files changed, 11 insertions, 0 deletions
diff --git a/microposts/rnn-turing.org b/microposts/rnn-turing.org
new file mode 100644
index 0000000..8636a5a
--- /dev/null
+++ b/microposts/rnn-turing.org
@@ -0,0 +1,11 @@
+#+title: rnn-turing
+
+#+date: <2018-09-18>
+
+Just some non-rigorous guess / thought: Feedforward networks are like
+combinatorial logic, and recurrent networks are like sequential logic
+(e.g. data flip-flop is like the feedback connection in RNN). Since NAND
++ combinatorial logic + sequential logic = von Neumann machine which is
+an approximation of the Turing machine, it is not surprising that RNN
+(with feedforward networks) is Turing complete (assuming that neural
+networks can learn the NAND gate).