aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorYuchen Pei <me@ypei.me>2018-09-18 11:28:08 +0200
committerYuchen Pei <me@ypei.me>2018-09-18 11:28:08 +0200
commit07f9b711a36bb89e1f99738ee1d531fa48a2fb31 (patch)
tree9ff7a4183fa49cedc76ba7f10476e3995df45bd1
parentbe2200dacb98541afb0a438273c881a0f5858e33 (diff)
added an mpost
-rw-r--r--microposts/rnn-turing.md5
1 files changed, 5 insertions, 0 deletions
diff --git a/microposts/rnn-turing.md b/microposts/rnn-turing.md
new file mode 100644
index 0000000..40777c1
--- /dev/null
+++ b/microposts/rnn-turing.md
@@ -0,0 +1,5 @@
+---
+date: 2018-09-18
+---
+
+Just some **non-rigorous** rambling: Feedforward networks are like combinatorial logic, and recurrent networks are like sequential logic (e.g. data flip-flop is like the feedback connection in RNN). Since NAND + combinatorial logic + sequential logic = von Neumann machine which is an approximation of the Turing machine, it is not surprising that RNN (with feedforward networks) is Turing complete (assuming that neural networks can learn the NAND gate).