aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorYuchen Pei <me@ypei.me>2018-09-18 11:31:31 +0200
committerYuchen Pei <me@ypei.me>2018-09-18 11:31:31 +0200
commit7eb146a4735b21a4b8ccc8b00a6c677216206d9e (patch)
treef0ef1da92958ff248d2bbc41b3df32d28cc370c9
parent07f9b711a36bb89e1f99738ee1d531fa48a2fb31 (diff)
rephrasing
-rw-r--r--microposts/rnn-turing.md2
1 files changed, 1 insertions, 1 deletions
diff --git a/microposts/rnn-turing.md b/microposts/rnn-turing.md
index 40777c1..5c7605c 100644
--- a/microposts/rnn-turing.md
+++ b/microposts/rnn-turing.md
@@ -2,4 +2,4 @@
date: 2018-09-18
---
-Just some **non-rigorous** rambling: Feedforward networks are like combinatorial logic, and recurrent networks are like sequential logic (e.g. data flip-flop is like the feedback connection in RNN). Since NAND + combinatorial logic + sequential logic = von Neumann machine which is an approximation of the Turing machine, it is not surprising that RNN (with feedforward networks) is Turing complete (assuming that neural networks can learn the NAND gate).
+Just some non-rigorous guess / thought: Feedforward networks are like combinatorial logic, and recurrent networks are like sequential logic (e.g. data flip-flop is like the feedback connection in RNN). Since NAND + combinatorial logic + sequential logic = von Neumann machine which is an approximation of the Turing machine, it is not surprising that RNN (with feedforward networks) is Turing complete (assuming that neural networks can learn the NAND gate).