aboutsummaryrefslogtreecommitdiff
path: root/microposts
diff options
context:
space:
mode:
authorYuchen Pei <me@ypei.me>2018-05-11 15:48:59 +0200
committerYuchen Pei <me@ypei.me>2018-05-11 15:48:59 +0200
commita24d26ee19831f3fb10b7d78d76a7e0dcc5a510b (patch)
tree20652d597c4aeb6da42f8cbfb7669710834a24cb /microposts
parent67a012fdfbd62c20b25ef86da387c436bc659649 (diff)
minor edit
- added link to permalink of a previous micropost
Diffstat (limited to 'microposts')
-rw-r--r--microposts/rnn-fsm.md2
1 files changed, 1 insertions, 1 deletions
diff --git a/microposts/rnn-fsm.md b/microposts/rnn-fsm.md
index 567063d..a1e1315 100644
--- a/microposts/rnn-fsm.md
+++ b/microposts/rnn-fsm.md
@@ -3,7 +3,7 @@ date: 2018-05-11
---
### Some notes on RNN, FSM / FA, TM and UTM
-Related to a previous micropost.
+Related to [a previous micropost](#neural-turing-machine).
[The slides from Toronto](http://www.cs.toronto.edu/~rgrosse/csc321/lec9.pdf) is a nice introduction to RNN (recurrent neural network) from a computational point of view. It states that RNN can simulate any FSM (finite state machine, a.k.a. finite automata abbr. FA) with a toy example computing the parity of a binary string.