aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorYuchen Pei <me@ypei.me>2019-01-27 19:26:37 +0100
committerYuchen Pei <me@ypei.me>2019-01-27 19:26:37 +0100
commit6cf0d12eafcb7432db80656dfb60dc009bb2b00d (patch)
tree233062e5007ccd4ddba83e7447c39c1e9033371f
parentef45326e0abcd5172262170bf7363ef97424ffdc (diff)
minor
-rw-r--r--microposts/learning-undecidable.md2
1 files changed, 1 insertions, 1 deletions
diff --git a/microposts/learning-undecidable.md b/microposts/learning-undecidable.md
index 60be24c..4507b0d 100644
--- a/microposts/learning-undecidable.md
+++ b/microposts/learning-undecidable.md
@@ -8,7 +8,7 @@ Fantastic article, very clearly written.
So it reduces a kind of learninability called estimating the maximum (EMX) to the cardinality of real numbers which is undecidable.
-When it comes to the relation between EMX and the rest of machine learning framework, the article mentions that EMX belongs to "extensions of PAC learnability include Vapnik’s statistical learning setting and the equivalent general learning setting by Shalev-Shwartz and colleagues" (I have no idea what these two things are), but it does not say whether EMX is representative of or reduces to common learning tasks. So it is not clear whether its undecidability applies to ML at large. What do you think?
+When it comes to the relation between EMX and the rest of machine learning framework, the article mentions that EMX belongs to "extensions of PAC learnability include Vapnik’s statistical learning setting and the equivalent general learning setting by Shalev-Shwartz and colleagues" (I have no idea what these two things are), but it does not say whether EMX is representative of or reduces to common learning tasks. So it is not clear whether its undecidability applies to ML at large.
Another condition to the main theorem is the union bounded closure assumption. It seems a reasonable property of a family of sets, but then again I wonder how that translates to learning.