<!doctype html> <html lang="en"> <head> <meta charset="utf-8"> <title>Yuchen's Blog</title> <link rel="stylesheet" href="../assets/css/default.css" /> <script src="https://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML" type="text/javascript"></script> <script src="../assets/js/analytics.js" type="text/javascript"></script> </head> <body> <header> <span class="logo"> <a href="blog.html">Yuchen's Blog</a> </span> <nav> <a href="postlist.html">All posts</a><a href="index.html">About</a><a href="blog-feed.xml">Feed</a> </nav> </header> <div class="main"> <div class="bodyitem"> <a href="posts/2019-03-14-great-but-manageable-expectations.html"><h2> Great but Manageable Expectations </h2></a> <p>Posted on 2019-03-14</p> <p>This is Part 2 of a two-part blog post on differential privacy. Continuing from <a href="/posts/2019-03-13-a-tail-of-two-densities.html">Part 1</a>, I discuss the Rényi differential privacy, corresponding to the Rényi divergence, a study of the moment generating functions the divergence between probability measures to derive the tail bounds.</p> <a href="posts/2019-03-14-great-but-manageable-expectations.html">Continue reading</a> </div> <div class="bodyitem"> <a href="posts/2019-03-13-a-tail-of-two-densities.html"><h2> A Tail of Two Densities </h2></a> <p>Posted on 2019-03-13</p> <p>This is Part 1 of a two-part post where I give an introduction to differential privacy, which is a study of tail bounds of the divergence between probability measures, with the end goal of applying it to stochastic gradient descent.</p> <a href="posts/2019-03-13-a-tail-of-two-densities.html">Continue reading</a> </div> <div class="bodyitem"> <a href="posts/2019-02-14-raise-your-elbo.html"><h2> Raise your ELBO </h2></a> <p>Posted on 2019-02-14</p> <p>In this post I give an introduction to variational inference, which is about maximising the evidence lower bound (ELBO).</p> <a href="posts/2019-02-14-raise-your-elbo.html">Continue reading</a> </div> <div class="bodyitem"> <a href="posts/2019-01-03-discriminant-analysis.html"><h2> Discriminant analysis </h2></a> <p>Posted on 2019-01-03</p> <p>In this post I talk about the theory and implementation of linear and quadratic discriminant analysis, classical methods in statistical learning.</p> <a href="posts/2019-01-03-discriminant-analysis.html">Continue reading</a> </div> <div class="bodyitem"> <a href="posts/2018-12-02-lime-shapley.html"><h2> Shapley, LIME and SHAP </h2></a> <p>Posted on 2018-12-02</p> <p>In this post I explain LIME (Ribeiro et. al. 2016), the Shapley values (Shapley, 1953) and the SHAP values (Strumbelj-Kononenko, 2014; Lundberg-Lee, 2017).</p> <a href="posts/2018-12-02-lime-shapley.html">Continue reading</a> </div> <div class="bodyitem"> <p><a href="postlist.html">older posts</a></p> </div> </div> </body> </html>