Conditional Probability Estimation
An online journal club on the topics Conditional Probability Estimation. Our cover topics on all sorts of probabilistic approach, such as VAE, normalizing, graph neural network, probabilistic time series forecasting.
Introduction: Conditional Probability Estimation
12 Energy-based Models 3
Published:
Tags:
References:
- Pytorch Deep Learning Lectures
- Pytorch Deep Learning Slides
- A high-bias, low-variance introduction to Machine Learning for physicists
Summary: We will discuss energy-based learning in this session.
Pages: 60
11 Energy-based Models 2
Published:
Tags:
References:
- Pytorch Deep Learning Lectures
- Pytorch Deep Learning Slides
- A high-bias, low-variance introduction to Machine Learning for physicists
Summary: We will discuss energy-based learning in this session.
Pages: 60
10 Energy-based Models
Published:
Tags:
References:
- Pytorch Deep Learning Lectures
- Pytorch Deep Learning Slides
- A high-bias, low-variance introduction to Machine Learning for physicists
Summary: We will discuss energy-based learning in this session.
Pages: 60
9 Summary of Generative Models
Published:
Tags:
References:
- Deep Generative Modelling: A Comparative Review of VAEs, GANs, Normalizing Flows, Energy-Based and Autoregressive Models
Summary:
Pages: 60
8 MAF: how is MADE being used
Published:
Tags:
Summary: We discussed MAF (arXiv:1705.07057v4) last time: The paper did not explain how exactly is MADE being used to update the shift and logscale.
We will use the tensorflow implementation of MAF to probe the above question. Here is the link to the relevant documentation: https://www.tensorflow.org/probability/api_docs/python/tfp/bijectors/MaskedAutoregressiveFlow
Topics Refer to references.
Notes 1310.8499_notes.pdf
Pages: 60
7 MADE: Masked Autoencoder for Distribution Estimation
Published:
Tags:
References:
- MADE: Masked Autoencoder for Distribution Estimation
Summary: Topics Refer to references.
Notes 1310.8499_notes.pdf
Pages: 60
6 Deep AutoRegressive Networks
Published:
Tags:
References:
- Gregor, K., Danihelka, I., Mnih, A., Blundell, C., & Wierstra, D. (2014). Deep autoregressive networks. 31st International Conference on Machine Learning, ICML 2014, 4, 2991–3000.
- Autoregressive models
Summary: Topics Refer to references.
Notes 1310.8499_notes.pdf
Pages: 60
5 Review of Normalizing Flow
Published:
Tags:
Summary: Topics Normalizing flow Applications of normalizing flow Methods of normalizing flow Problems of normalizing flow
Pages: 60
4 Variantional Inference Normalizing Flow
Published:
Tags:
References:
- Christpher M. Bishop. (2006). Pattern Recognition and Machine Learning.
- Rezende, D. J., & Mohamed, S. (2015). Variational Inference with Normalizing Flows.
Summary: Topics Variational Inference Normalizing Flow Variational Inference with Normalizing Flows
Pages: 60
3 EM Methods
Published:
Tags:
References:
- Trevor Hastie, Robert Tibshirani, J. F. (2004). The Elements of Statistical Learning (Vol. 99, Issue 466). Springer Science & Business Media.
- Christpher M. Bishop. (2006). Pattern Recognition and Machine Learning.
Summary: EM method, expectation-maximization algorithm, is an inspiring iterative method to find the log-likelihood by introducing some intermediate variable such as responsibility.
Pages: 60
2 Least Squares, Bootstrap, Maximum Likelihood, and Bayesian
Published:
Tags:
Summary: Relation and difference between different methods in regression
Pages: 60
1 Conditional Probability and Bayes
Published:
Tags:
References:
- Trevor Hastie, Robert Tibshirani, J. F. (2004). The Elements of Statistical Learning (Vol. 99, Issue 466). Springer Science & Business Media.
- Association Rules
- Bayes' theorem @ Wikipedia
- Ross, S. M. (2014). Introduction to Probability and Statistics for Engineers and Scientists. Elsevier.
- Naive Bayes
Summary: Skeleton notes for conditional probability and Bayes' theorem
Pages: 60