Graph Neural Networks: Basics

This will be the beginning of a new topic: Graph Neural Networks. In this new series, we will use the textbook by Hamilton1. For the first episode, we will discuss some basics about graphs to make …

Understand models to estimate conditional probabilities

In Progress

Graph Neural Networks: Basics

This will be the beginning of a new topic: Graph Neural Networks. In this new series, we will use the textbook by Hamilton1. For the first episode, we will discuss some basics about graphs to make …

The discussion is online through Lark/Wechat.

- Lark is our primary communication channel.
- Wechat is mostly for our backup plans.

If you would like to be part of the party, please create a post here on GitHub discussions.

The discussions are mostly in Chinese.

This is a bi-weekly meetup. To following all the upcoming events, please add this ics to your calendar.

- Calendar ics url: Add this ICS to your calendar app to follow the upcoming events.

If you would like to add individual events by yourself, use the “**Add to Calendar**” button on each event page.

Here is a calendar web page for the upcoming events (Calendar Page):

- Everyone will and shall get their chance to lead the discussion.
- The first principle is to understand the content. Interrupt and ask any questions to make sure we all understand the content well.

Conditional probability estimation is one of the most fundamental problems in statistics.

- Conditional probability estimation is frequently used in solving both real life and academic problems. One is likely to encounter this problem at some point of their life.
- If you are inferring, you are probably using conditional probabilities. It is a perspective.
- There are many models and methods to estimate the conditional probability. We can learn about and from these models and methods.
- We need a universal model to solve this problem for productivity. A universal model for this task will save us a lot of time and energy.
- Many machine learning methods are based on conditional probabilities.
- Many classifiers
- Bayesian networks
- …

- Read and Discuss
- Apply on toy problems

We will update this list on our way forward. Here is a partial list of references.

As a start this is an outline of what should be covered.

- What is the conditional probability?
- Sampling theory
- Bayes
- Representation of a conditional probability

- Statistical methods to estimate the conditional probability
- The list is enormous. We will only concentrate on the basics.

- Tree-based
- Tree as “clustering” method
- Application on the bike-sharing problem

- NN-based
- NN as feature transformations
- Application on the bike-sharing problem

- EM Methods
- Variational Methods
- Normalizing Flow
- To be added as we learn more about it

We have prepared dataset that can be used both for classification problems and regression problems.

- Timezone conversions: World Clock

Conditional Probability Estimation

- Conditional Probability and Bayes
- Least Squares, Bootstrap, Maximum Likelihood, and Bayesian
- EM Methods
- Variantional Inference Normalizing Flow
- Review of Normalizing Flow
- Deep AutoRegressive Networks
- MADE: Masked Autoencoder for Distribution Estimation
- MAF: how is MADE being used
- Summary of Generative Models
- Energy-based Models
- Energy-based Models 2
- Energy-based Models 3
- Energy-based Models 4
- Energy-based Models 5
- Predictive Coding Approximates Backprop along Arbitrary Computation Graphs
- LTD/LTP
- Self-supervised Learning: Generative or Contrastive
- Self-supervised Learning: GAN
- Self-supervised Learning: Theories (Part 1)
- Self-supervised Learning: Theories (Part 2)
- Graph Neural Networks: Basics
- References for Probability Estimation Club