Conditional Probability Estimation

An online journal club on the topics Conditional Probability Estimation. Our cover topics on all sorts of probabilistic approach, such as VAE, normalizing, graph neural network, probabilistic time series forecasting.

Introduction: Conditional Probability Estimation

23 Graph Neural Networks

Published:
Tags:
Summary: Chapter 5 of Hamilton1. Hamilton2020 Hamilton WL. Graph representation learning. Synth lect artif intell mach learn. 2020;14: 1–159. doi:10.2200/s01045ed1v01y202009aim046 ↩︎
Pages: 60

22 Graph Neural Networks: Basics (2)

Published:
Tags:
Summary: We will continue the discussion on Graph Neural Networks. Problems of using Graphs Graph Neural Networks Textbook: Hamilton1 Hamilton2020 Hamilton WL. Graph representation learning. Synth lect artif intell mach learn. 2020;14: 1–159. doi:10.2200/s01045ed1v01y202009aim046 ↩︎
Pages: 60

21 Graph Neural Networks: Basics

Published:
Tags:
Summary: This will be the beginning of a new topic: Graph Neural Networks. In this new series, we will use the textbook by Hamilton1. For the first episode, we will discuss some basics about graphs to make sure we are all on the same page. @Steven will lead the discussion. Hamilton2020 Hamilton WL. Graph representation learning. Synth lect artif intell mach learn. 2020;14: 1–159. doi:10.2200/s01045ed1v01y202009aim046 ↩︎
Pages: 60

15 Predictive Coding Approximates Backprop along Arbitrary Computation Graphs

Published:
Summary: In this meetup, we will discuss this paper: https://arxiv.org/abs/2006.04182 Why? Feedforward-backprop usually has a loss function that involves all the parameters. Backprop means we need this huge global loss $\mathcal L({w_{ij}})$. However, it is hard to imaging such global loss calculations in our brain. One of the alternatives is predictive coding, which only utilizes local connection information. In this paper (2006.04182), the author proves the equivalence of backprop and predictive coding on arbitary graph.
Pages: 60