Conditional Probability Estimation

An online discussion group on the topic Conditional Probability Estimation

Introduction: Conditional Probability Estimation

21 Graph Neural Networks: Basics

Published:
Tags:
Summary: This will be the beginning of a new topic: Graph Neural Networks. In this new series, we will use the textbook by Hamilton1. For the first episode, we will discuss some basics about graphs to make sure we are all on the same page. @Steven will lead the discussion. Hamilton2020 Hamilton WL. Graph representation learning. Synth lect artif intell mach learn. 2020;14: 1–159. doi:10.2200/s01045ed1v01y202009aim046 ↩︎
Pages: 22

15 Predictive Coding Approximates Backprop along Arbitrary Computation Graphs

Published:
Summary: In this meetup, we will discuss this paper: https://arxiv.org/abs/2006.04182 Why? Feedforward-backprop usually has a loss function that involves all the parameters. Backprop means we need this huge global loss $\mathcal L({w_{ij}})$. However, it is hard to imaging such global loss calculations in our brain. One of the alternatives is predictive coding, which only utilizes local connection information. In this paper (2006.04182), the author proves the equivalence of backprop and predictive coding on arbitary graph.
Pages: 22