vinsis/math-and-ml-notes
Books, papers and links to latest research in ML/AI
repo name | vinsis/math-and-ml-notes |
repo link | https://github.com/vinsis/math-and-ml-notes |
homepage | |
language | Jupyter Notebook |
size (curr.) | 2224 kB |
stars (curr.) | 66 |
created | 2019-09-11 |
license | MIT License |
Links to some important research papers or links. I plan to add notes as I go through each topic one by one.
✔ 1. Information theory based (unsupervised) learning
- Invariant Information Clustering
- Mutual Information Neural Estimation
- Deep Infomax
- Learning Representations by Maximizing Mutual Information Across Views
- How Google decoupled MI maximization and representation learning: On Mutual Information Maximization for Representation Learning
New from NIPS 2019
2. Disentangled representations
- Quick overview by Google
- β-VAE, pdf
- Understanding disentangling in β-VAE
- Disentangling Disentanglement in Variational Autoencoders
- Isolating Sources of Disentanglement in Variational Autoencoders
- InfoGAN-CR: Disentangling Generative Adversarial Networks with Contrastive Regularizers
- Disentangling by Factorising, pdf
3. Contrastive Coding
- Representation Learning with Contrastive Predictive Coding
- Data-Efficient Image Recognition with Contrastive Predictive Coding
- Contrastive Multiview Coding
- Momentum Contrast for Unsupervised Visual Representation Learning
- Google disspelling a lot of misconceptions about disentangled representations: Challenging Common Assumptions in the Unsupervised Learning of Disentangled Representations
✔ 4. Automatic differentiation
- Automatic differentiation in machine learning: a survey
- Automatic Reverse-Mode Differentiation: Lecture Notes
- Reverse mode automatic differentiation
5. NNs and ODEs
- Neural Ordinary Differential Equations
- Augmented Neural ODEs
- Invertible ResNets
- Universal Differential Equations for Scientific Machine Learning
6. Probabilistic Programming
- Probabilistic models of cognition
- The Design and Implementation of Probabilistic Programming Languages
- Composition in Probabilistic Language Understanding
7. Miscellaneous
Memorization in neural networks
Online Learning
Graph Neural Networks
Normalizing Flows
- Detailed hands-on introduction
- Normalizing Flows for Probabilistic Modeling and Inference
- PyTorch implementations of density estimation algorithms
Transformers
Others
- Zero-shot knowledge transfer
- SpecNet
- Deep Learning & Symbolic Mathematics
- Learning to Predict Layout-to-image Conditional Convolutions for Semantic Image Synthesis
- Deep Equilibrium Models
8. Theory of neural networks
Lottery tickets
- Lottery ticket hypothesis
- Deconstructing Lottery Tickets: Zeros, Signs, and the Supermask
- Rigging the Lottery: Making All Tickets Winners
Others
- What’s Hidden in a Randomly Weighted Neural Network?
- Topological properties of the set of functions generated by neural networks of fixed size
- YOUR CLASSIFIER IS SECRETLY AN ENERGY BASED MODEL AND YOU SHOULD TREAT IT LIKE ONE
- Neural Persistence: A Complexity Measure for Deep Neural Networks Using Algebraic Topology
9. Advanced Variational Inference
- Amortized Population Gibbs Samplers with Neural Sufficient Statistics
- Evaluating Combinatorial Generalization in Variational Autoencoders