anhquan0412/basic_model_scratch
Implementation of some classic Machine Learning model from scratch and benchmarking against popular ML library
repo name | anhquan0412/basic_model_scratch |
repo link | https://github.com/anhquan0412/basic_model_scratch |
homepage | |
language | Jupyter Notebook |
size (curr.) | 34769 kB |
stars (curr.) | 608 |
created | 2018-03-14 |
license | |
Machine Learning from scratch!
Update: Code implementations have been moved to python module. Notebook will only show results and model comparison
To refresh my knowledge, I will attempt to implement some basic machine learning algorithms from scratch using only python and limited numpy/pandas function. My model implementations will be compared to existing models from popular ML library (sklearn)
- Linear Regression with weight decay (L2 regularization)
- Logistic Regression with weight decay
- Random Forest with Permutation Feature Importances
- K Nearest Neighbors: supervised and unsupervised
- Neural network for classification
- Stochastic Gradient Descent
- Multiple hidden layers
- Variety of activation functions + gradients (Sigmoid, Softmax, ReLU …) customized for each hidden layer
- L2 regularization
- Dropout
- Dynamic learning rate optimizer (momentum, RMSProp and Adam)
- TODO: batchnorm
The following notebooks uses Pytorch libraries so they are not implemented from scratch. However, I try not to use any high level Pytorch function
- Pytorch Neural Network with:
- Custom Data Loader
- Data Augmentation on 1 channel image: torchvision vs fastai
- Shallow NN with batchnorm and dropout
- Learning rate finder
- Auto Encoding
- Collaborative Filtering
- Char RNN in Vietnamese (Fast.ai)