September 25, 2019

336 words 2 mins read

dh7/ML-Tutorial-Notebooks

dh7/ML-Tutorial-Notebooks

This depot contain tutorials for real beginners who want to understand machine learning by reading some code.

repo name dh7/ML-Tutorial-Notebooks
repo link https://github.com/dh7/ML-Tutorial-Notebooks
homepage
language Jupyter Notebook
size (curr.) 2573 kB
stars (curr.) 39
created 2016-05-06
license BSD 2-Clause “Simplified” License

ML-Tutorial-Notebooks

Machine learning tutorial from scratch.

This depos contains some tutorials for real beginners who want to understand machine learning by reading some code.

Minimal character-level Vanilla RNN model, explained in a notbook

RNN stand for “Recurent Neural Network”.
To understand why RNN are so hot you must read this!

This notebook to explain the Minimal character-level Vanilla RNN model written by Andrej Karpathy
This code create a RNN to generate a text, char after char, by learning char after char from a textfile.

I love this character-level Vanilla RNN code because it doesn’t use any library except numpy. All the NN magic in 112 lines of code, no need to understand any dependency. Everything is there! I’ll try to explain in detail every line of it. Disclamer: I still need to use some external links for reference.

Minimal character-level TensorFlow RNN model

112 ligne of code to implement a character-level RNN in TensorFlow.
here the code! This is an adaptation of Minimal character-level Vanilla RNN model_ written by __Andrej Karpathy__

Character-level TensorFlow RNN model.

If you want to go deeper in TensorFlow for RNN, This notebook try to explain this original code from Sherjil Ozair.

Linear regression with Tensor Flow.

This notebook to start with tensor flow, using a simple example from Aymeric Damien

Fizz Buzz with Tensor Flow.

This notebook to explain the code from Fizz Buzz in Tensor Flow blog post written by Joel Grus
You should read his post first!

His code try to play the Fizz Buzz game by using machine learning.

Temperature

Temperature is a concept that is used when you need to generate a random number from a probability vector but want to over empasis samples that have the highest probability.

This notebook show the effects in practice.

comments powered by Disqus