TFLearn Examples
Basics
- Linear Regression. Implement a linear regression using TFLearn.
- Logical Operators. Implement logical operators with TFLearn (also includes a usage of 'merge').
- Weights Persistence. Save and Restore a model.
- Fine-Tuning. Fine-Tune a pre-trained model on a new task.
- Using HDF5. Use HDF5 to handle large datasets.
- Using DASK. Use DASK to handle large datasets.
Extending TensorFlow
- Layers. Use TFLearn layers along with TensorFlow.
- Trainer. Use TFLearn trainer class to train any TensorFlow graph.
- Built-in Ops. Use TFLearn built-in operations along with TensorFlow.
- Summaries. Use TFLearn summarizers along with TensorFlow.
- Variables. Use TFLearn variables along with TensorFlow.
Computer Vision
Supervised
- Multi-layer perceptron. A multi-layer perceptron implementation for MNIST classification task.
- Convolutional Network (MNIST). A Convolutional neural network implementation for classifying MNIST dataset.
- Convolutional Network (CIFAR-10). A Convolutional neural network implementation for classifying CIFAR-10 dataset.
- Network in Network. 'Network in Network' implementation for classifying CIFAR-10 dataset.
- Alexnet. Apply Alexnet to Oxford Flowers 17 classification task.
- VGGNet. Apply VGG Network to Oxford Flowers 17 classification task.
- VGGNet Finetuning (Fast Training). Use a pre-trained VGG Network and retrain it on your own data, for fast training.
- RNN Pixels. Use RNN (over sequence of pixels) to classify images.
- Highway Network. Highway Network implementation for classifying MNIST dataset.
- Highway Convolutional Network. Highway Convolutional Network implementation for classifying MNIST dataset.
- Residual Network (MNIST). A bottleneck residual network applied to MNIST classification task.
- Residual Network (CIFAR-10). A residual network applied to CIFAR-10 classification task.
- ResNeXt (CIFAR-10). Aggregated residual transformations network (ResNeXt) applied to CIFAR-10 classification task.
- Google Inception (v3). Google's Inception v3 network applied to Oxford Flowers 17 classification task.
Unsupervised
- Auto Encoder. An auto encoder applied to MNIST handwritten digits.
- Variational Auto Encoder. A Variational Auto Encoder (VAE) trained to generate digit images.
- GAN (Generative Adversarial Networks). Use generative adversarial networks (GAN) to generate digit images from a noise distribution.
- DCGAN (Deep Convolutional Generative Adversarial Networks). Use deep convolutional generative adversarial networks (DCGAN) to generate digit images from a noise distribution.
Natural Language Processing
- Recurrent Neural Network (LSTM). Apply an LSTM to IMDB sentiment dataset classification task.
- Bi-Directional RNN (LSTM). Apply a bi-directional LSTM to IMDB sentiment dataset classification task.
- Dynamic RNN (LSTM). Apply a dynamic LSTM to classify variable length text from IMDB dataset.
- City Name Generation. Generates new US-cities name, using LSTM network.
- Shakespeare Scripts Generation. Generates new Shakespeare scripts, using LSTM network.
- Seq2seq. Pedagogical example of seq2seq recurrent network. See this repo for full instructions.
- CNN Seq. Apply a 1-D convolutional network to classify sequence of words from IMDB sentiment dataset.
Reinforcement Learning
- Atari Pacman 1-step Q-Learning. Teach a machine to play Atari games (Pacman by default) using 1-step Q-learning.
Others
- Recommender - Wide & Deep Network. Pedagogical example of wide & deep networks for recommender systems.
Notebooks
- Spiral Classification Problem. TFLearn implementation of spiral classification problem from Stanford CS231n.