The first 5 steps are about creating the deep learning model.
I trained the deep learning model in a Jupiter notebook in google Colab, with Fast AI, as explained in the lectures of 2020.
download images by using Big Image Search Api;
manually remove the not relevant images;
apply Data Augmentation;
train the deep learning model;
save all the script including the export.pkl file in a Github repository.
In this case the repository is https://github.com/enricodata/emotion-faces that I am going to use as an example.
To push to GitHub the file export.pkl which is bigger than 25 mega use git-lfs.
First, if you have the double step authentication active on Github, add a token as explained here: https://help.github.com/en/github/authenticating-to-github/creating-a-personal-access-token-for-the-command-line#creating-a-token
On local terminal:
brew install git-lfs (I am using a Mac)
cd your_local_directory (in my case emotion-faces)
git lfs install
git lfs track “export.pkl”
git add .gitattributes export.pkl
git commit -m “add here a comment”
git push -u origin master
- deploy in https://mybinder.org
- In the field called “GitHub” write: https://github.com/enricodata/emotion-faces/
- In the field called “Path to a notebook file (optional)” select URL and write: /voila/render/emotion_classifier.ipynb
- finally I saved the link of the deployed application in a friendly short link: https://bit.ly/face-emotions