This project implements a character classifier for the popular anime One Piece using transfer learning techniques. The classifier can identify 18 different characters from the series with high accuracy, making it a valuable tool for fans and developers working on One Piece related projects.
- Utilizes MobileNet as the base model with additional custom layers
- Trained on a dataset of 18 One Piece character classes from Kaggle
- Achieves 96% accuracy on the validation dataset
- Deployed in multiple formats for versatile use:
- TensorFlow Lite
- TensorFlow.js
- SavedModel for TensorFlow Serving
- Data Split: 80% training, 20% testing
- Model Architecture: MobileNet base with custom layers
- Inference Methods: Web interface via Streamlit and API endpoints via TensorFlow Serving
The model can be easily integrated into various applications, including:
- Fan websites and apps
- Character recognition tools
- Content moderation for One Piece-related platforms
- Available online at: https://onepiececlassifier.streamlit.app/
- Can be run locally using:
streamlit run app.py
To run this notebook, you will need to install the required packages. You can install them using the following command:
pip install -r requirements.txt
You can run the TensorFlow Serving Docker container using the following command:
sudo docker run -d --name tf_serving_predict \
-v /repository_path/saved_model:/models/predict \
-p 8501:8501 \
-e MODEL_NAME=predict \
tensorflow/serving:latest
You can now perform inference on a new image using the inference notebook provided in this repository or by sending a POST request to the TensorFlow Serving API endpoint. The API endpoint for the TensorFlow Serving is:
http://localhost:8501/v1/models/predict:predict
After you are done with the inference, you can stop the docker container using the following command:
sudo docker stop tf_serving_predict
sudo docker rm tf_serving_predict