Taxi Fare Prediction model Deployment Video
Blog with instructions on the run (Coming Soon)
The aim is the predict the taxi fares of New York with input features like pickup & dropoff co-ordinates, passenger count & the timestamp.
Here is link to the Kaggle Dataset: https://www.kaggle.com/competitions/new-york-city-taxi-fare-prediction
Sample Data:
We trained a XGBRegressor model. The model is trained using xgboost library and is deployed on the cloud using ServiceFoundry 🚀
Setting up servicefoundry
Install and setup servicefoundry on your computer.
pip install servicefoundry
servicefoundry use server https://app.develop.truefoundry.tech
servicefoundry login
Training model
To deploy using the python API run the following command in terminal
- Change working directory to train folder
cd train
- Create workspace and API key on the TrueFoundry platform
- Replace the
MLF_API_KEY
value in thetrain.yaml
file with the API Key found in secrets tab of your TrueFoundry account and it it in thetrain.yaml
file<i>
(Instructions here)</i>
- Copy the workspace fqn of the workspace that you want to use from the workspace tab of TrueFoundry
<i>
(Instructions here) and add it in thetrain.yaml
file</i>
- To deploy using python script:
python train_deploy.py
To deploy using CLI:
servicefoundry deploy --file train_deploy.yaml
- Click on the dashboard link in the terminal
- Click on the
<b>
"Trigger Job"</b>
on the dashboard to run the training job
Deploying realtime inference
Note: <i>
It is necessary to train a model before being able to deploy it as a service </i>
- Change working directory to infer_realtime folder
cd infer_realtime
- Create workspace and API key on the TrueFoundry platform
- Replace the
MLF_API_KEY
value in the infer_realtime_deploy.py file with the API Key found in secrets tab of your TrueFoundry account and add it ininfer.yaml
file<i>
(Instructions here)</i>
- Copy the workspace_fqn of the workspace that you want to use from the workspace tab of TrueFoundry
<i>
(Instructions here) and add it ininfer.yaml
file</i>
- Find the model_version_fqn of the model that you want to deploy from
- Go to experiments tracking tab of TrueFoundry
- Click on the project name that you trained (
<i>
taxi-fare-train by default</i>
) - Click on models tab
- Click on the model name to the model trained to open the tab showing different versions of the model
- Copy the FQN of the latest version of the model
- Add the latest version in the
infer.yaml
file
- To deploy using python script:
python infer_deploy.py
To deploy using CLI:
servicefoundry deploy --file infer/infer_deploy.yaml
- Click on the dashboard link in the terminal to open the service deployment page with FastAPI EndPoint
Querying the deployed model
This can either be done via the fastapi endpoint directly via browser.
You can also query with python script:
request_url = "https://taxi-fare-infer-vishank-betatest-ws.tfy-ctl-euwe1-develop.develop.truefoundry.tech"
features= {
"pickup_datetime":"2015-01-27 13:08:24 UTC",
"pickup_latitude":40.7638053894043,
"pickup_longitude":-73.973320007324219,
"dropoff_latitude":40.74383544921875,
"dropoff_longitude":-73.981430053710938,
"passenger_count":3
}
predictions_list = requests.post(
url=urljoin(request_url, "/predict"), json=features
).json()
Deploying Demo
Note: <i>
It is necessary to deploy live inference model before being able to deploy a demo </i>
- Create workspace and API key on the TrueFoundry platform
- Replace the
MLF_API_KEY
value in the infer_realtime_deploy.py file with the API Key found in secrets tab of your TrueFoundry account and add it in thedemo.yaml
file<i>
(Instructions here)</i>
- Copy the workspace_fqn of the workspace that you want to use from the workspace tab of TrueFoundry and add it in the
train.yaml
file<i>
(Instructions here)</i>
- Copy the inference_server_url from:
- Go to deployment tab of TrueFoundry
- Open the service that was deployment as live inference model
<i>
("taxi-fare-prediction" by default)</i>
- Copy the Endpoint link
- To deploy using python script:
python demo/demo_deploy.py
To deploy using CLI:
servicefoundry deploy --file demo/demo_deploy.yaml
- Click on the dashboard link in the terminal
- Click on the
<b>
"Endpoint"</b>
link on the dashboard to open the streamlit demo