diff --git a/samples/nvidia-resnet/LICENSE b/samples/nvidia-resnet/LICENSE index 0ec773595971..94bdef64ebf2 100644 --- a/samples/nvidia-resnet/LICENSE +++ b/samples/nvidia-resnet/LICENSE @@ -1,25 +1,13 @@ -Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. - -Redistribution and use in source and binary forms, with or without -modification, are permitted provided that the following conditions -are met: - * Redistributions of source code must retain the above copyright - notice, this list of conditions and the following disclaimer. - * Redistributions in binary form must reproduce the above copyright - notice, this list of conditions and the following disclaimer in the - documentation and/or other materials provided with the distribution. - * Neither the name of NVIDIA CORPORATION nor the names of its - contributors may be used to endorse or promote products derived - from this software without specific prior written permission. - -THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY -EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR -CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, -EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, -PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR -PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY -OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. diff --git a/samples/nvidia-resnet/README.md b/samples/nvidia-resnet/README.md index 8287acc514a6..32229b4b0fe4 100644 --- a/samples/nvidia-resnet/README.md +++ b/samples/nvidia-resnet/README.md @@ -1,8 +1,9 @@ -# A simple NVIDIA-accelerated ResNet Kubeflow pipeline -### This example demonstrates a simple end-to-end training & deployment of a Keras Resnet model on the CIFAR10 dataset utilizing the following NVIDIA technologies: +# A simple GPU-accelerated ResNet Kubeflow pipeline +## Overview +This example demonstrates a simple end-to-end training & deployment of a Keras Resnet model on the CIFAR10 dataset utilizing the following technologies: * [NVIDIA-Docker2](https://github.com/NVIDIA/nvidia-docker) to make the Docker containers GPU aware. * [NVIDIA device plugin](https://github.com/NVIDIA/k8s-device-plugin) to allow Kubernetes to access GPU nodes. -* [TensorFlow-19.02](https://ngc.nvidia.com/catalog/containers/nvidia:tensorflow) containers from NVIDIA GPU Cloud container registry. +* [TensorFlow-19.03](https://ngc.nvidia.com/catalog/containers/nvidia:tensorflow) containers from NVIDIA GPU Cloud container registry. * [TensorRT](https://docs.nvidia.com/deeplearning/dgx/integrate-tf-trt/index.html) for optimizing the Inference Graph in FP16 for leveraging the dedicated use of Tensor Cores for Inference. * [TensorRT Inference Server](https://github.com/NVIDIA/tensorrt-inference-server) for serving the trained model. @@ -11,28 +12,29 @@ * NVIDIA GPU ## Quickstart -* Install NVIDIA Docker, Kubernetes and Kubeflow on your local machine: +* Install NVIDIA Docker, Kubernetes and Kubeflow on your local machine (on your first run): * `sudo ./install_kubeflow_and_dependencies.sh` -* Mount persistent volume to Kubeflow: - * `sudo ./mount_persistent_volume.sh` -* Build the Preprocessing, Training, Serving, and Pipeline containers using the following script: - * First, modify `build.sh` in `preprocess`, `train`, and `serve` directories to point to a container registry that is accessible to you +* Build the Docker image of each pipeline component and compile the Kubeflow pipeline: + * First, make sure `IMAGE` variable in `build.sh` in each component dir under `components` dir points to a public container registry + * Then, make sure the `image` used in each `ContainerOp` in `pipeline/src/pipeline.py` matches `IMAGE` in the step above + * Then, make sure the `image` of the webapp Deployment in `components/webapp_launcher/src/webapp-service-template.yaml` matches `IMAGE` in `components/webapp/build.sh` * Then, `sudo ./build_pipeline.sh` - * Note the `pipeline.py.tar.gz` file that appears on your working directory -* Determine the ambassador port using this command: + * Note the `pipeline.py.tar.gz` file that appears in your working directory +* Determine the ambassador port: * `sudo kubectl get svc -n kubeflow ambassador` -* Open the Kubeflow Dashboard on: - * https://local-machine-ip-address:port-determined-from-previous-step - * E.g. https://10.110.210.99:31342 -* Click on the tab Pipeline Dashboard, upload the `pipeline.py.tar.gz` file under you working directory and create a run -* Once the training has completed (should take about 20 minutes for 50 epochs) and the model is being served, port forward the port of the serving pod (8000) to the local system: - * Determine the name of the serving pod by selecting it on the Kubeflow Dashboard - * Modify accordingly the variable `SERVING_POD` within `portforward_serving_port.sh` - * Then, `sudo ./portforward_serving_port.sh` -* Build the client container and start a local server for the demo web UI on the host machine (about 15 mins): - * `sudo ./test_trtis_client.sh` -* Now you have successfully set up the client through which you can ping the server with an image URL and obtain predictions: - * Open the demo client UI on a web browser with the following IP address: -https://local-machine-ip-address:8050 - * The port is specified in `demo_client_ui.py` and can be changed as needed - * Copy an image URL (for one of the 10 CIFAR classes) and paste it in the UI +* Open the Kubeflow UI on: + * https://[local-machine-ip-address]:[ambassador-port]/ + * E.g. https://10.110.210.99:31342/ +* Click on Pipeline Dashboard tab, upload the `pipeline.py.tar.gz` file you just compile and create a run +* Training takes about 20 minutes for 50 epochs and a web UI is deployed as part of the pipeline so user can interact with the served model +* Access the client web UI: + * https://[local-machine-ip-address]:[kubeflow-ambassador-port]/[webapp-prefix]/ + * E.g. https://10.110.210.99:31342/webapp/ +* Now you can test the trained model with random images and obtain class prediction and probability distribution + +## Cleanup +Following are optional scripts to cleanup your cluster (useful for debugging) +* Delete deployments & services from previous runs: + * `sudo ./clean_utils/delete_all_previous_resources.sh` +* Uninstall Minikube and Kubeflow: + * `sudo ./clean_utils/remove_minikube_and_kubeflow.sh` \ No newline at end of file diff --git a/samples/nvidia-resnet/build_pipeline.sh b/samples/nvidia-resnet/build_pipeline.sh index 8d5bd2ea57bf..27d3dd94608f 100755 --- a/samples/nvidia-resnet/build_pipeline.sh +++ b/samples/nvidia-resnet/build_pipeline.sh @@ -1,39 +1,28 @@ #!/bin/bash -# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. # -# Redistribution and use in source and binary forms, with or without -# modification, are permitted provided that the following conditions -# are met: -# * Redistributions of source code must retain the above copyright -# notice, this list of conditions and the following disclaimer. -# * Redistributions in binary form must reproduce the above copyright -# notice, this list of conditions and the following disclaimer in the -# documentation and/or other materials provided with the distribution. -# * Neither the name of NVIDIA CORPORATION nor the names of its -# contributors may be used to endorse or promote products derived -# from this software without specific prior written permission. +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at # -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY -# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, -# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, -# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR -# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY -# OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. -WORK_DIR=$(pwd) +base_dir=$(pwd) +components_dir=$base_dir/components -# Build and push images of kubeflow pipeline components -cd $WORK_DIR/preprocess && ./build.sh && \ -cd $WORK_DIR/train && ./build.sh && \ -cd $WORK_DIR/serve && ./build.sh && \ +# Build and push images of Kubeflow Pipelines components +for component in $components_dir/*/; do + cd $component && ./build.sh +done # Compile kubeflow pipeline tar file -cd $WORK_DIR/pipeline && ./build.sh - - +cd $base_dir/pipeline && ./build.sh +(mv -f src/*.tar.gz $base_dir && \ +echo "Pipeline compiled sucessfully!") || \ +echo "Pipeline compilation failed!" diff --git a/samples/nvidia-resnet/clean_utils/delete_all_previous_resources.sh b/samples/nvidia-resnet/clean_utils/delete_all_previous_resources.sh new file mode 100755 index 000000000000..84c6d0246cc1 --- /dev/null +++ b/samples/nvidia-resnet/clean_utils/delete_all_previous_resources.sh @@ -0,0 +1,33 @@ +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +TRTIS_NAME=trtis +WEBAPP_NAME=webapp +PIPELINE_NAME=resnet-cifar10-pipeline +KF_NAMESPACE=kubeflow + +kubectl delete service/$TRTIS_NAME -n $KF_NAMESPACE +kubectl delete deployment.apps/$TRTIS_NAME -n $KF_NAMESPACE + +for service in $( kubectl get svc -n $KF_NAMESPACE | grep $WEBAPP_NAME | cut -d' ' -f1 ); do + kubectl delete -n $KF_NAMESPACE service/$service +done + +for deployment in $( kubectl get deployment -n $KF_NAMESPACE | grep $WEBAPP_NAME | cut -d' ' -f1 ); do + kubectl delete -n $KF_NAMESPACE deployment.apps/$deployment +done + +for pod in $(kubectl get pod -n kubeflow | grep $PIPELINE_NAME | cut -d' ' -f1); do + kubectl delete -n $KF_NAMESPACE pod/$pod +done diff --git a/samples/nvidia-resnet/clean_utils/remove_minikube_and_kubeflow.sh b/samples/nvidia-resnet/clean_utils/remove_minikube_and_kubeflow.sh new file mode 100755 index 000000000000..8d4ee4e835c1 --- /dev/null +++ b/samples/nvidia-resnet/clean_utils/remove_minikube_and_kubeflow.sh @@ -0,0 +1,22 @@ +#!/bin/bash +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +# Remove KubeFlow +cd ${KUBEFLOW_SRC}/${KFAPP} +${KUBEFLOW_SRC}/scripts/kfctl.sh delete k8s + +# Remove Minikube +minikube stop +minikube delete diff --git a/samples/nvidia-resnet/components/inference_server_launcher/Dockerfile b/samples/nvidia-resnet/components/inference_server_launcher/Dockerfile new file mode 100644 index 000000000000..2b59ba9756e7 --- /dev/null +++ b/samples/nvidia-resnet/components/inference_server_launcher/Dockerfile @@ -0,0 +1,30 @@ +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +FROM ubuntu:16.04 + +RUN apt-get update -y && \ + apt-get install --no-install-recommends -y -q ca-certificates curl python-dev python-setuptools wget unzip +RUN easy_install pip && \ + pip install pyyaml six requests + +# Install kubectl +RUN curl -LO https://storage.googleapis.com/kubernetes-release/release/$(curl -s https://storage.googleapis.com/kubernetes-release/release/stable.txt)/bin/linux/amd64/kubectl +RUN chmod +x ./kubectl +RUN mv ./kubectl /usr/local/bin + +ADD src /workspace +WORKDIR /workspace + +ENTRYPOINT ["python", "deploy_trtis.py"] diff --git a/samples/nvidia-resnet/components/inference_server_launcher/build.sh b/samples/nvidia-resnet/components/inference_server_launcher/build.sh new file mode 100755 index 000000000000..24a08366afe3 --- /dev/null +++ b/samples/nvidia-resnet/components/inference_server_launcher/build.sh @@ -0,0 +1,19 @@ +#!/bin/bash +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +IMAGE= + +docker build -t $IMAGE . +docker push $IMAGE diff --git a/samples/nvidia-resnet/components/inference_server_launcher/src/deploy_trtis.py b/samples/nvidia-resnet/components/inference_server_launcher/src/deploy_trtis.py new file mode 100644 index 000000000000..bdcfa1e6de06 --- /dev/null +++ b/samples/nvidia-resnet/components/inference_server_launcher/src/deploy_trtis.py @@ -0,0 +1,59 @@ +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import argparse +import os +import logging +import subprocess +import requests + + +KUBEFLOW_NAMESPACE = 'kubeflow' +YAML_TEMPLATE = 'trtis-service-template.yaml' +YAML_FILE = 'trtis-service.yaml' + + +def main(): + parser = argparse.ArgumentParser(description='Inference server launcher') + parser.add_argument('--trtserver_name', help='Name of trtis service') + parser.add_argument('--model_path', help='...') + + args = parser.parse_args() + + logging.getLogger().setLevel(logging.INFO) + logging.info('Generating TRTIS service template') + + template_file = os.path.join(os.path.dirname( + os.path.realpath(__file__)), YAML_TEMPLATE) + target_file = os.path.join(os.path.dirname( + os.path.realpath(__file__)), YAML_FILE) + + with open(template_file, 'r') as template: + with open(target_file, "w") as target: + data = template.read() + changed = data.replace('TRTSERVER_NAME', args.trtserver_name) + changed1 = changed.replace( + 'KUBEFLOW_NAMESPACE', KUBEFLOW_NAMESPACE) + changed2 = changed1.replace('MODEL_PATH', args.model_path) + target.write(changed2) + + logging.info('Deploying TRTIS service') + subprocess.call(['kubectl', 'apply', '-f', YAML_FILE]) + + with open('/output.txt', 'w') as f: + f.write(args.trtserver_name) + + +if __name__ == "__main__": + main() diff --git a/samples/nvidia-resnet/components/inference_server_launcher/src/trtis-service-template.yaml b/samples/nvidia-resnet/components/inference_server_launcher/src/trtis-service-template.yaml new file mode 100644 index 000000000000..13900836b2ae --- /dev/null +++ b/samples/nvidia-resnet/components/inference_server_launcher/src/trtis-service-template.yaml @@ -0,0 +1,75 @@ +--- +apiVersion: v1 +kind: Service +metadata: + annotations: + getambassador.io/config: |- + --- + apiVersion: ambassador/v0 + kind: Mapping + name: trtisserving-predict-mapping-TRTSERVER_NAME + grpc: True + prefix: / + rewrite: / + service: TRTSERVER_NAME.KUBEFLOW_NAMESPACE:8001 + labels: + app: TRTSERVER_NAME + name: TRTSERVER_NAME + namespace: KUBEFLOW_NAMESPACE +spec: + ports: + - name: grpc-trtis-serving + port: 8001 + targetPort: 8001 + - name: http-trtis-serving + port: 8000 + targetPort: 8000 + - name: prometheus-metrics + port: 8002 + targetPort: 8002 + selector: + app: TRTSERVER_NAME + type: ClusterIP +--- +apiVersion: extensions/v1beta1 +kind: Deployment +metadata: + labels: + app: TRTSERVER_NAME + name: TRTSERVER_NAME + namespace: KUBEFLOW_NAMESPACE +spec: + replicas: 1 + template: + metadata: + labels: + app: TRTSERVER_NAME + version: v1 + spec: + containers: + - image: nvcr.io/nvidia/tensorrtserver:19.03-py3 + command: ["/bin/sh", "-c"] + args: ["trtserver --model-store=MODEL_PATH"] + imagePullPolicy: IfNotPresent + name: TRTSERVER_NAME + ports: + - containerPort: 9000 + - containerPort: 8000 + - containerPort: 8001 + - containerPort: 8002 + resources: + limits: + cpu: "2" + memory: 4Gi + nvidia.com/gpu: 1 + requests: + cpu: "2" + memory: 4Gi + nvidia.com/gpu: 1 + volumeMounts: + - name: persistent-data-store + mountPath: /mnt/workspace + volumes: + - name: persistent-data-store + persistentVolumeClaim: + claimName: nvidia-workspace-read-claim diff --git a/samples/nvidia-resnet/components/preprocess/Dockerfile b/samples/nvidia-resnet/components/preprocess/Dockerfile new file mode 100644 index 000000000000..d9b3907095d4 --- /dev/null +++ b/samples/nvidia-resnet/components/preprocess/Dockerfile @@ -0,0 +1,21 @@ +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +FROM nvcr.io/nvidia/tensorflow:19.03-py3 + +RUN pip install keras +ADD src /workspace +WORKDIR /workspace + +ENTRYPOINT ["python", "preprocess.py"] diff --git a/samples/nvidia-resnet/components/preprocess/build.sh b/samples/nvidia-resnet/components/preprocess/build.sh new file mode 100755 index 000000000000..cea03d282148 --- /dev/null +++ b/samples/nvidia-resnet/components/preprocess/build.sh @@ -0,0 +1,19 @@ +#!/bin/bash +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +IMAGE= + +docker build -t $IMAGE . +docker push $IMAGE diff --git a/samples/nvidia-resnet/components/preprocess/src/preprocess.py b/samples/nvidia-resnet/components/preprocess/src/preprocess.py new file mode 100644 index 000000000000..20f3141183fd --- /dev/null +++ b/samples/nvidia-resnet/components/preprocess/src/preprocess.py @@ -0,0 +1,52 @@ +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import os +import argparse +import numpy as np +from keras.datasets import cifar10 + + +def main(): + parser = argparse.ArgumentParser(description='Data processor') + parser.add_argument('--input_dir', help='Raw data directory') + parser.add_argument('--output_dir', help='Processed data directory') + + args = parser.parse_args() + + def load_and_process_data(input_dir): + processed_data = cifar10.load_data() + return processed_data + + def save_data(processed_data, output_dir): + (x_train, y_train), (x_test, y_test) = processed_data + if not os.path.isdir(output_dir): + os.mkdir(output_dir) + np.save(os.path.join(output_dir, 'x_train.npy'), x_train) + np.save(os.path.join(output_dir, 'y_train.npy'), y_train) + np.save(os.path.join(output_dir, 'x_test.npy'), x_test) + np.save(os.path.join(output_dir, 'y_test.npy'), y_test) + + processed_data = load_and_process_data(args.input_dir) + save_data(processed_data, args.output_dir) + + with open('/output.txt', 'w') as f: + f.write(args.output_dir) + + print('input_dir: {}'.format(args.input_dir)) + print('output_dir: {}'.format(args.output_dir)) + + +if __name__ == "__main__": + main() diff --git a/samples/nvidia-resnet/components/train/Dockerfile b/samples/nvidia-resnet/components/train/Dockerfile new file mode 100644 index 000000000000..e49a6dbd7442 --- /dev/null +++ b/samples/nvidia-resnet/components/train/Dockerfile @@ -0,0 +1,21 @@ +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +FROM nvcr.io/nvidia/tensorflow:19.03-py3 + +RUN pip install keras +ADD src /workspace +WORKDIR /workspace + +ENTRYPOINT ["python", "train.py"] diff --git a/samples/nvidia-resnet/components/train/build.sh b/samples/nvidia-resnet/components/train/build.sh new file mode 100755 index 000000000000..18644de4a8b1 --- /dev/null +++ b/samples/nvidia-resnet/components/train/build.sh @@ -0,0 +1,19 @@ +#!/bin/bash +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +IMAGE= + +docker build -t $IMAGE . +docker push $IMAGE diff --git a/samples/nvidia-resnet/components/train/src/train.py b/samples/nvidia-resnet/components/train/src/train.py new file mode 100644 index 000000000000..359bd1d4383c --- /dev/null +++ b/samples/nvidia-resnet/components/train/src/train.py @@ -0,0 +1,590 @@ +# COPYRIGHT +# +# All contributions by François Chollet: +# Copyright (c) 2015 - 2018, François Chollet. +# All rights reserved. +# +# All contributions by Google: +# Copyright (c) 2015 - 2018, Google, Inc. +# All rights reserved. +# +# All contributions by Microsoft: +# Copyright (c) 2017 - 2018, Microsoft, Inc. +# All rights reserved. +# +# All other contributions: +# Copyright (c) 2015 - 2018, the respective contributors. +# All rights reserved. +# +# Each contributor holds copyright over their respective contributions. +# The project versioning (Git) records all such contribution source information. +# +# LICENSE +# +# The MIT License (MIT) +# +# Permission is hereby granted, free of charge, to any person obtaining a copy +# of this software and associated documentation files (the "Software"), to deal +# in the Software without restriction, including without limitation the rights +# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +# copies of the Software, and to permit persons to whom the Software is +# furnished to do so, subject to the following conditions: +# +# The above copyright notice and this permission notice shall be included in all +# copies or substantial portions of the Software. +# +# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +# SOFTWARE. + +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + + +from __future__ import print_function + +import os +import shutil +import argparse +import numpy as np + +import tensorflow as tf +from tensorflow.python.saved_model import builder as saved_model_builder +from tensorflow.python.saved_model.signature_def_utils import predict_signature_def +from tensorflow.python.saved_model import tag_constants +from tensorflow.python.saved_model import signature_constants + +import keras +from keras.regularizers import l2 +from keras import backend as K +from keras.models import Model +from keras import backend as K +from keras.optimizers import Adam +from keras.models import load_model +from keras.layers import Dense, Conv2D +from keras.layers import BatchNormalization, Activation +from keras.layers import AveragePooling2D, Input, Flatten +from keras.callbacks import ModelCheckpoint, LearningRateScheduler +from keras.callbacks import ReduceLROnPlateau +from keras.preprocessing.image import ImageDataGenerator + +import tensorflow.contrib.tensorrt as trt +# keras.mixed_precision.experimental.set_policy("default_mixed") + + +CONT_TRTIS_RESOURCE_DIR = 'trtis_resource' + + +def main(): + parser = argparse.ArgumentParser(description='Model trainer') + parser.add_argument('--input_dir', help='Processed data directory') + parser.add_argument('--output_dir', help='Output model directory') + parser.add_argument('--epochs', help='Number of training epochs') + parser.add_argument('--model_name', help='Output model name') + parser.add_argument('--model_version', help='Output model version') + + args = parser.parse_args() + + print(args.output_dir, args.model_name) + + # Copy TRTIS resource (containing config.pbtxt, labels.txt, ...) from container to mounted volume + model_dir = os.path.join(args.output_dir, args.model_name) + if os.path.isdir(model_dir): + shutil.rmtree(model_dir) + shutil.copytree(CONT_TRTIS_RESOURCE_DIR, model_dir) + os.mkdir(os.path.join(model_dir, args.model_version)) + + # Training parameters + batch_size = 128 # orig paper trained all networks with batch_size=128 + epochs = int(args.epochs) + data_augmentation = True + num_classes = 10 + + # Subtracting pixel mean improves accuracy + subtract_pixel_mean = True + + # Model parameter + # ---------------------------------------------------------------------------- + # | | 200-epoch | Orig Paper| 200-epoch | Orig Paper| sec/epoch + # Model | n | ResNet v1 | ResNet v1 | ResNet v2 | ResNet v2 | GTX1080Ti + # |v1(v2)| %Accuracy | %Accuracy | %Accuracy | %Accuracy | v1 (v2) + # ---------------------------------------------------------------------------- + # ResNet20 | 3 (2)| 92.16 | 91.25 | ----- | ----- | 35 (---) + # ResNet32 | 5(NA)| 92.46 | 92.49 | NA | NA | 50 ( NA) + # ResNet44 | 7(NA)| 92.50 | 92.83 | NA | NA | 70 ( NA) + # ResNet56 | 9 (6)| 92.71 | 93.03 | 93.01 | NA | 90 (100) + # ResNet110 |18(12)| 92.65 | 93.39+-.16| 93.15 | 93.63 | 165(180) + # ResNet164 |27(18)| ----- | 94.07 | ----- | 94.54 | ---(---) + # ResNet1001| (111)| ----- | 92.39 | ----- | 95.08+-.14| ---(---) + # --------------------------------------------------------------------------- + + n = 3 + + # Model version + # Orig paper: version = 1 (ResNet v1), Improved ResNet: version = 2 (ResNet v2) + version = 2 + + # Computed depth from supplied model parameter n + if version == 1: + depth = n * 6 + 2 + elif version == 2: + depth = n * 9 + 2 + + # Model name, depth and version + model_type = 'ResNet%dv%d' % (depth, version) + + # Load the CIFAR10 data. + def load_preprocessed_data(input_dir): + x_train = np.load(os.path.join(input_dir, "x_train.npy")) + y_train = np.load(os.path.join(input_dir, "y_train.npy")) + x_test = np.load(os.path.join(input_dir, "x_test.npy")) + y_test = np.load(os.path.join(input_dir, "y_test.npy")) + return x_train, y_train, x_test, y_test + + preprocessed_data = load_preprocessed_data(args.input_dir) + x_train, y_train, x_test, y_test = preprocessed_data + + # Input image dimensions. + input_shape = x_train.shape[1:] + + # Normalize data. + x_train = x_train.astype('float32') / 255 + x_test = x_test.astype('float32') / 255 + + # If subtract pixel mean is enabled + if subtract_pixel_mean: + x_train_mean = np.mean(x_train, axis=0) + x_train -= x_train_mean + x_test -= x_train_mean + + print('x_train shape:', x_train.shape) + print(x_train.shape[0], 'train samples') + print(x_test.shape[0], 'test samples') + print('y_train shape:', y_train.shape) + + # Convert class vectors to binary class matrices. + y_train = keras.utils.to_categorical(y_train, num_classes) + y_test = keras.utils.to_categorical(y_test, num_classes) + + def lr_schedule(epoch): + """Learning Rate Schedule + + Learning rate is scheduled to be reduced after 80, 120, 160, 180 epochs. + Called automatically every epoch as part of callbacks during training. + + # Arguments + epoch (int): The number of epochs + + # Returns + lr (float32): learning rate + """ + lr = 1e-3 + if epoch > 180: + lr *= 0.5e-3 + elif epoch > 160: + lr *= 1e-3 + elif epoch > 120: + lr *= 1e-2 + elif epoch > 80: + lr *= 1e-1 + print('Learning rate: ', lr) + return lr + + def resnet_layer(inputs, + num_filters=16, + kernel_size=3, + strides=1, + activation='relu', + batch_normalization=True, + conv_first=True): + """2D Convolution-Batch Normalization-Activation stack builder + + # Arguments + inputs (tensor): input tensor from input image or previous layer + num_filters (int): Conv2D number of filters + kernel_size (int): Conv2D square kernel dimensions + strides (int): Conv2D square stride dimensions + activation (string): activation name + batch_normalization (bool): whether to include batch normalization + conv_first (bool): conv-bn-activation (True) or + bn-activation-conv (False) + + # Returns + x (tensor): tensor as input to the next layer + """ + conv = Conv2D(num_filters, + kernel_size=kernel_size, + strides=strides, + padding='same', + kernel_initializer='he_normal', + kernel_regularizer=l2(1e-4)) + + x = inputs + if conv_first: + x = conv(x) + if batch_normalization: + x = BatchNormalization()(x) + if activation is not None: + x = Activation(activation)(x) + else: + if batch_normalization: + x = BatchNormalization()(x) + if activation is not None: + x = Activation(activation)(x) + x = conv(x) + return x + + def resnet_v1(input_shape, depth, num_classes=10): + """ResNet Version 1 Model builder [a] + + Stacks of 2 x (3 x 3) Conv2D-BN-ReLU + Last ReLU is after the shortcut connection. + At the beginning of each stage, the feature map size is halved (downsampled) + by a convolutional layer with strides=2, while the number of filters is + doubled. Within each stage, the layers have the same number filters and the + same number of filters. + Features maps sizes: + stage 0: 32x32, 16 + stage 1: 16x16, 32 + stage 2: 8x8, 64 + The Number of parameters is approx the same as Table 6 of [a]: + ResNet20 0.27M + ResNet32 0.46M + ResNet44 0.66M + ResNet56 0.85M + ResNet110 1.7M + + # Arguments + input_shape (tensor): shape of input image tensor + depth (int): number of core convolutional layers + num_classes (int): number of classes (CIFAR10 has 10) + + # Returns + model (Model): Keras model instance + """ + if (depth - 2) % 6 != 0: + raise ValueError('depth should be 6n+2 (eg 20, 32, 44 in [a])') + # Start model definition. + num_filters = 16 + num_res_blocks = int((depth - 2) / 6) + + inputs = Input(shape=input_shape) + x = resnet_layer(inputs=inputs) + # Instantiate the stack of residual units + for stack in range(3): + for res_block in range(num_res_blocks): + strides = 1 + if stack > 0 and res_block == 0: # first layer but not first stack + strides = 2 # downsample + y = resnet_layer(inputs=x, + num_filters=num_filters, + strides=strides) + y = resnet_layer(inputs=y, + num_filters=num_filters, + activation=None) + if stack > 0 and res_block == 0: # first layer but not first stack + # linear projection residual shortcut connection to match + # changed dims + x = resnet_layer(inputs=x, + num_filters=num_filters, + kernel_size=1, + strides=strides, + activation=None, + batch_normalization=False) + x = keras.layers.add([x, y]) + x = Activation('relu')(x) + num_filters *= 2 + + # Add classifier on top. + # v1 does not use BN after last shortcut connection-ReLU + x = AveragePooling2D(pool_size=8)(x) + y = Flatten()(x) + outputs = Dense(num_classes, + activation='softmax', + kernel_initializer='he_normal')(y) + + # Instantiate model. + model = Model(inputs=inputs, outputs=outputs) + return model + + def resnet_v2(input_shape, depth, num_classes=10): + """ResNet Version 2 Model builder [b] + + Stacks of (1 x 1)-(3 x 3)-(1 x 1) BN-ReLU-Conv2D or also known as + bottleneck layer + First shortcut connection per layer is 1 x 1 Conv2D. + Second and onwards shortcut connection is identity. + At the beginning of each stage, the feature map size is halved (downsampled) + by a convolutional layer with strides=2, while the number of filter maps is + doubled. Within each stage, the layers have the same number filters and the + same filter map sizes. + Features maps sizes: + conv1 : 32x32, 16 + stage 0: 32x32, 64 + stage 1: 16x16, 128 + stage 2: 8x8, 256 + + # Arguments + input_shape (tensor): shape of input image tensor + depth (int): number of core convolutional layers + num_classes (int): number of classes (CIFAR10 has 10) + + # Returns + model (Model): Keras model instance + """ + if (depth - 2) % 9 != 0: + raise ValueError('depth should be 9n+2 (eg 56 or 110 in [b])') + # Start model definition. + num_filters_in = 16 + num_res_blocks = int((depth - 2) / 9) + + inputs = Input(shape=input_shape) + # v2 performs Conv2D with BN-ReLU on input before splitting into 2 paths + x = resnet_layer(inputs=inputs, + num_filters=num_filters_in, + conv_first=True) + + # Instantiate the stack of residual units + for stage in range(3): + for res_block in range(num_res_blocks): + activation = 'relu' + batch_normalization = True + strides = 1 + if stage == 0: + num_filters_out = num_filters_in * 4 + if res_block == 0: # first layer and first stage + activation = None + batch_normalization = False + else: + num_filters_out = num_filters_in * 2 + if res_block == 0: # first layer but not first stage + strides = 2 # downsample + + # bottleneck residual unit + y = resnet_layer(inputs=x, + num_filters=num_filters_in, + kernel_size=1, + strides=strides, + activation=activation, + batch_normalization=batch_normalization, + conv_first=False) + y = resnet_layer(inputs=y, + num_filters=num_filters_in, + conv_first=False) + y = resnet_layer(inputs=y, + num_filters=num_filters_out, + kernel_size=1, + conv_first=False) + if res_block == 0: + # linear projection residual shortcut connection to match + # changed dims + x = resnet_layer(inputs=x, + num_filters=num_filters_out, + kernel_size=1, + strides=strides, + activation=None, + batch_normalization=False) + x = keras.layers.add([x, y]) + + num_filters_in = num_filters_out + + # Add classifier on top. + # v2 has BN-ReLU before Pooling + x = BatchNormalization()(x) + x = Activation('relu')(x) + x = AveragePooling2D(pool_size=8)(x) + y = Flatten()(x) + outputs = Dense(num_classes, + activation='softmax', + kernel_initializer='he_normal')(y) + + # Instantiate model. + model = Model(inputs=inputs, outputs=outputs) + return model + + if version == 2: + model = resnet_v2(input_shape=input_shape, depth=depth) + else: + model = resnet_v1(input_shape=input_shape, depth=depth) + + model.compile(loss='categorical_crossentropy', + optimizer=Adam(lr=lr_schedule(0)), + metrics=['accuracy']) + model.summary() + print(model_type) + + # Prepare model model saving directory. + save_dir = os.path.join(os.getcwd(), 'saved_models') + model_name = 'cifar10_%s_model.{epoch:03d}.h5' % model_type + if not os.path.isdir(save_dir): + os.makedirs(save_dir) + filepath = os.path.join(save_dir, model_name) + + # Prepare callbacks for model saving and for learning rate adjustment. + checkpoint = ModelCheckpoint(filepath=filepath, + monitor='val_acc', + verbose=1, + save_best_only=True) + + lr_scheduler = LearningRateScheduler(lr_schedule) + + lr_reducer = ReduceLROnPlateau(factor=np.sqrt(0.1), + cooldown=0, + patience=5, + min_lr=0.5e-6) + + callbacks = [checkpoint, lr_reducer, lr_scheduler] + + # Run training, with or without data augmentation. + if not data_augmentation: + print('Not using data augmentation.') + model.fit(x_train, y_train, + batch_size=batch_size, + epochs=epochs, + validation_data=(x_test, y_test), + shuffle=True, + callbacks=callbacks) + else: + print('Using real-time data augmentation.') + # This will do preprocessing and realtime data augmentation: + datagen = ImageDataGenerator( + # set input mean to 0 over the dataset + featurewise_center=False, + # set each sample mean to 0 + samplewise_center=False, + # divide inputs by std of dataset + featurewise_std_normalization=False, + # divide each input by its std + samplewise_std_normalization=False, + # apply ZCA whitening + zca_whitening=False, + # epsilon for ZCA whitening + zca_epsilon=1e-06, + # randomly rotate images in the range (deg 0 to 180) + rotation_range=0, + # randomly shift images horizontally + width_shift_range=0.1, + # randomly shift images vertically + height_shift_range=0.1, + # set range for random shear + shear_range=0., + # set range for random zoom + zoom_range=0., + # set range for random channel shifts + channel_shift_range=0., + # set mode for filling points outside the input boundaries + fill_mode='nearest', + # value used for fill_mode = "constant" + cval=0., + # randomly flip images + horizontal_flip=True, + # randomly flip images + vertical_flip=False, + # set rescaling factor (applied before any other transformation) + rescale=None, + # set function that will be applied on each input + preprocessing_function=None, + # image data format, either "channels_first" or "channels_last" + data_format=None, + # fraction of images reserved for validation (strictly between 0 and 1) + validation_split=0.0) + + # Compute quantities required for featurewise normalization + # (std, mean, and principal components if ZCA whitening is applied). + datagen.fit(x_train) + + # Fit the model on the batches generated by datagen.flow(). + model.fit_generator(datagen.flow(x_train, y_train, batch_size=batch_size), + steps_per_epoch=len(x_train)/batch_size, + validation_data=(x_test, y_test), + epochs=epochs, verbose=1, workers=4, + callbacks=callbacks) + + # Score trained model. + scores = model.evaluate(x_test, y_test, verbose=1) + print('Test loss:', scores[0]) + print('Test accuracy:', scores[1]) + + # Save Keras model + tmp_model_path = os.path.join(args.output_dir, "tmp") + if os.path.isdir(tmp_model_path): + shutil.rmtree(tmp_model_path) + os.mkdir(tmp_model_path) + + keras_model_path = os.path.join(tmp_model_path, 'keras_model.h5') + model.save(keras_model_path) + + # Convert Keras model to Tensorflow SavedModel + def export_h5_to_pb(path_to_h5, export_path): + # Set the learning phase to Test since the model is already trained. + K.set_learning_phase(0) + # Load the Keras model + keras_model = load_model(path_to_h5) + # Build the Protocol Buffer SavedModel at 'export_path' + builder = saved_model_builder.SavedModelBuilder(export_path) + # Create prediction signature to be used by TensorFlow Serving Predict API + signature = predict_signature_def(inputs={"input_1": keras_model.input}, + outputs={"dense_1": keras_model.output}) + with K.get_session() as sess: + # Save the meta graph and the variables + builder.add_meta_graph_and_variables(sess=sess, tags=[tag_constants.SERVING], + signature_def_map={signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: signature}) + builder.save() + + tf_model_path = os.path.join(args.output_dir, "tf_saved_model") + if os.path.isdir(tf_model_path): + shutil.rmtree(tf_model_path) + + export_h5_to_pb(keras_model_path, tf_model_path) + + # Apply TF_TRT on the Tensorflow SavedModel + graph = tf.Graph() + with graph.as_default(): + with tf.Session(): + # Create a TensorRT inference graph from a SavedModel: + trt_graph = trt.create_inference_graph( + input_graph_def=None, + outputs=None, + input_saved_model_dir=tf_model_path, + input_saved_model_tags=[tag_constants.SERVING], + max_batch_size=batch_size, + max_workspace_size_bytes=2 << 30, + precision_mode='fp16') + + print([n.name + '=>' + n.op for n in trt_graph.node]) + + tf.io.write_graph( + trt_graph, + os.path.join(model_dir, args.model_version), + 'model.graphdef', + as_text=False + ) + + # Remove tmp dirs + shutil.rmtree(tmp_model_path) + shutil.rmtree(tf_model_path) + + with open('/output.txt', 'w') as f: + f.write(args.output_dir) + + print('input_dir: {}'.format(args.input_dir)) + print('output_dir: {}'.format(args.output_dir)) + + +if __name__ == "__main__": + main() diff --git a/samples/nvidia-resnet/train/trtis_resource/config.pbtxt b/samples/nvidia-resnet/components/train/src/trtis_resource/config.pbtxt similarity index 84% rename from samples/nvidia-resnet/train/trtis_resource/config.pbtxt rename to samples/nvidia-resnet/components/train/src/trtis_resource/config.pbtxt index 073ab43b366f..21563ca7ddbe 100644 --- a/samples/nvidia-resnet/train/trtis_resource/config.pbtxt +++ b/samples/nvidia-resnet/components/train/src/trtis_resource/config.pbtxt @@ -1,29 +1,29 @@ -name: "resnet_graphdef" -platform: "tensorflow_graphdef" -max_batch_size: 128 - - -input [ - { - name: "input_1_1" - data_type: TYPE_FP32 - format: FORMAT_NHWC - dims: [ 32, 32, 3 ] - } -] - -output [ - { - name: "dense_1_1/Softmax" - data_type: TYPE_FP32 - dims: [ 10 ] - label_filename: "labels.txt" - } -] - -instance_group [ - { - kind: KIND_GPU, - count: 4 - } -] +name: "resnet_graphdef" +platform: "tensorflow_graphdef" +max_batch_size: 128 + + +input [ + { + name: "input_1_1" + data_type: TYPE_FP32 + format: FORMAT_NHWC + dims: [ 32, 32, 3 ] + } +] + +output [ + { + name: "dense_1_1/Softmax" + data_type: TYPE_FP32 + dims: [ 10 ] + label_filename: "labels.txt" + } +] + +instance_group [ + { + count: 2, + kind: KIND_GPU + } +] diff --git a/samples/nvidia-resnet/train/trtis_resource/labels.txt b/samples/nvidia-resnet/components/train/src/trtis_resource/labels.txt similarity index 90% rename from samples/nvidia-resnet/train/trtis_resource/labels.txt rename to samples/nvidia-resnet/components/train/src/trtis_resource/labels.txt index f1a0b5852787..fa30c22b95d7 100644 --- a/samples/nvidia-resnet/train/trtis_resource/labels.txt +++ b/samples/nvidia-resnet/components/train/src/trtis_resource/labels.txt @@ -7,4 +7,4 @@ dog frog horse ship -truck \ No newline at end of file +truck diff --git a/samples/nvidia-resnet/components/webapp/Dockerfile b/samples/nvidia-resnet/components/webapp/Dockerfile new file mode 100644 index 000000000000..9548491acd2b --- /dev/null +++ b/samples/nvidia-resnet/components/webapp/Dockerfile @@ -0,0 +1,22 @@ +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +FROM base-trtis-client + +RUN pip3 install flask +ADD src /workspace/web_server +WORKDIR /workspace/web_server +EXPOSE 8080 + +ENTRYPOINT ["python3", "flask_server.py"] diff --git a/samples/nvidia-resnet/components/webapp/build.sh b/samples/nvidia-resnet/components/webapp/build.sh new file mode 100755 index 000000000000..6568df6617af --- /dev/null +++ b/samples/nvidia-resnet/components/webapp/build.sh @@ -0,0 +1,26 @@ +#!/bin/bash +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +IMAGE= + +# Build base TRTIS client image +git clone https://github.com/NVIDIA/tensorrt-inference-server.git +base=tensorrt-inference-server +docker build -t base-trtis-client -f $base/Dockerfile.client $base +rm -rf $base + +# Build & push webapp image +docker build -t $IMAGE . +docker push $IMAGE diff --git a/samples/nvidia-resnet/components/webapp/src/README.md b/samples/nvidia-resnet/components/webapp/src/README.md new file mode 100644 index 000000000000..42bceb564533 --- /dev/null +++ b/samples/nvidia-resnet/components/webapp/src/README.md @@ -0,0 +1,14 @@ +# web-ui + +The files in this folder define a web interface that can be used to interact with a TensorFlow server + +- flask_server.py + - main server code. Handles incoming requests, and renders HTML from template +- mnist_client.py + - code to interact with TensorFlow model server + - takes in an image and server details, and returns the server's response +- Dockerfile + - builds a runnable container out of the files in this directory + +--- +This is not an officially supported Google product diff --git a/samples/nvidia-resnet/components/webapp/src/flask_server.py b/samples/nvidia-resnet/components/webapp/src/flask_server.py new file mode 100644 index 000000000000..019f576f7411 --- /dev/null +++ b/samples/nvidia-resnet/components/webapp/src/flask_server.py @@ -0,0 +1,87 @@ +''' +Copyright 2018 Google LLC + +Licensed under the Apache License, Version 2.0 (the "License"); +you may not use this file except in compliance with the License. +You may obtain a copy of the License at + + https://www.apache.org/licenses/LICENSE-2.0 + +Unless required by applicable law or agreed to in writing, software +distributed under the License is distributed on an "AS IS" BASIS, +WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +See the License for the specific language governing permissions and +limitations under the License. +''' + +import logging +import os +# from threading import Timer + +from flask import Flask, render_template, request +from trtis_client import get_prediction, random_image + +app = Flask(__name__) + +name_arg = os.getenv('MODEL_SERVE_NAME', 'resnet_graphdef') +addr_arg = os.getenv('TRTSERVER_HOST', '10.110.20.210') +port_arg = os.getenv('TRTSERVER_PORT', '8001') +model_version = os.getenv('MODEL_VERSION', '-1') + +# handle requests to the server +@app.route("/") +def main(): + args = {"name": name_arg, "addr": addr_arg, "port": port_arg, "version": str(model_version)} + logging.info("Request args: %s", args) + + output = None + connection = {"text": "", "success": False} + try: + # get a random test MNIST image + file_name, truth, serving_path = random_image('/workspace/web_server/static/images') + # get prediction from TensorFlow server + pred, scores = get_prediction(file_name, + server_host=addr_arg, + server_port=int(port_arg), + model_name=name_arg, + model_version=int(model_version)) + # if no exceptions thrown, server connection was a success + connection["text"] = "Connected (model version: {0}".format(str(model_version))+ ")" + connection["success"] = True + # parse class confidence scores from server prediction + output = {"truth": truth, "prediction": pred, + "img_path": serving_path, "scores": scores} + except Exception as e: # pylint: disable=broad-except + logging.info("Exception occured: %s", e) + # server connection failed + connection["text"] = "Exception making request: {0}".format(e) + # after 10 seconds, delete cached image file from server + # t = Timer(10.0, remove_resource, [img_path]) + # t.start() + # render results using HTML template + return render_template('index.html', output=output, + connection=connection, args=args) + + +def remove_resource(path): + """ + attempt to delete file from path. Used to clean up MNIST testing images + + :param path: the path of the file to delete + """ + try: + os.remove(path) + print("removed " + path) + except OSError: + print("no file at " + path) + + +if __name__ == '__main__': + logging.basicConfig(level=logging.INFO, + format=('%(levelname)s|%(asctime)s' + '|%(pathname)s|%(lineno)d| %(message)s'), + datefmt='%Y-%m-%dT%H:%M:%S', + ) + logging.getLogger().setLevel(logging.INFO) + logging.info("Starting flask.") + app.run(debug=True, host='0.0.0.0', port=8080) diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/0.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/0.jpg new file mode 100644 index 000000000000..904c5216d1fa Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/0.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/1.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/1.jpg new file mode 100644 index 000000000000..75289241606f Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/1.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/10.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/10.jpg new file mode 100644 index 000000000000..4cd3a3abb2c5 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/10.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/11.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/11.jpg new file mode 100644 index 000000000000..14d7e8e22a18 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/11.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/12.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/12.jpg new file mode 100644 index 000000000000..54fb9a8188fb Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/12.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/13.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/13.jpg new file mode 100644 index 000000000000..327a8d601c5e Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/13.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/14.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/14.jpg new file mode 100644 index 000000000000..ca40fd3f3eb0 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/14.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/15.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/15.jpg new file mode 100644 index 000000000000..8308735627f1 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/15.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/16.png b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/16.png new file mode 100644 index 000000000000..81f2a47fc01c Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/16.png differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/17.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/17.jpg new file mode 100644 index 000000000000..b04f21fbdca7 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/17.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/18.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/18.jpg new file mode 100644 index 000000000000..159af1498a21 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/18.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/19.png b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/19.png new file mode 100644 index 000000000000..309db4dfe600 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/19.png differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/2.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/2.jpg new file mode 100644 index 000000000000..9e0cc5857854 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/2.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/3.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/3.jpg new file mode 100644 index 000000000000..a0f5e66c7232 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/3.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/4.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/4.jpg new file mode 100644 index 000000000000..b8d75d0e3504 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/4.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/5.png b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/5.png new file mode 100644 index 000000000000..0c4c05359221 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/5.png differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/6.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/6.jpg new file mode 100644 index 000000000000..fcd0304ddbbd Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/6.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/7.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/7.jpg new file mode 100644 index 000000000000..1bc6f0d8ca9b Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/7.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/8.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/8.jpg new file mode 100644 index 000000000000..a45947ced0c3 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/8.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/airplane/9.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/9.jpg new file mode 100644 index 000000000000..1047f4d1a504 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/airplane/9.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/0.png b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/0.png new file mode 100644 index 000000000000..0c8126721967 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/0.png differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/1.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/1.jpg new file mode 100644 index 000000000000..3e4f37cbe13c Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/1.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/10.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/10.jpg new file mode 100644 index 000000000000..8ccc57a19027 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/10.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/11.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/11.jpg new file mode 100644 index 000000000000..1c163964f706 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/11.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/12.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/12.jpg new file mode 100644 index 000000000000..c48107d55849 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/12.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/13.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/13.jpg new file mode 100644 index 000000000000..c0feb1d4b7af Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/13.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/14.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/14.jpg new file mode 100644 index 000000000000..b1032b7d8647 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/14.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/15.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/15.jpg new file mode 100644 index 000000000000..9f14cab3d4bc Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/15.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/16.png b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/16.png new file mode 100644 index 000000000000..d041bb3bfa99 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/16.png differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/17.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/17.jpg new file mode 100644 index 000000000000..1701398e833b Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/17.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/18.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/18.jpg new file mode 100644 index 000000000000..61bca21dc83a Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/18.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/19.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/19.jpg new file mode 100644 index 000000000000..8bb29d1135db Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/19.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/2.png b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/2.png new file mode 100644 index 000000000000..b3f14705a52f Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/2.png differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/3.jpeg b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/3.jpeg new file mode 100644 index 000000000000..0f64b56b38c7 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/3.jpeg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/4.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/4.jpg new file mode 100644 index 000000000000..3b34bb9a591e Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/4.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/5.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/5.jpg new file mode 100644 index 000000000000..99d027f38d0f Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/5.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/6.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/6.jpg new file mode 100644 index 000000000000..a46b6c1327be Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/6.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/7.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/7.jpg new file mode 100644 index 000000000000..6add380a0378 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/7.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/8.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/8.jpg new file mode 100644 index 000000000000..a0e0a5b08cac Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/8.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/automobile/9.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/9.jpg new file mode 100644 index 000000000000..c9c05f1218b6 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/automobile/9.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/0.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/0.jpg new file mode 100644 index 000000000000..e20e12b098f7 --- /dev/null +++ b/samples/nvidia-resnet/components/webapp/src/static/images/bird/0.jpg @@ -0,0 +1,527 @@ + + + + + +1467x1007px Bird 73.51 KB #192166 + + + + + + + + + + + + + + + + + +
+
+
+ +
+ + + + + +
+ + + +
+
+ +
+ + + + + + +
+ + +
    +
  • Login
  • + +
  • +
    +
    + + + + + + + + + + + + +
    + +
    + +
    +
    + +
    +
    Remember Me
    +
    + +
    + +
    +
    +
  • +
+
+
+

Bird 192166 73.51 KB

+
+
+ + +
+ +
+ + Bird + +
+ + + +
+ +
+ + +
+

"Bird 192166" is an awesome wallpapers presented here by Daniel Bruno at the top resolution of 1467x1007px and size of 73.51 KB. More of such great image you always can find at the Animals category or with our "Related walls" widget. Upload date - 09 April 2015

+ + + + + + + + + + +
+ + + +
+ Download in Original Size (1467x1007px) +
+ +
+ Download: + +
+ + + + + +
+
+ +
+ Code for Website (HTML): + +
+ +
+ Code for Forum (BB-code): + +
+ +
+
+ + + + +
+
+ + +
+ + + + + +

Related Wallpapers for Bird

+
+ + + + +
+

Popular Wallpapers

+
+ + + + +
+ +
+ + +
+ + diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/1.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/1.jpg new file mode 100644 index 000000000000..1d68b4e095f8 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/bird/1.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/10.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/10.jpg new file mode 100644 index 000000000000..17886b780f78 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/bird/10.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/11.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/11.jpg new file mode 100644 index 000000000000..9e23685705a7 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/bird/11.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/12.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/12.jpg new file mode 100644 index 000000000000..fe47575618ae Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/bird/12.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/13.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/13.jpg new file mode 100644 index 000000000000..e3ff5f2107e9 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/bird/13.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/14.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/14.jpg new file mode 100644 index 000000000000..5fc355d355af Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/bird/14.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/15.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/15.jpg new file mode 100644 index 000000000000..83a548f3909c Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/bird/15.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/16.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/16.jpg new file mode 100644 index 000000000000..3a1449990415 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/bird/16.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/17.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/17.jpg new file mode 100644 index 000000000000..162a2225ca05 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/bird/17.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/18.jpeg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/18.jpeg new file mode 100644 index 000000000000..fa03dcfddfad Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/bird/18.jpeg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/19.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/19.jpg new file mode 100644 index 000000000000..36d2ce29c879 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/bird/19.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/2.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/2.jpg new file mode 100644 index 000000000000..39ac91e7e5b0 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/bird/2.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/3.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/3.jpg new file mode 100644 index 000000000000..a63b8de8bb19 --- /dev/null +++ b/samples/nvidia-resnet/components/webapp/src/static/images/bird/3.jpg @@ -0,0 +1,48 @@ + + +This evening I got lucky enough to spot... - Jeremy Black Photography | Facebook + + + + + + + +
Jump to
Press alt + / to open this menu
See more of Jeremy Black Photography on Facebook
See more of Jeremy Black Photography on Facebook
or
Create New Account
+ + + + + + + + + + + + + + +
+ + + + + + +
+ + + + + + + + + \ No newline at end of file diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/4.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/4.jpg new file mode 100644 index 000000000000..f00981fcf104 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/bird/4.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/5.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/5.jpg new file mode 100644 index 000000000000..8c1674592e63 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/bird/5.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/6.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/6.jpg new file mode 100644 index 000000000000..700d93a3a896 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/bird/6.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/7.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/7.jpg new file mode 100644 index 000000000000..49fd56e6dc76 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/bird/7.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/8.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/8.jpg new file mode 100644 index 000000000000..864239f2f72b Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/bird/8.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/bird/9.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/bird/9.jpg new file mode 100644 index 000000000000..b9b3ebc956e7 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/bird/9.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/0.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/cat/0.jpg new file mode 100644 index 000000000000..a439850df429 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/0.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/1.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/cat/1.jpg new file mode 100644 index 000000000000..4724c3eea26b Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/1.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/10.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/cat/10.jpg new file mode 100644 index 000000000000..09e80c165e63 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/10.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/11.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/cat/11.jpg new file mode 100644 index 000000000000..3c5558c3ea75 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/11.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/12.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/cat/12.jpg new file mode 100644 index 000000000000..4e8ff4a2703c Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/12.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/13.gif b/samples/nvidia-resnet/components/webapp/src/static/images/cat/13.gif new file mode 100644 index 000000000000..0c771addc189 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/13.gif differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/14.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/cat/14.jpg new file mode 100644 index 000000000000..2f21016bc85e Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/14.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/15.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/cat/15.jpg new file mode 100644 index 000000000000..05f08f730506 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/15.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/16.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/cat/16.jpg new file mode 100644 index 000000000000..cb109b2b1bfa Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/16.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/17.jpeg b/samples/nvidia-resnet/components/webapp/src/static/images/cat/17.jpeg new file mode 100644 index 000000000000..023c53ef476e Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/17.jpeg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/18.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/cat/18.jpg new file mode 100644 index 000000000000..08b237f9d94c Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/18.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/19.png b/samples/nvidia-resnet/components/webapp/src/static/images/cat/19.png new file mode 100644 index 000000000000..92d996510319 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/19.png differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/2.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/cat/2.jpg new file mode 100644 index 000000000000..7c79c747147d Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/2.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/3.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/cat/3.jpg new file mode 100644 index 000000000000..a260caf6e03c Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/3.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/4.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/cat/4.jpg new file mode 100644 index 000000000000..c994e81de49c Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/4.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/5.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/cat/5.jpg new file mode 100644 index 000000000000..8c53b6efd83d Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/5.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/6.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/cat/6.jpg new file mode 100644 index 000000000000..8a05a1fc65f5 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/6.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/7.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/cat/7.jpg new file mode 100644 index 000000000000..f7bc217f5c7e Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/7.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/8.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/cat/8.jpg new file mode 100644 index 000000000000..3aecce0c8304 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/8.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/cat/9.jpeg b/samples/nvidia-resnet/components/webapp/src/static/images/cat/9.jpeg new file mode 100644 index 000000000000..6c5837a6635f Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/cat/9.jpeg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/0.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/0.jpg new file mode 100644 index 000000000000..89a5688fa614 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/0.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/1.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/1.jpg new file mode 100644 index 000000000000..e09dae7d159d Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/1.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/10.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/10.jpg new file mode 100644 index 000000000000..32384ddabdb4 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/10.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/11.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/11.jpg new file mode 100644 index 000000000000..da7596365863 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/11.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/12.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/12.jpg new file mode 100644 index 000000000000..fba4701e959a Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/12.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/13.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/13.jpg new file mode 100644 index 000000000000..75a77ef1a21a Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/13.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/14.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/14.jpg new file mode 100644 index 000000000000..e6c30296e09c Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/14.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/15.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/15.jpg new file mode 100644 index 000000000000..0ceac44669c6 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/15.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/16.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/16.jpg new file mode 100644 index 000000000000..e93b0cde3042 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/16.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/17.png b/samples/nvidia-resnet/components/webapp/src/static/images/deer/17.png new file mode 100644 index 000000000000..12a7c912f8d4 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/17.png differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/18.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/18.jpg new file mode 100644 index 000000000000..d8569d243d47 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/18.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/19.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/19.jpg new file mode 100644 index 000000000000..f27faf236541 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/19.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/2.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/2.jpg new file mode 100644 index 000000000000..18caf7359209 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/2.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/3.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/3.jpg new file mode 100644 index 000000000000..f2ecf812814f Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/3.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/4.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/4.jpg new file mode 100644 index 000000000000..f7b74d28bb06 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/4.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/5.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/5.jpg new file mode 100644 index 000000000000..606519f490b6 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/5.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/6.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/6.jpg new file mode 100644 index 000000000000..1aa6fa24cfa6 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/6.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/7.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/7.jpg new file mode 100644 index 000000000000..26e5f27ec1eb Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/7.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/8.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/8.jpg new file mode 100644 index 000000000000..fc793b07126a Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/8.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/deer/9.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/deer/9.jpg new file mode 100644 index 000000000000..8cf0bb287231 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/deer/9.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/0.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/0.jpg new file mode 100644 index 000000000000..efab5b6cca88 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/0.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/1.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/1.jpg new file mode 100644 index 000000000000..4ce468208906 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/1.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/10.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/10.jpg new file mode 100644 index 000000000000..f1458b0822d8 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/10.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/11.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/11.jpg new file mode 100644 index 000000000000..a82c2f0be069 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/11.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/12.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/12.jpg new file mode 100644 index 000000000000..8e8959d46599 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/12.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/13.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/13.jpg new file mode 100644 index 000000000000..83681ea12715 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/13.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/14.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/14.jpg new file mode 100644 index 000000000000..87f0cdd63f92 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/14.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/15.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/15.jpg new file mode 100644 index 000000000000..135598f53a95 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/15.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/16.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/16.jpg new file mode 100644 index 000000000000..87bbd375a04b Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/16.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/17.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/17.jpg new file mode 100644 index 000000000000..6fae975e462f Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/17.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/18.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/18.jpg new file mode 100644 index 000000000000..5b9de13d7d0d Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/18.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/19.png b/samples/nvidia-resnet/components/webapp/src/static/images/dog/19.png new file mode 100644 index 000000000000..5e811f2044b7 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/19.png differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/2.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/2.jpg new file mode 100644 index 000000000000..051ede13d0dd Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/2.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/3.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/3.jpg new file mode 100644 index 000000000000..1f2181642618 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/3.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/4.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/4.jpg new file mode 100644 index 000000000000..351a8f3f68a9 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/4.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/5.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/5.jpg new file mode 100644 index 000000000000..ce06a1477ee2 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/5.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/6.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/6.jpg new file mode 100644 index 000000000000..656a38ecda3b Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/6.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/7.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/7.jpg new file mode 100644 index 000000000000..9b893dcb9934 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/7.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/8.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/8.jpg new file mode 100644 index 000000000000..e3937ea23ff4 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/8.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/dog/9.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/dog/9.jpg new file mode 100644 index 000000000000..14046f03352d Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/dog/9.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/0.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/0.jpg new file mode 100644 index 000000000000..1dbc29ff89b4 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/0.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/1.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/1.jpg new file mode 100644 index 000000000000..e3ddce6aa58f Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/1.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/10.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/10.jpg new file mode 100644 index 000000000000..688133c3a701 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/10.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/11.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/11.jpg new file mode 100644 index 000000000000..aa5b5506c20a Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/11.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/12.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/12.jpg new file mode 100644 index 000000000000..996b94b1416e Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/12.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/13.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/13.jpg new file mode 100644 index 000000000000..c3f09fc0be08 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/13.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/14.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/14.jpg new file mode 100644 index 000000000000..dd1af603d0fb Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/14.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/15.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/15.jpg new file mode 100644 index 000000000000..3ed0c44e98a5 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/15.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/16.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/16.jpg new file mode 100644 index 000000000000..3ae8d889c3fb Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/16.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/17.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/17.jpg new file mode 100644 index 000000000000..64786fdb436c Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/17.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/18.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/18.jpg new file mode 100644 index 000000000000..3dbd79ed55b0 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/18.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/19.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/19.jpg new file mode 100644 index 000000000000..ff89ba65d319 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/19.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/2.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/2.jpg new file mode 100644 index 000000000000..54040025bf7b Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/2.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/3.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/3.jpg new file mode 100644 index 000000000000..a373d9baacdd Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/3.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/4.png b/samples/nvidia-resnet/components/webapp/src/static/images/frog/4.png new file mode 100644 index 000000000000..900eb8308676 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/4.png differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/5.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/5.jpg new file mode 100644 index 000000000000..c15bd9b9639d Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/5.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/6.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/6.jpg new file mode 100644 index 000000000000..6beecb41dbed Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/6.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/7.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/7.jpg new file mode 100644 index 000000000000..5cad0b296340 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/7.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/8.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/8.jpg new file mode 100644 index 000000000000..74d4c27cb805 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/8.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/frog/9.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/frog/9.jpg new file mode 100644 index 000000000000..34841845ae17 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/frog/9.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/0.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/0.jpg new file mode 100644 index 000000000000..0e6f776d5b89 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/0.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/1.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/1.jpg new file mode 100644 index 000000000000..5ca43ec71f06 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/1.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/10.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/10.jpg new file mode 100644 index 000000000000..e1fe1ba83ebb Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/10.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/11.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/11.jpg new file mode 100644 index 000000000000..2e08dcfd78d8 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/11.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/12.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/12.jpg new file mode 100644 index 000000000000..60ad21b595a9 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/12.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/13.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/13.jpg new file mode 100644 index 000000000000..7f03168816d4 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/13.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/14.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/14.jpg new file mode 100644 index 000000000000..2af322744ef1 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/14.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/15.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/15.jpg new file mode 100644 index 000000000000..051d6d144bcb Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/15.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/16.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/16.jpg new file mode 100644 index 000000000000..4e00fd6f9fa2 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/16.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/17.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/17.jpg new file mode 100644 index 000000000000..6ab60061d7f1 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/17.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/18.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/18.jpg new file mode 100644 index 000000000000..33c492ab26ac Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/18.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/19.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/19.jpg new file mode 100644 index 000000000000..78771e135a5c Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/19.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/2.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/2.jpg new file mode 100644 index 000000000000..75a73cb3bb68 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/2.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/3.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/3.jpg new file mode 100644 index 000000000000..e66abc6d2499 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/3.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/4.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/4.jpg new file mode 100644 index 000000000000..7f481630e6aa Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/4.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/5.jpeg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/5.jpeg new file mode 100644 index 000000000000..04cd669a6807 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/5.jpeg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/6.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/6.jpg new file mode 100644 index 000000000000..eca5efea52cd Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/6.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/7.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/7.jpg new file mode 100644 index 000000000000..81ae8d2562eb Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/7.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/8.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/8.jpg new file mode 100644 index 000000000000..c3f6bc17cf6e Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/8.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/horse/9.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/horse/9.jpg new file mode 100644 index 000000000000..612ae6feaf42 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/horse/9.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/0.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/0.jpg new file mode 100644 index 000000000000..439d61ec4d3c Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/ship/0.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/1.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/1.jpg new file mode 100644 index 000000000000..a9f18a6b9405 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/ship/1.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/10.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/10.jpg new file mode 100644 index 000000000000..c50b0fe92f17 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/ship/10.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/11.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/11.jpg new file mode 100644 index 000000000000..2d20c2c3ad13 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/ship/11.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/12.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/12.jpg new file mode 100644 index 000000000000..a1a7f22d530c Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/ship/12.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/13.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/13.jpg new file mode 100644 index 000000000000..fca8669f21da Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/ship/13.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/14.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/14.jpg new file mode 100644 index 000000000000..0f5fb607e95d Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/ship/14.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/15.jpeg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/15.jpeg new file mode 100644 index 000000000000..9b143aebe053 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/ship/15.jpeg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/16.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/16.jpg new file mode 100644 index 000000000000..e6fb5a9df6c9 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/ship/16.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/17.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/17.jpg new file mode 100644 index 000000000000..5aa9d2c5391c Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/ship/17.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/18.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/18.jpg new file mode 100644 index 000000000000..d4a2d8639fbf Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/ship/18.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/19.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/19.jpg new file mode 100644 index 000000000000..e69de29bb2d1 diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/2.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/2.jpg new file mode 100644 index 000000000000..064ab050ce8a Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/ship/2.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/3.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/3.jpg new file mode 100644 index 000000000000..76046c1e0a5b Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/ship/3.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/4.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/4.jpg new file mode 100644 index 000000000000..9b6d6cd1cb2a Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/ship/4.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/5.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/5.jpg new file mode 100644 index 000000000000..6444ebc79b17 --- /dev/null +++ b/samples/nvidia-resnet/components/webapp/src/static/images/ship/5.jpg @@ -0,0 +1,302 @@ + + + + + + + + + +Luxury cruises in 2019 & 2020 – Incredible vacations with Cunard + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+
+
+ +
+
+
+ + + + + + + + + + + + + +
+
+
+
+
+
+

Current Campaigns

Queen Mary 2 sails into New York city

Celebrate the 4th of July on Queen Mary 2.

Couple enjoying a stroll on Queen Mary 2 deck

2021 voyages on sale now.

Spring Savings Event.

2020 voyages.

+ +
+
+ +
+
+
+

Featured voyages.

Independence Day Celebration, Halifax And Boston Short Break

Find out more

Eastern Caribbean, 12 Nights

Find out more

Alaska, 10 Nights

Find out more
+ +
+
+
+
+

The Cunard experience.

+
+
+
+
+
+ +
+
+
+
+

Queen Mary 2.

The blissful space onboard the most famous ocean liner in the world is yours to enjoy on a Transatlantic Crossing or destinations around the globe.

Queen Victoria.

Relax in Queen Victoria’s warm and welcoming ambiance on voyages exploring the Mediterranean, the Norwegian Fjords and beyond.

+ +
+
+
+
+

Queen Elizabeth.

Experience the Queen Elizabeth’s refined and graceful appeal as you discover the exotic allure of Japan and Eastern Asia, Alaska or Australia’s diverse coast.

The World Voyage by Cunard.

In 2020, you can experience the ultimate maritime adventure, whether it is our signature full circumnavigation of the globe or an extended journey to countless exotic shores.

+ +
+
+
+
+

Why cross with Cunard?

Get an inspiring glimpse into the Transatlantic experience from those who have sailed.

+
+
+
+
+
+ +
+
+
+
+ + +
+
+
+
+ +
+
+
+
+ +
+
+
+
+ +
+
+
+
+ +
+
+
+
+
+
+
+ +
+
+
+
+ + + + + + + + + + + + + +
+
+
+ + +
+
+
+ + + + + + + + diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/6.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/6.jpg new file mode 100644 index 000000000000..48a9e288e31e Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/ship/6.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/7.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/7.jpg new file mode 100644 index 000000000000..2d07ac04a193 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/ship/7.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/8.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/8.jpg new file mode 100644 index 000000000000..ce5725aa4bb3 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/ship/8.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/ship/9.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/ship/9.jpg new file mode 100644 index 000000000000..0b0f77c01533 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/ship/9.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/0.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/truck/0.jpg new file mode 100644 index 000000000000..27888605675c Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/0.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/1.png b/samples/nvidia-resnet/components/webapp/src/static/images/truck/1.png new file mode 100644 index 000000000000..35636aa03c5c Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/1.png differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/10.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/truck/10.jpg new file mode 100644 index 000000000000..08d692f4e856 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/10.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/11.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/truck/11.jpg new file mode 100644 index 000000000000..85ec82c86048 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/11.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/12.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/truck/12.jpg new file mode 100644 index 000000000000..75955d865846 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/12.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/13.png b/samples/nvidia-resnet/components/webapp/src/static/images/truck/13.png new file mode 100644 index 000000000000..6c0a07a90897 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/13.png differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/14.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/truck/14.jpg new file mode 100644 index 000000000000..2585f91f22b2 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/14.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/15.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/truck/15.jpg new file mode 100644 index 000000000000..fc52ecf1a37f Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/15.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/16.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/truck/16.jpg new file mode 100644 index 000000000000..ecf3d3c1fc0c Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/16.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/17.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/truck/17.jpg new file mode 100644 index 000000000000..efaa8cf74220 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/17.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/18.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/truck/18.jpg new file mode 100644 index 000000000000..eef7e6e7371e Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/18.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/19.png b/samples/nvidia-resnet/components/webapp/src/static/images/truck/19.png new file mode 100644 index 000000000000..fd2a29125f8e Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/19.png differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/2.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/truck/2.jpg new file mode 100644 index 000000000000..6192ead65a31 --- /dev/null +++ b/samples/nvidia-resnet/components/webapp/src/static/images/truck/2.jpg @@ -0,0 +1,42 @@ + + +Big Truck Driver updated their profile... - Big Truck Driver | Facebook + + + + + + + +
+ + + + + + + + + + + + + +
+ + + + + + +
+ + + + + + \ No newline at end of file diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/3.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/truck/3.jpg new file mode 100644 index 000000000000..12583282524b Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/3.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/4.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/truck/4.jpg new file mode 100644 index 000000000000..a9e7d2e422c3 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/4.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/5.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/truck/5.jpg new file mode 100644 index 000000000000..f1217de4c96e Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/5.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/6.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/truck/6.jpg new file mode 100644 index 000000000000..fd6b19349801 Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/6.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/7.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/truck/7.jpg new file mode 100644 index 000000000000..09b0d287f81d Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/7.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/8.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/truck/8.jpg new file mode 100644 index 000000000000..504c3f00765b Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/8.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/images/truck/9.jpg b/samples/nvidia-resnet/components/webapp/src/static/images/truck/9.jpg new file mode 100644 index 000000000000..c5ddc2f254bb Binary files /dev/null and b/samples/nvidia-resnet/components/webapp/src/static/images/truck/9.jpg differ diff --git a/samples/nvidia-resnet/components/webapp/src/static/scripts/material.min.js b/samples/nvidia-resnet/components/webapp/src/static/scripts/material.min.js new file mode 100644 index 000000000000..cb9bbca9edc5 --- /dev/null +++ b/samples/nvidia-resnet/components/webapp/src/static/scripts/material.min.js @@ -0,0 +1,10 @@ +/** + * material-design-lite - Material Design Components in CSS, JS and HTML + * @version v1.0.6 + * @license Apache-2.0 + * @copyright 2015 Google, Inc. + * @link https://github.com/google/material-design-lite + */ +!function(){"use strict";function e(e,t){if(e){if(t.element_.classList.contains(t.CssClasses_.MDL_JS_RIPPLE_EFFECT)){var s=document.createElement("span");s.classList.add(t.CssClasses_.MDL_RIPPLE_CONTAINER),s.classList.add(t.CssClasses_.MDL_JS_RIPPLE_EFFECT);var i=document.createElement("span");i.classList.add(t.CssClasses_.MDL_RIPPLE),s.appendChild(i),e.appendChild(s)}e.addEventListener("click",function(s){s.preventDefault();var i=e.href.split("#")[1],n=t.element_.querySelector("#"+i);t.resetTabState_(),t.resetPanelState_(),e.classList.add(t.CssClasses_.ACTIVE_CLASS),n.classList.add(t.CssClasses_.ACTIVE_CLASS)})}}function t(e,t,s,i){if(i.tabBar_.classList.contains(i.CssClasses_.JS_RIPPLE_EFFECT)){var n=document.createElement("span");n.classList.add(i.CssClasses_.RIPPLE_CONTAINER),n.classList.add(i.CssClasses_.JS_RIPPLE_EFFECT);var a=document.createElement("span");a.classList.add(i.CssClasses_.RIPPLE),n.appendChild(a),e.appendChild(n)}e.addEventListener("click",function(n){n.preventDefault();var a=e.href.split("#")[1],l=i.content_.querySelector("#"+a);i.resetTabState_(t),i.resetPanelState_(s),e.classList.add(i.CssClasses_.IS_ACTIVE),l.classList.add(i.CssClasses_.IS_ACTIVE)})}var s={upgradeDom:function(e,t){},upgradeElement:function(e,t){},upgradeElements:function(e){},upgradeAllRegistered:function(){},registerUpgradedCallback:function(e,t){},register:function(e){},downgradeElements:function(e){}};s=function(){function e(e,t){for(var s=0;sd;d++){if(r=l[d],!r)throw new Error("Unable to find a registered component for the given class.");a.push(r.className),i.setAttribute("data-upgraded",a.join(","));var h=new r.classConstructor(i);h[C]=r,c.push(h);for(var u=0,m=r.callbacks.length;m>u;u++)r.callbacks[u](i);r.widget&&(i[r.className]=h);var E=document.createEvent("Events");E.initEvent("mdl-componentupgraded",!0,!0),i.dispatchEvent(E)}}function a(e){Array.isArray(e)||(e="function"==typeof e.item?Array.prototype.slice.call(e):[e]);for(var t,s=0,i=e.length;i>s;s++)t=e[s],t instanceof HTMLElement&&(n(t),t.children.length>0&&a(t.children))}function l(t){var s="undefined"==typeof t.widget&&"undefined"==typeof t.widget,i=!0;s||(i=t.widget||t.widget);var n={classConstructor:t.constructor||t.constructor,className:t.classAsString||t.classAsString,cssClass:t.cssClass||t.cssClass,widget:i,callbacks:[]};if(p.forEach(function(e){if(e.cssClass===n.cssClass)throw new Error("The provided cssClass has already been registered: "+e.cssClass);if(e.className===n.className)throw new Error("The provided className has already been registered")}),t.constructor.prototype.hasOwnProperty(C))throw new Error("MDL component classes must not have "+C+" defined as a property.");var a=e(t.classAsString,n);a||p.push(n)}function o(t,s){var i=e(t);i&&i.callbacks.push(s)}function r(){for(var e=0;e0&&this.container_.classList.contains(this.CssClasses_.IS_VISIBLE)&&(e.keyCode===this.Keycodes_.UP_ARROW?(e.preventDefault(),t[t.length-1].focus()):e.keyCode===this.Keycodes_.DOWN_ARROW&&(e.preventDefault(),t[0].focus()))}},_.prototype.handleItemKeyboardEvent_=function(e){if(this.element_&&this.container_){var t=this.element_.querySelectorAll("."+this.CssClasses_.ITEM+":not([disabled])");if(t&&t.length>0&&this.container_.classList.contains(this.CssClasses_.IS_VISIBLE)){var s=Array.prototype.slice.call(t).indexOf(e.target);if(e.keyCode===this.Keycodes_.UP_ARROW)e.preventDefault(),s>0?t[s-1].focus():t[t.length-1].focus();else if(e.keyCode===this.Keycodes_.DOWN_ARROW)e.preventDefault(),t.length>s+1?t[s+1].focus():t[0].focus();else if(e.keyCode===this.Keycodes_.SPACE||e.keyCode===this.Keycodes_.ENTER){e.preventDefault();var i=new MouseEvent("mousedown");e.target.dispatchEvent(i),i=new MouseEvent("mouseup"),e.target.dispatchEvent(i),e.target.click()}else e.keyCode===this.Keycodes_.ESCAPE&&(e.preventDefault(),this.hide())}}},_.prototype.handleItemClick_=function(e){e.target.hasAttribute("disabled")?e.stopPropagation():(this.closing_=!0,window.setTimeout(function(e){this.hide(),this.closing_=!1}.bind(this),this.Constant_.CLOSE_TIMEOUT))},_.prototype.applyClip_=function(e,t){this.element_.classList.contains(this.CssClasses_.UNALIGNED)?this.element_.style.clip="":this.element_.classList.contains(this.CssClasses_.BOTTOM_RIGHT)?this.element_.style.clip="rect(0 "+t+"px 0 "+t+"px)":this.element_.classList.contains(this.CssClasses_.TOP_LEFT)?this.element_.style.clip="rect("+e+"px 0 "+e+"px 0)":this.element_.classList.contains(this.CssClasses_.TOP_RIGHT)?this.element_.style.clip="rect("+e+"px "+t+"px "+e+"px "+t+"px)":this.element_.style.clip=""},_.prototype.addAnimationEndListener_=function(){var e=function(){this.element_.removeEventListener("transitionend",e),this.element_.removeEventListener("webkitTransitionEnd",e),this.element_.classList.remove(this.CssClasses_.IS_ANIMATING)}.bind(this);this.element_.addEventListener("transitionend",e),this.element_.addEventListener("webkitTransitionEnd",e)},_.prototype.show=function(e){if(this.element_&&this.container_&&this.outline_){var t=this.element_.getBoundingClientRect().height,s=this.element_.getBoundingClientRect().width;this.container_.style.width=s+"px",this.container_.style.height=t+"px",this.outline_.style.width=s+"px",this.outline_.style.height=t+"px";for(var i=this.Constant_.TRANSITION_DURATION_SECONDS*this.Constant_.TRANSITION_DURATION_FRACTION,n=this.element_.querySelectorAll("."+this.CssClasses_.ITEM),a=0;a=this.maxRows&&e.preventDefault()},E.prototype.onFocus_=function(e){this.element_.classList.add(this.CssClasses_.IS_FOCUSED)},E.prototype.onBlur_=function(e){this.element_.classList.remove(this.CssClasses_.IS_FOCUSED)},E.prototype.updateClasses_=function(){this.checkDisabled(),this.checkValidity(),this.checkDirty()},E.prototype.checkDisabled=function(){this.input_.disabled?this.element_.classList.add(this.CssClasses_.IS_DISABLED):this.element_.classList.remove(this.CssClasses_.IS_DISABLED)},E.prototype.checkDisabled=E.prototype.checkDisabled,E.prototype.checkValidity=function(){this.input_.validity&&(this.input_.validity.valid?this.element_.classList.remove(this.CssClasses_.IS_INVALID):this.element_.classList.add(this.CssClasses_.IS_INVALID))},E.prototype.checkValidity=E.prototype.checkValidity,E.prototype.checkDirty=function(){this.input_.value&&this.input_.value.length>0?this.element_.classList.add(this.CssClasses_.IS_DIRTY):this.element_.classList.remove(this.CssClasses_.IS_DIRTY)},E.prototype.checkDirty=E.prototype.checkDirty,E.prototype.disable=function(){this.input_.disabled=!0,this.updateClasses_()},E.prototype.disable=E.prototype.disable,E.prototype.enable=function(){this.input_.disabled=!1,this.updateClasses_()},E.prototype.enable=E.prototype.enable,E.prototype.change=function(e){this.input_.value=e||"",this.updateClasses_()},E.prototype.change=E.prototype.change,E.prototype.init=function(){if(this.element_&&(this.label_=this.element_.querySelector("."+this.CssClasses_.LABEL),this.input_=this.element_.querySelector("."+this.CssClasses_.INPUT),this.input_)){this.input_.hasAttribute(this.Constant_.MAX_ROWS_ATTRIBUTE)&&(this.maxRows=parseInt(this.input_.getAttribute(this.Constant_.MAX_ROWS_ATTRIBUTE),10),isNaN(this.maxRows)&&(this.maxRows=this.Constant_.NO_MAX_ROWS)),this.boundUpdateClassesHandler=this.updateClasses_.bind(this),this.boundFocusHandler=this.onFocus_.bind(this),this.boundBlurHandler=this.onBlur_.bind(this),this.input_.addEventListener("input",this.boundUpdateClassesHandler),this.input_.addEventListener("focus",this.boundFocusHandler),this.input_.addEventListener("blur",this.boundBlurHandler),this.maxRows!==this.Constant_.NO_MAX_ROWS&&(this.boundKeyDownHandler=this.onKeyDown_.bind(this),this.input_.addEventListener("keydown",this.boundKeyDownHandler));var e=this.element_.classList.contains(this.CssClasses_.IS_INVALID);this.updateClasses_(),this.element_.classList.add(this.CssClasses_.IS_UPGRADED),e&&this.element_.classList.add(this.CssClasses_.IS_INVALID)}},E.prototype.mdlDowngrade_=function(){this.input_.removeEventListener("input",this.boundUpdateClassesHandler),this.input_.removeEventListener("focus",this.boundFocusHandler),this.input_.removeEventListener("blur",this.boundBlurHandler),this.boundKeyDownHandler&&this.input_.removeEventListener("keydown",this.boundKeyDownHandler)},E.prototype.mdlDowngrade=E.prototype.mdlDowngrade_,E.prototype.mdlDowngrade=E.prototype.mdlDowngrade,s.register({constructor:E,classAsString:"MaterialTextfield",cssClass:"mdl-js-textfield",widget:!0});var L=function(e){this.element_=e,this.init()};window.MaterialTooltip=L,L.prototype.Constant_={},L.prototype.CssClasses_={IS_ACTIVE:"is-active"},L.prototype.handleMouseEnter_=function(e){e.stopPropagation();var t=e.target.getBoundingClientRect(),s=t.left+t.width/2,i=-1*(this.element_.offsetWidth/2);0>s+i?(this.element_.style.left=0,this.element_.style.marginLeft=0):(this.element_.style.left=s+"px",this.element_.style.marginLeft=i+"px"),this.element_.style.top=t.top+t.height+10+"px",this.element_.classList.add(this.CssClasses_.IS_ACTIVE),window.addEventListener("scroll",this.boundMouseLeaveHandler,!1),window.addEventListener("touchmove",this.boundMouseLeaveHandler,!1)},L.prototype.handleMouseLeave_=function(e){e.stopPropagation(),this.element_.classList.remove(this.CssClasses_.IS_ACTIVE),window.removeEventListener("scroll",this.boundMouseLeaveHandler),window.removeEventListener("touchmove",this.boundMouseLeaveHandler,!1)},L.prototype.init=function(){if(this.element_){var e=this.element_.getAttribute("for");e&&(this.forElement_=document.getElementById(e)),this.forElement_&&(this.forElement_.hasAttribute("tabindex")||this.forElement_.setAttribute("tabindex","0"),this.boundMouseEnterHandler=this.handleMouseEnter_.bind(this),this.boundMouseLeaveHandler=this.handleMouseLeave_.bind(this),this.forElement_.addEventListener("mouseenter",this.boundMouseEnterHandler,!1),this.forElement_.addEventListener("click",this.boundMouseEnterHandler,!1),this.forElement_.addEventListener("blur",this.boundMouseLeaveHandler),this.forElement_.addEventListener("touchstart",this.boundMouseEnterHandler,!1),this.forElement_.addEventListener("mouseleave",this.boundMouseLeaveHandler))}},L.prototype.mdlDowngrade_=function(){this.forElement_&&(this.forElement_.removeEventListener("mouseenter",this.boundMouseEnterHandler,!1),this.forElement_.removeEventListener("click",this.boundMouseEnterHandler,!1),this.forElement_.removeEventListener("touchstart",this.boundMouseEnterHandler,!1),this.forElement_.removeEventListener("mouseleave",this.boundMouseLeaveHandler))},L.prototype.mdlDowngrade=L.prototype.mdlDowngrade_,L.prototype.mdlDowngrade=L.prototype.mdlDowngrade,s.register({constructor:L,classAsString:"MaterialTooltip",cssClass:"mdl-tooltip"});var I=function(e){this.element_=e,this.init()};window.MaterialLayout=I,I.prototype.Constant_={MAX_WIDTH:"(max-width: 1024px)",TAB_SCROLL_PIXELS:100,MENU_ICON:"menu",CHEVRON_LEFT:"chevron_left",CHEVRON_RIGHT:"chevron_right"},I.prototype.Mode_={STANDARD:0,SEAMED:1,WATERFALL:2,SCROLL:3},I.prototype.CssClasses_={CONTAINER:"mdl-layout__container",HEADER:"mdl-layout__header",DRAWER:"mdl-layout__drawer",CONTENT:"mdl-layout__content",DRAWER_BTN:"mdl-layout__drawer-button",ICON:"material-icons",JS_RIPPLE_EFFECT:"mdl-js-ripple-effect",RIPPLE_CONTAINER:"mdl-layout__tab-ripple-container",RIPPLE:"mdl-ripple",RIPPLE_IGNORE_EVENTS:"mdl-js-ripple-effect--ignore-events",HEADER_SEAMED:"mdl-layout__header--seamed",HEADER_WATERFALL:"mdl-layout__header--waterfall",HEADER_SCROLL:"mdl-layout__header--scroll",FIXED_HEADER:"mdl-layout--fixed-header",OBFUSCATOR:"mdl-layout__obfuscator",TAB_BAR:"mdl-layout__tab-bar",TAB_CONTAINER:"mdl-layout__tab-bar-container",TAB:"mdl-layout__tab",TAB_BAR_BUTTON:"mdl-layout__tab-bar-button",TAB_BAR_LEFT_BUTTON:"mdl-layout__tab-bar-left-button",TAB_BAR_RIGHT_BUTTON:"mdl-layout__tab-bar-right-button",PANEL:"mdl-layout__tab-panel",HAS_DRAWER:"has-drawer",HAS_TABS:"has-tabs",HAS_SCROLLING_HEADER:"has-scrolling-header",CASTING_SHADOW:"is-casting-shadow",IS_COMPACT:"is-compact",IS_SMALL_SCREEN:"is-small-screen",IS_DRAWER_OPEN:"is-visible",IS_ACTIVE:"is-active",IS_UPGRADED:"is-upgraded",IS_ANIMATING:"is-animating",ON_LARGE_SCREEN:"mdl-layout--large-screen-only",ON_SMALL_SCREEN:"mdl-layout--small-screen-only"},I.prototype.contentScrollHandler_=function(){this.header_.classList.contains(this.CssClasses_.IS_ANIMATING)||(this.content_.scrollTop>0&&!this.header_.classList.contains(this.CssClasses_.IS_COMPACT)?(this.header_.classList.add(this.CssClasses_.CASTING_SHADOW),this.header_.classList.add(this.CssClasses_.IS_COMPACT),this.header_.classList.add(this.CssClasses_.IS_ANIMATING)):this.content_.scrollTop<=0&&this.header_.classList.contains(this.CssClasses_.IS_COMPACT)&&(this.header_.classList.remove(this.CssClasses_.CASTING_SHADOW),this.header_.classList.remove(this.CssClasses_.IS_COMPACT),this.header_.classList.add(this.CssClasses_.IS_ANIMATING)))},I.prototype.screenSizeHandler_=function(){this.screenSizeMediaQuery_.matches?this.element_.classList.add(this.CssClasses_.IS_SMALL_SCREEN):(this.element_.classList.remove(this.CssClasses_.IS_SMALL_SCREEN),this.drawer_&&(this.drawer_.classList.remove(this.CssClasses_.IS_DRAWER_OPEN),this.obfuscator_.classList.remove(this.CssClasses_.IS_DRAWER_OPEN)))},I.prototype.drawerToggleHandler_=function(){this.drawer_.classList.toggle(this.CssClasses_.IS_DRAWER_OPEN),this.obfuscator_.classList.toggle(this.CssClasses_.IS_DRAWER_OPEN)},I.prototype.headerTransitionEndHandler_=function(){this.header_.classList.remove(this.CssClasses_.IS_ANIMATING)},I.prototype.headerClickHandler_=function(){this.header_.classList.contains(this.CssClasses_.IS_COMPACT)&&(this.header_.classList.remove(this.CssClasses_.IS_COMPACT),this.header_.classList.add(this.CssClasses_.IS_ANIMATING))},I.prototype.resetTabState_=function(e){for(var t=0;tn;n++){var a=s[n];a.classList&&a.classList.contains(this.CssClasses_.HEADER)&&(this.header_=a),a.classList&&a.classList.contains(this.CssClasses_.DRAWER)&&(this.drawer_=a),a.classList&&a.classList.contains(this.CssClasses_.CONTENT)&&(this.content_=a)}this.header_&&(this.tabBar_=this.header_.querySelector("."+this.CssClasses_.TAB_BAR));var l=this.Mode_.STANDARD;if(this.header_&&(this.header_.classList.contains(this.CssClasses_.HEADER_SEAMED)?l=this.Mode_.SEAMED:this.header_.classList.contains(this.CssClasses_.HEADER_WATERFALL)?(l=this.Mode_.WATERFALL,this.header_.addEventListener("transitionend",this.headerTransitionEndHandler_.bind(this)),this.header_.addEventListener("click",this.headerClickHandler_.bind(this))):this.header_.classList.contains(this.CssClasses_.HEADER_SCROLL)&&(l=this.Mode_.SCROLL,e.classList.add(this.CssClasses_.HAS_SCROLLING_HEADER)),l===this.Mode_.STANDARD?(this.header_.classList.add(this.CssClasses_.CASTING_SHADOW),this.tabBar_&&this.tabBar_.classList.add(this.CssClasses_.CASTING_SHADOW)):l===this.Mode_.SEAMED||l===this.Mode_.SCROLL?(this.header_.classList.remove(this.CssClasses_.CASTING_SHADOW),this.tabBar_&&this.tabBar_.classList.remove(this.CssClasses_.CASTING_SHADOW)):l===this.Mode_.WATERFALL&&(this.content_.addEventListener("scroll",this.contentScrollHandler_.bind(this)),this.contentScrollHandler_())),this.drawer_){var o=this.element_.querySelector("."+this.CssClasses_.DRAWER_BTN);if(!o){o=document.createElement("div"),o.classList.add(this.CssClasses_.DRAWER_BTN);var r=document.createElement("i");r.classList.add(this.CssClasses_.ICON),r.textContent=this.Constant_.MENU_ICON,o.appendChild(r)}this.drawer_.classList.contains(this.CssClasses_.ON_LARGE_SCREEN)?o.classList.add(this.CssClasses_.ON_LARGE_SCREEN):this.drawer_.classList.contains(this.CssClasses_.ON_SMALL_SCREEN)&&o.classList.add(this.CssClasses_.ON_SMALL_SCREEN),o.addEventListener("click",this.drawerToggleHandler_.bind(this)),this.element_.classList.add(this.CssClasses_.HAS_DRAWER),this.element_.classList.contains(this.CssClasses_.FIXED_HEADER)?this.header_.insertBefore(o,this.header_.firstChild):this.element_.insertBefore(o,this.content_);var d=document.createElement("div");d.classList.add(this.CssClasses_.OBFUSCATOR),this.element_.appendChild(d),d.addEventListener("click",this.drawerToggleHandler_.bind(this)),this.obfuscator_=d}if(this.screenSizeMediaQuery_=window.matchMedia(this.Constant_.MAX_WIDTH),this.screenSizeMediaQuery_.addListener(this.screenSizeHandler_.bind(this)),this.screenSizeHandler_(),this.header_&&this.tabBar_){this.element_.classList.add(this.CssClasses_.HAS_TABS);var _=document.createElement("div");_.classList.add(this.CssClasses_.TAB_CONTAINER),this.header_.insertBefore(_,this.tabBar_),this.header_.removeChild(this.tabBar_);var h=document.createElement("div");h.classList.add(this.CssClasses_.TAB_BAR_BUTTON),h.classList.add(this.CssClasses_.TAB_BAR_LEFT_BUTTON);var p=document.createElement("i");p.classList.add(this.CssClasses_.ICON),p.textContent=this.Constant_.CHEVRON_LEFT,h.appendChild(p),h.addEventListener("click",function(){this.tabBar_.scrollLeft-=this.Constant_.TAB_SCROLL_PIXELS}.bind(this));var c=document.createElement("div");c.classList.add(this.CssClasses_.TAB_BAR_BUTTON),c.classList.add(this.CssClasses_.TAB_BAR_RIGHT_BUTTON);var u=document.createElement("i");u.classList.add(this.CssClasses_.ICON),u.textContent=this.Constant_.CHEVRON_RIGHT,c.appendChild(u),c.addEventListener("click",function(){this.tabBar_.scrollLeft+=this.Constant_.TAB_SCROLL_PIXELS}.bind(this)),_.appendChild(h),_.appendChild(this.tabBar_),_.appendChild(c);var C=function(){this.tabBar_.scrollLeft>0?h.classList.add(this.CssClasses_.IS_ACTIVE):h.classList.remove(this.CssClasses_.IS_ACTIVE),this.tabBar_.scrollLeft0)return;this.setFrameCount(1);var i,n,a=e.currentTarget.getBoundingClientRect();if(0===e.clientX&&0===e.clientY)i=Math.round(a.width/2),n=Math.round(a.height/2);else{var l=e.clientX?e.clientX:e.touches[0].clientX,o=e.clientY?e.clientY:e.touches[0].clientY;i=Math.round(l-a.left),n=Math.round(o-a.top)}this.setRippleXY(i,n),this.setRippleStyles(!0),window.requestAnimationFrame(this.animFrameHandler.bind(this))}},b.prototype.upHandler_=function(e){e&&2!==e.detail&&this.rippleElement_.classList.remove(this.CssClasses_.IS_VISIBLE),window.setTimeout(function(){this.rippleElement_.classList.remove(this.CssClasses_.IS_VISIBLE)}.bind(this),0)},b.prototype.init=function(){if(this.element_){var e=this.element_.classList.contains(this.CssClasses_.RIPPLE_CENTER);this.element_.classList.contains(this.CssClasses_.RIPPLE_EFFECT_IGNORE_EVENTS)||(this.rippleElement_=this.element_.querySelector("."+this.CssClasses_.RIPPLE),this.frameCount_=0,this.rippleSize_=0,this.x_=0,this.y_=0,this.ignoringMouseDown_=!1,this.boundDownHandler=this.downHandler_.bind(this),this.element_.addEventListener("mousedown",this.boundDownHandler),this.element_.addEventListener("touchstart",this.boundDownHandler),this.boundUpHandler=this.upHandler_.bind(this),this.element_.addEventListener("mouseup",this.boundUpHandler),this.element_.addEventListener("mouseleave",this.boundUpHandler),this.element_.addEventListener("touchend",this.boundUpHandler),this.element_.addEventListener("blur",this.boundUpHandler),this.getFrameCount=function(){return this.frameCount_},this.setFrameCount=function(e){this.frameCount_=e},this.getRippleElement=function(){return this.rippleElement_},this.setRippleXY=function(e,t){this.x_=e,this.y_=t},this.setRippleStyles=function(t){if(null!==this.rippleElement_){var s,i,n,a="translate("+this.x_+"px, "+this.y_+"px)";t?(i=this.Constant_.INITIAL_SCALE,n=this.Constant_.INITIAL_SIZE):(i=this.Constant_.FINAL_SCALE,n=this.rippleSize_+"px",e&&(a="translate("+this.boundWidth/2+"px, "+this.boundHeight/2+"px)")),s="translate(-50%, -50%) "+a+i,this.rippleElement_.style.webkitTransform=s,this.rippleElement_.style.msTransform=s,this.rippleElement_.style.transform=s,t?this.rippleElement_.classList.remove(this.CssClasses_.IS_ANIMATING):this.rippleElement_.classList.add(this.CssClasses_.IS_ANIMATING)}},this.animFrameHandler=function(){this.frameCount_-->0?window.requestAnimationFrame(this.animFrameHandler.bind(this)):this.setRippleStyles(!1)})}},b.prototype.mdlDowngrade_=function(){this.element_.removeEventListener("mousedown",this.boundDownHandler),this.element_.removeEventListener("touchstart",this.boundDownHandler),this.element_.removeEventListener("mouseup",this.boundUpHandler),this.element_.removeEventListener("mouseleave",this.boundUpHandler),this.element_.removeEventListener("touchend",this.boundUpHandler),this.element_.removeEventListener("blur",this.boundUpHandler)},b.prototype.mdlDowngrade=b.prototype.mdlDowngrade_,b.prototype.mdlDowngrade=b.prototype.mdlDowngrade,s.register({constructor:b,classAsString:"MaterialRipple",cssClass:"mdl-js-ripple-effect",widget:!1})}(); +//# sourceMappingURL=material.min.js.map diff --git a/samples/nvidia-resnet/components/webapp/src/static/styles/demo.css b/samples/nvidia-resnet/components/webapp/src/static/styles/demo.css new file mode 100644 index 000000000000..5cbfd26797b0 --- /dev/null +++ b/samples/nvidia-resnet/components/webapp/src/static/styles/demo.css @@ -0,0 +1,238 @@ +/** + * Copyright 2015 Google Inc. All Rights Reserved. + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +html, body { + font-family: 'Roboto', 'Helvetica', sans-serif; + margin: 0; + padding: 0; +} +.mdl-demo .mdl-layout__header-row { + padding-left: 40px; +} +.mdl-demo .mdl-layout.is-small-screen .mdl-layout__header-row h3 { + font-size: inherit; +} +.mdl-demo .mdl-layout__tab-bar-button { + display: none; +} +.mdl-demo .mdl-layout.is-small-screen .mdl-layout__tab-bar .mdl-button { + display: none; +} +.mdl-demo .mdl-layout:not(.is-small-screen) .mdl-layout__tab-bar, +.mdl-demo .mdl-layout:not(.is-small-screen) .mdl-layout__tab-bar-container { + overflow: visible; +} +.mdl-demo .mdl-layout__tab-bar-container { + height: 64px; +} +.mdl-demo .mdl-layout__tab-bar { + padding: 0; + padding-left: 16px; + box-sizing: border-box; + height: 100%; + width: 100%; +} +.mdl-demo .mdl-layout__tab-bar .mdl-layout__tab { + height: 64px; + line-height: 64px; +} +.mdl-demo .mdl-layout__tab-bar .mdl-layout__tab.is-active::after { + background-color: white; + height: 4px; +} +.mdl-demo main > .mdl-layout__tab-panel { + padding: 8px; + padding-top: 48px; +} +.mdl-demo .mdl-card { + height: auto; + display: -webkit-flex; + display: -ms-flexbox; + display: flex; + -webkit-flex-direction: column; + -ms-flex-direction: column; + flex-direction: column; +} +.mdl-demo .mdl-card > * { + height: auto; +} +.mdl-demo .mdl-card .mdl-card__supporting-text { + margin: 40px; + -webkit-flex-grow: 1; + -ms-flex-positive: 1; + flex-grow: 1; + padding: 0; + color: inherit; + width: calc(100% - 80px); +} +.mdl-demo.mdl-demo .mdl-card__supporting-text h4 { + margin-top: 0; + margin-bottom: 20px; +} +.mdl-demo .mdl-card__actions { + margin: 0; + padding: 4px 40px; + color: inherit; +} +.mdl-demo .mdl-card__actions a { + color: #00BCD4; + margin: 0; +} +.mdl-demo .mdl-card__actions a:hover, +.mdl-demo .mdl-card__actions a:active { + color: inherit; + background-color: transparent; +} +.mdl-demo .mdl-card__supporting-text + .mdl-card__actions { + border-top: 1px solid rgba(0, 0, 0, 0.12); +} +.mdl-demo #add { + position: absolute; + right: 40px; + top: 36px; + z-index: 999; +} + +.mdl-demo .mdl-layout__content section:not(:last-of-type) { + position: relative; + margin-bottom: 48px; +} +.mdl-demo section.section--center { + max-width: 860px; +} +.mdl-demo #features section.section--center { + max-width: 620px; +} +.mdl-demo section > header{ + display: -webkit-flex; + display: -ms-flexbox; + display: flex; + -webkit-align-items: center; + -ms-flex-align: center; + align-items: center; + -webkit-justify-content: center; + -ms-flex-pack: center; + justify-content: center; +} +.mdl-demo section > .section__play-btn { + min-height: 200px; +} +.mdl-demo section > header > .material-icons { + font-size: 3rem; +} +.mdl-demo section > button { + position: absolute; + z-index: 99; + top: 8px; + right: 8px; +} +.mdl-demo section .section__circle { + display: -webkit-flex; + display: -ms-flexbox; + display: flex; + -webkit-align-items: center; + -ms-flex-align: center; + align-items: center; + -webkit-justify-content: flex-start; + -ms-flex-pack: start; + justify-content: flex-start; + -webkit-flex-grow: 0; + -ms-flex-positive: 0; + flex-grow: 0; + -webkit-flex-shrink: 1; + -ms-flex-negative: 1; + flex-shrink: 1; +} +.mdl-demo section .section__text { + -webkit-flex-grow: 1; + -ms-flex-positive: 1; + flex-grow: 1; + -webkit-flex-shrink: 0; + -ms-flex-negative: 0; + flex-shrink: 0; + padding-top: 8px; +} +.mdl-demo section .section__text h5 { + font-size: inherit; + margin: 0; + margin-bottom: 0.5em; +} +.mdl-demo section .section__text a { + text-decoration: none; +} +.mdl-demo section .section__circle-container > .section__circle-container__circle { + width: 64px; + height: 64px; + border-radius: 32px; + margin: 8px 0; +} +.mdl-demo section.section--footer .section__circle--big { + width: 100px; + height: 100px; + border-radius: 50px; + margin: 8px 32px; +} +.mdl-demo .is-small-screen section.section--footer .section__circle--big { + width: 50px; + height: 50px; + border-radius: 25px; + margin: 8px 16px; +} +.mdl-demo section.section--footer { + padding: 64px 0; + margin: 0 -8px -8px -8px; +} +.mdl-demo section.section--center .section__text:not(:last-child) { + border-bottom: 1px solid rgba(0,0,0,.13); +} +.mdl-demo .mdl-card .mdl-card__supporting-text > h3:first-child { + margin-bottom: 24px; +} +.mdl-demo .mdl-layout__tab-panel:not(#overview) { + background-color: white; +} +.mdl-demo #features section { + margin-bottom: 72px; +} +.mdl-demo #features h4, #features h5 { + margin-bottom: 16px; +} +.mdl-demo .toc { + border-left: 4px solid #C1EEF4; + margin: 24px; + padding: 0; + padding-left: 8px; + display: -webkit-flex; + display: -ms-flexbox; + display: flex; + -webkit-flex-direction: column; + -ms-flex-direction: column; + flex-direction: column; +} +.mdl-demo .toc h4 { + font-size: 0.9rem; + margin-top: 0; +} +.mdl-demo .toc a { + color: #4DD0E1; + text-decoration: none; + font-size: 16px; + line-height: 28px; + display: block; +} +.mdl-demo .mdl-menu__container { + z-index: 99; +} diff --git a/samples/nvidia-resnet/components/webapp/src/static/styles/material.deep_purple-pink.min.css b/samples/nvidia-resnet/components/webapp/src/static/styles/material.deep_purple-pink.min.css new file mode 100644 index 000000000000..818190a37bbc --- /dev/null +++ b/samples/nvidia-resnet/components/webapp/src/static/styles/material.deep_purple-pink.min.css @@ -0,0 +1,8 @@ +/** + * material-design-lite - Material Design Components in CSS, JS and HTML + * @version v1.3.0 + * @license Apache-2.0 + * @copyright 2015 Google, Inc. + * @link https://github.com/google/material-design-lite + */ +@charset "UTF-8";html{color:rgba(0,0,0,.87)}::-moz-selection{background:#b3d4fc;text-shadow:none}::selection{background:#b3d4fc;text-shadow:none}hr{display:block;height:1px;border:0;border-top:1px solid #ccc;margin:1em 0;padding:0}audio,canvas,iframe,img,svg,video{vertical-align:middle}fieldset{border:0;margin:0;padding:0}textarea{resize:vertical}.browserupgrade{margin:.2em 0;background:#ccc;color:#000;padding:.2em 0}.hidden{display:none!important}.visuallyhidden{border:0;clip:rect(0 0 0 0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}.visuallyhidden.focusable:active,.visuallyhidden.focusable:focus{clip:auto;height:auto;margin:0;overflow:visible;position:static;width:auto}.invisible{visibility:hidden}.clearfix:before,.clearfix:after{content:" ";display:table}.clearfix:after{clear:both}@media print{*,*:before,*:after,*:first-letter{background:transparent!important;color:#000!important;box-shadow:none!important}a,a:visited{text-decoration:underline}a[href]:after{content:" (" attr(href)")"}abbr[title]:after{content:" (" attr(title)")"}a[href^="#"]:after,a[href^="javascript:"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100%!important}p,h2,h3{orphans:3;widows:3}h2,h3{page-break-after:avoid}}a,.mdl-accordion,.mdl-button,.mdl-card,.mdl-checkbox,.mdl-dropdown-menu,.mdl-icon-toggle,.mdl-item,.mdl-radio,.mdl-slider,.mdl-switch,.mdl-tabs__tab{-webkit-tap-highlight-color:transparent;-webkit-tap-highlight-color:rgba(255,255,255,0)}html{width:100%;height:100%;-ms-touch-action:manipulation;touch-action:manipulation}body{width:100%;min-height:100%}main{display:block}*[hidden]{display:none!important}html,body{font-family:"Helvetica","Arial",sans-serif;font-size:14px;font-weight:400;line-height:20px}h1,h2,h3,h4,h5,h6,p{padding:0}h1 small,h2 small,h3 small,h4 small,h5 small,h6 small{font-family:"Roboto","Helvetica","Arial",sans-serif;font-weight:400;line-height:1.35;letter-spacing:-.02em;opacity:.54;font-size:.6em}h1{font-size:56px;line-height:1.35;letter-spacing:-.02em;margin:24px 0}h1,h2{font-family:"Roboto","Helvetica","Arial",sans-serif;font-weight:400}h2{font-size:45px;line-height:48px}h2,h3{margin:24px 0}h3{font-size:34px;line-height:40px}h3,h4{font-family:"Roboto","Helvetica","Arial",sans-serif;font-weight:400}h4{font-size:24px;line-height:32px;-moz-osx-font-smoothing:grayscale;margin:24px 0 16px}h5{font-size:20px;font-weight:500;line-height:1;letter-spacing:.02em}h5,h6{font-family:"Roboto","Helvetica","Arial",sans-serif;margin:24px 0 16px}h6{font-size:16px;letter-spacing:.04em}h6,p{font-weight:400;line-height:24px}p{font-size:14px;letter-spacing:0;margin:0 0 16px}a{color:rgb(255,64,129);font-weight:500}blockquote{font-family:"Roboto","Helvetica","Arial",sans-serif;position:relative;font-size:24px;font-weight:300;font-style:italic;line-height:1.35;letter-spacing:.08em}blockquote:before{position:absolute;left:-.5em;content:'“'}blockquote:after{content:'”';margin-left:-.05em}mark{background-color:#f4ff81}dt{font-weight:700}address{font-size:12px;line-height:1;font-style:normal}address,ul,ol{font-weight:400;letter-spacing:0}ul,ol{font-size:14px;line-height:24px}.mdl-typography--display-4,.mdl-typography--display-4-color-contrast{font-family:"Roboto","Helvetica","Arial",sans-serif;font-size:112px;font-weight:300;line-height:1;letter-spacing:-.04em}.mdl-typography--display-4-color-contrast{opacity:.54}.mdl-typography--display-3,.mdl-typography--display-3-color-contrast{font-family:"Roboto","Helvetica","Arial",sans-serif;font-size:56px;font-weight:400;line-height:1.35;letter-spacing:-.02em}.mdl-typography--display-3-color-contrast{opacity:.54}.mdl-typography--display-2,.mdl-typography--display-2-color-contrast{font-family:"Roboto","Helvetica","Arial",sans-serif;font-size:45px;font-weight:400;line-height:48px}.mdl-typography--display-2-color-contrast{opacity:.54}.mdl-typography--display-1,.mdl-typography--display-1-color-contrast{font-family:"Roboto","Helvetica","Arial",sans-serif;font-size:34px;font-weight:400;line-height:40px}.mdl-typography--display-1-color-contrast{opacity:.54}.mdl-typography--headline,.mdl-typography--headline-color-contrast{font-family:"Roboto","Helvetica","Arial",sans-serif;font-size:24px;font-weight:400;line-height:32px;-moz-osx-font-smoothing:grayscale}.mdl-typography--headline-color-contrast{opacity:.87}.mdl-typography--title,.mdl-typography--title-color-contrast{font-family:"Roboto","Helvetica","Arial",sans-serif;font-size:20px;font-weight:500;line-height:1;letter-spacing:.02em}.mdl-typography--title-color-contrast{opacity:.87}.mdl-typography--subhead,.mdl-typography--subhead-color-contrast{font-family:"Roboto","Helvetica","Arial",sans-serif;font-size:16px;font-weight:400;line-height:24px;letter-spacing:.04em}.mdl-typography--subhead-color-contrast{opacity:.87}.mdl-typography--body-2,.mdl-typography--body-2-color-contrast{font-size:14px;font-weight:700;line-height:24px;letter-spacing:0}.mdl-typography--body-2-color-contrast{opacity:.87}.mdl-typography--body-1,.mdl-typography--body-1-color-contrast{font-size:14px;font-weight:400;line-height:24px;letter-spacing:0}.mdl-typography--body-1-color-contrast{opacity:.87}.mdl-typography--body-2-force-preferred-font,.mdl-typography--body-2-force-preferred-font-color-contrast{font-family:"Roboto","Helvetica","Arial",sans-serif;font-size:14px;font-weight:500;line-height:24px;letter-spacing:0}.mdl-typography--body-2-force-preferred-font-color-contrast{opacity:.87}.mdl-typography--body-1-force-preferred-font,.mdl-typography--body-1-force-preferred-font-color-contrast{font-family:"Roboto","Helvetica","Arial",sans-serif;font-size:14px;font-weight:400;line-height:24px;letter-spacing:0}.mdl-typography--body-1-force-preferred-font-color-contrast{opacity:.87}.mdl-typography--caption,.mdl-typography--caption-force-preferred-font{font-size:12px;font-weight:400;line-height:1;letter-spacing:0}.mdl-typography--caption-force-preferred-font{font-family:"Roboto","Helvetica","Arial",sans-serif}.mdl-typography--caption-color-contrast,.mdl-typography--caption-force-preferred-font-color-contrast{font-size:12px;font-weight:400;line-height:1;letter-spacing:0;opacity:.54}.mdl-typography--caption-force-preferred-font-color-contrast,.mdl-typography--menu{font-family:"Roboto","Helvetica","Arial",sans-serif}.mdl-typography--menu{font-size:14px;font-weight:500;line-height:1;letter-spacing:0}.mdl-typography--menu-color-contrast{opacity:.87}.mdl-typography--menu-color-contrast,.mdl-typography--button,.mdl-typography--button-color-contrast{font-family:"Roboto","Helvetica","Arial",sans-serif;font-size:14px;font-weight:500;line-height:1;letter-spacing:0}.mdl-typography--button,.mdl-typography--button-color-contrast{text-transform:uppercase}.mdl-typography--button-color-contrast{opacity:.87}.mdl-typography--text-left{text-align:left}.mdl-typography--text-right{text-align:right}.mdl-typography--text-center{text-align:center}.mdl-typography--text-justify{text-align:justify}.mdl-typography--text-nowrap{white-space:nowrap}.mdl-typography--text-lowercase{text-transform:lowercase}.mdl-typography--text-uppercase{text-transform:uppercase}.mdl-typography--text-capitalize{text-transform:capitalize}.mdl-typography--font-thin{font-weight:200!important}.mdl-typography--font-light{font-weight:300!important}.mdl-typography--font-regular{font-weight:400!important}.mdl-typography--font-medium{font-weight:500!important}.mdl-typography--font-bold{font-weight:700!important}.mdl-typography--font-black{font-weight:900!important}.material-icons{font-family:'Material Icons';font-weight:400;font-style:normal;font-size:24px;line-height:1;letter-spacing:normal;text-transform:none;display:inline-block;word-wrap:normal;-moz-font-feature-settings:'liga';font-feature-settings:'liga';-webkit-font-feature-settings:'liga';-webkit-font-smoothing:antialiased}.mdl-color-text--red{color:#f44336 !important}.mdl-color--red{background-color:#f44336 !important}.mdl-color-text--red-50{color:#ffebee !important}.mdl-color--red-50{background-color:#ffebee !important}.mdl-color-text--red-100{color:#ffcdd2 !important}.mdl-color--red-100{background-color:#ffcdd2 !important}.mdl-color-text--red-200{color:#ef9a9a !important}.mdl-color--red-200{background-color:#ef9a9a !important}.mdl-color-text--red-300{color:#e57373 !important}.mdl-color--red-300{background-color:#e57373 !important}.mdl-color-text--red-400{color:#ef5350 !important}.mdl-color--red-400{background-color:#ef5350 !important}.mdl-color-text--red-500{color:#f44336 !important}.mdl-color--red-500{background-color:#f44336 !important}.mdl-color-text--red-600{color:#e53935 !important}.mdl-color--red-600{background-color:#e53935 !important}.mdl-color-text--red-700{color:#d32f2f !important}.mdl-color--red-700{background-color:#d32f2f !important}.mdl-color-text--red-800{color:#c62828 !important}.mdl-color--red-800{background-color:#c62828 !important}.mdl-color-text--red-900{color:#b71c1c !important}.mdl-color--red-900{background-color:#b71c1c !important}.mdl-color-text--red-A100{color:#ff8a80 !important}.mdl-color--red-A100{background-color:#ff8a80 !important}.mdl-color-text--red-A200{color:#ff5252 !important}.mdl-color--red-A200{background-color:#ff5252 !important}.mdl-color-text--red-A400{color:#ff1744 !important}.mdl-color--red-A400{background-color:#ff1744 !important}.mdl-color-text--red-A700{color:#d50000 !important}.mdl-color--red-A700{background-color:#d50000 !important}.mdl-color-text--pink{color:#e91e63 !important}.mdl-color--pink{background-color:#e91e63 !important}.mdl-color-text--pink-50{color:#fce4ec !important}.mdl-color--pink-50{background-color:#fce4ec !important}.mdl-color-text--pink-100{color:#f8bbd0 !important}.mdl-color--pink-100{background-color:#f8bbd0 !important}.mdl-color-text--pink-200{color:#f48fb1 !important}.mdl-color--pink-200{background-color:#f48fb1 !important}.mdl-color-text--pink-300{color:#f06292 !important}.mdl-color--pink-300{background-color:#f06292 !important}.mdl-color-text--pink-400{color:#ec407a !important}.mdl-color--pink-400{background-color:#ec407a !important}.mdl-color-text--pink-500{color:#e91e63 !important}.mdl-color--pink-500{background-color:#e91e63 !important}.mdl-color-text--pink-600{color:#d81b60 !important}.mdl-color--pink-600{background-color:#d81b60 !important}.mdl-color-text--pink-700{color:#c2185b !important}.mdl-color--pink-700{background-color:#c2185b !important}.mdl-color-text--pink-800{color:#ad1457 !important}.mdl-color--pink-800{background-color:#ad1457 !important}.mdl-color-text--pink-900{color:#880e4f !important}.mdl-color--pink-900{background-color:#880e4f !important}.mdl-color-text--pink-A100{color:#ff80ab !important}.mdl-color--pink-A100{background-color:#ff80ab !important}.mdl-color-text--pink-A200{color:#ff4081 !important}.mdl-color--pink-A200{background-color:#ff4081 !important}.mdl-color-text--pink-A400{color:#f50057 !important}.mdl-color--pink-A400{background-color:#f50057 !important}.mdl-color-text--pink-A700{color:#c51162 !important}.mdl-color--pink-A700{background-color:#c51162 !important}.mdl-color-text--purple{color:#9c27b0 !important}.mdl-color--purple{background-color:#9c27b0 !important}.mdl-color-text--purple-50{color:#f3e5f5 !important}.mdl-color--purple-50{background-color:#f3e5f5 !important}.mdl-color-text--purple-100{color:#e1bee7 !important}.mdl-color--purple-100{background-color:#e1bee7 !important}.mdl-color-text--purple-200{color:#ce93d8 !important}.mdl-color--purple-200{background-color:#ce93d8 !important}.mdl-color-text--purple-300{color:#ba68c8 !important}.mdl-color--purple-300{background-color:#ba68c8 !important}.mdl-color-text--purple-400{color:#ab47bc !important}.mdl-color--purple-400{background-color:#ab47bc !important}.mdl-color-text--purple-500{color:#9c27b0 !important}.mdl-color--purple-500{background-color:#9c27b0 !important}.mdl-color-text--purple-600{color:#8e24aa !important}.mdl-color--purple-600{background-color:#8e24aa !important}.mdl-color-text--purple-700{color:#7b1fa2 !important}.mdl-color--purple-700{background-color:#7b1fa2 !important}.mdl-color-text--purple-800{color:#6a1b9a !important}.mdl-color--purple-800{background-color:#6a1b9a !important}.mdl-color-text--purple-900{color:#4a148c !important}.mdl-color--purple-900{background-color:#4a148c !important}.mdl-color-text--purple-A100{color:#ea80fc !important}.mdl-color--purple-A100{background-color:#ea80fc !important}.mdl-color-text--purple-A200{color:#e040fb !important}.mdl-color--purple-A200{background-color:#e040fb !important}.mdl-color-text--purple-A400{color:#d500f9 !important}.mdl-color--purple-A400{background-color:#d500f9 !important}.mdl-color-text--purple-A700{color:#a0f !important}.mdl-color--purple-A700{background-color:#a0f !important}.mdl-color-text--deep-purple{color:#673ab7 !important}.mdl-color--deep-purple{background-color:#673ab7 !important}.mdl-color-text--deep-purple-50{color:#ede7f6 !important}.mdl-color--deep-purple-50{background-color:#ede7f6 !important}.mdl-color-text--deep-purple-100{color:#d1c4e9 !important}.mdl-color--deep-purple-100{background-color:#d1c4e9 !important}.mdl-color-text--deep-purple-200{color:#b39ddb !important}.mdl-color--deep-purple-200{background-color:#b39ddb !important}.mdl-color-text--deep-purple-300{color:#9575cd !important}.mdl-color--deep-purple-300{background-color:#9575cd !important}.mdl-color-text--deep-purple-400{color:#7e57c2 !important}.mdl-color--deep-purple-400{background-color:#7e57c2 !important}.mdl-color-text--deep-purple-500{color:#673ab7 !important}.mdl-color--deep-purple-500{background-color:#673ab7 !important}.mdl-color-text--deep-purple-600{color:#5e35b1 !important}.mdl-color--deep-purple-600{background-color:#5e35b1 !important}.mdl-color-text--deep-purple-700{color:#512da8 !important}.mdl-color--deep-purple-700{background-color:#512da8 !important}.mdl-color-text--deep-purple-800{color:#4527a0 !important}.mdl-color--deep-purple-800{background-color:#4527a0 !important}.mdl-color-text--deep-purple-900{color:#311b92 !important}.mdl-color--deep-purple-900{background-color:#311b92 !important}.mdl-color-text--deep-purple-A100{color:#b388ff !important}.mdl-color--deep-purple-A100{background-color:#b388ff !important}.mdl-color-text--deep-purple-A200{color:#7c4dff !important}.mdl-color--deep-purple-A200{background-color:#7c4dff !important}.mdl-color-text--deep-purple-A400{color:#651fff !important}.mdl-color--deep-purple-A400{background-color:#651fff !important}.mdl-color-text--deep-purple-A700{color:#6200ea !important}.mdl-color--deep-purple-A700{background-color:#6200ea !important}.mdl-color-text--indigo{color:#3f51b5 !important}.mdl-color--indigo{background-color:#3f51b5 !important}.mdl-color-text--indigo-50{color:#e8eaf6 !important}.mdl-color--indigo-50{background-color:#e8eaf6 !important}.mdl-color-text--indigo-100{color:#c5cae9 !important}.mdl-color--indigo-100{background-color:#c5cae9 !important}.mdl-color-text--indigo-200{color:#9fa8da !important}.mdl-color--indigo-200{background-color:#9fa8da !important}.mdl-color-text--indigo-300{color:#7986cb !important}.mdl-color--indigo-300{background-color:#7986cb !important}.mdl-color-text--indigo-400{color:#5c6bc0 !important}.mdl-color--indigo-400{background-color:#5c6bc0 !important}.mdl-color-text--indigo-500{color:#3f51b5 !important}.mdl-color--indigo-500{background-color:#3f51b5 !important}.mdl-color-text--indigo-600{color:#3949ab !important}.mdl-color--indigo-600{background-color:#3949ab !important}.mdl-color-text--indigo-700{color:#303f9f !important}.mdl-color--indigo-700{background-color:#303f9f !important}.mdl-color-text--indigo-800{color:#283593 !important}.mdl-color--indigo-800{background-color:#283593 !important}.mdl-color-text--indigo-900{color:#1a237e !important}.mdl-color--indigo-900{background-color:#1a237e !important}.mdl-color-text--indigo-A100{color:#8c9eff !important}.mdl-color--indigo-A100{background-color:#8c9eff !important}.mdl-color-text--indigo-A200{color:#536dfe !important}.mdl-color--indigo-A200{background-color:#536dfe !important}.mdl-color-text--indigo-A400{color:#3d5afe !important}.mdl-color--indigo-A400{background-color:#3d5afe !important}.mdl-color-text--indigo-A700{color:#304ffe !important}.mdl-color--indigo-A700{background-color:#304ffe !important}.mdl-color-text--blue{color:#2196f3 !important}.mdl-color--blue{background-color:#2196f3 !important}.mdl-color-text--blue-50{color:#e3f2fd !important}.mdl-color--blue-50{background-color:#e3f2fd !important}.mdl-color-text--blue-100{color:#bbdefb !important}.mdl-color--blue-100{background-color:#bbdefb !important}.mdl-color-text--blue-200{color:#90caf9 !important}.mdl-color--blue-200{background-color:#90caf9 !important}.mdl-color-text--blue-300{color:#64b5f6 !important}.mdl-color--blue-300{background-color:#64b5f6 !important}.mdl-color-text--blue-400{color:#42a5f5 !important}.mdl-color--blue-400{background-color:#42a5f5 !important}.mdl-color-text--blue-500{color:#2196f3 !important}.mdl-color--blue-500{background-color:#2196f3 !important}.mdl-color-text--blue-600{color:#1e88e5 !important}.mdl-color--blue-600{background-color:#1e88e5 !important}.mdl-color-text--blue-700{color:#1976d2 !important}.mdl-color--blue-700{background-color:#1976d2 !important}.mdl-color-text--blue-800{color:#1565c0 !important}.mdl-color--blue-800{background-color:#1565c0 !important}.mdl-color-text--blue-900{color:#0d47a1 !important}.mdl-color--blue-900{background-color:#0d47a1 !important}.mdl-color-text--blue-A100{color:#82b1ff !important}.mdl-color--blue-A100{background-color:#82b1ff !important}.mdl-color-text--blue-A200{color:#448aff !important}.mdl-color--blue-A200{background-color:#448aff !important}.mdl-color-text--blue-A400{color:#2979ff !important}.mdl-color--blue-A400{background-color:#2979ff !important}.mdl-color-text--blue-A700{color:#2962ff !important}.mdl-color--blue-A700{background-color:#2962ff !important}.mdl-color-text--light-blue{color:#03a9f4 !important}.mdl-color--light-blue{background-color:#03a9f4 !important}.mdl-color-text--light-blue-50{color:#e1f5fe !important}.mdl-color--light-blue-50{background-color:#e1f5fe !important}.mdl-color-text--light-blue-100{color:#b3e5fc !important}.mdl-color--light-blue-100{background-color:#b3e5fc !important}.mdl-color-text--light-blue-200{color:#81d4fa !important}.mdl-color--light-blue-200{background-color:#81d4fa !important}.mdl-color-text--light-blue-300{color:#4fc3f7 !important}.mdl-color--light-blue-300{background-color:#4fc3f7 !important}.mdl-color-text--light-blue-400{color:#29b6f6 !important}.mdl-color--light-blue-400{background-color:#29b6f6 !important}.mdl-color-text--light-blue-500{color:#03a9f4 !important}.mdl-color--light-blue-500{background-color:#03a9f4 !important}.mdl-color-text--light-blue-600{color:#039be5 !important}.mdl-color--light-blue-600{background-color:#039be5 !important}.mdl-color-text--light-blue-700{color:#0288d1 !important}.mdl-color--light-blue-700{background-color:#0288d1 !important}.mdl-color-text--light-blue-800{color:#0277bd !important}.mdl-color--light-blue-800{background-color:#0277bd !important}.mdl-color-text--light-blue-900{color:#01579b !important}.mdl-color--light-blue-900{background-color:#01579b !important}.mdl-color-text--light-blue-A100{color:#80d8ff !important}.mdl-color--light-blue-A100{background-color:#80d8ff !important}.mdl-color-text--light-blue-A200{color:#40c4ff !important}.mdl-color--light-blue-A200{background-color:#40c4ff !important}.mdl-color-text--light-blue-A400{color:#00b0ff !important}.mdl-color--light-blue-A400{background-color:#00b0ff !important}.mdl-color-text--light-blue-A700{color:#0091ea !important}.mdl-color--light-blue-A700{background-color:#0091ea !important}.mdl-color-text--cyan{color:#00bcd4 !important}.mdl-color--cyan{background-color:#00bcd4 !important}.mdl-color-text--cyan-50{color:#e0f7fa !important}.mdl-color--cyan-50{background-color:#e0f7fa !important}.mdl-color-text--cyan-100{color:#b2ebf2 !important}.mdl-color--cyan-100{background-color:#b2ebf2 !important}.mdl-color-text--cyan-200{color:#80deea !important}.mdl-color--cyan-200{background-color:#80deea !important}.mdl-color-text--cyan-300{color:#4dd0e1 !important}.mdl-color--cyan-300{background-color:#4dd0e1 !important}.mdl-color-text--cyan-400{color:#26c6da !important}.mdl-color--cyan-400{background-color:#26c6da !important}.mdl-color-text--cyan-500{color:#00bcd4 !important}.mdl-color--cyan-500{background-color:#00bcd4 !important}.mdl-color-text--cyan-600{color:#00acc1 !important}.mdl-color--cyan-600{background-color:#00acc1 !important}.mdl-color-text--cyan-700{color:#0097a7 !important}.mdl-color--cyan-700{background-color:#0097a7 !important}.mdl-color-text--cyan-800{color:#00838f !important}.mdl-color--cyan-800{background-color:#00838f !important}.mdl-color-text--cyan-900{color:#006064 !important}.mdl-color--cyan-900{background-color:#006064 !important}.mdl-color-text--cyan-A100{color:#84ffff !important}.mdl-color--cyan-A100{background-color:#84ffff !important}.mdl-color-text--cyan-A200{color:#18ffff !important}.mdl-color--cyan-A200{background-color:#18ffff !important}.mdl-color-text--cyan-A400{color:#00e5ff !important}.mdl-color--cyan-A400{background-color:#00e5ff !important}.mdl-color-text--cyan-A700{color:#00b8d4 !important}.mdl-color--cyan-A700{background-color:#00b8d4 !important}.mdl-color-text--teal{color:#009688 !important}.mdl-color--teal{background-color:#009688 !important}.mdl-color-text--teal-50{color:#e0f2f1 !important}.mdl-color--teal-50{background-color:#e0f2f1 !important}.mdl-color-text--teal-100{color:#b2dfdb !important}.mdl-color--teal-100{background-color:#b2dfdb !important}.mdl-color-text--teal-200{color:#80cbc4 !important}.mdl-color--teal-200{background-color:#80cbc4 !important}.mdl-color-text--teal-300{color:#4db6ac !important}.mdl-color--teal-300{background-color:#4db6ac !important}.mdl-color-text--teal-400{color:#26a69a !important}.mdl-color--teal-400{background-color:#26a69a !important}.mdl-color-text--teal-500{color:#009688 !important}.mdl-color--teal-500{background-color:#009688 !important}.mdl-color-text--teal-600{color:#00897b !important}.mdl-color--teal-600{background-color:#00897b !important}.mdl-color-text--teal-700{color:#00796b !important}.mdl-color--teal-700{background-color:#00796b !important}.mdl-color-text--teal-800{color:#00695c !important}.mdl-color--teal-800{background-color:#00695c !important}.mdl-color-text--teal-900{color:#004d40 !important}.mdl-color--teal-900{background-color:#004d40 !important}.mdl-color-text--teal-A100{color:#a7ffeb !important}.mdl-color--teal-A100{background-color:#a7ffeb !important}.mdl-color-text--teal-A200{color:#64ffda !important}.mdl-color--teal-A200{background-color:#64ffda !important}.mdl-color-text--teal-A400{color:#1de9b6 !important}.mdl-color--teal-A400{background-color:#1de9b6 !important}.mdl-color-text--teal-A700{color:#00bfa5 !important}.mdl-color--teal-A700{background-color:#00bfa5 !important}.mdl-color-text--green{color:#4caf50 !important}.mdl-color--green{background-color:#4caf50 !important}.mdl-color-text--green-50{color:#e8f5e9 !important}.mdl-color--green-50{background-color:#e8f5e9 !important}.mdl-color-text--green-100{color:#c8e6c9 !important}.mdl-color--green-100{background-color:#c8e6c9 !important}.mdl-color-text--green-200{color:#a5d6a7 !important}.mdl-color--green-200{background-color:#a5d6a7 !important}.mdl-color-text--green-300{color:#81c784 !important}.mdl-color--green-300{background-color:#81c784 !important}.mdl-color-text--green-400{color:#66bb6a !important}.mdl-color--green-400{background-color:#66bb6a !important}.mdl-color-text--green-500{color:#4caf50 !important}.mdl-color--green-500{background-color:#4caf50 !important}.mdl-color-text--green-600{color:#43a047 !important}.mdl-color--green-600{background-color:#43a047 !important}.mdl-color-text--green-700{color:#388e3c !important}.mdl-color--green-700{background-color:#388e3c !important}.mdl-color-text--green-800{color:#2e7d32 !important}.mdl-color--green-800{background-color:#2e7d32 !important}.mdl-color-text--green-900{color:#1b5e20 !important}.mdl-color--green-900{background-color:#1b5e20 !important}.mdl-color-text--green-A100{color:#b9f6ca !important}.mdl-color--green-A100{background-color:#b9f6ca !important}.mdl-color-text--green-A200{color:#69f0ae !important}.mdl-color--green-A200{background-color:#69f0ae !important}.mdl-color-text--green-A400{color:#00e676 !important}.mdl-color--green-A400{background-color:#00e676 !important}.mdl-color-text--green-A700{color:#00c853 !important}.mdl-color--green-A700{background-color:#00c853 !important}.mdl-color-text--light-green{color:#8bc34a !important}.mdl-color--light-green{background-color:#8bc34a !important}.mdl-color-text--light-green-50{color:#f1f8e9 !important}.mdl-color--light-green-50{background-color:#f1f8e9 !important}.mdl-color-text--light-green-100{color:#dcedc8 !important}.mdl-color--light-green-100{background-color:#dcedc8 !important}.mdl-color-text--light-green-200{color:#c5e1a5 !important}.mdl-color--light-green-200{background-color:#c5e1a5 !important}.mdl-color-text--light-green-300{color:#aed581 !important}.mdl-color--light-green-300{background-color:#aed581 !important}.mdl-color-text--light-green-400{color:#9ccc65 !important}.mdl-color--light-green-400{background-color:#9ccc65 !important}.mdl-color-text--light-green-500{color:#8bc34a !important}.mdl-color--light-green-500{background-color:#8bc34a !important}.mdl-color-text--light-green-600{color:#7cb342 !important}.mdl-color--light-green-600{background-color:#7cb342 !important}.mdl-color-text--light-green-700{color:#689f38 !important}.mdl-color--light-green-700{background-color:#689f38 !important}.mdl-color-text--light-green-800{color:#558b2f !important}.mdl-color--light-green-800{background-color:#558b2f !important}.mdl-color-text--light-green-900{color:#33691e !important}.mdl-color--light-green-900{background-color:#33691e !important}.mdl-color-text--light-green-A100{color:#ccff90 !important}.mdl-color--light-green-A100{background-color:#ccff90 !important}.mdl-color-text--light-green-A200{color:#b2ff59 !important}.mdl-color--light-green-A200{background-color:#b2ff59 !important}.mdl-color-text--light-green-A400{color:#76ff03 !important}.mdl-color--light-green-A400{background-color:#76ff03 !important}.mdl-color-text--light-green-A700{color:#64dd17 !important}.mdl-color--light-green-A700{background-color:#64dd17 !important}.mdl-color-text--lime{color:#cddc39 !important}.mdl-color--lime{background-color:#cddc39 !important}.mdl-color-text--lime-50{color:#f9fbe7 !important}.mdl-color--lime-50{background-color:#f9fbe7 !important}.mdl-color-text--lime-100{color:#f0f4c3 !important}.mdl-color--lime-100{background-color:#f0f4c3 !important}.mdl-color-text--lime-200{color:#e6ee9c !important}.mdl-color--lime-200{background-color:#e6ee9c !important}.mdl-color-text--lime-300{color:#dce775 !important}.mdl-color--lime-300{background-color:#dce775 !important}.mdl-color-text--lime-400{color:#d4e157 !important}.mdl-color--lime-400{background-color:#d4e157 !important}.mdl-color-text--lime-500{color:#cddc39 !important}.mdl-color--lime-500{background-color:#cddc39 !important}.mdl-color-text--lime-600{color:#c0ca33 !important}.mdl-color--lime-600{background-color:#c0ca33 !important}.mdl-color-text--lime-700{color:#afb42b !important}.mdl-color--lime-700{background-color:#afb42b !important}.mdl-color-text--lime-800{color:#9e9d24 !important}.mdl-color--lime-800{background-color:#9e9d24 !important}.mdl-color-text--lime-900{color:#827717 !important}.mdl-color--lime-900{background-color:#827717 !important}.mdl-color-text--lime-A100{color:#f4ff81 !important}.mdl-color--lime-A100{background-color:#f4ff81 !important}.mdl-color-text--lime-A200{color:#eeff41 !important}.mdl-color--lime-A200{background-color:#eeff41 !important}.mdl-color-text--lime-A400{color:#c6ff00 !important}.mdl-color--lime-A400{background-color:#c6ff00 !important}.mdl-color-text--lime-A700{color:#aeea00 !important}.mdl-color--lime-A700{background-color:#aeea00 !important}.mdl-color-text--yellow{color:#ffeb3b !important}.mdl-color--yellow{background-color:#ffeb3b !important}.mdl-color-text--yellow-50{color:#fffde7 !important}.mdl-color--yellow-50{background-color:#fffde7 !important}.mdl-color-text--yellow-100{color:#fff9c4 !important}.mdl-color--yellow-100{background-color:#fff9c4 !important}.mdl-color-text--yellow-200{color:#fff59d !important}.mdl-color--yellow-200{background-color:#fff59d !important}.mdl-color-text--yellow-300{color:#fff176 !important}.mdl-color--yellow-300{background-color:#fff176 !important}.mdl-color-text--yellow-400{color:#ffee58 !important}.mdl-color--yellow-400{background-color:#ffee58 !important}.mdl-color-text--yellow-500{color:#ffeb3b !important}.mdl-color--yellow-500{background-color:#ffeb3b !important}.mdl-color-text--yellow-600{color:#fdd835 !important}.mdl-color--yellow-600{background-color:#fdd835 !important}.mdl-color-text--yellow-700{color:#fbc02d !important}.mdl-color--yellow-700{background-color:#fbc02d !important}.mdl-color-text--yellow-800{color:#f9a825 !important}.mdl-color--yellow-800{background-color:#f9a825 !important}.mdl-color-text--yellow-900{color:#f57f17 !important}.mdl-color--yellow-900{background-color:#f57f17 !important}.mdl-color-text--yellow-A100{color:#ffff8d !important}.mdl-color--yellow-A100{background-color:#ffff8d !important}.mdl-color-text--yellow-A200{color:#ff0 !important}.mdl-color--yellow-A200{background-color:#ff0 !important}.mdl-color-text--yellow-A400{color:#ffea00 !important}.mdl-color--yellow-A400{background-color:#ffea00 !important}.mdl-color-text--yellow-A700{color:#ffd600 !important}.mdl-color--yellow-A700{background-color:#ffd600 !important}.mdl-color-text--amber{color:#ffc107 !important}.mdl-color--amber{background-color:#ffc107 !important}.mdl-color-text--amber-50{color:#fff8e1 !important}.mdl-color--amber-50{background-color:#fff8e1 !important}.mdl-color-text--amber-100{color:#ffecb3 !important}.mdl-color--amber-100{background-color:#ffecb3 !important}.mdl-color-text--amber-200{color:#ffe082 !important}.mdl-color--amber-200{background-color:#ffe082 !important}.mdl-color-text--amber-300{color:#ffd54f !important}.mdl-color--amber-300{background-color:#ffd54f !important}.mdl-color-text--amber-400{color:#ffca28 !important}.mdl-color--amber-400{background-color:#ffca28 !important}.mdl-color-text--amber-500{color:#ffc107 !important}.mdl-color--amber-500{background-color:#ffc107 !important}.mdl-color-text--amber-600{color:#ffb300 !important}.mdl-color--amber-600{background-color:#ffb300 !important}.mdl-color-text--amber-700{color:#ffa000 !important}.mdl-color--amber-700{background-color:#ffa000 !important}.mdl-color-text--amber-800{color:#ff8f00 !important}.mdl-color--amber-800{background-color:#ff8f00 !important}.mdl-color-text--amber-900{color:#ff6f00 !important}.mdl-color--amber-900{background-color:#ff6f00 !important}.mdl-color-text--amber-A100{color:#ffe57f !important}.mdl-color--amber-A100{background-color:#ffe57f !important}.mdl-color-text--amber-A200{color:#ffd740 !important}.mdl-color--amber-A200{background-color:#ffd740 !important}.mdl-color-text--amber-A400{color:#ffc400 !important}.mdl-color--amber-A400{background-color:#ffc400 !important}.mdl-color-text--amber-A700{color:#ffab00 !important}.mdl-color--amber-A700{background-color:#ffab00 !important}.mdl-color-text--orange{color:#ff9800 !important}.mdl-color--orange{background-color:#ff9800 !important}.mdl-color-text--orange-50{color:#fff3e0 !important}.mdl-color--orange-50{background-color:#fff3e0 !important}.mdl-color-text--orange-100{color:#ffe0b2 !important}.mdl-color--orange-100{background-color:#ffe0b2 !important}.mdl-color-text--orange-200{color:#ffcc80 !important}.mdl-color--orange-200{background-color:#ffcc80 !important}.mdl-color-text--orange-300{color:#ffb74d !important}.mdl-color--orange-300{background-color:#ffb74d !important}.mdl-color-text--orange-400{color:#ffa726 !important}.mdl-color--orange-400{background-color:#ffa726 !important}.mdl-color-text--orange-500{color:#ff9800 !important}.mdl-color--orange-500{background-color:#ff9800 !important}.mdl-color-text--orange-600{color:#fb8c00 !important}.mdl-color--orange-600{background-color:#fb8c00 !important}.mdl-color-text--orange-700{color:#f57c00 !important}.mdl-color--orange-700{background-color:#f57c00 !important}.mdl-color-text--orange-800{color:#ef6c00 !important}.mdl-color--orange-800{background-color:#ef6c00 !important}.mdl-color-text--orange-900{color:#e65100 !important}.mdl-color--orange-900{background-color:#e65100 !important}.mdl-color-text--orange-A100{color:#ffd180 !important}.mdl-color--orange-A100{background-color:#ffd180 !important}.mdl-color-text--orange-A200{color:#ffab40 !important}.mdl-color--orange-A200{background-color:#ffab40 !important}.mdl-color-text--orange-A400{color:#ff9100 !important}.mdl-color--orange-A400{background-color:#ff9100 !important}.mdl-color-text--orange-A700{color:#ff6d00 !important}.mdl-color--orange-A700{background-color:#ff6d00 !important}.mdl-color-text--deep-orange{color:#ff5722 !important}.mdl-color--deep-orange{background-color:#ff5722 !important}.mdl-color-text--deep-orange-50{color:#fbe9e7 !important}.mdl-color--deep-orange-50{background-color:#fbe9e7 !important}.mdl-color-text--deep-orange-100{color:#ffccbc !important}.mdl-color--deep-orange-100{background-color:#ffccbc !important}.mdl-color-text--deep-orange-200{color:#ffab91 !important}.mdl-color--deep-orange-200{background-color:#ffab91 !important}.mdl-color-text--deep-orange-300{color:#ff8a65 !important}.mdl-color--deep-orange-300{background-color:#ff8a65 !important}.mdl-color-text--deep-orange-400{color:#ff7043 !important}.mdl-color--deep-orange-400{background-color:#ff7043 !important}.mdl-color-text--deep-orange-500{color:#ff5722 !important}.mdl-color--deep-orange-500{background-color:#ff5722 !important}.mdl-color-text--deep-orange-600{color:#f4511e !important}.mdl-color--deep-orange-600{background-color:#f4511e !important}.mdl-color-text--deep-orange-700{color:#e64a19 !important}.mdl-color--deep-orange-700{background-color:#e64a19 !important}.mdl-color-text--deep-orange-800{color:#d84315 !important}.mdl-color--deep-orange-800{background-color:#d84315 !important}.mdl-color-text--deep-orange-900{color:#bf360c !important}.mdl-color--deep-orange-900{background-color:#bf360c !important}.mdl-color-text--deep-orange-A100{color:#ff9e80 !important}.mdl-color--deep-orange-A100{background-color:#ff9e80 !important}.mdl-color-text--deep-orange-A200{color:#ff6e40 !important}.mdl-color--deep-orange-A200{background-color:#ff6e40 !important}.mdl-color-text--deep-orange-A400{color:#ff3d00 !important}.mdl-color--deep-orange-A400{background-color:#ff3d00 !important}.mdl-color-text--deep-orange-A700{color:#dd2c00 !important}.mdl-color--deep-orange-A700{background-color:#dd2c00 !important}.mdl-color-text--brown{color:#795548 !important}.mdl-color--brown{background-color:#795548 !important}.mdl-color-text--brown-50{color:#efebe9 !important}.mdl-color--brown-50{background-color:#efebe9 !important}.mdl-color-text--brown-100{color:#d7ccc8 !important}.mdl-color--brown-100{background-color:#d7ccc8 !important}.mdl-color-text--brown-200{color:#bcaaa4 !important}.mdl-color--brown-200{background-color:#bcaaa4 !important}.mdl-color-text--brown-300{color:#a1887f !important}.mdl-color--brown-300{background-color:#a1887f !important}.mdl-color-text--brown-400{color:#8d6e63 !important}.mdl-color--brown-400{background-color:#8d6e63 !important}.mdl-color-text--brown-500{color:#795548 !important}.mdl-color--brown-500{background-color:#795548 !important}.mdl-color-text--brown-600{color:#6d4c41 !important}.mdl-color--brown-600{background-color:#6d4c41 !important}.mdl-color-text--brown-700{color:#5d4037 !important}.mdl-color--brown-700{background-color:#5d4037 !important}.mdl-color-text--brown-800{color:#4e342e !important}.mdl-color--brown-800{background-color:#4e342e !important}.mdl-color-text--brown-900{color:#3e2723 !important}.mdl-color--brown-900{background-color:#3e2723 !important}.mdl-color-text--grey{color:#9e9e9e !important}.mdl-color--grey{background-color:#9e9e9e !important}.mdl-color-text--grey-50{color:#fafafa !important}.mdl-color--grey-50{background-color:#fafafa !important}.mdl-color-text--grey-100{color:#f5f5f5 !important}.mdl-color--grey-100{background-color:#f5f5f5 !important}.mdl-color-text--grey-200{color:#eee !important}.mdl-color--grey-200{background-color:#eee !important}.mdl-color-text--grey-300{color:#e0e0e0 !important}.mdl-color--grey-300{background-color:#e0e0e0 !important}.mdl-color-text--grey-400{color:#bdbdbd !important}.mdl-color--grey-400{background-color:#bdbdbd !important}.mdl-color-text--grey-500{color:#9e9e9e !important}.mdl-color--grey-500{background-color:#9e9e9e !important}.mdl-color-text--grey-600{color:#757575 !important}.mdl-color--grey-600{background-color:#757575 !important}.mdl-color-text--grey-700{color:#616161 !important}.mdl-color--grey-700{background-color:#616161 !important}.mdl-color-text--grey-800{color:#424242 !important}.mdl-color--grey-800{background-color:#424242 !important}.mdl-color-text--grey-900{color:#212121 !important}.mdl-color--grey-900{background-color:#212121 !important}.mdl-color-text--blue-grey{color:#607d8b !important}.mdl-color--blue-grey{background-color:#607d8b !important}.mdl-color-text--blue-grey-50{color:#eceff1 !important}.mdl-color--blue-grey-50{background-color:#eceff1 !important}.mdl-color-text--blue-grey-100{color:#cfd8dc !important}.mdl-color--blue-grey-100{background-color:#cfd8dc !important}.mdl-color-text--blue-grey-200{color:#b0bec5 !important}.mdl-color--blue-grey-200{background-color:#b0bec5 !important}.mdl-color-text--blue-grey-300{color:#90a4ae !important}.mdl-color--blue-grey-300{background-color:#90a4ae !important}.mdl-color-text--blue-grey-400{color:#78909c !important}.mdl-color--blue-grey-400{background-color:#78909c !important}.mdl-color-text--blue-grey-500{color:#607d8b !important}.mdl-color--blue-grey-500{background-color:#607d8b !important}.mdl-color-text--blue-grey-600{color:#546e7a !important}.mdl-color--blue-grey-600{background-color:#546e7a !important}.mdl-color-text--blue-grey-700{color:#455a64 !important}.mdl-color--blue-grey-700{background-color:#455a64 !important}.mdl-color-text--blue-grey-800{color:#37474f !important}.mdl-color--blue-grey-800{background-color:#37474f !important}.mdl-color-text--blue-grey-900{color:#263238 !important}.mdl-color--blue-grey-900{background-color:#263238 !important}.mdl-color--black{background-color:#000 !important}.mdl-color-text--black{color:#000 !important}.mdl-color--white{background-color:#fff !important}.mdl-color-text--white{color:#fff !important}.mdl-color--primary{background-color:rgb(103,58,183)!important}.mdl-color--primary-contrast{background-color:rgb(255,255,255)!important}.mdl-color--primary-dark{background-color:rgb(81,45,168)!important}.mdl-color--accent{background-color:rgb(255,64,129)!important}.mdl-color--accent-contrast{background-color:rgb(255,255,255)!important}.mdl-color-text--primary{color:rgb(103,58,183)!important}.mdl-color-text--primary-contrast{color:rgb(255,255,255)!important}.mdl-color-text--primary-dark{color:rgb(81,45,168)!important}.mdl-color-text--accent{color:rgb(255,64,129)!important}.mdl-color-text--accent-contrast{color:rgb(255,255,255)!important}.mdl-ripple{background:#000;border-radius:50%;height:50px;left:0;opacity:0;pointer-events:none;position:absolute;top:0;-webkit-transform:translate(-50%,-50%);transform:translate(-50%,-50%);width:50px;overflow:hidden}.mdl-ripple.is-animating{transition:transform .3s cubic-bezier(0,0,.2,1),width .3s cubic-bezier(0,0,.2,1),height .3s cubic-bezier(0,0,.2,1),opacity .6s cubic-bezier(0,0,.2,1);transition:transform .3s cubic-bezier(0,0,.2,1),width .3s cubic-bezier(0,0,.2,1),height .3s cubic-bezier(0,0,.2,1),opacity .6s cubic-bezier(0,0,.2,1),-webkit-transform .3s cubic-bezier(0,0,.2,1)}.mdl-ripple.is-visible{opacity:.3}.mdl-animation--default,.mdl-animation--fast-out-slow-in{transition-timing-function:cubic-bezier(.4,0,.2,1)}.mdl-animation--linear-out-slow-in{transition-timing-function:cubic-bezier(0,0,.2,1)}.mdl-animation--fast-out-linear-in{transition-timing-function:cubic-bezier(.4,0,1,1)}.mdl-badge{position:relative;white-space:nowrap;margin-right:24px}.mdl-badge:not([data-badge]){margin-right:auto}.mdl-badge[data-badge]:after{content:attr(data-badge);display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:row;-ms-flex-direction:row;flex-direction:row;-webkit-flex-wrap:wrap;-ms-flex-wrap:wrap;flex-wrap:wrap;-webkit-justify-content:center;-ms-flex-pack:center;justify-content:center;-webkit-align-content:center;-ms-flex-line-pack:center;align-content:center;-webkit-align-items:center;-ms-flex-align:center;align-items:center;position:absolute;top:-11px;right:-24px;font-family:"Roboto","Helvetica","Arial",sans-serif;font-weight:600;font-size:12px;width:22px;height:22px;border-radius:50%;background:rgb(255,64,129);color:rgb(255,255,255)}.mdl-button .mdl-badge[data-badge]:after{top:-10px;right:-5px}.mdl-badge.mdl-badge--no-background[data-badge]:after{color:rgb(255,64,129);background:rgba(255,255,255,.2);box-shadow:0 0 1px gray}.mdl-badge.mdl-badge--overlap{margin-right:10px}.mdl-badge.mdl-badge--overlap:after{right:-10px}.mdl-button{background:0 0;border:none;border-radius:2px;color:#000;position:relative;height:36px;margin:0;min-width:64px;padding:0 16px;display:inline-block;font-family:"Roboto","Helvetica","Arial",sans-serif;font-size:14px;font-weight:500;text-transform:uppercase;letter-spacing:0;overflow:hidden;will-change:box-shadow;transition:box-shadow .2s cubic-bezier(.4,0,1,1),background-color .2s cubic-bezier(.4,0,.2,1),color .2s cubic-bezier(.4,0,.2,1);outline:none;cursor:pointer;text-decoration:none;text-align:center;line-height:36px;vertical-align:middle}.mdl-button::-moz-focus-inner{border:0}.mdl-button:hover{background-color:rgba(158,158,158,.2)}.mdl-button:focus:not(:active){background-color:rgba(0,0,0,.12)}.mdl-button:active{background-color:rgba(158,158,158,.4)}.mdl-button.mdl-button--colored{color:rgb(103,58,183)}.mdl-button.mdl-button--colored:focus:not(:active){background-color:rgba(0,0,0,.12)}input.mdl-button[type="submit"]{-webkit-appearance:none}.mdl-button--raised{background:rgba(158,158,158,.2);box-shadow:0 2px 2px 0 rgba(0,0,0,.14),0 3px 1px -2px rgba(0,0,0,.2),0 1px 5px 0 rgba(0,0,0,.12)}.mdl-button--raised:active{box-shadow:0 4px 5px 0 rgba(0,0,0,.14),0 1px 10px 0 rgba(0,0,0,.12),0 2px 4px -1px rgba(0,0,0,.2);background-color:rgba(158,158,158,.4)}.mdl-button--raised:focus:not(:active){box-shadow:0 0 8px rgba(0,0,0,.18),0 8px 16px rgba(0,0,0,.36);background-color:rgba(158,158,158,.4)}.mdl-button--raised.mdl-button--colored{background:rgb(103,58,183);color:rgb(255,255,255)}.mdl-button--raised.mdl-button--colored:hover{background-color:rgb(103,58,183)}.mdl-button--raised.mdl-button--colored:active{background-color:rgb(103,58,183)}.mdl-button--raised.mdl-button--colored:focus:not(:active){background-color:rgb(103,58,183)}.mdl-button--raised.mdl-button--colored .mdl-ripple{background:rgb(255,255,255)}.mdl-button--fab{border-radius:50%;font-size:24px;height:56px;margin:auto;min-width:56px;width:56px;padding:0;overflow:hidden;background:rgba(158,158,158,.2);box-shadow:0 1px 1.5px 0 rgba(0,0,0,.12),0 1px 1px 0 rgba(0,0,0,.24);position:relative;line-height:normal}.mdl-button--fab .material-icons{position:absolute;top:50%;left:50%;-webkit-transform:translate(-12px,-12px);transform:translate(-12px,-12px);line-height:24px;width:24px}.mdl-button--fab.mdl-button--mini-fab{height:40px;min-width:40px;width:40px}.mdl-button--fab .mdl-button__ripple-container{border-radius:50%;-webkit-mask-image:-webkit-radial-gradient(circle,#fff,#000)}.mdl-button--fab:active{box-shadow:0 4px 5px 0 rgba(0,0,0,.14),0 1px 10px 0 rgba(0,0,0,.12),0 2px 4px -1px rgba(0,0,0,.2);background-color:rgba(158,158,158,.4)}.mdl-button--fab:focus:not(:active){box-shadow:0 0 8px rgba(0,0,0,.18),0 8px 16px rgba(0,0,0,.36);background-color:rgba(158,158,158,.4)}.mdl-button--fab.mdl-button--colored{background:rgb(255,64,129);color:rgb(255,255,255)}.mdl-button--fab.mdl-button--colored:hover{background-color:rgb(255,64,129)}.mdl-button--fab.mdl-button--colored:focus:not(:active){background-color:rgb(255,64,129)}.mdl-button--fab.mdl-button--colored:active{background-color:rgb(255,64,129)}.mdl-button--fab.mdl-button--colored .mdl-ripple{background:rgb(255,255,255)}.mdl-button--icon{border-radius:50%;font-size:24px;height:32px;margin-left:0;margin-right:0;min-width:32px;width:32px;padding:0;overflow:hidden;color:inherit;line-height:normal}.mdl-button--icon .material-icons{position:absolute;top:50%;left:50%;-webkit-transform:translate(-12px,-12px);transform:translate(-12px,-12px);line-height:24px;width:24px}.mdl-button--icon.mdl-button--mini-icon{height:24px;min-width:24px;width:24px}.mdl-button--icon.mdl-button--mini-icon .material-icons{top:0;left:0}.mdl-button--icon .mdl-button__ripple-container{border-radius:50%;-webkit-mask-image:-webkit-radial-gradient(circle,#fff,#000)}.mdl-button__ripple-container{display:block;height:100%;left:0;position:absolute;top:0;width:100%;z-index:0;overflow:hidden}.mdl-button[disabled] .mdl-button__ripple-container .mdl-ripple,.mdl-button.mdl-button--disabled .mdl-button__ripple-container .mdl-ripple{background-color:transparent}.mdl-button--primary.mdl-button--primary{color:rgb(103,58,183)}.mdl-button--primary.mdl-button--primary .mdl-ripple{background:rgb(255,255,255)}.mdl-button--primary.mdl-button--primary.mdl-button--raised,.mdl-button--primary.mdl-button--primary.mdl-button--fab{color:rgb(255,255,255);background-color:rgb(103,58,183)}.mdl-button--accent.mdl-button--accent{color:rgb(255,64,129)}.mdl-button--accent.mdl-button--accent .mdl-ripple{background:rgb(255,255,255)}.mdl-button--accent.mdl-button--accent.mdl-button--raised,.mdl-button--accent.mdl-button--accent.mdl-button--fab{color:rgb(255,255,255);background-color:rgb(255,64,129)}.mdl-button[disabled][disabled],.mdl-button.mdl-button--disabled.mdl-button--disabled{color:rgba(0,0,0,.26);cursor:default;background-color:transparent}.mdl-button--fab[disabled][disabled],.mdl-button--fab.mdl-button--disabled.mdl-button--disabled{background-color:rgba(0,0,0,.12);color:rgba(0,0,0,.26)}.mdl-button--raised[disabled][disabled],.mdl-button--raised.mdl-button--disabled.mdl-button--disabled{background-color:rgba(0,0,0,.12);color:rgba(0,0,0,.26);box-shadow:none}.mdl-button--colored[disabled][disabled],.mdl-button--colored.mdl-button--disabled.mdl-button--disabled{color:rgba(0,0,0,.26)}.mdl-button .material-icons{vertical-align:middle}.mdl-card{display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:column;-ms-flex-direction:column;flex-direction:column;font-size:16px;font-weight:400;min-height:200px;overflow:hidden;width:330px;z-index:1;position:relative;background:#fff;border-radius:2px;box-sizing:border-box}.mdl-card__media{background-color:rgb(255,64,129);background-repeat:repeat;background-position:50% 50%;background-size:cover;background-origin:padding-box;background-attachment:scroll;box-sizing:border-box}.mdl-card__title{-webkit-align-items:center;-ms-flex-align:center;align-items:center;color:#000;display:block;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-justify-content:stretch;-ms-flex-pack:stretch;justify-content:stretch;line-height:normal;padding:16px;-webkit-perspective-origin:165px 56px;perspective-origin:165px 56px;-webkit-transform-origin:165px 56px;transform-origin:165px 56px;box-sizing:border-box}.mdl-card__title.mdl-card--border{border-bottom:1px solid rgba(0,0,0,.1)}.mdl-card__title-text{-webkit-align-self:flex-end;-ms-flex-item-align:end;align-self:flex-end;color:inherit;display:block;display:-webkit-flex;display:-ms-flexbox;display:flex;font-size:24px;font-weight:300;line-height:normal;overflow:hidden;-webkit-transform-origin:149px 48px;transform-origin:149px 48px;margin:0}.mdl-card__subtitle-text{font-size:14px;color:rgba(0,0,0,.54);margin:0}.mdl-card__supporting-text{color:rgba(0,0,0,.54);font-size:1rem;line-height:18px;overflow:hidden;padding:16px;width:90%}.mdl-card__supporting-text.mdl-card--border{border-bottom:1px solid rgba(0,0,0,.1)}.mdl-card__actions{font-size:16px;line-height:normal;width:100%;background-color:transparent;padding:8px;box-sizing:border-box}.mdl-card__actions.mdl-card--border{border-top:1px solid rgba(0,0,0,.1)}.mdl-card--expand{-webkit-flex-grow:1;-ms-flex-positive:1;flex-grow:1}.mdl-card__menu{position:absolute;right:16px;top:16px}.mdl-checkbox{position:relative;z-index:1;vertical-align:middle;display:inline-block;box-sizing:border-box;width:100%;height:24px;margin:0;padding:0}.mdl-checkbox.is-upgraded{padding-left:24px}.mdl-checkbox__input{line-height:24px}.mdl-checkbox.is-upgraded .mdl-checkbox__input{position:absolute;width:0;height:0;margin:0;padding:0;opacity:0;-ms-appearance:none;-moz-appearance:none;-webkit-appearance:none;appearance:none;border:none}.mdl-checkbox__box-outline{position:absolute;top:3px;left:0;display:inline-block;box-sizing:border-box;width:16px;height:16px;margin:0;cursor:pointer;overflow:hidden;border:2px solid rgba(0,0,0,.54);border-radius:2px;z-index:2}.mdl-checkbox.is-checked .mdl-checkbox__box-outline{border:2px solid rgb(103,58,183)}fieldset[disabled] .mdl-checkbox .mdl-checkbox__box-outline,.mdl-checkbox.is-disabled .mdl-checkbox__box-outline{border:2px solid rgba(0,0,0,.26);cursor:auto}.mdl-checkbox__focus-helper{position:absolute;top:3px;left:0;display:inline-block;box-sizing:border-box;width:16px;height:16px;border-radius:50%;background-color:transparent}.mdl-checkbox.is-focused .mdl-checkbox__focus-helper{box-shadow:0 0 0 8px rgba(0,0,0,.1);background-color:rgba(0,0,0,.1)}.mdl-checkbox.is-focused.is-checked .mdl-checkbox__focus-helper{box-shadow:0 0 0 8px rgba(103,58,183,.26);background-color:rgba(103,58,183,.26)}.mdl-checkbox__tick-outline{position:absolute;top:0;left:0;height:100%;width:100%;-webkit-mask:url("data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiIHN0YW5kYWxvbmU9Im5vIj8+CjxzdmcKICAgeG1sbnM6ZGM9Imh0dHA6Ly9wdXJsLm9yZy9kYy9lbGVtZW50cy8xLjEvIgogICB4bWxuczpjYz0iaHR0cDovL2NyZWF0aXZlY29tbW9ucy5vcmcvbnMjIgogICB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5OTkvMDIvMjItcmRmLXN5bnRheC1ucyMiCiAgIHhtbG5zOnN2Zz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciCiAgIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyIKICAgdmVyc2lvbj0iMS4xIgogICB2aWV3Qm94PSIwIDAgMSAxIgogICBwcmVzZXJ2ZUFzcGVjdFJhdGlvPSJ4TWluWU1pbiBtZWV0Ij4KICA8ZGVmcz4KICAgIDxjbGlwUGF0aCBpZD0iY2xpcCI+CiAgICAgIDxwYXRoCiAgICAgICAgIGQ9Ik0gMCwwIDAsMSAxLDEgMSwwIDAsMCB6IE0gMC44NTM0Mzc1LDAuMTY3MTg3NSAwLjk1OTY4NzUsMC4yNzMxMjUgMC40MjkzNzUsMC44MDM0Mzc1IDAuMzIzMTI1LDAuOTA5Njg3NSAwLjIxNzE4NzUsMC44MDM0Mzc1IDAuMDQwMzEyNSwwLjYyNjg3NSAwLjE0NjU2MjUsMC41MjA2MjUgMC4zMjMxMjUsMC42OTc1IDAuODUzNDM3NSwwLjE2NzE4NzUgeiIKICAgICAgICAgc3R5bGU9ImZpbGw6I2ZmZmZmZjtmaWxsLW9wYWNpdHk6MTtzdHJva2U6bm9uZSIgLz4KICAgIDwvY2xpcFBhdGg+CiAgICA8bWFzayBpZD0ibWFzayIgbWFza1VuaXRzPSJvYmplY3RCb3VuZGluZ0JveCIgbWFza0NvbnRlbnRVbml0cz0ib2JqZWN0Qm91bmRpbmdCb3giPgogICAgICA8cGF0aAogICAgICAgICBkPSJNIDAsMCAwLDEgMSwxIDEsMCAwLDAgeiBNIDAuODUzNDM3NSwwLjE2NzE4NzUgMC45NTk2ODc1LDAuMjczMTI1IDAuNDI5Mzc1LDAuODAzNDM3NSAwLjMyMzEyNSwwLjkwOTY4NzUgMC4yMTcxODc1LDAuODAzNDM3NSAwLjA0MDMxMjUsMC42MjY4NzUgMC4xNDY1NjI1LDAuNTIwNjI1IDAuMzIzMTI1LDAuNjk3NSAwLjg1MzQzNzUsMC4xNjcxODc1IHoiCiAgICAgICAgIHN0eWxlPSJmaWxsOiNmZmZmZmY7ZmlsbC1vcGFjaXR5OjE7c3Ryb2tlOm5vbmUiIC8+CiAgICA8L21hc2s+CiAgPC9kZWZzPgogIDxyZWN0CiAgICAgd2lkdGg9IjEiCiAgICAgaGVpZ2h0PSIxIgogICAgIHg9IjAiCiAgICAgeT0iMCIKICAgICBjbGlwLXBhdGg9InVybCgjY2xpcCkiCiAgICAgc3R5bGU9ImZpbGw6IzAwMDAwMDtmaWxsLW9wYWNpdHk6MTtzdHJva2U6bm9uZSIgLz4KPC9zdmc+Cg==");mask:url("data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiIHN0YW5kYWxvbmU9Im5vIj8+CjxzdmcKICAgeG1sbnM6ZGM9Imh0dHA6Ly9wdXJsLm9yZy9kYy9lbGVtZW50cy8xLjEvIgogICB4bWxuczpjYz0iaHR0cDovL2NyZWF0aXZlY29tbW9ucy5vcmcvbnMjIgogICB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5OTkvMDIvMjItcmRmLXN5bnRheC1ucyMiCiAgIHhtbG5zOnN2Zz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciCiAgIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyIKICAgdmVyc2lvbj0iMS4xIgogICB2aWV3Qm94PSIwIDAgMSAxIgogICBwcmVzZXJ2ZUFzcGVjdFJhdGlvPSJ4TWluWU1pbiBtZWV0Ij4KICA8ZGVmcz4KICAgIDxjbGlwUGF0aCBpZD0iY2xpcCI+CiAgICAgIDxwYXRoCiAgICAgICAgIGQ9Ik0gMCwwIDAsMSAxLDEgMSwwIDAsMCB6IE0gMC44NTM0Mzc1LDAuMTY3MTg3NSAwLjk1OTY4NzUsMC4yNzMxMjUgMC40MjkzNzUsMC44MDM0Mzc1IDAuMzIzMTI1LDAuOTA5Njg3NSAwLjIxNzE4NzUsMC44MDM0Mzc1IDAuMDQwMzEyNSwwLjYyNjg3NSAwLjE0NjU2MjUsMC41MjA2MjUgMC4zMjMxMjUsMC42OTc1IDAuODUzNDM3NSwwLjE2NzE4NzUgeiIKICAgICAgICAgc3R5bGU9ImZpbGw6I2ZmZmZmZjtmaWxsLW9wYWNpdHk6MTtzdHJva2U6bm9uZSIgLz4KICAgIDwvY2xpcFBhdGg+CiAgICA8bWFzayBpZD0ibWFzayIgbWFza1VuaXRzPSJvYmplY3RCb3VuZGluZ0JveCIgbWFza0NvbnRlbnRVbml0cz0ib2JqZWN0Qm91bmRpbmdCb3giPgogICAgICA8cGF0aAogICAgICAgICBkPSJNIDAsMCAwLDEgMSwxIDEsMCAwLDAgeiBNIDAuODUzNDM3NSwwLjE2NzE4NzUgMC45NTk2ODc1LDAuMjczMTI1IDAuNDI5Mzc1LDAuODAzNDM3NSAwLjMyMzEyNSwwLjkwOTY4NzUgMC4yMTcxODc1LDAuODAzNDM3NSAwLjA0MDMxMjUsMC42MjY4NzUgMC4xNDY1NjI1LDAuNTIwNjI1IDAuMzIzMTI1LDAuNjk3NSAwLjg1MzQzNzUsMC4xNjcxODc1IHoiCiAgICAgICAgIHN0eWxlPSJmaWxsOiNmZmZmZmY7ZmlsbC1vcGFjaXR5OjE7c3Ryb2tlOm5vbmUiIC8+CiAgICA8L21hc2s+CiAgPC9kZWZzPgogIDxyZWN0CiAgICAgd2lkdGg9IjEiCiAgICAgaGVpZ2h0PSIxIgogICAgIHg9IjAiCiAgICAgeT0iMCIKICAgICBjbGlwLXBhdGg9InVybCgjY2xpcCkiCiAgICAgc3R5bGU9ImZpbGw6IzAwMDAwMDtmaWxsLW9wYWNpdHk6MTtzdHJva2U6bm9uZSIgLz4KPC9zdmc+Cg==");background:0 0;transition-duration:.28s;transition-timing-function:cubic-bezier(.4,0,.2,1);transition-property:background}.mdl-checkbox.is-checked .mdl-checkbox__tick-outline{background:rgb(103,58,183)url("data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiIHN0YW5kYWxvbmU9Im5vIj8+CjxzdmcKICAgeG1sbnM6ZGM9Imh0dHA6Ly9wdXJsLm9yZy9kYy9lbGVtZW50cy8xLjEvIgogICB4bWxuczpjYz0iaHR0cDovL2NyZWF0aXZlY29tbW9ucy5vcmcvbnMjIgogICB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5OTkvMDIvMjItcmRmLXN5bnRheC1ucyMiCiAgIHhtbG5zOnN2Zz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciCiAgIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyIKICAgdmVyc2lvbj0iMS4xIgogICB2aWV3Qm94PSIwIDAgMSAxIgogICBwcmVzZXJ2ZUFzcGVjdFJhdGlvPSJ4TWluWU1pbiBtZWV0Ij4KICA8cGF0aAogICAgIGQ9Ik0gMC4wNDAzODA1OSwwLjYyNjc3NjcgMC4xNDY0NDY2MSwwLjUyMDcxMDY4IDAuNDI5Mjg5MzIsMC44MDM1NTMzOSAwLjMyMzIyMzMsMC45MDk2MTk0MSB6IE0gMC4yMTcxNTcyOSwwLjgwMzU1MzM5IDAuODUzNTUzMzksMC4xNjcxNTcyOSAwLjk1OTYxOTQxLDAuMjczMjIzMyAwLjMyMzIyMzMsMC45MDk2MTk0MSB6IgogICAgIGlkPSJyZWN0Mzc4MCIKICAgICBzdHlsZT0iZmlsbDojZmZmZmZmO2ZpbGwtb3BhY2l0eToxO3N0cm9rZTpub25lIiAvPgo8L3N2Zz4K")}fieldset[disabled] .mdl-checkbox.is-checked .mdl-checkbox__tick-outline,.mdl-checkbox.is-checked.is-disabled .mdl-checkbox__tick-outline{background:rgba(0,0,0,.26)url("data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiIHN0YW5kYWxvbmU9Im5vIj8+CjxzdmcKICAgeG1sbnM6ZGM9Imh0dHA6Ly9wdXJsLm9yZy9kYy9lbGVtZW50cy8xLjEvIgogICB4bWxuczpjYz0iaHR0cDovL2NyZWF0aXZlY29tbW9ucy5vcmcvbnMjIgogICB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5OTkvMDIvMjItcmRmLXN5bnRheC1ucyMiCiAgIHhtbG5zOnN2Zz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciCiAgIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyIKICAgdmVyc2lvbj0iMS4xIgogICB2aWV3Qm94PSIwIDAgMSAxIgogICBwcmVzZXJ2ZUFzcGVjdFJhdGlvPSJ4TWluWU1pbiBtZWV0Ij4KICA8cGF0aAogICAgIGQ9Ik0gMC4wNDAzODA1OSwwLjYyNjc3NjcgMC4xNDY0NDY2MSwwLjUyMDcxMDY4IDAuNDI5Mjg5MzIsMC44MDM1NTMzOSAwLjMyMzIyMzMsMC45MDk2MTk0MSB6IE0gMC4yMTcxNTcyOSwwLjgwMzU1MzM5IDAuODUzNTUzMzksMC4xNjcxNTcyOSAwLjk1OTYxOTQxLDAuMjczMjIzMyAwLjMyMzIyMzMsMC45MDk2MTk0MSB6IgogICAgIGlkPSJyZWN0Mzc4MCIKICAgICBzdHlsZT0iZmlsbDojZmZmZmZmO2ZpbGwtb3BhY2l0eToxO3N0cm9rZTpub25lIiAvPgo8L3N2Zz4K")}.mdl-checkbox__label{position:relative;cursor:pointer;font-size:16px;line-height:24px;margin:0}fieldset[disabled] .mdl-checkbox .mdl-checkbox__label,.mdl-checkbox.is-disabled .mdl-checkbox__label{color:rgba(0,0,0,.26);cursor:auto}.mdl-checkbox__ripple-container{position:absolute;z-index:2;top:-6px;left:-10px;box-sizing:border-box;width:36px;height:36px;border-radius:50%;cursor:pointer;overflow:hidden;-webkit-mask-image:-webkit-radial-gradient(circle,#fff,#000)}.mdl-checkbox__ripple-container .mdl-ripple{background:rgb(103,58,183)}fieldset[disabled] .mdl-checkbox .mdl-checkbox__ripple-container,.mdl-checkbox.is-disabled .mdl-checkbox__ripple-container{cursor:auto}fieldset[disabled] .mdl-checkbox .mdl-checkbox__ripple-container .mdl-ripple,.mdl-checkbox.is-disabled .mdl-checkbox__ripple-container .mdl-ripple{background:0 0}.mdl-chip{height:32px;font-family:"Roboto","Helvetica","Arial",sans-serif;line-height:32px;padding:0 12px;border:0;border-radius:16px;background-color:#dedede;display:inline-block;color:rgba(0,0,0,.87);margin:2px 0;font-size:0;white-space:nowrap}.mdl-chip__text{font-size:13px;vertical-align:middle;display:inline-block}.mdl-chip__action{height:24px;width:24px;background:0 0;opacity:.54;cursor:pointer;padding:0;margin:0 0 0 4px;font-size:13px;text-decoration:none;color:rgba(0,0,0,.87);border:none;outline:none}.mdl-chip__action,.mdl-chip__contact{display:inline-block;vertical-align:middle;overflow:hidden;text-align:center}.mdl-chip__contact{height:32px;width:32px;border-radius:16px;margin-right:8px;font-size:18px;line-height:32px}.mdl-chip:focus{outline:0;box-shadow:0 2px 2px 0 rgba(0,0,0,.14),0 3px 1px -2px rgba(0,0,0,.2),0 1px 5px 0 rgba(0,0,0,.12)}.mdl-chip:active{background-color:#d6d6d6}.mdl-chip--deletable{padding-right:4px}.mdl-chip--contact{padding-left:0}.mdl-data-table{position:relative;border:1px solid rgba(0,0,0,.12);border-collapse:collapse;white-space:nowrap;font-size:13px;background-color:#fff}.mdl-data-table thead{padding-bottom:3px}.mdl-data-table thead .mdl-data-table__select{margin-top:0}.mdl-data-table tbody tr{position:relative;height:48px;transition-duration:.28s;transition-timing-function:cubic-bezier(.4,0,.2,1);transition-property:background-color}.mdl-data-table tbody tr.is-selected{background-color:#e0e0e0}.mdl-data-table tbody tr:hover{background-color:#eee}.mdl-data-table td{text-align:right}.mdl-data-table th{padding:0 18px 12px 18px;text-align:right}.mdl-data-table td:first-of-type,.mdl-data-table th:first-of-type{padding-left:24px}.mdl-data-table td:last-of-type,.mdl-data-table th:last-of-type{padding-right:24px}.mdl-data-table td{position:relative;height:48px;border-top:1px solid rgba(0,0,0,.12);border-bottom:1px solid rgba(0,0,0,.12);padding:12px 18px;box-sizing:border-box}.mdl-data-table td,.mdl-data-table td .mdl-data-table__select{vertical-align:middle}.mdl-data-table th{position:relative;vertical-align:bottom;text-overflow:ellipsis;font-weight:700;line-height:24px;letter-spacing:0;height:48px;font-size:12px;color:rgba(0,0,0,.54);padding-bottom:8px;box-sizing:border-box}.mdl-data-table th.mdl-data-table__header--sorted-ascending,.mdl-data-table th.mdl-data-table__header--sorted-descending{color:rgba(0,0,0,.87)}.mdl-data-table th.mdl-data-table__header--sorted-ascending:before,.mdl-data-table th.mdl-data-table__header--sorted-descending:before{font-family:'Material Icons';font-weight:400;font-style:normal;line-height:1;letter-spacing:normal;text-transform:none;display:inline-block;word-wrap:normal;-moz-font-feature-settings:'liga';font-feature-settings:'liga';-webkit-font-feature-settings:'liga';-webkit-font-smoothing:antialiased;font-size:16px;content:"\e5d8";margin-right:5px;vertical-align:sub}.mdl-data-table th.mdl-data-table__header--sorted-ascending:hover,.mdl-data-table th.mdl-data-table__header--sorted-descending:hover{cursor:pointer}.mdl-data-table th.mdl-data-table__header--sorted-ascending:hover:before,.mdl-data-table th.mdl-data-table__header--sorted-descending:hover:before{color:rgba(0,0,0,.26)}.mdl-data-table th.mdl-data-table__header--sorted-descending:before{content:"\e5db"}.mdl-data-table__select{width:16px}.mdl-data-table__cell--non-numeric.mdl-data-table__cell--non-numeric{text-align:left}.mdl-dialog{border:none;box-shadow:0 9px 46px 8px rgba(0,0,0,.14),0 11px 15px -7px rgba(0,0,0,.12),0 24px 38px 3px rgba(0,0,0,.2);width:280px}.mdl-dialog__title{padding:24px 24px 0;margin:0;font-size:2.5rem}.mdl-dialog__actions{padding:8px 8px 8px 24px;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:row-reverse;-ms-flex-direction:row-reverse;flex-direction:row-reverse;-webkit-flex-wrap:wrap;-ms-flex-wrap:wrap;flex-wrap:wrap}.mdl-dialog__actions>*{margin-right:8px;height:36px}.mdl-dialog__actions>*:first-child{margin-right:0}.mdl-dialog__actions--full-width{padding:0 0 8px}.mdl-dialog__actions--full-width>*{height:48px;-webkit-flex:0 0 100%;-ms-flex:0 0 100%;flex:0 0 100%;padding-right:16px;margin-right:0;text-align:right}.mdl-dialog__content{padding:20px 24px 24px;color:rgba(0,0,0,.54)}.mdl-mega-footer{padding:16px 40px;color:#9e9e9e;background-color:#424242}.mdl-mega-footer--top-section:after,.mdl-mega-footer--middle-section:after,.mdl-mega-footer--bottom-section:after,.mdl-mega-footer__top-section:after,.mdl-mega-footer__middle-section:after,.mdl-mega-footer__bottom-section:after{content:'';display:block;clear:both}.mdl-mega-footer--left-section,.mdl-mega-footer__left-section,.mdl-mega-footer--right-section,.mdl-mega-footer__right-section{margin-bottom:16px}.mdl-mega-footer--right-section a,.mdl-mega-footer__right-section a{display:block;margin-bottom:16px;color:inherit;text-decoration:none}@media screen and (min-width:760px){.mdl-mega-footer--left-section,.mdl-mega-footer__left-section{float:left}.mdl-mega-footer--right-section,.mdl-mega-footer__right-section{float:right}.mdl-mega-footer--right-section a,.mdl-mega-footer__right-section a{display:inline-block;margin-left:16px;line-height:36px;vertical-align:middle}}.mdl-mega-footer--social-btn,.mdl-mega-footer__social-btn{width:36px;height:36px;padding:0;margin:0;background-color:#9e9e9e;border:none}.mdl-mega-footer--drop-down-section,.mdl-mega-footer__drop-down-section{display:block;position:relative}@media screen and (min-width:760px){.mdl-mega-footer--drop-down-section,.mdl-mega-footer__drop-down-section{width:33%}.mdl-mega-footer--drop-down-section:nth-child(1),.mdl-mega-footer--drop-down-section:nth-child(2),.mdl-mega-footer__drop-down-section:nth-child(1),.mdl-mega-footer__drop-down-section:nth-child(2){float:left}.mdl-mega-footer--drop-down-section:nth-child(3),.mdl-mega-footer__drop-down-section:nth-child(3){float:right}.mdl-mega-footer--drop-down-section:nth-child(3):after,.mdl-mega-footer__drop-down-section:nth-child(3):after{clear:right}.mdl-mega-footer--drop-down-section:nth-child(4),.mdl-mega-footer__drop-down-section:nth-child(4){clear:right;float:right}.mdl-mega-footer--middle-section:after,.mdl-mega-footer__middle-section:after{content:'';display:block;clear:both}.mdl-mega-footer--bottom-section,.mdl-mega-footer__bottom-section{padding-top:0}}@media screen and (min-width:1024px){.mdl-mega-footer--drop-down-section,.mdl-mega-footer--drop-down-section:nth-child(3),.mdl-mega-footer--drop-down-section:nth-child(4),.mdl-mega-footer__drop-down-section,.mdl-mega-footer__drop-down-section:nth-child(3),.mdl-mega-footer__drop-down-section:nth-child(4){width:24%;float:left}}.mdl-mega-footer--heading-checkbox,.mdl-mega-footer__heading-checkbox{position:absolute;width:100%;height:55.8px;padding:32px;margin:-16px 0 0;cursor:pointer;z-index:1;opacity:0}.mdl-mega-footer--heading-checkbox+.mdl-mega-footer--heading:after,.mdl-mega-footer--heading-checkbox+.mdl-mega-footer__heading:after,.mdl-mega-footer__heading-checkbox+.mdl-mega-footer--heading:after,.mdl-mega-footer__heading-checkbox+.mdl-mega-footer__heading:after{font-family:'Material Icons';content:'\E5CE'}.mdl-mega-footer--heading-checkbox:checked~.mdl-mega-footer--link-list,.mdl-mega-footer--heading-checkbox:checked~.mdl-mega-footer__link-list,.mdl-mega-footer--heading-checkbox:checked+.mdl-mega-footer--heading+.mdl-mega-footer--link-list,.mdl-mega-footer--heading-checkbox:checked+.mdl-mega-footer__heading+.mdl-mega-footer__link-list,.mdl-mega-footer__heading-checkbox:checked~.mdl-mega-footer--link-list,.mdl-mega-footer__heading-checkbox:checked~.mdl-mega-footer__link-list,.mdl-mega-footer__heading-checkbox:checked+.mdl-mega-footer--heading+.mdl-mega-footer--link-list,.mdl-mega-footer__heading-checkbox:checked+.mdl-mega-footer__heading+.mdl-mega-footer__link-list{display:none}.mdl-mega-footer--heading-checkbox:checked+.mdl-mega-footer--heading:after,.mdl-mega-footer--heading-checkbox:checked+.mdl-mega-footer__heading:after,.mdl-mega-footer__heading-checkbox:checked+.mdl-mega-footer--heading:after,.mdl-mega-footer__heading-checkbox:checked+.mdl-mega-footer__heading:after{font-family:'Material Icons';content:'\E5CF'}.mdl-mega-footer--heading,.mdl-mega-footer__heading{position:relative;width:100%;padding-right:39.8px;margin-bottom:16px;box-sizing:border-box;font-size:14px;line-height:23.8px;font-weight:500;white-space:nowrap;text-overflow:ellipsis;overflow:hidden;color:#e0e0e0}.mdl-mega-footer--heading:after,.mdl-mega-footer__heading:after{content:'';position:absolute;top:0;right:0;display:block;width:23.8px;height:23.8px;background-size:cover}.mdl-mega-footer--link-list,.mdl-mega-footer__link-list{list-style:none;padding:0;margin:0 0 32px}.mdl-mega-footer--link-list:after,.mdl-mega-footer__link-list:after{clear:both;display:block;content:''}.mdl-mega-footer--link-list li,.mdl-mega-footer__link-list li{font-size:14px;font-weight:400;letter-spacing:0;line-height:20px}.mdl-mega-footer--link-list a,.mdl-mega-footer__link-list a{color:inherit;text-decoration:none;white-space:nowrap}@media screen and (min-width:760px){.mdl-mega-footer--heading-checkbox,.mdl-mega-footer__heading-checkbox{display:none}.mdl-mega-footer--heading-checkbox+.mdl-mega-footer--heading:after,.mdl-mega-footer--heading-checkbox+.mdl-mega-footer__heading:after,.mdl-mega-footer__heading-checkbox+.mdl-mega-footer--heading:after,.mdl-mega-footer__heading-checkbox+.mdl-mega-footer__heading:after{content:''}.mdl-mega-footer--heading-checkbox:checked~.mdl-mega-footer--link-list,.mdl-mega-footer--heading-checkbox:checked~.mdl-mega-footer__link-list,.mdl-mega-footer--heading-checkbox:checked+.mdl-mega-footer__heading+.mdl-mega-footer__link-list,.mdl-mega-footer--heading-checkbox:checked+.mdl-mega-footer--heading+.mdl-mega-footer--link-list,.mdl-mega-footer__heading-checkbox:checked~.mdl-mega-footer--link-list,.mdl-mega-footer__heading-checkbox:checked~.mdl-mega-footer__link-list,.mdl-mega-footer__heading-checkbox:checked+.mdl-mega-footer__heading+.mdl-mega-footer__link-list,.mdl-mega-footer__heading-checkbox:checked+.mdl-mega-footer--heading+.mdl-mega-footer--link-list{display:block}.mdl-mega-footer--heading-checkbox:checked+.mdl-mega-footer--heading:after,.mdl-mega-footer--heading-checkbox:checked+.mdl-mega-footer__heading:after,.mdl-mega-footer__heading-checkbox:checked+.mdl-mega-footer--heading:after,.mdl-mega-footer__heading-checkbox:checked+.mdl-mega-footer__heading:after{content:''}}.mdl-mega-footer--bottom-section,.mdl-mega-footer__bottom-section{padding-top:16px;margin-bottom:16px}.mdl-logo{margin-bottom:16px;color:#fff}.mdl-mega-footer--bottom-section .mdl-mega-footer--link-list li,.mdl-mega-footer__bottom-section .mdl-mega-footer__link-list li{float:left;margin-bottom:0;margin-right:16px}@media screen and (min-width:760px){.mdl-logo{float:left;margin-bottom:0;margin-right:16px}}.mdl-mini-footer{display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-flow:row wrap;-ms-flex-flow:row wrap;flex-flow:row wrap;-webkit-justify-content:space-between;-ms-flex-pack:justify;justify-content:space-between;padding:32px 16px;color:#9e9e9e;background-color:#424242}.mdl-mini-footer:after{content:'';display:block}.mdl-mini-footer .mdl-logo{line-height:36px}.mdl-mini-footer--link-list,.mdl-mini-footer__link-list{display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-flow:row nowrap;-ms-flex-flow:row nowrap;flex-flow:row nowrap;list-style:none;margin:0;padding:0}.mdl-mini-footer--link-list li,.mdl-mini-footer__link-list li{margin-bottom:0;margin-right:16px}@media screen and (min-width:760px){.mdl-mini-footer--link-list li,.mdl-mini-footer__link-list li{line-height:36px}}.mdl-mini-footer--link-list a,.mdl-mini-footer__link-list a{color:inherit;text-decoration:none;white-space:nowrap}.mdl-mini-footer--left-section,.mdl-mini-footer__left-section{display:inline-block;-webkit-order:0;-ms-flex-order:0;order:0}.mdl-mini-footer--right-section,.mdl-mini-footer__right-section{display:inline-block;-webkit-order:1;-ms-flex-order:1;order:1}.mdl-mini-footer--social-btn,.mdl-mini-footer__social-btn{width:36px;height:36px;padding:0;margin:0;background-color:#9e9e9e;border:none}.mdl-icon-toggle{position:relative;z-index:1;vertical-align:middle;display:inline-block;height:32px;margin:0;padding:0}.mdl-icon-toggle__input{line-height:32px}.mdl-icon-toggle.is-upgraded .mdl-icon-toggle__input{position:absolute;width:0;height:0;margin:0;padding:0;opacity:0;-ms-appearance:none;-moz-appearance:none;-webkit-appearance:none;appearance:none;border:none}.mdl-icon-toggle__label{display:inline-block;position:relative;cursor:pointer;height:32px;width:32px;min-width:32px;color:#616161;border-radius:50%;padding:0;margin-left:0;margin-right:0;text-align:center;background-color:transparent;will-change:background-color;transition:background-color .2s cubic-bezier(.4,0,.2,1),color .2s cubic-bezier(.4,0,.2,1)}.mdl-icon-toggle__label.material-icons{line-height:32px;font-size:24px}.mdl-icon-toggle.is-checked .mdl-icon-toggle__label{color:rgb(103,58,183)}.mdl-icon-toggle.is-disabled .mdl-icon-toggle__label{color:rgba(0,0,0,.26);cursor:auto;transition:none}.mdl-icon-toggle.is-focused .mdl-icon-toggle__label{background-color:rgba(0,0,0,.12)}.mdl-icon-toggle.is-focused.is-checked .mdl-icon-toggle__label{background-color:rgba(103,58,183,.26)}.mdl-icon-toggle__ripple-container{position:absolute;z-index:2;top:-2px;left:-2px;box-sizing:border-box;width:36px;height:36px;border-radius:50%;cursor:pointer;overflow:hidden;-webkit-mask-image:-webkit-radial-gradient(circle,#fff,#000)}.mdl-icon-toggle__ripple-container .mdl-ripple{background:#616161}.mdl-icon-toggle.is-disabled .mdl-icon-toggle__ripple-container{cursor:auto}.mdl-icon-toggle.is-disabled .mdl-icon-toggle__ripple-container .mdl-ripple{background:0 0}.mdl-list{display:block;padding:8px 0;list-style:none}.mdl-list__item{font-family:"Roboto","Helvetica","Arial",sans-serif;font-size:16px;font-weight:400;letter-spacing:.04em;line-height:1;min-height:48px;-webkit-flex-direction:row;-ms-flex-direction:row;flex-direction:row;-webkit-flex-wrap:nowrap;-ms-flex-wrap:nowrap;flex-wrap:nowrap;padding:16px;cursor:default;color:rgba(0,0,0,.87);overflow:hidden}.mdl-list__item,.mdl-list__item .mdl-list__item-primary-content{box-sizing:border-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-align-items:center;-ms-flex-align:center;align-items:center}.mdl-list__item .mdl-list__item-primary-content{-webkit-order:0;-ms-flex-order:0;order:0;-webkit-flex-grow:2;-ms-flex-positive:2;flex-grow:2;text-decoration:none}.mdl-list__item .mdl-list__item-primary-content .mdl-list__item-icon{margin-right:32px}.mdl-list__item .mdl-list__item-primary-content .mdl-list__item-avatar{margin-right:16px}.mdl-list__item .mdl-list__item-secondary-content{display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-flow:column;-ms-flex-flow:column;flex-flow:column;-webkit-align-items:flex-end;-ms-flex-align:end;align-items:flex-end;margin-left:16px}.mdl-list__item .mdl-list__item-secondary-content .mdl-list__item-secondary-action label{display:inline}.mdl-list__item .mdl-list__item-secondary-content .mdl-list__item-secondary-info{font-size:12px;font-weight:400;line-height:1;letter-spacing:0;color:rgba(0,0,0,.54)}.mdl-list__item .mdl-list__item-secondary-content .mdl-list__item-sub-header{padding:0 0 0 16px}.mdl-list__item-icon,.mdl-list__item-icon.material-icons{height:24px;width:24px;font-size:24px;box-sizing:border-box;color:#757575}.mdl-list__item-avatar,.mdl-list__item-avatar.material-icons{height:40px;width:40px;box-sizing:border-box;border-radius:50%;background-color:#757575;font-size:40px;color:#fff}.mdl-list__item--two-line{height:72px}.mdl-list__item--two-line .mdl-list__item-primary-content{height:36px;line-height:20px;display:block}.mdl-list__item--two-line .mdl-list__item-primary-content .mdl-list__item-avatar{float:left}.mdl-list__item--two-line .mdl-list__item-primary-content .mdl-list__item-icon{float:left;margin-top:6px}.mdl-list__item--two-line .mdl-list__item-primary-content .mdl-list__item-secondary-content{height:36px}.mdl-list__item--two-line .mdl-list__item-primary-content .mdl-list__item-sub-title{font-size:14px;font-weight:400;letter-spacing:0;line-height:18px;color:rgba(0,0,0,.54);display:block;padding:0}.mdl-list__item--three-line{height:88px}.mdl-list__item--three-line .mdl-list__item-primary-content{height:52px;line-height:20px;display:block}.mdl-list__item--three-line .mdl-list__item-primary-content .mdl-list__item-avatar,.mdl-list__item--three-line .mdl-list__item-primary-content .mdl-list__item-icon{float:left}.mdl-list__item--three-line .mdl-list__item-secondary-content{height:52px}.mdl-list__item--three-line .mdl-list__item-text-body{font-size:14px;font-weight:400;letter-spacing:0;line-height:18px;height:52px;color:rgba(0,0,0,.54);display:block;padding:0}.mdl-menu__container{display:block;margin:0;padding:0;border:none;position:absolute;overflow:visible;height:0;width:0;visibility:hidden;z-index:-1}.mdl-menu__container.is-visible,.mdl-menu__container.is-animating{z-index:999;visibility:visible}.mdl-menu__outline{display:block;background:#fff;margin:0;padding:0;border:none;border-radius:2px;position:absolute;top:0;left:0;overflow:hidden;opacity:0;-webkit-transform:scale(0);transform:scale(0);-webkit-transform-origin:0 0;transform-origin:0 0;box-shadow:0 2px 2px 0 rgba(0,0,0,.14),0 3px 1px -2px rgba(0,0,0,.2),0 1px 5px 0 rgba(0,0,0,.12);will-change:transform;transition:transform .3s cubic-bezier(.4,0,.2,1),opacity .2s cubic-bezier(.4,0,.2,1);transition:transform .3s cubic-bezier(.4,0,.2,1),opacity .2s cubic-bezier(.4,0,.2,1),-webkit-transform .3s cubic-bezier(.4,0,.2,1);z-index:-1}.mdl-menu__container.is-visible .mdl-menu__outline{opacity:1;-webkit-transform:scale(1);transform:scale(1);z-index:999}.mdl-menu__outline.mdl-menu--bottom-right{-webkit-transform-origin:100% 0;transform-origin:100% 0}.mdl-menu__outline.mdl-menu--top-left{-webkit-transform-origin:0 100%;transform-origin:0 100%}.mdl-menu__outline.mdl-menu--top-right{-webkit-transform-origin:100% 100%;transform-origin:100% 100%}.mdl-menu{position:absolute;list-style:none;top:0;left:0;height:auto;width:auto;min-width:124px;padding:8px 0;margin:0;opacity:0;clip:rect(0 0 0 0);z-index:-1}.mdl-menu__container.is-visible .mdl-menu{opacity:1;z-index:999}.mdl-menu.is-animating{transition:opacity .2s cubic-bezier(.4,0,.2,1),clip .3s cubic-bezier(.4,0,.2,1)}.mdl-menu.mdl-menu--bottom-right{left:auto;right:0}.mdl-menu.mdl-menu--top-left{top:auto;bottom:0}.mdl-menu.mdl-menu--top-right{top:auto;left:auto;bottom:0;right:0}.mdl-menu.mdl-menu--unaligned{top:auto;left:auto}.mdl-menu__item{display:block;border:none;color:rgba(0,0,0,.87);background-color:transparent;text-align:left;margin:0;padding:0 16px;outline-color:#bdbdbd;position:relative;overflow:hidden;font-size:14px;font-weight:400;letter-spacing:0;text-decoration:none;cursor:pointer;height:48px;line-height:48px;white-space:nowrap;opacity:0;transition:opacity .2s cubic-bezier(.4,0,.2,1);-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none}.mdl-menu__container.is-visible .mdl-menu__item{opacity:1}.mdl-menu__item::-moz-focus-inner{border:0}.mdl-menu__item--full-bleed-divider{border-bottom:1px solid rgba(0,0,0,.12)}.mdl-menu__item[disabled],.mdl-menu__item[data-mdl-disabled]{color:#bdbdbd;background-color:transparent;cursor:auto}.mdl-menu__item[disabled]:hover,.mdl-menu__item[data-mdl-disabled]:hover{background-color:transparent}.mdl-menu__item[disabled]:focus,.mdl-menu__item[data-mdl-disabled]:focus{background-color:transparent}.mdl-menu__item[disabled] .mdl-ripple,.mdl-menu__item[data-mdl-disabled] .mdl-ripple{background:0 0}.mdl-menu__item:hover{background-color:#eee}.mdl-menu__item:focus{outline:none;background-color:#eee}.mdl-menu__item:active{background-color:#e0e0e0}.mdl-menu__item--ripple-container{display:block;height:100%;left:0;position:absolute;top:0;width:100%;z-index:0;overflow:hidden}.mdl-progress{display:block;position:relative;height:4px;width:500px;max-width:100%}.mdl-progress>.bar{display:block;position:absolute;top:0;bottom:0;width:0%;transition:width .2s cubic-bezier(.4,0,.2,1)}.mdl-progress>.progressbar{background-color:rgb(103,58,183);z-index:1;left:0}.mdl-progress>.bufferbar{background-image:linear-gradient(to right,rgba(255,255,255,.7),rgba(255,255,255,.7)),linear-gradient(to right,rgb(103,58,183),rgb(103,58,183));z-index:0;left:0}.mdl-progress>.auxbar{right:0}@supports (-webkit-appearance:none){.mdl-progress:not(.mdl-progress--indeterminate):not(.mdl-progress--indeterminate)>.auxbar,.mdl-progress:not(.mdl-progress__indeterminate):not(.mdl-progress__indeterminate)>.auxbar{background-image:linear-gradient(to right,rgba(255,255,255,.7),rgba(255,255,255,.7)),linear-gradient(to right,rgb(103,58,183),rgb(103,58,183));-webkit-mask:url("data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIj8+Cjxzdmcgd2lkdGg9IjEyIiBoZWlnaHQ9IjQiIHZpZXdQb3J0PSIwIDAgMTIgNCIgdmVyc2lvbj0iMS4xIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPgogIDxlbGxpcHNlIGN4PSIyIiBjeT0iMiIgcng9IjIiIHJ5PSIyIj4KICAgIDxhbmltYXRlIGF0dHJpYnV0ZU5hbWU9ImN4IiBmcm9tPSIyIiB0bz0iLTEwIiBkdXI9IjAuNnMiIHJlcGVhdENvdW50PSJpbmRlZmluaXRlIiAvPgogIDwvZWxsaXBzZT4KICA8ZWxsaXBzZSBjeD0iMTQiIGN5PSIyIiByeD0iMiIgcnk9IjIiIGNsYXNzPSJsb2FkZXIiPgogICAgPGFuaW1hdGUgYXR0cmlidXRlTmFtZT0iY3giIGZyb209IjE0IiB0bz0iMiIgZHVyPSIwLjZzIiByZXBlYXRDb3VudD0iaW5kZWZpbml0ZSIgLz4KICA8L2VsbGlwc2U+Cjwvc3ZnPgo=");mask:url("data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIj8+Cjxzdmcgd2lkdGg9IjEyIiBoZWlnaHQ9IjQiIHZpZXdQb3J0PSIwIDAgMTIgNCIgdmVyc2lvbj0iMS4xIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPgogIDxlbGxpcHNlIGN4PSIyIiBjeT0iMiIgcng9IjIiIHJ5PSIyIj4KICAgIDxhbmltYXRlIGF0dHJpYnV0ZU5hbWU9ImN4IiBmcm9tPSIyIiB0bz0iLTEwIiBkdXI9IjAuNnMiIHJlcGVhdENvdW50PSJpbmRlZmluaXRlIiAvPgogIDwvZWxsaXBzZT4KICA8ZWxsaXBzZSBjeD0iMTQiIGN5PSIyIiByeD0iMiIgcnk9IjIiIGNsYXNzPSJsb2FkZXIiPgogICAgPGFuaW1hdGUgYXR0cmlidXRlTmFtZT0iY3giIGZyb209IjE0IiB0bz0iMiIgZHVyPSIwLjZzIiByZXBlYXRDb3VudD0iaW5kZWZpbml0ZSIgLz4KICA8L2VsbGlwc2U+Cjwvc3ZnPgo=")}}.mdl-progress:not(.mdl-progress--indeterminate)>.auxbar,.mdl-progress:not(.mdl-progress__indeterminate)>.auxbar{background-image:linear-gradient(to right,rgba(255,255,255,.9),rgba(255,255,255,.9)),linear-gradient(to right,rgb(103,58,183),rgb(103,58,183))}.mdl-progress.mdl-progress--indeterminate>.bar1,.mdl-progress.mdl-progress__indeterminate>.bar1{-webkit-animation-name:indeterminate1;animation-name:indeterminate1}.mdl-progress.mdl-progress--indeterminate>.bar1,.mdl-progress.mdl-progress__indeterminate>.bar1,.mdl-progress.mdl-progress--indeterminate>.bar3,.mdl-progress.mdl-progress__indeterminate>.bar3{background-color:rgb(103,58,183);-webkit-animation-duration:2s;animation-duration:2s;-webkit-animation-iteration-count:infinite;animation-iteration-count:infinite;-webkit-animation-timing-function:linear;animation-timing-function:linear}.mdl-progress.mdl-progress--indeterminate>.bar3,.mdl-progress.mdl-progress__indeterminate>.bar3{background-image:none;-webkit-animation-name:indeterminate2;animation-name:indeterminate2}@-webkit-keyframes indeterminate1{0%{left:0%;width:0%}50%{left:25%;width:75%}75%{left:100%;width:0%}}@keyframes indeterminate1{0%{left:0%;width:0%}50%{left:25%;width:75%}75%{left:100%;width:0%}}@-webkit-keyframes indeterminate2{0%,50%{left:0%;width:0%}75%{left:0%;width:25%}100%{left:100%;width:0%}}@keyframes indeterminate2{0%,50%{left:0%;width:0%}75%{left:0%;width:25%}100%{left:100%;width:0%}}.mdl-navigation{display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-wrap:nowrap;-ms-flex-wrap:nowrap;flex-wrap:nowrap;box-sizing:border-box}.mdl-navigation__link{color:#424242;text-decoration:none;margin:0;font-size:14px;font-weight:400;line-height:24px;letter-spacing:0;opacity:.87}.mdl-navigation__link .material-icons{vertical-align:middle}.mdl-layout{width:100%;height:100%;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:column;-ms-flex-direction:column;flex-direction:column;overflow-y:auto;overflow-x:hidden;position:relative;-webkit-overflow-scrolling:touch}.mdl-layout.is-small-screen .mdl-layout--large-screen-only{display:none}.mdl-layout:not(.is-small-screen) .mdl-layout--small-screen-only{display:none}.mdl-layout__container{position:absolute;width:100%;height:100%}.mdl-layout__title,.mdl-layout-title{display:block;position:relative;font-family:"Roboto","Helvetica","Arial",sans-serif;font-size:20px;line-height:1;letter-spacing:.02em;font-weight:400;box-sizing:border-box}.mdl-layout-spacer{-webkit-flex-grow:1;-ms-flex-positive:1;flex-grow:1}.mdl-layout__drawer{display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:column;-ms-flex-direction:column;flex-direction:column;-webkit-flex-wrap:nowrap;-ms-flex-wrap:nowrap;flex-wrap:nowrap;width:240px;height:100%;max-height:100%;position:absolute;top:0;left:0;box-shadow:0 2px 2px 0 rgba(0,0,0,.14),0 3px 1px -2px rgba(0,0,0,.2),0 1px 5px 0 rgba(0,0,0,.12);box-sizing:border-box;border-right:1px solid #e0e0e0;background:#fafafa;-webkit-transform:translateX(-250px);transform:translateX(-250px);-webkit-transform-style:preserve-3d;transform-style:preserve-3d;will-change:transform;transition-duration:.2s;transition-timing-function:cubic-bezier(.4,0,.2,1);transition-property:transform;transition-property:transform,-webkit-transform;color:#424242;overflow:visible;overflow-y:auto;z-index:5}.mdl-layout__drawer.is-visible{-webkit-transform:translateX(0);transform:translateX(0)}.mdl-layout__drawer.is-visible~.mdl-layout__content.mdl-layout__content{overflow:hidden}.mdl-layout__drawer>*{-webkit-flex-shrink:0;-ms-flex-negative:0;flex-shrink:0}.mdl-layout__drawer>.mdl-layout__title,.mdl-layout__drawer>.mdl-layout-title{line-height:64px;padding-left:40px}@media screen and (max-width:1024px){.mdl-layout__drawer>.mdl-layout__title,.mdl-layout__drawer>.mdl-layout-title{line-height:56px;padding-left:16px}}.mdl-layout__drawer .mdl-navigation{-webkit-flex-direction:column;-ms-flex-direction:column;flex-direction:column;-webkit-align-items:stretch;-ms-flex-align:stretch;align-items:stretch;padding-top:16px}.mdl-layout__drawer .mdl-navigation .mdl-navigation__link{display:block;-webkit-flex-shrink:0;-ms-flex-negative:0;flex-shrink:0;padding:16px 40px;margin:0;color:#757575}@media screen and (max-width:1024px){.mdl-layout__drawer .mdl-navigation .mdl-navigation__link{padding:16px}}.mdl-layout__drawer .mdl-navigation .mdl-navigation__link:hover{background-color:#e0e0e0}.mdl-layout__drawer .mdl-navigation .mdl-navigation__link--current{background-color:#e0e0e0;color:#000}@media screen and (min-width:1025px){.mdl-layout--fixed-drawer>.mdl-layout__drawer{-webkit-transform:translateX(0);transform:translateX(0)}}.mdl-layout__drawer-button{display:block;position:absolute;height:48px;width:48px;border:0;-webkit-flex-shrink:0;-ms-flex-negative:0;flex-shrink:0;overflow:hidden;text-align:center;cursor:pointer;font-size:26px;line-height:56px;font-family:Helvetica,Arial,sans-serif;margin:8px 12px;top:0;left:0;color:rgb(255,255,255);z-index:4}.mdl-layout__header .mdl-layout__drawer-button{position:absolute;color:rgb(255,255,255);background-color:inherit}@media screen and (max-width:1024px){.mdl-layout__header .mdl-layout__drawer-button{margin:4px}}@media screen and (max-width:1024px){.mdl-layout__drawer-button{margin:4px;color:rgba(0,0,0,.5)}}@media screen and (min-width:1025px){.mdl-layout__drawer-button{line-height:54px}.mdl-layout--no-desktop-drawer-button .mdl-layout__drawer-button,.mdl-layout--fixed-drawer>.mdl-layout__drawer-button,.mdl-layout--no-drawer-button .mdl-layout__drawer-button{display:none}}.mdl-layout__header{display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:column;-ms-flex-direction:column;flex-direction:column;-webkit-flex-wrap:nowrap;-ms-flex-wrap:nowrap;flex-wrap:nowrap;-webkit-justify-content:flex-start;-ms-flex-pack:start;justify-content:flex-start;box-sizing:border-box;-webkit-flex-shrink:0;-ms-flex-negative:0;flex-shrink:0;width:100%;margin:0;padding:0;border:none;min-height:64px;max-height:1000px;z-index:3;background-color:rgb(103,58,183);color:rgb(255,255,255);box-shadow:0 2px 2px 0 rgba(0,0,0,.14),0 3px 1px -2px rgba(0,0,0,.2),0 1px 5px 0 rgba(0,0,0,.12);transition-duration:.2s;transition-timing-function:cubic-bezier(.4,0,.2,1);transition-property:max-height,box-shadow}@media screen and (max-width:1024px){.mdl-layout__header{min-height:56px}}.mdl-layout--fixed-drawer.is-upgraded:not(.is-small-screen)>.mdl-layout__header{margin-left:240px;width:calc(100% - 240px)}@media screen and (min-width:1025px){.mdl-layout--fixed-drawer>.mdl-layout__header .mdl-layout__header-row{padding-left:40px}}.mdl-layout__header>.mdl-layout-icon{position:absolute;left:40px;top:16px;height:32px;width:32px;overflow:hidden;z-index:3;display:block}@media screen and (max-width:1024px){.mdl-layout__header>.mdl-layout-icon{left:16px;top:12px}}.mdl-layout.has-drawer .mdl-layout__header>.mdl-layout-icon{display:none}.mdl-layout__header.is-compact{max-height:64px}@media screen and (max-width:1024px){.mdl-layout__header.is-compact{max-height:56px}}.mdl-layout__header.is-compact.has-tabs{height:112px}@media screen and (max-width:1024px){.mdl-layout__header.is-compact.has-tabs{min-height:104px}}@media screen and (max-width:1024px){.mdl-layout__header{display:none}.mdl-layout--fixed-header>.mdl-layout__header{display:-webkit-flex;display:-ms-flexbox;display:flex}}.mdl-layout__header--transparent.mdl-layout__header--transparent{background-color:transparent;box-shadow:none}.mdl-layout__header--seamed,.mdl-layout__header--scroll{box-shadow:none}.mdl-layout__header--waterfall{box-shadow:none;overflow:hidden}.mdl-layout__header--waterfall.is-casting-shadow{box-shadow:0 2px 2px 0 rgba(0,0,0,.14),0 3px 1px -2px rgba(0,0,0,.2),0 1px 5px 0 rgba(0,0,0,.12)}.mdl-layout__header--waterfall.mdl-layout__header--waterfall-hide-top{-webkit-justify-content:flex-end;-ms-flex-pack:end;justify-content:flex-end}.mdl-layout__header-row{display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:row;-ms-flex-direction:row;flex-direction:row;-webkit-flex-wrap:nowrap;-ms-flex-wrap:nowrap;flex-wrap:nowrap;-webkit-flex-shrink:0;-ms-flex-negative:0;flex-shrink:0;box-sizing:border-box;-webkit-align-self:stretch;-ms-flex-item-align:stretch;align-self:stretch;-webkit-align-items:center;-ms-flex-align:center;align-items:center;height:64px;margin:0;padding:0 40px 0 80px}.mdl-layout--no-drawer-button .mdl-layout__header-row{padding-left:40px}@media screen and (min-width:1025px){.mdl-layout--no-desktop-drawer-button .mdl-layout__header-row{padding-left:40px}}@media screen and (max-width:1024px){.mdl-layout__header-row{height:56px;padding:0 16px 0 72px}.mdl-layout--no-drawer-button .mdl-layout__header-row{padding-left:16px}}.mdl-layout__header-row>*{-webkit-flex-shrink:0;-ms-flex-negative:0;flex-shrink:0}.mdl-layout__header--scroll .mdl-layout__header-row{width:100%}.mdl-layout__header-row .mdl-navigation{margin:0;padding:0;height:64px;-webkit-flex-direction:row;-ms-flex-direction:row;flex-direction:row;-webkit-align-items:center;-ms-flex-align:center;align-items:center}@media screen and (max-width:1024px){.mdl-layout__header-row .mdl-navigation{height:56px}}.mdl-layout__header-row .mdl-navigation__link{display:block;color:rgb(255,255,255);line-height:64px;padding:0 24px}@media screen and (max-width:1024px){.mdl-layout__header-row .mdl-navigation__link{line-height:56px;padding:0 16px}}.mdl-layout__obfuscator{background-color:transparent;position:absolute;top:0;left:0;height:100%;width:100%;z-index:4;visibility:hidden;transition-property:background-color;transition-duration:.2s;transition-timing-function:cubic-bezier(.4,0,.2,1)}.mdl-layout__obfuscator.is-visible{background-color:rgba(0,0,0,.5);visibility:visible}@supports (pointer-events:auto){.mdl-layout__obfuscator{background-color:rgba(0,0,0,.5);opacity:0;transition-property:opacity;visibility:visible;pointer-events:none}.mdl-layout__obfuscator.is-visible{pointer-events:auto;opacity:1}}.mdl-layout__content{-ms-flex:0 1 auto;position:relative;display:inline-block;overflow-y:auto;overflow-x:hidden;-webkit-flex-grow:1;-ms-flex-positive:1;flex-grow:1;z-index:1;-webkit-overflow-scrolling:touch}.mdl-layout--fixed-drawer>.mdl-layout__content{margin-left:240px}.mdl-layout__container.has-scrolling-header .mdl-layout__content{overflow:visible}@media screen and (max-width:1024px){.mdl-layout--fixed-drawer>.mdl-layout__content{margin-left:0}.mdl-layout__container.has-scrolling-header .mdl-layout__content{overflow-y:auto;overflow-x:hidden}}.mdl-layout__tab-bar{height:96px;margin:0;width:calc(100% - 112px);padding:0 0 0 56px;display:-webkit-flex;display:-ms-flexbox;display:flex;background-color:rgb(103,58,183);overflow-y:hidden;overflow-x:scroll}.mdl-layout__tab-bar::-webkit-scrollbar{display:none}.mdl-layout--no-drawer-button .mdl-layout__tab-bar{padding-left:16px;width:calc(100% - 32px)}@media screen and (min-width:1025px){.mdl-layout--no-desktop-drawer-button .mdl-layout__tab-bar{padding-left:16px;width:calc(100% - 32px)}}@media screen and (max-width:1024px){.mdl-layout__tab-bar{width:calc(100% - 60px);padding:0 0 0 60px}.mdl-layout--no-drawer-button .mdl-layout__tab-bar{width:calc(100% - 8px);padding-left:4px}}.mdl-layout--fixed-tabs .mdl-layout__tab-bar{padding:0;overflow:hidden;width:100%}.mdl-layout__tab-bar-container{position:relative;height:48px;width:100%;border:none;margin:0;z-index:2;-webkit-flex-grow:0;-ms-flex-positive:0;flex-grow:0;-webkit-flex-shrink:0;-ms-flex-negative:0;flex-shrink:0;overflow:hidden}.mdl-layout__container>.mdl-layout__tab-bar-container{position:absolute;top:0;left:0}.mdl-layout__tab-bar-button{display:inline-block;position:absolute;top:0;height:48px;width:56px;z-index:4;text-align:center;background-color:rgb(103,58,183);color:transparent;cursor:pointer;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none}.mdl-layout--no-desktop-drawer-button .mdl-layout__tab-bar-button,.mdl-layout--no-drawer-button .mdl-layout__tab-bar-button{width:16px}.mdl-layout--no-desktop-drawer-button .mdl-layout__tab-bar-button .material-icons,.mdl-layout--no-drawer-button .mdl-layout__tab-bar-button .material-icons{position:relative;left:-4px}@media screen and (max-width:1024px){.mdl-layout__tab-bar-button{width:60px}}.mdl-layout--fixed-tabs .mdl-layout__tab-bar-button{display:none}.mdl-layout__tab-bar-button .material-icons{line-height:48px}.mdl-layout__tab-bar-button.is-active{color:rgb(255,255,255)}.mdl-layout__tab-bar-left-button{left:0}.mdl-layout__tab-bar-right-button{right:0}.mdl-layout__tab{margin:0;border:none;padding:0 24px;float:left;position:relative;display:block;-webkit-flex-grow:0;-ms-flex-positive:0;flex-grow:0;-webkit-flex-shrink:0;-ms-flex-negative:0;flex-shrink:0;text-decoration:none;height:48px;line-height:48px;text-align:center;font-weight:500;font-size:14px;text-transform:uppercase;color:rgba(255,255,255,.6);overflow:hidden}@media screen and (max-width:1024px){.mdl-layout__tab{padding:0 12px}}.mdl-layout--fixed-tabs .mdl-layout__tab{float:none;-webkit-flex-grow:1;-ms-flex-positive:1;flex-grow:1;padding:0}.mdl-layout.is-upgraded .mdl-layout__tab.is-active{color:rgb(255,255,255)}.mdl-layout.is-upgraded .mdl-layout__tab.is-active::after{height:2px;width:100%;display:block;content:" ";bottom:0;left:0;position:absolute;background:rgb(255,64,129);-webkit-animation:border-expand .2s cubic-bezier(.4,0,.4,1).01s alternate forwards;animation:border-expand .2s cubic-bezier(.4,0,.4,1).01s alternate forwards;transition:all 1s cubic-bezier(.4,0,1,1)}.mdl-layout__tab .mdl-layout__tab-ripple-container{display:block;position:absolute;height:100%;width:100%;left:0;top:0;z-index:1;overflow:hidden}.mdl-layout__tab .mdl-layout__tab-ripple-container .mdl-ripple{background-color:rgb(255,255,255)}.mdl-layout__tab-panel{display:block}.mdl-layout.is-upgraded .mdl-layout__tab-panel{display:none}.mdl-layout.is-upgraded .mdl-layout__tab-panel.is-active{display:block}.mdl-radio{position:relative;font-size:16px;line-height:24px;display:inline-block;vertical-align:middle;box-sizing:border-box;height:24px;margin:0;padding-left:0}.mdl-radio.is-upgraded{padding-left:24px}.mdl-radio__button{line-height:24px}.mdl-radio.is-upgraded .mdl-radio__button{position:absolute;width:0;height:0;margin:0;padding:0;opacity:0;-ms-appearance:none;-moz-appearance:none;-webkit-appearance:none;appearance:none;border:none}.mdl-radio__outer-circle{position:absolute;top:4px;left:0;display:inline-block;box-sizing:border-box;width:16px;height:16px;margin:0;cursor:pointer;border:2px solid rgba(0,0,0,.54);border-radius:50%;z-index:2}.mdl-radio.is-checked .mdl-radio__outer-circle{border:2px solid rgb(103,58,183)}.mdl-radio__outer-circle fieldset[disabled] .mdl-radio,.mdl-radio.is-disabled .mdl-radio__outer-circle{border:2px solid rgba(0,0,0,.26);cursor:auto}.mdl-radio__inner-circle{position:absolute;z-index:1;margin:0;top:8px;left:4px;box-sizing:border-box;width:8px;height:8px;cursor:pointer;transition-duration:.28s;transition-timing-function:cubic-bezier(.4,0,.2,1);transition-property:transform;transition-property:transform,-webkit-transform;-webkit-transform:scale(0,0);transform:scale(0,0);border-radius:50%;background:rgb(103,58,183)}.mdl-radio.is-checked .mdl-radio__inner-circle{-webkit-transform:scale(1,1);transform:scale(1,1)}fieldset[disabled] .mdl-radio .mdl-radio__inner-circle,.mdl-radio.is-disabled .mdl-radio__inner-circle{background:rgba(0,0,0,.26);cursor:auto}.mdl-radio.is-focused .mdl-radio__inner-circle{box-shadow:0 0 0 10px rgba(0,0,0,.1)}.mdl-radio__label{cursor:pointer}fieldset[disabled] .mdl-radio .mdl-radio__label,.mdl-radio.is-disabled .mdl-radio__label{color:rgba(0,0,0,.26);cursor:auto}.mdl-radio__ripple-container{position:absolute;z-index:2;top:-9px;left:-13px;box-sizing:border-box;width:42px;height:42px;border-radius:50%;cursor:pointer;overflow:hidden;-webkit-mask-image:-webkit-radial-gradient(circle,#fff,#000)}.mdl-radio__ripple-container .mdl-ripple{background:rgb(103,58,183)}fieldset[disabled] .mdl-radio .mdl-radio__ripple-container,.mdl-radio.is-disabled .mdl-radio__ripple-container{cursor:auto}fieldset[disabled] .mdl-radio .mdl-radio__ripple-container .mdl-ripple,.mdl-radio.is-disabled .mdl-radio__ripple-container .mdl-ripple{background:0 0}_:-ms-input-placeholder,:root .mdl-slider.mdl-slider.is-upgraded{-ms-appearance:none;height:32px;margin:0}.mdl-slider{width:calc(100% - 40px);margin:0 20px}.mdl-slider.is-upgraded{-webkit-appearance:none;-moz-appearance:none;appearance:none;height:2px;background:0 0;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none;outline:0;padding:0;color:rgb(103,58,183);-webkit-align-self:center;-ms-flex-item-align:center;-ms-grid-row-align:center;align-self:center;z-index:1;cursor:pointer}.mdl-slider.is-upgraded::-moz-focus-outer{border:0}.mdl-slider.is-upgraded::-ms-tooltip{display:none}.mdl-slider.is-upgraded::-webkit-slider-runnable-track{background:0 0}.mdl-slider.is-upgraded::-moz-range-track{background:0 0;border:none}.mdl-slider.is-upgraded::-ms-track{background:0 0;color:transparent;height:2px;width:100%;border:none}.mdl-slider.is-upgraded::-ms-fill-lower{padding:0;background:linear-gradient(to right,transparent,transparent 16px,rgb(103,58,183)16px,rgb(103,58,183)0)}.mdl-slider.is-upgraded::-ms-fill-upper{padding:0;background:linear-gradient(to left,transparent,transparent 16px,rgba(0,0,0,.26)16px,rgba(0,0,0,.26)0)}.mdl-slider.is-upgraded::-webkit-slider-thumb{-webkit-appearance:none;width:12px;height:12px;box-sizing:border-box;border-radius:50%;background:rgb(103,58,183);border:none;transition:transform .18s cubic-bezier(.4,0,.2,1),border .18s cubic-bezier(.4,0,.2,1),box-shadow .18s cubic-bezier(.4,0,.2,1),background .28s cubic-bezier(.4,0,.2,1);transition:transform .18s cubic-bezier(.4,0,.2,1),border .18s cubic-bezier(.4,0,.2,1),box-shadow .18s cubic-bezier(.4,0,.2,1),background .28s cubic-bezier(.4,0,.2,1),-webkit-transform .18s cubic-bezier(.4,0,.2,1)}.mdl-slider.is-upgraded::-moz-range-thumb{-moz-appearance:none;width:12px;height:12px;box-sizing:border-box;border-radius:50%;background-image:none;background:rgb(103,58,183);border:none}.mdl-slider.is-upgraded:focus:not(:active)::-webkit-slider-thumb{box-shadow:0 0 0 10px rgba(103,58,183,.26)}.mdl-slider.is-upgraded:focus:not(:active)::-moz-range-thumb{box-shadow:0 0 0 10px rgba(103,58,183,.26)}.mdl-slider.is-upgraded:active::-webkit-slider-thumb{background-image:none;background:rgb(103,58,183);-webkit-transform:scale(1.5);transform:scale(1.5)}.mdl-slider.is-upgraded:active::-moz-range-thumb{background-image:none;background:rgb(103,58,183);transform:scale(1.5)}.mdl-slider.is-upgraded::-ms-thumb{width:32px;height:32px;border:none;border-radius:50%;background:rgb(103,58,183);transform:scale(.375);transition:transform .18s cubic-bezier(.4,0,.2,1),background .28s cubic-bezier(.4,0,.2,1);transition:transform .18s cubic-bezier(.4,0,.2,1),background .28s cubic-bezier(.4,0,.2,1),-webkit-transform .18s cubic-bezier(.4,0,.2,1)}.mdl-slider.is-upgraded:focus:not(:active)::-ms-thumb{background:radial-gradient(circle closest-side,rgb(103,58,183)0%,rgb(103,58,183)37.5%,rgba(103,58,183,.26)37.5%,rgba(103,58,183,.26)100%);transform:scale(1)}.mdl-slider.is-upgraded:active::-ms-thumb{background:rgb(103,58,183);transform:scale(.5625)}.mdl-slider.is-upgraded.is-lowest-value::-webkit-slider-thumb{border:2px solid rgba(0,0,0,.26);background:0 0}.mdl-slider.is-upgraded.is-lowest-value::-moz-range-thumb{border:2px solid rgba(0,0,0,.26);background:0 0}.mdl-slider.is-upgraded.is-lowest-value+.mdl-slider__background-flex>.mdl-slider__background-upper{left:6px}.mdl-slider.is-upgraded.is-lowest-value:focus:not(:active)::-webkit-slider-thumb{box-shadow:0 0 0 10px rgba(0,0,0,.12);background:rgba(0,0,0,.12)}.mdl-slider.is-upgraded.is-lowest-value:focus:not(:active)::-moz-range-thumb{box-shadow:0 0 0 10px rgba(0,0,0,.12);background:rgba(0,0,0,.12)}.mdl-slider.is-upgraded.is-lowest-value:active::-webkit-slider-thumb{border:1.6px solid rgba(0,0,0,.26);-webkit-transform:scale(1.5);transform:scale(1.5)}.mdl-slider.is-upgraded.is-lowest-value:active+.mdl-slider__background-flex>.mdl-slider__background-upper{left:9px}.mdl-slider.is-upgraded.is-lowest-value:active::-moz-range-thumb{border:1.5px solid rgba(0,0,0,.26);transform:scale(1.5)}.mdl-slider.is-upgraded.is-lowest-value::-ms-thumb{background:radial-gradient(circle closest-side,transparent 0%,transparent 66.67%,rgba(0,0,0,.26)66.67%,rgba(0,0,0,.26)100%)}.mdl-slider.is-upgraded.is-lowest-value:focus:not(:active)::-ms-thumb{background:radial-gradient(circle closest-side,rgba(0,0,0,.12)0%,rgba(0,0,0,.12)25%,rgba(0,0,0,.26)25%,rgba(0,0,0,.26)37.5%,rgba(0,0,0,.12)37.5%,rgba(0,0,0,.12)100%);transform:scale(1)}.mdl-slider.is-upgraded.is-lowest-value:active::-ms-thumb{transform:scale(.5625);background:radial-gradient(circle closest-side,transparent 0%,transparent 77.78%,rgba(0,0,0,.26)77.78%,rgba(0,0,0,.26)100%)}.mdl-slider.is-upgraded.is-lowest-value::-ms-fill-lower{background:0 0}.mdl-slider.is-upgraded.is-lowest-value::-ms-fill-upper{margin-left:6px}.mdl-slider.is-upgraded.is-lowest-value:active::-ms-fill-upper{margin-left:9px}.mdl-slider.is-upgraded:disabled:focus::-webkit-slider-thumb,.mdl-slider.is-upgraded:disabled:active::-webkit-slider-thumb,.mdl-slider.is-upgraded:disabled::-webkit-slider-thumb{-webkit-transform:scale(.667);transform:scale(.667);background:rgba(0,0,0,.26)}.mdl-slider.is-upgraded:disabled:focus::-moz-range-thumb,.mdl-slider.is-upgraded:disabled:active::-moz-range-thumb,.mdl-slider.is-upgraded:disabled::-moz-range-thumb{transform:scale(.667);background:rgba(0,0,0,.26)}.mdl-slider.is-upgraded:disabled+.mdl-slider__background-flex>.mdl-slider__background-lower{background-color:rgba(0,0,0,.26);left:-6px}.mdl-slider.is-upgraded:disabled+.mdl-slider__background-flex>.mdl-slider__background-upper{left:6px}.mdl-slider.is-upgraded.is-lowest-value:disabled:focus::-webkit-slider-thumb,.mdl-slider.is-upgraded.is-lowest-value:disabled:active::-webkit-slider-thumb,.mdl-slider.is-upgraded.is-lowest-value:disabled::-webkit-slider-thumb{border:3px solid rgba(0,0,0,.26);background:0 0;-webkit-transform:scale(.667);transform:scale(.667)}.mdl-slider.is-upgraded.is-lowest-value:disabled:focus::-moz-range-thumb,.mdl-slider.is-upgraded.is-lowest-value:disabled:active::-moz-range-thumb,.mdl-slider.is-upgraded.is-lowest-value:disabled::-moz-range-thumb{border:3px solid rgba(0,0,0,.26);background:0 0;transform:scale(.667)}.mdl-slider.is-upgraded.is-lowest-value:disabled:active+.mdl-slider__background-flex>.mdl-slider__background-upper{left:6px}.mdl-slider.is-upgraded:disabled:focus::-ms-thumb,.mdl-slider.is-upgraded:disabled:active::-ms-thumb,.mdl-slider.is-upgraded:disabled::-ms-thumb{transform:scale(.25);background:rgba(0,0,0,.26)}.mdl-slider.is-upgraded.is-lowest-value:disabled:focus::-ms-thumb,.mdl-slider.is-upgraded.is-lowest-value:disabled:active::-ms-thumb,.mdl-slider.is-upgraded.is-lowest-value:disabled::-ms-thumb{transform:scale(.25);background:radial-gradient(circle closest-side,transparent 0%,transparent 50%,rgba(0,0,0,.26)50%,rgba(0,0,0,.26)100%)}.mdl-slider.is-upgraded:disabled::-ms-fill-lower{margin-right:6px;background:linear-gradient(to right,transparent,transparent 25px,rgba(0,0,0,.26)25px,rgba(0,0,0,.26)0)}.mdl-slider.is-upgraded:disabled::-ms-fill-upper{margin-left:6px}.mdl-slider.is-upgraded.is-lowest-value:disabled:active::-ms-fill-upper{margin-left:6px}.mdl-slider__ie-container{height:18px;overflow:visible;border:none;margin:none;padding:none}.mdl-slider__container{height:18px;position:relative;-webkit-flex-direction:row;-ms-flex-direction:row;flex-direction:row}.mdl-slider__container,.mdl-slider__background-flex{background:0 0;display:-webkit-flex;display:-ms-flexbox;display:flex}.mdl-slider__background-flex{position:absolute;height:2px;width:calc(100% - 52px);top:50%;left:0;margin:0 26px;overflow:hidden;border:0;padding:0;-webkit-transform:translate(0,-1px);transform:translate(0,-1px)}.mdl-slider__background-lower{background:rgb(103,58,183)}.mdl-slider__background-lower,.mdl-slider__background-upper{-webkit-flex:0;-ms-flex:0;flex:0;position:relative;border:0;padding:0}.mdl-slider__background-upper{background:rgba(0,0,0,.26);transition:left .18s cubic-bezier(.4,0,.2,1)}.mdl-snackbar{position:fixed;bottom:0;left:50%;cursor:default;background-color:#323232;z-index:3;display:block;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-justify-content:space-between;-ms-flex-pack:justify;justify-content:space-between;font-family:"Roboto","Helvetica","Arial",sans-serif;will-change:transform;-webkit-transform:translate(0,80px);transform:translate(0,80px);transition:transform .25s cubic-bezier(.4,0,1,1);transition:transform .25s cubic-bezier(.4,0,1,1),-webkit-transform .25s cubic-bezier(.4,0,1,1);pointer-events:none}@media (max-width:479px){.mdl-snackbar{width:100%;left:0;min-height:48px;max-height:80px}}@media (min-width:480px){.mdl-snackbar{min-width:288px;max-width:568px;border-radius:2px;-webkit-transform:translate(-50%,80px);transform:translate(-50%,80px)}}.mdl-snackbar--active{-webkit-transform:translate(0,0);transform:translate(0,0);pointer-events:auto;transition:transform .25s cubic-bezier(0,0,.2,1);transition:transform .25s cubic-bezier(0,0,.2,1),-webkit-transform .25s cubic-bezier(0,0,.2,1)}@media (min-width:480px){.mdl-snackbar--active{-webkit-transform:translate(-50%,0);transform:translate(-50%,0)}}.mdl-snackbar__text{padding:14px 12px 14px 24px;vertical-align:middle;color:#fff;float:left}.mdl-snackbar__action{background:0 0;border:none;color:rgb(255,64,129);float:right;padding:14px 24px 14px 12px;font-family:"Roboto","Helvetica","Arial",sans-serif;font-size:14px;font-weight:500;text-transform:uppercase;line-height:1;letter-spacing:0;overflow:hidden;outline:none;opacity:0;pointer-events:none;cursor:pointer;text-decoration:none;text-align:center;-webkit-align-self:center;-ms-flex-item-align:center;-ms-grid-row-align:center;align-self:center}.mdl-snackbar__action::-moz-focus-inner{border:0}.mdl-snackbar__action:not([aria-hidden]){opacity:1;pointer-events:auto}.mdl-spinner{display:inline-block;position:relative;width:28px;height:28px}.mdl-spinner:not(.is-upgraded).is-active:after{content:"Loading..."}.mdl-spinner.is-upgraded.is-active{-webkit-animation:mdl-spinner__container-rotate 1568.23529412ms linear infinite;animation:mdl-spinner__container-rotate 1568.23529412ms linear infinite}@-webkit-keyframes mdl-spinner__container-rotate{to{-webkit-transform:rotate(360deg);transform:rotate(360deg)}}@keyframes mdl-spinner__container-rotate{to{-webkit-transform:rotate(360deg);transform:rotate(360deg)}}.mdl-spinner__layer{position:absolute;width:100%;height:100%;opacity:0}.mdl-spinner__layer-1{border-color:#42a5f5}.mdl-spinner--single-color .mdl-spinner__layer-1{border-color:rgb(103,58,183)}.mdl-spinner.is-active .mdl-spinner__layer-1{-webkit-animation:mdl-spinner__fill-unfill-rotate 5332ms cubic-bezier(.4,0,.2,1)infinite both,mdl-spinner__layer-1-fade-in-out 5332ms cubic-bezier(.4,0,.2,1)infinite both;animation:mdl-spinner__fill-unfill-rotate 5332ms cubic-bezier(.4,0,.2,1)infinite both,mdl-spinner__layer-1-fade-in-out 5332ms cubic-bezier(.4,0,.2,1)infinite both}.mdl-spinner__layer-2{border-color:#f44336}.mdl-spinner--single-color .mdl-spinner__layer-2{border-color:rgb(103,58,183)}.mdl-spinner.is-active .mdl-spinner__layer-2{-webkit-animation:mdl-spinner__fill-unfill-rotate 5332ms cubic-bezier(.4,0,.2,1)infinite both,mdl-spinner__layer-2-fade-in-out 5332ms cubic-bezier(.4,0,.2,1)infinite both;animation:mdl-spinner__fill-unfill-rotate 5332ms cubic-bezier(.4,0,.2,1)infinite both,mdl-spinner__layer-2-fade-in-out 5332ms cubic-bezier(.4,0,.2,1)infinite both}.mdl-spinner__layer-3{border-color:#fdd835}.mdl-spinner--single-color .mdl-spinner__layer-3{border-color:rgb(103,58,183)}.mdl-spinner.is-active .mdl-spinner__layer-3{-webkit-animation:mdl-spinner__fill-unfill-rotate 5332ms cubic-bezier(.4,0,.2,1)infinite both,mdl-spinner__layer-3-fade-in-out 5332ms cubic-bezier(.4,0,.2,1)infinite both;animation:mdl-spinner__fill-unfill-rotate 5332ms cubic-bezier(.4,0,.2,1)infinite both,mdl-spinner__layer-3-fade-in-out 5332ms cubic-bezier(.4,0,.2,1)infinite both}.mdl-spinner__layer-4{border-color:#4caf50}.mdl-spinner--single-color .mdl-spinner__layer-4{border-color:rgb(103,58,183)}.mdl-spinner.is-active .mdl-spinner__layer-4{-webkit-animation:mdl-spinner__fill-unfill-rotate 5332ms cubic-bezier(.4,0,.2,1)infinite both,mdl-spinner__layer-4-fade-in-out 5332ms cubic-bezier(.4,0,.2,1)infinite both;animation:mdl-spinner__fill-unfill-rotate 5332ms cubic-bezier(.4,0,.2,1)infinite both,mdl-spinner__layer-4-fade-in-out 5332ms cubic-bezier(.4,0,.2,1)infinite both}@-webkit-keyframes mdl-spinner__fill-unfill-rotate{12.5%{-webkit-transform:rotate(135deg);transform:rotate(135deg)}25%{-webkit-transform:rotate(270deg);transform:rotate(270deg)}37.5%{-webkit-transform:rotate(405deg);transform:rotate(405deg)}50%{-webkit-transform:rotate(540deg);transform:rotate(540deg)}62.5%{-webkit-transform:rotate(675deg);transform:rotate(675deg)}75%{-webkit-transform:rotate(810deg);transform:rotate(810deg)}87.5%{-webkit-transform:rotate(945deg);transform:rotate(945deg)}to{-webkit-transform:rotate(1080deg);transform:rotate(1080deg)}}@keyframes mdl-spinner__fill-unfill-rotate{12.5%{-webkit-transform:rotate(135deg);transform:rotate(135deg)}25%{-webkit-transform:rotate(270deg);transform:rotate(270deg)}37.5%{-webkit-transform:rotate(405deg);transform:rotate(405deg)}50%{-webkit-transform:rotate(540deg);transform:rotate(540deg)}62.5%{-webkit-transform:rotate(675deg);transform:rotate(675deg)}75%{-webkit-transform:rotate(810deg);transform:rotate(810deg)}87.5%{-webkit-transform:rotate(945deg);transform:rotate(945deg)}to{-webkit-transform:rotate(1080deg);transform:rotate(1080deg)}}@-webkit-keyframes mdl-spinner__layer-1-fade-in-out{from,25%{opacity:.99}26%,89%{opacity:0}90%,100%{opacity:.99}}@keyframes mdl-spinner__layer-1-fade-in-out{from,25%{opacity:.99}26%,89%{opacity:0}90%,100%{opacity:.99}}@-webkit-keyframes mdl-spinner__layer-2-fade-in-out{from,15%{opacity:0}25%,50%{opacity:.99}51%{opacity:0}}@keyframes mdl-spinner__layer-2-fade-in-out{from,15%{opacity:0}25%,50%{opacity:.99}51%{opacity:0}}@-webkit-keyframes mdl-spinner__layer-3-fade-in-out{from,40%{opacity:0}50%,75%{opacity:.99}76%{opacity:0}}@keyframes mdl-spinner__layer-3-fade-in-out{from,40%{opacity:0}50%,75%{opacity:.99}76%{opacity:0}}@-webkit-keyframes mdl-spinner__layer-4-fade-in-out{from,65%{opacity:0}75%,90%{opacity:.99}100%{opacity:0}}@keyframes mdl-spinner__layer-4-fade-in-out{from,65%{opacity:0}75%,90%{opacity:.99}100%{opacity:0}}.mdl-spinner__gap-patch{position:absolute;box-sizing:border-box;top:0;left:45%;width:10%;height:100%;overflow:hidden;border-color:inherit}.mdl-spinner__gap-patch .mdl-spinner__circle{width:1000%;left:-450%}.mdl-spinner__circle-clipper{display:inline-block;position:relative;width:50%;height:100%;overflow:hidden;border-color:inherit}.mdl-spinner__circle-clipper.mdl-spinner__left{float:left}.mdl-spinner__circle-clipper.mdl-spinner__right{float:right}.mdl-spinner__circle-clipper .mdl-spinner__circle{width:200%}.mdl-spinner__circle{box-sizing:border-box;height:100%;border-width:3px;border-style:solid;border-color:inherit;border-bottom-color:transparent!important;border-radius:50%;-webkit-animation:none;animation:none;position:absolute;top:0;right:0;bottom:0;left:0}.mdl-spinner__left .mdl-spinner__circle{border-right-color:transparent!important;-webkit-transform:rotate(129deg);transform:rotate(129deg)}.mdl-spinner.is-active .mdl-spinner__left .mdl-spinner__circle{-webkit-animation:mdl-spinner__left-spin 1333ms cubic-bezier(.4,0,.2,1)infinite both;animation:mdl-spinner__left-spin 1333ms cubic-bezier(.4,0,.2,1)infinite both}.mdl-spinner__right .mdl-spinner__circle{left:-100%;border-left-color:transparent!important;-webkit-transform:rotate(-129deg);transform:rotate(-129deg)}.mdl-spinner.is-active .mdl-spinner__right .mdl-spinner__circle{-webkit-animation:mdl-spinner__right-spin 1333ms cubic-bezier(.4,0,.2,1)infinite both;animation:mdl-spinner__right-spin 1333ms cubic-bezier(.4,0,.2,1)infinite both}@-webkit-keyframes mdl-spinner__left-spin{from{-webkit-transform:rotate(130deg);transform:rotate(130deg)}50%{-webkit-transform:rotate(-5deg);transform:rotate(-5deg)}to{-webkit-transform:rotate(130deg);transform:rotate(130deg)}}@keyframes mdl-spinner__left-spin{from{-webkit-transform:rotate(130deg);transform:rotate(130deg)}50%{-webkit-transform:rotate(-5deg);transform:rotate(-5deg)}to{-webkit-transform:rotate(130deg);transform:rotate(130deg)}}@-webkit-keyframes mdl-spinner__right-spin{from{-webkit-transform:rotate(-130deg);transform:rotate(-130deg)}50%{-webkit-transform:rotate(5deg);transform:rotate(5deg)}to{-webkit-transform:rotate(-130deg);transform:rotate(-130deg)}}@keyframes mdl-spinner__right-spin{from{-webkit-transform:rotate(-130deg);transform:rotate(-130deg)}50%{-webkit-transform:rotate(5deg);transform:rotate(5deg)}to{-webkit-transform:rotate(-130deg);transform:rotate(-130deg)}}.mdl-switch{position:relative;z-index:1;vertical-align:middle;display:inline-block;box-sizing:border-box;width:100%;height:24px;margin:0;padding:0;overflow:visible;-webkit-touch-callout:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none}.mdl-switch.is-upgraded{padding-left:28px}.mdl-switch__input{line-height:24px}.mdl-switch.is-upgraded .mdl-switch__input{position:absolute;width:0;height:0;margin:0;padding:0;opacity:0;-ms-appearance:none;-moz-appearance:none;-webkit-appearance:none;appearance:none;border:none}.mdl-switch__track{background:rgba(0,0,0,.26);position:absolute;left:0;top:5px;height:14px;width:36px;border-radius:14px;cursor:pointer}.mdl-switch.is-checked .mdl-switch__track{background:rgba(103,58,183,.5)}.mdl-switch__track fieldset[disabled] .mdl-switch,.mdl-switch.is-disabled .mdl-switch__track{background:rgba(0,0,0,.12);cursor:auto}.mdl-switch__thumb{background:#fafafa;position:absolute;left:0;top:2px;height:20px;width:20px;border-radius:50%;cursor:pointer;box-shadow:0 2px 2px 0 rgba(0,0,0,.14),0 3px 1px -2px rgba(0,0,0,.2),0 1px 5px 0 rgba(0,0,0,.12);transition-duration:.28s;transition-timing-function:cubic-bezier(.4,0,.2,1);transition-property:left}.mdl-switch.is-checked .mdl-switch__thumb{background:rgb(103,58,183);left:16px;box-shadow:0 3px 4px 0 rgba(0,0,0,.14),0 3px 3px -2px rgba(0,0,0,.2),0 1px 8px 0 rgba(0,0,0,.12)}.mdl-switch__thumb fieldset[disabled] .mdl-switch,.mdl-switch.is-disabled .mdl-switch__thumb{background:#bdbdbd;cursor:auto}.mdl-switch__focus-helper{position:absolute;top:50%;left:50%;-webkit-transform:translate(-4px,-4px);transform:translate(-4px,-4px);display:inline-block;box-sizing:border-box;width:8px;height:8px;border-radius:50%;background-color:transparent}.mdl-switch.is-focused .mdl-switch__focus-helper{box-shadow:0 0 0 20px rgba(0,0,0,.1);background-color:rgba(0,0,0,.1)}.mdl-switch.is-focused.is-checked .mdl-switch__focus-helper{box-shadow:0 0 0 20px rgba(103,58,183,.26);background-color:rgba(103,58,183,.26)}.mdl-switch__label{position:relative;cursor:pointer;font-size:16px;line-height:24px;margin:0;left:24px}.mdl-switch__label fieldset[disabled] .mdl-switch,.mdl-switch.is-disabled .mdl-switch__label{color:#bdbdbd;cursor:auto}.mdl-switch__ripple-container{position:absolute;z-index:2;top:-12px;left:-14px;box-sizing:border-box;width:48px;height:48px;border-radius:50%;cursor:pointer;overflow:hidden;-webkit-mask-image:-webkit-radial-gradient(circle,#fff,#000);transition-duration:.4s;transition-timing-function:step-end;transition-property:left}.mdl-switch__ripple-container .mdl-ripple{background:rgb(103,58,183)}.mdl-switch__ripple-container fieldset[disabled] .mdl-switch,.mdl-switch.is-disabled .mdl-switch__ripple-container{cursor:auto}fieldset[disabled] .mdl-switch .mdl-switch__ripple-container .mdl-ripple,.mdl-switch.is-disabled .mdl-switch__ripple-container .mdl-ripple{background:0 0}.mdl-switch.is-checked .mdl-switch__ripple-container{left:2px}.mdl-tabs{display:block;width:100%}.mdl-tabs__tab-bar{display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:row;-ms-flex-direction:row;flex-direction:row;-webkit-justify-content:center;-ms-flex-pack:center;justify-content:center;-webkit-align-content:space-between;-ms-flex-line-pack:justify;align-content:space-between;-webkit-align-items:flex-start;-ms-flex-align:start;align-items:flex-start;height:48px;padding:0;margin:0;border-bottom:1px solid #e0e0e0}.mdl-tabs__tab{margin:0;border:none;padding:0 24px;float:left;position:relative;display:block;text-decoration:none;height:48px;line-height:48px;text-align:center;font-weight:500;font-size:14px;text-transform:uppercase;color:rgba(0,0,0,.54);overflow:hidden}.mdl-tabs.is-upgraded .mdl-tabs__tab.is-active{color:rgba(0,0,0,.87)}.mdl-tabs.is-upgraded .mdl-tabs__tab.is-active:after{height:2px;width:100%;display:block;content:" ";bottom:0;left:0;position:absolute;background:rgb(103,58,183);-webkit-animation:border-expand .2s cubic-bezier(.4,0,.4,1).01s alternate forwards;animation:border-expand .2s cubic-bezier(.4,0,.4,1).01s alternate forwards;transition:all 1s cubic-bezier(.4,0,1,1)}.mdl-tabs__tab .mdl-tabs__ripple-container{display:block;position:absolute;height:100%;width:100%;left:0;top:0;z-index:1;overflow:hidden}.mdl-tabs__tab .mdl-tabs__ripple-container .mdl-ripple{background:rgb(103,58,183)}.mdl-tabs__panel{display:block}.mdl-tabs.is-upgraded .mdl-tabs__panel{display:none}.mdl-tabs.is-upgraded .mdl-tabs__panel.is-active{display:block}@-webkit-keyframes border-expand{0%{opacity:0;width:0}100%{opacity:1;width:100%}}@keyframes border-expand{0%{opacity:0;width:0}100%{opacity:1;width:100%}}.mdl-textfield{position:relative;font-size:16px;display:inline-block;box-sizing:border-box;width:300px;max-width:100%;margin:0;padding:20px 0}.mdl-textfield .mdl-button{position:absolute;bottom:20px}.mdl-textfield--align-right{text-align:right}.mdl-textfield--full-width{width:100%}.mdl-textfield--expandable{min-width:32px;width:auto;min-height:32px}.mdl-textfield--expandable .mdl-button--icon{top:16px}.mdl-textfield__input{border:none;border-bottom:1px solid rgba(0,0,0,.12);display:block;font-size:16px;font-family:"Helvetica","Arial",sans-serif;margin:0;padding:4px 0;width:100%;background:0 0;text-align:left;color:inherit}.mdl-textfield__input[type="number"]{-moz-appearance:textfield}.mdl-textfield__input[type="number"]::-webkit-inner-spin-button,.mdl-textfield__input[type="number"]::-webkit-outer-spin-button{-webkit-appearance:none;margin:0}.mdl-textfield.is-focused .mdl-textfield__input{outline:none}.mdl-textfield.is-invalid .mdl-textfield__input{border-color:#d50000;box-shadow:none}fieldset[disabled] .mdl-textfield .mdl-textfield__input,.mdl-textfield.is-disabled .mdl-textfield__input{background-color:transparent;border-bottom:1px dotted rgba(0,0,0,.12);color:rgba(0,0,0,.26)}.mdl-textfield textarea.mdl-textfield__input{display:block}.mdl-textfield__label{bottom:0;color:rgba(0,0,0,.26);font-size:16px;left:0;right:0;pointer-events:none;position:absolute;display:block;top:24px;width:100%;overflow:hidden;white-space:nowrap;text-align:left}.mdl-textfield.is-dirty .mdl-textfield__label,.mdl-textfield.has-placeholder .mdl-textfield__label{visibility:hidden}.mdl-textfield--floating-label .mdl-textfield__label{transition-duration:.2s;transition-timing-function:cubic-bezier(.4,0,.2,1)}.mdl-textfield--floating-label.has-placeholder .mdl-textfield__label{transition:none}fieldset[disabled] .mdl-textfield .mdl-textfield__label,.mdl-textfield.is-disabled.is-disabled .mdl-textfield__label{color:rgba(0,0,0,.26)}.mdl-textfield--floating-label.is-focused .mdl-textfield__label,.mdl-textfield--floating-label.is-dirty .mdl-textfield__label,.mdl-textfield--floating-label.has-placeholder .mdl-textfield__label{color:rgb(103,58,183);font-size:12px;top:4px;visibility:visible}.mdl-textfield--floating-label.is-focused .mdl-textfield__expandable-holder .mdl-textfield__label,.mdl-textfield--floating-label.is-dirty .mdl-textfield__expandable-holder .mdl-textfield__label,.mdl-textfield--floating-label.has-placeholder .mdl-textfield__expandable-holder .mdl-textfield__label{top:-16px}.mdl-textfield--floating-label.is-invalid .mdl-textfield__label{color:#d50000;font-size:12px}.mdl-textfield__label:after{background-color:rgb(103,58,183);bottom:20px;content:'';height:2px;left:45%;position:absolute;transition-duration:.2s;transition-timing-function:cubic-bezier(.4,0,.2,1);visibility:hidden;width:10px}.mdl-textfield.is-focused .mdl-textfield__label:after{left:0;visibility:visible;width:100%}.mdl-textfield.is-invalid .mdl-textfield__label:after{background-color:#d50000}.mdl-textfield__error{color:#d50000;position:absolute;font-size:12px;margin-top:3px;visibility:hidden;display:block}.mdl-textfield.is-invalid .mdl-textfield__error{visibility:visible}.mdl-textfield__expandable-holder{display:inline-block;position:relative;margin-left:32px;transition-duration:.2s;transition-timing-function:cubic-bezier(.4,0,.2,1);display:inline-block;max-width:.1px}.mdl-textfield.is-focused .mdl-textfield__expandable-holder,.mdl-textfield.is-dirty .mdl-textfield__expandable-holder{max-width:600px}.mdl-textfield__expandable-holder .mdl-textfield__label:after{bottom:0}.mdl-tooltip{-webkit-transform:scale(0);transform:scale(0);-webkit-transform-origin:top center;transform-origin:top center;z-index:999;background:rgba(97,97,97,.9);border-radius:2px;color:#fff;display:inline-block;font-size:10px;font-weight:500;line-height:14px;max-width:170px;position:fixed;top:-500px;left:-500px;padding:8px;text-align:center}.mdl-tooltip.is-active{-webkit-animation:pulse 200ms cubic-bezier(0,0,.2,1)forwards;animation:pulse 200ms cubic-bezier(0,0,.2,1)forwards}.mdl-tooltip--large{line-height:14px;font-size:14px;padding:16px}@-webkit-keyframes pulse{0%{-webkit-transform:scale(0);transform:scale(0);opacity:0}50%{-webkit-transform:scale(.99);transform:scale(.99)}100%{-webkit-transform:scale(1);transform:scale(1);opacity:1;visibility:visible}}@keyframes pulse{0%{-webkit-transform:scale(0);transform:scale(0);opacity:0}50%{-webkit-transform:scale(.99);transform:scale(.99)}100%{-webkit-transform:scale(1);transform:scale(1);opacity:1;visibility:visible}}.mdl-shadow--2dp{box-shadow:0 2px 2px 0 rgba(0,0,0,.14),0 3px 1px -2px rgba(0,0,0,.2),0 1px 5px 0 rgba(0,0,0,.12)}.mdl-shadow--3dp{box-shadow:0 3px 4px 0 rgba(0,0,0,.14),0 3px 3px -2px rgba(0,0,0,.2),0 1px 8px 0 rgba(0,0,0,.12)}.mdl-shadow--4dp{box-shadow:0 4px 5px 0 rgba(0,0,0,.14),0 1px 10px 0 rgba(0,0,0,.12),0 2px 4px -1px rgba(0,0,0,.2)}.mdl-shadow--6dp{box-shadow:0 6px 10px 0 rgba(0,0,0,.14),0 1px 18px 0 rgba(0,0,0,.12),0 3px 5px -1px rgba(0,0,0,.2)}.mdl-shadow--8dp{box-shadow:0 8px 10px 1px rgba(0,0,0,.14),0 3px 14px 2px rgba(0,0,0,.12),0 5px 5px -3px rgba(0,0,0,.2)}.mdl-shadow--16dp{box-shadow:0 16px 24px 2px rgba(0,0,0,.14),0 6px 30px 5px rgba(0,0,0,.12),0 8px 10px -5px rgba(0,0,0,.2)}.mdl-shadow--24dp{box-shadow:0 9px 46px 8px rgba(0,0,0,.14),0 11px 15px -7px rgba(0,0,0,.12),0 24px 38px 3px rgba(0,0,0,.2)}.mdl-grid{display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-flow:row wrap;-ms-flex-flow:row wrap;flex-flow:row wrap;margin:0 auto;-webkit-align-items:stretch;-ms-flex-align:stretch;align-items:stretch}.mdl-grid.mdl-grid--no-spacing{padding:0}.mdl-cell{box-sizing:border-box}.mdl-cell--top{-webkit-align-self:flex-start;-ms-flex-item-align:start;align-self:flex-start}.mdl-cell--middle{-webkit-align-self:center;-ms-flex-item-align:center;-ms-grid-row-align:center;align-self:center}.mdl-cell--bottom{-webkit-align-self:flex-end;-ms-flex-item-align:end;align-self:flex-end}.mdl-cell--stretch{-webkit-align-self:stretch;-ms-flex-item-align:stretch;-ms-grid-row-align:stretch;align-self:stretch}.mdl-grid.mdl-grid--no-spacing>.mdl-cell{margin:0}.mdl-cell--order-1{-webkit-order:1;-ms-flex-order:1;order:1}.mdl-cell--order-2{-webkit-order:2;-ms-flex-order:2;order:2}.mdl-cell--order-3{-webkit-order:3;-ms-flex-order:3;order:3}.mdl-cell--order-4{-webkit-order:4;-ms-flex-order:4;order:4}.mdl-cell--order-5{-webkit-order:5;-ms-flex-order:5;order:5}.mdl-cell--order-6{-webkit-order:6;-ms-flex-order:6;order:6}.mdl-cell--order-7{-webkit-order:7;-ms-flex-order:7;order:7}.mdl-cell--order-8{-webkit-order:8;-ms-flex-order:8;order:8}.mdl-cell--order-9{-webkit-order:9;-ms-flex-order:9;order:9}.mdl-cell--order-10{-webkit-order:10;-ms-flex-order:10;order:10}.mdl-cell--order-11{-webkit-order:11;-ms-flex-order:11;order:11}.mdl-cell--order-12{-webkit-order:12;-ms-flex-order:12;order:12}@media (max-width:479px){.mdl-grid{padding:8px}.mdl-cell{margin:8px;width:calc(100% - 16px)}.mdl-grid--no-spacing>.mdl-cell{width:100%}.mdl-cell--hide-phone{display:none!important}.mdl-cell--order-1-phone.mdl-cell--order-1-phone{-webkit-order:1;-ms-flex-order:1;order:1}.mdl-cell--order-2-phone.mdl-cell--order-2-phone{-webkit-order:2;-ms-flex-order:2;order:2}.mdl-cell--order-3-phone.mdl-cell--order-3-phone{-webkit-order:3;-ms-flex-order:3;order:3}.mdl-cell--order-4-phone.mdl-cell--order-4-phone{-webkit-order:4;-ms-flex-order:4;order:4}.mdl-cell--order-5-phone.mdl-cell--order-5-phone{-webkit-order:5;-ms-flex-order:5;order:5}.mdl-cell--order-6-phone.mdl-cell--order-6-phone{-webkit-order:6;-ms-flex-order:6;order:6}.mdl-cell--order-7-phone.mdl-cell--order-7-phone{-webkit-order:7;-ms-flex-order:7;order:7}.mdl-cell--order-8-phone.mdl-cell--order-8-phone{-webkit-order:8;-ms-flex-order:8;order:8}.mdl-cell--order-9-phone.mdl-cell--order-9-phone{-webkit-order:9;-ms-flex-order:9;order:9}.mdl-cell--order-10-phone.mdl-cell--order-10-phone{-webkit-order:10;-ms-flex-order:10;order:10}.mdl-cell--order-11-phone.mdl-cell--order-11-phone{-webkit-order:11;-ms-flex-order:11;order:11}.mdl-cell--order-12-phone.mdl-cell--order-12-phone{-webkit-order:12;-ms-flex-order:12;order:12}.mdl-cell--1-col,.mdl-cell--1-col-phone.mdl-cell--1-col-phone{width:calc(25% - 16px)}.mdl-grid--no-spacing>.mdl-cell--1-col,.mdl-grid--no-spacing>.mdl-cell--1-col-phone.mdl-cell--1-col-phone{width:25%}.mdl-cell--2-col,.mdl-cell--2-col-phone.mdl-cell--2-col-phone{width:calc(50% - 16px)}.mdl-grid--no-spacing>.mdl-cell--2-col,.mdl-grid--no-spacing>.mdl-cell--2-col-phone.mdl-cell--2-col-phone{width:50%}.mdl-cell--3-col,.mdl-cell--3-col-phone.mdl-cell--3-col-phone{width:calc(75% - 16px)}.mdl-grid--no-spacing>.mdl-cell--3-col,.mdl-grid--no-spacing>.mdl-cell--3-col-phone.mdl-cell--3-col-phone{width:75%}.mdl-cell--4-col,.mdl-cell--4-col-phone.mdl-cell--4-col-phone{width:calc(100% - 16px)}.mdl-grid--no-spacing>.mdl-cell--4-col,.mdl-grid--no-spacing>.mdl-cell--4-col-phone.mdl-cell--4-col-phone{width:100%}.mdl-cell--5-col,.mdl-cell--5-col-phone.mdl-cell--5-col-phone{width:calc(100% - 16px)}.mdl-grid--no-spacing>.mdl-cell--5-col,.mdl-grid--no-spacing>.mdl-cell--5-col-phone.mdl-cell--5-col-phone{width:100%}.mdl-cell--6-col,.mdl-cell--6-col-phone.mdl-cell--6-col-phone{width:calc(100% - 16px)}.mdl-grid--no-spacing>.mdl-cell--6-col,.mdl-grid--no-spacing>.mdl-cell--6-col-phone.mdl-cell--6-col-phone{width:100%}.mdl-cell--7-col,.mdl-cell--7-col-phone.mdl-cell--7-col-phone{width:calc(100% - 16px)}.mdl-grid--no-spacing>.mdl-cell--7-col,.mdl-grid--no-spacing>.mdl-cell--7-col-phone.mdl-cell--7-col-phone{width:100%}.mdl-cell--8-col,.mdl-cell--8-col-phone.mdl-cell--8-col-phone{width:calc(100% - 16px)}.mdl-grid--no-spacing>.mdl-cell--8-col,.mdl-grid--no-spacing>.mdl-cell--8-col-phone.mdl-cell--8-col-phone{width:100%}.mdl-cell--9-col,.mdl-cell--9-col-phone.mdl-cell--9-col-phone{width:calc(100% - 16px)}.mdl-grid--no-spacing>.mdl-cell--9-col,.mdl-grid--no-spacing>.mdl-cell--9-col-phone.mdl-cell--9-col-phone{width:100%}.mdl-cell--10-col,.mdl-cell--10-col-phone.mdl-cell--10-col-phone{width:calc(100% - 16px)}.mdl-grid--no-spacing>.mdl-cell--10-col,.mdl-grid--no-spacing>.mdl-cell--10-col-phone.mdl-cell--10-col-phone{width:100%}.mdl-cell--11-col,.mdl-cell--11-col-phone.mdl-cell--11-col-phone{width:calc(100% - 16px)}.mdl-grid--no-spacing>.mdl-cell--11-col,.mdl-grid--no-spacing>.mdl-cell--11-col-phone.mdl-cell--11-col-phone{width:100%}.mdl-cell--12-col,.mdl-cell--12-col-phone.mdl-cell--12-col-phone{width:calc(100% - 16px)}.mdl-grid--no-spacing>.mdl-cell--12-col,.mdl-grid--no-spacing>.mdl-cell--12-col-phone.mdl-cell--12-col-phone{width:100%}.mdl-cell--1-offset,.mdl-cell--1-offset-phone.mdl-cell--1-offset-phone{margin-left:calc(25% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--1-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--1-offset-phone.mdl-cell--1-offset-phone{margin-left:25%}.mdl-cell--2-offset,.mdl-cell--2-offset-phone.mdl-cell--2-offset-phone{margin-left:calc(50% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--2-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--2-offset-phone.mdl-cell--2-offset-phone{margin-left:50%}.mdl-cell--3-offset,.mdl-cell--3-offset-phone.mdl-cell--3-offset-phone{margin-left:calc(75% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--3-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--3-offset-phone.mdl-cell--3-offset-phone{margin-left:75%}}@media (min-width:480px) and (max-width:839px){.mdl-grid{padding:8px}.mdl-cell{margin:8px;width:calc(50% - 16px)}.mdl-grid--no-spacing>.mdl-cell{width:50%}.mdl-cell--hide-tablet{display:none!important}.mdl-cell--order-1-tablet.mdl-cell--order-1-tablet{-webkit-order:1;-ms-flex-order:1;order:1}.mdl-cell--order-2-tablet.mdl-cell--order-2-tablet{-webkit-order:2;-ms-flex-order:2;order:2}.mdl-cell--order-3-tablet.mdl-cell--order-3-tablet{-webkit-order:3;-ms-flex-order:3;order:3}.mdl-cell--order-4-tablet.mdl-cell--order-4-tablet{-webkit-order:4;-ms-flex-order:4;order:4}.mdl-cell--order-5-tablet.mdl-cell--order-5-tablet{-webkit-order:5;-ms-flex-order:5;order:5}.mdl-cell--order-6-tablet.mdl-cell--order-6-tablet{-webkit-order:6;-ms-flex-order:6;order:6}.mdl-cell--order-7-tablet.mdl-cell--order-7-tablet{-webkit-order:7;-ms-flex-order:7;order:7}.mdl-cell--order-8-tablet.mdl-cell--order-8-tablet{-webkit-order:8;-ms-flex-order:8;order:8}.mdl-cell--order-9-tablet.mdl-cell--order-9-tablet{-webkit-order:9;-ms-flex-order:9;order:9}.mdl-cell--order-10-tablet.mdl-cell--order-10-tablet{-webkit-order:10;-ms-flex-order:10;order:10}.mdl-cell--order-11-tablet.mdl-cell--order-11-tablet{-webkit-order:11;-ms-flex-order:11;order:11}.mdl-cell--order-12-tablet.mdl-cell--order-12-tablet{-webkit-order:12;-ms-flex-order:12;order:12}.mdl-cell--1-col,.mdl-cell--1-col-tablet.mdl-cell--1-col-tablet{width:calc(12.5% - 16px)}.mdl-grid--no-spacing>.mdl-cell--1-col,.mdl-grid--no-spacing>.mdl-cell--1-col-tablet.mdl-cell--1-col-tablet{width:12.5%}.mdl-cell--2-col,.mdl-cell--2-col-tablet.mdl-cell--2-col-tablet{width:calc(25% - 16px)}.mdl-grid--no-spacing>.mdl-cell--2-col,.mdl-grid--no-spacing>.mdl-cell--2-col-tablet.mdl-cell--2-col-tablet{width:25%}.mdl-cell--3-col,.mdl-cell--3-col-tablet.mdl-cell--3-col-tablet{width:calc(37.5% - 16px)}.mdl-grid--no-spacing>.mdl-cell--3-col,.mdl-grid--no-spacing>.mdl-cell--3-col-tablet.mdl-cell--3-col-tablet{width:37.5%}.mdl-cell--4-col,.mdl-cell--4-col-tablet.mdl-cell--4-col-tablet{width:calc(50% - 16px)}.mdl-grid--no-spacing>.mdl-cell--4-col,.mdl-grid--no-spacing>.mdl-cell--4-col-tablet.mdl-cell--4-col-tablet{width:50%}.mdl-cell--5-col,.mdl-cell--5-col-tablet.mdl-cell--5-col-tablet{width:calc(62.5% - 16px)}.mdl-grid--no-spacing>.mdl-cell--5-col,.mdl-grid--no-spacing>.mdl-cell--5-col-tablet.mdl-cell--5-col-tablet{width:62.5%}.mdl-cell--6-col,.mdl-cell--6-col-tablet.mdl-cell--6-col-tablet{width:calc(75% - 16px)}.mdl-grid--no-spacing>.mdl-cell--6-col,.mdl-grid--no-spacing>.mdl-cell--6-col-tablet.mdl-cell--6-col-tablet{width:75%}.mdl-cell--7-col,.mdl-cell--7-col-tablet.mdl-cell--7-col-tablet{width:calc(87.5% - 16px)}.mdl-grid--no-spacing>.mdl-cell--7-col,.mdl-grid--no-spacing>.mdl-cell--7-col-tablet.mdl-cell--7-col-tablet{width:87.5%}.mdl-cell--8-col,.mdl-cell--8-col-tablet.mdl-cell--8-col-tablet{width:calc(100% - 16px)}.mdl-grid--no-spacing>.mdl-cell--8-col,.mdl-grid--no-spacing>.mdl-cell--8-col-tablet.mdl-cell--8-col-tablet{width:100%}.mdl-cell--9-col,.mdl-cell--9-col-tablet.mdl-cell--9-col-tablet{width:calc(100% - 16px)}.mdl-grid--no-spacing>.mdl-cell--9-col,.mdl-grid--no-spacing>.mdl-cell--9-col-tablet.mdl-cell--9-col-tablet{width:100%}.mdl-cell--10-col,.mdl-cell--10-col-tablet.mdl-cell--10-col-tablet{width:calc(100% - 16px)}.mdl-grid--no-spacing>.mdl-cell--10-col,.mdl-grid--no-spacing>.mdl-cell--10-col-tablet.mdl-cell--10-col-tablet{width:100%}.mdl-cell--11-col,.mdl-cell--11-col-tablet.mdl-cell--11-col-tablet{width:calc(100% - 16px)}.mdl-grid--no-spacing>.mdl-cell--11-col,.mdl-grid--no-spacing>.mdl-cell--11-col-tablet.mdl-cell--11-col-tablet{width:100%}.mdl-cell--12-col,.mdl-cell--12-col-tablet.mdl-cell--12-col-tablet{width:calc(100% - 16px)}.mdl-grid--no-spacing>.mdl-cell--12-col,.mdl-grid--no-spacing>.mdl-cell--12-col-tablet.mdl-cell--12-col-tablet{width:100%}.mdl-cell--1-offset,.mdl-cell--1-offset-tablet.mdl-cell--1-offset-tablet{margin-left:calc(12.5% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--1-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--1-offset-tablet.mdl-cell--1-offset-tablet{margin-left:12.5%}.mdl-cell--2-offset,.mdl-cell--2-offset-tablet.mdl-cell--2-offset-tablet{margin-left:calc(25% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--2-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--2-offset-tablet.mdl-cell--2-offset-tablet{margin-left:25%}.mdl-cell--3-offset,.mdl-cell--3-offset-tablet.mdl-cell--3-offset-tablet{margin-left:calc(37.5% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--3-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--3-offset-tablet.mdl-cell--3-offset-tablet{margin-left:37.5%}.mdl-cell--4-offset,.mdl-cell--4-offset-tablet.mdl-cell--4-offset-tablet{margin-left:calc(50% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--4-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--4-offset-tablet.mdl-cell--4-offset-tablet{margin-left:50%}.mdl-cell--5-offset,.mdl-cell--5-offset-tablet.mdl-cell--5-offset-tablet{margin-left:calc(62.5% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--5-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--5-offset-tablet.mdl-cell--5-offset-tablet{margin-left:62.5%}.mdl-cell--6-offset,.mdl-cell--6-offset-tablet.mdl-cell--6-offset-tablet{margin-left:calc(75% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--6-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--6-offset-tablet.mdl-cell--6-offset-tablet{margin-left:75%}.mdl-cell--7-offset,.mdl-cell--7-offset-tablet.mdl-cell--7-offset-tablet{margin-left:calc(87.5% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--7-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--7-offset-tablet.mdl-cell--7-offset-tablet{margin-left:87.5%}}@media (min-width:840px){.mdl-grid{padding:8px}.mdl-cell{margin:8px;width:calc(33.3333333333% - 16px)}.mdl-grid--no-spacing>.mdl-cell{width:33.3333333333%}.mdl-cell--hide-desktop{display:none!important}.mdl-cell--order-1-desktop.mdl-cell--order-1-desktop{-webkit-order:1;-ms-flex-order:1;order:1}.mdl-cell--order-2-desktop.mdl-cell--order-2-desktop{-webkit-order:2;-ms-flex-order:2;order:2}.mdl-cell--order-3-desktop.mdl-cell--order-3-desktop{-webkit-order:3;-ms-flex-order:3;order:3}.mdl-cell--order-4-desktop.mdl-cell--order-4-desktop{-webkit-order:4;-ms-flex-order:4;order:4}.mdl-cell--order-5-desktop.mdl-cell--order-5-desktop{-webkit-order:5;-ms-flex-order:5;order:5}.mdl-cell--order-6-desktop.mdl-cell--order-6-desktop{-webkit-order:6;-ms-flex-order:6;order:6}.mdl-cell--order-7-desktop.mdl-cell--order-7-desktop{-webkit-order:7;-ms-flex-order:7;order:7}.mdl-cell--order-8-desktop.mdl-cell--order-8-desktop{-webkit-order:8;-ms-flex-order:8;order:8}.mdl-cell--order-9-desktop.mdl-cell--order-9-desktop{-webkit-order:9;-ms-flex-order:9;order:9}.mdl-cell--order-10-desktop.mdl-cell--order-10-desktop{-webkit-order:10;-ms-flex-order:10;order:10}.mdl-cell--order-11-desktop.mdl-cell--order-11-desktop{-webkit-order:11;-ms-flex-order:11;order:11}.mdl-cell--order-12-desktop.mdl-cell--order-12-desktop{-webkit-order:12;-ms-flex-order:12;order:12}.mdl-cell--1-col,.mdl-cell--1-col-desktop.mdl-cell--1-col-desktop{width:calc(8.3333333333% - 16px)}.mdl-grid--no-spacing>.mdl-cell--1-col,.mdl-grid--no-spacing>.mdl-cell--1-col-desktop.mdl-cell--1-col-desktop{width:8.3333333333%}.mdl-cell--2-col,.mdl-cell--2-col-desktop.mdl-cell--2-col-desktop{width:calc(16.6666666667% - 16px)}.mdl-grid--no-spacing>.mdl-cell--2-col,.mdl-grid--no-spacing>.mdl-cell--2-col-desktop.mdl-cell--2-col-desktop{width:16.6666666667%}.mdl-cell--3-col,.mdl-cell--3-col-desktop.mdl-cell--3-col-desktop{width:calc(25% - 16px)}.mdl-grid--no-spacing>.mdl-cell--3-col,.mdl-grid--no-spacing>.mdl-cell--3-col-desktop.mdl-cell--3-col-desktop{width:25%}.mdl-cell--4-col,.mdl-cell--4-col-desktop.mdl-cell--4-col-desktop{width:calc(33.3333333333% - 16px)}.mdl-grid--no-spacing>.mdl-cell--4-col,.mdl-grid--no-spacing>.mdl-cell--4-col-desktop.mdl-cell--4-col-desktop{width:33.3333333333%}.mdl-cell--5-col,.mdl-cell--5-col-desktop.mdl-cell--5-col-desktop{width:calc(41.6666666667% - 16px)}.mdl-grid--no-spacing>.mdl-cell--5-col,.mdl-grid--no-spacing>.mdl-cell--5-col-desktop.mdl-cell--5-col-desktop{width:41.6666666667%}.mdl-cell--6-col,.mdl-cell--6-col-desktop.mdl-cell--6-col-desktop{width:calc(50% - 16px)}.mdl-grid--no-spacing>.mdl-cell--6-col,.mdl-grid--no-spacing>.mdl-cell--6-col-desktop.mdl-cell--6-col-desktop{width:50%}.mdl-cell--7-col,.mdl-cell--7-col-desktop.mdl-cell--7-col-desktop{width:calc(58.3333333333% - 16px)}.mdl-grid--no-spacing>.mdl-cell--7-col,.mdl-grid--no-spacing>.mdl-cell--7-col-desktop.mdl-cell--7-col-desktop{width:58.3333333333%}.mdl-cell--8-col,.mdl-cell--8-col-desktop.mdl-cell--8-col-desktop{width:calc(66.6666666667% - 16px)}.mdl-grid--no-spacing>.mdl-cell--8-col,.mdl-grid--no-spacing>.mdl-cell--8-col-desktop.mdl-cell--8-col-desktop{width:66.6666666667%}.mdl-cell--9-col,.mdl-cell--9-col-desktop.mdl-cell--9-col-desktop{width:calc(75% - 16px)}.mdl-grid--no-spacing>.mdl-cell--9-col,.mdl-grid--no-spacing>.mdl-cell--9-col-desktop.mdl-cell--9-col-desktop{width:75%}.mdl-cell--10-col,.mdl-cell--10-col-desktop.mdl-cell--10-col-desktop{width:calc(83.3333333333% - 16px)}.mdl-grid--no-spacing>.mdl-cell--10-col,.mdl-grid--no-spacing>.mdl-cell--10-col-desktop.mdl-cell--10-col-desktop{width:83.3333333333%}.mdl-cell--11-col,.mdl-cell--11-col-desktop.mdl-cell--11-col-desktop{width:calc(91.6666666667% - 16px)}.mdl-grid--no-spacing>.mdl-cell--11-col,.mdl-grid--no-spacing>.mdl-cell--11-col-desktop.mdl-cell--11-col-desktop{width:91.6666666667%}.mdl-cell--12-col,.mdl-cell--12-col-desktop.mdl-cell--12-col-desktop{width:calc(100% - 16px)}.mdl-grid--no-spacing>.mdl-cell--12-col,.mdl-grid--no-spacing>.mdl-cell--12-col-desktop.mdl-cell--12-col-desktop{width:100%}.mdl-cell--1-offset,.mdl-cell--1-offset-desktop.mdl-cell--1-offset-desktop{margin-left:calc(8.3333333333% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--1-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--1-offset-desktop.mdl-cell--1-offset-desktop{margin-left:8.3333333333%}.mdl-cell--2-offset,.mdl-cell--2-offset-desktop.mdl-cell--2-offset-desktop{margin-left:calc(16.6666666667% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--2-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--2-offset-desktop.mdl-cell--2-offset-desktop{margin-left:16.6666666667%}.mdl-cell--3-offset,.mdl-cell--3-offset-desktop.mdl-cell--3-offset-desktop{margin-left:calc(25% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--3-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--3-offset-desktop.mdl-cell--3-offset-desktop{margin-left:25%}.mdl-cell--4-offset,.mdl-cell--4-offset-desktop.mdl-cell--4-offset-desktop{margin-left:calc(33.3333333333% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--4-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--4-offset-desktop.mdl-cell--4-offset-desktop{margin-left:33.3333333333%}.mdl-cell--5-offset,.mdl-cell--5-offset-desktop.mdl-cell--5-offset-desktop{margin-left:calc(41.6666666667% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--5-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--5-offset-desktop.mdl-cell--5-offset-desktop{margin-left:41.6666666667%}.mdl-cell--6-offset,.mdl-cell--6-offset-desktop.mdl-cell--6-offset-desktop{margin-left:calc(50% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--6-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--6-offset-desktop.mdl-cell--6-offset-desktop{margin-left:50%}.mdl-cell--7-offset,.mdl-cell--7-offset-desktop.mdl-cell--7-offset-desktop{margin-left:calc(58.3333333333% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--7-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--7-offset-desktop.mdl-cell--7-offset-desktop{margin-left:58.3333333333%}.mdl-cell--8-offset,.mdl-cell--8-offset-desktop.mdl-cell--8-offset-desktop{margin-left:calc(66.6666666667% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--8-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--8-offset-desktop.mdl-cell--8-offset-desktop{margin-left:66.6666666667%}.mdl-cell--9-offset,.mdl-cell--9-offset-desktop.mdl-cell--9-offset-desktop{margin-left:calc(75% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--9-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--9-offset-desktop.mdl-cell--9-offset-desktop{margin-left:75%}.mdl-cell--10-offset,.mdl-cell--10-offset-desktop.mdl-cell--10-offset-desktop{margin-left:calc(83.3333333333% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--10-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--10-offset-desktop.mdl-cell--10-offset-desktop{margin-left:83.3333333333%}.mdl-cell--11-offset,.mdl-cell--11-offset-desktop.mdl-cell--11-offset-desktop{margin-left:calc(91.6666666667% + 8px)}.mdl-grid.mdl-grid--no-spacing>.mdl-cell--11-offset,.mdl-grid.mdl-grid--no-spacing>.mdl-cell--11-offset-desktop.mdl-cell--11-offset-desktop{margin-left:91.6666666667%}}body{margin:0}.styleguide-demo h1{margin:48px 24px 0}.styleguide-demo h1:after{content:'';display:block;width:100%;border-bottom:1px solid rgba(0,0,0,.5);margin-top:24px}.styleguide-demo{opacity:0;transition:opacity .6s ease}.styleguide-masthead{height:256px;background:#212121;padding:115px 16px 0}.styleguide-container{position:relative;max-width:960px;width:100%}.styleguide-title{color:#fff;bottom:auto;position:relative;font-size:56px;font-weight:300;line-height:1;letter-spacing:-.02em}.styleguide-title:after{border-bottom:0}.styleguide-title span{font-weight:300}.mdl-styleguide .mdl-layout__drawer .mdl-navigation__link{padding:10px 24px}.demosLoaded .styleguide-demo{opacity:1}iframe{display:block;width:100%;border:none}iframe.heightSet{overflow:hidden}.demo-wrapper{margin:24px}.demo-wrapper iframe{border:1px solid rgba(0,0,0,.5)} \ No newline at end of file diff --git a/samples/nvidia-resnet/components/webapp/src/templates/index.html b/samples/nvidia-resnet/components/webapp/src/templates/index.html new file mode 100644 index 000000000000..7b81ccadf512 --- /dev/null +++ b/samples/nvidia-resnet/components/webapp/src/templates/index.html @@ -0,0 +1,119 @@ + + + + + + + + Kubeflow UI + + + + + + + +
+
+
+
+

End2end Resnet50 Using NVIDIA TensorRT Inference Server, TF-AMP and TensorRT

+
+
+
+ +
+
+
+

TRTIS Model Server

+
+
+ + +
+
+ + +
+
+ + + Input is not a valid port +
+
+ + +
+ +
+ {% if connection.success %} +
✓ {{ connection.text }}
+ {% else %} +
❗ {{ connection.text }}
+ {% endif %} +
+
+
+ + {% if output %} +
+
+
+

Test Results

+ +

+ + + + + + + + + + + {% for score in output.scores %} + + + + + {% endfor %} + +
Truth{{ output.truth }}
Prediction {{ output.prediction }}
Probability {{ score.index }}: +
+ +
+

+ +
+
+
+ {% endif %} +
+
+ + diff --git a/samples/nvidia-resnet/components/webapp/src/trtis_client.py b/samples/nvidia-resnet/components/webapp/src/trtis_client.py new file mode 100644 index 000000000000..0e33b9362326 --- /dev/null +++ b/samples/nvidia-resnet/components/webapp/src/trtis_client.py @@ -0,0 +1,315 @@ +#!/usr/bin/env python2.7 +''' +Copyright 2018 Google LLC + +Licensed under the Apache License, Version 2.0 (the "License"); +you may not use this file except in compliance with the License. +You may obtain a copy of the License at + + https://www.apache.org/licenses/LICENSE-2.0 + +Unless required by applicable law or agreed to in writing, software +distributed under the License is distributed on an "AS IS" BASIS, +WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +See the License for the specific language governing permissions and +limitations under the License. +''' + +from __future__ import print_function + +import numpy as np +import os +import random +from builtins import range +from functools import partial +import grpc + +from tensorrtserver.api import api_pb2 +from tensorrtserver.api import grpc_service_pb2 +from tensorrtserver.api import grpc_service_pb2_grpc +import tensorrtserver.api.model_config_pb2 as model_config + +from PIL import Image + + +def model_dtype_to_np(model_dtype): + if model_dtype == model_config.TYPE_BOOL: + return np.bool + elif model_dtype == model_config.TYPE_INT8: + return np.int8 + elif model_dtype == model_config.TYPE_INT16: + return np.int16 + elif model_dtype == model_config.TYPE_INT32: + return np.int32 + elif model_dtype == model_config.TYPE_INT64: + return np.int64 + elif model_dtype == model_config.TYPE_UINT8: + return np.uint8 + elif model_dtype == model_config.TYPE_UINT16: + return np.uint16 + elif model_dtype == model_config.TYPE_FP16: + return np.float16 + elif model_dtype == model_config.TYPE_FP32: + return np.float32 + elif model_dtype == model_config.TYPE_FP64: + return np.float64 + elif model_dtype == model_config.TYPE_STRING: + return np.dtype(object) + return None + + +def parse_model(status, model_name, batch_size, verbose=False): + """ + Check the configuration of a model to make sure it meets the + requirements for an image classification network (as expected by + this client) + """ + server_status = status.server_status + if model_name not in server_status.model_status.keys(): + raise Exception("unable to get status for '" + model_name + "'") + + status = server_status.model_status[model_name] + config = status.config + + if len(config.input) != 1: + raise Exception("expecting 1 input, got {}".format(len(config.input))) + if len(config.output) != 1: + raise Exception("expecting 1 output, got {}".format(len(config.output))) + + input = config.input[0] + output = config.output[0] + + if output.data_type != model_config.TYPE_FP32: + raise Exception("expecting output datatype to be TYPE_FP32, model '" + + model_name + "' output type is " + + model_config.DataType.Name(output.data_type)) + + # Output is expected to be a vector. But allow any number of + # dimensions as long as all but 1 is size 1 (e.g. { 10 }, { 1, 10 + # }, { 10, 1, 1 } are all ok). + non_one_cnt = 0 + for dim in output.dims: + if dim > 1: + non_one_cnt += 1 + if non_one_cnt > 1: + raise Exception("expecting model output to be a vector") + + # Model specifying maximum batch size of 0 indicates that batching + # is not supported and so the input tensors do not expect an "N" + # dimension (and 'batch_size' should be 1 so that only a single + # image instance is inferred at a time). + max_batch_size = config.max_batch_size + if max_batch_size == 0: + if batch_size != 1: + raise Exception("batching not supported for model '" + model_name + "'") + else: # max_batch_size > 0 + if batch_size > max_batch_size: + raise Exception( + "expecting batch size <= {} for model '{}'".format(max_batch_size, model_name)) + + # Model input must have 3 dims, either CHW or HWC + if len(input.dims) != 3: + raise Exception( + "expecting input to have 3 dimensions, model '{}' input has {}".format( + model_name, len(input.dims))) + + if ((input.format != model_config.ModelInput.FORMAT_NCHW) and + (input.format != model_config.ModelInput.FORMAT_NHWC)): + raise Exception("unexpected input format " + model_config.ModelInput.Format.Name(input.format) + + ", expecting " + + model_config.ModelInput.Format.Name(model_config.ModelInput.FORMAT_NCHW) + + " or " + + model_config.ModelInput.Format.Name(model_config.ModelInput.FORMAT_NHWC)) + + if input.format == model_config.ModelInput.FORMAT_NHWC: + h = input.dims[0] + w = input.dims[1] + c = input.dims[2] + else: + c = input.dims[0] + h = input.dims[1] + w = input.dims[2] + + return (input.name, output.name, c, h, w, input.format, model_dtype_to_np(input.data_type)) + + +def preprocess(img, format, dtype, c, h, w): + """ + Pre-process an image to meet the size, type and format + requirements specified by the parameters. + """ + # np.set_printoptions(threshold='nan') + + if c == 1: + sample_img = img.convert('L') + else: + sample_img = img.convert('RGB') + + resized_img = sample_img.resize((w, h), Image.BILINEAR) + resized = np.array(resized_img) + if resized.ndim == 2: + resized = resized[:, :, np.newaxis] + + typed = resized.astype(dtype) + + scaled = (typed / 255) - 0.5 + + # Channels are in RGB order. Currently model configuration data + # doesn't provide any information as to other channel orderings + # (like BGR) so we just assume RGB. + return scaled + + +def postprocess(results, filenames, batch_size): + """ + Post-process results to show classifications. + """ + if len(results) != 1: + raise Exception("expected 1 result, got {}".format(len(results))) + + batched_result = results[0].batch_classes + if len(batched_result) != batch_size: + raise Exception("expected {} results, got {}".format(batch_size, len(batched_result))) + if len(filenames) != batch_size: + raise Exception("expected {} filenames, got {}".format(batch_size, len(filenames))) + + label, score = [], [] + # batch size is always 1 here, need to modify if were to larger batch_size + for (index, result) in enumerate(batched_result): + print("Image '{}':".format(filenames[index])) + for cls in result.cls: + label.append(cls.label) + score += [{"index": cls.label, "val": cls.value}] + print(" {} ({}) = {}".format(cls.idx, cls.label, cls.value)) + return label[0], score + + +def requestGenerator(input_name, output_name, c, h, w, format, dtype, model_name, model_version, image_filename, + result_filenames): + # Prepare request for Infer gRPC + # The meta data part can be reused across requests + request = grpc_service_pb2.InferRequest() + request.model_name = model_name + if model_version is None: + request.model_version = -1 + else: + request.model_version = model_version + # optional pass in a batch size for generate requester over a set of image files, need to refactor + batch_size = 1 + request.meta_data.batch_size = batch_size + output_message = api_pb2.InferRequestHeader.Output() + output_message.name = output_name + # Number of class results to report. Default is 10 to match with demo. + output_message.cls.count = 10 + request.meta_data.output.extend([output_message]) + + filenames = [] + if os.path.isdir(image_filename): + filenames = [os.path.join(image_filename, f) + for f in os.listdir(image_filename) + if os.path.isfile(os.path.join(image_filename, f))] + else: + filenames = [image_filename, ] + + filenames.sort() + + # Preprocess the images into input data according to model + # requirements + image_data = [] + for filename in filenames: + img = Image.open(filename) + image_data.append(preprocess(img, format, dtype, c, h, w)) + + request.meta_data.input.add(name=input_name) + + # Send requests of batch_size images. If the number of + # images isn't an exact multiple of batch_size then just + # start over with the first images until the batch is filled. + image_idx = 0 + last_request = False + while not last_request: + input_bytes = None + input_filenames = [] + del request.raw_input[:] + for idx in range(batch_size): + input_filenames.append(filenames[image_idx]) + if input_bytes is None: + input_bytes = image_data[image_idx].tobytes() + else: + input_bytes += image_data[image_idx].tobytes() + + image_idx = (image_idx + 1) % len(image_data) + if image_idx == 0: + last_request = True + + request.raw_input.extend([input_bytes]) + result_filenames.append(input_filenames) + yield request + + +def get_prediction(image_filename, server_host='localhost', server_port=8001, + model_name="end2end-demo", model_version=None): + """ + Retrieve a prediction from a TensorFlow model server + + :param image: a end2end-demo image + :param server_host: the address of the TensorRT inference server + :param server_port: the port used by the server + :param model_name: the name of the model + :param timeout: the amount of time to wait for a prediction to complete + :return 0: the integer predicted in the end2end-demo image + :return 1: the confidence scores for all classes + """ + channel = grpc.insecure_channel(server_host + ':' + str(server_port)) + grpc_stub = grpc_service_pb2_grpc.GRPCServiceStub(channel) + + # Prepare request for Status gRPC + request = grpc_service_pb2.StatusRequest(model_name=model_name) + # Call and receive response from Status gRPC + response = grpc_stub.Status(request) + # Make sure the model matches our requirements, and get some + # properties of the model that we need for preprocessing + batch_size = 1 + verbose = False + input_name, output_name, c, h, w, format, dtype = parse_model( + response, model_name, batch_size, verbose) + + filledRequestGenerator = partial(requestGenerator, input_name, output_name, c, h, w, format, dtype, model_name, + model_version, image_filename) + + # Send requests of batch_size images. If the number of + # images isn't an exact multiple of batch_size then just + # start over with the first images until the batch is filled. + result_filenames = [] + requests = [] + responses = [] + + # Send request + for request in filledRequestGenerator(result_filenames): + responses.append(grpc_stub.Infer(request)) + + # For async, retrieve results according to the send order + for request in requests: + responses.append(request.result()) + + idx = 0 + for response in responses: + print("Request {}, batch size {}".format(idx, batch_size)) + label, score = postprocess(response.meta_data.output, result_filenames[idx], batch_size) + idx += 1 + + return label, score + + +def random_image(img_path='/workspace/web_server/static/images'): + """ + Pull a random image out of the small end2end-demo dataset + + :param savePath: the path to save the file to. If None, file is not saved + :return 0: file selected + :return 1: label selelcted + """ + random_dir = random.choice(os.listdir(img_path)) + random_file = random.choice(os.listdir(img_path + '/' + random_dir)) + + return img_path + '/' + random_dir + '/' + random_file, random_dir, 'static/images' + '/' + random_dir + '/' + random_file diff --git a/samples/nvidia-resnet/components/webapp_launcher/Dockerfile b/samples/nvidia-resnet/components/webapp_launcher/Dockerfile new file mode 100644 index 000000000000..8f4884ba3868 --- /dev/null +++ b/samples/nvidia-resnet/components/webapp_launcher/Dockerfile @@ -0,0 +1,32 @@ +# Copyright 2018 Google Inc. All Rights Reserved. +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +FROM ubuntu:16.04 + +RUN apt-get update -y && \ + apt-get install --no-install-recommends -y -q ca-certificates curl python-dev python-setuptools wget unzip +RUN easy_install pip && \ + pip install pyyaml six requests + +# Install kubectl +RUN curl -LO https://storage.googleapis.com/kubernetes-release/release/$(curl -s https://storage.googleapis.com/kubernetes-release/release/stable.txt)/bin/linux/amd64/kubectl +RUN chmod +x ./kubectl +RUN mv ./kubectl /usr/local/bin + +ADD src /workspace +WORKDIR /workspace + +ENTRYPOINT ["python", "deploy_webapp.py"] + diff --git a/samples/nvidia-resnet/components/webapp_launcher/build.sh b/samples/nvidia-resnet/components/webapp_launcher/build.sh new file mode 100755 index 000000000000..f5afd4682dfc --- /dev/null +++ b/samples/nvidia-resnet/components/webapp_launcher/build.sh @@ -0,0 +1,19 @@ +#!/bin/bash +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +IMAGE= + +docker build -t $IMAGE . +docker push $IMAGE diff --git a/samples/nvidia-resnet/components/webapp_launcher/src/deploy_webapp.py b/samples/nvidia-resnet/components/webapp_launcher/src/deploy_webapp.py new file mode 100644 index 000000000000..c03134dce5d6 --- /dev/null +++ b/samples/nvidia-resnet/components/webapp_launcher/src/deploy_webapp.py @@ -0,0 +1,71 @@ +# Copyright 2018 Google Inc. All Rights Reserved. +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# https://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import argparse +import os +import logging +import subprocess +import requests + + +KUBEFLOW_NAMESPACE = 'kubeflow' +YAML_TEMPLATE = 'webapp-service-template.yaml' +YAML_FILE = 'webapp-service.yaml' + + +def main(): + parser = argparse.ArgumentParser(description='Webapp launcher') + parser.add_argument('--trtserver_name', help='Name of trtis service') + parser.add_argument('--workflow_name', help='Workflow name') + parser.add_argument('--model_name', help='Name of default model') + parser.add_argument('--model_version', help='Model version') + parser.add_argument('--webapp_prefix', + help='Webapp prefix as subpath of Kubeflow UI') + parser.add_argument( + '--webapp_port', help='Webapp port inside the Kubernetes cluster') + + args = parser.parse_args() + + print("using model name: %s and namespace: %s" % + (args.model_name, KUBEFLOW_NAMESPACE)) + + logging.getLogger().setLevel(logging.INFO) + logging.info('Generating webapp service template') + + template_file = os.path.join(os.path.dirname( + os.path.realpath(__file__)), YAML_TEMPLATE) + target_file = os.path.join(os.path.dirname( + os.path.realpath(__file__)), YAML_FILE) + + with open(template_file, 'r') as template: + with open(target_file, "w") as target: + data = template.read() + changed = data.replace('MODEL_PASSIN_NAME', args.model_name) + changed1 = changed.replace( + 'KUBEFLOW_NAMESPACE', KUBEFLOW_NAMESPACE) + changed2 = changed1.replace( + 'MODEL_PASSIN_VERSION', args.model_version) + changed3 = changed2.replace('TRTSERVER_NAME', args.trtserver_name) + changed4 = changed3.replace('WORKFLOW_NAME', args.workflow_name) + changed5 = changed4.replace('WEBAPP_PREFIX', args.webapp_prefix) + changed6 = changed5.replace('WEBAPP_PORT', args.webapp_port) + target.write(changed6) + + subprocess.call(['kubectl', 'apply', '-f', YAML_FILE]) + logging.info('Deploying webapp service') + + +if __name__ == "__main__": + main() diff --git a/samples/nvidia-resnet/components/webapp_launcher/src/webapp-service-template.yaml b/samples/nvidia-resnet/components/webapp_launcher/src/webapp-service-template.yaml new file mode 100644 index 000000000000..714ce8ed94f3 --- /dev/null +++ b/samples/nvidia-resnet/components/webapp_launcher/src/webapp-service-template.yaml @@ -0,0 +1,56 @@ +apiVersion: v1 +kind: Service +metadata: + annotations: + getambassador.io/config: |- + --- + apiVersion: ambassador/v0 + kind: Mapping + name: webapp-WORKFLOW_NAME + prefix: /WEBAPP_PREFIX/ + rewrite: / + timeout_ms: 1200000 + service: webappsvc.KUBEFLOW_NAMESPACE:WEBAPP_PORT + name: webappsvc + labels: + app: demo-client-ui + role: frontend +spec: + type: ClusterIP + ports: + - port: WEBAPP_PORT + targetPort: "http-server" + selector: + app: demo-client-ui + role: frontend + +--- + +apiVersion: extensions/v1beta1 +kind: Deployment +metadata: + name: webapp +spec: + replicas: 1 + template: + metadata: + labels: + app: demo-client-ui + role: frontend + spec: + containers: + - name: webapp + image: + imagePullPolicy: Always + env: + - name: TRTSERVER_HOST + value: TRTSERVER_NAME.KUBEFLOW_NAMESPACE + - name: MODEL_SERVE_NAME + value: MODEL_PASSIN_NAME + - name: MODEL_VERSION + value: "MODEL_PASSIN_VERSION" + - name: TRTSERVER_PORT + value: "8001" + ports: + - name: http-server + containerPort: 8080 diff --git a/samples/nvidia-resnet/install_kubeflow_and_dependencies.sh b/samples/nvidia-resnet/install_kubeflow_and_dependencies.sh index 0c9664f2bf3e..ff2132af833a 100755 --- a/samples/nvidia-resnet/install_kubeflow_and_dependencies.sh +++ b/samples/nvidia-resnet/install_kubeflow_and_dependencies.sh @@ -1,30 +1,17 @@ #!/bin/bash # Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. # -# Redistribution and use in source and binary forms, with or without -# modification, are permitted provided that the following conditions -# are met: -# * Redistributions of source code must retain the above copyright -# notice, this list of conditions and the following disclaimer. -# * Redistributions in binary form must reproduce the above copyright -# notice, this list of conditions and the following disclaimer in the -# documentation and/or other materials provided with the distribution. -# * Neither the name of NVIDIA CORPORATION nor the names of its -# contributors may be used to endorse or promote products derived -# from this software without specific prior written permission. +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at # -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY -# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, -# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, -# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR -# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY -# OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. # Install nvidia-docker 2 sudo tee /etc/docker/daemon.json </dev/null || kubectl apply -f $PV +kubectl replace -f $PVC 2>/dev/null || kubectl apply -f $PVC -docker build -t $IMAGE . && \ -docker run --rm -v $WORK_DIR:$WORK_DIR $IMAGE $CMD && \ -mv -f *.tar.gz ../ +docker build -t $IMAGE . +docker run --rm -v $(pwd)/src:/workspace $IMAGE diff --git a/samples/nvidia-resnet/pipeline/pipeline.py b/samples/nvidia-resnet/pipeline/pipeline.py deleted file mode 100644 index 93ae9c4e4113..000000000000 --- a/samples/nvidia-resnet/pipeline/pipeline.py +++ /dev/null @@ -1,115 +0,0 @@ -# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. -# -# Redistribution and use in source and binary forms, with or without -# modification, are permitted provided that the following conditions -# are met: -# * Redistributions of source code must retain the above copyright -# notice, this list of conditions and the following disclaimer. -# * Redistributions in binary form must reproduce the above copyright -# notice, this list of conditions and the following disclaimer in the -# documentation and/or other materials provided with the distribution. -# * Neither the name of NVIDIA CORPORATION nor the names of its -# contributors may be used to endorse or promote products derived -# from this software without specific prior written permission. -# -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY -# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, -# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, -# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR -# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY -# OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - - -import kfp.dsl as dsl -import datetime -import os -from kubernetes import client as k8s_client - - -PIPELINE_NAME = 'resnet_cifar10_pipeline' - - -def preprocess_op(persistent_volume_path, input_dir, output_dir, step_name='preprocess'): - return dsl.ContainerOp( - name=step_name, - image='nvcr.io/nvidian/sae/ananths:kubeflow-preprocess', - command=['python'], - arguments=[ - '/scripts/preprocess.py', - '--input_dir', '%s/%s' % (persistent_volume_path, input_dir), - '--output_dir', '%s/%s' % (persistent_volume_path, output_dir), - ], - file_outputs={} - ) - - -def train_op(persistent_volume_path, input_dir, output_dir, step_name='train'): - return dsl.ContainerOp( - name=step_name, - image='nvcr.io/nvidian/sae/ananths:kubeflow-train', - command=['python'], - arguments=[ - '/scripts/train.py', - '--input_dir', '%s/%s' % (persistent_volume_path, input_dir), - '--output_dir', '%s/%s' % (persistent_volume_path, output_dir), - ], - file_outputs={} - ) - - -def serve_op(persistent_volume_path, input_dir, step_name='serve'): - return dsl.ContainerOp( - name=step_name, - image='nvcr.io/nvidian/sae/ananths:kubeflow-serve', - command=['python3'], - arguments=[ - '/scripts/serve.py', - '--input_dir', '%s/%s' % (persistent_volume_path, input_dir), - ], - file_outputs={} - ) - - -@dsl.pipeline( - name=PIPELINE_NAME, - description='Demonstrate the ResNet50 predict.' -) -def resnet_pipeline( - raw_data_dir=dsl.PipelineParam(name='raw-data-dir', value='raw_data'), - processed_data_dir=dsl.PipelineParam(name='processed-data-dir', value='processed_data'), - model_dir=dsl.PipelineParam(name='saved-model-dir', value='saved_model') -): - - op_dict = {} - - persistent_volume_name = 'nvidia-workspace' - persistent_volume_path = '/mnt/workspace/' - - op_dict['preprocess'] = preprocess_op( - persistent_volume_path, raw_data_dir, processed_data_dir) - - op_dict['train'] = train_op( - persistent_volume_path, processed_data_dir, model_dir) - op_dict['train'].after(op_dict['preprocess']) - - op_dict['serve'] = serve_op( - persistent_volume_path, model_dir) - op_dict['serve'].after(op_dict['train']) - - for _, container_op in op_dict.items(): - container_op.add_volume(k8s_client.V1Volume( - host_path=k8s_client.V1HostPathVolumeSource( - path=persistent_volume_path), - name=persistent_volume_name)) - container_op.add_volume_mount(k8s_client.V1VolumeMount( - mount_path=persistent_volume_path, name=persistent_volume_name)) - - -if __name__ == '__main__': - import kfp.compiler as compiler - compiler.Compiler().compile(resnet_pipeline, __file__ + '.tar.gz') diff --git a/samples/nvidia-resnet/pipeline/persistent-volume-claim.yaml b/samples/nvidia-resnet/pipeline/src/persistent-volume-claim.yaml similarity index 81% rename from samples/nvidia-resnet/pipeline/persistent-volume-claim.yaml rename to samples/nvidia-resnet/pipeline/src/persistent-volume-claim.yaml index a76d85ba53e9..697c2088a91f 100644 --- a/samples/nvidia-resnet/pipeline/persistent-volume-claim.yaml +++ b/samples/nvidia-resnet/pipeline/src/persistent-volume-claim.yaml @@ -2,10 +2,11 @@ kind: PersistentVolumeClaim apiVersion: v1 metadata: name: nvidia-workspace-read-claim + namespace: kubeflow spec: storageClassName: manual accessModes: - ReadWriteOnce resources: requests: - storage: 20Gi \ No newline at end of file + storage: 20Gi diff --git a/samples/nvidia-resnet/pipeline/persistent-volume.yaml b/samples/nvidia-resnet/pipeline/src/persistent-volume.yaml similarity index 91% rename from samples/nvidia-resnet/pipeline/persistent-volume.yaml rename to samples/nvidia-resnet/pipeline/src/persistent-volume.yaml index 8fbfb4e5d018..65464bbd7c2f 100644 --- a/samples/nvidia-resnet/pipeline/persistent-volume.yaml +++ b/samples/nvidia-resnet/pipeline/src/persistent-volume.yaml @@ -2,6 +2,7 @@ kind: PersistentVolume apiVersion: v1 metadata: name: nvidia-workspace + namespace: kubeflow labels: type: local spec: diff --git a/samples/nvidia-resnet/pipeline/src/pipeline.py b/samples/nvidia-resnet/pipeline/src/pipeline.py new file mode 100644 index 000000000000..9154968d43a1 --- /dev/null +++ b/samples/nvidia-resnet/pipeline/src/pipeline.py @@ -0,0 +1,124 @@ +# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + + +import kfp.dsl as dsl +import datetime +import os +from kubernetes import client as k8s_client + + +# Modify image='' in each op to match IMAGE in the build.sh of its corresponding component + +def PreprocessOp(name, input_dir, output_dir): + return dsl.ContainerOp( + name=name, + image='', + arguments=[ + '--input_dir', input_dir, + '--output_dir', output_dir, + ], + file_outputs={'output': '/output.txt'} + ) + + +def TrainOp(name, input_dir, output_dir, model_name, model_version, epochs): + return dsl.ContainerOp( + name=name, + image='', + arguments=[ + '--input_dir', input_dir, + '--output_dir', output_dir, + '--model_name', model_name, + '--model_version', model_version, + '--epochs', epochs + ], + file_outputs={'output': '/output.txt'} + ) + + +def InferenceServerLauncherOp(name, input_dir, trtserver_name): + return dsl.ContainerOp( + name=name, + image='', + arguments=[ + '--trtserver_name', trtserver_name, + '--model_path', input_dir, + ], + file_outputs={'output': '/output.txt'} + ) + + +def WebappLauncherOp(name, trtserver_name, model_name, model_version, webapp_prefix, webapp_port): + return dsl.ContainerOp( + name=name, + image='', + arguments=[ + '--workflow_name', '{{workflow.name}}', + '--trtserver_name', trtserver_name, + '--model_name', model_name, + '--model_version', str(model_version), + '--webapp_prefix', webapp_prefix, + '--webapp_port', str(webapp_port) + ], + file_outputs={} + ) + + +@dsl.pipeline( + name='resnet_cifar10_pipeline', + description='Demonstrate an end-to-end training & serving pipeline using ResNet and CIFAR-10' +) +def resnet_pipeline( + raw_data_dir='/mnt/workspace/raw_data', + processed_data_dir='/mnt/workspace/processed_data', + model_dir='/mnt/workspace/saved_model', + epochs=50, + trtserver_name='trtis', + model_name='resnet_graphdef', + model_version=1, + webapp_prefix='webapp', + webapp_port=80 +): + + persistent_volume_name = 'nvidia-workspace' + persistent_volume_path = '/mnt/workspace' + + op_dict = {} + + op_dict['preprocess'] = PreprocessOp( + 'preprocess', raw_data_dir, processed_data_dir) + + op_dict['train'] = TrainOp( + 'train', op_dict['preprocess'].output, model_dir, model_name, model_version, epochs) + + op_dict['deploy_inference_server'] = InferenceServerLauncherOp( + 'deploy_inference_server', op_dict['train'].output, trtserver_name) + + op_dict['deploy_webapp'] = WebappLauncherOp( + 'deploy_webapp', op_dict['deploy_inference_server'].output, model_name, model_version, webapp_prefix, webapp_port) + + for _, container_op in op_dict.items(): + container_op.add_volume(k8s_client.V1Volume( + host_path=k8s_client.V1HostPathVolumeSource( + path=persistent_volume_path), + name=persistent_volume_name)) + container_op.add_volume_mount(k8s_client.V1VolumeMount( + mount_path=persistent_volume_path, + name=persistent_volume_name)) + + +if __name__ == '__main__': + import kfp.compiler as compiler + compiler.Compiler().compile(resnet_pipeline, __file__ + '.tar.gz') diff --git a/samples/nvidia-resnet/portforward_serving_port.sh b/samples/nvidia-resnet/portforward_serving_port.sh deleted file mode 100755 index 5b0948ca3e28..000000000000 --- a/samples/nvidia-resnet/portforward_serving_port.sh +++ /dev/null @@ -1,39 +0,0 @@ -#!/bin/bash -# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. -# -# Redistribution and use in source and binary forms, with or without -# modification, are permitted provided that the following conditions -# are met: -# * Redistributions of source code must retain the above copyright -# notice, this list of conditions and the following disclaimer. -# * Redistributions in binary form must reproduce the above copyright -# notice, this list of conditions and the following disclaimer in the -# documentation and/or other materials provided with the distribution. -# * Neither the name of NVIDIA CORPORATION nor the names of its -# contributors may be used to endorse or promote products derived -# from this software without specific prior written permission. -# -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY -# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, -# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, -# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR -# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY -# OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - - -# Look for the name of the serving POD mentioned on the Kubeflow pipeline dashboard -SERVING_POD=resnet-cifar10-pipeline-8tkrt-1130129541 - -HOST_PORT=8000 - -# Model name specified in the file cofig.pbtxt -MODEL_NAME=resnet_graphdef - -apt-get update && apt-get install -y socat -kubectl port-forward pod/$SERVING_POD $HOST_PORT:8000 -n kubeflow & -curl localhost:$HOST_PORT/api/status/$MODEL_NAME diff --git a/samples/nvidia-resnet/preprocess/Dockerfile b/samples/nvidia-resnet/preprocess/Dockerfile deleted file mode 100644 index db94f5337bd1..000000000000 --- a/samples/nvidia-resnet/preprocess/Dockerfile +++ /dev/null @@ -1,32 +0,0 @@ -# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. -# -# Redistribution and use in source and binary forms, with or without -# modification, are permitted provided that the following conditions -# are met: -# * Redistributions of source code must retain the above copyright -# notice, this list of conditions and the following disclaimer. -# * Redistributions in binary form must reproduce the above copyright -# notice, this list of conditions and the following disclaimer in the -# documentation and/or other materials provided with the distribution. -# * Neither the name of NVIDIA CORPORATION nor the names of its -# contributors may be used to endorse or promote products derived -# from this software without specific prior written permission. -# -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY -# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, -# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, -# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR -# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY -# OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - - -FROM nvcr.io/nvidia/tensorflow:19.02-py3 - -RUN pip install keras - -COPY preprocess.py /scripts/preprocess.py diff --git a/samples/nvidia-resnet/preprocess/build.sh b/samples/nvidia-resnet/preprocess/build.sh deleted file mode 100755 index a71f976a6414..000000000000 --- a/samples/nvidia-resnet/preprocess/build.sh +++ /dev/null @@ -1,32 +0,0 @@ -#!/bin/bash -# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. -# -# Redistribution and use in source and binary forms, with or without -# modification, are permitted provided that the following conditions -# are met: -# * Redistributions of source code must retain the above copyright -# notice, this list of conditions and the following disclaimer. -# * Redistributions in binary form must reproduce the above copyright -# notice, this list of conditions and the following disclaimer in the -# documentation and/or other materials provided with the distribution. -# * Neither the name of NVIDIA CORPORATION nor the names of its -# contributors may be used to endorse or promote products derived -# from this software without specific prior written permission. -# -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY -# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, -# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, -# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR -# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY -# OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - - -IMAGE=:resnet-preprocess - -docker build -t $IMAGE . -docker push $IMAGE diff --git a/samples/nvidia-resnet/preprocess/preprocess.py b/samples/nvidia-resnet/preprocess/preprocess.py deleted file mode 100644 index 00f3be206bbb..000000000000 --- a/samples/nvidia-resnet/preprocess/preprocess.py +++ /dev/null @@ -1,62 +0,0 @@ -# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. -# -# Redistribution and use in source and binary forms, with or without -# modification, are permitted provided that the following conditions -# are met: -# * Redistributions of source code must retain the above copyright -# notice, this list of conditions and the following disclaimer. -# * Redistributions in binary form must reproduce the above copyright -# notice, this list of conditions and the following disclaimer in the -# documentation and/or other materials provided with the distribution. -# * Neither the name of NVIDIA CORPORATION nor the names of its -# contributors may be used to endorse or promote products derived -# from this software without specific prior written permission. -# -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY -# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, -# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, -# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR -# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY -# OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - - -import os -import argparse -import numpy as np -from keras.datasets import cifar10 - - -parser = argparse.ArgumentParser( - description='Provide input and output directories') -parser.add_argument('--input_dir', - help='provide input directory') -parser.add_argument('--output_dir', - help='provide output directory') -args = parser.parse_args() - - -def load_and_process_data(input_dir): - processed_data = cifar10.load_data() - return processed_data - - -def save_data(processed_data, output_dir): - (x_train, y_train), (x_test, y_test) = processed_data - if not os.path.isdir(output_dir): - os.mkdir(output_dir) - np.save(os.path.join(output_dir, 'x_train.npy'), x_train) - np.save(os.path.join(output_dir, 'y_train.npy'), y_train) - np.save(os.path.join(output_dir, 'x_test.npy'), x_test) - np.save(os.path.join(output_dir, 'y_test.npy'), y_test) - - -processed_data = load_and_process_data(args.input_dir) -save_data(processed_data, args.output_dir) - -print('input_dir: {}'.format(args.input_dir)) -print('output_dir: {}'.format(args.output_dir)) diff --git a/samples/nvidia-resnet/remove_minikube_and_kubeflow.sh b/samples/nvidia-resnet/remove_minikube_and_kubeflow.sh deleted file mode 100755 index 1148bfea03a7..000000000000 --- a/samples/nvidia-resnet/remove_minikube_and_kubeflow.sh +++ /dev/null @@ -1,35 +0,0 @@ -#!/bin/bash -# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. -# -# Redistribution and use in source and binary forms, with or without -# modification, are permitted provided that the following conditions -# are met: -# * Redistributions of source code must retain the above copyright -# notice, this list of conditions and the following disclaimer. -# * Redistributions in binary form must reproduce the above copyright -# notice, this list of conditions and the following disclaimer in the -# documentation and/or other materials provided with the distribution. -# * Neither the name of NVIDIA CORPORATION nor the names of its -# contributors may be used to endorse or promote products derived -# from this software without specific prior written permission. -# -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY -# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, -# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, -# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR -# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY -# OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - - -# Remove KubeFlow -cd ${KUBEFLOW_SRC}/${KFAPP} -${KUBEFLOW_SRC}/scripts/kfctl.sh delete k8s - -# Remove Minikube -minikube stop -minikube delete diff --git a/samples/nvidia-resnet/serve/Dockerfile b/samples/nvidia-resnet/serve/Dockerfile deleted file mode 100644 index 5d8eee89bcea..000000000000 --- a/samples/nvidia-resnet/serve/Dockerfile +++ /dev/null @@ -1,35 +0,0 @@ -# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. -# -# Redistribution and use in source and binary forms, with or without -# modification, are permitted provided that the following conditions -# are met: -# * Redistributions of source code must retain the above copyright -# notice, this list of conditions and the following disclaimer. -# * Redistributions in binary form must reproduce the above copyright -# notice, this list of conditions and the following disclaimer in the -# documentation and/or other materials provided with the distribution. -# * Neither the name of NVIDIA CORPORATION nor the names of its -# contributors may be used to endorse or promote products derived -# from this software without specific prior written permission. -# -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY -# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, -# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, -# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR -# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY -# OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - - -FROM nvcr.io/nvidia/tensorrtserver:19.02-py3 - -RUN apt-get install -y software-properties-common && \ - add-apt-repository -y ppa:deadsnakes/ppa && \ - apt-get update && \ - apt-get install -y python3.6 - -COPY serve.py /scripts/serve.py diff --git a/samples/nvidia-resnet/serve/build.sh b/samples/nvidia-resnet/serve/build.sh deleted file mode 100755 index c3310cf605b7..000000000000 --- a/samples/nvidia-resnet/serve/build.sh +++ /dev/null @@ -1,32 +0,0 @@ -#!/bin/bash -# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. -# -# Redistribution and use in source and binary forms, with or without -# modification, are permitted provided that the following conditions -# are met: -# * Redistributions of source code must retain the above copyright -# notice, this list of conditions and the following disclaimer. -# * Redistributions in binary form must reproduce the above copyright -# notice, this list of conditions and the following disclaimer in the -# documentation and/or other materials provided with the distribution. -# * Neither the name of NVIDIA CORPORATION nor the names of its -# contributors may be used to endorse or promote products derived -# from this software without specific prior written permission. -# -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY -# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, -# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, -# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR -# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY -# OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - - -IMAGE=:resnet-serve - -docker build -t $IMAGE . -docker push $IMAGE diff --git a/samples/nvidia-resnet/serve/serve.py b/samples/nvidia-resnet/serve/serve.py deleted file mode 100644 index bce58492507b..000000000000 --- a/samples/nvidia-resnet/serve/serve.py +++ /dev/null @@ -1,62 +0,0 @@ -# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. -# -# Redistribution and use in source and binary forms, with or without -# modification, are permitted provided that the following conditions -# are met: -# * Redistributions of source code must retain the above copyright -# notice, this list of conditions and the following disclaimer. -# * Redistributions in binary form must reproduce the above copyright -# notice, this list of conditions and the following disclaimer in the -# documentation and/or other materials provided with the distribution. -# * Neither the name of NVIDIA CORPORATION nor the names of its -# contributors may be used to endorse or promote products derived -# from this software without specific prior written permission. -# -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY -# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, -# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, -# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR -# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY -# OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - - -import os -import shutil -import argparse - - -parser = argparse.ArgumentParser( - description='Provide input directory') -parser.add_argument('--input_dir', - help='provide input directory') -args = parser.parse_args() - - -TRTIS_RESOURCE_DIR = 'trtis_resource' - -# Modify according to your training script and config.pbtxt -GRAPHDEF_FILE = 'trt_graphdef.pb' -MODEL_NAME = 'resnet_graphdef' - -# Check for resource dir -resource_dir = os.path.join(args.input_dir, TRTIS_RESOURCE_DIR) -if not os.path.isdir(resource_dir): - raise IOError('Resource dir for TRTIS not found') - -# Create a directory structure that TRTIS expects -config_file_path = os.path.join(resource_dir, 'config.pbtxt') -label_file_path = os.path.join(resource_dir, 'labels.txt') -graphdef_path = os.path.join(args.input_dir, GRAPHDEF_FILE) -model_dir = '/models/%s/1' % MODEL_NAME - -os.makedirs(model_dir) -shutil.copy(config_file_path, '/models/%s' % MODEL_NAME) -shutil.copy(label_file_path, '/models/%s' % MODEL_NAME) -shutil.copyfile(graphdef_path, os.path.join(model_dir, 'model.graphdef')) - -os.system('trtserver --model-store=/models') diff --git a/samples/nvidia-resnet/test_trtis_client.sh b/samples/nvidia-resnet/test_trtis_client.sh deleted file mode 100755 index 592a965e5133..000000000000 --- a/samples/nvidia-resnet/test_trtis_client.sh +++ /dev/null @@ -1,44 +0,0 @@ -#!/bin/bash -# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. -# -# Redistribution and use in source and binary forms, with or without -# modification, are permitted provided that the following conditions -# are met: -# * Redistributions of source code must retain the above copyright -# notice, this list of conditions and the following disclaimer. -# * Redistributions in binary form must reproduce the above copyright -# notice, this list of conditions and the following disclaimer in the -# documentation and/or other materials provided with the distribution. -# * Neither the name of NVIDIA CORPORATION nor the names of its -# contributors may be used to endorse or promote products derived -# from this software without specific prior written permission. -# -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY -# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, -# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, -# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR -# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY -# OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - - -CLIENT_DIR=$(pwd)/trtis_client -REPO_DIR=/tmp/tensorrt-inference-server -SCRIPT_DIR=$REPO_DIR/src/clients/python/ -IMAGE=tensorrtserver_clients -UPDATED_CLIENT_FILE=updated_image_client.py -DEMO_CLIENT_UI=demo_client_ui.py - -CMD="pip install dash && python3 /workspace/src/clients/python/$DEMO_CLIENT_UI" -# If you don't want to test the UI, just use the command below -# CMD="python3 /workspace/src/clients/python/$UPDATED_CLIENT_FILE -m resnet_graphdef -s RESNET images/mug.jpg" - -git clone https://github.com/NVIDIA/tensorrt-inference-server.git $REPO_DIR -cp $CLIENT_DIR/$UPDATED_CLIENT_FILE $SCRIPT_DIR -cp $CLIENT_DIR/$DEMO_CLIENT_UI $SCRIPT_DIR -docker build -t $IMAGE -f $REPO_DIR/Dockerfile.client $REPO_DIR -docker run -it --rm --net=host $IMAGE /bin/bash -c "$CMD" diff --git a/samples/nvidia-resnet/train/Dockerfile b/samples/nvidia-resnet/train/Dockerfile deleted file mode 100644 index a02e6770ac2a..000000000000 --- a/samples/nvidia-resnet/train/Dockerfile +++ /dev/null @@ -1,33 +0,0 @@ -# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. -# -# Redistribution and use in source and binary forms, with or without -# modification, are permitted provided that the following conditions -# are met: -# * Redistributions of source code must retain the above copyright -# notice, this list of conditions and the following disclaimer. -# * Redistributions in binary form must reproduce the above copyright -# notice, this list of conditions and the following disclaimer in the -# documentation and/or other materials provided with the distribution. -# * Neither the name of NVIDIA CORPORATION nor the names of its -# contributors may be used to endorse or promote products derived -# from this software without specific prior written permission. -# -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY -# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, -# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, -# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR -# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY -# OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - - -FROM nvcr.io/nvidia/tensorflow:19.02-py3 - -RUN pip install keras - -COPY train.py /scripts/train.py -COPY trtis_resource /trtis_resource diff --git a/samples/nvidia-resnet/train/build.sh b/samples/nvidia-resnet/train/build.sh deleted file mode 100755 index cd334b0360a7..000000000000 --- a/samples/nvidia-resnet/train/build.sh +++ /dev/null @@ -1,32 +0,0 @@ -#!/bin/bash -# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. -# -# Redistribution and use in source and binary forms, with or without -# modification, are permitted provided that the following conditions -# are met: -# * Redistributions of source code must retain the above copyright -# notice, this list of conditions and the following disclaimer. -# * Redistributions in binary form must reproduce the above copyright -# notice, this list of conditions and the following disclaimer in the -# documentation and/or other materials provided with the distribution. -# * Neither the name of NVIDIA CORPORATION nor the names of its -# contributors may be used to endorse or promote products derived -# from this software without specific prior written permission. -# -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY -# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, -# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, -# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR -# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY -# OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - - -IMAGE=:resnet-train - -docker build -t $IMAGE . -docker push $IMAGE diff --git a/samples/nvidia-resnet/train/train.py b/samples/nvidia-resnet/train/train.py deleted file mode 100644 index fd85f271c651..000000000000 --- a/samples/nvidia-resnet/train/train.py +++ /dev/null @@ -1,595 +0,0 @@ -# COPYRIGHT -# -# All contributions by François Chollet: -# Copyright (c) 2015 - 2018, François Chollet. -# All rights reserved. -# -# All contributions by Google: -# Copyright (c) 2015 - 2018, Google, Inc. -# All rights reserved. -# -# All contributions by Microsoft: -# Copyright (c) 2017 - 2018, Microsoft, Inc. -# All rights reserved. -# -# All other contributions: -# Copyright (c) 2015 - 2018, the respective contributors. -# All rights reserved. -# -# Each contributor holds copyright over their respective contributions. -# The project versioning (Git) records all such contribution source information. -# -# LICENSE -# -# The MIT License (MIT) -# -# Permission is hereby granted, free of charge, to any person obtaining a copy -# of this software and associated documentation files (the "Software"), to deal -# in the Software without restriction, including without limitation the rights -# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell -# copies of the Software, and to permit persons to whom the Software is -# furnished to do so, subject to the following conditions: -# -# The above copyright notice and this permission notice shall be included in all -# copies or substantial portions of the Software. -# -# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR -# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, -# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE -# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER -# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, -# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE -# SOFTWARE. - -# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. -# -# Redistribution and use in source and binary forms, with or without -# modification, are permitted provided that the following conditions -# are met: -# * Redistributions of source code must retain the above copyright -# notice, this list of conditions and the following disclaimer. -# * Redistributions in binary form must reproduce the above copyright -# notice, this list of conditions and the following disclaimer in the -# documentation and/or other materials provided with the distribution. -# * Neither the name of NVIDIA CORPORATION nor the names of its -# contributors may be used to endorse or promote products derived -# from this software without specific prior written permission. -# -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY -# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, -# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, -# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR -# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY -# OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - - -from __future__ import print_function - -import os -import shutil -import argparse -import numpy as np - -import keras -from keras.regularizers import l2 -from keras import backend as K -from keras.models import Model -from keras import backend as K -from keras.optimizers import Adam -from keras.models import load_model -from keras.layers import Dense, Conv2D -from keras.layers import BatchNormalization, Activation -from keras.layers import AveragePooling2D, Input, Flatten -from keras.callbacks import ModelCheckpoint, LearningRateScheduler -from keras.callbacks import ReduceLROnPlateau -from keras.preprocessing.image import ImageDataGenerator - -import tensorflow as tf -from tensorflow.python.saved_model import builder as saved_model_builder -from tensorflow.python.saved_model.signature_def_utils import predict_signature_def -from tensorflow.python.saved_model import tag_constants -from tensorflow.python.saved_model import signature_constants - -import tensorflow.contrib.tensorrt as trt - - -parser = argparse.ArgumentParser( - description='Provide input and output directories') -parser.add_argument('--input_dir', - help='provide input directory') -parser.add_argument('--output_dir', - help='provide output directory') -args = parser.parse_args() - - -# Copy TRTIS resource (containing config.pbtxt, labels.txt, ...) from container to mounted volume -CONT_TRTIS_RESOURCE_PATH = '/trtis_resource' -external_trtis_resource_path = os.path.join(args.output_dir, 'trtis_resource') -if os.path.isdir(external_trtis_resource_path): - shutil.rmtree(external_trtis_resource_path) -shutil.copytree(CONT_TRTIS_RESOURCE_PATH, external_trtis_resource_path) - -# Training parameters -batch_size = 128 # orig paper trained all networks with batch_size=128 -epochs = 20 -data_augmentation = True -num_classes = 10 - -# Subtracting pixel mean improves accuracy -subtract_pixel_mean = True - -# Model parameter -# ---------------------------------------------------------------------------- -# | | 200-epoch | Orig Paper| 200-epoch | Orig Paper| sec/epoch -# Model | n | ResNet v1 | ResNet v1 | ResNet v2 | ResNet v2 | GTX1080Ti -# |v1(v2)| %Accuracy | %Accuracy | %Accuracy | %Accuracy | v1 (v2) -# ---------------------------------------------------------------------------- -# ResNet20 | 3 (2)| 92.16 | 91.25 | ----- | ----- | 35 (---) -# ResNet32 | 5(NA)| 92.46 | 92.49 | NA | NA | 50 ( NA) -# ResNet44 | 7(NA)| 92.50 | 92.83 | NA | NA | 70 ( NA) -# ResNet56 | 9 (6)| 92.71 | 93.03 | 93.01 | NA | 90 (100) -# ResNet110 |18(12)| 92.65 | 93.39+-.16| 93.15 | 93.63 | 165(180) -# ResNet164 |27(18)| ----- | 94.07 | ----- | 94.54 | ---(---) -# ResNet1001| (111)| ----- | 92.39 | ----- | 95.08+-.14| ---(---) -# --------------------------------------------------------------------------- - -n = 3 - -# Model version -# Orig paper: version = 1 (ResNet v1), Improved ResNet: version = 2 (ResNet v2) -version = 2 - -# Computed depth from supplied model parameter n -if version == 1: - depth = n * 6 + 2 -elif version == 2: - depth = n * 9 + 2 - -# Model name, depth and version -model_type = 'ResNet%dv%d' % (depth, version) - -# Load the CIFAR10 data. -def load_preprocessed_data(input_dir): - x_train = np.load(os.path.join(input_dir, "x_train.npy")) - y_train = np.load(os.path.join(input_dir, "y_train.npy")) - x_test = np.load(os.path.join(input_dir, "x_test.npy")) - y_test = np.load(os.path.join(input_dir, "y_test.npy")) - return x_train, y_train, x_test, y_test - - -preprocessed_data = load_preprocessed_data(args.input_dir) -x_train, y_train, x_test, y_test = preprocessed_data - -# Input image dimensions. -input_shape = x_train.shape[1:] - -# Normalize data. -x_train = x_train.astype('float32') / 255 -x_test = x_test.astype('float32') / 255 - -# If subtract pixel mean is enabled -if subtract_pixel_mean: - x_train_mean = np.mean(x_train, axis=0) - x_train -= x_train_mean - x_test -= x_train_mean - -print('x_train shape:', x_train.shape) -print(x_train.shape[0], 'train samples') -print(x_test.shape[0], 'test samples') -print('y_train shape:', y_train.shape) - -# Convert class vectors to binary class matrices. -y_train = keras.utils.to_categorical(y_train, num_classes) -y_test = keras.utils.to_categorical(y_test, num_classes) - - -def lr_schedule(epoch): - """Learning Rate Schedule - - Learning rate is scheduled to be reduced after 80, 120, 160, 180 epochs. - Called automatically every epoch as part of callbacks during training. - - # Arguments - epoch (int): The number of epochs - - # Returns - lr (float32): learning rate - """ - lr = 1e-3 - if epoch > 180: - lr *= 0.5e-3 - elif epoch > 160: - lr *= 1e-3 - elif epoch > 120: - lr *= 1e-2 - elif epoch > 80: - lr *= 1e-1 - print('Learning rate: ', lr) - return lr - - -def resnet_layer(inputs, - num_filters=16, - kernel_size=3, - strides=1, - activation='relu', - batch_normalization=True, - conv_first=True): - """2D Convolution-Batch Normalization-Activation stack builder - - # Arguments - inputs (tensor): input tensor from input image or previous layer - num_filters (int): Conv2D number of filters - kernel_size (int): Conv2D square kernel dimensions - strides (int): Conv2D square stride dimensions - activation (string): activation name - batch_normalization (bool): whether to include batch normalization - conv_first (bool): conv-bn-activation (True) or - bn-activation-conv (False) - - # Returns - x (tensor): tensor as input to the next layer - """ - conv = Conv2D(num_filters, - kernel_size=kernel_size, - strides=strides, - padding='same', - kernel_initializer='he_normal', - kernel_regularizer=l2(1e-4)) - - x = inputs - if conv_first: - x = conv(x) - if batch_normalization: - x = BatchNormalization()(x) - if activation is not None: - x = Activation(activation)(x) - else: - if batch_normalization: - x = BatchNormalization()(x) - if activation is not None: - x = Activation(activation)(x) - x = conv(x) - return x - - -def resnet_v1(input_shape, depth, num_classes=10): - """ResNet Version 1 Model builder [a] - - Stacks of 2 x (3 x 3) Conv2D-BN-ReLU - Last ReLU is after the shortcut connection. - At the beginning of each stage, the feature map size is halved (downsampled) - by a convolutional layer with strides=2, while the number of filters is - doubled. Within each stage, the layers have the same number filters and the - same number of filters. - Features maps sizes: - stage 0: 32x32, 16 - stage 1: 16x16, 32 - stage 2: 8x8, 64 - The Number of parameters is approx the same as Table 6 of [a]: - ResNet20 0.27M - ResNet32 0.46M - ResNet44 0.66M - ResNet56 0.85M - ResNet110 1.7M - - # Arguments - input_shape (tensor): shape of input image tensor - depth (int): number of core convolutional layers - num_classes (int): number of classes (CIFAR10 has 10) - - # Returns - model (Model): Keras model instance - """ - if (depth - 2) % 6 != 0: - raise ValueError('depth should be 6n+2 (eg 20, 32, 44 in [a])') - # Start model definition. - num_filters = 16 - num_res_blocks = int((depth - 2) / 6) - - inputs = Input(shape=input_shape) - x = resnet_layer(inputs=inputs) - # Instantiate the stack of residual units - for stack in range(3): - for res_block in range(num_res_blocks): - strides = 1 - if stack > 0 and res_block == 0: # first layer but not first stack - strides = 2 # downsample - y = resnet_layer(inputs=x, - num_filters=num_filters, - strides=strides) - y = resnet_layer(inputs=y, - num_filters=num_filters, - activation=None) - if stack > 0 and res_block == 0: # first layer but not first stack - # linear projection residual shortcut connection to match - # changed dims - x = resnet_layer(inputs=x, - num_filters=num_filters, - kernel_size=1, - strides=strides, - activation=None, - batch_normalization=False) - x = keras.layers.add([x, y]) - x = Activation('relu')(x) - num_filters *= 2 - - # Add classifier on top. - # v1 does not use BN after last shortcut connection-ReLU - x = AveragePooling2D(pool_size=8)(x) - y = Flatten()(x) - outputs = Dense(num_classes, - activation='softmax', - kernel_initializer='he_normal')(y) - - # Instantiate model. - model = Model(inputs=inputs, outputs=outputs) - return model - - -def resnet_v2(input_shape, depth, num_classes=10): - """ResNet Version 2 Model builder [b] - - Stacks of (1 x 1)-(3 x 3)-(1 x 1) BN-ReLU-Conv2D or also known as - bottleneck layer - First shortcut connection per layer is 1 x 1 Conv2D. - Second and onwards shortcut connection is identity. - At the beginning of each stage, the feature map size is halved (downsampled) - by a convolutional layer with strides=2, while the number of filter maps is - doubled. Within each stage, the layers have the same number filters and the - same filter map sizes. - Features maps sizes: - conv1 : 32x32, 16 - stage 0: 32x32, 64 - stage 1: 16x16, 128 - stage 2: 8x8, 256 - - # Arguments - input_shape (tensor): shape of input image tensor - depth (int): number of core convolutional layers - num_classes (int): number of classes (CIFAR10 has 10) - - # Returns - model (Model): Keras model instance - """ - if (depth - 2) % 9 != 0: - raise ValueError('depth should be 9n+2 (eg 56 or 110 in [b])') - # Start model definition. - num_filters_in = 16 - num_res_blocks = int((depth - 2) / 9) - - inputs = Input(shape=input_shape) - # v2 performs Conv2D with BN-ReLU on input before splitting into 2 paths - x = resnet_layer(inputs=inputs, - num_filters=num_filters_in, - conv_first=True) - - # Instantiate the stack of residual units - for stage in range(3): - for res_block in range(num_res_blocks): - activation = 'relu' - batch_normalization = True - strides = 1 - if stage == 0: - num_filters_out = num_filters_in * 4 - if res_block == 0: # first layer and first stage - activation = None - batch_normalization = False - else: - num_filters_out = num_filters_in * 2 - if res_block == 0: # first layer but not first stage - strides = 2 # downsample - - # bottleneck residual unit - y = resnet_layer(inputs=x, - num_filters=num_filters_in, - kernel_size=1, - strides=strides, - activation=activation, - batch_normalization=batch_normalization, - conv_first=False) - y = resnet_layer(inputs=y, - num_filters=num_filters_in, - conv_first=False) - y = resnet_layer(inputs=y, - num_filters=num_filters_out, - kernel_size=1, - conv_first=False) - if res_block == 0: - # linear projection residual shortcut connection to match - # changed dims - x = resnet_layer(inputs=x, - num_filters=num_filters_out, - kernel_size=1, - strides=strides, - activation=None, - batch_normalization=False) - x = keras.layers.add([x, y]) - - num_filters_in = num_filters_out - - # Add classifier on top. - # v2 has BN-ReLU before Pooling - x = BatchNormalization()(x) - x = Activation('relu')(x) - x = AveragePooling2D(pool_size=8)(x) - y = Flatten()(x) - outputs = Dense(num_classes, - activation='softmax', - kernel_initializer='he_normal')(y) - - # Instantiate model. - model = Model(inputs=inputs, outputs=outputs) - return model - - -if version == 2: - model = resnet_v2(input_shape=input_shape, depth=depth) -else: - model = resnet_v1(input_shape=input_shape, depth=depth) - -model.compile(loss='categorical_crossentropy', - optimizer=Adam(lr=lr_schedule(0)), - metrics=['accuracy']) -model.summary() -print(model_type) - -# Prepare model model saving directory. -save_dir = os.path.join(os.getcwd(), 'saved_models') -model_name = 'cifar10_%s_model.{epoch:03d}.h5' % model_type -if not os.path.isdir(save_dir): - os.makedirs(save_dir) -filepath = os.path.join(save_dir, model_name) - -# Prepare callbacks for model saving and for learning rate adjustment. -checkpoint = ModelCheckpoint(filepath=filepath, - monitor='val_acc', - verbose=1, - save_best_only=True) - -lr_scheduler = LearningRateScheduler(lr_schedule) - -lr_reducer = ReduceLROnPlateau(factor=np.sqrt(0.1), - cooldown=0, - patience=5, - min_lr=0.5e-6) - -callbacks = [checkpoint, lr_reducer, lr_scheduler] - -# Run training, with or without data augmentation. -if not data_augmentation: - print('Not using data augmentation.') - model.fit(x_train, y_train, - batch_size=batch_size, - epochs=epochs, - validation_data=(x_test, y_test), - shuffle=True, - callbacks=callbacks) -else: - print('Using real-time data augmentation.') - # This will do preprocessing and realtime data augmentation: - datagen = ImageDataGenerator( - # set input mean to 0 over the dataset - featurewise_center=False, - # set each sample mean to 0 - samplewise_center=False, - # divide inputs by std of dataset - featurewise_std_normalization=False, - # divide each input by its std - samplewise_std_normalization=False, - # apply ZCA whitening - zca_whitening=False, - # epsilon for ZCA whitening - zca_epsilon=1e-06, - # randomly rotate images in the range (deg 0 to 180) - rotation_range=0, - # randomly shift images horizontally - width_shift_range=0.1, - # randomly shift images vertically - height_shift_range=0.1, - # set range for random shear - shear_range=0., - # set range for random zoom - zoom_range=0., - # set range for random channel shifts - channel_shift_range=0., - # set mode for filling points outside the input boundaries - fill_mode='nearest', - # value used for fill_mode = "constant" - cval=0., - # randomly flip images - horizontal_flip=True, - # randomly flip images - vertical_flip=False, - # set rescaling factor (applied before any other transformation) - rescale=None, - # set function that will be applied on each input - preprocessing_function=None, - # image data format, either "channels_first" or "channels_last" - data_format=None, - # fraction of images reserved for validation (strictly between 0 and 1) - validation_split=0.0) - - # Compute quantities required for featurewise normalization - # (std, mean, and principal components if ZCA whitening is applied). - datagen.fit(x_train) - - # Fit the model on the batches generated by datagen.flow(). - model.fit_generator(datagen.flow(x_train, y_train, batch_size=batch_size), - steps_per_epoch=len(x_train)/batch_size, - validation_data=(x_test, y_test), - epochs=epochs, verbose=1, workers=4, - callbacks=callbacks) - -# Score trained model. -scores = model.evaluate(x_test, y_test, verbose=1) -print('Test loss:', scores[0]) -print('Test accuracy:', scores[1]) - -# Save Keras model -tmp_model_path = os.path.join(args.output_dir, "tmp") -if os.path.isdir(tmp_model_path): - shutil.rmtree(tmp_model_path) -os.mkdir(tmp_model_path) - -keras_model_path = os.path.join(tmp_model_path, 'keras_model.h5') -model.save(keras_model_path) - -# Convert Keras model to Tensorflow SavedModel -def export_h5_to_pb(path_to_h5, export_path): - # Set the learning phase to Test since the model is already trained. - K.set_learning_phase(0) - # Load the Keras model - keras_model = load_model(path_to_h5) - # Build the Protocol Buffer SavedModel at 'export_path' - builder = saved_model_builder.SavedModelBuilder(export_path) - # Create prediction signature to be used by TensorFlow Serving Predict API - signature = predict_signature_def(inputs={"input_1": keras_model.input}, - outputs={"dense_1": keras_model.output}) - with K.get_session() as sess: - # Save the meta graph and the variables - builder.add_meta_graph_and_variables(sess=sess, tags=[tag_constants.SERVING], - signature_def_map={signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: signature}) - builder.save() - - -tf_model_path = os.path.join(args.output_dir, "tf_saved_model") -if os.path.isdir(tf_model_path): - shutil.rmtree(tf_model_path) - -export_h5_to_pb(keras_model_path, tf_model_path) - -# Apply TF_TRT on the Tensorflow SavedModel -graph = tf.Graph() -with graph.as_default(): - with tf.Session() as sess: - # Create a TensorRT inference graph from a SavedModel: - trt_graph = trt.create_inference_graph( - input_graph_def=None, - outputs=None, - input_saved_model_dir=tf_model_path, - input_saved_model_tags=[tag_constants.SERVING], - max_batch_size=batch_size, - max_workspace_size_bytes=2 << 30, - precision_mode='fp16') - - print([n.name + '=>' + n.op for n in trt_graph.node]) - - tf.io.write_graph( - trt_graph, - args.output_dir, - 'trt_graphdef.pb', - as_text=False - ) - -# Remove tmp dirs -shutil.rmtree(tmp_model_path) -shutil.rmtree(tf_model_path) - -print('input_dir: {}'.format(args.input_dir)) -print('output_dir: {}'.format(args.output_dir)) diff --git a/samples/nvidia-resnet/trtis_client/demo_client_ui.py b/samples/nvidia-resnet/trtis_client/demo_client_ui.py deleted file mode 100644 index 080631849b0e..000000000000 --- a/samples/nvidia-resnet/trtis_client/demo_client_ui.py +++ /dev/null @@ -1,93 +0,0 @@ -# Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved. -# -# Redistribution and use in source and binary forms, with or without -# modification, are permitted provided that the following conditions -# are met: -# * Redistributions of source code must retain the above copyright -# notice, this list of conditions and the following disclaimer. -# * Redistributions in binary form must reproduce the above copyright -# notice, this list of conditions and the following disclaimer in the -# documentation and/or other materials provided with the distribution. -# * Neither the name of NVIDIA CORPORATION nor the names of its -# contributors may be used to endorse or promote products derived -# from this software without specific prior written permission. -# -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY -# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, -# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, -# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR -# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY -# OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - - -import os -import urllib.request as req -import dash -import dash_core_components as dcc -import dash_html_components as html -from dash.dependencies import Input, Output -from dash.exceptions import PreventUpdate - - -external_stylesheets = ['https://codepen.io/chriddyp/pen/bWLwgP.css'] - -app = dash.Dash(__name__, external_stylesheets=external_stylesheets) - -app.layout = html.Div([ - html.H3('Client UI'), - html.Div('Image URL:'), - dcc.Input( - id='url-input', - placeholder='Enter the image url...', - type='text', - value='', - style={'display': 'block'} - ), - html.Img( - id='web-image', - width='500' - ), - html.Div('Prediction:'), - html.Div( - id='prediction-div', - style={'whiteSpace': 'pre-wrap'} - ) -]) - - -@app.callback( - Output('web-image', 'src'), - [Input('url-input', 'value')]) -def display_image(url): - return url - - -@app.callback( - Output('prediction-div', 'children'), - [Input('url-input', 'value')]) -def get_prediction(url): - if not url: - raise PreventUpdate - img_path = 'sample.jpg' - result_path = 'result.txt' - script = '/workspace/src/clients/python/updated_image_client.py' - model_name = 'resnet_graphdef' - scaling_style = 'RESNET' - req.urlretrieve(url, img_path) - if os.path.exists(result_path): - os.remove(result_path) - cmd = 'python3 %s -m %s -s %s %s >> %s' % ( - script, model_name, scaling_style, img_path, result_path) - os.system(cmd) - with open(result_path, 'r') as f: - result = f.read() - return result - - -if __name__ == '__main__': - app.run_server(debug=True, host='0.0.0.0', port=8050) \ No newline at end of file diff --git a/samples/nvidia-resnet/trtis_client/updated_image_client.py b/samples/nvidia-resnet/trtis_client/updated_image_client.py deleted file mode 100644 index 16d6fae07608..000000000000 --- a/samples/nvidia-resnet/trtis_client/updated_image_client.py +++ /dev/null @@ -1,308 +0,0 @@ -#!/usr/bin/python - -# Copyright (c) 2018-2019, NVIDIA CORPORATION. All rights reserved. -# -# Redistribution and use in source and binary forms, with or without -# modification, are permitted provided that the following conditions -# are met: -# * Redistributions of source code must retain the above copyright -# notice, this list of conditions and the following disclaimer. -# * Redistributions in binary form must reproduce the above copyright -# notice, this list of conditions and the following disclaimer in the -# documentation and/or other materials provided with the distribution. -# * Neither the name of NVIDIA CORPORATION nor the names of its -# contributors may be used to endorse or promote products derived -# from this software without specific prior written permission. -# -# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY -# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR -# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, -# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, -# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR -# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY -# OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - - -import argparse -import numpy as np -import os -from builtins import range -from PIL import Image -from tensorrtserver.api import * -import tensorrtserver.api.model_config_pb2 as model_config - -FLAGS = None - -def model_dtype_to_np(model_dtype): - if model_dtype == model_config.TYPE_BOOL: - return np.bool - elif model_dtype == model_config.TYPE_INT8: - return np.int8 - elif model_dtype == model_config.TYPE_INT16: - return np.int16 - elif model_dtype == model_config.TYPE_INT32: - return np.int32 - elif model_dtype == model_config.TYPE_INT64: - return np.int64 - elif model_dtype == model_config.TYPE_UINT8: - return np.uint8 - elif model_dtype == model_config.TYPE_UINT16: - return np.uint16 - elif model_dtype == model_config.TYPE_FP16: - return np.float16 - elif model_dtype == model_config.TYPE_FP32: - return np.float32 - elif model_dtype == model_config.TYPE_FP64: - return np.float64 - elif model_dtype == model_config.TYPE_STRING: - return np.dtype(object) - return None - -def parse_model(url, protocol, model_name, batch_size, verbose=False): - """ - Check the configuration of a model to make sure it meets the - requirements for an image classification network (as expected by - this client) - """ - ctx = ServerStatusContext(url, protocol, model_name, verbose) - server_status = ctx.get_server_status() - - if model_name not in server_status.model_status: - raise Exception("unable to get status for '" + model_name + "'") - - status = server_status.model_status[model_name] - config = status.config - - if len(config.input) != 1: - raise Exception("expecting 1 input, got {}".format(len(config.input))) - if len(config.output) != 1: - raise Exception("expecting 1 output, got {}".format(len(config.output))) - - input = config.input[0] - output = config.output[0] - - if output.data_type != model_config.TYPE_FP32: - raise Exception("expecting output datatype to be TYPE_FP32, model '" + - model_name + "' output type is " + - model_config.DataType.Name(output.data_type)) - - # Output is expected to be a vector. But allow any number of - # dimensions as long as all but 1 is size 1 (e.g. { 10 }, { 1, 10 - # }, { 10, 1, 1 } are all ok). Variable-size dimensions are not - # currently supported. - non_one_cnt = 0 - for dim in output.dims: - if dim == -1: - raise Exception("variable-size dimension in model output not supported") - if dim > 1: - non_one_cnt += 1 - if non_one_cnt > 1: - raise Exception("expecting model output to be a vector") - - # Model specifying maximum batch size of 0 indicates that batching - # is not supported and so the input tensors do not expect an "N" - # dimension (and 'batch_size' should be 1 so that only a single - # image instance is inferred at a time). - max_batch_size = config.max_batch_size - if max_batch_size == 0: - if batch_size != 1: - raise Exception("batching not supported for model '" + model_name + "'") - else: # max_batch_size > 0 - if batch_size > max_batch_size: - raise Exception("expecting batch size <= {} for model {}".format(max_batch_size, model_name)) - - # Model input must have 3 dims, either CHW or HWC - if len(input.dims) != 3: - raise Exception( - "expecting input to have 3 dimensions, model '{}' input has {}".format( - model_name, len(input.dims))) - - # Variable-size dimensions are not currently supported. - for dim in input.dims: - if dim == -1: - raise Exception("variable-size dimension in model input not supported") - - if ((input.format != model_config.ModelInput.FORMAT_NCHW) and - (input.format != model_config.ModelInput.FORMAT_NHWC)): - raise Exception("unexpected input format " + model_config.ModelInput.Format.Name(input.format) + - ", expecting " + - model_config.ModelInput.Format.Name(model_config.ModelInput.FORMAT_NCHW) + - " or " + - model_config.ModelInput.Format.Name(model_config.ModelInput.FORMAT_NHWC)) - - if input.format == model_config.ModelInput.FORMAT_NHWC: - h = input.dims[0] - w = input.dims[1] - c = input.dims[2] - else: - c = input.dims[0] - h = input.dims[1] - w = input.dims[2] - - return (input.name, output.name, c, h, w, input.format, model_dtype_to_np(input.data_type)) - -def preprocess(img, format, dtype, c, h, w, scaling): - """ - Pre-process an image to meet the size, type and format - requirements specified by the parameters. - """ - #np.set_printoptions(threshold='nan') - - if c == 1: - sample_img = img.convert('L') - else: - sample_img = img.convert('RGB') - - resized_img = sample_img.resize((w, h), Image.BILINEAR) - resized = np.array(resized_img) - if resized.ndim == 2: - resized = resized[:,:,np.newaxis] - - typed = resized.astype(dtype) - - if scaling == 'INCEPTION': - scaled = (typed / 128) - 1 - elif scaling == 'RESNET': - scaled = (typed / 255) - 0.5 - elif scaling == 'VGG': - if c == 1: - scaled = typed - np.asarray((128,), dtype=dtype) - else: - scaled = typed - np.asarray((123, 117, 104), dtype=dtype) - else: - scaled = typed - - # Swap to CHW if necessary - if format == model_config.ModelInput.FORMAT_NCHW: - ordered = np.transpose(scaled, (2, 0, 1)) - else: - ordered = scaled - - # Channels are in RGB order. Currently model configuration data - # doesn't provide any information as to other channel orderings - # (like BGR) so we just assume RGB. - return ordered - -def postprocess(results, filenames, batch_size): - """ - Post-process results to show classifications. - """ - if len(results) != 1: - raise Exception("expected 1 result, got {}".format(len(results))) - - batched_result = list(results.values())[0] - if len(batched_result) != batch_size: - raise Exception("expected {} results, got {}".format(batch_size, len(batched_result))) - if len(filenames) != batch_size: - raise Exception("expected {} filenames, got {}".format(batch_size, len(filenames))) - - for (index, result) in enumerate(batched_result): - print("Image '{}':".format(filenames[index])) - for cls in result: - print(" {} ({}) = {}".format(cls[0], cls[2], cls[1])) - - -if __name__ == '__main__': - parser = argparse.ArgumentParser() - parser.add_argument('-v', '--verbose', action="store_true", required=False, default=False, - help='Enable verbose output') - parser.add_argument('-a', '--async', action="store_true", required=False, default=False, - help='Use asynchronous inference API') - parser.add_argument('--streaming', action="store_true", required=False, default=False, - help='Use streaming inference API. ' + - 'The flag is only available with gRPC protocol.') - parser.add_argument('-m', '--model-name', type=str, required=True, - help='Name of model') - parser.add_argument('-x', '--model-version', type=int, required=False, - help='Version of model. Default is to use latest version.') - parser.add_argument('-b', '--batch-size', type=int, required=False, default=1, - help='Batch size. Default is 1.') - parser.add_argument('-c', '--classes', type=int, required=False, default=1, - help='Number of class results to report. Default is 1.') - parser.add_argument('-s', '--scaling', type=str, choices=['NONE', 'INCEPTION', 'RESNET', 'VGG'], - required=False, default='NONE', - help='Type of scaling to apply to image pixels. Default is NONE.') - parser.add_argument('-u', '--url', type=str, required=False, default='localhost:8000', - help='Inference server URL. Default is localhost:8000.') - parser.add_argument('-i', '--protocol', type=str, required=False, default='HTTP', - help='Protocol (HTTP/gRPC) used to ' + - 'communicate with inference service. Default is HTTP.') - parser.add_argument('image_filename', type=str, nargs='?', default=None, - help='Input image / Input folder.') - FLAGS = parser.parse_args() - - protocol = ProtocolType.from_str(FLAGS.protocol) - - if FLAGS.streaming and protocol != ProtocolType.GRPC: - raise Exception("Streaming is only allowed with gRPC protocol") - - # Make sure the model matches our requirements, and get some - # properties of the model that we need for preprocessing - input_name, output_name, c, h, w, format, dtype = parse_model( - FLAGS.url, protocol, FLAGS.model_name, - FLAGS.batch_size, FLAGS.verbose) - - ctx = InferContext(FLAGS.url, protocol, FLAGS.model_name, - FLAGS.model_version, FLAGS.verbose, 0, FLAGS.streaming) - - filenames = [] - if os.path.isdir(FLAGS.image_filename): - filenames = [os.path.join(FLAGS.image_filename, f) - for f in os.listdir(FLAGS.image_filename) - if os.path.isfile(os.path.join(FLAGS.image_filename, f))] - else: - filenames = [FLAGS.image_filename,] - - filenames.sort() - - # Preprocess the images into input data according to model - # requirements - image_data = [] - for filename in filenames: - img = Image.open(filename) - image_data.append(preprocess(img, format, dtype, c, h, w, FLAGS.scaling)) - - # Send requests of FLAGS.batch_size images. If the number of - # images isn't an exact multiple of FLAGS.batch_size then just - # start over with the first images until the batch is filled. - results = [] - result_filenames = [] - request_ids = [] - image_idx = 0 - last_request = False - while not last_request: - input_filenames = [] - input_batch = [] - for idx in range(FLAGS.batch_size): - input_filenames.append(filenames[image_idx]) - input_batch.append(image_data[image_idx]) - image_idx = (image_idx + 1) % len(image_data) - if image_idx == 0: - last_request = True - - result_filenames.append(input_filenames) - - # Send request - if not FLAGS.async: - results.append(ctx.run( - { input_name : input_batch }, - { output_name : (InferContext.ResultFormat.CLASS, FLAGS.classes) }, - FLAGS.batch_size)) - else: - request_ids.append(ctx.async_run( - { input_name : input_batch }, - { output_name : (InferContext.ResultFormat.CLASS, FLAGS.classes) }, - FLAGS.batch_size)) - - # For async, retrieve results according to the send order - if FLAGS.async: - for request_id in request_ids: - results.append(ctx.get_async_run_results(request_id, True)) - - for idx in range(len(results)): - print("Request {}, batch size {}".format(idx, FLAGS.batch_size)) - postprocess(results[idx], result_filenames[idx], FLAGS.batch_size)