diff --git a/docs/samples/explanation/art/mnist/README.md b/docs/samples/explanation/art/mnist/README.md index c2dd2b188df..fbc9f11f7fa 100644 --- a/docs/samples/explanation/art/mnist/README.md +++ b/docs/samples/explanation/art/mnist/README.md @@ -23,7 +23,7 @@ The first step is to [determine the ingress IP and ports](../../../../../README. ``` MODEL_NAME=artserver SERVICE_HOSTNAME=$(kubectl get inferenceservice ${MODEL_NAME} -o jsonpath='{.status.url}' | cut -d "/" -f 3) -python query_explain.py http://${INGRESS_HOST}:${INGRESS_PORT}/v1/models/$MODEL_NAME:explain ${SERVICE_HOSTNAME} +python query_explain.py http://${INGRESS_HOST}:${INGRESS_PORT}/v1/models/${MODEL_NAME}:explain ${SERVICE_HOSTNAME} ``` After some time you should see a pop up containing the explanation, similar to the image below. If a pop up does not display and the message "Unable to find an adversarial example." appears then an adversarial example could not be found for the image given in a timely manner. If a pop up does display then the image on the left is the original image and the image on the right is the adversarial example. The labels above both images represent what classification the model made for each individual image. @@ -35,8 +35,8 @@ The [Square Attack method](https://arxiv.org/abs/1912.00049) used in this exampl To try a different MNIST example add an integer to the end of the query between 0-9,999. The integer chosen will be the index of the image to be chosen in the MNIST dataset. Or to try a file with custom data add the file path to the end. Keep in mind that the data format must be `{"instances": [,