Skip to content

Commit

Permalink
🧪 📒 Add nbmake to test the Jupyter notebooks (#863)
Browse files Browse the repository at this point in the history
* Fix fastflow notebook

* Add nbmake to requirements and add pytest nbmake to tox to test jupyter notebooks.

* Address cwd issue in getting started notebook

* Address cwd issue in btech notebook

* Address cwd issue in mvtec notebook

* Address cwd issue in folder dataset notebook

* Address cwd issue in tiling notebook

* Address cwd issue in model notebook

* Add the dataset directory to run the notebooks

* Refactor datamodule and dataset notebooks (#880)

re-order dataset notebooks and call prepare_data

* 🐞 Fix minor logic bug detecting empty list of images (#882)

This fixes a regression in `get_image_filenames()` caused by changing from a len(image_filenames) == 0 check in a previous commit. Empty lists report as `False`, not `True`, as caught by this, so added a `not` to fix.

* Fixed typo

* Modified hpo notebooks to pass the tests

* Fix nncf jupyter notebook

* Loosen nncf version

* Fix isort, ignore benchmarking and openvino notebooks for now

---------

Co-authored-by: Dick Ameln <dick.ameln@intel.com>
Co-authored-by: Tom Gambone <stillgreyfox@users.noreply.github.com>
  • Loading branch information
3 people authored Feb 6, 2023
1 parent 3ba78bf commit 476655a
Show file tree
Hide file tree
Showing 15 changed files with 1,741 additions and 1,687 deletions.
2 changes: 2 additions & 0 deletions .github/workflows/pre_merge.yml
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,8 @@ jobs:
uses: actions/checkout@v2
- name: Install Tox
run: pip install tox
- name: Link the dataset path to the dataset directory in the repository root.
run: ln -s $ANOMALIB_DATASET_PATH ./datasets
- name: Coverage
run: tox -e pre_merge
- name: Upload coverage report
Expand Down
3 changes: 1 addition & 2 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ repos:

# python import sorting
- repo: https://github.com/PyCQA/isort
rev: 5.10.1
rev: 5.11.5
hooks:
- id: isort

Expand Down Expand Up @@ -75,7 +75,6 @@ repos:
- id: nbqa-black
- id: nbqa-isort
- id: nbqa-flake8
args: ["--max-line-length=120", "--ignore=E203,W503"]
- id: nbqa-pylint

- repo: https://github.com/pre-commit/mirrors-prettier
Expand Down
2 changes: 1 addition & 1 deletion anomalib/data/utils/image.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ def get_image_filenames(path: str | Path) -> list[Path]:
if path.is_dir():
image_filenames = [p for p in path.glob("**/*") if p.suffix in IMG_EXTENSIONS]

if image_filenames:
if not image_filenames:
raise ValueError(f"Found 0 images in {path}")

return image_filenames
Expand Down
70 changes: 41 additions & 29 deletions notebooks/000_getting_started/001_getting_started.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -23,9 +23,15 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Installing Anomalib\n",
"\n",
"Installation can be done in two ways: (i) install via PyPI, or (ii) installing from source. In this notebook, we'll install it from source in order to get the most recent version of anomalib."
"## Installing Anomalib"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"Installation can be done in two ways: (i) install via PyPI, or (ii) installing from sourc, both of which are shown below:"
]
},
{
Expand All @@ -51,15 +57,8 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"### II. Install from Source"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"> NOTE: If you are running the notebook on Google Colab, make sure that you install the library from source to have access to the entire material, including configuration files, etc."
"### II. Install from Source\n",
"This option would initially download anomalib repository from github and manually install `anomalib` from source, which is shown below:"
]
},
{
Expand All @@ -69,17 +68,17 @@
"outputs": [],
"source": [
"# Option - II: Uncomment the next three lines if you want to install from the source.\n",
"# %git clone https://github.com/openvinotoolkit/anomalib.git\n",
"# !git clone https://github.com/openvinotoolkit/anomalib.git\n",
"# %cd anomalib\n",
"# %pip install -e ."
"# %pip install ."
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"> NOTE: If you are running the notebook from the `notebooks/000_getting_started` directory of the repository, you could install the library as follows:"
" Now let's verify the working directory. This is to access the datasets and configs when the notebook is run from different platforms such as local or Google Colab."
]
},
{
Expand All @@ -88,9 +87,28 @@
"metadata": {},
"outputs": [],
"source": [
"# If you are in `~/anomalib/notebooks/000_getting_started` directory, uncomment the following two lines.\n",
"# %cd ../..\n",
"# %pip install -e ."
"from __future__ import annotations\n",
"\n",
"import os\n",
"from pathlib import Path\n",
"from typing import Any\n",
"\n",
"from git.repo import Repo\n",
"\n",
"current_directory = Path.cwd()\n",
"if current_directory.name == \"000_getting_started\":\n",
" # On the assumption that, the notebook is located in\n",
" # ~/anomalib/notebooks/000_getting_started/\n",
" root_directory = current_directory.parent.parent\n",
"elif current_directory.name == \"anomalib\":\n",
" # This means that the notebook is run from the main anomalib directory.\n",
" root_directory = current_directory\n",
"else:\n",
" # Otherwise, we'll need to clone the anomalib repo to the `current_directory`\n",
" repo = Repo.clone_from(url=\"https://github.com/openvinotoolkit/anomalib.git\", to_path=current_directory)\n",
" root_directory = current_directory / \"anomalib\"\n",
"\n",
"os.chdir(root_directory)"
]
},
{
Expand All @@ -106,11 +124,6 @@
"metadata": {},
"outputs": [],
"source": [
"from __future__ import annotations\n",
"\n",
"from pathlib import Path\n",
"from typing import Any\n",
"\n",
"import numpy as np\n",
"from IPython.display import display\n",
"from PIL import Image\n",
Expand Down Expand Up @@ -166,7 +179,7 @@
"outputs": [],
"source": [
"MODEL = \"padim\" # 'padim', 'cflow', 'stfpm', 'ganomaly', 'dfkde', 'patchcore'\n",
"CONFIG_PATH = f\"../../anomalib/models/{MODEL}/config.yaml\"\n",
"CONFIG_PATH = root_directory / f\"anomalib/models/{MODEL}/config.yaml\"\n",
"with open(file=CONFIG_PATH, mode=\"r\", encoding=\"utf-8\") as file:\n",
" print(file.read())"
]
Expand All @@ -185,8 +198,7 @@
"outputs": [],
"source": [
"# pass the config file to model, callbacks and datamodule\n",
"config = get_configurable_parameters(config_path=CONFIG_PATH)\n",
"config[\"dataset\"][\"path\"] = \"../../datasets/MVTec\" # or wherever the MVTec dataset is stored."
"config = get_configurable_parameters(config_path=CONFIG_PATH)"
]
},
{
Expand All @@ -211,7 +223,7 @@
"datamodule.prepare_data() # Create train/val/test/prediction sets.\n",
"\n",
"i, data = next(enumerate(datamodule.val_dataloader()))\n",
"data.keys()"
"print(data.keys())"
]
},
{
Expand Down Expand Up @@ -367,12 +379,12 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.13 (default, Oct 21 2022, 23:50:54) \n[GCC 11.2.0]"
"version": "3.8.13 (default, Nov 6 2022, 23:15:27) \n[GCC 9.3.0]"
},
"orig_nbformat": 4,
"vscode": {
"interpreter": {
"hash": "f26beec5b578f06009232863ae217b956681fd13da2e828fa5a0ecf8cf2ccd29"
"hash": "ae223df28f60859a2f400fae8b3a1034248e0a469f5599fd9a89c32908ed7a84"
}
}
},
Expand Down
Loading

0 comments on commit 476655a

Please sign in to comment.