Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clean up and refactor the output of the OTX CLI #1946

Merged
merged 22 commits into from
Mar 30, 2023
Merged
Show file tree
Hide file tree
Changes from 10 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ Building workspace folder
Comma-separated paths to unlabeled file list
--task TASK The currently supported options: ('CLASSIFICATION', 'DETECTION', 'INSTANCE_SEGMENTATION', 'SEGMENTATION', 'ACTION_CLASSIFICATION', 'ACTION_DETECTION', 'ANOMALY_CLASSIFICATION', 'ANOMALY_DETECTION', 'ANOMALY_SEGMENTATION').
--train-type TRAIN_TYPE
The currently supported options: dict_keys(['INCREMENTAL', 'SEMISUPERVISED', 'SELFSUPERVISED']).
The currently supported options: dict_keys(['Incremental', 'Semisupervised', 'Selfsupervised']).
--workspace WORKSPACE Location where the workspace.
--model MODEL Enter the name of the model you want to use. (Ex. EfficientNet-B0).
--backbone BACKBONE Available Backbone Type can be found using 'otx find --backbone {framework}'.
Expand Down
2 changes: 1 addition & 1 deletion docs/source/guide/tutorials/advanced/self_sl.rst
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ for **self-supervised learning** by running the following command:

.. code-block::

(otx) ...$ otx build --train-data-roots data/flower_photos --model MobileNet-V3-large-1x --train-type SELFSUPERVISED --workspace otx-workspace-CLASSIFICATION-SELFSUPERVISED
(otx) ...$ otx build --train-data-roots data/flower_photos --model MobileNet-V3-large-1x --train-type SELFSUPERVISED --workspace otx-workspace-CLASSIFICATION-Selfsupervised

[*] Workspace Path: otx-workspace-CLASSIFICATION-Selfsupervised
[*] Load Model Template ID: Custom_Image_Classification_MobileNet-V3-large-1x
Expand Down
2 changes: 1 addition & 1 deletion docs/source/guide/tutorials/advanced/semi_sl.rst
Original file line number Diff line number Diff line change
Expand Up @@ -150,7 +150,7 @@ In the train log, you can check that the train type is set to **Semisupervised**
...


After training ends, a trained model is saved in the ``latest`` sub-directory in the workspace named ``otx-workspace-CLASSIFICATION`` by default.
After training ends, a trained model is saved in the ``latest_trained_model`` sub-directory in the workspace named ``otx-workspace-CLASSIFICATION`` by default.


***************************
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -215,12 +215,12 @@ Export
It allows running the model on the Intel hardware much more efficiently, especially on the CPU. Also, the resulting IR model is required to run POT optimization. IR model consists of two files: ``openvino.xml`` for weights and ``openvino.bin`` for architecture.

2. Run the command line below to export the trained model
and save the exported model to the ``openvino_models`` folder.
and save the exported model to the ``openvino`` folder.

.. code-block::

(otx) ...$ otx export --load-weights models/weights.pth \
--output openvino_models
--output openvino

...
2023-02-21 22:54:32,518 - mmaction - INFO - Model architecture: X3D
Expand All @@ -241,8 +241,8 @@ using ``otx eval`` and passing the IR model path to the ``--load-weights`` param
.. code-block::

(otx) ...$ otx eval --test-data-roots ../data/hmdb51/CVAT/valid \
--load-weights openvino_models/openvino.xml \
--output outputs/openvino_models
--load-weights openvino/openvino.xml \
--output outputs/openvino

...

Expand All @@ -262,7 +262,7 @@ OpenVINO™ model (.xml) with OpenVINO™ POT.

.. code-block::

(otx) ...$ otx optimize --load-weights openvino_models/openvino.xml \
(otx) ...$ otx optimize --load-weights openvino/openvino.xml \
--output pot_model

...
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -163,7 +163,7 @@ Export
It allows running the model on the Intel hardware much more efficiently, especially on the CPU. Also, the resulting IR model is required to run POT optimization. IR model consists of two files: ``openvino.xml`` for weights and ``openvino.bin`` for architecture.

2. Run the command line below to export the trained model
and save the exported model to the ``openvino_models`` folder.
and save the exported model to the ``openvino`` folder.

.. code-block::

Expand Down Expand Up @@ -213,7 +213,7 @@ OpenVINO™ model (.xml) with OpenVINO™ POT.

.. code-block::

(otx) ...$ otx optimize --load-weights openvino_models/openvino.xml \
(otx) ...$ otx optimize --load-weights openvino/openvino.xml \
--save-model-to pot_model

...
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -159,13 +159,13 @@ Export
It allows running the model on the Intel hardware much more efficient, especially on the CPU. Also, the resulting IR model is required to run POT optimization. IR model consists of 2 files: ``openvino.xml`` for weights and ``openvino.bin`` for architecture.

2. We can run the below command line to export the trained model
and save the exported model to the ``openvino_models`` folder:
and save the exported model to the ``openvino`` folder:

.. code-block::

otx export ote_anomaly_detection_padim \
--load-weights otx-workspace-ANOMALY_DETECTION/models/weights.pth \
--output otx-workspace-ANOMALY_DETECTION/openvino_models
--output otx-workspace-ANOMALY_DETECTION/openvino

You will see the outputs similar to the following:

Expand All @@ -187,8 +187,8 @@ Now that we have the exported model, let's check its performance using ``otx eva

otx eval ote_anomaly_detection_padim \
--test-data-roots datasets/MVTec/bottle/test \
--load-weights otx-workspace-ANOMALY_DETECTION/openvino_models/openvino.xml \
--output otx-workspace-ANOMALY_DETECTION/openvino_models
--load-weights otx-workspace-ANOMALY_DETECTION/openvino/openvino.xml \
--output otx-workspace-ANOMALY_DETECTION/openvino

This gives the following results:

Expand All @@ -210,7 +210,7 @@ optimization.

otx optimize ote_anomaly_detection_padim \
--train-data-roots datasets/MVTec/bottle/train \
--load-weights otx-workspace-ANOMALY_DETECTION/openvino_models/openvino.xml \
--load-weights otx-workspace-ANOMALY_DETECTION/openvino/openvino.xml \
--output otx-workspace-ANOMALY_DETECTION/pot_model

This command generates the following files that can be used to run :doc:`otx demo <../demo>`:
Expand Down
17 changes: 11 additions & 6 deletions otx/cli/manager/config_manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -133,8 +133,9 @@ def output_path(self) -> Path:
if "output" in self.args and self.args.output:
output_path = Path(self.args.output)
else:
output_path = self.workspace_root / "outputs" / self.create_date
output_path.mkdir(exist_ok=True, parents=True)
output_path = self.workspace_root / "outputs" / f"{self.create_date}_{self.mode}"
if not output_path.exists():
output_path.mkdir(exist_ok=True, parents=True)
return output_path

def check_workspace(self) -> bool:
Expand Down Expand Up @@ -189,7 +190,7 @@ def configure_data_config(self, update_data_yaml: bool = True) -> None:
"""Configure data_config according to the situation and create data.yaml."""
data_yaml_path = self.data_config_file_path
data_yaml = configure_dataset(self.args, data_yaml_path=data_yaml_path)
if self.mode in ("train", "build"):
if self.mode in ("train", "build", "optimize"):
use_auto_split = data_yaml["data"]["train"]["data-roots"] and not data_yaml["data"]["val"]["data-roots"]
# FIXME: Hardcoded for Self-Supervised Learning
if use_auto_split and str(self.train_type).upper() != "SELFSUPERVISED":
Expand All @@ -210,7 +211,11 @@ def _get_train_type(self, ignore_args: bool = False) -> str:
if arg_algo_backend:
train_type = arg_algo_backend.get("train_type", {"value": "Incremental"}) # type: ignore
return train_type.get("value", "Incremental")
if hasattr(self.args, "train_type") and self.mode in ("build", "train") and self.args.train_type:
if (
hasattr(self.args, "train_type")
and self.mode in ("build", "train", "optimize")
and self.args.train_type
):
self.train_type = self.args.train_type
if self.train_type not in TASK_TYPE_TO_SUB_DIR_NAME:
raise NotSupportedError(f"{self.train_type} is not currently supported by otx.")
Expand Down Expand Up @@ -276,7 +281,7 @@ def _get_arg_data_yaml(self):
# TODO: This should modify data yaml format to data_config format.
"""Save the splitted dataset and data.yaml to the workspace."""
data_yaml = self._create_empty_data_cfg()
if self.mode == "train":
if self.mode in ("train", "optimize"):
if self.args.train_data_roots:
data_yaml["data"]["train"]["data-roots"] = self.args.train_data_roots
if self.args.val_data_roots:
Expand Down Expand Up @@ -379,7 +384,7 @@ def update_data_config(self, data_yaml: dict) -> None:
"file_list": data_yaml["data"]["unlabeled"]["file-list"],
}
# FIXME: Hardcoded for Self-Supervised Learning
if self.mode == "train" and str(self.train_type).upper() == "SELFSUPERVISED":
if self.mode in ("train", "optimize") and str(self.train_type).upper() == "SELFSUPERVISED":
self.data_config["val_subset"] = {"data_root": None}

def _get_template(self, task_type: str, model: Optional[str] = None) -> ModelTemplate:
Expand Down
6 changes: 1 addition & 5 deletions otx/cli/tools/build.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,6 @@
# See the License for the specific language governing permissions
# and limitations under the License.

from pathlib import Path

from otx.cli.manager.config_manager import TASK_TYPE_TO_SUB_DIR_NAME, ConfigManager
from otx.cli.utils.parser import get_parser_and_hprams_data

Expand Down Expand Up @@ -88,11 +86,9 @@ def main():
"""Main function for model or backbone or task building."""

args = get_args()
config_manager = ConfigManager(args, mode="build")
config_manager = ConfigManager(args, workspace_root=args.workspace, mode="build")
if args.task:
config_manager.task_type = args.task.upper()
if args.workspace:
config_manager.workspace_root = Path(args.workspace)

# Auto-Configuration for model template
config_manager.configure_template(model=args.model)
Expand Down
2 changes: 1 addition & 1 deletion otx/cli/tools/demo.py
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ def main():
# Dynamically create an argument parser based on override parameters.
args, override_param = get_args()

config_manager = ConfigManager(args, mode="eval")
config_manager = ConfigManager(args, mode="demo")
# Auto-Configuration for model template
config_manager.configure_template()

Expand Down
7 changes: 2 additions & 5 deletions otx/cli/tools/deploy.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,11 +58,8 @@ def main():
hyper_parameters = template.hyper_parameters.data
assert hyper_parameters

if not args.load_weights and config_manager.check_workspace():
exported_weight_path = config_manager.workspace_root / "latest/openvino_models/openvino.xml"
if not exported_weight_path.exists():
raise RuntimeError("No appropriate OpenVINO exported model was found.")
args.load_weights = str(exported_weight_path)
if not args.load_weights:
raise RuntimeError("No appropriate OpenVINO exported model was found.")

# Get classes for Task, ConfigurableParameters and Dataset.
if not args.load_weights.endswith(".bin") and not args.load_weights.endswith(".xml"):
Expand Down
5 changes: 4 additions & 1 deletion otx/cli/tools/eval.py
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,10 @@ def main():
config_manager.configure_template()

if not args.load_weights and config_manager.check_workspace():
args.load_weights = str(config_manager.workspace_root / "latest" / "weights.pth")
latest_model_path = (
config_manager.workspace_root / "outputs" / "latest_trained_model" / "models" / "weights.pth"
)
args.load_weights = str(latest_model_path)

# Update Hyper Parameter Configs
hyper_parameters = config_manager.get_hyparams_config(override_param)
Expand Down
2 changes: 1 addition & 1 deletion otx/cli/tools/explain.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ def main():

args, override_param = get_args()

config_manager = ConfigManager(args, mode="eval")
config_manager = ConfigManager(args, mode="explain")
# Auto-Configuration for model template
config_manager.configure_template()

Expand Down
15 changes: 6 additions & 9 deletions otx/cli/tools/export.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ def get_args():
def main():
"""Main function that is used for model exporting."""
args = get_args()
config_manager = ConfigManager(args, mode="eval", workspace_root=args.workspace)
config_manager = ConfigManager(args, mode="export", workspace_root=args.workspace)
# Auto-Configuration for model template
config_manager.configure_template()

Expand All @@ -72,7 +72,10 @@ def main():

# Get class for Task.
if not args.load_weights and config_manager.check_workspace():
args.load_weights = str(config_manager.workspace_root / "latest" / "weights.pth")
latest_model_path = (
config_manager.workspace_root / "outputs" / "latest_trained_model" / "models" / "weights.pth"
)
args.load_weights = str(latest_model_path)

is_nncf = is_checkpoint_nncf(args.load_weights)
task_class = get_impl_class(template.entrypoints.nncf if is_nncf else template.entrypoints.base)
Expand Down Expand Up @@ -107,18 +110,12 @@ def main():

if not args.output:
output_path = config_manager.output_path
output_path = output_path / "openvino_models"
output_path = output_path / "openvino"
else:
output_path = Path(args.output)
output_path.mkdir(exist_ok=True, parents=True)
save_model_data(exported_model, str(output_path))

# Softlink to weights & openvino_models
pre_weight_path = Path(args.load_weights).resolve().parent / "openvino_models"
if pre_weight_path.exists():
pre_weight_path.unlink()
pre_weight_path.symlink_to(output_path.resolve())

return dict(retcode=0, template=template.name)


Expand Down
13 changes: 5 additions & 8 deletions otx/cli/tools/optimize.py
Original file line number Diff line number Diff line change
Expand Up @@ -78,13 +78,16 @@ def main():
# Dynamically create an argument parser based on override parameters.
args, override_param = get_args()

config_manager = ConfigManager(args, workspace_root=args.workspace, mode="train")
config_manager = ConfigManager(args, workspace_root=args.workspace, mode="optimize")
# Auto-Configuration for model template
config_manager.configure_template()

# The default in the workspace is the model weight of the OTX train.
if not args.load_weights and config_manager.check_workspace():
args.load_weights = str(config_manager.workspace_root / "latest" / "weights.pth")
latest_model_path = (
config_manager.workspace_root / "outputs" / "latest_trained_model" / "models" / "weights.pth"
)
args.load_weights = str(latest_model_path)

is_pot = False
if args.load_weights.endswith(".bin") or args.load_weights.endswith(".xml"):
Expand Down Expand Up @@ -135,12 +138,6 @@ def main():
output_path.mkdir(exist_ok=True, parents=True)
save_model_data(output_model, output_path)

# Softlink to weights & optimized models
pre_weight_path = Path(args.load_weights).resolve().parent / opt_method
if pre_weight_path.exists():
pre_weight_path.unlink()
pre_weight_path.symlink_to(output_path.resolve())

validation_dataset = dataset.get_subset(Subset.VALIDATION)
predicted_validation_dataset = task.infer(
validation_dataset.with_empty_annotations(),
Expand Down
9 changes: 6 additions & 3 deletions otx/cli/tools/train.py
Original file line number Diff line number Diff line change
Expand Up @@ -230,9 +230,12 @@ def main(): # pylint: disable=too-many-branches

save_model_data(output_model, str(config_manager.output_path / "models"))
# Latest model folder symbolic link to models
if (config_manager.workspace_root / "latest").exists():
(config_manager.workspace_root / "latest").unlink()
(config_manager.workspace_root / "latest").symlink_to((config_manager.output_path / "models").resolve())
latest_path = config_manager.workspace_root / "outputs" / "latest_trained_model"
if latest_path.exists():
latest_path.unlink()
elif not latest_path.parent.exists():
latest_path.parent.mkdir(exist_ok=True, parents=True)
latest_path.symlink_to(config_manager.output_path.resolve())

if config_manager.data_config["val_subset"]["data_root"]:
validation_dataset = dataset.get_subset(Subset.VALIDATION)
Expand Down
Loading