Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Coverage update #42

Merged
merged 7 commits into from
Jan 24, 2025
Merged

Coverage update #42

merged 7 commits into from
Jan 24, 2025

Conversation

rigoudyg
Copy link

Hi,
Here is a new pull request only on tests and coverage usage. @matthew-mizielinski it should be complementary to your pull request #39 for testing purpose.
To launch coverage, there is now a dedicated script (launch_test_with_coverage.sh) which launch the basic tests using pytest and all scripts contained in the scripts directory.
To launch coverage by hand (from the root directory of the repository) :
coverage erase
coverage run
coverage combine
coverage html
Launching the scripts is ok from the scripts directory or from the root directory.
The current level of coverage is 67% (on everything under stable version). The minimum level under which the coverage script fails is currently 50% and can be customised in the .coveragerc file.
coverage.json

Maybe, we could consider doing the following changes:

  • be able to specify an output directory in scripts to consider the test case (using tempfile to create a temporary directory automatically deleted once test is finished, making easier to tidy the run directory) -> done in database_transformation.py and workflow_exemple_2.py
  • be able to specify the directory in which the database are downloaded (useful for users which could use an administrator install and could not write in the source directory)
  • be able to run the query content offline to get the last cached version (currently not possible because check the version) and maybe having a function the retrieve cached versions

Comment on lines 11 to 12
from . import dreq_query as dq
from . import dreq_classes

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

may be better:

from data_request_api.stable.query import dreq_query as dq
from data_request_api.stable.query import dreq_classes

? in case this script moves


if args.output_dir in ["default", "customed"]:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does this implement this feature:

  • be able to specify the directory in which the database are downloaded (useful for users which could use an administrator install and could not write in the source directory)

from the PR description?

Copy link
Author

@rigoudyg rigoudyg Jan 23, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, it only deals with the files derived from the export file.

parser.add_argument("--dreq_export_version", default="release", help="Export version to be used")
parser.add_argument("--use_consolidation", default=False, help="Should content consolidation be used?")
parser.add_argument("--output_dir", choices=["default", "test", "customed"], default="default",
help="Output directory management")

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

customed --> custom

Maybe simpler to have just one option, --output_dir, which gives name of output dir, and if not given the default is used? I had to run:

python workflow_example_2.py --output_dir customed --output_dir_customed tmp

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Change done in the branch. It may be less robust (no check on the value possible in this case).
Now, possible values are:

  • "default" : default location of dreq_api
  • "test" : temporary directory for the tests which will be deleted once tests are finished
  • the path of the output directory


def get_information_from_data_request(dreq_version, dreq_export_version, use_consolidation, output_dir):
### Step 1: Get the data request content
DR_content, VS_content = get_transformed_content(version=dreq_version, export_version=dreq_export_version,

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

when I run this with a custom output dir, the separated json files appear in that dir (DR_release_content.json, VS_release_content.json). Would the original export file also appear there if it wasn't already cached?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, it wouldn't, the directory in which the original export file is put seems not to be customisable. But I may be wrong. This is one of the things I have higlithed in my first comment about this pull request.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ran for me, and the html report htmlcov/index.html is very nice! It took a long time to run, maybe 20 min... should it?

Does the six module need to be added as a dependency?

Can this file and .coveragerc be moved into tests/ just to keep the top-level repo dir uncluttered?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we need to add the six package as dependency but I may be wrong as we are in python3. Did you have issues with it when you run the API the first time?

The htmlcov directory can be customed through .coveragerc. I will change the default value to a place in the tests directory in my branch.
I'm not sure that we can move the .coveragerc (it won't be taken into account if I do so).

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have checked and I change the locations of the temporary files and .coveragerc too.

Copy link
Author

@rigoudyg rigoudyg Jan 23, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To be done once #39 will be merged, change the default cache for pytest in the pyproj.toml file :
[pytest]
cache_dir = tests/.pytest_cache

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The tests are quite long because some scripts make the splitting from the single and the several databases.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have just tested the launch_test_with_coverage.sh script and it works fine, rolling all the coverage tests (about 20 min as for @JamesAnstey) with a nice html view. Well done @rigoudyg 👍

# Conflicts:
#	data_request_api/command_line/export_dreq_lists_json.py
#	data_request_api/dev/JA/workflow_example_test.py
#	data_request_api/stable/content/dreq_api/consolidate_export.py
#	data_request_api/stable/content/dreq_api/dreq_content.py
#	data_request_api/stable/content/dreq_api/test_dreq_content.py
#	data_request_api/stable/content/dump_transformation.py
#	data_request_api/stable/query/dreq_query.py
#	data_request_api/stable/query/get_variables_metadata.py
#	data_request_api/stable/utilities/tools.py
#	requirements.txt
#	scripts/database_transformation.py
#	scripts/workflow_example.py
#	scripts/workflow_example_2.py
#	tests/test_data_request.py
#	tests/test_dump_transformation.py
#	tests/test_vocabulary_server.py
#	tests/test_workflow.py
@rigoudyg rigoudyg merged commit 4fe1972 into CMIP-Data-Request:main Jan 24, 2025
0 of 16 checks passed
@rigoudyg rigoudyg deleted the gr_coverage branch January 24, 2025 15:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants