-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Coverage update #42
Coverage update #42
Conversation
…erage configuration file
from . import dreq_query as dq | ||
from . import dreq_classes |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
may be better:
from data_request_api.stable.query import dreq_query as dq
from data_request_api.stable.query import dreq_classes
? in case this script moves
scripts/database_transformation.py
Outdated
|
||
if args.output_dir in ["default", "customed"]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
does this implement this feature:
- be able to specify the directory in which the database are downloaded (useful for users which could use an administrator install and could not write in the source directory)
from the PR description?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, it only deals with the files derived from the export file.
scripts/workflow_example_2.py
Outdated
parser.add_argument("--dreq_export_version", default="release", help="Export version to be used") | ||
parser.add_argument("--use_consolidation", default=False, help="Should content consolidation be used?") | ||
parser.add_argument("--output_dir", choices=["default", "test", "customed"], default="default", | ||
help="Output directory management") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
customed --> custom
Maybe simpler to have just one option, --output_dir, which gives name of output dir, and if not given the default is used? I had to run:
python workflow_example_2.py --output_dir customed --output_dir_customed tmp
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Change done in the branch. It may be less robust (no check on the value possible in this case).
Now, possible values are:
- "default" : default location of dreq_api
- "test" : temporary directory for the tests which will be deleted once tests are finished
- the path of the output directory
|
||
def get_information_from_data_request(dreq_version, dreq_export_version, use_consolidation, output_dir): | ||
### Step 1: Get the data request content | ||
DR_content, VS_content = get_transformed_content(version=dreq_version, export_version=dreq_export_version, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
when I run this with a custom output dir, the separated json files appear in that dir (DR_release_content.json
, VS_release_content.json
). Would the original export file also appear there if it wasn't already cached?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, it wouldn't, the directory in which the original export file is put seems not to be customisable. But I may be wrong. This is one of the things I have higlithed in my first comment about this pull request.
launch_test_with_coverage.sh
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ran for me, and the html report htmlcov/index.html
is very nice! It took a long time to run, maybe 20 min... should it?
Does the six
module need to be added as a dependency?
Can this file and .coveragerc
be moved into tests/
just to keep the top-level repo dir uncluttered?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think we need to add the six package as dependency but I may be wrong as we are in python3. Did you have issues with it when you run the API the first time?
The htmlcov directory can be customed through .coveragerc. I will change the default value to a place in the tests directory in my branch.
I'm not sure that we can move the .coveragerc (it won't be taken into account if I do so).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have checked and I change the locations of the temporary files and .coveragerc too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To be done once #39 will be merged, change the default cache for pytest in the pyproj.toml file :
[pytest]
cache_dir = tests/.pytest_cache
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The tests are quite long because some scripts make the splitting from the single and the several databases.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have just tested the launch_test_with_coverage.sh
script and it works fine, rolling all the coverage tests (about 20 min as for @JamesAnstey) with a nice html view. Well done @rigoudyg 👍
…output directories.
# Conflicts: # data_request_api/command_line/export_dreq_lists_json.py # data_request_api/dev/JA/workflow_example_test.py # data_request_api/stable/content/dreq_api/consolidate_export.py # data_request_api/stable/content/dreq_api/dreq_content.py # data_request_api/stable/content/dreq_api/test_dreq_content.py # data_request_api/stable/content/dump_transformation.py # data_request_api/stable/query/dreq_query.py # data_request_api/stable/query/get_variables_metadata.py # data_request_api/stable/utilities/tools.py # requirements.txt # scripts/database_transformation.py # scripts/workflow_example.py # scripts/workflow_example_2.py # tests/test_data_request.py # tests/test_dump_transformation.py # tests/test_vocabulary_server.py # tests/test_workflow.py
Hi,
Here is a new pull request only on tests and coverage usage. @matthew-mizielinski it should be complementary to your pull request #39 for testing purpose.
To launch coverage, there is now a dedicated script (launch_test_with_coverage.sh) which launch the basic tests using pytest and all scripts contained in the scripts directory.
To launch coverage by hand (from the root directory of the repository) :
coverage erase
coverage run
coverage combine
coverage html
Launching the scripts is ok from the scripts directory or from the root directory.
The current level of coverage is 67% (on everything under stable version). The minimum level under which the coverage script fails is currently 50% and can be customised in the .coveragerc file.
coverage.json
Maybe, we could consider doing the following changes: