-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add unit tests + CI + code coverage #10
Changes from all commits
9b06195
96e9f2a
b210add
c6e26df
5d48204
5789321
bf142f3
0bf05ac
7685acd
a0e2022
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,43 @@ | ||
version: 2 | ||
jobs: | ||
build: | ||
docker: | ||
- image: circleci/python:3.5 | ||
working_directory: ~/repo | ||
|
||
steps: | ||
- checkout | ||
|
||
- restore_cache: | ||
keys: | ||
- v1-dependencies-{{ checksum "dev-requirements.txt" }}-{{ checksum "setup.py" }} | ||
# fallback to using the latest cache if no exact match is found | ||
- v1-dependencies- | ||
|
||
- run: | ||
name: install dependencies | ||
command: | | ||
python3 -m venv venv | ||
. venv/bin/activate | ||
pip install -r dev-requirements.txt | ||
pip install -e . | ||
|
||
- save_cache: | ||
paths: | ||
- ./venv | ||
key: v1-dependencies-{{ checksum "dev-requirements.txt" }}-{{ checksum "setup.py" }} | ||
|
||
- run: | ||
name: run tests | ||
command: | | ||
. venv/bin/activate | ||
# Have CircleCI display pretty test result output | ||
mkdir test-reports | ||
py.test --cov=jupyter_telemetry --junitxml=test-reports/junit.xml tests/ | ||
# Upload coverage info to codecov | ||
codecov | ||
|
||
- store_test_results: | ||
path: test-reports | ||
- store_artifacts: | ||
path: test-reports |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,2 +1,6 @@ | ||
# Telemetry | ||
|
||
[![circleci](https://circleci.com/gh/jupyter/telemetry?style=shield)][https://circleci.com/gh/jupyter/telemetry] | ||
[![codecov](https://codecov.io/gh/jupyter/telemetry/branch/master/graph/badge.svg)](https://codecov.io/gh/jupyter/telemetry) | ||
|
||
Telemetry for Jupyter Applications and extensions. |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
pytest | ||
pytest-cov | ||
codecov |
This file was deleted.
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,148 @@ | ||
from contextlib import redirect_stderr | ||
import json | ||
import jsonschema | ||
import logging | ||
import pytest | ||
import io | ||
|
||
from jupyter_telemetry.eventlog import EventLog | ||
|
||
|
||
def test_register_invalid_schema(): | ||
""" | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think it would be worth adding comments that describe the specific "failures" we are testing in each chunk below. Another approach would be to break each chunk into a separate unit test. I doesn't matter to me, but some of these failures aren't what I expect. See comments below. |
||
Invalid JSON Schemas should fail registration | ||
""" | ||
el = EventLog() | ||
with pytest.raises(jsonschema.SchemaError): | ||
el.register_schema({ | ||
# Totally invalid | ||
'properties': True | ||
}) | ||
|
||
def test_missing_required_properties(): | ||
""" | ||
id and $version are required properties in our schemas. | ||
|
||
They aren't required by JSON Schema itself | ||
""" | ||
el = EventLog() | ||
with pytest.raises(ValueError): | ||
el.register_schema({ | ||
'properties': {} | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Should this throw a There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. KeyError says 'Raised when a mapping (dictionary) key is not found in the set of existing keys.', while ValueError says 'Raised when an operation or function receives an argument that has the right type but an inappropriate value, and the situation is not described by a more precise exception such as IndexError.' This makes me feel ValueError is the right thing, since we aren't passing anything in directly with the key. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 👍 |
||
}) | ||
|
||
with pytest.raises(ValueError): | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It's not immediately obvious to me why this should fail. I certainly didn't notice that the failure is caused by an extra
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Done! |
||
el.register_schema({ | ||
'$id': 'something', | ||
'$version': 1, # This should been 'version' | ||
}) | ||
|
||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Should we add a test that fails because a dunder key of There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Added! |
||
def test_reserved_properties(): | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Nice test! :) |
||
""" | ||
User schemas can't have properties starting with __ | ||
|
||
These are reserved | ||
""" | ||
el = EventLog() | ||
with pytest.raises(ValueError): | ||
el.register_schema({ | ||
'$id': 'test/test', | ||
'version': 1, | ||
'properties': { | ||
'__fail__': { | ||
'type': 'string' | ||
}, | ||
}, | ||
}) | ||
|
||
def test_record_event(): | ||
""" | ||
Simple test for emitting valid events | ||
""" | ||
schema = { | ||
'$id': 'test/test', | ||
'version': 1, | ||
'properties': { | ||
'something': { | ||
'type': 'string' | ||
}, | ||
}, | ||
} | ||
|
||
output = io.StringIO() | ||
handler = logging.StreamHandler(output) | ||
el = EventLog(handlers_maker=lambda el: [handler]) | ||
el.register_schema(schema) | ||
el.allowed_schemas = ['test/test'] | ||
|
||
el.record_event('test/test', 1, { | ||
'something': 'blah', | ||
}) | ||
handler.flush() | ||
|
||
event_capsule = json.loads(output.getvalue()) | ||
|
||
assert '__timestamp__' in event_capsule | ||
# Remove timestamp from capsule when checking equality, since it is gonna vary | ||
del event_capsule['__timestamp__'] | ||
assert event_capsule == { | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This isn't guaranteed to work in Python 2, but we don't have to worry about that, right? ;) There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. yeah, let's not worry about that for right now :) We can later if we wanna |
||
'__schema__': 'test/test', | ||
'__version__': 1, | ||
'something': 'blah' | ||
} | ||
|
||
|
||
def test_allowed_schemas(): | ||
""" | ||
Events should be emitted only if their schemas are allowed | ||
""" | ||
schema = { | ||
'$id': 'test/test', | ||
'version': 1, | ||
'properties': { | ||
'something': { | ||
'type': 'string' | ||
}, | ||
}, | ||
} | ||
|
||
output = io.StringIO() | ||
handler = logging.StreamHandler(output) | ||
el = EventLog(handlers_maker=lambda el: [handler]) | ||
# Just register schema, but do not mark it as allowed | ||
el.register_schema(schema) | ||
|
||
el.record_event('test/test', 1, { | ||
'something': 'blah', | ||
}) | ||
handler.flush() | ||
|
||
|
||
assert output.getvalue() == '' | ||
|
||
def test_record_event_badschema(): | ||
""" | ||
Fail fast when an event doesn't conform to its schema | ||
""" | ||
schema = { | ||
'$id': 'test/test', | ||
'version': 1, | ||
'properties': { | ||
'something': { | ||
'type': 'string' | ||
}, | ||
'status': { | ||
'enum': ['success', 'failure'] | ||
} | ||
} | ||
} | ||
|
||
el = EventLog(handlers_maker=lambda el: [logging.NullHandler()]) | ||
el.register_schema(schema) | ||
el.allowed_schemas = ['test/test'] | ||
|
||
with pytest.raises(jsonschema.ValidationError): | ||
el.record_event('test/test', 1, { | ||
'something': 'blah', | ||
'status': 'not-in-enum' | ||
}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've come across this on another Git extension as well - Is there somewhere Jupyter mandates Py3.5 compatibility?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added it to make jupyter/notebook#4752 pass, since notebook's tests seem to require a minimum of python 3.5.