Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add unit tests + CI + code coverage #10

Merged
merged 10 commits into from
Jul 9, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
43 changes: 43 additions & 0 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
version: 2
jobs:
build:
docker:
- image: circleci/python:3.5
working_directory: ~/repo

steps:
- checkout

- restore_cache:
keys:
- v1-dependencies-{{ checksum "dev-requirements.txt" }}-{{ checksum "setup.py" }}
# fallback to using the latest cache if no exact match is found
- v1-dependencies-

- run:
name: install dependencies
command: |
python3 -m venv venv
. venv/bin/activate
pip install -r dev-requirements.txt
pip install -e .

- save_cache:
paths:
- ./venv
key: v1-dependencies-{{ checksum "dev-requirements.txt" }}-{{ checksum "setup.py" }}

- run:
name: run tests
command: |
. venv/bin/activate
# Have CircleCI display pretty test result output
mkdir test-reports
py.test --cov=jupyter_telemetry --junitxml=test-reports/junit.xml tests/
# Upload coverage info to codecov
codecov

- store_test_results:
path: test-reports
- store_artifacts:
path: test-reports
4 changes: 4 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,6 @@
# Telemetry

[![circleci](https://circleci.com/gh/jupyter/telemetry?style=shield)][https://circleci.com/gh/jupyter/telemetry]
[![codecov](https://codecov.io/gh/jupyter/telemetry/branch/master/graph/badge.svg)](https://codecov.io/gh/jupyter/telemetry)

Telemetry for Jupyter Applications and extensions.
3 changes: 3 additions & 0 deletions dev-requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
pytest
pytest-cov
codecov
42 changes: 0 additions & 42 deletions jupyter_telemetry/handler.py

This file was deleted.

5 changes: 2 additions & 3 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
license = 'BSD',
platforms = "Linux, Mac OS X, Windows",
keywords = ['Jupyter', 'JupyterLab'],
python_requires = '>=3.6',
python_requires = '>=3.5',
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've come across this on another Git extension as well - Is there somewhere Jupyter mandates Py3.5 compatibility?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added it to make jupyter/notebook#4752 pass, since notebook's tests seem to require a minimum of python 3.5.

classifiers = [
'Intended Audience :: Developers',
'Intended Audience :: System Administrators',
Expand All @@ -26,7 +26,6 @@
install_requires=[
'jsonschema',
'python-json-logger',
'traitlets',
'notebook',
'traitlets'
],
)
148 changes: 148 additions & 0 deletions tests/test_register_schema.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,148 @@
from contextlib import redirect_stderr
import json
import jsonschema
import logging
import pytest
import io

from jupyter_telemetry.eventlog import EventLog


def test_register_invalid_schema():
"""
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it would be worth adding comments that describe the specific "failures" we are testing in each chunk below.

Another approach would be to break each chunk into a separate unit test. I doesn't matter to me, but some of these failures aren't what I expect. See comments below.

Invalid JSON Schemas should fail registration
"""
el = EventLog()
with pytest.raises(jsonschema.SchemaError):
el.register_schema({
# Totally invalid
'properties': True
})

def test_missing_required_properties():
"""
id and $version are required properties in our schemas.

They aren't required by JSON Schema itself
"""
el = EventLog()
with pytest.raises(ValueError):
el.register_schema({
'properties': {}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this throw a KeyError instead? The problem is caused by missing keys (i.e. $id and $version), but I can't decide if that should result in a ValueError or KeyError.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

KeyError says 'Raised when a mapping (dictionary) key is not found in the set of existing keys.', while ValueError says 'Raised when an operation or function receives an argument that has the right type but an inappropriate value, and the situation is not described by a more precise exception such as IndexError.'

This makes me feel ValueError is the right thing, since we aren't passing anything in directly with the key.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

})

with pytest.raises(ValueError):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's not immediately obvious to me why this should fail. I certainly didn't notice that the failure is caused by an extra $ before version. Maybe add a comment to this source code saying

# testing invalid key: "$version" when it should be "version".

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done!

el.register_schema({
'$id': 'something',
'$version': 1, # This should been 'version'
})

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we add a test that fails because a dunder key of properties is set in the given schema?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added!

def test_reserved_properties():
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice test! :)

"""
User schemas can't have properties starting with __

These are reserved
"""
el = EventLog()
with pytest.raises(ValueError):
el.register_schema({
'$id': 'test/test',
'version': 1,
'properties': {
'__fail__': {
'type': 'string'
},
},
})

def test_record_event():
"""
Simple test for emitting valid events
"""
schema = {
'$id': 'test/test',
'version': 1,
'properties': {
'something': {
'type': 'string'
},
},
}

output = io.StringIO()
handler = logging.StreamHandler(output)
el = EventLog(handlers_maker=lambda el: [handler])
el.register_schema(schema)
el.allowed_schemas = ['test/test']

el.record_event('test/test', 1, {
'something': 'blah',
})
handler.flush()

event_capsule = json.loads(output.getvalue())

assert '__timestamp__' in event_capsule
# Remove timestamp from capsule when checking equality, since it is gonna vary
del event_capsule['__timestamp__']
assert event_capsule == {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This isn't guaranteed to work in Python 2, but we don't have to worry about that, right? ;)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, let's not worry about that for right now :) We can later if we wanna

'__schema__': 'test/test',
'__version__': 1,
'something': 'blah'
}


def test_allowed_schemas():
"""
Events should be emitted only if their schemas are allowed
"""
schema = {
'$id': 'test/test',
'version': 1,
'properties': {
'something': {
'type': 'string'
},
},
}

output = io.StringIO()
handler = logging.StreamHandler(output)
el = EventLog(handlers_maker=lambda el: [handler])
# Just register schema, but do not mark it as allowed
el.register_schema(schema)

el.record_event('test/test', 1, {
'something': 'blah',
})
handler.flush()


assert output.getvalue() == ''

def test_record_event_badschema():
"""
Fail fast when an event doesn't conform to its schema
"""
schema = {
'$id': 'test/test',
'version': 1,
'properties': {
'something': {
'type': 'string'
},
'status': {
'enum': ['success', 'failure']
}
}
}

el = EventLog(handlers_maker=lambda el: [logging.NullHandler()])
el.register_schema(schema)
el.allowed_schemas = ['test/test']

with pytest.raises(jsonschema.ValidationError):
el.record_event('test/test', 1, {
'something': 'blah',
'status': 'not-in-enum'
})