Skip to content

Commit

Permalink
Tox support (#318)
Browse files Browse the repository at this point in the history
* update gitignore for pyenv files

* ignore tox files

* hardcode version instead of pulling from gh

importing semver in setup caused an error bc it was being imported before requirements file could install it

* move coverage requirement to tox config

* remove version py file since it's hardcoded in setup

* initial working tox commit

* update required coverage versions to be more flexible

* switch order formatters run

* ignore env specific coverage files

* update tox config to use coverage run instead of base python to execute tests

* add coverage report env to tox config

* update coverage-report description

* update a env name for type checking

* remove black and flake8 from requirments package and into tox envs

* update contributing md file with tox instructions

* add pyenv and tox instructions to CONTRIBUTING md

* Update test.yml

* adding comment to force testing on GH actions

* removing comment to force testing on GH actions

* Update test.yml to fix error caused by typo

* Revert "remove version py file since it's hardcoded in setup"

This reverts commit c41d140.

* Revert "hardcode version instead of pulling from gh"

This reverts commit e0f42e6.

* add semver as a dep for testenv in tox ini

* add pyproject toml to install semver as build requirement

---------

Co-authored-by: Lorin Dawson <lorin@databricks.com>
  • Loading branch information
R7L208 and Lorin Dawson authored Apr 25, 2023
1 parent 26fb674 commit 44f2235
Show file tree
Hide file tree
Showing 6 changed files with 178 additions and 61 deletions.
5 changes: 2 additions & 3 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -54,9 +54,8 @@ jobs:
- name: Generate coverage report
working-directory: ./python
run: |
pip install -r requirements.txt
pip install coverage
python -I -m pip install 'coverage<8,>=7' pyspark==3.2.1 -r requirements.txt
coverage run -m unittest discover -s tests -p '*_tests.py'
coverage xml
- name: Publish test coverage
uses: codecov/codecov-action@v1
uses: codecov/codecov-action@v1
9 changes: 9 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@

# coverage files
.coverage
.coverage.*
coverage.xml

# local delta tables
Expand All @@ -22,6 +23,14 @@ coverage.xml
*.pyc
**/__pycache__

## pyenv files
.python-version
python/.python-version

## tox files
.tox
python/.tox

# ignore virtual environments
python/venv
python/.venv
Expand Down
111 changes: 57 additions & 54 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -1,54 +1,57 @@
Thank you for your interest in contributing to the tempo project (the “Project”). In order to clarify the intellectual property license granted with Contributions from any person or entity who contributes to the Project, Databricks, Inc. ("Databricks") must have a Contributor License Agreement (CLA) on file that has been signed by each such Contributor (or if an entity, an authorized representative of such entity). This license is for your protection as a Contributor as well as the protection of Databricks and its users; it does not change your rights to use your own Contributions for any other purpose.
You may sign this CLA either on your own behalf (with respect to any Contributions that are owned by you) and/or on behalf of an entity (the "Corporation") (with respect to any Contributions that are owned by such Corporation (e.g., those Contributions you make during the performance of your employment duties to the Corporation)). Please mark the corresponding box below.
You accept and agree to the following terms and conditions for Your present and future Contributions submitted to Databricks. Except for the licenses granted herein to Databricks, You reserve all right, title, and interest in and to Your Contributions.
Definitions.
"You" (or "Your") shall mean the copyright owner or legal entity authorized by the copyright owner that is making this Agreement with Databricks. For legal entities, the entity making a Contribution and all other entities that control, are controlled by, or are under common control with that entity are considered to be a single Contributor. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.
"Contribution" shall mean the code, documentation or any original work of authorship, including any modifications or additions to an existing work, that is submitted by You to Databricks for inclusion in, or documentation of, any of the products owned or managed by Databricks, including the Project, whether on, before or after the date You sign this CLA. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to Databricks or its representatives, including but not limited to communication on electronic mailing lists, source code control systems (e.g., Github), and issue tracking systems that are managed by, or on behalf of, Databricks for the purpose of discussing and improving the Project, but excluding communication that is conspicuously marked or otherwise designated in writing by You as "Not a Contribution."
Grant of Copyright License. Subject to the terms and conditions of this Agreement, You hereby grant to Databricks a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare derivative works of, publicly display, publicly perform, sublicense (through multiple tiers), and distribute Your Contributions and such derivative works. For the avoidance of doubt, and without limitation, this includes, at our option, the right to sublicense this license to recipients or users of any products or services (including software) distributed or otherwise made available (e.g., by SaaS offering) by Databricks (each, a “Downstream Recipient”).
Grant of Patent License. Subject to the terms and conditions of this Agreement, You hereby grant to Databricks a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable, sublicensable (through multiple tiers) (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Contribution in whole or in part, alone or in combination with any other products or services (including for the avoidance of doubt the Project), where such license applies only to those patent claims licensable by You that are necessarily infringed by Your Contribution(s) alone or by combination of Your Contribution(s) with the Project to which such Contribution(s) was submitted. For the avoidance of doubt, and without limitation, this includes, at our option, the right to sublicense this license to Downstream Recipients.
Authorized Users. If you are signing this CLA on behalf of a Corporation, you may also add additional designated employees of the Corporation who will be covered by this CLA without the need to separately sign it (“Authorized Users”). Your Primary Point of Contact (you or the individual specified below) may add additional Authorized Users at any time by contacting Databricks at cla@databricks.com (or such other method as Databricks informs you).
Representations. You represent that:
You are legally entitled to grant the above licenses, and, if You are signing on behalf of a Corporation and have added any Authorized Users, You represent further that each employee of the Corporation designated by You is authorized to submit Contributions on behalf of the Corporation;
each of Your Contributions is Your original creation;
to your knowledge, Your Contributions do not infringe or otherwise misappropriate the intellectual property rights of a third person; and
you will not assert any moral rights in your Contribution against us or any Downstream Recipients.
Support. You are not expected to provide support for Your Contributions, except to the extent You desire to provide support. You may provide support for free, for a fee, or not at all. Unless required by applicable law or agreed to in writing, and except as specified above, You provide Your Contributions on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE.
Notification. It is your responsibility to notify Databricks when any change is required to the list of Authorized Users, or to the Corporation's Primary Point of Contact with Databricks. You agree to notify Databricks of any facts or circumstances of which you become aware that would make the representations or warranties herein inaccurate in any respect.
This CLA is governed by the laws of the State of California and applicable U.S. Federal law. Any choice of law rules will not apply.
Please check one of the applicable statement below. Please do NOT mark both statements:
I am signing on behalf of myself as an individual and no other person or entity, including my employer, has or will have rights with respect my Contributions.
I am signing on behalf of my employer or a legal entity and I have the actual authority to contractually bind such entity (the Corporation).

Name*:


Corporation Entity Name (if applicable):


Title or Role (if applicable):


Mailing Address*:


Email*:


Signature*:


Date*:


Github Username (if applicable):


Primary Point of Contact (if not you) (please provide name and email and Github username, if applicable):


Authorized Users (please list Github usernames):**



* Required field
** Please note that Authorized Users may not be immediately be granted authorization to submit Contributions; should more than one individual attempt to sign a CLA on behalf of a Corporation, the first such CLA will apply and later CLAs will be deemed void.
# Tox Setup instructions

`tox` is a testing tool that helps you automate and standardize testing in Python across multiple environments.

`pyenv`that allows you to manage multiple versions of Python on your computer and easily switch between them.

Since `tox` supports creating virtual environments using multiple Python versions, it is recommended to use `pyenv` to manage Python versions on your computer.

Install both tox and pyenv packages:
```bash
pip install -U tox pyenv
pyenv install 3.7 3.8 3.9
```

Within `python` folder, run the below command to create a `.python-version` file that will tell `pyenv` which Python version to use when running commands in this directory:
```bash
pyenv local 3.7 3.8 3.9
```

This allows `tox` to create virtual environments using any of the Python versions listed in the `.python-version` file.

A brief description of each managed `tox` environment can be found by running `tox list` or in the `tox.ini` file.

## Create a development environment
Run the following command in your terminal to create a virtual environment in the `.venv` folder:
```bash
tox --devenv .venv -e {environment-name}
```
The `—devenv` flag tells `tox` to create a development environment, and `.venv` is the folder where the virtual environment will be created.
Pre-defined environments can be found within the `tox.ini` file for different Python versions and their corresponding PySpark version. They include:
- py37-pyspark300
- py38-pyspark312
- py38-pyspark321
- py39-pyspark330
- py39-pyspark332

## Run tests locally for one or more environments
You can run tests locally for one or more environments defined enviornments without setting up a development environment first.

### To run tests for a single environment, use the `-e` flag followed by the environment name:
```bash
tox -e {environment-name}
```

### To run tests for multiple environments, specify the environment names separated by commas:
```bash
tox -e {environment-name1, environment-name2, etc.}
```
This will run tests for all listed environments.

### Run additional checks locally
`tox` has special environments for additional checks that must be performed as part of the PR process. These include formatting, linting, type checking, etc.
These environments are also defined in the `tox.ini`file and skip installing dependencies listed in the `requirements.txt` file and building the distribution when those are not required . They can be specified using the `-e` flag:
● format
● lint
● type-check
● coverage-report
2 changes: 2 additions & 0 deletions python/pyproject.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
[build-system]
requires = ["semver"] # PEP 518 - what is required to build
4 changes: 0 additions & 4 deletions python/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,11 +1,8 @@
black==23.3.0
flake8==6.0.0
ipython==8.10.0
numpy==1.24.2
chispa==0.9.2
pandas==1.5.2
pyarrow==10.0.1
pyspark==3.2.1
python-dateutil==2.8.2
pytz==2022.7.1
scipy==1.9.3
Expand All @@ -20,4 +17,3 @@ sphinx-design==0.2.0
sphinx-panels==0.6.0
jsonref==1.0.1
python-dateutil==2.8.2
coverage==6.5.0
108 changes: 108 additions & 0 deletions python/tox.ini
Original file line number Diff line number Diff line change
@@ -0,0 +1,108 @@
[tox]
requires =
tox>4,<5
virtualenv>20,<21
wheel>=0.38,<1
isolated_build = true
envlist =
format
lint
type
build-dist
; Mirror Supported LTS DBR versions here: https://docs.databricks.com/release-notes/runtime/
; Use correct PySpark version based on Python version present in env name
py37-pyspark300,
py38-pyspark{312,321},
py39-pyspark{330,332}
skip_missing_interpreters = true


[testenv]
description = run the tests under {envname}
package = wheel
wheel_build_env = .pkg
setenv =
COVERAGE_FILE = .coverage.{envname}
deps =
pyspark300: pyspark==3.0.0
pyspark312: pyspark==3.1.2
pyspark321: pyspark==3.2.1
pyspark330: pyspark==3.3.0
pyspark332: pyspark==3.3.2
coverage>=7,<8
-rrequirements.txt
commands =
coverage run -m unittest discover -s tests -p '*_tests.py'

[testenv:format]
description = run formatters
skipsdist = true
skip_install = true
deps =
black
commands =
black {toxinidir}

[testenv:lint]
description = run linters
skipsdist = true
skip_install = true
deps =
flake8
black
commands =
black --check {toxinidir}
flake8

[testenv:type-check]
description = run type checks
; todo - configure mypy
skipsdist = true
skip_install = true
deps =
mypy
commands =
mypy {toxinidir}/tempo

[testenv:build-dist]
description = build distribution
skip_install = true
deps =
build
commands =
python -m build --sdist --wheel {posargs: {toxinidir}}

[testenv:coverage-report]
description = combine coverage data and generate reports
deps = coverage
skipdist = true
skip_install = true
commands =
coverage combine
coverage report -m
coverage xml

[coverage:run]
source = tempo
parallel = true

[flake8]
exclude =
.git
__pycache__
env
.tox
build
.venv
venv
.coverage.py
.coverage
.coveragerc
.eggs
.mypy_cache
.pytest_cache
dbl_tempo.egg-info
max-line-length = 88
extend-ignore =
; See https://github.com/PyCQA/pycodestyle/issues/373
E203

0 comments on commit 44f2235

Please sign in to comment.