Skip to content

Commit

Permalink
Ct 177/part2 format code over (#108)
Browse files Browse the repository at this point in the history
* use checks over code and rewrite repo to conform to style conventions
* Add query_id to adapter payload and tests. (#109)
* Mix in released snowflake 1.1.0b1
* Resolve a Black library issue
  • Loading branch information
VersusFacit authored Mar 29, 2022
1 parent b03c472 commit dadbd42
Show file tree
Hide file tree
Showing 26 changed files with 410 additions and 272 deletions.
6 changes: 3 additions & 3 deletions .bumpversion.cfg
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
[bumpversion]
current_version = 1.0.0
current_version = 1.1.0b1
parse = (?P<major>\d+)
\.(?P<minor>\d+)
\.(?P<patch>\d+)
((?P<prerelease>a|b|rc)(?P<num>\d+))?
serialize =
serialize =
{major}.{minor}.{patch}{prerelease}{num}
{major}.{minor}.{patch}
commit = False
Expand All @@ -13,7 +13,7 @@ tag = False
[bumpversion:part:prerelease]
first_value = a
optional_value = final
values =
values =
a
b
rc
Expand Down
2 changes: 1 addition & 1 deletion .github/pull_request_template.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,4 +18,4 @@ resolves #
- [ ] I have signed the [CLA](https://docs.getdbt.com/docs/contributor-license-agreements)
- [ ] I have run this code in development and it appears to resolve the stated issue
- [ ] This PR includes tests, or tests are not required/relevant for this PR
- [ ] I have updated the `CHANGELOG.md` and added information about my change to the "dbt-snowflake next" section.
- [ ] I have updated the `CHANGELOG.md` and added information about my change to the "dbt-snowflake next" section.
4 changes: 2 additions & 2 deletions .github/workflows/integration.yml
Original file line number Diff line number Diff line change
Expand Up @@ -221,9 +221,9 @@ jobs:

post-failure:
runs-on: ubuntu-latest
needs: test
needs: test
if: ${{ failure() }}

steps:
- name: Posting scheduled run failures
uses: ravsamhq/notify-slack-action@v1
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/jira-creation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ name: Jira Issue Creation
on:
issues:
types: [opened, labeled]

permissions:
issues: write

Expand Down
3 changes: 1 addition & 2 deletions .github/workflows/jira-label.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ name: Jira Label Mirroring
on:
issues:
types: [labeled, unlabeled]

permissions:
issues: read

Expand All @@ -24,4 +24,3 @@ jobs:
JIRA_BASE_URL: ${{ secrets.JIRA_BASE_URL }}
JIRA_USER_EMAIL: ${{ secrets.JIRA_USER_EMAIL }}
JIRA_API_TOKEN: ${{ secrets.JIRA_API_TOKEN }}

2 changes: 1 addition & 1 deletion .github/workflows/jira-transition.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,4 +21,4 @@ jobs:
secrets:
JIRA_BASE_URL: ${{ secrets.JIRA_BASE_URL }}
JIRA_USER_EMAIL: ${{ secrets.JIRA_USER_EMAIL }}
JIRA_API_TOKEN: ${{ secrets.JIRA_API_TOKEN }}
JIRA_API_TOKEN: ${{ secrets.JIRA_API_TOKEN }}
2 changes: 0 additions & 2 deletions .github/workflows/stale.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,5 +13,3 @@ jobs:
stale-pr-message: "This PR has been marked as Stale because it has been open for 180 days with no activity. If you would like the PR to remain open, please remove the stale label or comment on the PR, or it will be closed in 7 days."
# mark issues/PRs stale when they haven't seen activity in 180 days
days-before-stale: 180
# ignore checking issues with the following labels
exempt-issue-labels: "epic, discussion"
20 changes: 10 additions & 10 deletions .github/workflows/version-bump.yml
Original file line number Diff line number Diff line change
@@ -1,16 +1,16 @@
# **what?**
# This workflow will take a version number and a dry run flag. With that
# it will run versionbump to update the version number everywhere in the
# it will run versionbump to update the version number everywhere in the
# code base and then generate an update Docker requirements file. If this
# is a dry run, a draft PR will open with the changes. If this isn't a dry
# run, the changes will be committed to the branch this is run on.

# **why?**
# This is to aid in releasing dbt and making sure we have updated
# This is to aid in releasing dbt and making sure we have updated
# the versions and Docker requirements in all places.

# **when?**
# This is triggered either manually OR
# This is triggered either manually OR
# from the repository_dispatch event "version-bump" which is sent from
# the dbt-release repo Action

Expand All @@ -25,11 +25,11 @@ on:
is_dry_run:
description: 'Creates a draft PR to allow testing instead of committing to a branch'
required: true
default: 'true'
default: 'true'
repository_dispatch:
types: [version-bump]

jobs:
jobs:
bump:
runs-on: ubuntu-latest
steps:
Expand Down Expand Up @@ -57,19 +57,19 @@ jobs:
run: |
python3 -m venv env
source env/bin/activate
pip install --upgrade pip
pip install --upgrade pip
- name: Create PR branch
if: ${{ steps.variables.outputs.IS_DRY_RUN == 'true' }}
run: |
git checkout -b bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_$GITHUB_RUN_ID
git push origin bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_$GITHUB_RUN_ID
git branch --set-upstream-to=origin/bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_$GITHUB_RUN_ID bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_$GITHUB_RUN_ID
- name: Bumping version
run: |
source env/bin/activate
pip install -r dev_requirements.txt
pip install -r dev_requirements.txt
env/bin/bumpversion --allow-dirty --new-version ${{steps.variables.outputs.VERSION_NUMBER}} major
git status
Expand Down Expand Up @@ -99,4 +99,4 @@ jobs:
draft: true
base: ${{github.ref}}
title: 'Bumping version to ${{steps.variables.outputs.VERSION_NUMBER}}'
branch: 'bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_${{GITHUB.RUN_ID}}'
branch: 'bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_${{GITHUB.RUN_ID}}'
2 changes: 2 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -21,12 +21,14 @@ repos:
rev: 21.12b0
hooks:
- id: black
additional_dependencies: ['click==8.0.4']
args:
- "--line-length=99"
- "--target-version=py38"
- id: black
alias: black-check
stages: [manual]
additional_dependencies: ['click==8.0.4']
args:
- "--line-length=99"
- "--target-version=py38"
Expand Down
15 changes: 14 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,22 @@
## dbt-snowflake 1.1.0 (TBD)
## dbt-snowflake 1.1.0b1 (March 23, 2022)

### Features
- Adds tests for incremental model unique key parameter ([#91](https://github.com/dbt-labs/dbt-snowflake/issues/91))
- enables mfa token caching for linux when using the username_password_mfa authenticator ([#65](https://github.com/dbt-labs/dbt-snowflake/pull/65))

### Fixes
- Add unique\_id field to docs generation test catalogs; a follow-on PR to core PR ([#4168](https://github.com/dbt-labs/dbt-core/pull/4618))

### Under the hood
- Add `query_id` for a query to `run_result.json` ([#40](https://github.com/dbt-labs/dbt-snowflake/pull/40))
- Change logic for Post-failure job run ([#67](https://github.com/dbt-labs/dbt-snowflake/pull/67))
- Update to version bumping script ([#68](https://github.com/dbt-labs/dbt-snowflake/pull/68))
- Add contributing.md file for snowflake adapter repo ([#79](https://github.com/dbt-labs/dbt-snowflake/pull/79))

### Contributors
- [@joshuataylor](https://github.com/joshuataylor) ([#40](https://github.com/dbt-labs/dbt-snowflake/pull/40))
- [@devoted](https://github.com/devoted) ([#40](https://github.com/dbt-labs/dbt-snowflake/pull/40))

## dbt-snowflake 1.0.0 (December 3rd, 2021)

## dbt-snowflake 1.0.0rc2 (November 24, 2021)
Expand All @@ -17,6 +28,8 @@
### Under the hood
- Resolves an issue caused when the Snowflake OCSP server is not accessible, by exposing the `insecure_mode` boolean avalable in the Snowflake python connector ([#31](https://github.com/dbt-labs/dbt-snowflake/issues/31), [#49](https://github.com/dbt-labs/dbt-snowflake/pull/49))
- Fix test related to preventing coercion of boolean values (True, False) to numeric values (0, 1) in query results ([#76](https://github.com/dbt-labs/dbt-snowflake/issues/76))
- Add Stale messaging Github Action workflow ([#84](https://github.com/dbt-labs/dbt-snowflake/pull/84))


### Contributors
- [@anthu](https://github.com/anthu) ([#48](https://github.com/dbt-labs/dbt-snowflake/pull/48))
Expand Down
118 changes: 118 additions & 0 deletions CONTRIBUTING.MD
Original file line number Diff line number Diff line change
@@ -0,0 +1,118 @@
# Contributing to `dbt-snowflake`

1. [About this document](#about-this-document)
3. [Getting the code](#getting-the-code)
5. [Running `dbt-snowflake` in development](#running-dbt-snowflake-in-development)
6. [Testing](#testing)
7. [Updating Docs](#updating-docs)
7. [Submitting a Pull Request](#submitting-a-pull-request)

## About this document
This document is a guide for anyone interested in contributing to the `dbt-snowflake` repository. It outlines how to create issues and submit pull requests (PRs).

This is not intended as a guide for using `dbt-snowflake` in a project. For configuring and using this adapter, see [Snowflake Profile](https://docs.getdbt.com/reference/warehouse-profiles/snowflake-profile), and [Snowflake Configs](https://docs.getdbt.com/reference/resource-configs/snowflake-configs).

We assume users have a Linux or MacOS system. You should have familiarity with:

- Python `virturalenv`s
- Python modules
- `pip`
- common command line utilities like `git`.

In addition to this guide, we highly encourage you to read the [dbt-core](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md). Almost all information there is applicable here!

### Signing the CLA

Please note that all contributors to `dbt-snowflake` must sign the [Contributor License Agreement](https://docs.getdbt.com/docs/contributor-license-agreements)(CLA) before their pull request(s) can be merged into the `dbt-snowflake` codebase. Given this, `dbt-snowflake` maintainers will unfortunately be unable to merge your contribution(s) until you've signed the CLA. You are, however, welcome to open issues and comment on existing ones.

## Getting the code

`git` is needed in order to download and modify the `dbt-snowflake` code. There are several ways to install Git. For MacOS, we suggest installing [Xcode](https://developer.apple.com/support/xcode/) or [Xcode Command Line Tools](https://mac.install.guide/commandlinetools/index.html).

### External contributors

If you are not a member of the `dbt-labs` GitHub organization, you can contribute to `dbt-snowflake` by forking the `dbt-snowflake` repository. For more on forking, check out the [GitHub docs on forking](https://help.github.com/en/articles/fork-a-repo). In short, you will need to:

1. fork the `dbt-snowflake` repository
2. clone your fork locally
3. check out a new branch for your proposed changes
4. push changes to your fork
5. open a pull request of your forked repository against `dbt-labs/dbt-snowflake`

### dbt Labs contributors

If you are a member of the `dbt Labs` GitHub organization, you will have push access to the `dbt-snowflake` repo. Rather than forking `dbt-snowflake` to make your changes, clone the repository like normal, and check out feature branches.

## Running `dbt-snowflake` in development

### Installation

1. Ensure you have the latest version of `pip` installed by running `pip install --upgrade pip` in terminal.

2. Configure and activate a `virtualenv` as described in [Setting up an environment](https://github.com/dbt-labs/dbt-core/blob/HEAD/CONTRIBUTING.md#setting-up-an-environment).

3. Install `dbt-core` in the active `virtualenv`. To confirm you installed dbt correctly, run `dbt --version` and `which dbt`.

4. Install `dbt-snowflake` and development dependencies in the active `virtualenv`. Run `pip install -e . -r dev-requirements.txt`.

When `dbt-snowflake` is installed this way, any changes you make to the `dbt-snowflake` source code will be reflected immediately (i.e. in your next local dbt invocation against a Snowflake target).

## Testing

### Initial setup

`dbt-snowflake` contains [unit](https://github.com/dbt-labs/dbt-snowflake/tree/main/tests/unit) and [integration](https://github.com/dbt-labs/dbt-snowflake/tree/main/tests/integration) tests. Integration tests require an actual Snowflake warehouse to test against. There are two primary ways to do this:

- This repo has CI/CD GitHub Actions set up. Both unit and integration tests will run against an already configured Snowflake warehouse during PR checks.

- You can also run integration tests "locally" by configuring a `test.env` file with appropriate `ENV` variables.

```
cp test.env.example test.env
$EDITOR test.env
```

WARNING: The parameters in your `test.env` file must link to a valid Snowflake account. The `test.env` file you create is git-ignored, but please be _extra_ careful to never check in credentials or other sensitive information when developing.


### "Local" test commands
There are a few methods for running tests locally.

#### `tox`
`tox` automatically runs unit tests against several Python versions using its own virtualenvs. Run `tox -p` to run unit tests for Python 3.7, Python 3.8, Python 3.9, and `flake8` in parallel. Run `tox -e py37` to invoke tests on Python version 3.7 only (use py37, py38, or py39). Tox recipes are found in `tox.ini`.

#### `pytest`
You may run a specific test or group of tests using `pytest` directly. Activate a Python virtualenv active with dev dependencies installed. Then, run tests like so:

```sh
# Note: replace $strings with valid names

# run all snowflake integration tests in a directory
python -m pytest -m profile_snowflake tests/integration/$test_directory
# run all snowflake integration tests in a module
python -m pytest -m profile_snowflake tests/integration/$test_dir_and_filename.py
# run all snowflake integration tests in a class
python -m pytest -m profile_snowflake tests/integration/$test_dir_and_filename.py::$test_class_name
# run a specific snowflake integration test
python -m pytest -m profile_snowflake tests/integration/$test_dir_and_filename.py::$test_class_name::$test__method_name

# run all unit tests in a module
python -m pytest tests/unit/$test_file_name.py
# run a specific unit test
python -m pytest tests/unit/$test_file_name.py::$test_class_name::$test_method_name
```

## Updating documentation

Many changes will require an update to `dbt-snowflake` documentation. Here are some relevant links.

- Docs are [here](https://docs.getdbt.com/).
- The docs repo for making changes is located [here](https://github.com/dbt-labs/docs.getdbt.com).
- The changes made are likely to impact one or both of [Snowflake Profile](https://docs.getdbt.com/reference/warehouse-profiles/snowflake-profile), or [Snowflake Configs](https://docs.getdbt.com/reference/resource-configs/snowflake-configs).
- We ask every community member who makes a user-facing change to open an issue or PR regarding doc changes.

## Submitting a Pull Request

A `dbt-snowflake` maintainer will review your PR and will determine if it has passed regression tests. They may suggest code revisions for style and clarity, or they may request that you add unit or integration tests. These are good things! We believe that, with a little bit of help, anyone can contribute high-quality code.

Once all tests are passing and your PR has been approved, a `dbt-snowflake` maintainer will merge your changes into the active development branch. And that's it! Happy developing :tada:
2 changes: 1 addition & 1 deletion MANIFEST.in
Original file line number Diff line number Diff line change
@@ -1 +1 @@
recursive-include dbt/include *.sql *.yml *.md
recursive-include dbt/include *.sql *.yml *.md
5 changes: 2 additions & 3 deletions dbt/adapters/snowflake/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,5 @@
from dbt.include import snowflake # type: ignore

Plugin = AdapterPlugin(
adapter=SnowflakeAdapter,
credentials=SnowflakeCredentials,
include_path=snowflake.PACKAGE_PATH)
adapter=SnowflakeAdapter, credentials=SnowflakeCredentials, include_path=snowflake.PACKAGE_PATH
)
2 changes: 1 addition & 1 deletion dbt/adapters/snowflake/__version__.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
version = '1.0.0'
version = "1.1.0b1"
20 changes: 16 additions & 4 deletions dbt/adapters/snowflake/column.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,20 +12,32 @@ def is_integer(self) -> bool:

def is_numeric(self) -> bool:
return self.dtype.lower() in [
'int', 'integer', 'bigint', 'smallint', 'tinyint', 'byteint',
'numeric', 'decimal', 'number'
"int",
"integer",
"bigint",
"smallint",
"tinyint",
"byteint",
"numeric",
"decimal",
"number",
]

def is_float(self):
return self.dtype.lower() in [
'float', 'float4', 'float8', 'double', 'double precision', 'real',
"float",
"float4",
"float8",
"double",
"double precision",
"real",
]

def string_size(self) -> int:
if not self.is_string():
raise RuntimeException("Called string_size() on non-string field!")

if self.dtype == 'text' or self.char_size is None:
if self.dtype == "text" or self.char_size is None:
return 16777216
else:
return int(self.char_size)
Loading

0 comments on commit dadbd42

Please sign in to comment.