Skip to content

Commit

Permalink
Merge branch 'dev/0.15.1' into dev/barbara-gittings
Browse files Browse the repository at this point in the history
  • Loading branch information
Jacob Beck committed Dec 23, 2019
2 parents 74672c8 + 5a4ddd6 commit 1235b3f
Show file tree
Hide file tree
Showing 18 changed files with 107 additions and 21 deletions.
2 changes: 1 addition & 1 deletion .bumpversion.cfg
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
[bumpversion]
current_version = 0.15.0
current_version = 0.15.1rc1
parse = (?P<major>\d+)
\.(?P<minor>\d+)
\.(?P<patch>\d+)
Expand Down
32 changes: 32 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,35 @@
## dbt 0.15.1 (To be released)

This is a bugfix release.

### Features
- Lazily load database connections ([#1584](https://github.com/fishtown-analytics/dbt/issues/1584), [#1992](https://github.com/fishtown-analytics/dbt/pull/1992))
- Support raising warnings in user-space ([#1970](https://github.com/fishtown-analytics/dbt/issues/1970), [#1977](https://github.com/fishtown-analytics/dbt/pull/1977))
- Suppport BigQuery label configuration for models ([#1942](https://github.com/fishtown-analytics/dbt/issues/1942), [#1964](https://github.com/fishtown-analytics/dbt/pull/1964))
- Support retrying when BigQuery models fail with server errors ([#1579](https://github.com/fishtown-analytics/dbt/issues/1579), [#1963](https://github.com/fishtown-analytics/dbt/pull/1963))

### Fixes
- Fix for catalog generation error when datasets are missing on BigQuery ([#1984](https://github.com/fishtown-analytics/dbt/issues/1984), [#2005](https://github.com/fishtown-analytics/dbt/pull/2005))
- Fix for invalid SQL generated when "check" strategy is used in Snapshots with changing schemas ([#1797](https://github.com/fishtown-analytics/dbt/issues/1797), [#2001](https://github.com/fishtown-analytics/dbt/pull/2001)(
- Fix for gaps in valid_from and valid_to timestamps when "check" strategy is used in Snapshots on some databases ([#1736](https://github.com/fishtown-analytics/dbt/issues/1736), [#1994](https://github.com/fishtown-analytics/dbt/pull/1994))
- Fix incorrect thread names in dbt server logs ([#1905](https://github.com/fishtown-analytics/dbt/issues/1905), [#2002](https://github.com/fishtown-analytics/dbt/pull/2002))
- Fix for ignored catalog data when user schemas begin with `pg*` on Postgres and Redshift ([#1960](https://github.com/fishtown-analytics/dbt/issues/1960), [#2003](https://github.com/fishtown-analytics/dbt/pull/2003))
- Fix for poorly defined materialization resolution logic ([#1962](https://github.com/fishtown-analytics/dbt/issues/1962), [#1976](https://github.com/fishtown-analytics/dbt/pull/1976))
- Fix missing `drop_schema` method in adapter namespace ([#1980](https://github.com/fishtown-analytics/dbt/issues/1980), [#1983](https://github.com/fishtown-analytics/dbt/pull/1983))
- Fix incorrect `generated_at` value in the catalog ([#1988](https://github.com/fishtown-analytics/dbt/pull/1988))

### Under the hood
- Fail more gracefully at install time when setuptools is downlevel ([#1975](https://github.com/fishtown-analytics/dbt/issues/1975), [#1978](https://github.com/fishtown-analytics/dbt/pull/1978))
- Make the `DBT_TEST_ALT` integration test warehouse configurable on Snowflake ([#1939](https://github.com/fishtown-analytics/dbt/issues/1939), [#1979](https://github.com/fishtown-analytics/dbt/pull/1979))
- Pin upper bound on `google-cloud-bigquery` dependency to `1.24.0`. ([#2007](https://github.com/fishtown-analytics/dbt/pull/2007))
- Remove duplicate `get_context_modules` method ([#1996](https://github.com/fishtown-analytics/dbt/pull/1996))
- Add type annotations to base adapter code ([#1982](https://github.com/fishtown-analytics/dbt/pull/1982))

Contributors:
- [@Fokko](https://github.com/Fokko) ([#1996](https://github.com/fishtown-analytics/dbt/pull/1996), [#1988](https://github.com/fishtown-analytics/dbt/pull/1988), [#1982](https://github.com/fishtown-analytics/dbt/pull/1982))



## dbt 0.15.0 (November 25, 2019)

### Breaking changes
Expand Down
12 changes: 12 additions & 0 deletions core/dbt/include/global_project/macros/adapters/common.sql
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,10 @@
{%- endmacro %}

{% macro default__create_table_as(temporary, relation, sql) -%}
{%- set sql_header = config.get('sql_header', none) -%}

{{ sql_header if sql_header is not none }}

create {% if temporary: -%}temporary{%- endif %} table
{{ relation.include(database=(not temporary), schema=(not temporary)) }}
as (
Expand All @@ -81,6 +85,10 @@
{%- endmacro %}

{% macro default__create_view_as(relation, sql) -%}
{%- set sql_header = config.get('sql_header', none) -%}

{{ sql_header if sql_header is not none }}

create view {{ relation }} as (
{{ sql }}
);
Expand Down Expand Up @@ -269,3 +277,7 @@

{% do return(tmp_relation) %}
{% endmacro %}

{% macro set_sql_header(config) -%}
{{ config.set('sql_header', caller()) }}
{%- endmacro %}
2 changes: 1 addition & 1 deletion core/dbt/source_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ class SourceConfig:
'unique_key',
'database',
'severity',

'sql_header',
'incremental_strategy',

# snapshots
Expand Down
2 changes: 1 addition & 1 deletion core/dbt/version.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,5 +56,5 @@ def get_version_information():
.format(version_msg))


__version__ = '0.15.0'
__version__ = '0.15.1rc1'
installed = get_installed_version()
2 changes: 1 addition & 1 deletion core/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ def read(fname):


package_name = "dbt-core"
package_version = "0.15.0"
package_version = "0.15.1rc1"
description = """dbt (data build tool) is a command line tool that helps \
analysts and engineers transform data in their warehouse more effectively"""

Expand Down
24 changes: 15 additions & 9 deletions plugins/bigquery/dbt/adapters/bigquery/connections.py
Original file line number Diff line number Diff line change
Expand Up @@ -176,9 +176,12 @@ def get_timeout(cls, conn):
return credentials.timeout_seconds

@classmethod
def get_retries(cls, conn):
def get_retries(cls, conn) -> int:
credentials = conn.credentials
return credentials.retries
if credentials.retries is not None:
return credentials.retries
else:
return 1

@classmethod
def get_table_from_response(cls, resp):
Expand Down Expand Up @@ -270,8 +273,11 @@ def create_table(self, database, schema, table_name, sql):
job_params = {'destination': table_ref,
'write_disposition': 'WRITE_TRUNCATE'}

timeout = self.get_timeout(conn)

def fn():
return self._query_and_results(client, sql, conn, job_params)
return self._query_and_results(client, sql, conn, job_params,
timeout=timeout)
self._retry_and_handle(msg=sql, conn=conn, fn=fn)

def create_date_partitioned_table(self, database, schema, table_name):
Expand Down Expand Up @@ -317,12 +323,12 @@ def fn():
return client.create_dataset(dataset, exists_ok=True)
self._retry_and_handle(msg='create dataset', conn=conn, fn=fn)

def _query_and_results(self, client, sql, conn, job_params):
def _query_and_results(self, client, sql, conn, job_params, timeout=None):
"""Query the client and wait for results."""
# Cannot reuse job_config if destination is set and ddl is used
job_config = google.cloud.bigquery.QueryJobConfig(**job_params)
query_job = client.query(sql, job_config=job_config)
iterator = query_job.result(timeout=self.get_timeout(conn))
iterator = query_job.result(timeout=timeout)

return query_job, iterator

Expand Down Expand Up @@ -354,13 +360,13 @@ def count_error(self, error):
return False # Don't log
self.error_count += 1
if _is_retryable(error) and self.error_count <= self.retries:
logger.warning(
'Retry attempt %s of %s after error: %s',
logger.debug(
'Retry attempt {} of {} after error: {}',
self.error_count, self.retries, repr(error))
return True
else:
logger.warning(
'Not Retrying after %s previous attempts. Error: %s',
logger.debug(
'Not Retrying after {} previous attempts. Error: {}',
self.error_count - 1, repr(error))
return False

Expand Down
7 changes: 7 additions & 0 deletions plugins/bigquery/dbt/include/bigquery/macros/adapters.sql
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,10 @@
{%- set raw_persist_docs = config.get('persist_docs', {}) -%}
{%- set raw_kms_key_name = config.get('kms_key_name', none) -%}
{%- set raw_labels = config.get('labels', []) -%}
{%- set sql_header = config.get('sql_header', none) -%}

{{ sql_header if sql_header is not none }}

create or replace table {{ relation }}
{{ partition_by(raw_partition_by) }}
{{ cluster_by(raw_cluster_by) }}
Expand All @@ -76,6 +80,9 @@
{% macro bigquery__create_view_as(relation, sql) -%}
{%- set raw_persist_docs = config.get('persist_docs', {}) -%}
{%- set raw_labels = config.get('labels', []) -%}
{%- set sql_header = config.get('sql_header', none) -%}

{{ sql_header if sql_header is not none }}

create or replace view {{ relation }}
{{ bigquery_table_options(persist_docs=raw_persist_docs, temporary=false, labels=raw_labels) }}
Expand Down
2 changes: 1 addition & 1 deletion plugins/bigquery/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@


package_name = "dbt-bigquery"
package_version = "0.15.0"
package_version = "0.15.1rc1"
description = """The bigquery adapter plugin for dbt (data build tool)"""

this_directory = os.path.abspath(os.path.dirname(__file__))
Expand Down
3 changes: 3 additions & 0 deletions plugins/postgres/dbt/include/postgres/macros/adapters.sql
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
{% macro postgres__create_table_as(temporary, relation, sql) -%}
{%- set unlogged = config.get('unlogged', default=false) -%}
{%- set sql_header = config.get('sql_header', none) -%}

{{ sql_header if sql_header is not none }}

create {% if temporary -%}
temporary
Expand Down
2 changes: 1 addition & 1 deletion plugins/postgres/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ def _dbt_psycopg2_name():


package_name = "dbt-postgres"
package_version = "0.15.0"
package_version = "0.15.1rc1"
description = """The postgres adpter plugin for dbt (data build tool)"""

this_directory = os.path.abspath(os.path.dirname(__file__))
Expand Down
6 changes: 6 additions & 0 deletions plugins/redshift/dbt/include/redshift/macros/adapters.sql
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,9 @@
{%- set _sort = config.get(
'sort',
validator=validation.any[list, basestring]) -%}
{%- set sql_header = config.get('sql_header', none) -%}

{{ sql_header if sql_header is not none }}

create {% if temporary -%}temporary{%- endif %} table
{{ relation.include(database=(not temporary), schema=(not temporary)) }}
Expand All @@ -51,6 +54,9 @@
{% macro redshift__create_view_as(relation, sql) -%}

{% set bind_qualifier = '' if config.get('bind', default=True) else 'with no schema binding' %}
{%- set sql_header = config.get('sql_header', none) -%}

{{ sql_header if sql_header is not none }}

create view {{ relation }} as (
{{ sql }}
Expand Down
2 changes: 1 addition & 1 deletion plugins/redshift/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@


package_name = "dbt-redshift"
package_version = "0.15.0"
package_version = "0.15.1rc1"
description = """The redshift adapter plugin for dbt (data build tool)"""

this_directory = os.path.abspath(os.path.dirname(__file__))
Expand Down
6 changes: 6 additions & 0 deletions plugins/snowflake/dbt/include/snowflake/macros/adapters.sql
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,9 @@
{% else %}
{%- set cluster_by_string = none -%}
{%- endif -%}
{%- set sql_header = config.get('sql_header', none) -%}

{{ sql_header if sql_header is not none }}

create or replace {% if temporary -%}
temporary
Expand Down Expand Up @@ -38,6 +41,9 @@
{% macro snowflake__create_view_as(relation, sql) -%}
{%- set secure = config.get('secure', default=false) -%}
{%- set copy_grants = config.get('copy_grants', default=false) -%}
{%- set sql_header = config.get('sql_header', none) -%}

{{ sql_header if sql_header is not none }}
create or replace {% if secure -%}
secure
{%- endif %} view {{ relation }} {% if copy_grants -%} copy grants {%- endif %} as (
Expand Down
2 changes: 1 addition & 1 deletion plugins/snowflake/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@


package_name = "dbt-snowflake"
package_version = "0.15.0"
package_version = "0.15.1rc1"
description = """The snowflake adapter plugin for dbt (data build tool)"""

this_directory = os.path.abspath(os.path.dirname(__file__))
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@


package_name = "dbt"
package_version = "0.15.0"
package_version = "0.15.1rc1"
description = """With dbt, data analysts and engineers can build analytics \
the way engineers build applications."""

Expand Down
14 changes: 14 additions & 0 deletions test/integration/022_bigquery_test/models/sql_header_model.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
{{ config(materialized="table") }}

{# This will fail if it is not extracted correctly #}
{% call set_sql_header(config) %}
CREATE TEMPORARY FUNCTION a_to_b(str STRING)
RETURNS STRING AS (
CASE
WHEN LOWER(str) = 'a' THEN 'b'
ELSE str
END
);
{% endcall %}

select a_to_b(dupe) as dupe from {{ ref('view_model') }}
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ def test__bigquery_simple_run(self):
self.run_dbt(['seed', '--full-refresh'])
results = self.run_dbt()
# Bump expected number of results when adding new model
self.assertEqual(len(results), 7)
self.assertEqual(len(results), 8)
self.assert_nondupes_pass()


Expand All @@ -64,7 +64,7 @@ class TestUnderscoreBigQueryRun(TestBaseBigQueryRun):
def test_bigquery_run_twice(self):
self.run_dbt(['seed'])
results = self.run_dbt()
self.assertEqual(len(results), 7)
self.assertEqual(len(results), 8)
results = self.run_dbt()
self.assertEqual(len(results), 7)
self.assertEqual(len(results), 8)
self.assert_nondupes_pass()

0 comments on commit 1235b3f

Please sign in to comment.