Skip to content

Commit

Permalink
Merge branch 'main' into b323176126-pandas-gbq
Browse files Browse the repository at this point in the history
  • Loading branch information
tswast authored Jan 16, 2025
2 parents aff7021 + 9c50418 commit ce368d9
Show file tree
Hide file tree
Showing 16 changed files with 1,066 additions and 13 deletions.
6 changes: 3 additions & 3 deletions .github/.OwlBot.lock.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright 2024 Google LLC
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
Expand All @@ -13,5 +13,5 @@
# limitations under the License.
docker:
image: gcr.io/cloud-devrel-public-resources/owlbot-python:latest
digest: sha256:8e3e7e18255c22d1489258d0374c901c01f9c4fd77a12088670cd73d580aa737
# created: 2024-12-17T00:59:58.625514486Z
digest: sha256:8ff1efe878e18bd82a0fb7b70bb86f77e7ab6901fed394440b6135db0ba8d84a
# created: 2025-01-09T12:01:16.422459506Z
28 changes: 28 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,34 @@
[1]: https://pypi.org/project/google-cloud-bigquery/#history


## [3.28.0](https://github.com/googleapis/python-bigquery/compare/v3.27.0...v3.28.0) (2025-01-15)


### Features

* Add property for `allowNonIncrementalDefinition` for materialized view ([#2084](https://github.com/googleapis/python-bigquery/issues/2084)) ([3359ef3](https://github.com/googleapis/python-bigquery/commit/3359ef37b90243bea2d9e68bb996fe5d736f304c))
* Add property for maxStaleness in table definitions ([#2087](https://github.com/googleapis/python-bigquery/issues/2087)) ([729322c](https://github.com/googleapis/python-bigquery/commit/729322c2288a30464f2f135ba18b9c4aa7d2f0da))
* Add type hints to Client ([#2044](https://github.com/googleapis/python-bigquery/issues/2044)) ([40529de](https://github.com/googleapis/python-bigquery/commit/40529de923e25c41c6728c121b9c82a042967ada))
* Adds ExternalCatalogDatasetOptions and tests ([#2111](https://github.com/googleapis/python-bigquery/issues/2111)) ([b929a90](https://github.com/googleapis/python-bigquery/commit/b929a900d49e2c15897134209ed9de5fc7f238cd))
* Adds ForeignTypeInfo class and tests ([#2110](https://github.com/googleapis/python-bigquery/issues/2110)) ([55ca63c](https://github.com/googleapis/python-bigquery/commit/55ca63c23fcb56573e2de67e4f7899939628c4a1))
* Adds new input validation function similar to isinstance. ([#2107](https://github.com/googleapis/python-bigquery/issues/2107)) ([a2bebb9](https://github.com/googleapis/python-bigquery/commit/a2bebb95c5ef32ac7c7cbe19c3e7a9412cbee60d))
* Adds StorageDescriptor and tests ([#2109](https://github.com/googleapis/python-bigquery/issues/2109)) ([6be0272](https://github.com/googleapis/python-bigquery/commit/6be0272ff25dac97a38ae4ee5aa02016dc82a0d8))
* Adds the SerDeInfo class and tests ([#2108](https://github.com/googleapis/python-bigquery/issues/2108)) ([62960f2](https://github.com/googleapis/python-bigquery/commit/62960f255d05b15940a8d2cdc595592175fada11))
* Migrate to pyproject.toml ([#2041](https://github.com/googleapis/python-bigquery/issues/2041)) ([1061611](https://github.com/googleapis/python-bigquery/commit/106161180ead01aca1ead909cf06ca559f68666d))
* Preserve unknown fields from the REST API representation in `SchemaField` ([#2097](https://github.com/googleapis/python-bigquery/issues/2097)) ([aaf1eb8](https://github.com/googleapis/python-bigquery/commit/aaf1eb85ada95ab866be0199812ea7f5c7f50766))
* Resource tags in dataset ([#2090](https://github.com/googleapis/python-bigquery/issues/2090)) ([3e13016](https://github.com/googleapis/python-bigquery/commit/3e130166f43dcc06704fe90edf9068dfd44842a6))
* Support setting max_stream_count when fetching query result ([#2051](https://github.com/googleapis/python-bigquery/issues/2051)) ([d461297](https://github.com/googleapis/python-bigquery/commit/d4612979b812d2a835e47200f27a87a66bcb856a))


### Bug Fixes

* Allow geopandas 1.x ([#2065](https://github.com/googleapis/python-bigquery/issues/2065)) ([f2ab8cb](https://github.com/googleapis/python-bigquery/commit/f2ab8cbfe00d442ad3b40683ecfec320e53b4688))


### Documentation

* Render fields correctly for update calls ([#2055](https://github.com/googleapis/python-bigquery/issues/2055)) ([a4d9534](https://github.com/googleapis/python-bigquery/commit/a4d9534a900f13ae7355904cda05097d781f27e3))

## [3.27.0](https://github.com/googleapis/python-bigquery/compare/v3.26.0...v3.27.0) (2024-11-01)


Expand Down
32 changes: 31 additions & 1 deletion google/cloud/bigquery/_helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
import re
import os
import warnings
from typing import Optional, Union
from typing import Optional, Union, Any, Tuple, Type

from dateutil import relativedelta
from google.cloud._helpers import UTC # type: ignore
Expand Down Expand Up @@ -1004,3 +1004,33 @@ def _verify_job_config_type(job_config, expected_type, param_name="job_config"):
job_config=job_config,
)
)


def _isinstance_or_raise(
value: Any,
dtype: Union[Type, Tuple[Type, ...]],
none_allowed: Optional[bool] = False,
) -> Any:
"""Determine whether a value type matches a given datatype or None.
Args:
value (Any): Value to be checked.
dtype (type): Expected data type or tuple of data types.
none_allowed Optional(bool): whether value is allowed to be None. Default
is False.
Returns:
Any: Returns the input value if the type check is successful.
Raises:
TypeError: If the input value's type does not match the expected data type(s).
"""
if none_allowed and value is None:
return value

if isinstance(value, dtype):
return value

or_none = ""
if none_allowed:
or_none = " (or None)"

msg = f"Pass {value} as a '{dtype}'{or_none}. Got {type(value)}."
raise TypeError(msg)
48 changes: 48 additions & 0 deletions google/cloud/bigquery/dataset.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@
from google.cloud.bigquery.routine import Routine, RoutineReference
from google.cloud.bigquery.table import Table, TableReference
from google.cloud.bigquery.encryption_configuration import EncryptionConfiguration
from google.cloud.bigquery import external_config

from typing import Optional, List, Dict, Any, Union

Expand Down Expand Up @@ -530,6 +531,8 @@ class Dataset(object):
"storage_billing_model": "storageBillingModel",
"max_time_travel_hours": "maxTimeTravelHours",
"default_rounding_mode": "defaultRoundingMode",
"resource_tags": "resourceTags",
"external_catalog_dataset_options": "externalCatalogDatasetOptions",
}

def __init__(self, dataset_ref) -> None:
Expand Down Expand Up @@ -801,6 +804,28 @@ def labels(self, value):
raise ValueError("Pass a dict")
self._properties["labels"] = value

@property
def resource_tags(self):
"""Dict[str, str]: Resource tags of the dataset.
Optional. The tags attached to this dataset. Tag keys are globally
unique. Tag key is expected to be in the namespaced format, for
example "123456789012/environment" where 123456789012 is
the ID of the parent organization or project resource for this tag
key. Tag value is expected to be the short name, for example
"Production".
Raises:
ValueError: for invalid value types.
"""
return self._properties.setdefault("resourceTags", {})

@resource_tags.setter
def resource_tags(self, value):
if not isinstance(value, dict) and value is not None:
raise ValueError("Pass a dict")
self._properties["resourceTags"] = value

@property
def default_encryption_configuration(self):
"""google.cloud.bigquery.encryption_configuration.EncryptionConfiguration: Custom
Expand Down Expand Up @@ -875,6 +900,29 @@ def storage_billing_model(self, value):
)
self._properties["storageBillingModel"] = value

@property
def external_catalog_dataset_options(self):
"""Options defining open source compatible datasets living in the
BigQuery catalog. Contains metadata of open source database, schema
or namespace represented by the current dataset."""

prop = _helpers._get_sub_prop(
self._properties, ["externalCatalogDatasetOptions"]
)

if prop is not None:
prop = external_config.ExternalCatalogDatasetOptions.from_api_repr(prop)
return prop

@external_catalog_dataset_options.setter
def external_catalog_dataset_options(self, value):
value = _helpers._isinstance_or_raise(
value, external_config.ExternalCatalogDatasetOptions, none_allowed=True
)
self._properties[
self._PROPERTY_TO_API_FIELD["external_catalog_dataset_options"]
] = (value.to_api_repr() if value is not None else None)

@classmethod
def from_string(cls, full_dataset_id: str) -> "Dataset":
"""Construct a dataset from fully-qualified dataset ID.
Expand Down
76 changes: 75 additions & 1 deletion google/cloud/bigquery/external_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
Job.configuration.query.tableDefinitions.
"""

from __future__ import absolute_import
from __future__ import absolute_import, annotations

import base64
import copy
Expand All @@ -28,6 +28,7 @@
from google.cloud.bigquery._helpers import _bytes_to_json
from google.cloud.bigquery._helpers import _int_or_none
from google.cloud.bigquery._helpers import _str_or_none
from google.cloud.bigquery import _helpers
from google.cloud.bigquery.format_options import AvroOptions, ParquetOptions
from google.cloud.bigquery.schema import SchemaField

Expand Down Expand Up @@ -1003,3 +1004,76 @@ def from_api_repr(cls, resource: dict) -> "ExternalConfig":
config = cls(resource["sourceFormat"])
config._properties = copy.deepcopy(resource)
return config


class ExternalCatalogDatasetOptions:
"""Options defining open source compatible datasets living in the BigQuery catalog.
Contains metadata of open source database, schema or namespace represented
by the current dataset.
Args:
default_storage_location_uri (Optional[str]): The storage location URI for all
tables in the dataset. Equivalent to hive metastore's database
locationUri. Maximum length of 1024 characters. (str)
parameters (Optional[dict[str, Any]]): A map of key value pairs defining the parameters
and properties of the open source schema. Maximum size of 2Mib.
"""

def __init__(
self,
default_storage_location_uri: Optional[str] = None,
parameters: Optional[Dict[str, Any]] = None,
):
self._properties: Dict[str, Any] = {}
self.default_storage_location_uri = default_storage_location_uri
self.parameters = parameters

@property
def default_storage_location_uri(self) -> Optional[str]:
"""Optional. The storage location URI for all tables in the dataset.
Equivalent to hive metastore's database locationUri. Maximum length of
1024 characters."""

return self._properties.get("defaultStorageLocationUri")

@default_storage_location_uri.setter
def default_storage_location_uri(self, value: Optional[str]):
value = _helpers._isinstance_or_raise(value, str, none_allowed=True)
self._properties["defaultStorageLocationUri"] = value

@property
def parameters(self) -> Optional[Dict[str, Any]]:
"""Optional. A map of key value pairs defining the parameters and
properties of the open source schema. Maximum size of 2Mib."""

return self._properties.get("parameters")

@parameters.setter
def parameters(self, value: Optional[Dict[str, Any]]):
value = _helpers._isinstance_or_raise(value, dict, none_allowed=True)
self._properties["parameters"] = value

def to_api_repr(self) -> dict:
"""Build an API representation of this object.
Returns:
Dict[str, Any]:
A dictionary in the format used by the BigQuery API.
"""
return self._properties

@classmethod
def from_api_repr(cls, api_repr: dict) -> ExternalCatalogDatasetOptions:
"""Factory: constructs an instance of the class (cls)
given its API representation.
Args:
api_repr (Dict[str, Any]):
API representation of the object to be instantiated.
Returns:
An instance of the class initialized with data from 'resource'.
"""
config = cls()
config._properties = api_repr
return config
Loading

0 comments on commit ce368d9

Please sign in to comment.