Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merging in datastore-v1beta3 branch. #1688

Merged
merged 31 commits into from
Apr 1, 2016
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
64c28fd
Updating datastore URI template for v1beta3.
dhermes Jan 21, 2016
712077e
Removing use of isolation level in datastore.
dhermes Jan 3, 2016
323eff8
Renaming Makefile dirs to be Bigtable specific.
dhermes Jan 7, 2016
3fbe2ec
Upgrading Makefile to generate datastore v1beta3.
dhermes Jan 7, 2016
62abc15
Updating lint rules to account for new generated files.
dhermes Jan 7, 2016
5e6ab59
Using more descriptive shell var names in Makefile.
dhermes Feb 12, 2016
77322e0
Renaming dataset_id->project_id on PartitionId.
dhermes Jan 7, 2016
9957fb5
Renaming namespace->namespace_id on PartitionId.
dhermes Jan 7, 2016
a2493e2
Rename Value.indexed->exclude_from_indexes.
dhermes Jan 15, 2016
f734f0a
Renaming datastore operation->op on CompositeFilter and PropertyFilter.
dhermes Jan 7, 2016
ad509aa
Renaming filter->filters in CompositeFilter.
dhermes Jan 7, 2016
bced58c
Updating to new default for CompositeFilter.Operator.
dhermes Jan 7, 2016
47540b8
Updating to new default for ReadOptions.ReadConsistency.
dhermes Jan 7, 2016
eccd475
Renaming group_by->distinct_on in Query.
dhermes Jan 7, 2016
de20e17
Updating Query.limit from int32 to google.protobuf.Int32Value.
dhermes Jan 7, 2016
01a6b3c
Renaming entity_result->entity_results in QueryResultBatch.
dhermes Jan 7, 2016
b925c1a
Renaming key->keys in AllocateIdsResponse/AllocateIdsRequest.
dhermes Jan 7, 2016
43ba890
Renaming key->keys in LookupRequest.
dhermes Jan 7, 2016
e7a6e20
Renaming path_element->path in Key.
dhermes Jan 7, 2016
08789e3
Upgrading Entity.property to properties map in datastore.
dhermes Jan 7, 2016
8f06584
Upgrading timestamp_microseconds_value to timestamp_value.
dhermes Jan 7, 2016
58855a3
Upgrading list_value -> array_value for v1beta3.
dhermes Jan 7, 2016
ff76cb4
Updating CommitRequest, Mutation and helpers for v1beta3.
dhermes Jan 7, 2016
1f17183
Follow-on commit to clean-up out-of-sync code.
dhermes Feb 13, 2016
10c4393
Using WhichOneof when parsing a Value protobuf.
dhermes Feb 13, 2016
2336b8f
Adding support for null and geo point values in v1beta3.
dhermes Feb 13, 2016
c0a5306
Removing custom dataset ID environment variable.
dhermes Feb 13, 2016
ba8c262
Removing hacks that avoid using project ID in key protos.
dhermes Feb 13, 2016
4524b53
Removing checks for dataset ID prefixes.
dhermes Feb 13, 2016
44e12b1
Parse Datastore error message using Status protocol buffer.
pcostell Mar 15, 2016
8538095
Merge branch master into datastore-v1beta3
dhermes Apr 1, 2016
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -53,3 +53,4 @@ scripts/pylintrc_reduced
# Directories used for creating generated PB2 files
generated_python/
cloud-bigtable-client/
googleapis-pb/
4 changes: 1 addition & 3 deletions CONTRIBUTING.rst
Original file line number Diff line number Diff line change
Expand Up @@ -162,8 +162,6 @@ Running System Tests

- ``GCLOUD_TESTS_PROJECT_ID``: Developers Console project ID (e.g.
bamboo-shift-455).
- ``GCLOUD_TESTS_DATASET_ID``: The name of the dataset your tests connect to.
This is typically the same as ``GCLOUD_TESTS_PROJECT_ID``.
- ``GOOGLE_APPLICATION_CREDENTIALS``: The path to a JSON key file;
see ``system_tests/app_credentials.json.sample`` as an example. Such a file
can be downloaded directly from the developer's console by clicking
Expand Down Expand Up @@ -195,7 +193,7 @@ Running System Tests

# Create the indexes
$ gcloud preview datastore create-indexes system_tests/data/index.yaml \
> --project=$GCLOUD_TESTS_DATASET_ID
> --project=$GCLOUD_TESTS_PROJECT_ID

# Restore your environment to its previous state.
$ unset CLOUDSDK_PYTHON_SITEPACKAGES
Expand Down
67 changes: 44 additions & 23 deletions Makefile
Original file line number Diff line number Diff line change
@@ -1,8 +1,10 @@
GENERATED_DIR=$(shell pwd)/generated_python
FINAL_DIR=$(shell pwd)/gcloud/bigtable/_generated
BIGTABLE_DIR=$(shell pwd)/gcloud/bigtable/_generated
DATASTORE_DIR=$(shell pwd)/gcloud/datastore/_generated
GRPC_PLUGIN=grpc_python_plugin
PROTOC_CMD=protoc
PROTOS_DIR=$(shell pwd)/cloud-bigtable-client/bigtable-protos/src/main/proto
BIGTABLE_PROTOS_DIR=$(shell pwd)/cloud-bigtable-client/bigtable-protos/src/main/proto
GOOGLEAPIS_PROTOS_DIR=$(shell pwd)/googleapis-pb

help:
@echo 'Makefile for gcloud-python Bigtable protos '
Expand All @@ -12,42 +14,58 @@ help:
@echo ' make clean Clean generated files '

generate:
[ -d cloud-bigtable-client ] || git clone https://github.com/GoogleCloudPlatform/cloud-bigtable-client
# Retrieve git repos that have our *.proto files.
[ -d cloud-bigtable-client ] || git clone https://github.com/GoogleCloudPlatform/cloud-bigtable-client --depth=1
cd cloud-bigtable-client && git pull origin master
[ -d googleapis-pb ] || git clone https://github.com/google/googleapis googleapis-pb --depth=1
cd googleapis-pb && git pull origin master
# Make the directory where our *_pb2.py files will go.
mkdir -p $(GENERATED_DIR)
# Generate all *_pb2.py files that require gRPC.
$(PROTOC_CMD) \
--proto_path=$(PROTOS_DIR) \
--proto_path=$(BIGTABLE_PROTOS_DIR) \
--python_out=$(GENERATED_DIR) \
--plugin=protoc-gen-grpc=$(GRPC_PLUGIN) \
--grpc_out=$(GENERATED_DIR) \
$(PROTOS_DIR)/google/bigtable/v1/bigtable_service.proto \
$(PROTOS_DIR)/google/bigtable/admin/cluster/v1/bigtable_cluster_service.proto \
$(PROTOS_DIR)/google/bigtable/admin/table/v1/bigtable_table_service.proto
$(BIGTABLE_PROTOS_DIR)/google/bigtable/v1/bigtable_service.proto \
$(BIGTABLE_PROTOS_DIR)/google/bigtable/admin/cluster/v1/bigtable_cluster_service.proto \
$(BIGTABLE_PROTOS_DIR)/google/bigtable/admin/table/v1/bigtable_table_service.proto
# Generate all *_pb2.py files that do not require gRPC.
$(PROTOC_CMD) \
--proto_path=$(PROTOS_DIR) \
--proto_path=$(BIGTABLE_PROTOS_DIR) \
--proto_path=$(GOOGLEAPIS_PROTOS_DIR) \
--python_out=$(GENERATED_DIR) \
$(PROTOS_DIR)/google/bigtable/v1/bigtable_data.proto \
$(PROTOS_DIR)/google/bigtable/v1/bigtable_service_messages.proto \
$(PROTOS_DIR)/google/bigtable/admin/cluster/v1/bigtable_cluster_data.proto \
$(PROTOS_DIR)/google/bigtable/admin/cluster/v1/bigtable_cluster_service_messages.proto \
$(PROTOS_DIR)/google/bigtable/admin/table/v1/bigtable_table_data.proto \
$(PROTOS_DIR)/google/bigtable/admin/table/v1/bigtable_table_service_messages.proto
$(BIGTABLE_PROTOS_DIR)/google/bigtable/v1/bigtable_data.proto \
$(BIGTABLE_PROTOS_DIR)/google/bigtable/v1/bigtable_service_messages.proto \
$(BIGTABLE_PROTOS_DIR)/google/bigtable/admin/cluster/v1/bigtable_cluster_data.proto \
$(BIGTABLE_PROTOS_DIR)/google/bigtable/admin/cluster/v1/bigtable_cluster_service_messages.proto \
$(BIGTABLE_PROTOS_DIR)/google/bigtable/admin/table/v1/bigtable_table_data.proto \
$(BIGTABLE_PROTOS_DIR)/google/bigtable/admin/table/v1/bigtable_table_service_messages.proto \
$(GOOGLEAPIS_PROTOS_DIR)/google/datastore/v1beta3/datastore.proto \
$(GOOGLEAPIS_PROTOS_DIR)/google/datastore/v1beta3/entity.proto \
$(GOOGLEAPIS_PROTOS_DIR)/google/datastore/v1beta3/query.proto
# Move the newly generated *_pb2.py files into our library.
mv $(GENERATED_DIR)/google/bigtable/v1/* $(FINAL_DIR)
mv $(GENERATED_DIR)/google/bigtable/admin/cluster/v1/* $(FINAL_DIR)
mv $(GENERATED_DIR)/google/bigtable/admin/table/v1/* $(FINAL_DIR)
mv $(GENERATED_DIR)/google/bigtable/v1/* $(BIGTABLE_DIR)
mv $(GENERATED_DIR)/google/bigtable/admin/cluster/v1/* $(BIGTABLE_DIR)
mv $(GENERATED_DIR)/google/bigtable/admin/table/v1/* $(BIGTABLE_DIR)
mv $(GENERATED_DIR)/google/datastore/v1beta3/* $(DATASTORE_DIR)
# Remove all existing *.proto files before we replace
rm -f $(FINAL_DIR)/*.proto
rm -f $(BIGTABLE_DIR)/*.proto
rm -f $(DATASTORE_DIR)/*.proto
# Copy over the *.proto files into our library.
cp $(PROTOS_DIR)/google/bigtable/v1/*.proto $(FINAL_DIR)
cp $(PROTOS_DIR)/google/bigtable/admin/cluster/v1/*.proto $(FINAL_DIR)
cp $(PROTOS_DIR)/google/bigtable/admin/table/v1/*.proto $(FINAL_DIR)
cp $(PROTOS_DIR)/google/longrunning/operations.proto $(FINAL_DIR)
cp $(BIGTABLE_PROTOS_DIR)/google/bigtable/v1/*.proto $(BIGTABLE_DIR)
cp $(BIGTABLE_PROTOS_DIR)/google/bigtable/admin/cluster/v1/*.proto $(BIGTABLE_DIR)
cp $(BIGTABLE_PROTOS_DIR)/google/bigtable/admin/table/v1/*.proto $(BIGTABLE_DIR)
cp $(BIGTABLE_PROTOS_DIR)/google/longrunning/operations.proto $(BIGTABLE_DIR)
cp $(GOOGLEAPIS_PROTOS_DIR)/google/datastore/v1beta3/*.proto $(DATASTORE_DIR)
# Rename all *.proto files in our library with an
# underscore and remove executable bit.
cd $(FINAL_DIR) && \
cd $(BIGTABLE_DIR) && \
for filename in *.proto; do \
chmod -x $$filename ; \
mv $$filename _$$filename ; \
done
cd $(DATASTORE_DIR) && \
for filename in *.proto; do \
chmod -x $$filename ; \
mv $$filename _$$filename ; \
Expand All @@ -56,6 +74,9 @@ generate:
# non-gRPC parts so that the protos from `googleapis-common-protos`
# can be used without gRPC.
python scripts/make_operations_grpc.py
# Separate the gRPC parts of the datastore service from the
# non-gRPC parts so that the protos can be used without gRPC.
python scripts/make_datastore_grpc.py
# Rewrite the imports in the generated *_pb2.py files.
python scripts/rewrite_imports.py

Expand Down
2 changes: 1 addition & 1 deletion README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ writes, strong consistency for reads and ancestor queries, and eventual
consistency for all other queries.

.. _Cloud Datastore: https://cloud.google.com/datastore/docs
.. _Datastore API docs: https://cloud.google.com/datastore/docs/apis/v1beta2/
.. _Datastore API docs: https://cloud.google.com/datastore/docs/apis/v1beta3/

See the ``gcloud-python`` API `datastore documentation`_ to learn how to
interact with the Cloud Datastore using this Client Library.
Expand Down
Loading