From b9005396b318557b88870def7819570ed68f1687 Mon Sep 17 00:00:00 2001 From: Alena Date: Mon, 13 Jan 2025 17:49:17 +0100 Subject: [PATCH] move example to advanced --- .../general-usage/credentials/advanced.md | 27 +++++++++++++++++- .../docs/walkthroughs/add_credentials.md | 28 ------------------- 2 files changed, 26 insertions(+), 29 deletions(-) diff --git a/docs/website/docs/general-usage/credentials/advanced.md b/docs/website/docs/general-usage/credentials/advanced.md index ef33e3f1d4..39882331b4 100644 --- a/docs/website/docs/general-usage/credentials/advanced.md +++ b/docs/website/docs/general-usage/credentials/advanced.md @@ -160,7 +160,32 @@ dlt.secrets["destination.postgres.credentials"] = BaseHook.get_connection('postg This will mock the TOML provider to desired values. -### Example +## Configuring destination credentials in code + +If you need to manage destination credentials programmatically, such as retrieving them from environment variables, you can define them directly in your pipeline code. + +The following example demonstrates how to use [GcpServiceAccountCredentials](complex_types#gcp-credentials) to set up a pipeline with a BigQuery destination. + +```py +import os + +import dlt +from dlt.sources.credentials import GcpServiceAccountCredentials +from dlt.destinations import bigquery + +# Retrieve credentials from the environment variable +creds_dict = os.getenv('BIGQUERY_CREDENTIALS') + +# Create credentials instance and parse them from a native representation +gcp_credentials = GcpServiceAccountCredentials() +gcp_credentials.parse_native_representation(creds_dict) + +# Pass the credentials to the BigQuery destination +pipeline = dlt.pipeline(destination=bigquery(credentials=gcp_credentials)) +pipeline.run([{"key1": "value1"}], table_name="temp") +``` + +## Example In the example below, the `google_sheets` source function is used to read selected tabs from Google Sheets. It takes several arguments that specify the spreadsheet, the tab names, and the Google credentials to be used when extracting data. diff --git a/docs/website/docs/walkthroughs/add_credentials.md b/docs/website/docs/walkthroughs/add_credentials.md index 3dd1fa5066..34616bc154 100644 --- a/docs/website/docs/walkthroughs/add_credentials.md +++ b/docs/website/docs/walkthroughs/add_credentials.md @@ -74,34 +74,6 @@ DESTINATION__BIGQUERY__CREDENTIALS__PRIVATE_KEY DESTINATION__BIGQUERY__CREDENTIALS__CLIENT_EMAIL DESTINATION__BIGQUERY__LOCATION ``` -### Configuring destination credentials in code - -If you need to manage destination credentials programmatically, such as retrieving them from environment variables, you can define them directly in your pipeline code. - -The following example demonstrates how to use `GcpServiceAccountCredentials` to set up a pipeline with a BigQuery destination. - -```py -import os - -import dlt -from dlt.sources.credentials import GcpServiceAccountCredentials -from dlt.destinations import bigquery - -# Retrieve credentials from the environment variable -creds_dict = os.getenv('BIGQUERY_CREDENTIALS') - -# Create credentials instance and parse them from a native representation -gcp_credentials = GcpServiceAccountCredentials() -gcp_credentials.parse_native_representation(creds_dict) - -# Pass the credentials to the BigQuery destination -pipeline = dlt.pipeline(destination=bigquery(credentials=gcp_credentials)) -pipeline.run([{"key1": "value1"}], table_name="temp") -``` - -In the example above, we retrieve service account credentials using `os.getenv`, parse them, and attach them to the pipeline's destination. - -For more details on GCP credentials, see [our documentation.](../general-usage/credentials/complex_types#gcp-credentials) ## Retrieving credentials from Google Cloud Secret Manager