Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated credentials docs "Added section Destination configuration via Python" #2182

Merged
merged 8 commits into from
Jan 13, 2025
Merged
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 27 additions & 1 deletion docs/website/docs/general-usage/credentials/advanced.md
Original file line number Diff line number Diff line change
Expand Up @@ -160,7 +160,7 @@ dlt.secrets["destination.postgres.credentials"] = BaseHook.get_connection('postg

This will mock the TOML provider to desired values.

## Example
### Example

In the example below, the `google_sheets` source function is used to read selected tabs from Google Sheets.
It takes several arguments that specify the spreadsheet, the tab names, and the Google credentials to be used when extracting data.
Expand Down Expand Up @@ -202,3 +202,29 @@ In the example above:
of a **source**)
:::

### Configuring destination credentials in code

You can also configure credentials directly for your destination in code. The following example demonstrates how to use `GcpServiceAccountCredentials` to set up a pipeline with a BigQuery destination.

```py
import os

import dlt
from dlt.sources.credentials import GcpServiceAccountCredentials
from dlt.destinations import bigquery

# Retrieve credentials from the environment variable
creds_dict = os.getenv('BIGQUERY_CREDENTIALS')

# Create credentials instance and parse them from a native representation
gcp_credentials = GcpServiceAccountCredentials()
gcp_credentials.parse_native_representation(creds_dict)

# Pass the credentials to the BigQuery destination
pipeline = dlt.pipeline(destination=bigquery(credentials=gcp_credentials))
pipeline.run([{"key1": "value1"}], table_name="temp")
```

In the example above, we retrieve service account credentials using `os.getenv`, parse them, and attach them to the pipeline's destination.

For more details on GCP credentials, see [our documentation.](../credentials/complex_types#gcp-credentials)
Loading