You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It seems that python sdk for databricks allows to upload files.
Research if it is possible to load files into tables like we do into BigQuery: when a local file may be copied into a table without any stage
If that does not work, research how to use Volumes on databricks to copy files there and use COPY INTO to move them into table.
If authentication is not configured, enable default credentials (ie. if present on serverless compute). You can take a look how CredentialsWithDefault is used (most implementations check if default credentials are present in def on_partial(self) -> None: but in your case you should to it in on_resolve when all fields holding credentials are empty)
Ideal scenario. when running in a Notebook, is that we can load a source (ie rest_api) without any additional configuration, staging or authorization - like we are able to do with duckdb
The text was updated successfully, but these errors were encountered:
* databricks: enable local files
* fix: databricks test config
* work in progress
* added create and drop volume to interface
* refactor direct load authentication
* fix databricks volume file name
* refactor databricks direct loading
* format and lint
* revert config.toml changes
* force notebook auth
* enhanced config validations
* force exception
* fix config resolve
* remove imports
* test: config exceptions
* restore comments
* restored destination_config
* fix pokema api values
* enables databricks no stage tests
* fix databricks config on_resolved
* adjusted direct load file management
* direct load docs
* filters by bucket when subset of destinations is set when creating test cases
* simpler file upload
* fix comment
* passes authentication directly from workspace, adds proper fingerprinting
* use real client_id in tests
* fixes config resolver to not pass NotResolved hints to config providers
---------
Co-authored-by: Marcin Rudolf <rudolfix@rudolfix.org>
It seems that python sdk for databricks allows to upload files.
CredentialsWithDefault
is used (most implementations check if default credentials are present in def on_partial(self) -> None: but in your case you should to it in on_resolve when all fields holding credentials are empty)Ideal scenario. when running in a Notebook, is that we can load a source (ie
rest_api
) without any additional configuration, staging or authorization - like we are able to do with duckdbThe text was updated successfully, but these errors were encountered: