-
Notifications
You must be signed in to change notification settings - Fork 189
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Include the snowflake query id in the run results output #7
Comments
Hey @jsnb-devoted, thanks for opening this awesome issue, and so sorry for the delay getting back to you! Good news is, I think this is going to be quite straightforward to implement—a
There's a carved-out place for just this sort of thing, in We'll need to define a new subclass, @dataclass
class SnowflakeAdapterResponse(AdapterResponse):
query_id: Optional[str] = None Then, we can simply update the @classmethod
def get_response(cls, cursor) -> SnowflakeAdapterResponse:
code = cursor.sqlstate
if code is None:
code = 'SUCCESS'
return SnowflakeAdapterResponse(
_message="{} {}".format(code, cursor.rowcount),
rows_affected=cursor.rowcount,
code=code,
query_id=cursor._sfqid
) It doesn't look like we have automated tests to verify the contents of One limitation to call out: |
@jtcohen6 thanks so much! I'll try to carve out some time to contribute. I took a quick look at the BigQuery implementation and it looks pretty straight forward. I hear you re: the multiple query materializations. The macro we wrote to log the queries had the same limitation. |
Describe the feature
As an analytics engineer I want a quick way to get the query id of a model run (or even test/seed/snapshot) so I can quickly see the query in the snowflake UI
Describe alternatives you've considered
We've implemented a macro that runs in the post-hook that will get the full url to the query and log it to console. This works great except that we have to call
SELECT last_query_id()
on every model run and more importantly the hook does not execute if the query failsI acknowledge that the compiled SQL is available as an alternative but we run airflow on kubernetes so the artifacts are not readily available. We do persist the artifacts in s3 but I think it would be an improved experience to go from airflow log to snowflake ui in one click.
Additional context
I can see that the snowflake query id is being logged for failed queries if you enable debug mode but the output is far too verbose for this use case. I would want just the query url/id logged on every run so that a user can go from log => snowflake ui with the fewest steps possible.
Who will this benefit?
I think this would benefit all dbt-snowflake users that use the snowflake ui for debugging/performance monitoring.
Are you interested in contributing this feature?
I wouldn't mind contributing but I would need some guidance on what option is most amenable to dbt's design. I'm not sure if the plugins are meant to interact with the run results functionality in dbt core.
The text was updated successfully, but these errors were encountered: