Replies: 1 comment 1 reply
-
Hey @sonic-x11 -- that seems like a good feature request if you'd like to open an issue. In the interim, here is a (hacky) workaround: If you have your expression variable spark_df = con._session.sql(con.compile(t)) That |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have created an Ibis connection using the PySpark backend, created a table, and applied some filters. Now, I want to get the result as a PySpark DataFrame, but the available options for conversion are Polars, Arrow, and Pandas DataFrames. Thank you!
Beta Was this translation helpful? Give feedback.
All reactions