You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It has been reported that the connector is creating multiple jdbc sessions. The connector should only create a constant number of JDBC connections on the driver node to prepare for read/write. The number of connections should not scale with executors created.
Environment
Spark version: N/A
Hadoop version: N/A
Vertica version: N/A
Vertica Spark Connector version: 3.2.1
Java version: N/A
Additional Environment Information: pyspark on Jupyter client
The text was updated successfully, but these errors were encountered:
FYI, when testing the connections to Vertica you can run any of the following queries in vsql to monitor the number of open sessions (be aware the query itself will create 1 session):
$ docker exec -it docker_vertica_1 vsql
> SELECT COUNT(*) FROM v_monitor.sessions;
> SELECT session_id, user_name, current_statement, last_statement, statement_start FROM v_monitor.sessions;
> SELECT session_id, node_name, user_name, client_hostname, login_timestamp, statement_start, current_statement, last_statement, client_type, client_os FROM v_monitor.sessions ORDER BY user_name;
Problem Description
It has been reported that the connector is creating multiple jdbc sessions. The connector should only create a constant number of JDBC connections on the driver node to prepare for read/write. The number of connections should not scale with executors created.
Environment
The text was updated successfully, but these errors were encountered: