Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature][Connector-v2] Add Snowflake Source&Sink connector #4470

Merged
merged 29 commits into from
May 25, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
60b8cca
add snowflake connector
HaoXuAI Apr 1, 2023
4333c99
add snowflake connector
HaoXuAI Apr 1, 2023
d0b560f
add test
HaoXuAI Apr 3, 2023
5a28622
update README.md
HaoXuAI Apr 3, 2023
0686615
unchange gitignore
HaoXuAI Apr 4, 2023
4856b97
Merge branch 'dev' into snowflake-connector
HaoXuAI Apr 10, 2023
7818c62
update connector pom
HaoXuAI Apr 13, 2023
9796170
update doc
HaoXuAI Apr 13, 2023
06c950e
Merge branch 'dev' into snowflake-connector
HaoXuAI Apr 17, 2023
f852273
update ci
HaoXuAI Apr 25, 2023
b323cac
Merge branch 'snowflake-connector' of https://github.com/HaoXuAI/incu…
HaoXuAI Apr 25, 2023
197e140
update type
HaoXuAI May 1, 2023
0141dbb
Merge branch 'dev' into snowflake-connector
HaoXuAI May 1, 2023
b4bd5be
update type
HaoXuAI May 2, 2023
8537c9d
update doc
HaoXuAI May 2, 2023
2a4a5f9
update test
HaoXuAI May 3, 2023
60fd3fe
update type
HaoXuAI May 7, 2023
df70208
update test
HaoXuAI May 10, 2023
7635de8
update doc
HaoXuAI May 10, 2023
3e203e7
Merge branch 'dev' into snowflake-connector
HaoXuAI May 11, 2023
c224f49
fix typo
HaoXuAI May 11, 2023
98f4ecf
update geo type
HaoXuAI May 12, 2023
018437c
update doc
HaoXuAI May 13, 2023
182b523
Merge branch 'dev' into snowflake-connector
EricJoy2048 May 16, 2023
16bbef1
Update docs/en/connector-v2/sink/Snowflake.md
EricJoy2048 May 16, 2023
8d49407
fix code style
HaoXuAI May 17, 2023
d0896db
Merge branch 'snowflake-connector' of https://github.com/HaoXuAI/incu…
HaoXuAI May 17, 2023
917d878
update doc
HaoXuAI May 19, 2023
c72ede9
Merge branch 'dev' into snowflake-connector
hailin0 May 21, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/en/connector-v2/sink/Jdbc.md
Original file line number Diff line number Diff line change
Expand Up @@ -161,6 +161,7 @@ there are some reference value for params above.
| Doris | com.mysql.cj.jdbc.Driver | jdbc:mysql://localhost:3306/test | / | https://mvnrepository.com/artifact/mysql/mysql-connector-java |
| teradata | com.teradata.jdbc.TeraDriver | jdbc:teradata://localhost/DBS_PORT=1025,DATABASE=test | / | https://mvnrepository.com/artifact/com.teradata.jdbc/terajdbc |
| Redshift | com.amazon.redshift.jdbc42.Driver | jdbc:redshift://localhost:5439/testdb | com.amazon.redshift.xa.RedshiftXADataSource | https://mvnrepository.com/artifact/com.amazon.redshift/redshift-jdbc42 |
| Snowflake | net.snowflake.client.jdbc.SnowflakeDriver | jdbc:snowflake://<account_name>.snowflakecomputing.com | / | https://mvnrepository.com/artifact/net.snowflake/snowflake-jdbc |
| Vertica | com.vertica.jdbc.Driver | jdbc:vertica://localhost:5433 | / | https://repo1.maven.org/maven2/com/vertica/jdbc/vertica-jdbc/12.0.3-0/vertica-jdbc-12.0.3-0.jar |

## Example
Expand Down
144 changes: 144 additions & 0 deletions docs/en/connector-v2/sink/Snowflake.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,144 @@
# Snowflake

> JDBC Snowflake Sink Connector
>
> ## Support those engines
>
> Spark<br/>
> Flink<br/>
> Seatunnel Zeta<br/>
>
## Key features

- [ ] [exactly-once](../../concept/connector-v2-features.md)
- [x] [cdc](../../concept/connector-v2-features.md)

## Description

Write data through jdbc. Support Batch mode and Streaming mode, support concurrent writing.

## Supported DataSource list

| datasource | supported versions | driver | url | maven |
|------------|----------------------------------------------------------|-------------------------------------------|--------------------------------------------------------|-----------------------------------------------------------------------------|
| snowflake | Different dependency version has different driver class. | net.snowflake.client.jdbc.SnowflakeDriver | jdbc:snowflake://<account_name>.snowflakecomputing.com | [Download](https://mvnrepository.com/artifact/net.snowflake/snowflake-jdbc) |

## Database dependency

> Please download the support list corresponding to 'Maven' and copy it to the '$SEATNUNNEL_HOME/plugins/jdbc/lib/' working directory<br/>
> For example Snowflake datasource: cp snowflake-connector-java-xxx.jar $SEATNUNNEL_HOME/plugins/jdbc/lib/
>
## Data Type Mapping

| Snowflake Data type | Seatunnel Data type |
|-----------------------------------------------------------------------------|---------------------|
| BOOLEAN | BOOLEAN |
| TINYINT<br/>SMALLINT<br/>BYTEINT<br/> | SHORT_TYPE |
| INT<br/>INTEGER<br/> | INT |
| BIGINT | LONG |
| DECIMAL<br/>NUMERIC<br/>NUMBER<br/> | DECIMAL(x,y) |
| DECIMAL(x,y)(Get the designated column's specified column size.>38) | DECIMAL(38,18) |
| REAL<br/>FLOAT4 | FLOAT |
| DOUBLE<br/>DOUBLE PRECISION<br/>FLOAT8<br/>FLOAT<br/> | DOUBLE |
| CHAR<br/>CHARACTER<br/>VARCHAR<br/>STRING<br/>TEXT<br/>VARIANT<br/>OBJECT | STRING |
| DATE | DATE |
| TIME | TIME |
| DATETIME<br/>TIMESTAMP<br/>TIMESTAMP_LTZ<br/>TIMESTAMP_NTZ<br/>TIMESTAMP_TZ | TIMESTAMP |
| BINARY<br/>VARBINARY<br/>GEOGRAPHY<br/>GEOMETRY | BYTES |

## Options

| name | type | required | default value | description |
|-------------------------------------------|---------|----------|---------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| url | String | Yes | - | The URL of the JDBC connection. Refer to a case: jdbc:snowflake://<account_name>.snowflakecomputing.com |
| driver | String | Yes | - | The jdbc class name used to connect to the remote data source,<br/> if you use Snowflake the value is `net.snowflake.client.jdbc.SnowflakeDriver`. |
| user | String | No | - | Connection instance user name |
| password | String | No | - | Connection instance password |
| query | String | No | - | Use this sql write upstream input datas to database. e.g `INSERT ...`,`query` have the higher priority |
| database | String | No | - | Use this `database` and `table-name` auto-generate sql and receive upstream input datas write to database.<br/>This option is mutually exclusive with `query` and has a higher priority. |
| table | String | No | - | Use database and this table-name auto-generate sql and receive upstream input datas write to database.<br/>This option is mutually exclusive with `query` and has a higher priority. |
| primary_keys | Array | No | - | This option is used to support operations such as `insert`, `delete`, and `update` when automatically generate sql. |
| support_upsert_by_query_primary_key_exist | Boolean | No | false | Choose to use INSERT sql, UPDATE sql to process update events(INSERT, UPDATE_AFTER) based on query primary key exists. This configuration is only used when database unsupport upsert syntax. **Note**: that this method has low performance |
| connection_check_timeout_sec | Int | No | 30 | The time in seconds to wait for the database operation used to validate the connection to complete. |
| max_retries | Int | No | 0 | The number of retries to submit failed (executeBatch) |
| batch_size | Int | No | 1000 | For batch writing, when the number of buffered records reaches the number of `batch_size` or the time reaches `batch_interval_ms`<br/>, the data will be flushed into the database |
| batch_interval_ms | Int | No | 1000 | For batch writing, when the number of buffers reaches the number of `batch_size` or the time reaches `batch_interval_ms`, the data will be flushed into the database |
| max_commit_attempts | Int | No | 3 | The number of retries for transaction commit failures |
| transaction_timeout_sec | Int | No | -1 | The timeout after the transaction is opened, the default is -1 (never timeout). Note that setting the timeout may affect<br/>exactly-once semantics |
| auto_commit | Boolean | No | true | Automatic transaction commit is enabled by default |
| common-options | | no | - | Sink plugin common parameters, please refer to [Sink Common Options](common-options.md) for details |

## tips

> If partition_column is not set, it will run in single concurrency, and if partition_column is set, it will be executed in parallel according to the concurrency of tasks.
>
## Task Example

### simple:

> This example defines a SeaTunnel synchronization task that automatically generates data through FakeSource and sends it to JDBC Sink. FakeSource generates a total of 16 rows of data (row.num=16), with each row having two fields, name (string type) and age (int type). The final target table is test_table will also be 16 rows of data in the table. Before run this job, you need create database test and table test_table in your snowflake database. And if you have not yet installed and deployed SeaTunnel, you need to follow the instructions in [Install SeaTunnel](../../start-v2/locally/deployment.md) to install and deploy SeaTunnel. And then follow the instructions in [Quick Start With SeaTunnel Engine](../../start-v2/locally/quick-start-seatunnel-engine.md) to run this job.
>
> ```
> # Defining the runtime environment
> env {
> # You can set flink configuration here
> execution.parallelism = 1
> job.mode = "BATCH"
> }
> source {
> # This is a example source plugin **only for test and demonstrate the feature source plugin**
> FakeSource {
> parallelism = 1
> result_table_name = "fake"
> row.num = 16
> schema = {
> fields {
> name = "string"
> age = "int"
> }
> }
> }
> # If you would like to get more information about how to configure seatunnel and see full list of source plugins,
> # please go to https://seatunnel.apache.org/docs/category/source-v2
> }
> transform {
> # If you would like to get more information about how to configure seatunnel and see full list of transform plugins,
> # please go to https://seatunnel.apache.org/docs/category/transform-v2
> }
> sink {
> jdbc {
> url = "jdbc:snowflake://<account_name>.snowflakecomputing.com"
> driver = "net.snowflake.client.jdbc.SnowflakeDriver"
> user = "root"
> password = "123456"
> query = "insert into test_table(name,age) values(?,?)"
> }
> # If you would like to get more information about how to configure seatunnel and see full list of sink plugins,
> # please go to https://seatunnel.apache.org/docs/category/sink-v2
> }
> ```

### CDC(Change data capture) event

> CDC change data is also supported by us In this case, you need config database, table and primary_keys.
>
> ```
> jdbc {
> url = "jdbc:snowflake://<account_name>.snowflakecomputing.com"
> driver = "net.snowflake.client.jdbc.SnowflakeDriver"
> user = "root"
> password = "123456"
>
> ```

# You need to configure both database and table
database = test
table = sink_table
primary_keys = ["id","name"]

}

```

```

1 change: 1 addition & 0 deletions docs/en/connector-v2/source/Jdbc.md
Original file line number Diff line number Diff line change
Expand Up @@ -117,6 +117,7 @@ there are some reference value for params above.
| saphana | com.sap.db.jdbc.Driver | jdbc:sap://localhost:39015 | https://mvnrepository.com/artifact/com.sap.cloud.db.jdbc/ngdbc |
| doris | com.mysql.cj.jdbc.Driver | jdbc:mysql://localhost:3306/test | https://mvnrepository.com/artifact/mysql/mysql-connector-java |
| teradata | com.teradata.jdbc.TeraDriver | jdbc:teradata://localhost/DBS_PORT=1025,DATABASE=test | https://mvnrepository.com/artifact/com.teradata.jdbc/terajdbc |
| Snowflake | net.snowflake.client.jdbc.SnowflakeDriver | jdbc:snowflake://<account_name>.snowflakecomputing.com | https://mvnrepository.com/artifact/net.snowflake/snowflake-jdbc |
| Redshift | com.amazon.redshift.jdbc42.Driver | jdbc:redshift://localhost:5439/testdb?defaultRowFetchSize=1000 | https://mvnrepository.com/artifact/com.amazon.redshift/redshift-jdbc42 |
| Vertica | com.vertica.jdbc.Driver | jdbc:vertica://localhost:5433 | https://repo1.maven.org/maven2/com/vertica/jdbc/vertica-jdbc/12.0.3-0/vertica-jdbc-12.0.3-0.jar |

Expand Down
Loading