Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The connector using Debezium and Kafka does not create tables and does not report any errors. #930

Closed
DmitryTuryshev opened this issue Nov 22, 2024 · 3 comments
Labels
dev-complete Development completed kafka Issues related to Kafka version qa-verified label to mark issues that were verified by QA
Milestone

Comments

@DmitryTuryshev
Copy link

DmitryTuryshev commented Nov 22, 2024

I'm trying to write messages from Kafka to ClickHouse. The messages are being received via Debezium, and I am using the image altinity/clickhouse-sink-connector:2.4.0-kafka.

Problem: The connector shows zero lag, and offsets exist, but no logs are available in the container, and data is not being written to ClickHouse.

Kafka Offsets
The offsets exist, but no processing is happening.
image

Configuration File
Here is my configuration for the connector:
{
"name": "clickhouse-sink-connector-10",
"config": {
"connector.class": "com.altinity.clickhouse.sink.connector.ClickHouseSinkConnector",
"tasks.max": "1",
"topics": "pg-source-1.public.t1",
"topics.regex" : "pg-source-1.*",
"clickhouse.server.url": "http://clickhouse",
"clickhouse.server.user": "default",
"clickhouse.server.password": "",
"clickhouse.topic2table.map": "pg-source-1.public.t1:default.t1",
"metrics.enable": "false",
"auto.create.tables": "true",
"auto.create.tables.replicated": "false",
"schema.evolution": "true",
"snowflake.id": "false",
"replacingmergetree.delete.column": "id",
"enable.kafka.offset": "true",
"errors.log.enable": "true",
"errors.log.include.messages": "true",
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
"key.converter.schemas.enable": "true",
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
"value.converter.schemas.enable": "true"
}
}

Observations
The messages from Debezium are being received successfully:
image

Troubleshooting Steps Already Tried

  1. Verified that messages are present in Kafka topics.
  2. Checked for container logs, but no output is present.
  3. Confirmed that offsets and lags are being updated correctly.

Let me know if additional details are needed, or if you have any suggestions!

@subkanthi subkanthi added this to the 2.5.0 milestone Nov 26, 2024
@subkanthi
Copy link
Collaborator

https://groups.google.com/g/debezium/c/l9uFOwUm7XQ

In MySQL 8.4

debezium          | Caused by: java.sql.SQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'MASTER STATUS' at line 1

@subkanthi
Copy link
Collaborator

Null_Value`,`_version`,`_sign` from input('`Type` String,`Value` String,`Null_Value` Nullable(String),`_version` UInt64,`_sign` UInt8')
[INFO ] 2024-12-04 20:39:27.483 [Task: Sink Connector thread-pool-0askId] PreparedStatementExecutor - *** INSERT QUERY for Database(datatypes) ***: insert into `binary_types_LONGBLOB`(`Type`,`Value`,`Null_Value`,`_version`,`_sign`) select `Type`,`Value`,`Null_Value`,`_version`,`_sign` from input('`Type` String,`Value` String,`Null_Value` Nullable(String),`_version` UInt64,`_sign` UInt8')
[INFO ] 2024-12-04 20:39:27.492 [Task: Sink Connector thread-pool-0askId] PreparedStatementExecutor - *** INSERT QUERY for Database(datatypes) ***: insert into `binary_types_BLOB`(`Type`,`Value`,`Null_Value`,`_version`,`_sign`) select `Type`,`Value`,`Null_Value`,`_version`,`_sign` from input('`Type` String,`Value` String,`Null_Value` Nullable(String),`_version` UInt64,`_sign` UInt8')
[INFO ] 2024-12-04 20:39:27.645 [Task: Sink Connector thread-pool-0askId] PreparedStatementExecutor - *************** EXECUTED BATCH Successfully Records: 1************** task(2) Thread ID: Sink Connector thread-pool-0 Result: [I@16ef1f16 Database: datatypes Table: binary_types_BINARY
[INFO ] 2024-12-04 20:39:27.645 [Task: Sink Connector thread-pool-0askId] PreparedStatementExecutor - *************** EXECUTED BATCH Successfully Records: 1************** task(3) Thread ID: Sink Connector thread-pool-0 Result: [I@87068e2 Database: datatypes Table: binary_types_BLOB
[INFO ] 2024-12-04 20:39:27.647 [Task: Sink Connector thread-pool-0askId] PreparedStatementExecutor - *************** EXECUTED BATCH Successfully Records: 1************** task(5) Thread ID: Sink Connector thread-pool-0 Result: [I@2da69dba Database: datatypes Table: binary_types_LONGBLOB

@subkanthi subkanthi added dev-complete Development completed kafka Issues related to Kafka version labels Dec 4, 2024
@Selfeer Selfeer added the qa-verified label to mark issues that were verified by QA label Dec 5, 2024
@DmitryTuryshev
Copy link
Author

DmitryTuryshev commented Dec 9, 2024

I connected Kafka Connect using Debezium to extract data from the source and Alitinity as a Sink connector to write data to ClickHouse. I configured everything: Kafka, Debezium Postgres, and the Alitinity Sink connector. After launching, I expected the data to be successfully transferred from the source through Kafka to ClickHouse. However, instead, I only received log offsets in Kafka, with no visible data in the target database.

I checked the logs of all components, and it seems that each of them is functioning without errors. Nevertheless, the data that should have been written to ClickHouse via Alitinity did not appear. Perhaps the issue lies in the configuration of the Sink connector.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dev-complete Development completed kafka Issues related to Kafka version qa-verified label to mark issues that were verified by QA
Projects
None yet
Development

No branches or pull requests

3 participants