diff --git a/docs/guide/src/docs/asciidoc/_links.adoc b/docs/guide/src/docs/asciidoc/_links.adoc index d0fe1e8..1df5fa6 100644 --- a/docs/guide/src/docs/asciidoc/_links.adoc +++ b/docs/guide/src/docs/asciidoc/_links.adoc @@ -1,4 +1,3 @@ -:link_releases: link:https://github.com/redis-field-engineering/redis-kafka-connect/releases[releases page] :link_redis_enterprise: link:https://redis.com/redis-enterprise-software/overview/[Redis Enterprise] :link_lettuce_uri: link:https://github.com/lettuce-io/lettuce-core/wiki/Redis-URI-and-connection-details#uri-syntax[Redis URI Syntax] :link_redis_notif: link:https://redis.io/docs/manual/keyspace-notifications[Redis Keyspace Notifications] diff --git a/docs/guide/src/docs/asciidoc/docker.adoc b/docs/guide/src/docs/asciidoc/docker.adoc deleted file mode 100644 index 3926e63..0000000 --- a/docs/guide/src/docs/asciidoc/docker.adoc +++ /dev/null @@ -1,55 +0,0 @@ -[[_docker]] -= Quick Start with Docker - -This section provides a hands-on look at the functionality of the Redis Kafka Source and Sink Connectors: - -* The *redis-sink* connector reads data from a Kafka topic and writes it to a Redis stream -* The *redis-source* connector reads data from a Redis stream and writes it to a Kafka topic - -== Requirements - -https://docs.docker.com/get-docker/[Docker] - -== Run the example - -Clone the link:{project-scm}[github repository] and execute `run.sh` in `docker` directory: - -[source,console,subs="attributes"] ----- -git clone {project-scm} -./run.sh ----- - -This will: - -* Run `docker-compose up` -* Wait for Redis, Kafka, and Kafka Connect to be ready -* Register the Confluent Datagen Connector -* Register the Redis Kafka Sink Connector -* Register the Redis Kafka Source Connector -* Publish some events to Kafka via the Datagen connector -* Write the events to Redis -* Send messages to a Redis stream -* Write the Redis stream messages back into Kafka - -Once running, examine the topics in the Kafka control center: http://localhost:9021/ - -* The `pageviews` topic should contain the 10 simple documents added, each similar to: - -[source,json] ----- -include::{includedir}/../resources/pageviews.json[] ----- - -* The `pageviews` stream should contain the 10 change events. - -Examine the stream in Redis: -[source,console] ----- -docker-compose exec redis /usr/local/bin/redis-cli -xread COUNT 10 STREAMS pageviews 0 ----- - -Messages added to the `mystream` stream will show up in the `mystream` topic - - diff --git a/docs/guide/src/docs/asciidoc/index.adoc b/docs/guide/src/docs/asciidoc/index.adoc index 33677c5..94e93d5 100644 --- a/docs/guide/src/docs/asciidoc/index.adoc +++ b/docs/guide/src/docs/asciidoc/index.adoc @@ -6,12 +6,11 @@ include::{includedir}/_links.adoc[] -:leveloffset: +1 -include::{includedir}/introduction.adoc[] +:leveloffset: 1 +include::{includedir}/overview.adoc[] +include::{includedir}/quickstart.adoc[] include::{includedir}/install.adoc[] include::{includedir}/connect.adoc[] include::{includedir}/sink.adoc[] include::{includedir}/source.adoc[] -include::{includedir}/docker.adoc[] include::{includedir}/resources.adoc[] -:leveloffset: -1 diff --git a/docs/guide/src/docs/asciidoc/install.adoc b/docs/guide/src/docs/asciidoc/install.adoc index dea021f..b984dde 100644 --- a/docs/guide/src/docs/asciidoc/install.adoc +++ b/docs/guide/src/docs/asciidoc/install.adoc @@ -5,7 +5,7 @@ Select one of the methods below to install {project-title}. == Download -Download the latest release archive from the link:{project-url}/releases[releases page]. +Download the latest release archive from https://github.com/{github-owner}/{github-repo}/releases[here]. == Confluent Hub @@ -14,4 +14,4 @@ Download the latest release archive from the link:{project-url}/releases[release == Manually -Follow the instructions in {link_manual_install}. +Follow the instructions in {link_manual_install} diff --git a/docs/guide/src/docs/asciidoc/introduction.adoc b/docs/guide/src/docs/asciidoc/overview.adoc similarity index 63% rename from docs/guide/src/docs/asciidoc/introduction.adoc rename to docs/guide/src/docs/asciidoc/overview.adoc index 0f23c00..5300c98 100644 --- a/docs/guide/src/docs/asciidoc/introduction.adoc +++ b/docs/guide/src/docs/asciidoc/overview.adoc @@ -1,7 +1,7 @@ -[[_introduction]] -= Introduction +[[_overview]] += Overview -{project-title} is used to import and export data between Apache Kafka and Redis. +{project-title} is a Confluent-verified connector that stores data from Kafka topics into Redis and pushes data from Redis into Kafka topics. image:redis-kafka-connector.svg[] diff --git a/docs/guide/src/docs/asciidoc/quickstart.adoc b/docs/guide/src/docs/asciidoc/quickstart.adoc new file mode 100644 index 0000000..fdec487 --- /dev/null +++ b/docs/guide/src/docs/asciidoc/quickstart.adoc @@ -0,0 +1,212 @@ +[[_quick_start]] += Quick Start + +This section shows how to configure the {project-title} to import/export data between Redis and Apache Kafka and provides a hands-on look at the functionality of the source and sink connectors. + +== Requirements + +Download and install the following software: + +* https://docs.docker.com/get-docker/[Docker] +* https://git-scm.com/book/en/v2/Getting-Started-Installing-Git[Git] + +== Start the Sandbox + +The sandbox starts the following Docker services: + +* Redis Stack +* Apache Kafka +* Kafka Connect with the {project-title} installed + +To start the sandbox run the following command: + +`docker compose up` + +After Docker downloads and starts the services you should see the following output: + +[source,console] +----- +[+] Running 8/0 + ✔ Container redis Created + ✔ Container zookeeper Created + ✔ Container broker Created + ✔ Container schema-registry Created + ✔ Container rest-proxy Created + ✔ Container connect Created + ✔ Container ksqldb-server Created + ✔ Container control-center Created +----- + +== Add Connectors + +Now that the required services are up and running, we can add connectors to Kafka Connect to transfer data between Redis and Kafka: + +* Add a sink connector to transfer data from Kafka to Redis +* Add a source connector to transfer data from Redis to Kafka + +=== Add a Datagen + +https://github.com/confluentinc/kafka-connect-datagen/[Kafka Connect Datagen] is a Kafka Connect source connector for generating mock data. + +Create the Datagen connector with the following command: + +[source,console] +----- +curl -X POST -H "Content-Type: application/json" --data ' + { "name": "datagen-pageviews", + "config": { + "connector.class": "io.confluent.kafka.connect.datagen.DatagenConnector", + "kafka.topic": "pageviews", + "quickstart": "pageviews", + "key.converter": "org.apache.kafka.connect.json.JsonConverter", + "value.converter": "org.apache.kafka.connect.json.JsonConverter", + "value.converter.schemas.enable": "false", + "producer.interceptor.classes": "io.confluent.monitoring.clients.interceptor.MonitoringProducerInterceptor", + "max.interval": 200, + "iterations": 10000000, + "tasks.max": "1" +}}' http://localhost:8083/connectors -w "\n" +----- + +This automatically creates the Kafka topic `pageviews` and produces data with a schema configuration from https://github.com/confluentinc/kafka-connect-datagen/blob/master/src/main/resources/pageviews_schema.avro + +[NOTE] +==== +Why do I see the message 'Failed to connect'? + +It takes up to three minutes for the Kafka Connect REST API to start. +If you receive the following error, wait three minutes and run the preceding command again. + +`curl: (7) Failed to connect to connect port 8083: Connection refused` +==== + +To confirm that you added the Datagen connector, run the following command: + +`curl -X GET http://localhost:8083/connectors` + + +=== Add a Sink Connector + +The command below adds a {project-title} sink connector configured with these properties: + +* The class Kafka Connect uses to instantiate the connector +* The Kafka topic from which the connector reads data +* The connection URI of the Redis database to which the connector writes data +* The Redis command to use for writing data (`JSONSET`) +* Key and value converters to correctly handle incoming `pageviews` data +* A https://docs.confluent.io/platform/current/connect/transforms/overview.html[Single Message Transform] to extract a key from `pageviews` messages. + +[source,console] +----- +curl -X POST -H "Content-Type: application/json" --data ' + {"name": "redis-sink-json", + "config": { + "connector.class":"com.redis.kafka.connect.RedisSinkConnector", + "tasks.max":"1", + "topics":"pageviews", + "redis.uri":"redis://redis:6379", + "redis.command":"JSONSET", + "key.converter": "org.apache.kafka.connect.json.JsonConverter", + "value.converter": "org.apache.kafka.connect.storage.StringConverter", + "value.converter.schemas.enable": "false", + "transforms": "Cast", + "transforms.Cast.type": "org.apache.kafka.connect.transforms.Cast$Key", + "transforms.Cast.spec": "string" +}}' http://localhost:8083/connectors -w "\n" +----- + +You can check that Kafka messages are being written to Redis with this command: + +`docker compose exec redis /opt/redis-stack/bin/redis-cli "keys" "*"` + +You should see the following output: + +[source,console] +----- + 1) "pageviews:6021" + 2) "pageviews:211" + 3) "pageviews:281" + ... +----- + +To retrieve the contents of a specific key use this command: + +`docker compose exec redis /opt/redis-stack/bin/redis-cli "JSON.GET" "pageviews:1451"` + +=> `"{\"viewtime\":1451,\"userid\":\"User_6\",\"pageid\":\"Page_35\"}"` + +=== Add a Source Connector + +The following command adds a source connector configured with these properties: + +* The class Kafka Connect uses to instantiate the connector +* The connection URI of the Redis database the connector connects to +* The name of the Redis stream from which the connector reads messages +* The Kafka topic to which the connector writes data + +[source,console] +----- +curl -X POST -H "Content-Type: application/json" --data ' +{ "name": "redis-source", + "config": { + "tasks.max":"1", + "connector.class":"com.redis.kafka.connect.RedisStreamSourceConnector", + "redis.uri":"redis://redis:6379", + "redis.stream.name":"mystream", + "topic": "mystream" + } +}' http://localhost:8083/connectors -w "\n" +----- + +Now add a message to the `mystream` Redis stream: + +`docker compose exec redis /opt/redis-stack/bin/redis-cli "xadd" "mystream" "*" "field1" "value11" "field2" "value21"` + +Examine the topics in the Kafka UI: http://localhost:9021 or http://localhost:8000/. +The `mystream` topic should have the previously sent stream message. + + +== End-to-end Example + +The project {project-scm}[repository] contains a script that runs all the steps shown previously. + +Clone the {project-scm}[{project-name}] repository and execute `run.sh` in `docker` directory: + +[source,console,subs="attributes"] +---- +git clone {project-scm} +cd {project-name} +./run.sh +---- + +This will: + +* Run `docker compose up` +* Wait for Redis, Kafka, and Kafka Connect to be ready +* Register the Confluent Datagen Connector +* Register the Redis Kafka Sink Connector +* Register the Redis Kafka Source Connector +* Publish some events to Kafka via the Datagen connector +* Write the events to Redis +* Send messages to a Redis stream +* Write the Redis stream messages back into Kafka + +Once running, examine the topics in the Kafka http://localhost:9021/[control center]: + +The `pageviews` topic should contain the 10 simple documents added, each similar to: + +[source,json] +---- +include::{includedir}/../resources/pageviews.json[] +---- + +* The `pageviews` stream should contain the 10 change events. + +Examine the stream in Redis: +[source,console] +---- +docker compose exec redis /usr/local/bin/redis-cli +xread COUNT 10 STREAMS pageviews 0 +---- + +Messages added to the `mystream` stream will show up in the `mystream` topic diff --git a/docs/guide/src/docs/asciidoc/sink.adoc b/docs/guide/src/docs/asciidoc/sink.adoc index 82a0dba..3c4f88c 100644 --- a/docs/guide/src/docs/asciidoc/sink.adoc +++ b/docs/guide/src/docs/asciidoc/sink.adoc @@ -1,7 +1,8 @@ [[_sink]] = Sink Connector Guide +:name: Redis Kafka Sink Connector -The sink connector consumes records from a Kafka topic and writes the data to Redis. +The {name} consumes records from a Kafka topic and writes the data to Redis. It includes the following features: * <<_sink_at_least_once_delivery,At least once delivery>> @@ -11,17 +12,17 @@ It includes the following features: [[_sink_at_least_once_delivery]] == At least once delivery -The sink connector guarantees that records from the Kafka topic are delivered at least once. +The {name} guarantees that records from the Kafka topic are delivered at least once. [[_sink_tasks]] == Multiple tasks -The sink connector supports running one or more tasks. +The {name} supports running one or more tasks. You can specify the number of tasks with the `tasks.max` configuration property. [[_sink_data_structures]] == Redis Data Structures -The sink connector supports the following Redis data-structure types as targets: +The {name} supports the following Redis data-structure types as targets: [[_collection_key]] * Collections: <<_sink_stream,stream>>, <<_sink_list,list>>, <<_sink_set,set>>, <<_sink_zset,sorted set>>, <<_sink_timeseries,time series>> @@ -167,10 +168,10 @@ The Kafka record value must be a number (e.g. `float64`) as it is used as the sa [[_sink_data_formats]] == Data Formats -The sink connector supports different data formats for record keys and values depending on the target Redis data structure. +The {name} supports different data formats for record keys and values depending on the target Redis data structure. === Kafka Record Keys -The sink connector expects Kafka record keys in a specific format depending on the configured target <<_sink_data_structures,Redis data structure>>: +The {name} expects Kafka record keys in a specific format depending on the configured target <<_sink_data_structures,Redis data structure>>: [options="header",cols="h,1,1"] |==== diff --git a/docs/guide/src/docs/asciidoc/source.adoc b/docs/guide/src/docs/asciidoc/source.adoc index 247f66c..3023846 100644 --- a/docs/guide/src/docs/asciidoc/source.adoc +++ b/docs/guide/src/docs/asciidoc/source.adoc @@ -1,7 +1,8 @@ [[_source]] = Source Connector Guide +:name: Redis Kafka Source Connector -{project-title} includes 2 source connectors: +The {name} includes 2 source connectors: * <<_stream_source,Stream>> * <<_keys_source,Keys>> @@ -20,7 +21,7 @@ It includes the following features: === Delivery Guarantees -The stream source connector can be configured to ack stream messages either automatically (at-most-once delivery) or explicitly (at-least-once delivery). +The {name} can be configured to ack stream messages either automatically (at-most-once delivery) or explicitly (at-least-once delivery). The default is at-least-once delivery. [[_stream_source_at_least_once_delivery]]