Skip to content

Commit

Permalink
docs: Added quick start section
Browse files Browse the repository at this point in the history
  • Loading branch information
jruaux committed May 2, 2024
1 parent 43d6987 commit 137ae4b
Show file tree
Hide file tree
Showing 8 changed files with 230 additions and 73 deletions.
1 change: 0 additions & 1 deletion docs/guide/src/docs/asciidoc/_links.adoc
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
:link_releases: link:https://github.com/redis-field-engineering/redis-kafka-connect/releases[releases page]
:link_redis_enterprise: link:https://redis.com/redis-enterprise-software/overview/[Redis Enterprise]
:link_lettuce_uri: link:https://github.com/lettuce-io/lettuce-core/wiki/Redis-URI-and-connection-details#uri-syntax[Redis URI Syntax]
:link_redis_notif: link:https://redis.io/docs/manual/keyspace-notifications[Redis Keyspace Notifications]
Expand Down
55 changes: 0 additions & 55 deletions docs/guide/src/docs/asciidoc/docker.adoc

This file was deleted.

7 changes: 3 additions & 4 deletions docs/guide/src/docs/asciidoc/index.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -6,12 +6,11 @@

include::{includedir}/_links.adoc[]

:leveloffset: +1
include::{includedir}/introduction.adoc[]
:leveloffset: 1
include::{includedir}/overview.adoc[]
include::{includedir}/quickstart.adoc[]
include::{includedir}/install.adoc[]
include::{includedir}/connect.adoc[]
include::{includedir}/sink.adoc[]
include::{includedir}/source.adoc[]
include::{includedir}/docker.adoc[]
include::{includedir}/resources.adoc[]
:leveloffset: -1
4 changes: 2 additions & 2 deletions docs/guide/src/docs/asciidoc/install.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ Select one of the methods below to install {project-title}.

== Download

Download the latest release archive from the link:{project-url}/releases[releases page].
Download the latest release archive from https://github.com/{github-owner}/{github-repo}/releases[here].

== Confluent Hub

Expand All @@ -14,4 +14,4 @@ Download the latest release archive from the link:{project-url}/releases[release

== Manually

Follow the instructions in {link_manual_install}.
Follow the instructions in {link_manual_install}
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
[[_introduction]]
= Introduction
[[_overview]]
= Overview

{project-title} is used to import and export data between Apache Kafka and Redis.
{project-title} is a Confluent-verified connector that stores data from Kafka topics into Redis and pushes data from Redis into Kafka topics.

image:redis-kafka-connector.svg[]

Expand Down
212 changes: 212 additions & 0 deletions docs/guide/src/docs/asciidoc/quickstart.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,212 @@
[[_quick_start]]
= Quick Start

This section shows how to configure the {project-title} to import/export data between Redis and Apache Kafka and provides a hands-on look at the functionality of the source and sink connectors.

== Requirements

Download and install the following software:

* https://docs.docker.com/get-docker/[Docker]
* https://git-scm.com/book/en/v2/Getting-Started-Installing-Git[Git]

== Start the Sandbox

The sandbox starts the following Docker services:

* Redis Stack
* Apache Kafka
* Kafka Connect with the {project-title} installed

To start the sandbox run the following command:

`docker compose up`

After Docker downloads and starts the services you should see the following output:

[source,console]
-----
[+] Running 8/0
✔ Container redis Created
✔ Container zookeeper Created
✔ Container broker Created
✔ Container schema-registry Created
✔ Container rest-proxy Created
✔ Container connect Created
✔ Container ksqldb-server Created
✔ Container control-center Created
-----

== Add Connectors

Now that the required services are up and running, we can add connectors to Kafka Connect to transfer data between Redis and Kafka:

* Add a sink connector to transfer data from Kafka to Redis
* Add a source connector to transfer data from Redis to Kafka

=== Add a Datagen

https://github.com/confluentinc/kafka-connect-datagen/[Kafka Connect Datagen] is a Kafka Connect source connector for generating mock data.

Create the Datagen connector with the following command:

[source,console]
-----
curl -X POST -H "Content-Type: application/json" --data '
{ "name": "datagen-pageviews",
"config": {
"connector.class": "io.confluent.kafka.connect.datagen.DatagenConnector",
"kafka.topic": "pageviews",
"quickstart": "pageviews",
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
"value.converter.schemas.enable": "false",
"producer.interceptor.classes": "io.confluent.monitoring.clients.interceptor.MonitoringProducerInterceptor",
"max.interval": 200,
"iterations": 10000000,
"tasks.max": "1"
}}' http://localhost:8083/connectors -w "\n"
-----

This automatically creates the Kafka topic `pageviews` and produces data with a schema configuration from https://github.com/confluentinc/kafka-connect-datagen/blob/master/src/main/resources/pageviews_schema.avro

[NOTE]
====
Why do I see the message 'Failed to connect'?
It takes up to three minutes for the Kafka Connect REST API to start.
If you receive the following error, wait three minutes and run the preceding command again.
`curl: (7) Failed to connect to connect port 8083: Connection refused`
====

To confirm that you added the Datagen connector, run the following command:

`curl -X GET http://localhost:8083/connectors`


=== Add a Sink Connector

The command below adds a {project-title} sink connector configured with these properties:

* The class Kafka Connect uses to instantiate the connector
* The Kafka topic from which the connector reads data
* The connection URI of the Redis database to which the connector writes data
* The Redis command to use for writing data (`JSONSET`)
* Key and value converters to correctly handle incoming `pageviews` data
* A https://docs.confluent.io/platform/current/connect/transforms/overview.html[Single Message Transform] to extract a key from `pageviews` messages.

[source,console]
-----
curl -X POST -H "Content-Type: application/json" --data '
{"name": "redis-sink-json",
"config": {
"connector.class":"com.redis.kafka.connect.RedisSinkConnector",
"tasks.max":"1",
"topics":"pageviews",
"redis.uri":"redis://redis:6379",
"redis.command":"JSONSET",
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
"value.converter": "org.apache.kafka.connect.storage.StringConverter",
"value.converter.schemas.enable": "false",
"transforms": "Cast",
"transforms.Cast.type": "org.apache.kafka.connect.transforms.Cast$Key",
"transforms.Cast.spec": "string"
}}' http://localhost:8083/connectors -w "\n"
-----

You can check that Kafka messages are being written to Redis with this command:

`docker compose exec redis /opt/redis-stack/bin/redis-cli "keys" "*"`

You should see the following output:

[source,console]
-----
1) "pageviews:6021"
2) "pageviews:211"
3) "pageviews:281"
...
-----

To retrieve the contents of a specific key use this command:

`docker compose exec redis /opt/redis-stack/bin/redis-cli "JSON.GET" "pageviews:1451"`

=> `"{\"viewtime\":1451,\"userid\":\"User_6\",\"pageid\":\"Page_35\"}"`

=== Add a Source Connector

The following command adds a source connector configured with these properties:

* The class Kafka Connect uses to instantiate the connector
* The connection URI of the Redis database the connector connects to
* The name of the Redis stream from which the connector reads messages
* The Kafka topic to which the connector writes data

[source,console]
-----
curl -X POST -H "Content-Type: application/json" --data '
{ "name": "redis-source",
"config": {
"tasks.max":"1",
"connector.class":"com.redis.kafka.connect.RedisStreamSourceConnector",
"redis.uri":"redis://redis:6379",
"redis.stream.name":"mystream",
"topic": "mystream"
}
}' http://localhost:8083/connectors -w "\n"
-----

Now add a message to the `mystream` Redis stream:

`docker compose exec redis /opt/redis-stack/bin/redis-cli "xadd" "mystream" "*" "field1" "value11" "field2" "value21"`

Examine the topics in the Kafka UI: http://localhost:9021 or http://localhost:8000/.
The `mystream` topic should have the previously sent stream message.


== End-to-end Example

The project {project-scm}[repository] contains a script that runs all the steps shown previously.

Clone the {project-scm}[{project-name}] repository and execute `run.sh` in `docker` directory:

[source,console,subs="attributes"]
----
git clone {project-scm}
cd {project-name}
./run.sh
----

This will:

* Run `docker compose up`
* Wait for Redis, Kafka, and Kafka Connect to be ready
* Register the Confluent Datagen Connector
* Register the Redis Kafka Sink Connector
* Register the Redis Kafka Source Connector
* Publish some events to Kafka via the Datagen connector
* Write the events to Redis
* Send messages to a Redis stream
* Write the Redis stream messages back into Kafka

Once running, examine the topics in the Kafka http://localhost:9021/[control center]:

The `pageviews` topic should contain the 10 simple documents added, each similar to:

[source,json]
----
include::{includedir}/../resources/pageviews.json[]
----

* The `pageviews` stream should contain the 10 change events.

Examine the stream in Redis:
[source,console]
----
docker compose exec redis /usr/local/bin/redis-cli
xread COUNT 10 STREAMS pageviews 0
----

Messages added to the `mystream` stream will show up in the `mystream` topic
13 changes: 7 additions & 6 deletions docs/guide/src/docs/asciidoc/sink.adoc
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
[[_sink]]
= Sink Connector Guide
:name: Redis Kafka Sink Connector

The sink connector consumes records from a Kafka topic and writes the data to Redis.
The {name} consumes records from a Kafka topic and writes the data to Redis.
It includes the following features:

* <<_sink_at_least_once_delivery,At least once delivery>>
Expand All @@ -11,17 +12,17 @@ It includes the following features:

[[_sink_at_least_once_delivery]]
== At least once delivery
The sink connector guarantees that records from the Kafka topic are delivered at least once.
The {name} guarantees that records from the Kafka topic are delivered at least once.

[[_sink_tasks]]
== Multiple tasks

The sink connector supports running one or more tasks.
The {name} supports running one or more tasks.
You can specify the number of tasks with the `tasks.max` configuration property.

[[_sink_data_structures]]
== Redis Data Structures
The sink connector supports the following Redis data-structure types as targets:
The {name} supports the following Redis data-structure types as targets:

[[_collection_key]]
* Collections: <<_sink_stream,stream>>, <<_sink_list,list>>, <<_sink_set,set>>, <<_sink_zset,sorted set>>, <<_sink_timeseries,time series>>
Expand Down Expand Up @@ -167,10 +168,10 @@ The Kafka record value must be a number (e.g. `float64`) as it is used as the sa
[[_sink_data_formats]]
== Data Formats

The sink connector supports different data formats for record keys and values depending on the target Redis data structure.
The {name} supports different data formats for record keys and values depending on the target Redis data structure.

=== Kafka Record Keys
The sink connector expects Kafka record keys in a specific format depending on the configured target <<_sink_data_structures,Redis data structure>>:
The {name} expects Kafka record keys in a specific format depending on the configured target <<_sink_data_structures,Redis data structure>>:

[options="header",cols="h,1,1"]
|====
Expand Down
5 changes: 3 additions & 2 deletions docs/guide/src/docs/asciidoc/source.adoc
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
[[_source]]
= Source Connector Guide
:name: Redis Kafka Source Connector

{project-title} includes 2 source connectors:
The {name} includes 2 source connectors:

* <<_stream_source,Stream>>
* <<_keys_source,Keys>>
Expand All @@ -20,7 +21,7 @@ It includes the following features:

=== Delivery Guarantees

The stream source connector can be configured to ack stream messages either automatically (at-most-once delivery) or explicitly (at-least-once delivery).
The {name} can be configured to ack stream messages either automatically (at-most-once delivery) or explicitly (at-least-once delivery).
The default is at-least-once delivery.

[[_stream_source_at_least_once_delivery]]
Expand Down

0 comments on commit 137ae4b

Please sign in to comment.