Skip to content

Commit

Permalink
docs: Added Confluent Cloud documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
jruaux committed May 2, 2024
1 parent 137ae4b commit 7eb6d57
Show file tree
Hide file tree
Showing 3 changed files with 53 additions and 1 deletion.
20 changes: 19 additions & 1 deletion docs/guide/src/docs/asciidoc/quickstart.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -209,4 +209,22 @@ docker compose exec redis /usr/local/bin/redis-cli
xread COUNT 10 STREAMS pageviews 0
----

Messages added to the `mystream` stream will show up in the `mystream` topic
Messages added to the `mystream` stream will show up in the `mystream` topic.


== Confluent Cloud

This section describes configuration aspects that are specific to using {project-title} in Confluent Cloud.

=== Egress Endpoints

It is required to specify https://docs.confluent.io/cloud/current/connectors/bring-your-connector/custom-connector-qs.html#cc-byoc-endpoints[egress endpoints] in order for the connector to reach the Redis database.

=== Sensitive Properties

The following are https://docs.confluent.io/cloud/current/connectors/bring-your-connector/custom-connector-qs.html#sensitive[sensitive properties] that must be marked as such in Confluent Cloud UI.

* `redis.uri`: URI of the Redis database to connect to, e.g. `redis://redis-12000.redis.com:12000`
* `redis.username`: Username to use to connect to Redis
* `redis.password`: Password to use to connect to Redis
* `redis.key.password`: Password of the private key file
12 changes: 12 additions & 0 deletions docs/guide/src/docs/asciidoc/sink.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,17 @@ It includes the following features:
* <<_sink_data_structures,Redis Data Structures>>
* <<_sink_data_formats,Supported Data Formats>>

== Class Name

The sink connector class name is `com.redis.kafka.connect.RedisSinkConnector`.

The corresponding configuration property would be:

[source,properties]
----
`connector.class = com.redis.kafka.connect.RedisSinkConnector`
----

[[_sink_at_least_once_delivery]]
== At least once delivery
The {name} guarantees that records from the Kafka topic are delivered at least once.
Expand All @@ -20,6 +31,7 @@ The {name} guarantees that records from the Kafka topic are delivered at least o
The {name} supports running one or more tasks.
You can specify the number of tasks with the `tasks.max` configuration property.


[[_sink_data_structures]]
== Redis Data Structures
The {name} supports the following Redis data-structure types as targets:
Expand Down
22 changes: 22 additions & 0 deletions docs/guide/src/docs/asciidoc/source.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,17 @@ It includes the following features:
* <<_stream_source_schema,Schema>>
* <<_stream_source_config,Configuration>>

=== Class Name

The stream source connector class name is `com.redis.kafka.connect.RedisStreamSourceConnector`.

The corresponding configuration property would be:

[source,properties]
----
`connector.class = com.redis.kafka.connect.RedisStreamSourceConnector`
----

=== Delivery Guarantees

The {name} can be configured to ack stream messages either automatically (at-most-once delivery) or explicitly (at-least-once delivery).
Expand Down Expand Up @@ -115,6 +126,17 @@ Some preliminary sizing using Redis statistics and `bigkeys`/`memkeys` is recomm
If you need assistance please contact your Redis account team.
====

=== Class Name

The keys source connector class name is `com.redis.kafka.connect.RedisKeysSourceConnector`.

The corresponding configuration property would be:

[source,properties]
----
`connector.class = com.redis.kafka.connect.RedisKeysSourceConnector`
----

[[_keys_source_config]]
=== Configuration
[source,properties]
Expand Down

0 comments on commit 7eb6d57

Please sign in to comment.