Skip to content

Commit 699bfeb

Browse files
authoredJul 8, 2024
docs: ssl example added (#196)
1 parent 11f05a5 commit 699bfeb

18 files changed

+788
-485
lines changed
 

‎.gitignore

+3
Original file line numberDiff line numberDiff line change
@@ -229,3 +229,6 @@ tags
229229
pyvenv.cfg
230230
pip-selfcheck.json
231231
poetry.toml
232+
233+
# local SSL cluster certificates
234+
kafka-certs/

‎examples/fastapi-sse/poetry.lock

-485
This file was deleted.

‎examples/ssl-example/README.md

+88
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,88 @@
1+
# Kstreams SSL example
2+
3+
This example shows how setup an SSL connection with `kstreams`. For this purpose we have to setup a local kafka SSL cluster.
4+
5+
`ENV` varialbles are exported using the script `/scripts/ssl/export-env-variables` and loaded in python using [pydantic-settings](https://docs.pydantic.dev/latest/concepts/pydantic_settings/).
6+
7+
Check [resources.py](https://github.com/kpn/kstreams/blob/master/examples/ssl-example/ssl_example/resources.py) to see how the backend with `SSL` is created
8+
9+
## Requirements
10+
11+
`python 3.8+`, `poetry`, `docker-compose`, `openssl`
12+
13+
### Usage
14+
15+
First create the server and client sertificates:
16+
17+
```bash
18+
./scripts/ssl/ssl-setup
19+
```
20+
21+
After executing the privious script, you will see a new folder called `kafka-certs`. The folder contains the `server` (inside the server folder),
22+
`admin` (inside the admin folder) and `client` certicicates. Do not worry about the content of them, it is just an example and they can be deleted,
23+
shared and recreated (it is just a local example)
24+
25+
Now you can run the local SSL cluster:
26+
27+
```bash
28+
./scripts/cluster/start
29+
```
30+
31+
Second, you need to install the project dependencies dependencies. In a different terminal execute:
32+
33+
```bash
34+
poetry install
35+
```
36+
37+
Export the env variables:
38+
39+
```bash
40+
. ./scripts/ssl/export-env-variables
41+
```
42+
43+
Then we can run the project
44+
45+
```bash
46+
poetry run app
47+
```
48+
49+
You should see something similar to the following logs:
50+
51+
```bash
52+
kstreams/examples/ssl-example via 🐳 colima is 📦 v0.1.0 via 🐍 v3.12.4
53+
❯ poetry run app
54+
55+
INFO:ssl_example.app:Starting application...
56+
INFO:aiokafka.consumer.subscription_state:Updating subscribed topics to: frozenset({'local--kstreams'})
57+
INFO:aiokafka.consumer.consumer:Subscribed to topic(s): {'local--kstreams'}
58+
INFO:kstreams.prometheus.monitor:Starting Prometheus Monitoring started...
59+
INFO:ssl_example.app:Producing event 0
60+
INFO:ssl_example.app:Producing event 1
61+
INFO:ssl_example.app:Producing event 2
62+
INFO:ssl_example.app:Producing event 3
63+
INFO:ssl_example.app:Producing event 4
64+
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
65+
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
66+
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
67+
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
68+
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
69+
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
70+
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
71+
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
72+
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
73+
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
74+
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
75+
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
76+
INFO:aiokafka.consumer.group_coordinator:Discovered coordinator 1 for group example-group
77+
INFO:aiokafka.consumer.group_coordinator:Revoking previously assigned partitions set() for group example-group
78+
INFO:aiokafka.consumer.group_coordinator:(Re-)joining group example-group
79+
INFO:aiokafka.consumer.group_coordinator:Joined group 'example-group' (generation 1) with member_id aiokafka-0.11.0-5fb10c73-64b2-42a8-ae8a-23f59d4a3b6b
80+
INFO:aiokafka.consumer.group_coordinator:Elected group leader -- performing partition assignments using roundrobin
81+
INFO:aiokafka.consumer.group_coordinator:Successfully synced group example-group with generation 1
82+
INFO:aiokafka.consumer.group_coordinator:Setting newly assigned partitions {TopicPartition(topic='local--kstreams', partition=0)} for group example-group
83+
```
84+
85+
## Note
86+
87+
If you plan on using this example, pay attention to the `pyproject.toml` dependencies, where
88+
`kstreams` is pointing to the parent folder. You will have to set the latest version.
+45
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
version: '3'
2+
services:
3+
4+
zookeeper:
5+
image: "confluentinc/cp-zookeeper:7.3.0"
6+
hostname: zookeeper
7+
container_name: zookeeper
8+
ports:
9+
- 32181:32181
10+
environment:
11+
- ZOOKEEPER_CLIENT_PORT=32181
12+
kafka:
13+
volumes:
14+
- ./kafka-certs/server/:/etc/kafka/secrets
15+
- ./kafka-certs/admin/client.properties:/etc/kafka/config/client.properties
16+
image: "confluentinc/cp-kafka:7.3.0"
17+
hostname: kafka
18+
container_name: kafka
19+
environment:
20+
KAFKA_BROKER_ID: "1"
21+
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: "1"
22+
KAFKA_AUTO_CREATE_TOPICS_ENABLE: "true"
23+
KAFKA_ZOOKEEPER_CONNECT: "zookeeper:32181"
24+
KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0
25+
CONFLUENT_METRICS_ENABLE: 'true'
26+
CONFLUENT_SUPPORT_CUSTOMER_ID: anonymous
27+
# Kafka security
28+
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: SSL:SSL,SSL2:SSL
29+
KAFKA_SECURITY_INTER_BROKER_PROTOCOL: SSL
30+
KAFKA_ADVERTISED_LISTENERS: SSL://localhost:9092,SSL2://kafka:9093
31+
KAFKA_SSL_KEYSTORE_FILENAME: keystore.jks
32+
KAFKA_SSL_TRUSTSTORE_FILENAME: truststore.jks
33+
KAFKA_SSL_KEY_CREDENTIALS: ssl-key-credentials
34+
KAFKA_SSL_KEYSTORE_CREDENTIALS: key-store-credentials
35+
KAFKA_SSL_TRUSTSTORE_CREDENTIALS: trust-store-credentials
36+
KAFKA_SSL_CLIENT_AUTH: required
37+
KAFKA_AUTHORIZER_CLASS_NAME: kafka.security.authorizer.AclAuthorizer
38+
KAFKA_SSL_PRINCIPAL_MAPPING_RULES: RULE:^.*[Cc][Nn]=([a-zA-Z0-9._-]*).*$$/CN=$$1/,DEFAULT
39+
KAFKA_SUPER_USERS: User:CN=localhost;User:CN=admin.client.company.org;
40+
ports:
41+
- "9092:9092"
42+
- "29092:29092"
43+
depends_on:
44+
- zookeeper
45+

0 commit comments

Comments
 (0)