-
Notifications
You must be signed in to change notification settings - Fork 41
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add Apache Kafka basic support (#105)
- Loading branch information
Showing
18 changed files
with
992 additions
and
1 deletion.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,161 @@ | ||
Kafka | ||
======== | ||
|
||
In order to enable Kafka support you have to add | ||
``testsuite.kafka.pytest_plugin`` to ``pytest_plugins`` list in your | ||
``conftest.py``. | ||
|
||
By default testsuite starts Kafka_ service. In this case Kafka installation | ||
is required. | ||
|
||
Currently Kafka plugin uses async aiokafka_ driver. | ||
|
||
Kafka installation | ||
--------------------- | ||
|
||
Consult official docs at https://kafka.apache.org/quickstart | ||
|
||
If you already have Kafka installed and its location differs from | ||
``/etc/kafka`` please specify | ||
``KAFKA_HOME`` environment variable accordingly. | ||
|
||
Installed Kafka **must** support KRaft_ protocol. | ||
|
||
Environment variables | ||
--------------------- | ||
|
||
KAFKA_HOME | ||
~~~~~~~~~~ | ||
|
||
Use to override Kafka binaries dir. Default is ``/etc/kafka`` | ||
|
||
TESTSUITE_KAFKA_SERVER_HOST | ||
~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
||
Use to override Kafka server host. Default is ``localhost``. | ||
|
||
TESTSUITE_KAFKA_SERVER_PORT | ||
~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
||
Use to override Kafka server port. Default is ``9099``. | ||
|
||
TESTSUITE_KAFKA_CONTROLLER_PORT | ||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
||
Use to override Kafka controller port. Default is ``9100``. | ||
|
||
TESTSUITE_KAFKA_SERVER_START_TIMEOUT | ||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
||
By default testsuite will wait for up to 10s for Kafka to start, | ||
one may customize this timeout via environment variable ``TESTSUITE_KAFKA_SERVER_START_TIMEOUT``. | ||
|
||
TESTSUITE_KAFKA_CUSTOM_TOPICS | ||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
||
All topics in tests are created automatically by Kafka broker in runtime with **only 1 partition**. | ||
To create topics with several partitions, either specify ``TESTSUITE_KAFKA_CUSTOM_TOPICS`` environment | ||
variable with the ``,`` separated list of topic to partitions count mapping or override the ``kafka_custom_topics`` fixture. | ||
For example, ``TESTSUITE_KAFKA_CUSTOM_TOPICS=large-topic-1:7,large-topic-2:20`` | ||
creates topic ``large-topic-1`` with 7 partitions and ``large-topic-2`` with 20 partitions. | ||
|
||
Customize ports | ||
--------------- | ||
|
||
Testsuite may start Kafka with custom ports if | ||
``TESTSUITE_KAFKA_SERVER_PORT`` or ``TESTSUITE_KAFKA_CONTROLLER_PORT`` | ||
environment variables are specified. | ||
|
||
Use external instance | ||
--------------------- | ||
|
||
If your instance is local you may try setting environment variable | ||
``TESTSUITE_KAFKA_SERVER_PORT`` and pytest option ``--kafka=1`` | ||
and see if it works. | ||
|
||
P.S. Topics creation remains on the user's side. | ||
|
||
Usage example | ||
------------- | ||
|
||
.. code-block:: python | ||
async def test_kafka_producer_consumer_chain(kafka_producer, kafka_consumer): | ||
TOPIC = 'Test-topic-chain' | ||
KEY = 'test-key' | ||
MESSAGE = 'test-message' | ||
await kafka_producer.send(TOPIC, KEY, MESSAGE) | ||
consumed_message = await kafka_consumer.receive_one([TOPIC]) | ||
assert consumed_message.topic == TOPIC | ||
assert consumed_message.key == KEY | ||
assert consumed_message.value == MESSAGE | ||
.. _Kafka: https://kafka.apache.org/ | ||
.. _aiokafka: https://github.com/aio-libs/aiokafka | ||
.. _KRaft: https://developer.confluent.io/learn/kraft/ | ||
|
||
Example integration | ||
------------------- | ||
|
||
.. code-block:: python | ||
pytest_plugins = [ | ||
'testsuite.pytest_plugin', | ||
'testsuite.databases.kafka.pytest_plugin', | ||
] | ||
KAFKA_CUSTOM_TOPICS = { | ||
'Large-topic-1': 7, | ||
'Large-topic-2': 3, | ||
} | ||
@pytest.fixture(scope='session') | ||
def kafka_custom_topics(): | ||
return KAFKA_CUSTOM_TOPICS | ||
Fixtures | ||
-------- | ||
|
||
.. currentmodule:: testsuite.databases.kafka.pytest_plugin | ||
|
||
kafka_producer | ||
~~~~~~~~~~~~~~ | ||
|
||
.. autofunction:: kafka_producer() | ||
:noindex: | ||
|
||
kafka_consumer | ||
~~~~~~~~~~~~~~ | ||
|
||
.. autofunction:: kafka_consumer() | ||
:noindex: | ||
|
||
kafka_custom_topics | ||
~~~~~~~~~~~~~~~~~~~ | ||
|
||
.. autofunction:: kafka_custom_topics() | ||
:noindex: | ||
|
||
kafka_local | ||
~~~~~~~~~~~~~~~~~~~ | ||
|
||
.. autofunction:: kafka_local() | ||
:noindex: | ||
|
||
|
||
Classes | ||
------- | ||
|
||
.. currentmodule:: testsuite.databases.kafka.classes | ||
|
||
.. autoclass:: KafkaProducer() | ||
:members: send, send_async | ||
|
||
.. autoclass:: KafkaConsumer() | ||
:members: receive_one, receive_batch | ||
|
||
.. autoclass:: ConsumedMessage() | ||
:members: topic, key, value, partition, offset |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Empty file.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,8 @@ | ||
import typing | ||
|
||
import pytest | ||
|
||
|
||
@pytest.fixture(scope='session') | ||
def kafka_custom_topics() -> typing.Dict[str, int]: | ||
return {'Large-Topic': 7} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,40 @@ | ||
import asyncio | ||
|
||
|
||
async def test_kafka_producer_basic(kafka_producer): | ||
await kafka_producer.send('Test-topic', 'test-key', 'test-message') | ||
|
||
|
||
async def test_kafka_producer_many_sends(kafka_producer): | ||
TOPIC = 'Test-topic' | ||
SEND_COUNT = 10 | ||
for send in range(SEND_COUNT): | ||
await kafka_producer.send( | ||
TOPIC, f'test-key-{send}', f'test-message-{send}' | ||
) | ||
|
||
|
||
async def test_kafka_producer_many_send_async(kafka_producer): | ||
TOPIC = 'Test-topic' | ||
SEND_COUNT = 100 | ||
send_futures = [] | ||
for send in range(SEND_COUNT): | ||
send_futures.append( | ||
await kafka_producer.send_async( | ||
TOPIC, f'test-key-{send}', f'test-message-{send}' | ||
) | ||
) | ||
await asyncio.wait(send_futures) | ||
|
||
|
||
async def test_kafka_producer_large_topic(kafka_producer): | ||
TOPIC = 'Large-Topic' | ||
PARTITION_COUNT = 7 | ||
|
||
for partition in range(PARTITION_COUNT): | ||
await kafka_producer.send( | ||
TOPIC, | ||
f'key-to-{partition}', | ||
f'message-to-{partition}', | ||
partition=partition, | ||
) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,66 @@ | ||
import logging | ||
import typing | ||
|
||
|
||
async def test_kafka_producer_consumer_chain(kafka_producer, kafka_consumer): | ||
TOPIC = 'Test-topic-chain' | ||
KEY = 'test-key' | ||
MESSAGE = 'test-message' | ||
|
||
await kafka_producer.send(TOPIC, KEY, MESSAGE) | ||
|
||
consumed_message = await kafka_consumer.receive_one([TOPIC]) | ||
|
||
assert consumed_message.topic == TOPIC | ||
assert consumed_message.key == KEY | ||
assert consumed_message.value == MESSAGE | ||
|
||
|
||
async def test_kafka_producer_consumer_chain_many_messages( | ||
kafka_producer, kafka_consumer | ||
): | ||
TOPIC = 'Test-topic-chain' | ||
SEND_COUNT = 10 | ||
BATCH_SIZE = 5 | ||
|
||
for send in range(SEND_COUNT): | ||
await kafka_producer.send( | ||
TOPIC, f'test-key-{send}', f'test-message-{send}' | ||
) | ||
|
||
sends_received: typing.Set[int] = set() | ||
|
||
while len(sends_received) < SEND_COUNT: | ||
consumed_messages = await kafka_consumer.receive_batch( | ||
topics=[TOPIC], max_batch_size=BATCH_SIZE | ||
) | ||
logging.info('Received batch of %d messages', len(consumed_messages)) | ||
for message in consumed_messages: | ||
sends_received.add(int(message.key.split('-')[-1])) | ||
|
||
|
||
async def test_kafka_producer_consumer_chain_many_topics( | ||
kafka_producer, kafka_consumer | ||
): | ||
TOPIC_COUNT = 3 | ||
TOPICS = [f'Test-topic-chain-{i}' for i in range(TOPIC_COUNT)] | ||
SEND_PER_TOPIC_COUNT = 10 | ||
MESSAGE_COUNT = TOPIC_COUNT * SEND_PER_TOPIC_COUNT | ||
BATCH_SIZE = 5 | ||
|
||
for i, topic in enumerate(TOPICS): | ||
for send in range(SEND_PER_TOPIC_COUNT): | ||
send_number = i * SEND_PER_TOPIC_COUNT + send | ||
await kafka_producer.send( | ||
topic, f'test-key-{send_number}', f'test-message-{send_number}' | ||
) | ||
|
||
sends_received: typing.Set[int] = set() | ||
|
||
while len(sends_received) < MESSAGE_COUNT: | ||
consumed_messages = await kafka_consumer.receive_batch( | ||
topics=TOPICS, max_batch_size=BATCH_SIZE | ||
) | ||
logging.info('Received batch of %d messages', len(consumed_messages)) | ||
for message in consumed_messages: | ||
sends_received.add(int(message.key.split('-')[-1])) |
Empty file.
Oops, something went wrong.