-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create environments for the new kafka client #14022
Create environments for the new kafka client #14022
Conversation
e1f515d
to
24320c4
Compare
The |
24320c4
to
7385324
Compare
The |
007e3b5
to
4eafde5
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this PR!
# return self.confluent_kafka_client.get_consumer_offsets_dict() | ||
return self.python_kafka_client.get_consumer_offsets_dict() | ||
|
||
def create_kafka_admin_client(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FYI I think we may need to remove create_kafka_admin_client()
in the kafka_python_client.py
code eventually since that was vestigial from the legacy implementation but couldn't outright remove due to it being used in the tests. I'm ok with keeping this for now, but I don't think we need to implement a similar function in confluent_kafka_client
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah I agree. I did not do it in this PR to focus on the global test structure. We can remove it on follow-up PRs :)
kafka_consumer/datadog_checks/kafka_consumer/client/generic_kafka_client.py
Outdated
Show resolved
Hide resolved
@@ -29,14 +29,6 @@ def token(self): | |||
|
|||
|
|||
class KafkaPythonClient(KafkaClient): | |||
def __init__(self, config, tls_context, log) -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How come we wouldn't need this part anymore?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
because I moved it to the superclass https://github.com/DataDog/integrations-core/pull/14022/files#diff-414e25ef3b0e9d3114346cba1b8500545ef67e2bf3598b76b1c6d9b6f9dbcd7eR8-R14
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah gotcha, missed that!
0a78744
to
8174e9e
Compare
8174e9e
to
4af8da1
Compare
Failing style right now, I can submit a commit to fix this |
Codecov Report
Flags with carried forward coverage won't be shown. Click here to find out more. |
Thank you 🙇 |
* Remove deprecated implementation of kafka_consumer (#13915) * Remove deprecated implementation of kafka_consumer * Apply suggestions * Remove DSM (#13914) * remove dsm * remove dsm from metadata.csv * Remove more unused code (#13922) * remove more unused code * revert changes in check * Flatten kafka consumer check (#13929) * Add more tests to increase code coverage (#13921) * Add more tests to increase code coverage * change to configerror * unsplit test files * update comments * apply review suggestions * Flatten the check structure * Revert "Flatten the check structure" This reverts commit 1492138. * Refactor Kafka Consumer (#13931) * Map out structure * Combine classes * Remove deprecated call * Remove clazz * Create structure for kafka client classes * Undo * Fix style * Add consumer offset and log collection (#13944) * Refactor broker offset metric collection (#13934) * Add broker offset metric collection * Change import * Clean up broker offset functions and change names * Fix style * Use updated values for check * Clean up functions * Refactor client creation (#13946) * Refactor client creation * Add back e2e test * Remove commented out line * Remove KafkaClient and refactor tests (#13954) * Revert "Remove KafkaClient and refactor tests (#13954)" This reverts commit e327d71. --------- Co-authored-by: Fanny Jiang <fanny.jiang@datadoghq.com> * Remove KafkaClient and refactor tests (#13967) * Pass in config to client (#13970) * Move metric reporting back into main check (#13973) * Refactor metric submissions back into check * fix spaces * remove todo note * fix style * move get broker metadata * remove broker metadata method from classes * reset client offsets * Drop Python 2 support (#13961) * Drop Python 2 support * style * Update kafka_consumer/pyproject.toml Co-authored-by: Ofek Lev <ofekmeister@gmail.com> --------- Co-authored-by: Ofek Lev <ofekmeister@gmail.com> * Fix agent deps (#13979) * Split the tests (#13983) * Add missing license headers (#13985) * Separate config logic (#13989) * Separate config logic * Apply changes from merge * Fix style * Change name to config * Fix style * Update for crlfile * move tls_context back into check (#13987) * Fix license headers (#13993) * Fix license headers * test * Revert "test" This reverts commit 28518f3. * Add healthchecks to zookeeper (#13998) * Refactor the tests (#13997) * Remove self.check and cleanup (#13992) * Remove self.check and cleanup * Fix instance level variables * Fix style * Move consumer offsets up * Rename variables to be consistent * Refactor and fix tests (#14019) * fix unit tests * fix tls test * remove irrelevant changes * revert client param * Disable one unit test (#14025) * Create environments for the new kafka client (#14022) * Create environments for the new kafka client * Fix style --------- Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> * Increase test coverage (#14021) * Map out new tests to add * Implement tests * Update comments * Fix style * Refactor GenericKafkaClient * Add dependency (#14076) * Pass consumer offsets into highwater offsets (#14077) * Create Kafka client for confluent lib (#14078) * Create Kafka client for confluent lib * Fix style * Validate kafka_connect_str * Remove collect_broker_version (#14095) * Remove collect_broker_version * Remove commented out code * Implement reset offsets (#14103) * Implement get_partitions_for_topic (#14079) * Implement get_partitions_for_topic * Add exception handling * Fix style * Implement consumer offsets (#14080) * Use confluent-kafka during the test setup (#14122) * Implement get_highwater_offsets and get_highwater_offsets_dict (#14094) * Implement get_highwater_offsets * Add TODO and note * Remove extraneous conditional * Add comment * Clarify TODOs * Make the tests pass with the legacy implementation (#14138) * Make the tests pass with the legacy implementation * skip test_gssapi as well * style * Remove TODO and update tests * Remove extra TODO * Add timeouts to fix tests * Fix config and tests --------- Co-authored-by: Florent Clarret <florent.clarret@datadoghq.com> * Modify the hatch environment to support several authentication method (#14135) * Create the topics from the python code instead of the docker image * drop KAFKA_VERSION * Remove some unused functions (#14145) * Remove some unused functions * style * Update all the tests to use the `kafka_instance` instead of a custom one (#14144) * Update all the tests to use the `kafka_instance` instead of a custom one * move the tests one folder up * style * Update kafka_consumer/tests/test_unit.py Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> * address --------- Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> * Implement the `request_metadata_update` method (#14152) * Remove the `get_dict` methods from the clients (#14149) * Remove the `get_dict` methods from the clients * Update kafka_consumer/datadog_checks/kafka_consumer/kafka_consumer.py Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> --------- Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> * Manually build confluent-kafka in the test env (#14173) * Refactor the confluent kafka client (#14158) * Add a tls e2e env and implement it (#14137) * Add a kerberos e2e env and implement it (#14120) * Add a krb5 config file to run the tests locally (#14251) * Implement OAuth config (#14247) * Implement OAuth config * Remove commented out code * Remove tuple * Fix style * Drop the legacy client (#14243) * Drop the legacy client * Fix tests and style --------- Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> * Fix style * Apply suggestions * Make try-except smaller * Change asserts into config errors * Add back disable e2e for kerberos * Remove licenses for removed dependencies --------- Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> Co-authored-by: Florent Clarret <florent.clarret@datadoghq.com> Co-authored-by: Ofek Lev <ofekmeister@gmail.com>
* Remove deprecated implementation of kafka_consumer (#13915) * Remove deprecated implementation of kafka_consumer * Apply suggestions * Remove DSM (#13914) * remove dsm * remove dsm from metadata.csv * Remove more unused code (#13922) * remove more unused code * revert changes in check * Flatten kafka consumer check (#13929) * Add more tests to increase code coverage (#13921) * Add more tests to increase code coverage * change to configerror * unsplit test files * update comments * apply review suggestions * Flatten the check structure * Revert "Flatten the check structure" This reverts commit 1492138. * Refactor Kafka Consumer (#13931) * Map out structure * Combine classes * Remove deprecated call * Remove clazz * Create structure for kafka client classes * Undo * Fix style * Add consumer offset and log collection (#13944) * Refactor broker offset metric collection (#13934) * Add broker offset metric collection * Change import * Clean up broker offset functions and change names * Fix style * Use updated values for check * Clean up functions * Refactor client creation (#13946) * Refactor client creation * Add back e2e test * Remove commented out line * Remove KafkaClient and refactor tests (#13954) * Revert "Remove KafkaClient and refactor tests (#13954)" This reverts commit e327d71. --------- Co-authored-by: Fanny Jiang <fanny.jiang@datadoghq.com> * Remove KafkaClient and refactor tests (#13967) * Pass in config to client (#13970) * Move metric reporting back into main check (#13973) * Refactor metric submissions back into check * fix spaces * remove todo note * fix style * move get broker metadata * remove broker metadata method from classes * reset client offsets * Drop Python 2 support (#13961) * Drop Python 2 support * style * Update kafka_consumer/pyproject.toml Co-authored-by: Ofek Lev <ofekmeister@gmail.com> --------- Co-authored-by: Ofek Lev <ofekmeister@gmail.com> * Fix agent deps (#13979) * Split the tests (#13983) * Add missing license headers (#13985) * Separate config logic (#13989) * Separate config logic * Apply changes from merge * Fix style * Change name to config * Fix style * Update for crlfile * move tls_context back into check (#13987) * Fix license headers (#13993) * Fix license headers * test * Revert "test" This reverts commit 28518f3. * Add healthchecks to zookeeper (#13998) * Refactor the tests (#13997) * Remove self.check and cleanup (#13992) * Remove self.check and cleanup * Fix instance level variables * Fix style * Move consumer offsets up * Rename variables to be consistent * Refactor and fix tests (#14019) * fix unit tests * fix tls test * remove irrelevant changes * revert client param * Disable one unit test (#14025) * Create environments for the new kafka client (#14022) * Create environments for the new kafka client * Fix style --------- Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> * Increase test coverage (#14021) * Map out new tests to add * Implement tests * Update comments * Fix style * Refactor GenericKafkaClient * Add dependency (#14076) * Pass consumer offsets into highwater offsets (#14077) * Create Kafka client for confluent lib (#14078) * Create Kafka client for confluent lib * Fix style * Validate kafka_connect_str * Remove collect_broker_version (#14095) * Remove collect_broker_version * Remove commented out code * Implement reset offsets (#14103) * Implement get_partitions_for_topic (#14079) * Implement get_partitions_for_topic * Add exception handling * Fix style * Implement consumer offsets (#14080) * Use confluent-kafka during the test setup (#14122) * Implement get_highwater_offsets and get_highwater_offsets_dict (#14094) * Implement get_highwater_offsets * Add TODO and note * Remove extraneous conditional * Add comment * Clarify TODOs * Make the tests pass with the legacy implementation (#14138) * Make the tests pass with the legacy implementation * skip test_gssapi as well * style * Remove TODO and update tests * Remove extra TODO * Add timeouts to fix tests * Fix config and tests --------- Co-authored-by: Florent Clarret <florent.clarret@datadoghq.com> * Modify the hatch environment to support several authentication method (#14135) * Create the topics from the python code instead of the docker image * drop KAFKA_VERSION * Remove some unused functions (#14145) * Remove some unused functions * style * Update all the tests to use the `kafka_instance` instead of a custom one (#14144) * Update all the tests to use the `kafka_instance` instead of a custom one * move the tests one folder up * style * Update kafka_consumer/tests/test_unit.py Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> * address --------- Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> * Implement the `request_metadata_update` method (#14152) * Remove the `get_dict` methods from the clients (#14149) * Remove the `get_dict` methods from the clients * Update kafka_consumer/datadog_checks/kafka_consumer/kafka_consumer.py Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> --------- Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> * Manually build confluent-kafka in the test env (#14173) * Refactor the confluent kafka client (#14158) * Add a tls e2e env and implement it (#14137) * Add a kerberos e2e env and implement it (#14120) * Add a krb5 config file to run the tests locally (#14251) * Implement OAuth config (#14247) * Implement OAuth config * Remove commented out code * Remove tuple * Fix style * Drop the legacy client (#14243) * Drop the legacy client * Fix tests and style --------- Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> * Fix style * Apply suggestions * Make try-except smaller * Change asserts into config errors * Add back disable e2e for kerberos * Remove licenses for removed dependencies --------- Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> Co-authored-by: Florent Clarret <florent.clarret@datadoghq.com> Co-authored-by: Ofek Lev <ofekmeister@gmail.com> a41ad12
What does this PR do?
Depends on #14019
Leverage @fanny-jiang's script to setup and e2e environments using the librdkafka & confluent-kafka libs and add new options to easily switch between the two implementations.
This will be removed once the new client will be ready and the test will be green. Also I added an indirection to the client instead of a factory to allow us to implements the functions one by one.
Motivation
This script is needed for us to test the new client because it's not yet installed at the agent level (PR in progress). At the moment we need to manually switch from one implementation to the other (e.g: https://github.com/DataDog/integrations-core/pull/14014/files#diff-5985e39a8c7486fac40276b2ffa70750ab902d1e6a25da1ba0e00648d8ce4cf3R11-R12)
With this PR, we'll be able to easily switch from one implementation to the other and should also make sure our tests do not depend on the client implementation.
Having a proxy instead of a factory will allow us to have this flexibility.
Let me know what you think, I'd like to discuss this during our sync
Additional Notes
Review checklist (to be filled by reviewers)
changelog/
andintegration/
labels attachedqa/skip-qa
label.