Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Revamp Kafka consumer check (#13918)
* Remove deprecated implementation of kafka_consumer (#13915) * Remove deprecated implementation of kafka_consumer * Apply suggestions * Remove DSM (#13914) * remove dsm * remove dsm from metadata.csv * Remove more unused code (#13922) * remove more unused code * revert changes in check * Flatten kafka consumer check (#13929) * Add more tests to increase code coverage (#13921) * Add more tests to increase code coverage * change to configerror * unsplit test files * update comments * apply review suggestions * Flatten the check structure * Revert "Flatten the check structure" This reverts commit 1492138. * Refactor Kafka Consumer (#13931) * Map out structure * Combine classes * Remove deprecated call * Remove clazz * Create structure for kafka client classes * Undo * Fix style * Add consumer offset and log collection (#13944) * Refactor broker offset metric collection (#13934) * Add broker offset metric collection * Change import * Clean up broker offset functions and change names * Fix style * Use updated values for check * Clean up functions * Refactor client creation (#13946) * Refactor client creation * Add back e2e test * Remove commented out line * Remove KafkaClient and refactor tests (#13954) * Revert "Remove KafkaClient and refactor tests (#13954)" This reverts commit e327d71. --------- Co-authored-by: Fanny Jiang <fanny.jiang@datadoghq.com> * Remove KafkaClient and refactor tests (#13967) * Pass in config to client (#13970) * Move metric reporting back into main check (#13973) * Refactor metric submissions back into check * fix spaces * remove todo note * fix style * move get broker metadata * remove broker metadata method from classes * reset client offsets * Drop Python 2 support (#13961) * Drop Python 2 support * style * Update kafka_consumer/pyproject.toml Co-authored-by: Ofek Lev <ofekmeister@gmail.com> --------- Co-authored-by: Ofek Lev <ofekmeister@gmail.com> * Fix agent deps (#13979) * Split the tests (#13983) * Add missing license headers (#13985) * Separate config logic (#13989) * Separate config logic * Apply changes from merge * Fix style * Change name to config * Fix style * Update for crlfile * move tls_context back into check (#13987) * Fix license headers (#13993) * Fix license headers * test * Revert "test" This reverts commit 28518f3. * Add healthchecks to zookeeper (#13998) * Refactor the tests (#13997) * Remove self.check and cleanup (#13992) * Remove self.check and cleanup * Fix instance level variables * Fix style * Move consumer offsets up * Rename variables to be consistent * Refactor and fix tests (#14019) * fix unit tests * fix tls test * remove irrelevant changes * revert client param * Disable one unit test (#14025) * Create environments for the new kafka client (#14022) * Create environments for the new kafka client * Fix style --------- Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> * Increase test coverage (#14021) * Map out new tests to add * Implement tests * Update comments * Fix style * Refactor GenericKafkaClient * Add dependency (#14076) * Pass consumer offsets into highwater offsets (#14077) * Create Kafka client for confluent lib (#14078) * Create Kafka client for confluent lib * Fix style * Validate kafka_connect_str * Remove collect_broker_version (#14095) * Remove collect_broker_version * Remove commented out code * Implement reset offsets (#14103) * Implement get_partitions_for_topic (#14079) * Implement get_partitions_for_topic * Add exception handling * Fix style * Implement consumer offsets (#14080) * Use confluent-kafka during the test setup (#14122) * Implement get_highwater_offsets and get_highwater_offsets_dict (#14094) * Implement get_highwater_offsets * Add TODO and note * Remove extraneous conditional * Add comment * Clarify TODOs * Make the tests pass with the legacy implementation (#14138) * Make the tests pass with the legacy implementation * skip test_gssapi as well * style * Remove TODO and update tests * Remove extra TODO * Add timeouts to fix tests * Fix config and tests --------- Co-authored-by: Florent Clarret <florent.clarret@datadoghq.com> * Modify the hatch environment to support several authentication method (#14135) * Create the topics from the python code instead of the docker image * drop KAFKA_VERSION * Remove some unused functions (#14145) * Remove some unused functions * style * Update all the tests to use the `kafka_instance` instead of a custom one (#14144) * Update all the tests to use the `kafka_instance` instead of a custom one * move the tests one folder up * style * Update kafka_consumer/tests/test_unit.py Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> * address --------- Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> * Implement the `request_metadata_update` method (#14152) * Remove the `get_dict` methods from the clients (#14149) * Remove the `get_dict` methods from the clients * Update kafka_consumer/datadog_checks/kafka_consumer/kafka_consumer.py Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> --------- Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> * Manually build confluent-kafka in the test env (#14173) * Refactor the confluent kafka client (#14158) * Add a tls e2e env and implement it (#14137) * Add a kerberos e2e env and implement it (#14120) * Add a krb5 config file to run the tests locally (#14251) * Implement OAuth config (#14247) * Implement OAuth config * Remove commented out code * Remove tuple * Fix style * Drop the legacy client (#14243) * Drop the legacy client * Fix tests and style --------- Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> * Fix style * Apply suggestions * Make try-except smaller * Change asserts into config errors * Add back disable e2e for kerberos * Remove licenses for removed dependencies --------- Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com> Co-authored-by: Florent Clarret <florent.clarret@datadoghq.com> Co-authored-by: Ofek Lev <ofekmeister@gmail.com> a41ad12
- Loading branch information