Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove the get_dict methods from the clients #14149

Conversation

FlorentClarret
Copy link
Member

What does this PR do?

Motivation

Additional Notes

Review checklist (to be filled by reviewers)

  • Feature or bugfix MUST have appropriate tests (unit, integration, e2e)
  • PR title must be written as a CHANGELOG entry (see why)
  • Files changes must correspond to the primary purpose of the PR as described in the title (small unrelated changes should have their own PR)
  • PR must have changelog/ and integration/ labels attached
  • If the PR doesn't need to be tested during QA, please add a qa/skip-qa label.

@FlorentClarret FlorentClarret requested a review from a team as a code owner March 14, 2023 09:46
Comment on lines +270 to +274
kafka_consumer_check = KafkaCheck('kafka_consumer', {'max_partition_contexts': 2}, [kafka_instance])
dd_run_check(kafka_consumer_check)

# Then
aggregator.assert_metric("kafka.broker_offset", count=1)
aggregator.assert_metric("kafka.broker_offset", count=2)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To reviewer: This test with max_partition_contexts set to 1 does not make sense to me because:

  • We mocked get_consumer_offsets_dict to return {("consumer_group1", "topic1", "partition1"): 2}, so with a len to 1
  • We mocked get_highwater_offsets_dict to return {("topic1", "partition1"): 3, ("topic2", "partition2"): 3}
  • However, if len(get_consumer_offsets_dict) > limit with limit=1, we do not even run the get_highwater_offsets methods, so get_highwater_offsets_dict should return an empty dict in that case.
  • See
    highwater_offsets = {}
    try:
    if len(consumer_offsets) < self._context_limit:
    highwater_offsets = self.client.get_highwater_offsets(consumer_offsets)
    else:
    self.warning("Context limit reached. Skipping highwater offset collection.")

To me, this test should run both methods and return more contexts than we should handle from the highwater method and then, skip some contexts. So here we now have 3 contexts (1 consumer, 2 highwaters) with a limit set to 2, we should only emit 2 metrics

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah that is a good catch! I should have made an assert here to verify that get_highwater_offsets() was actually called in the first place rather than just returning a mocked get_highwater_offsets_dict().

@FlorentClarret FlorentClarret force-pushed the florentclarret/kafka_consumer/drop_dict_methods branch from dc5d5cb to 26335f2 Compare March 14, 2023 10:21
@FlorentClarret FlorentClarret force-pushed the florentclarret/kafka_consumer/drop_dict_methods branch 2 times, most recently from d420669 to 5674f27 Compare March 14, 2023 12:38
@codecov
Copy link

codecov bot commented Mar 14, 2023

Codecov Report

❗ No coverage uploaded for pull request base (AI-2904/kafka-consumer-revamp@65751ea). Click here to learn what that means.
The diff coverage is n/a.

Flag Coverage Δ
kafka_consumer 91.30% <0.00%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

@FlorentClarret FlorentClarret force-pushed the florentclarret/kafka_consumer/drop_dict_methods branch from 025c343 to a64e180 Compare March 15, 2023 09:00
@FlorentClarret FlorentClarret merged commit 11aae44 into AI-2904/kafka-consumer-revamp Mar 15, 2023
@FlorentClarret FlorentClarret deleted the florentclarret/kafka_consumer/drop_dict_methods branch March 15, 2023 09:20
This was referenced Mar 17, 2023
@yzhan289 yzhan289 mentioned this pull request Mar 27, 2023
5 tasks
yzhan289 added a commit that referenced this pull request Apr 14, 2023
* Remove deprecated implementation of kafka_consumer (#13915)

* Remove deprecated implementation of kafka_consumer

* Apply suggestions

* Remove DSM (#13914)

* remove dsm

* remove dsm from metadata.csv

* Remove more unused code (#13922)

* remove more unused code

* revert changes in check

* Flatten kafka consumer check (#13929)

* Add more tests to increase code coverage (#13921)

* Add more tests to increase code coverage

* change to configerror

* unsplit test files

* update comments

* apply review suggestions

* Flatten the check structure

* Revert "Flatten the check structure"

This reverts commit 1492138.

* Refactor Kafka Consumer (#13931)

* Map out structure

* Combine classes

* Remove deprecated call

* Remove clazz

* Create structure for kafka client classes

* Undo

* Fix style

* Add consumer offset and log collection (#13944)

* Refactor broker offset metric collection (#13934)

* Add broker offset metric collection

* Change import

* Clean up broker offset functions and change names

* Fix style

* Use updated values for check

* Clean up functions

* Refactor client creation (#13946)

* Refactor client creation

* Add back e2e test

* Remove commented out line

* Remove KafkaClient and refactor tests (#13954)

* Revert "Remove KafkaClient and refactor tests (#13954)"

This reverts commit e327d71.

---------

Co-authored-by: Fanny Jiang <fanny.jiang@datadoghq.com>

* Remove KafkaClient and refactor tests (#13967)

* Pass in config to client (#13970)

* Move metric reporting back into main check (#13973)

* Refactor metric submissions back into check

* fix spaces

* remove todo note

* fix style

* move get broker metadata

* remove broker metadata method from classes

* reset client offsets

* Drop Python 2 support (#13961)

* Drop Python 2 support

* style

* Update kafka_consumer/pyproject.toml

Co-authored-by: Ofek Lev <ofekmeister@gmail.com>

---------

Co-authored-by: Ofek Lev <ofekmeister@gmail.com>

* Fix agent deps (#13979)

* Split the tests (#13983)

* Add missing license headers (#13985)

* Separate config logic (#13989)

* Separate config logic

* Apply changes from merge

* Fix style

* Change name to config

* Fix style

* Update for crlfile

* move tls_context back into check (#13987)

* Fix license headers (#13993)

* Fix license headers

* test

* Revert "test"

This reverts commit 28518f3.

* Add healthchecks to zookeeper (#13998)

* Refactor the tests (#13997)

* Remove self.check and cleanup (#13992)

* Remove self.check and cleanup

* Fix instance level variables

* Fix style

* Move consumer offsets up

* Rename variables to be consistent

* Refactor and fix tests (#14019)

* fix unit tests

* fix tls test

* remove irrelevant changes

* revert client param

* Disable one unit test (#14025)

* Create environments for the new kafka client (#14022)

* Create environments for the new kafka client

* Fix style

---------

Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com>

* Increase test coverage (#14021)

* Map out new tests to add

* Implement tests

* Update comments

* Fix style

* Refactor GenericKafkaClient

* Add dependency (#14076)

* Pass consumer offsets into highwater offsets (#14077)

* Create Kafka client for confluent lib (#14078)

* Create Kafka client for confluent lib

* Fix style

* Validate kafka_connect_str

* Remove collect_broker_version (#14095)

* Remove collect_broker_version

* Remove commented out code

* Implement reset offsets (#14103)

* Implement get_partitions_for_topic (#14079)

* Implement get_partitions_for_topic

* Add exception handling

* Fix style

* Implement consumer offsets (#14080)

* Use confluent-kafka during the test setup (#14122)

* Implement get_highwater_offsets and get_highwater_offsets_dict (#14094)

* Implement get_highwater_offsets

* Add TODO and note

* Remove extraneous conditional

* Add comment

* Clarify TODOs

* Make the tests pass with the legacy implementation (#14138)

* Make the tests pass with the legacy implementation

* skip test_gssapi as well

* style

* Remove TODO and update tests

* Remove extra TODO

* Add timeouts to fix tests

* Fix config and tests

---------

Co-authored-by: Florent Clarret <florent.clarret@datadoghq.com>

* Modify the hatch environment to support several authentication method (#14135)

* Create the topics from the python code instead of the docker image

* drop KAFKA_VERSION

* Remove some unused functions (#14145)

* Remove some unused functions

* style

* Update all the tests to use the `kafka_instance` instead of a custom one (#14144)

* Update all the tests to use the `kafka_instance` instead of a custom one

* move the tests one folder up

* style

* Update kafka_consumer/tests/test_unit.py

Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com>

* address

---------

Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com>

* Implement the `request_metadata_update` method (#14152)

* Remove the `get_dict` methods from the clients (#14149)

* Remove the `get_dict` methods from the clients

* Update kafka_consumer/datadog_checks/kafka_consumer/kafka_consumer.py

Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com>

---------

Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com>

* Manually build confluent-kafka in the test env (#14173)

* Refactor the confluent kafka client (#14158)

* Add a tls e2e env and implement it (#14137)

* Add a kerberos e2e env and implement it (#14120)

* Add a krb5 config file to run the tests locally (#14251)

* Implement OAuth config (#14247)

* Implement OAuth config

* Remove commented out code

* Remove tuple

* Fix style

* Drop the legacy client (#14243)

* Drop the legacy client

* Fix tests and style

---------

Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com>

* Fix style

* Apply suggestions

* Make try-except smaller

* Change asserts into config errors

* Add back disable e2e for kerberos

* Remove licenses for removed dependencies

---------

Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com>
Co-authored-by: Florent Clarret <florent.clarret@datadoghq.com>
Co-authored-by: Ofek Lev <ofekmeister@gmail.com>
github-actions bot pushed a commit that referenced this pull request Apr 14, 2023
* Remove deprecated implementation of kafka_consumer (#13915)

* Remove deprecated implementation of kafka_consumer

* Apply suggestions

* Remove DSM (#13914)

* remove dsm

* remove dsm from metadata.csv

* Remove more unused code (#13922)

* remove more unused code

* revert changes in check

* Flatten kafka consumer check (#13929)

* Add more tests to increase code coverage (#13921)

* Add more tests to increase code coverage

* change to configerror

* unsplit test files

* update comments

* apply review suggestions

* Flatten the check structure

* Revert "Flatten the check structure"

This reverts commit 1492138.

* Refactor Kafka Consumer (#13931)

* Map out structure

* Combine classes

* Remove deprecated call

* Remove clazz

* Create structure for kafka client classes

* Undo

* Fix style

* Add consumer offset and log collection (#13944)

* Refactor broker offset metric collection (#13934)

* Add broker offset metric collection

* Change import

* Clean up broker offset functions and change names

* Fix style

* Use updated values for check

* Clean up functions

* Refactor client creation (#13946)

* Refactor client creation

* Add back e2e test

* Remove commented out line

* Remove KafkaClient and refactor tests (#13954)

* Revert "Remove KafkaClient and refactor tests (#13954)"

This reverts commit e327d71.

---------

Co-authored-by: Fanny Jiang <fanny.jiang@datadoghq.com>

* Remove KafkaClient and refactor tests (#13967)

* Pass in config to client (#13970)

* Move metric reporting back into main check (#13973)

* Refactor metric submissions back into check

* fix spaces

* remove todo note

* fix style

* move get broker metadata

* remove broker metadata method from classes

* reset client offsets

* Drop Python 2 support (#13961)

* Drop Python 2 support

* style

* Update kafka_consumer/pyproject.toml

Co-authored-by: Ofek Lev <ofekmeister@gmail.com>

---------

Co-authored-by: Ofek Lev <ofekmeister@gmail.com>

* Fix agent deps (#13979)

* Split the tests (#13983)

* Add missing license headers (#13985)

* Separate config logic (#13989)

* Separate config logic

* Apply changes from merge

* Fix style

* Change name to config

* Fix style

* Update for crlfile

* move tls_context back into check (#13987)

* Fix license headers (#13993)

* Fix license headers

* test

* Revert "test"

This reverts commit 28518f3.

* Add healthchecks to zookeeper (#13998)

* Refactor the tests (#13997)

* Remove self.check and cleanup (#13992)

* Remove self.check and cleanup

* Fix instance level variables

* Fix style

* Move consumer offsets up

* Rename variables to be consistent

* Refactor and fix tests (#14019)

* fix unit tests

* fix tls test

* remove irrelevant changes

* revert client param

* Disable one unit test (#14025)

* Create environments for the new kafka client (#14022)

* Create environments for the new kafka client

* Fix style

---------

Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com>

* Increase test coverage (#14021)

* Map out new tests to add

* Implement tests

* Update comments

* Fix style

* Refactor GenericKafkaClient

* Add dependency (#14076)

* Pass consumer offsets into highwater offsets (#14077)

* Create Kafka client for confluent lib (#14078)

* Create Kafka client for confluent lib

* Fix style

* Validate kafka_connect_str

* Remove collect_broker_version (#14095)

* Remove collect_broker_version

* Remove commented out code

* Implement reset offsets (#14103)

* Implement get_partitions_for_topic (#14079)

* Implement get_partitions_for_topic

* Add exception handling

* Fix style

* Implement consumer offsets (#14080)

* Use confluent-kafka during the test setup (#14122)

* Implement get_highwater_offsets and get_highwater_offsets_dict (#14094)

* Implement get_highwater_offsets

* Add TODO and note

* Remove extraneous conditional

* Add comment

* Clarify TODOs

* Make the tests pass with the legacy implementation (#14138)

* Make the tests pass with the legacy implementation

* skip test_gssapi as well

* style

* Remove TODO and update tests

* Remove extra TODO

* Add timeouts to fix tests

* Fix config and tests

---------

Co-authored-by: Florent Clarret <florent.clarret@datadoghq.com>

* Modify the hatch environment to support several authentication method (#14135)

* Create the topics from the python code instead of the docker image

* drop KAFKA_VERSION

* Remove some unused functions (#14145)

* Remove some unused functions

* style

* Update all the tests to use the `kafka_instance` instead of a custom one (#14144)

* Update all the tests to use the `kafka_instance` instead of a custom one

* move the tests one folder up

* style

* Update kafka_consumer/tests/test_unit.py

Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com>

* address

---------

Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com>

* Implement the `request_metadata_update` method (#14152)

* Remove the `get_dict` methods from the clients (#14149)

* Remove the `get_dict` methods from the clients

* Update kafka_consumer/datadog_checks/kafka_consumer/kafka_consumer.py

Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com>

---------

Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com>

* Manually build confluent-kafka in the test env (#14173)

* Refactor the confluent kafka client (#14158)

* Add a tls e2e env and implement it (#14137)

* Add a kerberos e2e env and implement it (#14120)

* Add a krb5 config file to run the tests locally (#14251)

* Implement OAuth config (#14247)

* Implement OAuth config

* Remove commented out code

* Remove tuple

* Fix style

* Drop the legacy client (#14243)

* Drop the legacy client

* Fix tests and style

---------

Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com>

* Fix style

* Apply suggestions

* Make try-except smaller

* Change asserts into config errors

* Add back disable e2e for kerberos

* Remove licenses for removed dependencies

---------

Co-authored-by: Andrew Zhang <andrew.zhang@datadoghq.com>
Co-authored-by: Florent Clarret <florent.clarret@datadoghq.com>
Co-authored-by: Ofek Lev <ofekmeister@gmail.com> a41ad12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants