Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fixes #82 update the kafka-producer.yml and kafka-consumer.yml based … #83

Merged
merged 1 commit into from
Aug 23, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion kafka-common/src/test/resources/config/kafka-consumer.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ properties:
# basic.auth.credentials.source: ${kafka-consumer.basic.auth.credentials.source:USER_INFO}
# max fetch size from Kafka cluster. Default 50mb is too big for cache consumption on the sidecar
fetch.max.bytes: ${kafka-consumer.fetch.max.bytes:102400}
# max pol records default is 500. Adjust it based on the size of the records to make sure each poll
# max poll records default is 500. Adjust it based on the size of the records to make sure each poll
# is similar to requestMaxBytes down below.
max.poll.records: ${kafka-consumer.max.poll.records:100}
# The maximum amount of data per-partition the server will return. Records are fetched in batches by the consumer.
Expand Down
8 changes: 5 additions & 3 deletions kafka-common/src/test/resources/config/kafka-producer.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,20 +39,22 @@ properties:
# basic authentication user:pass for the schema registry
# basic.auth.user.info: ${kafka-producer.username:username}:${kafka-producer.password:password}
# basic.auth.credentials.source: ${kafka-producer.basic.auth.credentials.source:USER_INFO}
# If you have message that is bigger than 1 MB to produce, increase this value.
max.message.size: ${kafka-producer.max.message.size:1048576}

# The default topic for the producer. Only certain producer implementation will use it.
topic: ${kafka-producer.topic:portal-event}
# Default key format if no schema for the topic key
keyFormat: string
keyFormat: ${kafka-producer.keyFormat:jsonschema}
# Default value format if no schema for the topic value
valueFormat: string
valueFormat: ${kafka-producer.valueFormat:jsonschema}
# If open tracing is enable. traceability, correlation and metrics should not be in the chain if opentracing is used.
injectOpenTracing: ${kafka-producer.injectOpenTracing:false}
# Inject serviceId as callerId into the http header for metrics to collect the caller. The serviceId is from server.yml
injectCallerId: ${kafka-producer.injectCallerId:false}
# Indicator if the audit is enabled.
auditEnabled: ${kafka-producer.auditEnabled:true}
# Audit log destination topic or logfile. Default to topic
auditTarget: ${kafka-producer.auditTarget:topic}
auditTarget: ${kafka-producer.auditTarget:logfile}
# The consumer audit topic name if the auditTarget is topic
auditTopic: ${kafka-producer.auditTopic:sidecar-audit}