Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

org.sourcelab.kafka.webview.ui.controller.api.exceptions.ApiException: Failed to construct kafka consumer #114

Closed
cezarywisniewski opened this issue Dec 20, 2018 · 17 comments

Comments

@cezarywisniewski
Copy link

I have followed instructions in issue #81 to create a new format as per image below
image
But when trying to view the topic, I am getting "Failed to construct kafka consumer" message.
The view works to some extend when I am using String format instead of Avro so but obviously message is unreadable.

`2018-12-20 23:17:22.264 INFO 9 --- [nio-8080-exec-5] o.a.kafka.common.utils.AppInfoParser : Kafka version : 1.1.1
2018-12-20 23:17:22.264 INFO 9 --- [nio-8080-exec-5] o.a.kafka.common.utils.AppInfoParser : Kafka commitId : 98b6346a977495f6
2018-12-20 23:17:22.399 WARN 9 --- [nio-8080-exec-9] .m.m.a.ExceptionHandlerExceptionResolver : Resolved [org.sourcelab.kafka.webview.ui.controller.api.exceptions.ApiException: Failed to construct kafka consumer]

@Crim
Copy link
Collaborator

Crim commented Dec 21, 2018

Hey! Are you able to send me the JAR so I can test locally, or even better fork the examples project and commit your changes there for review?

Thanks!

@Crim
Copy link
Collaborator

Crim commented Jan 2, 2019

@cezarywisniewski When I build your JAR and upload it to my local web kafkaview instance using the same configuration as you've defined above (only entering a schema.registry.url property), I end up getting this underlying error.

Invalid value io.confluent.kafka.serializers.subject.TopicNameStrategy for configuration key.subject.name.strategy: Class io.confluent.kafka.serializers.subject.TopicNameStrategy could not be found.

Since I'm not super familiar with the Schema Registry product, I'll need to do some additional research to understand what exactly is going on.

I also need to better expose the real underlying exception cause in these scenarios.

@Crim
Copy link
Collaborator

Crim commented Jan 8, 2019

So I've narrowed this down to a classloader issue. The classloader Kafka Webview is using to load the deserializer from the uploaded JAR, is not the same class loader that the deserializer attempts to load its own classes from.

The AvroDeserializer uses Thread.getCurrentThread().getContextClassloader() which references a different classloader.

I'm still working through the best possible solution to this issue.

@cezarywisniewski
Copy link
Author

cezarywisniewski commented Jan 8, 2019 via email

@Crim
Copy link
Collaborator

Crim commented Jan 8, 2019

@cezarywisniewski So after poking around a bit more at confluent's source code it looks like the fact that they were using the ContextClassloader instead of the current classloader was a bug that was resolved in version ~5.x.

See this PR here

Are you able to try using version 5.1.0 of their deserializer package? When I built and upload a JAR using version 5.1.0 I seem to get further along in the consuming process. Unfortunately at the moment I don't have a copy of schema registry setup anywhere to completely validate that version works as expected end to end.

Let me know if you're able to test using version 5.1.0 and if things seem to work, or you run into other errors.

Thanks

@cezarywisniewski
Copy link
Author

It progressed a bit further but at some stage run into this error:
Error Could not write JSON: Not a map: {"type":"record","name":..... rest of the schema follows.
When I set the formats to String, String. The messages are displayed. Its just the portions of the data are not human readable. This would mean that the format is correct.

@Crim
Copy link
Collaborator

Crim commented Jan 8, 2019

Thanks for the update. I'll stand up a schema registry service today and see if I can replicate it as well.

@Crim
Copy link
Collaborator

Crim commented Jan 9, 2019

I've made some progress locally, but still not in an ideal situation. It looks like its just serializing it a flat string vs json. A little more playing and hopefully get it sorted out :)

image

@Crim
Copy link
Collaborator

Crim commented Jan 9, 2019

I think I've put together a solution.

image

@Crim
Copy link
Collaborator

Crim commented Jan 9, 2019

I'm debating if I should just include/package confluent's schema registry deserializer with Kafka WebView. I need to check to see how it's licensed.

@Crim
Copy link
Collaborator

Crim commented Jan 9, 2019

I've published a pre-release package which (hopefully) will resolve the issue for you here: https://github.com/SourceLabOrg/kafka-webview/releases/tag/v2.1.2-RC1

You should be able to download and extract the release, and overwrite your existing install's .jar file. Doing so, all your existing settings should carry over.

Let me know if you're able to validate this works end to end for you or not, and I'll cut a proper release if so.

Thanks for your help!
Stephen

@cezarywisniewski
Copy link
Author

This looks good. I can see the data now. The only problem (And I am not sure if you can fix it. It could be that the bug or limitation is with the Avro deserialiser) is that the data that has been encoded in BYTES type is not converted to the logical type (decimal) and is not readable.
image

@Crim
Copy link
Collaborator

Crim commented Jan 9, 2019

Are you able to share a condensed snippet of the avro schema that I could import locally?

@cezarywisniewski
Copy link
Author

Here is condensed Avro schema. I have removed some info and changed field names.
`{
"type": "record",
"name": "Future",
"namespace": "au.com.xxx.models",
"fields": [
{
"name": "index",
"type": "int"
},
{
"name": "description",
"type": "string"
},
{
"name": "inv_return",
"type": [
{
"type": "record",
"name": "Dollars",
"fields": [
{
"name": "dollars",
"type": {
"type": "bytes",
"logicalType": "decimal",
"precision": 20,
"scale": 2
}
}
]
},
{
"type": "record",
"name": "LinePay",
"fields": [
{
"name": "money",
"type": {
"type": "bytes",
"logicalType": "decimal",
"precision": 20,
"scale": 2
}
}
]
}
]
},
{
"name": "place_return",
"type": [
"Dollars",
"LinePay"
]
},
{
"name": "take",
"type": {
"type": "bytes",
"logicalType": "decimal",
"precision": 20,
"scale": 2
}
},
{
"name": "liability",
"type": {
"type": "bytes",
"logicalType": "decimal",
"precision": 20,
"scale": 2
}
},
{
"name": "laid_off_liability",
"type": {
"type": "bytes",
"logicalType": "decimal",
"precision": 20,
"scale": 2
}
},
{
"name": "limit",
"type": {
"type": "bytes",
"logicalType": "decimal",
"precision": 20,
"scale": 2
}
},
{
"name": "status",
"type": "string"
},
{
"name": "official",
"type": "boolean"
},
{
"name": "last_tkt_time",
"type": [
"null",
{
"type": "long",
"logicalType": "timestamp-millis"
}
]
},
{
"name": "take",
"type": {
"type": "bytes",
"logicalType": "decimal",
"precision": 20,
"scale": 2
}
},
{
"name": "liability",
"type": {
"type": "bytes",
"logicalType": "decimal",
"precision": 20,
"scale": 2
}
},
{
"name": "last_take",
"type": {
"type": "bytes",
"logicalType": "decimal",
"precision": 20,
"scale": 2
}
},
{
"name": "sort_order",
"type": "int"
},
{
"name": "deleted",
"type": "boolean"
},
{
"name": "override_reduction_percent",
"type": "int"
},
{
"name": "diff_price",
"type": "boolean"
},
{
"name": "session_take",
"type": {
"type": "bytes",
"logicalType": "decimal",
"precision": 20,
"scale": 2
}
},
{
"name": "session_liability",
"type": {
"type": "bytes",
"logicalType": "decimal",
"precision": 20,
"scale": 2
}
},
{
"name": "feed_id",
"type": "string"
},
{
"name": "value_type_str",
"type": "string"
},
{
"name": "group_id",
"type": "string"
},
{
"name": "last_official_time",
"type": [
"null",
{
"type": "long",
"logicalType": "timestamp-millis"
}
]
},
{
"name": "future_id",
"type": "int"
},
{
"name": "feed_return",
"type": [
"Dollars",
"LinePay",
"Odds"
]
}
]
}

@Crim
Copy link
Collaborator

Crim commented Jan 10, 2019

Do you use a particular library for populating your logical data types when generating the AVRO messages? It looks like the generated class files only deal w/ ByteBuffers for logical types using the standard AVRO 1.8.2 java library. That or I'm missing something. The AVRO docs seem to indicate logical types are for extending custom or non-native data types, so unfortunately there may not be much I can add to generically support those types :/

Schema file:

{"namespace": "io.confluent.examples.clients.basicavro",
 "type": "record",
 "name": "Payment",
 "fields": [
     {"name": "id", "type": "string"},
     {"name": "amount", "type": "double"},
     {
         "name": "take",
         "type": {
             "type": "bytes",
             "logicalType": "decimal",
             "precision": 20,
             "scale": 2
         }
     }
 ]
}

image

@cezarywisniewski
Copy link
Author

You are right. I do not think you can improve it any more due to limitations of avro library.
Anyway, thanks for your help.

@Crim
Copy link
Collaborator

Crim commented Jan 11, 2019

Yea you may be able to do better by adding a wrapper around the avro deserializer and handling deserialization of those types yourself?

Anyhow thanks for your continued help! If you have other issues or ideas feel free to post up an issue!

@Crim Crim closed this as completed Jan 19, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants