'can we use the schema embedded in the avro record to deserialize the value?

we are using avro for our kafka topic record and it needs to be deserialized in kafka connect sink we have for elastic search. We don't have any schema registry at the moment and hence we are trying to utilize registryless-avro-converter for reading the record in the connector. I see that this converter also requires a schema to passed in as an input in order to read the record. But I also came to know that schema is embedded in the avro record when serialized.

So I am trying to understand if there is any way we can utilize this embedded schema in the record to deserialize the value?

Since this schema will always be the one which is used for serializing, the deserializer on the consumer side doesn't need to worry about maintaining the latest schema (overhead of carrying schema is not a concern at the moment for us). My knowledge around this is at the beginner level so trying to understand the basics. I understand JsonConverter can help us achieve this but was just curious to know if it can be achieved via avro in some manner.



Solution 1:[1]

According to your link it is optional to provide schema.path

To use the RegistrylessAvroConverter, simply provide it in the key.converter or value.converter setting for your connector. RAC can run with or without an explicit reader or writer schema. If an explicit schema is not provided, the schema used will be determined at runtime.

N.B. Schemas determined at runtime could vary depending on how your connector is implemented and how it generates Connect Data Schemas. They recommend understanding the semantics of your Connectors before using the schemaless configuration for sources.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Ran Lupovich