I have added this to my consumer config, and now it works fine.
receive.buffer.bytes=1048576
On Wed, Nov 13, 2019 at 10:41 AM Upendra Yadav <upendra1024@gmail.com>
wrote:
> Hi,
>
> I m using consumer assign method and consume with 15000 poll time out to
> consume single partition data from another DC.
>
> Below are my consumer configs:
> enable.auto.commit=false
> max.poll.records=4000
> max.partition.fetch.bytes=4096000
> key.deserializer=org.apache.kafka.common.serialization
> .ByteArrayDeserializer value.deserializer=org.apache.kafka.common.
> serialization.ByteArrayDeserializer
>
> with this my consumer works fine. but when I'm changing
> max.partition.fetch.bytes to 16384000, my consumer is not receiving any
> message.
> there is no exception. if I'm using consumer assign, do I need to tune
> below properties:
> fetch.max.bytes
> session.timeout.ms
> heartbeat.interval.ms
> Please let me know if I'm missing something.
>
receive.buffer.bytes=1048576
On Wed, Nov 13, 2019 at 10:41 AM Upendra Yadav <upendra1024@gmail.com>
wrote:
> Hi,
>
> I m using consumer assign method and consume with 15000 poll time out to
> consume single partition data from another DC.
>
> Below are my consumer configs:
> enable.auto.commit=false
> max.poll.records=4000
> max.partition.fetch.bytes=4096000
> key.deserializer=org.apache.kafka.common.serialization
> .ByteArrayDeserializer value.deserializer=org.apache.kafka.common.
> serialization.ByteArrayDeserializer
>
> with this my consumer works fine. but when I'm changing
> max.partition.fetch.bytes to 16384000, my consumer is not receiving any
> message.
> there is no exception. if I'm using consumer assign, do I need to tune
> below properties:
> fetch.max.bytes
> session.timeout.ms
> heartbeat.interval.ms
> Please let me know if I'm missing something.
>
Comments
Post a Comment