Sentry-Native and Kafka MessageSizeTooLarge Error

Hi, thanks for great product.

Im trying to integrate crash reporting in our intranet, backed by sentry-native SDK (master 0.3, Windows+Crashpad) and on-premise sentry.
I used master branch as is, excluding nginx (modified to use external nginx) and pretty much copied all nginx settings.
I noticed that, crashpad handler working as expected generating minidumps, and sending to my sentry instance with HTTP/OK (verified with wireshark), in all my tested occurences.

But I noticed that if my minidump is less than mb size, everything is ok, but if my minidump is 2mb, I see

 [relay_server::actors::events] ERROR: error processing event: could  not store event
    май 27 10:39:04 svn docker-compose[5375]: relay_1                    |   caused                                   by: failed to send kafka message
    май 27 10:39:04 svn docker-compose[5375]: relay_1                    |   caused                                   by: Message production error: MessageSizeTooLarge (Broker: Message size too larg                                  e)

(ignore datetime in russian, sorry)
and event is not delivered to sentry web UI.
I did not touch KAFKA_MESSAGE_MAX_BYTES and KAFKA_MAX_REQUEST_SIZE, so 50 mb should be fine in my case. It could be kafka not pickuping these settings, but I not kafka expert, so I’m not sure how to veriify this?

I can send you my modified docker-compose(basically just switched off nginx, and exposed ports to web and relay) if needed. Nginx access log is fine, and I’m pretty sure docker internal network case, but I could be incorrect.

Any suggestions what can be wrong with my setup?


Just want to note, that tried with latest sentry-native commit with enabled gzip compression, and yes, now its send gzipped minidumps, but problem still exists and I get same error.
Maybe its uncompressed inside relay service?

Problem Solved.
I completely missed the fact that you can also configure kafka client from relay server.

So I added the line to relay config

- {name: "message.max.bytes", value: 50000000}

Inspired by this issue comment


1 Like

And you’ve gone ahead and submitted a PR to fix this for everyone, thanks!