Issue with display events

Hello!
I have a problem with events: they aren’t displayed.
Legend: after upgrading from 10.x to 20.x sentry version It works and displays new events for 3-4 hours but after than it stops showing them.
I use this script for testing:

	import sentry_sdk
	from raven import Client

	sentry_sdk.init(
	    'http://56d84af456444fefa2cf2e70f4580a42@sentry.dev.sp.corp/1',
	    max_breadcrumbs=50,
	    debug=True,
	)

	client = Client(
	    'http://56d84af456444fefa2cf2e70f4580a42@sentry.dev.sp.corp/1')

	try:
	    1 / 0
	except ZeroDivisionError:
	    client.captureException()

And I can see that it are processed and displayed in ‘ingest-event’ kafka topic:

	root@ee8a4cdf19ca:/# kafka-console-consumer --bootstrap-server localhost:9092 --topic ingest-events --offset 27489 --partition 0 | grep "ZeroDivisionError"


	typeeventpayload��"event_id":"65fc03ce45c8443ca566c6e549ab1d53","level":"error","version":"6","type":"error","logentry":{"formatted":"ZeroDivisionError: division by zero"},"logger":"","modules":{"python":"3.8.5"},"platform":"python","timestamp":1597152869.0,"received":1597152869.593785,"exception":{"values":[{"type":"ZeroDivisionError","value":"division by zero","module":"builtins","stacktrace":{"frames":[{"function":"<module>","module":"__main__","filename":"test.py","abs_path":"test.py","lineno":14,"pre_context":["","client = Client(","    'http://56d84af456444fefa2cf2e70f4580a42@sentry.dev.sp.corp/1')","","try:"],"context_line":"    1 / 0","post_context":["except ZeroDivisionError:","    client.captureException()"],"vars":{"Client":"<class 'raven.base.Client'>","__annotations__":{},"__builtins__":"<module 'builtins' (built-in)>","__cached__":null,"__doc__":null,"__file__":"'test.py'","__loader__":"<_frozen_importlib_external.SourceFileLoader object at 0x7f77bb912eb0>","__name__":"'__main__'","__package__":null,"__spec__":null,"client":"<raven.base.Client object at 0x7f77ba369910>","sentry_sdk":"<module 'sentry_sdk' from '/usr/lib/python3.8/site-packages/sentry_sdk/__init__.py'>"}}]}}]},"tags":[["server_name","yakovlev-pc"]],"extra":{"sys.argv":["'test.py'"]},"sdk":{"name":"raven-python","version":"6.10.0"},"key_id":"1","project":1,"grouping_config":{"enhancements":"eJybzDhxY3J-bm5-npWRgaGlroGxrpHxBABcTQcY","id":"newstyle:2019-10-29"},"_metrics":{"bytes.ingested.event":1362}}start_time�2eevent_id�65fc03ce45c8443ca566c6e549ab1d53project_idremote_addr172.31.21.1attachments
	^CProcessed a total of 1626 messages

But I can’t see it in web, and I also noticed that ‘events’ kafka topic is empty:

root@ee8a4cdf19ca:/# kafka-run-class kafka.tools.GetOffsetShell --broker-list localhost:9092 --topic events
events:0:0

Can you help us to fix this issue? What we can do for that?

Can you share your logs from snuba-consumer, post-process-forwarder, and relay?

Hello!
snuba-consumer:

|13:22:53|sentry-01:[sentry]# docker-compose logs snuba-consumer
Attaching to sentry_onpremise_snuba-consumer_1
snuba-consumer_1               | + '[' c = - ']'
snuba-consumer_1               | + snuba consumer --help
snuba-consumer_1               | + set -- snuba consumer --storage events --auto-offset-reset=latest --max-batch-time-ms 750
snuba-consumer_1               | + set gosu snuba snuba consumer --storage events --auto-offset-reset=latest --max-batch-time-ms 750
snuba-consumer_1               | + exec gosu snuba snuba consumer --storage events --auto-offset-reset=latest --max-batch-time-ms 750
snuba-consumer_1               | 2020-08-11 12:56:38,075 New partitions assigned: {Partition(topic=Topic(name='events'), index=0): 0}
snuba-consumer_1               | 2020-08-11 12:56:52,298 Partitions revoked: [Partition(topic=Topic(name='events'), index=0)]
snuba-consumer_1               | + '[' c = - ']'
snuba-consumer_1               | + snuba consumer --help
snuba-consumer_1               | + set -- snuba consumer --storage events --auto-offset-reset=latest --max-batch-time-ms 750
snuba-consumer_1               | + set gosu snuba snuba consumer --storage events --auto-offset-reset=latest --max-batch-time-ms 750
snuba-consumer_1               | + exec gosu snuba snuba consumer --storage events --auto-offset-reset=latest --max-batch-time-ms 750
snuba-consumer_1               | %3|1597150706.369|FAIL|rdkafka#producer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.31.21.10:9092 failed: Connection refused (after 2    ms in state CONNECT)
snuba-consumer_1               | %3|1597150706.370|FAIL|rdkafka#consumer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.31.21.10:9092 failed: Connection refused (after 0    ms in state CONNECT)
snuba-consumer_1               | %3|1597150707.368|FAIL|rdkafka#producer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.31.21.10:9092 failed: Connection refused (after 0    ms in state CONNECT, 1 identical error(s) suppressed)
snuba-consumer_1               | %3|1597150707.370|FAIL|rdkafka#consumer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.31.21.10:9092 failed: Connection refused (after 0    ms in state CONNECT, 1 identical error(s) suppressed)
snuba-consumer_1               | 2020-08-11 12:58:37,215 New partitions assigned: {Partition(topic=Topic(name='events'), index=0): 0}

post-process-forwarder:

|13:23:08|sentry-01:[sentry]# docker-compose logs post-process-forwarder
Attaching to sentry_onpremise_post-process-forwarder_1
post-process-forwarder_1       | 12:58:34 [WARNING] sentry.utils.geo: settings.GEOIP_PATH_MMDB not configured.
post-process-forwarder_1       | /usr/local/lib/python2.7/site- 
packages/cryptography/__init__.py:39: CryptographyDeprecationWarning: Python 2 is no longer supported by the Python core team. Support     for it is now deprecated in cryptography, and will be removed in a future release.
post-process-forwarder_1       |   CryptographyDeprecationWarning,
post-process-forwarder_1       | 12:58:42 [INFO] sentry.plugins.github: apps-not-configured

relay:

|13:26:45|sentry-01:[sentry]# docker-compose logs --tail=10 relay
Attaching to sentry_onpremise_relay_1
relay_1                        | 2020-08-11T12:59:09Z [relay_server::actors::events] ERROR: error processing event: failed to resolve project information
relay_1                        |   caused by: failed to fetch project state from upstream
relay_1                        | 2020-08-11T12:59:09Z [relay_server::actors::events] ERROR: error processing event: failed to resolve project information
relay_1                        |   caused by: failed to fetch project state from upstream
relay_1                        | 2020-08-11T12:59:09Z [relay_server::actors::events] ERROR: error processing event: failed to resolve project information
relay_1                        |   caused by: failed to fetch project state from upstream
relay_1                        | 2020-08-11T12:59:09Z [relay_server::actors::events] ERROR: error processing event: failed to resolve project information
relay_1                        |   caused by: failed to fetch project state from upstream
relay_1                        | 2020-08-11T12:59:09Z [relay_server::actors::events] ERROR: error processing event: failed to resolve project information
relay_1                        |   caused by: failed to fetch project state from upstream

This time 2020-08-11T12:59 is time of last reloading.
I also checked relay and run it in debug mode and saw that all was ok.

Right now I’ve downgraded to 20.6.0 from latest version and all started to work!

@selfuryon this is quite peculiar. I still suspect a networking issue as relay logs suggest it cannot communicate with web and snuba-consumer logs suggest that they cannot talk to kafka. (post-process-forwarder seems fine)

That said we’ll be looking into one potential issue so stay tuned.

1 Like