I was able to figure it out. I’m running this in swarm. So my setup takes me out of install.sh which is the officially supported method from Sentry. Basically I cross compare what’s been done between the last tag and the latest tag and I slowly update everything in swarm (I can document this for others)
I went to the base sentry github and found this issue: https://github.com/getsentry/sentry/issues/20435
Which made me realize I was missing both of these entries in the stack file.
# Kafka consumer responsible for feeding session data into Clickhouse
snuba-sessions-consumer:
<<: *snuba_defaults
command: consumer --storage sessions_raw --auto-offset-reset=latest --max-batch-time-ms 750
# Kafka consumer responsible for feeding transactions data into Clickhouse
snuba-transactions-consumer:
<<: *snuba_defaults
command: consumer --storage transactions --consumer-group transactions_group --auto-offset-reset=latest --max-batch-time-ms 750
Once I added them I monitored snuba-transactions-consumer and saw:
2020-09-30 22:18:40,616 New partitions assigned: {Partition(topic=Topic(name='events'), index=0): 106213}
2020-09-30 22:18:46,167 Completed processing <Batch: 1 message, open for 1.00 seconds>.
2020-09-30 22:18:51,321 Completed processing <Batch: 1 message, open for 1.00 seconds>.
2020-09-30 22:18:52,933 Completed processing <Batch: 1 message, open for 1.00 seconds>.
2020-09-30 22:19:25,151 Completed processing <Batch: 2 messages, open for 1.15 seconds>.
2020-09-30 22:19:46,260 Completed processing <Batch: 1 message, open for 1.00 seconds>.
2020-09-30 22:19:48,863 Completed processing <Batch: 2 messages, open for 1.01 seconds>.
2020-09-30 22:20:25,886 Completed processing <Batch: 1 message, open for 1.00 seconds>.
2020-09-30 22:20:29,612 Completed processing <Batch: 1 message, open for 1.00 seconds>.
2020-09-30 22:20:32,689 Completed processing <Batch: 1 message, open for 1.00 seconds>.
2020-09-30 22:21:09,783 Completed processing <Batch: 2 messages, open for 1.00 seconds>.
2020-09-30 22:21:13,149 Completed processing <Batch: 1 message, open for 1.00 seconds>.
2020-09-30 22:21:46,319 Completed processing <Batch: 1 message, open for 1.00 seconds>.
2020-09-30 22:21:53,003 Completed processing <Batch: 1 message, open for 1.00 seconds>.
2020-09-30 22:21:59,397 Completed processing <Batch: 1 message, open for 1.00 seconds>.
2020-09-30 22:22:31,465 Completed processing <Batch: 1 message, open for 1.00 seconds>.
2020-09-30 22:22:54,054 Completed processing <Batch: 1 message, open for 1.00 seconds>.
2020-09-30 22:23:04,539 Completed processing <Batch: 1 message, open for 1.00 seconds>.
2020-09-30 22:23:34,027 Completed processing <Batch: 1 message, open for 1.00 seconds>.
2020-09-30 22:23:50,282 Completed processing <Batch: 1 message, open for 1.00 seconds>.
2020-09-30 22:24:09,427 Completed processing <Batch: 2 messages, open for 1.01 seconds>.
I then waited a bit and then I went and generated another transaction and I now have performance data.
Note that old performance data never came back. It seems that without the snuba’s the data is trashed which is fine