Sentry responds HTTP 200 with event ID but no event on dashboard

It might be IPv6 related. Are you able to apply this patch and try with it: https://github.com/getsentry/onpremise/pull/496/files

How do I do that ?

In case it would be enough, I downloaded the docker-compose.yml file from the view file menu then I ran docker-compose restart, but it didn’t made the events appear.

Here are the most recent logs :

foobar@Sentry:~/onpremise$ docker-compose logs --tail=0 -f
Attaching to sentry_onpremise_nginx_1, sentry_onpremise_worker_1, sentry_onpremise_web_1, sentry_onpremise_cron_1, sentry_onpremise_sentry-cleanup_1, sentry_onpremise_ingest-consumer_1, sentry_onpremise_snuba-cleanup_1, sentry_onpremise_relay_1, sentry_onpremise_snuba-api_1, sentry_onpremise_postgres_1, sentry_onpremise_smtp_1, sentry_onpremise_redis_1, sentry_onpremise_memcached_1, sentry_onpremise_symbolicator_1, sentry_onpremise_zookeeper_1, sentry_onpremise_clickhouse_1, sentry_onpremise_symbolicator-cleanup_1, sentry_onpremise_post-process-forwarder_1, sentry_onpremise_snuba-replacer_1, sentry_onpremise_snuba-outcomes-consumer_1, sentry_onpremise_snuba-consumer_1, sentry_onpremise_kafka_1
nginx_1                    | 172.18.0.1 - - [31/Aug/2020:12:17:31 +0000] "POST /api/3/store/?sentry_key=5c24fe4d25f845918177fc3ba324347d&sentry_version=7 HTTP/1.1" 200 41 "http://dclients.foobar.live/" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
nginx_1                    | 172.18.0.1 - - [31/Aug/2020:12:17:36 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
nginx_1                    | 172.18.0.1 - - [31/Aug/2020:12:17:52 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
nginx_1                    | 172.18.0.1 - - [31/Aug/2020:12:18:11 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
nginx_1                    | 172.18.0.1 - - [31/Aug/2020:12:18:33 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
redis_1                    | 1:M 31 Aug 2020 12:18:49.045 * 100 changes in 300 seconds. Saving...
redis_1                    | 1:M 31 Aug 2020 12:18:49.046 * Background saving started by pid 12
redis_1                    | 12:C 31 Aug 2020 12:18:49.146 * DB saved on disk
redis_1                    | 12:C 31 Aug 2020 12:18:49.146 * RDB: 6 MB of memory used by copy-on-write
redis_1                    | 1:M 31 Aug 2020 12:18:49.147 * Background saving terminated with success
nginx_1                    | 172.18.0.1 - - [31/Aug/2020:12:18:58 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
nginx_1                    | 172.18.0.1 - - [31/Aug/2020:12:19:26 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
nginx_1                    | 172.18.0.1 - - [31/Aug/2020:12:19:57 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
snuba-cleanup_1            | 2020-08-31 12:20:02,475 Dropped 0 partitions on None
nginx_1                    | 172.18.0.1 - - [31/Aug/2020:12:20:32 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
worker_1                   | 12:20:46 [INFO] sentry.tasks.update_user_reports: update_user_reports.records_updated (reports_with_event=0 updated_reports=0 reports_to_update=0)
relay_1                    | 2020-08-31T12:21:02Z [relay_server::utils::kafka] ERROR: failed to produce message to Kafka (delivery callback): Message production error: MessageTimedOut (Local: Message timed out)
relay_1                    | 2020-08-31T12:21:02Z [relay_server::utils::kafka] ERROR: failed to produce message to Kafka (delivery callback): Message production error: MessageTimedOut (Local: Message timed out)
relay_1                    | 2020-08-31T12:21:03Z [relay_server::utils::kafka] ERROR: failed to produce message to Kafka (delivery callback): Message production error: MessageTimedOut (Local: Message timed out)
relay_1                    | 2020-08-31T12:21:03Z [relay_server::utils::kafka] ERROR: failed to produce message to Kafka (delivery callback): Message production error: MessageTimedOut (Local: Message timed out)
relay_1                    | 2020-08-31T12:21:03Z [relay_server::utils::kafka] ERROR: failed to produce message to Kafka (delivery callback): Message production error: MessageTimedOut (Local: Message timed out)
nginx_1                    | 172.18.0.1 - - [31/Aug/2020:12:21:09 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
nginx_1                    | 172.18.0.1 - - [31/Aug/2020:12:21:49 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
relay_1                    | 2020-08-31T12:21:55Z [relay_server::utils::kafka] ERROR: failed to produce message to Kafka (delivery callback): Message production error: MessageTimedOut (Local: Message timed out)

I think you either have some network congestion issues (and since this is a virtual network, CPU might be the bottleneck) or Kafka itself is having trouble responding which I would also attribute to heavy CPU load.

The maximum load average the server ever experienced following a Sentry event is 0.15.

Alright then I’m out of ideas, sorry. It still looks like a networking issue.

What if I give you access to the server that hosts Sentry ?

Sorry, we don’t offer that kind of support.

Okay, then could you tell me how can I install a stable (non-nightly) version of Sentry, preferably the last patch of the previous major, for instance, my currently installed version being 20.8.0, my preferred version would be the latest 19.x.x.

Thanks

You can find all the releases here: https://github.com/getsentry/onpremise/releases/

Downgrading is not safe due to data migrations tho, so I would not recommend it unless you have backups before the upgrade.

Well, my events aren’t showing, remember ? So, I actually have nothing to backup, at this point, without any other solution available, I have no other choice but to delete and reinstall.

However, I can’t find anything between 10.x and 20.x ?

I wasn’t sure if this was an upgrade from an earlier version or a fresh installation, sorry. You’ll still need Kafka with any modern version of Sentry so I’d really try to fix that issue.

Yes because we have switched our versioning scheme from SemVer to CalVer after 10.0.0 so the next version after that is 20.6.0 and I’d try that first. If it still doesn’t work as expected, you can go back as far as 9.1.2 sacrificing a bunch of new goodness for not having Kafka, Snuba etc. but you should be able to upgrade to newer versions if you change your mind.

More information about our releases can be found here: https://develop.sentry.dev/self-hosted/releases/ (this page went live just yesterday so this is not a snarky comment saying you didn’t read the docs first :smiley:) and more information about the switch to CalVer is over at our blog: https://blog.sentry.io/2020/06/22/self-hosted-sentry-switching-to-calver/

That’s funny, because I’m French and in my language, “CalVer” is homonymous with calvaire, which means pain, suffering. :joy:

So actually I’ve installed version 20.7.2 and it works fine.

Interesting.
Is there a full changelog too ?

Also I’m sorry if the comparison feels inappropriate for you, but I have a GitLab “omnibus” installation and both GitLab and Sentry look like the same to me, ie. big factories containing so many components that I can’t understand how it works.

For this reason, I chose to always keep my current version at the last patch version of the penultimate major version. For instance, the last version available of GitLab is 13.3.2 so I’m staying at version 12.10.14.

The thing is, CalVer doesn’t seem to handle major versions, by “major” I mean the versions that includes breaking changes. So how do I find out which versions of Sentry includes breaking changes ?

Thanks.

So, 20.7.2 worked for several hours, then stopped displaying events just like 20.8.0.

Logs :

redis_1                        | 1:M 03 Sep 2020 08:38:09.053 * 100 changes in 300 seconds. Saving...
redis_1                        | 1:M 03 Sep 2020 08:38:09.055 * Background saving started by pid 14
redis_1                        | 14:C 03 Sep 2020 08:38:09.091 * DB saved on disk
redis_1                        | 14:C 03 Sep 2020 08:38:09.091 * RDB: 4 MB of memory used by copy-on-write
redis_1                        | 1:M 03 Sep 2020 08:38:09.155 * Background saving terminated with success
nginx_1                        | 2020/09/03 08:38:19 [warn] 7#7: *244 a client request body is buffered to a temporary file /var/cache/nginx/client_temp/0000000006, client: 172.19.0.1, server: , request: "POST /api/4/store/?sentry_key=26c2f8a91f3a444083c69e27cfdbc2ef&sentry_version=7 HTTP/1.1", host: "logs.foobar.com", referrer: "http://localhost:8081/"
nginx_1                        | 172.19.0.1 - - [03/Sep/2020:08:38:19 +0000] "POST /api/4/store/?sentry_key=26c2f8a91f3a444083c69e27cfdbc2ef&sentry_version=7 HTTP/1.1" 200 41 "http://localhost:8081/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/84.0.4147.135 Safari/537.36"
postgres_1                     | ERROR:  duplicate key value violates unique constraint "sentry_release_organization_id_version_65da5843_uniq"
postgres_1                     | DETAIL:  Key (organization_id, version)=(1, c9c049b01908960ae8db863facf31b73398ff856) already exists.
postgres_1                     | STATEMENT:  INSERT INTO "sentry_release" ("organization_id", "project_id", "version", "ref", "url", "date_added", "date_started", "date_released", "data", "new_groups", "owner_id", "commit_count", "last_commit_id", "authors", "total_deploys", "last_deploy_id") VALUES (1, NULL, 'c9c049b01908960ae8db863facf31b73398ff856', NULL, NULL, '2020-09-03T08:38:19.927801+00:00'::timestamptz, NULL, NULL, '{}', 0, NULL, 0, NULL, '{}', 0, NULL) RETURNING "sentry_release"."id"
postgres_1                     | ERROR:  duplicate key value violates unique constraint "sentry_release_project_project_id_release_id_44ff55de_uniq"
postgres_1                     | DETAIL:  Key (project_id, release_id)=(3, 113) already exists.
postgres_1                     | STATEMENT:  INSERT INTO "sentry_release_project" ("project_id", "release_id", "new_groups") VALUES (3, 113, 0) RETURNING "sentry_release_project"."id"
nginx_1                        | 172.19.0.1 - - [03/Sep/2020:08:38:19 +0000] "POST /api/0/projects/foobar/live-clients/releases/ HTTP/1.1" 208 687 "-" "sentry-cli/1.55.2"
postgres_1                     | ERROR:  duplicate key value violates unique constraint "sentry_release_organization_id_version_65da5843_uniq"
postgres_1                     | DETAIL:  Key (organization_id, version)=(1, c9c049b01908960ae8db863facf31b73398ff856) already exists.
postgres_1                     | STATEMENT:  INSERT INTO "sentry_release" ("organization_id", "project_id", "version", "ref", "url", "date_added", "date_started", "date_released", "data", "new_groups", "owner_id", "commit_count", "last_commit_id", "authors", "total_deploys", "last_deploy_id") VALUES (1, NULL, 'c9c049b01908960ae8db863facf31b73398ff856', NULL, NULL, '2020-09-03T08:38:20.236029+00:00'::timestamptz, NULL, NULL, '{}', 0, NULL, 0, NULL, '{}', 0, NULL) RETURNING "sentry_release"."id"
postgres_1                     | ERROR:  duplicate key value violates unique constraint "sentry_release_project_project_id_release_id_44ff55de_uniq"
postgres_1                     | DETAIL:  Key (project_id, release_id)=(3, 113) already exists.
postgres_1                     | STATEMENT:  INSERT INTO "sentry_release_project" ("project_id", "release_id", "new_groups") VALUES (3, 113, 0) RETURNING "sentry_release_project"."id"

Do you have more logs as these ones don’t really tell much about the failure.

You can compare tags on GitHub but there’s no curated changelog yet.

That’s quite reasonable but I’d say doesn’t really apply to Sentry much. At least, it won’t in the near future as we are going to cut the releases from the latest depolyed SHA 2-weeks prior to the release date.

We expect people to treat the onpremise repository as a black-box. With that and considering that Sentry itself is an application, a breaking change doesn’t mean much. The limited things we could to would either won’t be done or be communicated very clearly with a gradual deprecation strategy (like React does).

I think the blog post on the switch to CalVer can answer this last question a bit better.

Well, I guess this summarizes the thread, lol.
What about open source ?
Open source isn’t only about publishing the code, but also and especially about being open, in the broadest sense.

Hi. Just because we want an experience to just work has absolutely nothing to do with open source. The intention of making it a “black box” is more so you as a user don’t need to be an expert on all the software we run or how we run it or how we configure every little thing. It’s not feasible really. You wanna be an expert on Python, uwsgi, nginx, Redis, memcached, clickhouse, zookeeper, Kafka, and other custom software we’ve written to just run Sentry? Maybe. But most don’t. To you the user, our goal is that these components just work. They’re a part of the “sentry” package. If we introduce new stuff, it should also just work, etc. this is what we’re referring to as “black box”. Other systems call this an “omnibus” package. We want to abstract the complexities. Do we have issues with stuff still? Sure. This is brand new for us. Well get better.

Every component we run and every piece of software here is still open source and you’re free to use as little or as much of this “onpremise” repo as you’d like. THAT is still open source and has nothing to do with our motivations on making this a “black box”.

Enjoy your free software that we’ve invested millions of dollars into making.

2 Likes

I came here to say the “black-box” is only for the developer experience and how to treat this package in your infrastructure but @matt beat me to it :slight_smile:

Every single piece needed to run Sentry is open-source and you are free to use our recommended setup or use that as a blueprint to create your own.

Yet they doesn’t.
And because of this, there is no way for me nor for you to help me debug it and find the issue because the only tool you have are logs and the logs doesn’t help because there aren’t enough nor enough details…