It might be IPv6 related. Are you able to apply this patch and try with it: https://github.com/getsentry/onpremise/pull/496/files
How do I do that ?
In case it would be enough, I downloaded the docker-compose.yml
file from the view file menu then I ran docker-compose restart
, but it didnât made the events appear.
Here are the most recent logs :
foobar@Sentry:~/onpremise$ docker-compose logs --tail=0 -f
Attaching to sentry_onpremise_nginx_1, sentry_onpremise_worker_1, sentry_onpremise_web_1, sentry_onpremise_cron_1, sentry_onpremise_sentry-cleanup_1, sentry_onpremise_ingest-consumer_1, sentry_onpremise_snuba-cleanup_1, sentry_onpremise_relay_1, sentry_onpremise_snuba-api_1, sentry_onpremise_postgres_1, sentry_onpremise_smtp_1, sentry_onpremise_redis_1, sentry_onpremise_memcached_1, sentry_onpremise_symbolicator_1, sentry_onpremise_zookeeper_1, sentry_onpremise_clickhouse_1, sentry_onpremise_symbolicator-cleanup_1, sentry_onpremise_post-process-forwarder_1, sentry_onpremise_snuba-replacer_1, sentry_onpremise_snuba-outcomes-consumer_1, sentry_onpremise_snuba-consumer_1, sentry_onpremise_kafka_1
nginx_1 | 172.18.0.1 - - [31/Aug/2020:12:17:31 +0000] "POST /api/3/store/?sentry_key=5c24fe4d25f845918177fc3ba324347d&sentry_version=7 HTTP/1.1" 200 41 "http://dclients.foobar.live/" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
nginx_1 | 172.18.0.1 - - [31/Aug/2020:12:17:36 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
nginx_1 | 172.18.0.1 - - [31/Aug/2020:12:17:52 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
nginx_1 | 172.18.0.1 - - [31/Aug/2020:12:18:11 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
nginx_1 | 172.18.0.1 - - [31/Aug/2020:12:18:33 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
redis_1 | 1:M 31 Aug 2020 12:18:49.045 * 100 changes in 300 seconds. Saving...
redis_1 | 1:M 31 Aug 2020 12:18:49.046 * Background saving started by pid 12
redis_1 | 12:C 31 Aug 2020 12:18:49.146 * DB saved on disk
redis_1 | 12:C 31 Aug 2020 12:18:49.146 * RDB: 6 MB of memory used by copy-on-write
redis_1 | 1:M 31 Aug 2020 12:18:49.147 * Background saving terminated with success
nginx_1 | 172.18.0.1 - - [31/Aug/2020:12:18:58 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
nginx_1 | 172.18.0.1 - - [31/Aug/2020:12:19:26 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
nginx_1 | 172.18.0.1 - - [31/Aug/2020:12:19:57 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
snuba-cleanup_1 | 2020-08-31 12:20:02,475 Dropped 0 partitions on None
nginx_1 | 172.18.0.1 - - [31/Aug/2020:12:20:32 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
worker_1 | 12:20:46 [INFO] sentry.tasks.update_user_reports: update_user_reports.records_updated (reports_with_event=0 updated_reports=0 reports_to_update=0)
relay_1 | 2020-08-31T12:21:02Z [relay_server::utils::kafka] ERROR: failed to produce message to Kafka (delivery callback): Message production error: MessageTimedOut (Local: Message timed out)
relay_1 | 2020-08-31T12:21:02Z [relay_server::utils::kafka] ERROR: failed to produce message to Kafka (delivery callback): Message production error: MessageTimedOut (Local: Message timed out)
relay_1 | 2020-08-31T12:21:03Z [relay_server::utils::kafka] ERROR: failed to produce message to Kafka (delivery callback): Message production error: MessageTimedOut (Local: Message timed out)
relay_1 | 2020-08-31T12:21:03Z [relay_server::utils::kafka] ERROR: failed to produce message to Kafka (delivery callback): Message production error: MessageTimedOut (Local: Message timed out)
relay_1 | 2020-08-31T12:21:03Z [relay_server::utils::kafka] ERROR: failed to produce message to Kafka (delivery callback): Message production error: MessageTimedOut (Local: Message timed out)
nginx_1 | 172.18.0.1 - - [31/Aug/2020:12:21:09 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
nginx_1 | 172.18.0.1 - - [31/Aug/2020:12:21:49 +0000] "GET /api/0/organizations/foobar/issues/?project=4&query=is%3Aunresolved&limit=25&statsPeriod=14d&shortIdLookup=1&cursor=0:0:1 HTTP/1.1" 200 2 "https://logs.foobar.com/organizations/foobar/issues/?project=2&project=3&project=4" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36"
relay_1 | 2020-08-31T12:21:55Z [relay_server::utils::kafka] ERROR: failed to produce message to Kafka (delivery callback): Message production error: MessageTimedOut (Local: Message timed out)
I think you either have some network congestion issues (and since this is a virtual network, CPU might be the bottleneck) or Kafka itself is having trouble responding which I would also attribute to heavy CPU load.
The maximum load average the server ever experienced following a Sentry event is 0.15
.
Alright then Iâm out of ideas, sorry. It still looks like a networking issue.
What if I give you access to the server that hosts Sentry ?
Sorry, we donât offer that kind of support.
Okay, then could you tell me how can I install a stable (non-nightly) version of Sentry, preferably the last patch of the previous major, for instance, my currently installed version being 20.8.0
, my preferred version would be the latest 19.x.x
.
Thanks
You can find all the releases here: https://github.com/getsentry/onpremise/releases/
Downgrading is not safe due to data migrations tho, so I would not recommend it unless you have backups before the upgrade.
Well, my events arenât showing, remember ? So, I actually have nothing to backup, at this point, without any other solution available, I have no other choice but to delete and reinstall.
However, I canât find anything between 10.x
and 20.x
?
I wasnât sure if this was an upgrade from an earlier version or a fresh installation, sorry. Youâll still need Kafka with any modern version of Sentry so Iâd really try to fix that issue.
Yes because we have switched our versioning scheme from SemVer to CalVer after 10.0.0
so the next version after that is 20.6.0
and Iâd try that first. If it still doesnât work as expected, you can go back as far as 9.1.2
sacrificing a bunch of new goodness for not having Kafka, Snuba etc. but you should be able to upgrade to newer versions if you change your mind.
More information about our releases can be found here: https://develop.sentry.dev/self-hosted/releases/ (this page went live just yesterday so this is not a snarky comment saying you didnât read the docs first ) and more information about the switch to CalVer is over at our blog: https://blog.sentry.io/2020/06/22/self-hosted-sentry-switching-to-calver/
Thatâs funny, because Iâm French and in my language, âCalVerâ is homonymous with calvaire, which means pain, suffering.
So actually Iâve installed version 20.7.2
and it works fine.
Interesting.
Is there a full changelog too ?
Also Iâm sorry if the comparison feels inappropriate for you, but I have a GitLab âomnibusâ installation and both GitLab and Sentry look like the same to me, ie. big factories containing so many components that I canât understand how it works.
For this reason, I chose to always keep my current version at the last patch version of the penultimate major version. For instance, the last version available of GitLab is 13.3.2
so Iâm staying at version 12.10.14
.
The thing is, CalVer doesnât seem to handle major versions, by âmajorâ I mean the versions that includes breaking changes. So how do I find out which versions of Sentry includes breaking changes ?
Thanks.
So, 20.7.2
worked for several hours, then stopped displaying events just like 20.8.0
.
Logs :
redis_1 | 1:M 03 Sep 2020 08:38:09.053 * 100 changes in 300 seconds. Saving...
redis_1 | 1:M 03 Sep 2020 08:38:09.055 * Background saving started by pid 14
redis_1 | 14:C 03 Sep 2020 08:38:09.091 * DB saved on disk
redis_1 | 14:C 03 Sep 2020 08:38:09.091 * RDB: 4 MB of memory used by copy-on-write
redis_1 | 1:M 03 Sep 2020 08:38:09.155 * Background saving terminated with success
nginx_1 | 2020/09/03 08:38:19 [warn] 7#7: *244 a client request body is buffered to a temporary file /var/cache/nginx/client_temp/0000000006, client: 172.19.0.1, server: , request: "POST /api/4/store/?sentry_key=26c2f8a91f3a444083c69e27cfdbc2ef&sentry_version=7 HTTP/1.1", host: "logs.foobar.com", referrer: "http://localhost:8081/"
nginx_1 | 172.19.0.1 - - [03/Sep/2020:08:38:19 +0000] "POST /api/4/store/?sentry_key=26c2f8a91f3a444083c69e27cfdbc2ef&sentry_version=7 HTTP/1.1" 200 41 "http://localhost:8081/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/84.0.4147.135 Safari/537.36"
postgres_1 | ERROR: duplicate key value violates unique constraint "sentry_release_organization_id_version_65da5843_uniq"
postgres_1 | DETAIL: Key (organization_id, version)=(1, c9c049b01908960ae8db863facf31b73398ff856) already exists.
postgres_1 | STATEMENT: INSERT INTO "sentry_release" ("organization_id", "project_id", "version", "ref", "url", "date_added", "date_started", "date_released", "data", "new_groups", "owner_id", "commit_count", "last_commit_id", "authors", "total_deploys", "last_deploy_id") VALUES (1, NULL, 'c9c049b01908960ae8db863facf31b73398ff856', NULL, NULL, '2020-09-03T08:38:19.927801+00:00'::timestamptz, NULL, NULL, '{}', 0, NULL, 0, NULL, '{}', 0, NULL) RETURNING "sentry_release"."id"
postgres_1 | ERROR: duplicate key value violates unique constraint "sentry_release_project_project_id_release_id_44ff55de_uniq"
postgres_1 | DETAIL: Key (project_id, release_id)=(3, 113) already exists.
postgres_1 | STATEMENT: INSERT INTO "sentry_release_project" ("project_id", "release_id", "new_groups") VALUES (3, 113, 0) RETURNING "sentry_release_project"."id"
nginx_1 | 172.19.0.1 - - [03/Sep/2020:08:38:19 +0000] "POST /api/0/projects/foobar/live-clients/releases/ HTTP/1.1" 208 687 "-" "sentry-cli/1.55.2"
postgres_1 | ERROR: duplicate key value violates unique constraint "sentry_release_organization_id_version_65da5843_uniq"
postgres_1 | DETAIL: Key (organization_id, version)=(1, c9c049b01908960ae8db863facf31b73398ff856) already exists.
postgres_1 | STATEMENT: INSERT INTO "sentry_release" ("organization_id", "project_id", "version", "ref", "url", "date_added", "date_started", "date_released", "data", "new_groups", "owner_id", "commit_count", "last_commit_id", "authors", "total_deploys", "last_deploy_id") VALUES (1, NULL, 'c9c049b01908960ae8db863facf31b73398ff856', NULL, NULL, '2020-09-03T08:38:20.236029+00:00'::timestamptz, NULL, NULL, '{}', 0, NULL, 0, NULL, '{}', 0, NULL) RETURNING "sentry_release"."id"
postgres_1 | ERROR: duplicate key value violates unique constraint "sentry_release_project_project_id_release_id_44ff55de_uniq"
postgres_1 | DETAIL: Key (project_id, release_id)=(3, 113) already exists.
postgres_1 | STATEMENT: INSERT INTO "sentry_release_project" ("project_id", "release_id", "new_groups") VALUES (3, 113, 0) RETURNING "sentry_release_project"."id"
Do you have more logs as these ones donât really tell much about the failure.
You can compare tags on GitHub but thereâs no curated changelog yet.
Thatâs quite reasonable but Iâd say doesnât really apply to Sentry much. At least, it wonât in the near future as we are going to cut the releases from the latest depolyed SHA 2-weeks prior to the release date.
We expect people to treat the onpremise repository as a black-box. With that and considering that Sentry itself is an application, a breaking change doesnât mean much. The limited things we could to would either wonât be done or be communicated very clearly with a gradual deprecation strategy (like React does).
I think the blog post on the switch to CalVer can answer this last question a bit better.
Well, I guess this summarizes the thread, lol.
What about open source ?
Open source isnât only about publishing the code, but also and especially about being open, in the broadest sense.
Hi. Just because we want an experience to just work has absolutely nothing to do with open source. The intention of making it a âblack boxâ is more so you as a user donât need to be an expert on all the software we run or how we run it or how we configure every little thing. Itâs not feasible really. You wanna be an expert on Python, uwsgi, nginx, Redis, memcached, clickhouse, zookeeper, Kafka, and other custom software weâve written to just run Sentry? Maybe. But most donât. To you the user, our goal is that these components just work. Theyâre a part of the âsentryâ package. If we introduce new stuff, it should also just work, etc. this is what weâre referring to as âblack boxâ. Other systems call this an âomnibusâ package. We want to abstract the complexities. Do we have issues with stuff still? Sure. This is brand new for us. Well get better.
Every component we run and every piece of software here is still open source and youâre free to use as little or as much of this âonpremiseâ repo as youâd like. THAT is still open source and has nothing to do with our motivations on making this a âblack boxâ.
Enjoy your free software that weâve invested millions of dollars into making.
I came here to say the âblack-boxâ is only for the developer experience and how to treat this package in your infrastructure but @matt beat me to it
Every single piece needed to run Sentry is open-source and you are free to use our recommended setup or use that as a blueprint to create your own.
Yet they doesnât.
And because of this, there is no way for me nor for you to help me debug it and find the issue because the only tool you have are logs and the logs doesnât help because there arenât enough nor enough detailsâŚ