Sentry stops processing events after upgrade 10.0 => 20.8.0.dev0ba2aa70

Same problems here sadly since 20.08. I’m gonna go with the worker restart approach and hope that the sentry folks are gonna find the culprit :frowning:

For the next upgrades I’m planning to wait two months. I don’t think they test their releases. They just get automatically pushed out.

I don’t think they test their releases. They just get automatically pushed out.

Yeah, we don’t test particular releases, master is always supposed to work because that’s what we deploy to sentry.io

1 Like

Sadly I can’t provide any logs, because there is nothing really suspicious in there. Just that 20.07 worked and 20.08 doesn’t anymore. Like others said above, the worker will just stop doing things after a few hours.

Ok, this thread has become really large and for each of the errors posted here there can be multiple root causes, but since you have a last known good version, perhaps it makes sense for you to start bisecting commits. Specifying a commit hash in onpremise is documented here:

downgrading to older versions is generally not a great idea as things can break but it seems you have nothing to lose at this point?

I’ve done this at the dev environment, not on prod. Even if I were to deploy all the commits one after the other. It takes hours for the error to show up.

To everyone experiencing worker issues with 20.8.0, we have a potential fix here that needs some testing: https://github.com/getsentry/onpremise/issues/629#issuecomment-686714716

We are hoping to have a fix by our 20.9.0 release on September 15th.

1 Like

i have the same problem

new release Sentry 20.9.0.dev00d0cb1f seems to be a promising one. After upgrade to this, our Sentry is stable as before even at heavy load.

1 Like

Fixed for me also. Sentry 20.9.0.dev00740696.
Started working few hours after upgrading. (give it time)

1 Like

Trying to do ./install.sh to the last version ends in this error:

12:34:01 [INFO] sentry.plugins.github: apps-not-configured
Traceback (most recent call last):
  File "/usr/local/bin/sentry", line 8, in <module>
sys.exit(main())
  File "/usr/local/lib/python2.7/site-packages/sentry/runner/__init__.py", line 166, in main
cli(prog_name=get_prog(), obj={}, max_content_width=100)
  File "/usr/local/lib/python2.7/site-packages/click/core.py", line 722, in __call__
return self.main(*args, **kwargs)
  File "/usr/local/lib/python2.7/site-packages/click/core.py", line 697, in main
rv = self.invoke(ctx)
  File "/usr/local/lib/python2.7/site-packages/click/core.py", line 1066, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/local/lib/python2.7/site-packages/click/core.py", line 895, in invoke
return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/lib/python2.7/site-packages/click/core.py", line 535, in invoke
return callback(*args, **kwargs)
  File "/usr/local/lib/python2.7/site-packages/click/decorators.py", line 17, in new_func
return f(get_current_context(), *args, **kwargs)
  File "/usr/local/lib/python2.7/site-packages/sentry/runner/decorators.py", line 29, in inner
configure()
  File "/usr/local/lib/python2.7/site-packages/sentry/runner/__init__.py", line 129, in configure
configure(ctx, py, yaml, skip_service_validation)
  File "/usr/local/lib/python2.7/site-packages/sentry/runner/settings.py", line 158, in configure
skip_service_validation=skip_service_validation,
  File "/usr/local/lib/python2.7/site-packages/sentry/runner/initializer.py", line 328, in initialize_app
setup_services(validate=not skip_service_validation)
  File "/usr/local/lib/python2.7/site-packages/sentry/runner/initializer.py", line 370, in setup_services
service.validate()
  File "/usr/local/lib/python2.7/site-packages/sentry/utils/services.py", line 105, in <lambda>
context[key] = (lambda f: lambda *a, **k: getattr(self, f)(*a, **k))(key)
  File "/usr/local/lib/python2.7/site-packages/sentry/buffer/redis.py", line 69, in validate
raise InvalidConfiguration(six.text_type(e))
sentry.exceptions.InvalidConfiguration: Redis is loading the dataset in memory
An error occurred, caught SIGERR on line 280
Cleaning up...

@fabriciols - interesting. That means redis is taking its time to get up and running and it takes longer than the sentry instance. This is definitely not ideal but can you try adding something like sleep 10 before that line 280 and see if it helps?

1 Like

it worked! Thanks :slight_smile:

2 Likes

This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.