[ERROR] celery.app.trace

Any body came across this error . Please provide guidance on how to resolve

worker_1                       | Exception: Group matching query does not exist.
worker_1                       | 17:57:14 [ERROR] celery.app.trace: Task sentry.tasks.post_process.post_process_group[51fac9e3-d8f3-4404-a8f0-6def9598e3c5] raised unexpected: DoesNotExist('Group matching query does not exist.',) (data={u'internal': False, u'traceback': u'Traceback (most recent call last):\n  File "/usr/local/lib/python2.7/site-packages/celery/app/trace.py", line 375, in trace_task\n    R = retval = fun(*args, **kwargs)\n  File "/usr/local/lib/python2.7/site-packages/celery/app/trace.py", line 632, in __protected_call__\n    return self.run(*args, **kwargs)\n  File "/usr/local/lib/python2.7/site-packages/sentry_sdk/integrations/celery.py", line 171, in _inner\n    reraise(*exc_info)\n  File "/usr/local/lib/python2.7/site-packages/sentry_sdk/integrations/celery.py", line 166, in _inner\n    return f(*args, **kwargs)\n  File "/usr/local/lib/python2.7/site-packages/sentry/tasks/base.py", line 48, in _wrapped\n    result = func(*args, **kwargs)\n  File "/usr/local/lib/python2.7/site-packages/sentry/tasks/post_process.py", line 166, in post_process_group\n    event.group, _ = get_group_with_redirect(event.group_id)\n  File "/usr/local/lib/python2.7/site-packages/sentry/models/group.py", line 116, in get_group_with_redirect\n    raise error  # raise original `DoesNotExist`\nDoesNotExist: Group matching query does not exist.\n', u'name': 'sentry.tasks.post_process.post_process_group', u'args': '()', u'kwargs': "{'event': <sentry.eventstore.models.Event object at 0x7fcd6c2d7b50>, 'is_new_group_environment': False, 'is_new': False, 'primary_hash': '655113b1c1db6814a075734ea5049a14', 'is_regression': None}", u'description': u'raised unexpected', u'hostname': u'celery@23c5cb445c61', u'id': '51fac9e3-d8f3-4404-a8f0-6def9598e3c5', u'exc': "DoesNotExist('Group matching query does not exist.',)"})
worker_1                       | Traceback (most recent call last):
worker_1                       |   File "/usr/local/lib/python2.7/site-packages/sentry_sdk/transport.py", line 311, in send_event_wrapper
worker_1                       |     self._send_event(event)
worker_1                       |   File "/usr/local/lib/python2.7/site-packages/sentry_sdk/transport.py", line 231, in _send_event
worker_1                       |     headers={"Content-Type": "application/json", "Content-Encoding": "gzip"},
worker_1                       |   File "/usr/local/lib/python2.7/site-packages/sentry_sdk/transport.py", line 178, in _send_request
worker_1                       |     headers=headers,
worker_1                       |   File "/usr/local/lib/python2.7/site-packages/urllib3/request.py", line 72, in request
worker_1                       |     **urlopen_kw)
worker_1                       |   File "/usr/local/lib/python2.7/site-packages/urllib3/request.py", line 150, in request_encode_body
worker_1                       |     return self.urlopen(method, url, **extra_kw)
worker_1                       |   File "/usr/local/lib/python2.7/site-packages/urllib3/poolmanager.py", line 324, in urlopen
worker_1                       |     response = conn.urlopen(method, u.request_uri, **kw)
worker_1                       |   File "/usr/local/lib/python2.7/site-packages/urllib3/connectionpool.py", line 667, in urlopen
worker_1                       |     **response_kw)
worker_1                       |   File "/usr/local/lib/python2.7/site-packages/urllib3/connectionpool.py", line 667, in urlopen
worker_1                       |     **response_kw)
worker_1                       |   File "/usr/local/lib/python2.7/site-packages/urllib3/connectionpool.py", line 667, in urlopen
worker_1                       |     **response_kw)
worker_1                       |   File "/usr/local/lib/python2.7/site-packages/urllib3/connectionpool.py", line 638, in urlopen
worker_1                       |     _stacktrace=sys.exc_info()[2])
worker_1                       |   File "/usr/local/lib/python2.7/site-packages/urllib3/util/retry.py", line 399, in increment
worker_1                       |     raise MaxRetryError(_pool, url, error or ResponseError(cause))
worker_1                       | MaxRetryError: HTTPSConnectionPool(host='retail-sentry.apple.com', port=443): Max retries exceeded with url: /api/1/store/ (Caused by SSLError(SSLError("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')],)",),))
worker_1                       | 17:57:14 [ERROR] sentry_sdk.errors: Internal error in sentry_sdk
worker_1                       | Traceback (most recent call last):
worker_1                       |   File "/usr/local/lib/python2.7/site-packages/celery/app/trace.py", line 375, in trace_task
worker_1                       |     R = retval = fun(*args, **kwargs)
worker_1                       |   File "/usr/local/lib/python2.7/site-packages/celery/app/trace.py", line 632, in __protected_call__
worker_1                       |     return self.run(*args, **kwargs)
worker_1                       |   File "/usr/local/lib/python2.7/site-packages/sentry_sdk/integrations/celery.py", line 171, in _inner
worker_1                       |     reraise(*exc_info)
worker_1                       |   File "/usr/local/lib/python2.7/site-packages/sentry_sdk/integrations/celery.py", line 166, in _inner
worker_1                       |     return f(*args, **kwargs)
worker_1                       |   File "/usr/local/lib/python2.7/site-packages/sentry/tasks/base.py", line 48, in _wrapped
worker_1                       |     result = func(*args, **kwargs)
worker_1                       |   File "/usr/local/lib/python2.7/site-packages/sentry/tasks/post_process.py", line 166, in post_process_group
worker_1                       |     event.group, _ = get_group_with_redirect(event.group_id)
worker_1                       |   File "/usr/local/lib/python2.7/site-packages/sentry/models/group.py", line 116, in get_group_with_redirect
worker_1                       |     raise error  # raise original `DoesNotExist`
1 Like

@BYK, can you provide some infor on this error. Ingestion is fine , but data forwarding to splunk has stopped . Not sure if it is related.

Looks like your internal TLS certificates have expired?

@BYK , the only certs i have are custom certs in /nginx mount for SSL . those i checked were valid. Is there anyother custom location for certs, which this service is looking at.

Yeah, I think you need to modify the sentry/Dockerfile to embed your custom certificates (well the custom CA) into the Sentry image so they are trusted.

Any pointers how to reference the CA cert in the Dockerfile for sentry , so that all instances pick it up?

@amit1 - Thinking about this now, I think you may 2 options:

  1. Add some COPY and RUN lines to the Dockerfile to bake the certs into the system.
  2. Use a volume mount. The sentry folder is already mounted so you may just need some config changes for the files to be picked up from there. Alternatively, you can add a new volume mount for the directory you store the certificates.

This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.