Hello Sentry community!
We’ve recently started using Sentry in our organization (on premise), and so far so good! However, we’ve stumbled upon our first issue that we can not seem to overcome.
We’ve attempted pushing JavaScript source maps via sentry-cli tool. Everything works well from the CLI point of view, however Sentry instance was returning IOError when tried to access. Upon investigation, we’ve realized that our containers are not sharing volumes, what resulted in uploads only being present on one of the containers, and not the others. We’ve connected volumes (all delegated to same path on host - we’ve tried the “volumes_from” option as well). No matter what setup we go for (volumes_from or path on host) we are able to write&read data to shared volume across containers. Sentry instance however seems to fail to move data from /tmp/ to /var/lib/sentry/files. Some attachments (docker-compose + logs [obfuscated information]).
Python: 2.7.14
Sentry: 8.22.0
Avatar error
IOError: cannot identify image file <File: 1.png>
sentry/models/avatar.py in get_cached_photo at line 56
size = min(self.ALLOWED_SIZES, key=lambda x: abs(x - size))
cache_key = self.get_cache_key(size)
photo = cache.get(cache_key)
if photo is None:
photo_file = self.file.getfile()
with Image.open(photo_file) as image:
image = image.resize((size, size))
image_file = BytesIO()
image.save(image_file, 'PNG')
photo = image_file.getvalue()
cache.set(cache_key, photo)
cache_key
'avatar:1:20'
photo
None
photo_file
<File: 1.png>
self
<UserAvatar at 0x7f4ff4e8fe50: id=5L>
size
20
sentry/web/frontend/user_avatar.py in get at line 33
Called from: django/views/generic/base.py in dispatch
Source map errors
[Errno 2] No such file or directory: u'/var/lib/sentry/files/17696/37791/8e2d25805a464887bbaac8c5b460b8c5'
sentry/models/file.py in getfile at line 142
>>> dst.write(chunk)
"""
assert self.path
storage = get_storage()
return storage.open(self.path)
class File(Model):
__core__ = False
self
<FileBlob at 0x7f4ff4f0bd90: id=4L>
storage
<django.core.files.storage.FileSystemStorage object at 0x7f4ff4f20ed0>
sentry/models/file.py in _nextidx at line 307
sentry/models/file.py in seek at line 375
sentry/models/file.py in open at line 321
sentry/models/file.py in __init__ at line 284
sentry/models/file.py in _get_chunked_blob at line 176
sentry/models/file.py in getfile at line 190
sentry/api/endpoints/project_release_file_details.py in download at line 75
sentry/api/endpoints/project_release_file_details.py in get at line 123
sentry/api/base.py in dispatch at line 188
sentry/api/base.py in handle_exception at line 83
Docker compose
version: '2'
services:
base:
environment:
SENTRY_POSTGRES_HOST: googlesql_pg_host
SENTRY_DB_USER: googlesql_pg_user
SENTRY_DB_PASSWORD: googlesql_pg_pass
SENTRY_EMAIL_HOST: smtp.sentry.host.com
SENTRY_EMAIL_PORT: 2525
SENTRY_EMAIL_USER: smtp_user
SENTRY_EMAIL_PASSWORD: smtp_password
SENTRY_EMAIL_USE_TLS: "True"
SENTRY_SERVER_EMAIL: root@sentry.host.com
SENTRY_SECRET_KEY: shared_secret
SENTRY_REDIS_HOST: redis
FILE_UPLOAD_MAX_MEMORY_SIZE: 26214400
volumes:
- /var/lib/sentry/files:/var/lib/sentry/files
- /var/lib/sentry/tmp:/tmp
image: sentry
restart: unless-stopped
sentry:
links:
- redis
- memcached
depends_on:
- redis
- memcached
extends: base
ports:
- 9000:9000
cron:
links:
- redis
- memcached
depends_on:
- redis
- memcached
extends: base
command: "sentry run cron"
worker:
links:
- redis
- memcached
depends_on:
- redis
- memcached
extends: base
command: "sentry run worker"
redis:
image: redis
restart: unless-stopped
memcached:
image: memcached:1.4
restart: unless-stopped
Upload maps we are trying to upload are below 40MB in size, (smallest one was few hundred KB). There is nginx proxy residing in front of Sentry - according to our information, it is configured properly (we use nginx in heavy-traffic elasticsearch cluster as proxy) - but any hints in this matter are also welcome.
We confirmed that the volumes are writable by user with Sentry uid by running touch/mkdir/similar as that user.
I am thankful for any help that will come our way!