35๐
S3BotoStorage
takes the bucket name as a parameter. If not given it will use the AWS_STORAGE_BUCKET_NAME
setting. That means if you want to make S3BotoStorage
the default storage backend with DEFAULT_FILE_STORAGE
then it must use the default bucket.
However you can also assign a storage on a field level:
from django.db import models
from storages.backends.s3boto import S3BotoStorage
class MyModel(models.Model):
file_1 = models.FileField() # Uses default storage
file_2 = models.FileField(storage=S3BotoStorage(bucket='other-bucket'))
Edit:
Comments are getting out of hand so Iโll update my answer. Changing the parameters of the storage backend on an instance basis is not something that the Django storage API was designed to do. The storage backend does not have knowledge of the model instance because the storages can be used outside the context of a model such as with static files. Not completely unreasonable but itโs not a usage that Django or django-storages was intended to solve. I donโt expect you aren to find a drop in storage backend that will handle this for you.
The docs describe how you can manage files manually: https://docs.djangoproject.com/en/1.9/topics/files/#storage-objects At a minimum you would need store the bucket where you saved the file somewhere so that you can find it later when you query the model.
14๐
Another solution if you want to specify bucket on the runtime, you can do so before invoking the save() method on the model.
Following the above example:
from django.db import models
from storages.backends.s3boto import S3BotoStorage
class MyModel(models.Model):
file_1 = models.FileField() # Uses default storage
file_2 = models.FileField()
In views when saving the model, you can specify the storage on that field.
my_file_model = MyModel()
my_file_model.file_2.storage = S3BotoStorage(bucket="your-bucket-name")
my_file_model.save()
In this way file_2 will be saved in the bucket you specify where file_1 will use your default bucket.
- [Django]-OR operator in Django model queries
- [Django]-Django filter queryset __in for *every* item in list
- [Django]-How to stop celery worker process
9๐
Just mention another bucket name in settings.py with PRIVATE_BUCKET_NAME=โbucket nameโ.
Create a custom class which override S3BotoStorage and which can be serialize into migration files.
Create a object for class s3_storage = S3MediaStorage()
and give it to storage in file1 field in MyModel
from storages.backends.s3boto import S3BotoStorage
from django.conf import settings
@deconstructible
class S3MediaStorage(S3BotoStorage):
def __init__(self, *args, **kwargs):
kwargs['bucket'] = getattr(settings, 'PRIVATE_BUCKET_NAME')
super(S3MediaStorage, self).__init__(*args, **kwargs)
s3_storage = S3MediaStorage()
class MyModel(models.Model):
file = models.FileField()
file1 = models.FileField(storage=s3_storage)
- [Django]-Django Admin CSS missing
- [Django]-How to use MySQLdb with Python and Django in OSX 10.6?
- [Django]-How can I serialize a queryset from an unrelated model as a nested serializer?
7๐
For S3Boto3Storage use this:
media_file = models.FileField('media file', storage=S3Boto3Storage(bucket_name='media_bucket'), upload_to='media', blank=True)
- [Django]-How to serve media files on Django production environment?
- [Django]-Getting database values using get_object_or_404
- [Django]-Id field in Model Django