'Django FileField with custom storage still uploads to default storage (GoogleCloudStorage)
I want to upload files to custom storage, other than the one for static. But each time when i try to declare a custom google cloud storage bucket, it still refers to the bucket related to statics (probably to the default storage)
I am trying to Declare a new class from wagtail's AbstractDocument class
class VirtualVisitDocs(AbstractDocument):
file = models.FileField(storage=CustomGoogleCloudStorage(), verbose_name=_('file'))
Then using it as a foreign key for another wagtail page with document chooserpanel as follows:
virtual_visit = models.ForeignKey(
VirtualVisitDocs,
on_delete=models.SET_NULL,
verbose_name=_('Virtual Visit Zip File'),
related_name='promotion_pages',
null=True,
blank=True,
)
MultiFieldPanel(
[
DocumentChooserPanel("virtual_visit"),
],
heading=_("Virtual Visit"),
classname="collapsible collapsed",
),
The storage instance from above CustomGoogleCloudStorage is the one which I have declared as follows:
@deconstructible
class CustomGoogleCloudStorage(GoogleCloudStorage):
def get_default_settings(self):
default_settings = super().get_default_settings()
default_settings['bucket_name'] = settings.GS_VIRTUAL_TOUR_BUCKET_NAME
default_settings['project_id'] = settings.GS_VIRTUAL_PROJECT_ID
return default_settings
Each time I try to upload documents this way, I get HTTP 403 which is correct, cause the service account I am using is not allowed to make changes on the default bucket. Rather, I want files to be uploaded to the second bucket where my service account is allowed to create bucket instance objects, but I just can't point to that another bucket. Whenever I try to print custom CustomGoogleCloudStorage instance as a dict I get correct project ID, bucket name and credentials file
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
