https://django-storages.readthedocs.io/

Overview

Django-Storages

PyPI Version Build Status

Installation

Installing from PyPI is as easy as doing:

pip install django-storages

If you'd prefer to install from source (maybe there is a bugfix in master that hasn't been released yet) then the magic incantation you are looking for is:

pip install -e 'git+https://github.com/jschneier/django-storages.git#egg=django-storages'

Once that is done set DEFAULT_FILE_STORAGE to the backend of your choice. If, for example, you want to use the boto3 backend you would set:

DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'

If you are using the FileSystemStorage as your storage management class in your models FileField fields, remove them and don't specify any storage parameter. That way, the DEFAULT_FILE_STORAGE class will be used by default in your field. For example, if you have a photo field defined as:

photo = models.FileField(
    storage=FileSystemStorage(location=settings.MEDIA_ROOT),
    upload_to='photos',
)

Set it to just:

photo = models.FileField(
    upload_to='photos',
)

There are also a number of settings available to control how each storage backend functions, please consult the documentation for a comprehensive list.

About

django-storages is a project to provide a variety of storage backends in a single library.

This library is usually compatible with the currently supported versions of Django. Check the Trove classifiers in setup.py to be sure.

django-storages is backed in part by Tidelift. Check them out for all of your enterprise open source software commerical support needs.

Security

To report a security vulnerability, please use the Tidelift security contact. Tidelift will coordinate the fix and disclosure. Please do not post a public issue on the tracker.

History

This repo began as a fork of the original library under the package name of django-storages-redux and became the official successor (releasing under django-storages on PyPI) in February of 2016.

Found a Bug? Something Unsupported?

I suspect that a few of the storage engines in backends/ have been unsupported for quite a long time. I personally only really need the S3Storage backend but welcome bug reports (and especially) patches and tests for some of the other backends.

Issues are tracked via GitHub issues at the project issue page.

Documentation

Documentation for django-storages is located at https://django-storages.readthedocs.io/.

Contributing

  1. Check for open issues at the project issue page or open a new issue to start a discussion about a feature or bug.
  2. Fork the django-storages repository on GitHub to start making changes.
  3. Add a test case to show that the bug is fixed or the feature is implemented correctly.
  4. Bug me until I can merge your pull request. Also, don't forget to add yourself to AUTHORS.
Comments
  • S3Boto3Storage raises ValueError: I/O operation on closed file.

    S3Boto3Storage raises ValueError: I/O operation on closed file.

    When running python manage.py collectstatic we get the following exception:

    Traceback (most recent call last):
      File "manage.py", line 10, in <module>
        execute_from_command_line(sys.argv)
      File "~/venvs/app_root/lib/python3.5/site-packages/django/core/management/__init__.py", line 363, in execute_from_command_line
        utility.execute()
      File "items()venvs/app_root/lib/python3.5/site-packages/django/core/management/__init__.py", line 355, in execute
        self.fetch_command(subcommand).run_from_argv(self.argv)
      File "~/venvs/app_root/lib/python3.5/site-packages/django/core/management/base.py", line 283, in run_from_argv
        self.execute(*args, **cmd_options)
      File "~/venvs/app_root/lib/python3.5/site-packages/django/core/management/base.py", line 330, in execute
        output = self.handle(*args, **options)
      File "~/venvs/app_root/lib/python3.5/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 199, in handle
        collected = self.collect()
      File "~/venvs/app_root/lib/python3.5/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 139, in collect
        for original_path, processed_path, processed in processor:
      File "~/venvs/app_root/lib/python3.5/site-packages/django/contrib/staticfiles/storage.py", line 246, in post_process
        for name, hashed_name, processed, _ in self._post_process(paths, adjustable_paths, hashed_files):
      File "~/venvs/app_root/lib/python3.5/site-packages/django/contrib/staticfiles/storage.py", line 312, in _post_process
        hashed_name = self.hashed_name(name, content_file)
      File "~/venvs/app_root/lib/python3.5/site-packages/django/contrib/staticfiles/storage.py", line 109, in hashed_name
        file_hash = self.file_hash(clean_name, content)
      File "~/venvs/app_root/lib/python3.5/site-packages/django/contrib/staticfiles/storage.py", line 86, in file_hash
        for chunk in content.chunks():
      File "~/venvs/app_root/lib/python3.5/site-packages/django/core/files/base.py", line 76, in chunks
        self.seek(0)
    ValueError: I/O operation on closed file.
    

    This only happens when using django-storages 1.6.4 or above. Versions 1.6.3 and lower work fine.

    We're using Django 1.11.4, python 3.5.2, boto3 1.4.6.

    bug s3boto 
    opened by tsifrer 73
  • Add Google Cloud Storage backend using the gcloud-python library

    Add Google Cloud Storage backend using the gcloud-python library

    Based on https://github.com/jschneier/django-storages/pull/146

    This uses the native gcloud-python library, which is recommended by Google for production setups and allows finer-grained authentication than current solutions (which only work when Google Cloud Storage is used in S3 compatibility mode).

    • [x] Review code (see: https://github.com/jschneier/django-storages/pull/146#issuecomment-266873319)
    • [x] Add tests
    • [x] Add documentation
    opened by scjody 32
  • Update Azure storage version

    Update Azure storage version

    Attempt to close #784. From what I can tell the updated version of azure-storage-blob doesn't have the same options for URL generation with is_emulated and custom_domain, so I've removed options that won't work anymore.

    There's a lot in here, so any feedback is appreciated, thanks!

    opened by pjsier 31
  • [s3] Add support for signing CloudFront URLs

    [s3] Add support for signing CloudFront URLs

    Fixes #456. Similar to #484 but uses AWS CloudFront signed URLS (See AWS docs)

    This is probably the right way to approach the problem in #484, but if that solution works that might be an alternative (not sure how @mattjegan configured S3+CloundFront)

    s3boto 
    opened by terencehonles 31
  • AWS S3 Frankfurt region not working

    AWS S3 Frankfurt region not working

    COPY OF BITBUCKET ISSUE #214 - https://bitbucket.org/david/django-storages/issue/214/aws-s3-frankfurt-region-not-working

    "Andreas Schilling created an issue 2015-01-04

    Using Frankfurt region (Germany) with django-storages produces HTTP 400 error. S3 in the the new region supports only Signature Version 4. In all other regions, Amazon S3 supports both Signature Version 4 and Signature Version 2.

    I assume django-storages only supports Signature Version 2. Is there any chance to support Version 4?"

    Thanks @jschneier for the fork! Is there a chance for django-storages-redux to support the eu-central-1 region?

    s3boto 
    opened by FPurchess 28
  • New S3 Boto3 backend (closes #57)

    New S3 Boto3 backend (closes #57)

    This pull request implements https://github.com/jschneier/django-storages/issues/57 by adding a Boto 3 backend that tries to be a close-to-drop-in replacement for Boto 2. Due to the slight differences, I have kept it a separate backend, but with a lot of copy and paste code. Given that Boto 2 is heading towards maintenance mode according to https://github.com/boto/boto/commit/e3dd99695e8976fad88c1c55d69914199f1878db, I don't think it's worth trying to have 2 backends sharing code when the Boto 2 implementation looks on the way to being deprecated.

    Note that this isn't just me blithely throwing away the Boto 2 implementation; the fundamental underlying operations are VERY different and worthy of a separate backend. Boto 2 operates on the assumption that you can set arbitrary headers by passing in a dictionary; Boto 3 restricts you to specific named parameters and as such the 2 approaches are very incompatible with one another. You can try to do minor mappings here and there to try to map some of the headers in the AWS_HEADERS setting, but trying to map every possible header value to the right argument name in the right method is pretty tedious and error prone. Instead, this pull request embraces Boto 3's use of parameters as its way of taking in these extra arguments, leaving the remapping up to the django-storages user who wants to switch backends. For the limited number of extra headers and parameters they'll use, this mapping is easier to do, and looking up in Boto 3's documentation for the parameter name is straightforward.

    This pull request replaces https://github.com/jschneier/django-storages/issues/66, adding unit tests and incorporating changes due to pull requests accepted into Boto3/botocore. A substantially similar version of this (minus the recent merges from the past few days) has been run in production with Django 1.6.11 for several months without problems.

    Also note that while I was at it, I made the necessary change to support #95 if you switch to this backend.

    Because this is based on s3boto.py, you should be able to manually perform a diff of s3boto.py and the new s3boto3.py to understand the changes.

    Changes:

    • The AWS_HEADERS/storage.headers setting is replaced with AWS_S3_OBJECT_PARAMETERS/storage.object_parameters. The keys/values to this are intended to be the arguments to the http://boto3.readthedocs.org/en/latest/reference/services/s3.html#S3.Object.put boto3.Resource('s3').Object.put() method, which has all the arguments that would be used to generate the correct request headers. Note that this gives a little more access than just headers, since a user could also provide ACL and Content arguments in this dict.
    • Unlike the s3boto implementation, s3boto3 backend does not currently support proxies (https://github.com/boto/botocore/issues/541), or alternate host/ports (https://github.com/boto/botocore/issues/601) because the underlying Boto3/Botocore library does not currently support it. It only kind of supports the endpoint URL
    • If using s3v4, since botocore does not automatically redirect you to the correct region's endpoint nor sign properly unless it knows the region, there is an AWS_S3_REGION_NAME/storage.region_name setting to force the region.
    • There's some behavior that was being done in the boto2 library that boto3 does not do, so equivalent code has been added to s3boto3 to do this. These include things like rewinding the file pointer, automatically checking for bucket and object existence, or directly writing the S3 contents to a file/file pointer. There's cases where Boto3 does not allow previous operations, like locally updating the last_modified attribute, which this new version performs by just reloading the object from the server.
    • No need to parse timestamps, since boto3 already performs this conversion for you.

    Known issues:

    • For unsigned URLs (e.g. querystring_auth=False), Boto3 does not support this in its API (https://github.com/boto/boto3/issues/169). This implements compatibility by stripping off any signature parameters by parsing the querystring. Note that these parameter names will be different for s3v1/s3v4 URLs, but this implements it by stripping them all away.
    • To support s3v4 URLs for endpoints that aren't v4 by default (most of them), there are 2 ways. The one I've used in production is to have a ~/.aws/config file as generated by running "aws configure set default.s3.signature_version s3v4". If deploying in a configuration where there is no user home directory, the location of this config can be set with AWS_CONFIG_FILE environment variable. Another way involves setting the S3Boto3Storage.config variable with a Boto3 client config.

    This has been tested using s3v4 signatures with both signed and unsigned URLs, along with response content disposition and KMS server-side encryption key arguments.

    opened by mbarrien 27
  • Calculate settings when storage is instantiated; not imported

    Calculate settings when storage is instantiated; not imported

    Makes the storage classes usable with Django's override_settings. For example, projects can now change the S3 location & bucket when running their tests. Affects the following backends:

    • azure_storage
    • gcloud
    • gs
    • s3boto
    • s3boto3

    The remaining storage backends do not need to be altered as they do no calculate settings at import time.

    The class BaseStorage was created to use a common pattern to initialize settings for various backends. Users of these backends can expect them to work consistently. This class is fully backwards compatible. Settings can now be set:

    • In settings.py
    • Using override_settings
    • As a class variable
    • As an argument to init

    Closes #498

    Completes all necessary backends and adds tests.

    opened by jdufresne 23
  • Allow the use of AWS profiles for S3 access

    Allow the use of AWS profiles for S3 access

    This uses the setting AWS_S3_SESSION_PROFILE as an alternative to using static credentials in AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY.
    This addresses the spirit of https://github.com/jschneier/django-storages/issues/774 and implements the suggestion in https://github.com/jschneier/django-storages/issues/895

    opened by dan-hook 19
  • Backend boto3 and signature v4

    Backend boto3 and signature v4

    I'm trying to use S3 with django-storages==1.5.0 and boto3==1.4.0, but I still have the same issue as with boto. Errors regarding the signature :

    Missing required header for this request: x-amz-content-sha256

    My conf :

    DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
    AWS_S3_REGION_NAME = 'eu-west-1'
    AWS_S3_HOST = 's3.eu-west-1.amazonaws.com'
    AWS_S3_SIGNATURE_VERSION = 'v4'
    

    If someone has the solution, it would be a pleasure to create a merge request to update the documentation regarding this point.

    s3boto 
    opened by jibaku 17
  • SFTP problem

    SFTP problem

    I get this error:

    File "/home/vagrant/.virtualenvs/gazex/lib/python3.5/site-packages/paramiko/client.py", line 70, in __init__
        self._system_host_keys = HostKeys()
    RecursionError: maximum recursion depth exceeded
    

    which really doesn't say much about the real problem. I know that there is a mkdir loop, but what causes it?

    bug 
    opened by adi- 17
  • Azure - Authentication error when `AZURE_CUSTOM_DOMAIN` set to Azure CDN

    Azure - Authentication error when `AZURE_CUSTOM_DOMAIN` set to Azure CDN

    Ever since the Azure backend was updated to the new azure-storage-blob library in v1.12, using django-storages with AZURE_CUSTOM_DOMAIN set results in Authentication errors when uploading files. For me this only happens with Akamai CDNs, but as reported by others below it affects other CDN types as well.

    v1.12 changed how AZURE_CUSTOM_DOMAIN is used with BlobServiceClient. In v1.11 and earlier, the custom domain was only used to get blob URLs. All other operations like uploading, streaming, getting metadata were being done by making requests to the actual storage account endpoint (https://<accountname>.blob.core.windows.net) even if a custom domain was specified.

    In v1.12, the behaviour changed so that the custom domain endpoint is used for all storage operations. This uncovered several different upstream issues causing various storage requests to fail with auth errors.

    Issues

    1. Auth error due to MAC signature mismatch when AZURE_CUSTOM_DOMAIN set to Akamai CDN

    Upstream issue: https://github.com/Azure/azure-sdk-for-python/issues/26381

    Uploads fail with the following error:

    ClientAuthenticationError: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
    ...
    authenticationerrordetail:The MAC signature found in the HTTP request '5dC3N7RcRW9V...' is not the same as any computed signature. Server used following string to sign: 'PUT
    
    
    1
    
    application/octet-stream
    
    
    
    
    
    
    x-ms-blob-type:BlockBlob
    x-ms-client-request-id:xxxx
    x-ms-date:Mon, 21 Feb 2022 07:34:35 GMT
    x-ms-version:2020-10-02
    Content: <?xml version="1.0" encoding="utf-8"?><Error><Code>AuthenticationFailed</Code><Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed 
    correctly including the signature.
    RequestId:xxx
    Time:2022-02-21T07:34:35.1058206Z</Message><AuthenticationErrorDetail>The MAC signature found in the HTTP request '5dC3N7RcRW9V...' is not the same as any computed 
    signature. Server used following string to sign: 'PUT
    

    2. Forbidden ClientAuthenticationError when AZURE_CUSTOM_DOMAIN set to Microsoft CDN

    Upstream issue: https://github.com/Azure/azure-sdk-for-python/issues/23640

    Uploads fail with the following error (different than above):

    azure.core.exceptions.ClientAuthenticationError: Operation returned an invalid status 'Forbidden'
    ErrorCode:AuthenticationFailed
    
    opened by JeffreyCA 15
  • Allow multiple S3 buckets to be defined

    Allow multiple S3 buckets to be defined

    Use Case : When dealing with multitenant services, it'd be ideal if we could define the multiple S3 buckets for each client and dynamically set the bucket to use with django-storages.

    Moreover, we do not have to look far for inspiration. We can reuse the Databases logic to handle and set the S3 bucket config. Users can then dynamically select the bucket with threading local or their preferred setup.

    ## settings.py
    STORAGES = {
        "S3_1": {
            "AWS_ACCESS_KEY_ID": "",
            "AWS_SECRET_ACCESS_KEY": "",
            "AWS_STORAGE_BUCKET_NAME": "",
            "AWS_CLOUDFRONT_URL": "",
            "AWS_S3_CUSTOM_DOMAIN": "",
        }, 
         "S3_2": {
            "AWS_ACCESS_KEY_ID": "",
            "AWS_SECRET_ACCESS_KEY": "",
            "AWS_STORAGE_BUCKET_NAME": "",
            "AWS_CLOUDFRONT_URL": "",
            "AWS_S3_CUSTOM_DOMAIN": "",
        }
    }
    
    ## S3Router.py
    class OpenStorage(S3Boto3Storage):
       storage_settings = settings.STORAGES["S3_1' ]
      ## or 
    
    dynamicStorage = S3BotoStorage("S3_1")
    
    opened by VaZark 3
  • wrong usage of LibCloudStorage._get_object in LibCloudStorage._read

    wrong usage of LibCloudStorage._get_object in LibCloudStorage._read

    I bumped into this bug when I was trying to make LibCloudStorage work with django's ManifestFilesMixin but that doesn't matter and it should be fixed regardless.

    exact version of what it is now:

    def _get_object(self, name):
        """Get object by its name. [Return None if object not found"""
        clean_name = self._clean_name(name)
        try:
            return self.driver.get_object(self.bucket, clean_name)
        except ObjectDoesNotExistError:
            return None
    
    def _read(self, name):
        obj = self._get_object(name)
        # TOFIX : we should be able to read chunk by chunk
        return next(self.driver.download_object_as_stream(obj, obj.size))
    

    my recommendation:

    def _read(self, name):
        obj = self._get_object(name)
        if obj is None:
            raise FileNotFoundError(f"{name} does not exist.")
        # TOFIX : we should be able to read chunk by chunk
        return next(self.driver.download_object_as_stream(obj, obj.size))
    

    and if you are curious about the exact trigger of the bug:

    class ManifestFilesMixin(HashedFilesMixin):
        def read_manifest(self):
            try:
                with self.manifest_storage.open(self.manifest_name) as manifest:
                    return manifest.read().decode()
            except FileNotFoundError:
                return None
    
    opened by engAmirEng 0
  • No rename method on GoogleCloudStorage

    No rename method on GoogleCloudStorage

    I'm using the following to allow renaming:

    @deconstruct.deconstructible
    class GoogleCloudStorage(gcloud.GoogleCloudStorage):
        def path(self, name) -> typing.AnyStr:
            raise NotImplementedError()
    
        def get_accessed_time(self, name) -> datetime.datetime:
            raise NotImplementedError()
    
        def rename(self, old_name: str, new_name: str) -> None:
            blob = self.bucket.blob(old_name)
            self.bucket.rename_blob(blob, new_name)
    

    Requesting feature.

    opened by wieczorek1990 0
  • An RSA backend is required for signing cloudfront URLs?

    An RSA backend is required for signing cloudfront URLs?

    Hello,

    I am trying to use django-storages with AWS CloudFront.  I created an RSA public and private key pair with 
    

    'openssl genrsa -out private_key.pem 2048' and 'openssl rsa -pubout -in private_key.pem -out public_key.pem' in a cygwin window. I am using python-decouple to handle separation of settings from code. In my .ini file I put both of my RSA keys in with this syntax:

    AWS_CLOUDFRONT_KEY=-----BEGIN RSA PRIVATE KEY----- ____MIIEogIBAfKCAQEAn05f+B/dcarBHa4hTyPCSYgP9x39qnN74yLDmy4QGw8MaRaB ____lftda77PNuwj/DTjUc59YPlMM8HIS9D436I3ngPEnhn5B3ojfu80xr5zCIZXIynW ... ____bxnZAoGAG/sVbReKOpcHQo4bxJ+F4SzjyN+TQF7rNI3qgUe6S7xlHVsuzPgwPPrv ____q2Ax8mi24feX4VdxmXupT1rZ+DpNQN2K6YPtn/kaP93oh7dPpoMiC3jmNKUO3Zkr ____jbj6BO/UbcvI7noxgMTTCjSCHs2/VE6tuOkS635AH6HjO1Ag6i4= ____-----END RSA PRIVATE KEY----- AWS_CLOUDFRONT_KEY_ID=-----BEGIN PUBLIC KEY----- ____MIIBIjANBfkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAn05f+B/dcarBHa4hTyPC ____SYgP9x39qnN74yLDmy4QGw8MaRaBlftda77PNuwj/DTjUc59YPlMM8HIS9D436I3 ... ____JlEKvnt+sfI5aBI0o9ylSvIHqpnYeN8vsRswRbLUYti9k5wCjrhmKZTH5PudPruw ____MQIDAQAB ____-----END PUBLIC KEY-----

    In order to satisfy python-decouple's config requirements the multiline keys have four spaces prepended to ensure that the entirety is taken as a single setting (Markdown keeps mucking this up; so I'm showing four underscores in place of the spaces). I have also tried prepending with a tab instead of four spaces without any luck. When I run manage.py runserver, my application launches, but I get this error:

    ImproperlyConfigured at / An RSA backend is required for signing cloudfront URLs. Supported backends are packages: cryptography and rsa.

    Looking at django-storages code, I see that this is the result of my key failing the check to be an RSA key in backends/s3boto3.py. Here's a code trace: `File "C:\Users\Al\PycharmProjects\Django-Dashboard-layout\env\lib\site-packages\storages\base.py", line 7, in init default_settings = self.get_default_settings() │ └ <function S3Boto3Storage.get_default_settings at 0x05668E88> └ <core.storage_backends.StaticStorage object at 0x04D98B10>

    File "C:\Users\Al\PycharmProjects\Django-Dashboard-layout\env\lib\site-packages\storages\backends\s3boto3.py", line 291, in get_default_settings cloudfront_signer = self.get_cloudfront_signer(cloudfront_key_id, cloudfront_key) │ │ │ └ '-----BEGIN RSA PRIVATE KEY-----\nMIIEogIBAAKCAQEAn05f+B/dcarBHa4hTyPCSYgP9x39qnN74yLDmy4QGw8MaRaB\nlftda77PNuwj/DTjUc59YPlMM... │ │ └ '-----BEGIN PUBLIC KEY-----\nMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAn05f+B/dcarBHa4hTyPC\nSYgP9x39qnN74yLDmy4QGw8MaRaBlf... │ └ <function S3Boto3Storage.get_cloudfront_signer at 0x05668E40> └ <core.storage_backends.StaticStorage object at 0x04D98B10>

    File "C:\Users\Al\PycharmProjects\Django-Dashboard-layout\env\lib\site-packages\storages\backends\s3boto3.py", line 279, in get_cloudfront_signer return _cloud_front_signer_from_pem(key_id, key) │ │ └ '-----BEGIN RSA PRIVATE KEY-----\nMIIEogIBAAKCAQEAn05f+B/dcarBHa4hTyPCSYgP9x39qnN74yLDmy4QGw8MaRaB\nlftda77PNuwj/DTjUc59YPlMM... │ └ '-----BEGIN PUBLIC KEY-----\nMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAn05f+B/dcarBHa4hTyPC\nSYgP9x39qnN74yLDmy4QGw8MaRaBlf... └ <function _cloud_front_signer_from_pem at 0x056689C0>

    File "C:\Users\Al\PycharmProjects\Django-Dashboard-layout\env\lib\site-packages\storages\backends\s3boto3.py", line 84, in _cloud_front_signer_from_pem 'An RSA backend is required for signing cloudfront URLs.\n' `

    How can I get my RSA keys to load properly? Do I have to use an environment variable rather than python-decouple in this instance?

    Thanks--

    Al

    opened by algaspar 1
  • Django project deployed to GCP App Engine returns [Errno 30] Read-only file system when uploading an image

    Django project deployed to GCP App Engine returns [Errno 30] Read-only file system when uploading an image

    Tried to look for solutions but I don't seem to have found any, so I'm opening a new issue: I have a django project which uses a gcp bucket for images upload. Locally it works fine, but when deployed on GCP App Engine and I try to upload an image I get this exception

    [Errno 30] Read-only file system: '/workspace/media'
    

    I am confident my env vars are correct and the django-storages library gets set up correctly at startup, but I really can't understand why the backend is trying to access the local filesystem instead of uploading the image. The bucket setup code is below:

    if env("USE_BUCKET_AS_MEDIA_FOLDER"):
        # Set "media" folder
        DEFAULT_FILE_STORAGE = "myproject.gcsutils.Media"
    
        GS_BUCKET_NAME = env("MEDIA_FOLDER_BUCKET_NAME")
    
        # Add an unique ID to a file name if same file name exists
        GS_FILE_OVERWRITE = False
    
        GCP_CREDENTIALS = env("GCP_CREDENTIALS")
    
        GS_CREDENTIALS = service_account.Credentials.from_service_account_info(
            json.loads(GCP_CREDENTIALS)
        )
    

    And here are the contents of myproject.gcsutils.py

    from storages.backends.gcloud import GoogleCloudStorage
    
    def Media():
        return GoogleCloudStorage(location="media")
    

    Happy to add any useful details, any help would be very appreciated

    opened by HitLuca 0
Owner
Josh Schneier
Josh Schneier
https://django-storages.readthedocs.io/

Installation Installing from PyPI is as easy as doing: pip install django-storages If you'd prefer to install from source (maybe there is a bugfix in

Josh Schneier 2.3k Jan 6, 2023
PyStan, a Python interface to Stan, a platform for statistical modeling. Documentation: https://pystan.readthedocs.io

PyStan NOTE: This documentation describes a BETA release of PyStan 3. PyStan is a Python interface to Stan, a package for Bayesian inference. Stan® is

Stan 229 Dec 29, 2022
Cowrie SSH/Telnet Honeypot https://cowrie.readthedocs.io

Cowrie Welcome to the Cowrie GitHub repository This is the official repository for the Cowrie SSH and Telnet Honeypot effort. What is Cowrie Cowrie is

Cowrie 4.1k Jan 9, 2023
PyStan, a Python interface to Stan, a platform for statistical modeling. Documentation: https://pystan.readthedocs.io

PyStan PyStan is a Python interface to Stan, a package for Bayesian inference. Stan® is a state-of-the-art platform for statistical modeling and high-

Stan 229 Dec 29, 2022
Askbot is a Django/Python Q&A forum. **Contributors README**: https://github.com/ASKBOT/askbot-devel#how-to-contribute. Commercial hosting of Askbot and support are available at https://askbot.com

ATTENTION: master branch is experimental, please read below Askbot - a Django Q&A forum platform This is Askbot project - open source Q&A system, like

ASKBOT 1.5k Dec 28, 2022
A high performance implementation of HDBSCAN clustering. http://hdbscan.readthedocs.io/en/latest/

HDBSCAN Now a part of scikit-learn-contrib HDBSCAN - Hierarchical Density-Based Spatial Clustering of Applications with Noise. Performs DBSCAN over va

Leland McInnes 91 Dec 29, 2022
Sphinx theme for readthedocs.org

Read the Docs Sphinx Theme This Sphinx theme was designed to provide a great reader experience for documentation users on both desktop and mobile devi

Read the Docs 4.3k Dec 31, 2022
:speech_balloon: SpeechPy - A Library for Speech Processing and Recognition: http://speechpy.readthedocs.io/en/latest/

SpeechPy Official Project Documentation Table of Contents Documentation Which Python versions are supported Citation How to Install? Local Installatio

Amirsina Torfi 870 Dec 27, 2022
The source code that powers readthedocs.org

Welcome to Read the Docs Purpose Read the Docs hosts documentation for the open source community. It supports Sphinx docs written with reStructuredTex

Read the Docs 7.4k Dec 25, 2022
Fast image augmentation library and easy to use wrapper around other libraries. Documentation: https://albumentations.ai/docs/ Paper about library: https://www.mdpi.com/2078-2489/11/2/125

Albumentations Albumentations is a Python library for image augmentation. Image augmentation is used in deep learning and computer vision tasks to inc

null 11.4k Jan 9, 2023
Official mirror of https://gitlab.com/pgjones/hypercorn https://pgjones.gitlab.io/hypercorn/

Hypercorn Hypercorn is an ASGI web server based on the sans-io hyper, h11, h2, and wsproto libraries and inspired by Gunicorn. Hypercorn supports HTTP

Phil Jones 432 Jan 8, 2023
Fast image augmentation library and easy to use wrapper around other libraries. Documentation: https://albumentations.ai/docs/ Paper about library: https://www.mdpi.com/2078-2489/11/2/125

Albumentations Albumentations is a Python library for image augmentation. Image augmentation is used in deep learning and computer vision tasks to inc

null 11.4k Jan 2, 2023
A clone of https://virgool.io written in django

Virgool clone A clone of virgool blog written in django Installation first rename the .env.sample to .env and fill it. with docker docker-compose up -

Danial Selmipoor 7 Dec 23, 2022
Django module to easily send emails/sms/tts/push using django templates stored on database and managed through the Django Admin

Django-Db-Mailer Documentation available at Read the Docs. What's that Django module to easily send emails/push/sms/tts using django templates stored

LPgenerator 250 Dec 21, 2022
Django URL Shortener is a Django app to to include URL Shortening feature in your Django Project

Django URL Shortener Django URL Shortener is a Django app to to include URL Shortening feature in your Django Project Install this package to your Dja

Rishav Sinha 4 Nov 18, 2021
As easy as /aitch-tee-tee-pie/ 🥧 Modern, user-friendly command-line HTTP client for the API era. JSON support, colors, sessions, downloads, plugins & more. https://twitter.com/httpie

HTTPie: human-friendly CLI HTTP client for the API era HTTPie (pronounced aitch-tee-tee-pie) is a command-line HTTP client. Its goal is to make CLI in

HTTPie 25.4k Dec 30, 2022