https://django-storages.readthedocs.io/

Overview

Django-Storages

PyPI Version Build Status

Installation

Installing from PyPI is as easy as doing:

pip install django-storages

If you'd prefer to install from source (maybe there is a bugfix in master that hasn't been released yet) then the magic incantation you are looking for is:

pip install -e 'git+https://github.com/jschneier/django-storages.git#egg=django-storages'

Once that is done set DEFAULT_FILE_STORAGE to the backend of your choice. If, for example, you want to use the boto3 backend you would set:

DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'

If you are using the FileSystemStorage as your storage management class in your models FileField fields, remove them and don't specify any storage parameter. That way, the DEFAULT_FILE_STORAGE class will be used by default in your field. For example, if you have a photo field defined as:

photo = models.FileField(
    storage=FileSystemStorage(location=settings.MEDIA_ROOT),
    upload_to='photos',
)

Set it to just:

photo = models.FileField(
    upload_to='photos',
)

There are also a number of settings available to control how each storage backend functions, please consult the documentation for a comprehensive list.

About

django-storages is a project to provide a variety of storage backends in a single library.

This library is usually compatible with the currently supported versions of Django. Check the Trove classifiers in setup.py to be sure.

django-storages is backed in part by Tidelift. Check them out for all of your enterprise open source software commerical support needs.

Security

To report a security vulnerability, please use the Tidelift security contact. Tidelift will coordinate the fix and disclosure. Please do not post a public issue on the tracker.

History

This repo began as a fork of the original library under the package name of django-storages-redux and became the official successor (releasing under django-storages on PyPI) in February of 2016.

Found a Bug? Something Unsupported?

I suspect that a few of the storage engines in backends/ have been unsupported for quite a long time. I personally only really need the S3Storage backend but welcome bug reports (and especially) patches and tests for some of the other backends.

Issues are tracked via GitHub issues at the project issue page.

Documentation

Documentation for django-storages is located at https://django-storages.readthedocs.io/.

Contributing

  1. Check for open issues at the project issue page or open a new issue to start a discussion about a feature or bug.
  2. Fork the django-storages repository on GitHub to start making changes.
  3. Add a test case to show that the bug is fixed or the feature is implemented correctly.
  4. Bug me until I can merge your pull request. Also, don't forget to add yourself to AUTHORS.
Issues
  • S3Boto3Storage raises ValueError: I/O operation on closed file.

    S3Boto3Storage raises ValueError: I/O operation on closed file.

    When running python manage.py collectstatic we get the following exception:

    Traceback (most recent call last):
      File "manage.py", line 10, in <module>
        execute_from_command_line(sys.argv)
      File "~/venvs/app_root/lib/python3.5/site-packages/django/core/management/__init__.py", line 363, in execute_from_command_line
        utility.execute()
      File "items()venvs/app_root/lib/python3.5/site-packages/django/core/management/__init__.py", line 355, in execute
        self.fetch_command(subcommand).run_from_argv(self.argv)
      File "~/venvs/app_root/lib/python3.5/site-packages/django/core/management/base.py", line 283, in run_from_argv
        self.execute(*args, **cmd_options)
      File "~/venvs/app_root/lib/python3.5/site-packages/django/core/management/base.py", line 330, in execute
        output = self.handle(*args, **options)
      File "~/venvs/app_root/lib/python3.5/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 199, in handle
        collected = self.collect()
      File "~/venvs/app_root/lib/python3.5/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 139, in collect
        for original_path, processed_path, processed in processor:
      File "~/venvs/app_root/lib/python3.5/site-packages/django/contrib/staticfiles/storage.py", line 246, in post_process
        for name, hashed_name, processed, _ in self._post_process(paths, adjustable_paths, hashed_files):
      File "~/venvs/app_root/lib/python3.5/site-packages/django/contrib/staticfiles/storage.py", line 312, in _post_process
        hashed_name = self.hashed_name(name, content_file)
      File "~/venvs/app_root/lib/python3.5/site-packages/django/contrib/staticfiles/storage.py", line 109, in hashed_name
        file_hash = self.file_hash(clean_name, content)
      File "~/venvs/app_root/lib/python3.5/site-packages/django/contrib/staticfiles/storage.py", line 86, in file_hash
        for chunk in content.chunks():
      File "~/venvs/app_root/lib/python3.5/site-packages/django/core/files/base.py", line 76, in chunks
        self.seek(0)
    ValueError: I/O operation on closed file.
    

    This only happens when using django-storages 1.6.4 or above. Versions 1.6.3 and lower work fine.

    We're using Django 1.11.4, python 3.5.2, boto3 1.4.6.

    bug s3boto 
    opened by tsifrer 73
  • Add Google Cloud Storage backend using the gcloud-python library

    Add Google Cloud Storage backend using the gcloud-python library

    Based on https://github.com/jschneier/django-storages/pull/146

    This uses the native gcloud-python library, which is recommended by Google for production setups and allows finer-grained authentication than current solutions (which only work when Google Cloud Storage is used in S3 compatibility mode).

    • [x] Review code (see: https://github.com/jschneier/django-storages/pull/146#issuecomment-266873319)
    • [x] Add tests
    • [x] Add documentation
    opened by scjody 32
  • Update Azure storage version

    Update Azure storage version

    Attempt to close #784. From what I can tell the updated version of azure-storage-blob doesn't have the same options for URL generation with is_emulated and custom_domain, so I've removed options that won't work anymore.

    There's a lot in here, so any feedback is appreciated, thanks!

    opened by pjsier 31
  • [s3] Add support for signing CloudFront URLs

    [s3] Add support for signing CloudFront URLs

    Fixes #456. Similar to #484 but uses AWS CloudFront signed URLS (See AWS docs)

    This is probably the right way to approach the problem in #484, but if that solution works that might be an alternative (not sure how @mattjegan configured S3+CloundFront)

    s3boto 
    opened by terencehonles 29
  • AWS S3 Frankfurt region not working

    AWS S3 Frankfurt region not working

    COPY OF BITBUCKET ISSUE #214 - https://bitbucket.org/david/django-storages/issue/214/aws-s3-frankfurt-region-not-working

    "Andreas Schilling created an issue 2015-01-04

    Using Frankfurt region (Germany) with django-storages produces HTTP 400 error. S3 in the the new region supports only Signature Version 4. In all other regions, Amazon S3 supports both Signature Version 4 and Signature Version 2.

    I assume django-storages only supports Signature Version 2. Is there any chance to support Version 4?"

    Thanks @jschneier for the fork! Is there a chance for django-storages-redux to support the eu-central-1 region?

    s3boto 
    opened by FPurchess 28
  • New S3 Boto3 backend (closes #57)

    New S3 Boto3 backend (closes #57)

    This pull request implements https://github.com/jschneier/django-storages/issues/57 by adding a Boto 3 backend that tries to be a close-to-drop-in replacement for Boto 2. Due to the slight differences, I have kept it a separate backend, but with a lot of copy and paste code. Given that Boto 2 is heading towards maintenance mode according to https://github.com/boto/boto/commit/e3dd99695e8976fad88c1c55d69914199f1878db, I don't think it's worth trying to have 2 backends sharing code when the Boto 2 implementation looks on the way to being deprecated.

    Note that this isn't just me blithely throwing away the Boto 2 implementation; the fundamental underlying operations are VERY different and worthy of a separate backend. Boto 2 operates on the assumption that you can set arbitrary headers by passing in a dictionary; Boto 3 restricts you to specific named parameters and as such the 2 approaches are very incompatible with one another. You can try to do minor mappings here and there to try to map some of the headers in the AWS_HEADERS setting, but trying to map every possible header value to the right argument name in the right method is pretty tedious and error prone. Instead, this pull request embraces Boto 3's use of parameters as its way of taking in these extra arguments, leaving the remapping up to the django-storages user who wants to switch backends. For the limited number of extra headers and parameters they'll use, this mapping is easier to do, and looking up in Boto 3's documentation for the parameter name is straightforward.

    This pull request replaces https://github.com/jschneier/django-storages/issues/66, adding unit tests and incorporating changes due to pull requests accepted into Boto3/botocore. A substantially similar version of this (minus the recent merges from the past few days) has been run in production with Django 1.6.11 for several months without problems.

    Also note that while I was at it, I made the necessary change to support #95 if you switch to this backend.

    Because this is based on s3boto.py, you should be able to manually perform a diff of s3boto.py and the new s3boto3.py to understand the changes.

    Changes:

    • The AWS_HEADERS/storage.headers setting is replaced with AWS_S3_OBJECT_PARAMETERS/storage.object_parameters. The keys/values to this are intended to be the arguments to the http://boto3.readthedocs.org/en/latest/reference/services/s3.html#S3.Object.put boto3.Resource('s3').Object.put() method, which has all the arguments that would be used to generate the correct request headers. Note that this gives a little more access than just headers, since a user could also provide ACL and Content arguments in this dict.
    • Unlike the s3boto implementation, s3boto3 backend does not currently support proxies (https://github.com/boto/botocore/issues/541), or alternate host/ports (https://github.com/boto/botocore/issues/601) because the underlying Boto3/Botocore library does not currently support it. It only kind of supports the endpoint URL
    • If using s3v4, since botocore does not automatically redirect you to the correct region's endpoint nor sign properly unless it knows the region, there is an AWS_S3_REGION_NAME/storage.region_name setting to force the region.
    • There's some behavior that was being done in the boto2 library that boto3 does not do, so equivalent code has been added to s3boto3 to do this. These include things like rewinding the file pointer, automatically checking for bucket and object existence, or directly writing the S3 contents to a file/file pointer. There's cases where Boto3 does not allow previous operations, like locally updating the last_modified attribute, which this new version performs by just reloading the object from the server.
    • No need to parse timestamps, since boto3 already performs this conversion for you.

    Known issues:

    • For unsigned URLs (e.g. querystring_auth=False), Boto3 does not support this in its API (https://github.com/boto/boto3/issues/169). This implements compatibility by stripping off any signature parameters by parsing the querystring. Note that these parameter names will be different for s3v1/s3v4 URLs, but this implements it by stripping them all away.
    • To support s3v4 URLs for endpoints that aren't v4 by default (most of them), there are 2 ways. The one I've used in production is to have a ~/.aws/config file as generated by running "aws configure set default.s3.signature_version s3v4". If deploying in a configuration where there is no user home directory, the location of this config can be set with AWS_CONFIG_FILE environment variable. Another way involves setting the S3Boto3Storage.config variable with a Boto3 client config.

    This has been tested using s3v4 signatures with both signed and unsigned URLs, along with response content disposition and KMS server-side encryption key arguments.

    opened by mbarrien 27
  • Calculate settings when storage is instantiated; not imported

    Calculate settings when storage is instantiated; not imported

    Makes the storage classes usable with Django's override_settings. For example, projects can now change the S3 location & bucket when running their tests. Affects the following backends:

    • azure_storage
    • gcloud
    • gs
    • s3boto
    • s3boto3

    The remaining storage backends do not need to be altered as they do no calculate settings at import time.

    The class BaseStorage was created to use a common pattern to initialize settings for various backends. Users of these backends can expect them to work consistently. This class is fully backwards compatible. Settings can now be set:

    • In settings.py
    • Using override_settings
    • As a class variable
    • As an argument to init

    Closes #498

    Completes all necessary backends and adds tests.

    opened by jdufresne 23
  • Allow the use of AWS profiles for S3 access

    Allow the use of AWS profiles for S3 access

    This uses the setting AWS_S3_SESSION_PROFILE as an alternative to using static credentials in AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY.
    This addresses the spirit of https://github.com/jschneier/django-storages/issues/774 and implements the suggestion in https://github.com/jschneier/django-storages/issues/895

    opened by dan-hook 19
  • Detect Content-Type when content_type property is null

    Detect Content-Type when content_type property is null

    Fallback on other content-type detection methods when the content_type property is None Fixes #406

    opened by bxm156 19
  • SFTP problem

    SFTP problem

    I get this error:

    File "/home/vagrant/.virtualenvs/gazex/lib/python3.5/site-packages/paramiko/client.py", line 70, in __init__
        self._system_host_keys = HostKeys()
    RecursionError: maximum recursion depth exceeded
    

    which really doesn't say much about the real problem. I know that there is a mkdir loop, but what causes it?

    bug 
    opened by adi- 17
  • Azure Blob - collectstatic not working

    Azure Blob - collectstatic not working

    Hi,

    my running project is not working:

    • Django 4.0rc1
    • Django-storages 1.12.3

    Requirements:

        django >= 4.0rc1
        mysqlclient
        django-froala-editor
        pillow
        stripe
        django-storages[azure]
    
    
    custom_azure.py:
    
    from storages.backends.azure_storage import AzureStorage
    
    class AzureMediaStorage(AzureStorage):
        account_name = 'foo'  # Must be replaced by your <storage_account_name>
        account_key = 'zzzz=' # Must be replaced by your <storage_account_key>
        azure_container = 'media'
        expiration_secs = None
        overwrite_files = True
    
    class AzureStaticStorage(AzureStorage):
        account_name = 'foo'  # Must be replaced by your storage_account_name
        account_key = 'zzzz=' # Must be replaced by your <storage_account_key>
        azure_container = 'static'
        expiration_secs = None
        overwrite_files = True
    

    Error output:

    Traceback (most recent call last):
      File "/mnt/c/Program Files/JetBrains/PyCharm 2021.2.3/plugins/python/helpers/pycharm/django_manage.py", line 52, in <module>
        run_command()
      File "/mnt/c/Program Files/JetBrains/PyCharm 2021.2.3/plugins/python/helpers/pycharm/django_manage.py", line 46, in run_command
        run_module(manage_file, None, '__main__', True)
      File "/usr/lib64/python3.9/runpy.py", line 210, in run_module
        return _run_module_code(code, init_globals, run_name, mod_spec)
      File "/usr/lib64/python3.9/runpy.py", line 97, in _run_module_code
        _run_code(code, mod_globals, init_globals,
      File "/usr/lib64/python3.9/runpy.py", line 87, in _run_code
        exec(code, run_globals)
      File "/home/westcoast-dk/PycharmProjects/WestcoastShop/WestcoastShop/manage.py", line 24, in <module>
        execute_from_command_line(sys.argv)
      File "/usr/local/lib/python3.9/site-packages/django/core/management/__init__.py", line 425, in execute_from_command_line
        utility.execute()
      File "/usr/local/lib/python3.9/site-packages/django/core/management/__init__.py", line 419, in execute
        self.fetch_command(subcommand).run_from_argv(self.argv)
      File "/usr/local/lib/python3.9/site-packages/django/core/management/base.py", line 373, in run_from_argv
        self.execute(*args, **cmd_options)
      File "/usr/local/lib/python3.9/site-packages/django/core/management/base.py", line 417, in execute
        output = self.handle(*args, **options)
      File "/usr/local/lib/python3.9/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 187, in handle
        collected = self.collect()
      File "/usr/local/lib/python3.9/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 114, in collect
        handler(path, prefixed_path, storage)
      File "/usr/local/lib/python3.9/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 334, in copy_file
        if not self.delete_file(path, prefixed_path, source_storage):
      File "/usr/local/lib/python3.9/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 248, in delete_file
        if self.storage.exists(prefixed_path):
      File "/usr/local/lib/python3.9/site-packages/storages/backends/azure_storage.py", line 241, in exists
        blob_client.get_blob_properties()
      File "/usr/local/lib/python3.9/site-packages/azure/core/tracing/decorator.py", line 83, in wrapper_use_tracer
        return func(*args, **kwargs)
      File "/usr/local/lib/python3.9/site-packages/azure/storage/blob/_blob_client.py", line 1242, in get_blob_properties
        process_storage_error(error)
      File "/usr/local/lib/python3.9/site-packages/azure/storage/blob/_shared/response_handlers.py", line 177, in process_storage_error
        exec("raise error from None")   # pylint: disable=exec-used # nosec
      File "<string>", line 1, in <module>
    azure.core.exceptions.ClientAuthenticationError: Operation returned an invalid status 'Forbidden'
    ErrorCode:AuthenticationFailed
    
    Process finished with exit code 1
    
    

    I don't have changed anything on the storage account or the settings.py

    regards Christopher

    opened by raucodes 8
  • Update supported Python and Django versions

    Update supported Python and Django versions

    • drop unsupported versions (e.g. Python 3.5 or Django 3.0)
    • add latest versions (e.g. Python 3.10 or Django 4.0)
    opened by pauloxnet 0
  • Add ability to get object/file attributes with listdir variant

    Add ability to get object/file attributes with listdir variant

    I sometimes want to get the attributes (size, modtime) for my S3 and local filesystem objects when using listdir. I could not see how to do that, so I created custom storage classes like this. Can this capability be added to the base implementations?

    class CustomS3Boto3Storage(S3Boto3Storage): def listdirAttributes(self, name): path = self._normalize_name(self._clean_name(name)) # The path needs to end with a slash, but if the root is empty, leave it. if path and not path.endswith('/'): path += '/'

    	directories = []
    	files = []
    	paginator = self.connection.meta.client.get_paginator('list_objects')
    	pages = paginator.paginate(Bucket=self.bucket_name, Delimiter='/', Prefix=path)
    	for page in pages:
    		for entry in page.get('CommonPrefixes', ()):
    			directories.append(posixpath.relpath(entry['Prefix'], path))
    		for entry in page.get('Contents', ()):
    			key = entry['Key']
    			if key != path:
    				attrs = {}
    				attrs['name'] = posixpath.relpath(key, path)
    				attrs['size'] = entry['Size']
    				attrs['modified'] = str(entry['LastModified'])
    				files.append(attrs)
    	return directories, files
    

    class CustomFileSystemStorage(FileSystemStorage):

    def listdirAttributes(self, path):
    	path = self.path(path)
    	dtz = timezone.get_default_timezone()
    	directories, files = [], []
    
    	for entry in os.listdir(path):
    		if os.path.isdir(os.path.join(path, entry)):
    			directories.append(entry)
    		else:
    			attrs = {}
    			attrs['name'] = entry
    			file = os.stat(os.path.join(path, entry))
    			attrs['size'] = file.st_size
    			attrs['modified'] = str(datetime.fromtimestamp(file.st_mtime, dtz))
    			files.append(attrs)
    	return directories, files
    
    opened by johnbyrne7 0
  • Oracle cloud usage docs

    Oracle cloud usage docs

    null

    opened by henriqueccapozzi 0
  • Add ability to make some content types public even if by default it requires a query parameter authentication

    Add ability to make some content types public even if by default it requires a query parameter authentication

    Even if the default bucket ACL is private (AWS, GCP, AZure, etc) there are some cases where some content type is needed to be public.

    Here is an example using the GoogleCloudStorage

    settings.py

    PUBLIC_CONTENT_TYPES = [
        'text/css',
        'text/javascript',
        'application/javascript',
        'application/x-javascript', 
    ]
    

    backends/gcloud.py

    class GoogleCloudStorage(BaseStorage):
        ...
        
        def _is_public(self, content_type):
            return content_type in settings(PUBLIC_CONTENT_TYPES)
                
        def _save(self, name, content):
            cleaned_name = clean_name(name)
            name = self._normalize_name(cleaned_name)
    
            content.name = cleaned_name
            file = GoogleCloudFile(name, 'rw', self)
            file.blob.cache_control = self.cache_control
            file.blob.upload_from_file(
                content, rewind=True, size=content.size,
                content_type=file.mime_type, predefined_acl=self.default_acl)
                
            if self._is_public(file.blob.content_type):
                file.blob.make_public()
                
            return cleaned_name
                
        def url(self, name):
            ...
            
            no_signed_url = (
                self.default_acl == 'publicRead' or not self.querystring_auth or self._is_public(blob.content_type))
                
            ...
    
    opened by jkevingutierrez 0
  • Docs for multiple buckets with varying properties, Digital Ocean signed URLs with CDN domain

    Docs for multiple buckets with varying properties, Digital Ocean signed URLs with CDN domain

    from guide posted in #944, also referenced in #395

    I double checked and cleaned up the example template filter, but I don't use DO Spaces and would rather not pay 5 bucks to test this if possible, perhaps @lifenautjoe can verify that this all works on his DO account.

    opened by awhileback 0
  • Add content_type to the object param function

    Add content_type to the object param function

    I want to change the object parameters on a per-object basis. But I don't want the same params for every type of object. For example, I want ContentDisposition: attachment for images, but not for PDF files.

    My current workaround is this:

    class MediaRootS3Boto3Storage(S3Boto3Storage):
        location = "media"
        file_overwrite = False
    
        def _get_write_parameters(self, name, content=None):
            params = {}
    
            _type, encoding = mimetypes.guess_type(name)
            content_type = getattr(content, 'content_type', None)
            content_type = content_type or _type or self.default_content_type
    
            params['ContentType'] = content_type
            if encoding:
                params['ContentEncoding'] = encoding
    
            params.update(self.get_object_parameters_for_type(name, content_type))
    
            if 'ACL' not in params and self.default_acl:
                params['ACL'] = self.default_acl
    
            return params
    
        def get_object_parameters_for_type(self, name, content_type):
            object_params = self.object_parameters.copy()
            if 'image' in content_type:
                object_params.update(
                    ContentDisposition='attachment'
                )
            return object_params
    

    In this PR I update the get_object_parameters so it also gets content_type as a parameter.

    opened by aljazkosir 2
  • Renaming files using S3Boto3Storage deletes the object

    Renaming files using S3Boto3Storage deletes the object

    Using Django-storages 1.11.1 and Python 3.8.

    If I rename a FileField object's name (i.e. name attribute), then the original key is deleted from S3, but the File field is stored and renamed correctly in the database.

    Is this a bug or expected behavior? If expected, can this be documented?

    opened by piraka9011 0
  • Upload File Using Signed URL - Google Cloud Storage

    Upload File Using Signed URL - Google Cloud Storage

    For a heavy media-dependent application, e are using Django-storages to store files to google cloud storage. However, we don't want to process the entire upload keeping our server engaged. So, we are thinking to use a signed URL for uploading files directly to google could. We can achieve this in python however, I want to do it in a more django-way. I am looking for suggestions on how to do it... Thanks

    opened by maahad767 0
Owner
Josh Schneier
Josh Schneier
https://django-storages.readthedocs.io/

Installation Installing from PyPI is as easy as doing: pip install django-storages If you'd prefer to install from source (maybe there is a bugfix in

Josh Schneier 2k Nov 30, 2021
PyStan, a Python interface to Stan, a platform for statistical modeling. Documentation: https://pystan.readthedocs.io

PyStan NOTE: This documentation describes a BETA release of PyStan 3. PyStan is a Python interface to Stan, a package for Bayesian inference. Stan® is

Stan 136 Nov 12, 2021
Cowrie SSH/Telnet Honeypot https://cowrie.readthedocs.io

Cowrie Welcome to the Cowrie GitHub repository This is the official repository for the Cowrie SSH and Telnet Honeypot effort. What is Cowrie Cowrie is

Cowrie 3.8k Dec 3, 2021
PyStan, a Python interface to Stan, a platform for statistical modeling. Documentation: https://pystan.readthedocs.io

PyStan PyStan is a Python interface to Stan, a package for Bayesian inference. Stan® is a state-of-the-art platform for statistical modeling and high-

Stan 136 Nov 12, 2021
Askbot is a Django/Python Q&A forum. **Contributors README**: https://github.com/ASKBOT/askbot-devel#how-to-contribute. Commercial hosting of Askbot and support are available at https://askbot.com

ATTENTION: master branch is experimental, please read below Askbot - a Django Q&A forum platform This is Askbot project - open source Q&A system, like

ASKBOT 1.5k Nov 22, 2021
A high performance implementation of HDBSCAN clustering. http://hdbscan.readthedocs.io/en/latest/

HDBSCAN Now a part of scikit-learn-contrib HDBSCAN - Hierarchical Density-Based Spatial Clustering of Applications with Noise. Performs DBSCAN over va

Leland McInnes 85 Sep 2, 2021
Sphinx theme for readthedocs.org

Read the Docs Sphinx Theme This Sphinx theme was designed to provide a great reader experience for documentation users on both desktop and mobile devi

Read the Docs 3.9k Dec 1, 2021
:speech_balloon: SpeechPy - A Library for Speech Processing and Recognition: http://speechpy.readthedocs.io/en/latest/

SpeechPy Official Project Documentation Table of Contents Documentation Which Python versions are supported Citation How to Install? Local Installatio

Amirsina Torfi 845 Nov 18, 2021
The source code that powers readthedocs.org

Welcome to Read the Docs Purpose Read the Docs hosts documentation for the open source community. It supports Sphinx docs written with reStructuredTex

Read the Docs 6.8k Nov 26, 2021
Fast image augmentation library and easy to use wrapper around other libraries. Documentation: https://albumentations.ai/docs/ Paper about library: https://www.mdpi.com/2078-2489/11/2/125

Albumentations Albumentations is a Python library for image augmentation. Image augmentation is used in deep learning and computer vision tasks to inc

null 9.3k Dec 2, 2021
Official mirror of https://gitlab.com/pgjones/hypercorn https://pgjones.gitlab.io/hypercorn/

Hypercorn Hypercorn is an ASGI web server based on the sans-io hyper, h11, h2, and wsproto libraries and inspired by Gunicorn. Hypercorn supports HTTP

Phil Jones 229 Nov 26, 2021
Fast image augmentation library and easy to use wrapper around other libraries. Documentation: https://albumentations.ai/docs/ Paper about library: https://www.mdpi.com/2078-2489/11/2/125

Albumentations Albumentations is a Python library for image augmentation. Image augmentation is used in deep learning and computer vision tasks to inc

null 9.2k Dec 1, 2021
Django module to easily send emails/sms/tts/push using django templates stored on database and managed through the Django Admin

Django-Db-Mailer Documentation available at Read the Docs. What's that Django module to easily send emails/push/sms/tts using django templates stored

LPgenerator 240 Nov 25, 2021
Django URL Shortener is a Django app to to include URL Shortening feature in your Django Project

Django URL Shortener Django URL Shortener is a Django app to to include URL Shortening feature in your Django Project Install this package to your Dja

Rishav Sinha 4 Nov 18, 2021
As easy as /aitch-tee-tee-pie/ 🥧 Modern, user-friendly command-line HTTP client for the API era. JSON support, colors, sessions, downloads, plugins & more. https://twitter.com/httpie

HTTPie: human-friendly CLI HTTP client for the API era HTTPie (pronounced aitch-tee-tee-pie) is a command-line HTTP client. Its goal is to make CLI in

HTTPie 52.9k Dec 4, 2021
Web-interface + rest API for classification and regression (https://jeff1evesque.github.io/machine-learning.docs)

Machine Learning This project provides a web-interface, as well as a programmatic-api for various machine learning algorithms. Supported algorithms: S

Jeff Levesque 246 Nov 25, 2021
Code for: https://berkeleyautomation.github.io/bags/

DeformableRavens Code for the paper Learning to Rearrange Deformable Cables, Fabrics, and Bags with Goal-Conditioned Transporter Networks. Here is the

Daniel Seita 85 Nov 21, 2021
Discord Bot for server hosts, devs, and admins. Analyzes timings reports & uploads text files to hastebin. Developed by https://birdflop.com.

"Botflop" Click here to invite Botflop to your server. Current abilities Analyze timings reports Paste a timings report to review an in-depth descript

Purpur 46 Nov 18, 2021
A Pythonic client for the official https://data.gov.gr API.

pydatagovgr An unofficial Pythonic client for the official data.gov.gr API. Aims to be an easy, intuitive and out-of-the-box way to: find data publish

Ilias Antonopoulos 34 Nov 11, 2021
The official implementation of NeMo: Neural Mesh Models of Contrastive Features for Robust 3D Pose Estimation [ICLR-2021]. https://arxiv.org/pdf/2101.12378.pdf

NeMo: Neural Mesh Models of Contrastive Features for Robust 3D Pose Estimation [ICLR-2021] Release Notes The offical PyTorch implementation of NeMo, p

Angtian Wang 67 Oct 29, 2021
My Advent of Code solutions. I also upload videos of my solves: https://www.youtube.com/channel/UCuWLIm0l4sDpEe28t41WITA

My solutions to adventofcode.com puzzles. I post videos of me solving the puzzles in real-time at https://www.youtube.com/channel/UCuWLIm0l4sDpEe28t41

null 33 Dec 1, 2021
HTTP(s) "monitoring" webpage via FastAPI+Jinja2. Inspired by https://github.com/RaymiiOrg/bash-http-monitoring

python-http-monitoring HTTP(s) "monitoring" powered by FastAPI+Jinja2+aiohttp. Inspired by bash-http-monitoring. Installation can be done with pipenv

itzk 36 Oct 24, 2021