Microsoft Azure Storage Library for Python

Overview

Microsoft Azure Storage SDK for Python

NEWS!! azure-storage-blob version 12.0.0 is GA now!

Here is the link to v12.0.0 repo.

Note: the current repo is for azure-storage-blob<=2.1.0, upgrading to v12.0.0 could break you current code. Link for breaking change details.

This project provides a client library in Python that makes it easy to consume Microsoft Azure Storage services. For documentation please see the Microsoft Azure Python Developer Center and our API Reference (also available on readthedocs).

If you are looking for the Service Bus or Azure Management libraries, please visit https://github.com/Azure/azure-sdk-for-python.

Compatibility

IMPORTANT: If you have an earlier version of the azure package (version < 1.0), you should uninstall it before installing this package.

You can check the version using pip:

pip freeze

If you see azure==0.11.0 (or any version below 1.0), uninstall it first then install it again:

pip uninstall azure
pip install azure

If you are upgrading from a version older than 0.30.0, see the upgrade doc, the usage samples in the samples directory, and the ChangeLog and BreakingChanges.

If you are encountering problems installing azure storage on Azure Web Apps, upgrading pip might help.

IMPORTANT: If you have an earlier version of the azure-storage package (version <= 0.36.0), you should uninstall it before installing the new split packages.

You can check the version using pip:

pip freeze

If you see azure-storage==0.36.0 (or any version below 0.36.0), uninstall it first:

pip uninstall azure-storage

Features

  • Blob
    • Create/Read/Update/Delete Containers
    • Create/Read/Update/Delete Blobs
    • Advanced Blob Operations
  • Queue
    • Create/Delete Queues
    • Insert/Peek Queue Messages
    • Advanced Queue Operations
  • Files
    • Create/Update/Delete Shares
    • Create/Update/Delete Directories
    • Create/Read/Update/Delete Files
    • Advanced File Operations

Getting Started

Download

The Azure Storage SDK for Python is composed of 5 packages:

  • azure-storage-blob
    • Contains the blob service APIs.
  • azure-storage-file
    • Contains the file service APIs.
  • azure-storage-queue
    • Contains the queue service APIs.
  • azure-storage-common
    • Contains common code shared by blob, file and queue.
  • azure-storage-nspkg
    • Owns the azure.storage namespace, user should not use this directly.

Note: prior to and including version 0.36.0, there used to be a single package (azure-storage) containing all services. It is no longer supported, and users should install the 3 before-mentioned service packages individually, depending on the need. In addition, the table package is no longer releasing under the azure-storage namespace, please refer to cosmosdb.

Option 1: Via PyPi

To install via the Python Package Index (PyPI), type:

pip install azure-storage-blob
pip install azure-storage-file
pip install azure-storage-queue

Option 2: Source Via Git

To get the source code of the SDK via git just type:

git clone git://github.com/Azure/azure-storage-python.git

cd ./azure-storage-python/azure-storage-nspkg
python setup.py install

cd ../azure-storage-common
python setup.py install

cd ../azure-storage-blob
python setup.py install

Replace azure-storage-blob with azure-storage-file or azure-storage-queue, to install the other services.

Option 3: Source Zip

Download a zip of the code via GitHub or PyPi. Then follow the same instructions in option 2.

Minimum Requirements

  • Python 2.7, 3.3-3.7.
  • See setup.py for dependencies

Usage

To use this SDK to call Microsoft Azure storage services, you need to first create an account.

Logging

To make debugging easier, it is recommended to turn on logging for the logger named 'azure.storage'. Here are two example configurations:

# Basic configuration: configure the root logger, including 'azure.storage'
logging.basicConfig(format='%(asctime)s %(name)-20s %(levelname)-5s %(message)s', level=logging.INFO)
# More advanced configuration allowing more control
logger = logging.getLogger('azure.storage')
handler = logging.StreamHandler()
formatter = logging.Formatter('%(asctime)s %(name)-20s %(levelname)-5s %(message)s')
handler.setFormatter(formatter)
logger.addHandler(handler)
logger.setLevel(logging.INFO)

Here is how we use the logging levels, it is recommended to use INFO:

  • DEBUG: log strings to sign
  • INFO: log outgoing requests and responses, as well as retry attempts
  • WARNING: not used
  • ERROR: log calls that still failed after all the retries

Code Sample

See the samples directory for blob, queue, and file usage samples.

Need Help?

Be sure to check out the Microsoft Azure Developer Forums on MSDN or the Developer Forums on Stack Overflow if you have trouble with the provided code.

Contribute Code or Provide Feedback

If you would like to become an active contributor to this project, please follow the instructions provided in Azure Projects Contribution Guidelines. You can find more details for contributing in the CONTRIBUTING.md doc.

If you encounter any bugs with the library, please file an issue in the Issues section of the project.

Learn More

Comments
  • cannot import name 'BlockBlobService'

    cannot import name 'BlockBlobService'

    Hi,

    I have azure-storage-blob version 0.37.1 and still i get an error when i import BlockBlobService.

    from azure.storage.blob import BlockBlobService # import azure sdk packages

    Error message: ImportError: cannot import name 'BlockBlobService'

    Here is a list of existing azure storage packages in my current virtual environment:

    azure-storage-blob==0.37.1 azure-storage-common==0.37.1 azure-storage-nspkg==2.0.0

    How can i fix this?

    Thanks.

    opened by melzoghbi 41
  • Random authentication error when using account SAS to upload large file to blob storage

    Random authentication error when using account SAS to upload large file to blob storage

    Hi guys,

    I am seeing random authentication issues when using account SAS to upload large files ( >20 GB) to blob storage.

    azure.common.AzureHttpError: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
    <?xml version="1.0" encoding="utf-8"?><Error><Code>AuthenticationFailed</Code><Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
    RequestId:aa7f238b-0001-0006-3d71-cf3f37000000
    Time:2017-05-18T00:54:58.0436290Z</Message><AuthenticationErrorDetail>sr is mandatory. Cannot be empty</AuthenticationErrorDetail></Error>
    

    The authentication error detail showed sr is mandatory. Cannot be empty. Per Account SAS documentation, account SAS doesn't have the sr parameter while service SAS has it. Using the same token, I was able to successfully upload 2 large files at different time, but it also failed many times due to authentication issue. The SAS token is given to me, and I do not have direct access to the account, so I may not be able to change or generate a service SAS. Could you please take a look if this is a bug or simply I just miss something?

    Thanks

    bug 
    opened by thanhnhut90 16
  • ModuleNotFoundError: No module named 'azure.storage' not always reproducable

    ModuleNotFoundError: No module named 'azure.storage' not always reproducable

    Which service(blob, file, queue) does this issue concern?

    queue (possibly others?)

    What problem was encountered?

    Installing azure-storage-queue (in a clean virtualenv) frequently (so not always) leads to a module installation which cannot be imported. The installation steps followed are the ones documented in https://docs.microsoft.com/en-us/azure/storage/queues/storage-python-how-to-use-queue-storage

    In a fresh Python-3.6.3 virtualenv

    $ pip install azure-storage-queue
    ... installation happens successfully, no errors ...
    $  python -c 'from azure.storage.queue import QueueService'
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
    ModuleNotFoundError: No module named 'azure.storage'
    
    

    Sometimes the installation does result into a module which can be imported without errors but most often, ... it does not.

    Have you found a mitigation/solution?

    A mitigation yes

    $ pip install azure-storage-queue --upgrade --force-reinstall Fixes the problem. Keep in mind that the installation is done on a clean virtualenv.

    I'd expect the import to succeed without any further ado after installing azure-storage-queue. If specific details are required, pls let me know.

    Tnx,

    Jelle

    opened by smetj 15
  • ImportError: No module named storage.blob

    ImportError: No module named storage.blob

    from azure.storage.blob import BlockBlobService

    block_blob_service = BlockBlobService(account_name='myaccount', account_key='mykey')

    block_blob_service.create_container('backups')

    Part of my pip freeze ::

    pip freeze appdirs==1.4.0 azure==1.0.3 azure-common==1.1.4 azure-mgmt==0.20.2 azure-mgmt-common==0.20.0 azure-mgmt-compute==0.20.1 azure-mgmt-network==0.20.1 azure-mgmt-nspkg==1.0.0 azure-mgmt-resource==0.20.1 azure-mgmt-storage==0.20.0 azure-nspkg==1.0.0 azure-servicebus==0.20.1 azure-servicemanagement-legacy==0.20.2 azure-storage==0.33.0

    The 0.33.0 version is clearly there, why does it keep saying no module named storage.blob?

    opened by michaelseto 15
  • azure.storage.blob.baseblobservice. BaseBlobService .exists logs error if blob does not exist

    azure.storage.blob.baseblobservice. BaseBlobService .exists logs error if blob does not exist

    Which service(blob, file, queue) does this issue concern?

    azure.storage.blob.baseblobservice.BaseBlobService.exists

    What problem was encountered?

    When checking if a blob exists in container with azure.storage.blob. baseblobservice.exists, an error is logged when the blob does not exist in container. The expected behavior would be that no error is logged when the blob does not exist.

    Have you found a mitigation/solution?

    Hint: azure.storage.blob.baseblobservice.BaseBlobService.get_blob_properties sends a request via _perform_request that raises an exception which is logged due to https://github.com/Azure/azure-storage-python/blob/master/azure-storage-common/azure/storage/common/storageclient.py#L348

    fixed/waiting for release 
    opened by tim-werner 14
  • Upload to blob fails with timeout error much sooner than timeout expires

    Upload to blob fails with timeout error much sooner than timeout expires

    blob_service.create_blob_from_path(container, blob_path, file, content_settings=content_settings, timeout=300)
    
    $ time make upload-big-files
    ./src/7.upload-2-azure.sh    16041-big-files1   az--repoX
    Init of upload_pack.py
    Latest local pack found: 1.5.3.0203.161011-112303Z
    Adding additional mime content types
    INFO:requests.packages.urllib3.connectionpool:Starting new HTTPS connection (1): repoX.blob.core.windows.net
    DEBUG:requests.packages.urllib3.connectionpool:"GET /repoX-pub?restype=container HTTP/1.1" 200 None
    DEBUG:requests.packages.urllib3.connectionpool:"HEAD /repoX-pub/repoX/games/big-files/files-v1/latest-build.cfg HTTP/1.1" 200 0
    blob: repoX/games/big-files/files-v1/latest-build.cfg EXISTS
    DEBUG:requests.packages.urllib3.connectionpool:"GET /repoX-pub/repoX/games/big-files/files-v1/latest-build.cfg HTTP/1.1" 206 73
    DEBUG:requests.packages.urllib3.connectionpool:"HEAD /repoX-pub/repoX/games/big-files/files-v1/1.5.3.0203-161010-165722/.ok HTTP/1.1" 404 0
    Local version (1.5.3.0203.161011-112303Z) is equal or newer than Azure version (1.5.3.0203-161010-165722)
    uploading webgl.memgz
    INFO:requests.packages.urllib3.connectionpool:Starting new HTTPS connection (2): repoX.blob.core.windows.net
    DEBUG:requests.packages.urllib3.connectionpool:"PUT /repoX-pub/repoX/games/big-files/files-v1/1.5.3.0203.161011-112303Z/bin.webgl/webgl.mem?timeout=300 HTTP/1.1" 201 None
    uploading webgl.datagz
    INFO:requests.packages.urllib3.connectionpool:Starting new HTTPS connection (3): repoX.blob.core.windows.net
    INFO:requests.packages.urllib3.connectionpool:Starting new HTTPS connection (4): repoX.blob.core.windows.net
    INFO:requests.packages.urllib3.connectionpool:Starting new HTTPS connection (5): repoX.blob.core.windows.net
    ########################################################
    Traceback (most recent call last):
      File "/usr/local/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 544, in urlopen
        body=body, headers=headers)
      File "/usr/local/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 349, in _make_request
        conn.request(method, url, **httplib_request_kw)
      File "/usr/local/Cellar/python3/3.5.2_2/Frameworks/Python.framework/Versions/3.5/lib/python3.5/http/client.py", line 1106, in request
        self._send_request(method, url, body, headers)
      File "/usr/local/Cellar/python3/3.5.2_2/Frameworks/Python.framework/Versions/3.5/lib/python3.5/http/client.py", line 1151, in _send_request
        self.endheaders(body)
      File "/usr/local/Cellar/python3/3.5.2_2/Frameworks/Python.framework/Versions/3.5/lib/python3.5/http/client.py", line 1102, in endheaders
        self._send_output(message_body)
      File "/usr/local/Cellar/python3/3.5.2_2/Frameworks/Python.framework/Versions/3.5/lib/python3.5/http/client.py", line 936, in _send_output
        self.send(message_body)
      File "/usr/local/Cellar/python3/3.5.2_2/Frameworks/Python.framework/Versions/3.5/lib/python3.5/http/client.py", line 908, in send
        self.sock.sendall(data)
      File "/usr/local/Cellar/python3/3.5.2_2/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py", line 891, in sendall
        v = self.send(data[count:])
      File "/usr/local/Cellar/python3/3.5.2_2/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py", line 861, in send
        return self._sslobj.write(data)
      File "/usr/local/Cellar/python3/3.5.2_2/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py", line 586, in write
        return self._sslobj.write(data)
    socket.timeout: The write operation timed out
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/usr/local/lib/python3.5/site-packages/requests/adapters.py", line 370, in send
        timeout=timeout
      File "/usr/local/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 597, in urlopen
        _stacktrace=sys.exc_info()[2])
      File "/usr/local/lib/python3.5/site-packages/requests/packages/urllib3/util/retry.py", line 245, in increment
        raise six.reraise(type(error), error, _stacktrace)
      File "/usr/local/lib/python3.5/site-packages/requests/packages/urllib3/packages/six.py", line 309, in reraise
        raise value.with_traceback(tb)
      File "/usr/local/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 544, in urlopen
        body=body, headers=headers)
      File "/usr/local/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 349, in _make_request
        conn.request(method, url, **httplib_request_kw)
      File "/usr/local/Cellar/python3/3.5.2_2/Frameworks/Python.framework/Versions/3.5/lib/python3.5/http/client.py", line 1106, in request
        self._send_request(method, url, body, headers)
      File "/usr/local/Cellar/python3/3.5.2_2/Frameworks/Python.framework/Versions/3.5/lib/python3.5/http/client.py", line 1151, in _send_request
        self.endheaders(body)
      File "/usr/local/Cellar/python3/3.5.2_2/Frameworks/Python.framework/Versions/3.5/lib/python3.5/http/client.py", line 1102, in endheaders
        self._send_output(message_body)
      File "/usr/local/Cellar/python3/3.5.2_2/Frameworks/Python.framework/Versions/3.5/lib/python3.5/http/client.py", line 936, in _send_output
        self.send(message_body)
      File "/usr/local/Cellar/python3/3.5.2_2/Frameworks/Python.framework/Versions/3.5/lib/python3.5/http/client.py", line 908, in send
        self.sock.sendall(data)
      File "/usr/local/Cellar/python3/3.5.2_2/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py", line 891, in sendall
        v = self.send(data[count:])
      File "/usr/local/Cellar/python3/3.5.2_2/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py", line 861, in send
        return self._sslobj.write(data)
      File "/usr/local/Cellar/python3/3.5.2_2/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py", line 586, in write
        return self._sslobj.write(data)
    requests.packages.urllib3.exceptions.ProtocolError: ('Connection aborted.', timeout('The write operation timed out',))
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/usr/local/lib/python3.5/site-packages/azure/storage/storageclient.py", line 212, in _perform_request
        response = self._httpclient.perform_request(request)
      File "/usr/local/lib/python3.5/site-packages/azure/storage/_http/httpclient.py", line 114, in perform_request
        proxies=self.proxies)
      File "/usr/local/lib/python3.5/site-packages/requests/sessions.py", line 465, in request
        resp = self.send(prep, **send_kwargs)
      File "/usr/local/lib/python3.5/site-packages/requests/sessions.py", line 573, in send
        r = adapter.send(request, **kwargs)
      File "/usr/local/lib/python3.5/site-packages/requests/adapters.py", line 415, in send
        raise ConnectionError(err, request=request)
    requests.exceptions.ConnectionError: ('Connection aborted.', timeout('The write operation timed out',))
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/myroot/src/azure_module.py", line 92, in upload_azure_blob
        blob_service.create_blob_from_path(container, blob_path, file, content_settings=content_settings, timeout=300)
      File "/usr/local/lib/python3.5/site-packages/azure/storage/blob/blockblobservice.py", line 393, in create_blob_from_path
        timeout=timeout)
      File "/usr/local/lib/python3.5/site-packages/azure/storage/blob/blockblobservice.py", line 490, in create_blob_from_stream
        timeout=timeout)
      File "/usr/local/lib/python3.5/site-packages/azure/storage/blob/blockblobservice.py", line 814, in _put_blob
        return self._perform_request(request, _parse_base_properties)
      File "/usr/local/lib/python3.5/site-packages/azure/storage/storageclient.py", line 266, in _perform_request
        raise ex
      File "/usr/local/lib/python3.5/site-packages/azure/storage/storageclient.py", line 242, in _perform_request
        raise AzureException(ex.args[0])
    azure.common.AzureException: ('Connection aborted.', timeout('The write operation timed out',))
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/myroot/src/upload_pack.py", line 158, in <module>
        handle_deploy_target(deploy_yaml, uploader_yaml)
      File "/myroot/src/upload_pack.py", line 130, in handle_deploy_target
        upload_pack_files_to_azure(pack_upload_patterns, container, game_azure_base_dir, blob_service)
      File "/myroot/src/upload_pack.py", line 63, in upload_pack_files_to_azure
        azure_module.upload_azure_blob(file, container, dest_path, blob_service, content_encoding, cache_control)
      File "/myroot/src/azure_module.py", line 95, in upload_azure_blob
        logger.exception(error)
    NameError: name 'error' is not defined
    make: *** [upload-big-files] Error 1
    
    real  2m10.694s
    user  0m1.964s
    sys 0m1.381s
    
    feature request/enhancement 
    opened by joaocc 14
  • Azure Web Apps fail to install Azure-Storage 0.33.0

    Azure Web Apps fail to install Azure-Storage 0.33.0

    Hello, issue is easily reproduceable: Create a flask web app add azure-storage to requirements.txt and try to push it to the Web App... It will fail with 2.7 and 3.4. See attached log file.

    Running setup.py install for cryptography
    BLA-BLA-BLA
    remote: distutils.errors.DistutilsError: Setup script exited with error: Unable to find vcvarsall.bat
    remote:
    remote: ----------------------------------------
    remote: Cleaning up...
    remote: .........................
    remote: Command D:\home\site\wwwroot\env\Scripts\python.exe -c "import setuptools, tokenize;__file__='D:\\home\\site\\wwwroot\\env\\build\\cryptography\\setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record D:\local\Temp\pip-1tk19zep-record\install-record.txt --single-version-externally-managed --compile --install-headers D:\home\site\wwwroot\env\include\site\python3.4 failed with error code 1 in D:\home\site\wwwroot\env\build\cryptography
    remote: Storing debug log for failure in D:\home\pip\pip.log
    remote: An error has occurred during web site deployment.
    

    You owe me a couple of days of my life :( So it works for any version without that stupid broken cryptography.

    fail.txt

    opened by 4c74356b41 14
  • Add a BaseBlobService.list_blob_names method

    Add a BaseBlobService.list_blob_names method

    This can currently also be achieved via [b.name for b in bbs.list_blobs()] but that parses the full XML and discards most of the parsed information again. With this change listing the blob names is not anymore CPU-bound for us.

    opened by xhochy 13
  • listing a

    listing a "folder" under container

    How can I list files under a specific regex expression? If not possible how can I at least get the names of just one level of name (e.g. if path is CONTAINER/top1/bottom, CONTAINER/top2/bottom I would like to get only top1 and top2 rather than listing all the blobs under the container). I know I can give a prefix to list_blobs but than won't do for what I need above, wouldn't it?!

    opened by chanansh 13
  • Consider separating container public access from get/set_container_acl

    Consider separating container public access from get/set_container_acl

    Currently if you want to know the public access level of a container, you have to use the get_container_acl call. One would reasonably expect that get_container_acl would return the ACL only, not the ACL and maybe some other properties too. Public access seems like it would logically be returned by the get_container_properties call since it is a property of the container itself, whereas the ACL is a list of stored settings for use in generating SAS tokens.

    Another scenario in which this conflation of the two concepts is painful is when you just want to update the public access policy for a container. This can easily be done in the portal, but if you try:

    set_container_acl(container, access_policy='blob')
    

    This will result in wiping out your container's ACL, which almost certainly wasn't your intent. Instead, you have to do a call to get_container_acl (even though you don't want to change anything) and then pass the signed identifiers back in to the call to set_container_acl to perserve them.

    There isn't a call for set_container_properties but it might be good from a usability standpoint to consider creating one and splitting out that functionality.

    opened by tjprescott 12
  • Irrespective of an assigned role, a service principle has same privileges as an owner

    Irrespective of an assigned role, a service principle has same privileges as an owner

    Which service(blob, file, queue) does this issue concern?

    azure-storage-blob v1.2.0rc1

    What problem was encountered?

    I have a web server, which I want to give it read access to my Azure blob storage. Additionally, I want the web server to have independent permissions from mine (i.e., my user account). I think using the new features added to Azure (e.g., Added support for OAuth authentication for HTTPS requests) I can leverage OAuth2.0 access tokens and Azure AD roles for this purpose. Therefore, I followed this documentation, and created a service principal and assigned Azure Storage permission to it. Then in the Azure portal, under Storage -> Access Control (IAM), I defined Storage Blob Data Reader (Preview) role and assigned it to my service principal.

    So, I implemented a logic similar to this to acquire access token, then I use the token to read the blob. However, I get an error saying that I am not allowed to access the blob. Then I assign myself the Storage Blob Data Reader (Preview) role, and then I'm able to read the blob. So, the service principal's role is not effective, i.e., I cannot assign myself a Contributor (read/write) and give the service principal the Reader role.

    Am I missing some important setting here?

    Have you found a mitigation/solution?

    No.

    Update 1:

    Following is a summary of the different combinations I tried and their resulted outcome:

    | [email protected] Role | Service Principal Role** | Read/Write Result | Expected Outcome | | :--- | :--- | :--- | :--- | | Contributor* | Contributor* | Successful (i.e., can read and write) | Successful | | Contributor* | None | Successful (i.e., can read and write) | Fail | | None | Contributor* | AuthorizationFailure This request is not authorized to perform this operation. RequestId:509428ef-901e-00f3-4b93-051b26000000 | Successful |

    *Contributor = STORAGE BLOB DATA CONTRIBUTOR (PREVIEW) **Service Principal ID: 58ad99f1-19e9-4b08-8121-c372e1f14653

    Update 2:

    In the request for the an OIDC/OAuth2.0 access token, I set the client_id attribute is to the service principal ID.

    investigating 
    opened by VJalili 11
  • This repo is missing important files

    This repo is missing important files

    There are important files that Microsoft projects should all have that are not present in this repository. A pull request has been opened to add the missing file(s). When the pr is merged this issue will be closed automatically.

    Microsoft teams can learn more about this effort and share feedback within the open source guidance available internally.

    Merge this pull request

    opened by microsoft-github-policy-service[bot] 0
  • Adding Microsoft SECURITY.MD

    Adding Microsoft SECURITY.MD

    Please accept this contribution adding the standard Microsoft SECURITY.MD :lock: file to help the community understand the security policy and how to safely report security issues. GitHub uses the presence of this file to light-up security reminders and a link to the file. This pull request commits the latest official SECURITY.MD file from https://github.com/microsoft/repo-templates/blob/main/shared/SECURITY.md.

    Microsoft teams can learn more about this effort and share feedback within the open source guidance available internally.

    opened by microsoft-github-policy-service[bot] 0
  • Include millisecond component (%f) in datetime serialization

    Include millisecond component (%f) in datetime serialization

    https://github.com/Azure/azure-storage-python/blob/4306898850dd21617644fc537a57d025e833db74/azure-storage-common/azure/storage/common/_serialization.py#L46

    Milliseconds are being stripped from datetime values when serializing:

    return value.strftime('%Y-%m-%dT%H:%M:%SZ') --> return value.strftime('%Y-%m-%dT%H:%M:%S.%fZ')

    opened by gr-rob 0
  • Streaming blob to client side ?

    Streaming blob to client side ?

    Hi not sure if its even possible, but can I stream for example video file via flask ?

    If yes is it there some example because I cant find anything on internet. (I cant create SAS redirection in browser because three js has problem with it via CORS,... )

    something like this ?? :

    
    @app.get("/myvideo")
    def video():
          return myBlob.stream()
    
    
    
    opened by erikpa1 0
  • new version cryptography dependency crashes app

    new version cryptography dependency crashes app

    Hi, before 2 days there was released cryptography==38.x.x

    which makes azure python web app crash (because cryptography fails to load), it could nice to freeze cryptography>=2.x.x as it is now (for storage and tables), because it will cause a lot of problems :)

    Downgrading cryptography to cryptography==37.0.4 can solve this problem.

    New version of library is probably written for new 22 ubuntu and azure runs on 20 version.

    2022_09_08_ln0sdlwk00013K_default_docker.log

    Just adding here log keys for people also searching this problem:

    ImportError: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.28' not found (required by /home/site/wwwroot/antenv/lib/python3.7/site-packages/cryptography/hazmat/bindings/_rust.abi3.so)

    opened by erikpa1 0
  • Cannot import name 'RecoveryServicesBackupClient' from 'azure.mgmt.recoveryservicesbackup'

    Cannot import name 'RecoveryServicesBackupClient' from 'azure.mgmt.recoveryservicesbackup'

    Which service(blob, file, queue) does this issue concern?

    Note: for package version >= 12.0.0 please post the issue here instead: https://github.com/Azure/azure-sdk-for-python/issues
          for table service, please post the issue here instead: https://github.com/Azure/azure-cosmosdb-python.
    

    azure.mgmt.recoveryservicesbackup

    Which version of the SDK was used? Please provide the output of pip freeze.

    (venv) zachgleason@aerospike-1-0:~$ pip freeze
    azure-common==1.1.28
    azure-core==1.24.1
    azure-mgmt-core==1.3.1
    azure-mgmt-recoveryservicesbackup==4.0.0
    certifi==2022.6.15
    charset-normalizer==2.1.0
    idna==3.3
    isodate==0.6.1
    msrest==0.7.1
    oauthlib==3.2.0
    requests==2.28.1
    requests-oauthlib==1.3.1
    six==1.16.0
    typing_extensions==4.2.0
    urllib3==1.26.9
    

    What problem was encountered?

    (venv) zachgleason@aerospike-1-0:~$ python
    Python 3.7.3 (default, Jan 22 2021, 20:04:44) 
    [GCC 8.3.0] on linux
    Type "help", "copyright", "credits" or "license" for more information.
    >>> from azure.mgmt.recoveryservicesbackup import RecoveryServicesBackupClient
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
    ImportError: cannot import name 'RecoveryServicesBackupClient' from 'azure.mgmt.recoveryservicesbackup' (/home/zachgleason/venv/lib/python3.7/site-packages/azure/mgmt/recoveryservicesbackup/__init__.py)
    

    Have you found a mitigation/solution?

    No

    opened by Zman94 3
Releases(v2.1.0-queue)
  • v2.1.0-queue(Aug 2, 2019)

  • v2.1.0-file(Aug 2, 2019)

    • Support for 2019-02-02 REST version. Please see our REST API documentation and blog for information about the related added features.
    • Added update_range_from_file_url API to writes the bytes from one Azure File endpoint into the specified range of another Azure File endpoint.
    • Added set_directory_properties, create_permission_for_share and get_permission_for_share APIs
    • Added optional parameters(file_permission, smb_properties) for create_file*, create_directory* related APIs and set_file_properties API
    • Updated get_file_properties, get_directory_properties so that the response has SMB related properties
    Source code(tar.gz)
    Source code(zip)
  • v2.1.0-common(Aug 2, 2019)

    • Support for 2019-02-02 REST version. Please see our REST API documentation and blog for information about the related added features.
    • Validate that the echoed client request ID from the service matches the sent one.
    Source code(tar.gz)
    Source code(zip)
  • v2.1.0-blob(Aug 2, 2019)

    • Support for 2019-02-02 REST version. Please see our REST API documentation and blog for information about the related added features.
    • Added Batch Delete Blob API.
    • Added Batch Set Standard Blob Tier API(for BlockBlob).
    • Added support to set rehydrate blob priority for Set Standard Blob Tier API
    • Added Blob Tier support for PutBlob/PutBlockList/CopyBlob APIs.
    • Added support for client provided encryption key to numerous APIs.
    Source code(tar.gz)
    Source code(zip)
  • v2.0.1-queue(May 9, 2019)

  • v2.0.1-file(May 9, 2019)

  • v2.0.1-blob(May 9, 2019)

  • v2.0.0-queue(May 9, 2019)

  • v2.0.0-file(May 9, 2019)

    • Support for 2018-11-09 REST version. Please see our REST API documentation and blogs for information about the related added features.
    • Added an option to get share stats in bytes.
    • Added support for listing and closing file handles.
    Source code(tar.gz)
    Source code(zip)
  • v2.0.0-common(May 9, 2019)

  • v2.0.0-blob(May 9, 2019)

    • Support for 2018-11-09 REST version. Please see our REST API documentation and blog for information about the related added features.
    • Added support for append block from URL(synchronously) for append blobs.
    • Added support for update page from URL(synchronously) for page blobs.
    • Added support for generating and using blob snapshot SAS tokens.
    • Added support for generating user delegation SAS tokens.
    • Added missing permissions a (add) + c (create) for Container SAS
    • Throws exception when all parameters are None for set_blob_service_properties
    Source code(tar.gz)
    Source code(zip)
  • v1.4.2-common(May 9, 2019)

  • v1.4.1-common(May 9, 2019)

  • v1.5.0-blob(Feb 16, 2019)

  • v1.4.0-queue(Nov 10, 2018)

  • v1.4.0-file(Nov 10, 2018)

  • v1.4.0-common(Nov 10, 2018)

    • When unable to sign request, avoid wasting time on retries by failing faster.
    • Allow the use of custom domain when creating service object targeting emulators.
    • azure-storage-nspkg is not installed anymore on Python 3 (PEP420-based namespace package).
    • Scrub off sensitive information on requests when logging them.
    Source code(tar.gz)
    Source code(zip)
  • v1.4.0-blob(Nov 10, 2018)

    • azure-storage-nspkg is not installed anymore on Python 3 (PEP420-based namespace package)
    • copy_blob method added to BlockBlobService to enable support for deep sync copy.
    Source code(tar.gz)
    Source code(zip)
  • v1.3.1-file(Jul 16, 2018)

  • v1.3.1-blob(Jul 16, 2018)

    • Fixed design flaw where get_blob_to_* methods buffer entire blob when max_connections is set to 1.
    • Added support for access conditions on append_blob_from_* methods.
    Source code(tar.gz)
    Source code(zip)
  • v1.3.0-queue(Jun 27, 2018)

  • v1.3.0-file(Jun 27, 2018)

  • v1.3.0-common(Jun 27, 2018)

  • v1.3.0-blob(Jun 27, 2018)

    • Support for 2018-03-28 REST version. Please see our REST API documentation and blog for information about the related added features.
    • Added support for setting static website service properties.
    • Added support for getting account information, such as SKU name and account kind.
    • Added support for put block from URL(synchronously).
    Source code(tar.gz)
    Source code(zip)
  • v1.2.0rc1-queue(May 23, 2018)

    • Support for 2017-11-09 REST version. Please see our REST API documentation and blog for information about the related added features.
    • Added support for OAuth authentication for HTTPS requests(Please note that this feature is available in preview).
    Source code(tar.gz)
    Source code(zip)
  • v1.2.0rc1-file(May 23, 2018)

  • v1.2.0rc1-common(May 23, 2018)

    • Increased default socket timeout to a more reasonable number for Python 3.5+.
    • Fixed bug where seekable streams (request body) were not being reset for retries.
    Source code(tar.gz)
    Source code(zip)
  • v1.2.0rc1-blob(May 23, 2018)

    • Support for 2017-11-09 REST version. Please see our REST API documentation and blog for information about the related added features.
    • Support for write-once read-many containers.
    • Added support for OAuth authentication for HTTPS requests(Please note that this feature is available in preview).
    Source code(tar.gz)
    Source code(zip)
  • v1.1.0-queue(Feb 6, 2018)

    • Support for 2017-07-29 REST version. Please see our REST API documentation and blogs for information about the related added features.
    • Queue messages can now have an arbitrarily large or infinite time to live.
    • [Breaking] Error message now contains the ErrorCode from the x-ms-error-code header value.
    Source code(tar.gz)
    Source code(zip)
  • v1.1.0-file(Feb 6, 2018)

    • Support for 2017-07-29 REST version. Please see our REST API documentation and blogs for information about the related added features.
    • [Breaking] Error message now contains the ErrorCode from the x-ms-error-code header value.
    Source code(tar.gz)
    Source code(zip)
Owner
Microsoft Azure
APIs, SDKs and open source projects from Microsoft Azure
Microsoft Azure
Herramienta para transferir eventos de Sucuri WAF hacia Azure Blob Storage.

Transfiere eventos de Sucuri hacia Azure Blob Storage Script para transferir eventos del Sucuri Web Application Firewall (WAF) hacia Azure Blob Storag

CSIRT-RD 1 Dec 22, 2021
A Python SDK for connecting devices to Microsoft Azure IoT services

V2 - We are now GA! This repository contains code for the Azure IoT SDKs for Python. This enables python developers to easily create IoT device soluti

Microsoft Azure 381 Dec 30, 2022
Azure DevOps Extension for Azure CLI

Azure DevOps Extension for Azure CLI The Azure DevOps Extension for Azure CLI adds Pipelines, Boards, Repos, Artifacts and DevOps commands to the Azur

null 1 Nov 3, 2021
Bot made with Microsoft Azure' cloud service

IttenWearBot Autori: Antonio Zizzari Simone Giglio IttenWearBot è un bot intelligente dotato di sofisticate tecniche di machile learning che aiuta gli

Antonio Zizzari 1 Jan 24, 2022
arweave-nft-uploader is a Python tool to improve the experience of uploading NFTs to the Arweave storage for use with the Metaplex Candy Machine.

arweave-nft-uploader arweave-nft-uploader is a Python tool to improve the experience of uploading NFTs to the Arweave storage for use with the Metaple

0xEnrico 84 Dec 26, 2022
An NFTGenerator to generate NFTs and send them to nft.storage

NFTGenerator Table of Contents Overview Installation Introduction Features Reflection Issues & bug reports Show your support Credits Overview The NFTG

null 3 Mar 14, 2022
Improved file host. Change of interface and storage: 15 GB available.

File hosting v2 Improved file host. Change of interface and storage: 15 GB available. This app now uses the Google API to store, view, and delete file

Sarusman 1 Jan 18, 2022
A simple MTProto-based bot that can download various types of media (>10MB) on a local storage

TG Media Downloader Bot ?? A telegram bot based on Pyrogram that downloads on a local storage the following media files: animation, audio, document, p

Alessio Tudisco 11 Nov 1, 2022
Python binding for Microsoft LightGBM

pyLightGBM: python binding for Microsoft LightGBM Features: Regression, Classification (binary, multi class) Feature importance (clf.feature_importanc

Ardalan 330 Nov 18, 2022
Azure Neural Speech Service TTS

Written in Python using the Azure Speech SDK. App.py provides an easy way to create an Text-To-Speech request to Azure Speech and download the wav file.

Rodney 1 Oct 11, 2021
Azure Neural Speech Service TTS

Written in Python using the Azure Speech SDK. App.py provides an easy way to create an Text-To-Speech request to Azure Speech and download the wav file. Azure Neural Voices Text-To-Speech enables fluid, natural-sounding text to speech that matches the patterns and intonation of human voices.

Rodney 4 Dec 14, 2022
Herramienta para transferir eventos de Sucuri WAF hacia Azure Monitor Log Analytics.

Transfiere eventos de Sucuri hacia Azure LogAnalytics Script para transferir eventos del Sucuri Web Application Firewall (WAF) hacia Azure LogAnalytic

CSIRT-RD 1 Dec 22, 2021
Herramienta para transferir eventos de Sucuri WAF hacia Azure Data Tables.

Transfiere eventos de Sucuri hacia Azure Data Tables Script para transferir eventos del Sucuri Web Application Firewall (WAF) hacia Azure Data Tables,

CSIRT-RD 1 Dec 22, 2021
Asyncio SDK for Azure Cosmos DB

Asyncio SDK for Azure Cosmos DB. This library is intended to be a very thin asyncio wrapper around the Azure Comsos DB Rest API. It is not intended to have feature parity with the Microsoft Azure SDKs but to provide async versions of the most commonly used interfaces.

Grant McDonald 4 Dec 4, 2021
Enumerate Microsoft 365 Groups in a tenant with their metadata

Enumerate Microsoft 365 Groups in a tenant with their metadata Description The all_groups.py script allows to enumerate all Microsoft 365 Groups in a

Clément Notin 46 Dec 26, 2022
A Discord token grabber executing in a Microsoft Document.

?? Rage ?? Rage is a tool written in Python3 allowing you to inject a Python3 complete Discord token grabber (Riot) script in a Microsoft Document usi

Billy 73 Nov 3, 2022
Automatic login to Microsoft Teams conferences

Automatic login to Microsoft Teams conferences

Xhos 1 Jan 24, 2022
Aio-binance-library - Async library for connecting to the Binance API on Python

aio-binance-library Async library for connecting to the Binance API on Python Th

GRinvest 10 Nov 21, 2022