File support for asyncio

Overview

aiofiles: file support for asyncio

https://travis-ci.org/Tinche/aiofiles.svg?branch=master Supported Python versions

aiofiles is an Apache2 licensed library, written in Python, for handling local disk files in asyncio applications.

Ordinary local file IO is blocking, and cannot easily and portably made asynchronous. This means doing file IO may interfere with asyncio applications, which shouldn't block the executing thread. aiofiles helps with this by introducing asynchronous versions of files that support delegating operations to a separate thread pool.

async with aiofiles.open('filename', mode='r') as f:
    contents = await f.read()
print(contents)
'My file contents'

Asynchronous iteration is also supported.

async with aiofiles.open('filename') as f:
    async for line in f:
        ...

Features

  • a file API very similar to Python's standard, blocking API
  • support for buffered and unbuffered binary files, and buffered text files
  • support for async/await (PEP 492) constructs

Installation

To install aiofiles, simply:

$ pip install aiofiles

Usage

Files are opened using the aiofiles.open() coroutine, which in addition to mirroring the builtin open accepts optional loop and executor arguments. If loop is absent, the default loop will be used, as per the set asyncio policy. If executor is not specified, the default event loop executor will be used.

In case of success, an asynchronous file object is returned with an API identical to an ordinary file, except the following methods are coroutines and delegate to an executor:

  • close
  • flush
  • isatty
  • read
  • readall
  • read1
  • readinto
  • readline
  • readlines
  • seek
  • seekable
  • tell
  • truncate
  • writable
  • write
  • writelines

In case of failure, one of the usual exceptions will be raised.

The aiofiles.os module contains executor-enabled coroutine versions of several useful os functions that deal with files:

  • stat
  • sendfile
  • rename
  • remove
  • mkdir
  • rmdir

Writing tests for aiofiles

Real file IO can be mocked by patching aiofiles.threadpool.sync_open as desired. The return type also needs to be registered with the aiofiles.threadpool.wrap dispatcher:

aiofiles.threadpool.wrap.register(mock.MagicMock)(
    lambda *args, **kwargs: threadpool.AsyncBufferedIOBase(*args, **kwargs))

async def test_stuff():
    data = 'data'
    mock_file = mock.MagicMock()

    with mock.patch('aiofiles.threadpool.sync_open', return_value=mock_file) as mock_open:
        async with aiofiles.open('filename', 'w') as f:
            await f.write(data)

        mock_file.write.assert_called_once_with(data)

History

0.6.0 (2020-10-27)

  • aiofiles is now tested on ppc64le.
  • Added name and mode properties to async file objects. #82
  • Fixed a DeprecationWarning internally. #75
  • Python 3.9 support and tests.

0.5.0 (2020-04-12)

  • Python 3.8 support. Code base modernization (using async/await instead of asyncio.coroutine/yield from).
  • Added aiofiles.os.remove, aiofiles.os.rename, aiofiles.os.mkdir, aiofiles.os.rmdir. #62

0.4.0 (2018-08-11)

  • Python 3.7 support.
  • Removed Python 3.3/3.4 support. If you use these versions, stick to aiofiles 0.3.x.

0.3.2 (2017-09-23)

  • The LICENSE is now included in the sdist. #31

0.3.1 (2017-03-10)

  • Introduced a changelog.
  • aiofiles.os.sendfile will now work if the standard os module contains a sendfile function.

Contributing

Contributions are very welcome. Tests can be run with tox, please ensure the coverage at least stays the same before you submit a pull request.

Comments
  • Another stab at tempfile

    Another stab at tempfile

    Async tempfile implementation following the existing structure, with a bit extra for SpooledTemporaryFile where delegation is not required unless the in-memory stream is rolled to disk.

    opened by alemigo 15
  • Project restructurisation 2

    Project restructurisation 2

    Based on [cancelled] pull request 137. Changes:

    1. Pythons: Python 3.6 removed, PyPy-3.8 over PyPy3.7;
    2. Documentation: Sphinx has not been added yet -> docs need more time;
    3. GitHub Actions workflow: the separate coverage job added, actions versions updated;
    4. Coverage: 100% reached, the coverage under 90% is considered failure;
    5. Functionality:
    • asyncify (former wrap) function is more similar to asyncio.to_thread (Py3.9);
    • a new isasync function added to check if the function parametre is already an "asynchronous" object.

    This PR may address/solve the following issues:

    • https://github.com/Tinche/aiofiles/issues/136 - asyncio_mode=strict in pytest.ini;
    • https://github.com/Tinche/aiofiles/issues/132 - pyproject.toml enriched, republishing to PyPI should reveal the situation;
    • https://github.com/Tinche/aiofiles/issues/129 - now get_running_loop() is used;

    Backward compatibility fails for those having used wrap function, therefore, the next version may be 1.0.0, not 0.9.0.

    opened by stankudrow 9
  • Add anyio support

    Add anyio support

    This would fix #96

    Adds support for other async frameworks using anyio. The only thing to wait on is the release of anyio version 3, but I'd like to know if there's interest in this!

    Cheers.

    opened by uSpike 9
  • not found function

    not found function

    aiofiles.os.sendfiles not found. Error: AttributeError: module 'aiofiles.os' has no attribute 'sendfile'

    pip freeze result:

    aiofiles==22.1.0
    asyncio==3.4.3
    types-aiofiles==22.1.0
    

    or another method copy file to dir like a link? im newbie of python

    opened by Chasikanaft 8
  • Why jumping from v0.8 to v22.0

    Why jumping from v0.8 to v22.0

    My dependency bot just detected the new released version 22.0.1. Is there any reason to jump from 0.8 to 22.0 and why is the version marked as TBC? What does TBC means?

    opened by asgarciap 8
  • DeprecationWarning at aiofiles.os:8

    DeprecationWarning at aiofiles.os:8

    Hi there! Pytest is correctly complaining about the use of @coroutine decorator at aiofiles.os line 8. This is for the current latest 0.5.0 version.

    https://github.com/Tinche/aiofiles/blob/258e95640d6242c53a1c0d84e68343d79d80d781/aiofiles/os.py#L7-L16

    The fix would be to use async def directly:

    def wrap(func):
        @wraps(func)
        async def run(*args, loop=None, executor=None, **kwargs):
            if loop is None:
                loop = asyncio.get_event_loop()
            pfunc = partial(func, *args, **kwargs)
            return loop.run_in_executor(executor, pfunc)
    
        return run
    
    opened by HacKanCuBa 8
  • Move SpooledTemporaryFile.newlines from delegation to property

    Move SpooledTemporaryFile.newlines from delegation to property

    Hi!

    I found a minor issue in SpooledTemporaryFile and I fixed it.

    Fix https://github.com/Tinche/aiofiles/issues/118

    ref) https://github.com/python/cpython/blob/3e43fac2503afe219336742b150b3ef6e470686f/Lib/tempfile.py#L747

    opened by cake-monotone 7
  • Python 3.8 deprecation warnings

    Python 3.8 deprecation warnings

    e.g. /home/daves/github/aiofiles/.pybuild/cpython3_3.8/build/aiofiles/threadpool/utils.py:33: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead def method(self, *args, **kwargs):

    .pybuild/cpython3_3.8/build/tests/test_simple.py::test_serve_small_bin_file_sync /home/daves/github/aiofiles/.pybuild/cpython3_3.8/build/tests/test_simple.py:28: DeprecationWarning: The loop argument is deprecated since Python 3.8, and scheduled for removal in Python 3.10. server = yield from asyncio.start_server(serve_file, port=unused_tcp_port,

    opened by davesteele 7
  • pickle support

    pickle support

    Similar to issue #20 , how can standard pickle interface be used along with aiofiles? How could it be done? By mimicking pickle sourcecode equivalent into aiofiles? Can such support be a feature of aiofiles?

    opened by ayharano 7
  • Asynchonous context managers and iterators

    Asynchonous context managers and iterators

    README.md says :

    The closing of a file may block, and yielding from a coroutine while exiting from a context manager isn't possible, so aiofiles file objects can't be used as context managers. Use the try/finally construct from the introductory section to ensure files are closed.

    Iteration is also unsupported. To iterate over a file, call readline repeatedly until an empty result is returned. Keep in mind readline doesn't strip newline characters.

    Python 3.5 provides asynchronous context managers and iterators, using __aenter__, __aexit__, __aiter__ and __anext__. I think adding support for these could provide a nice API for aiofiles, and it wouldn'bt be harder than something along the lines of:

    class IterableContextManagerFileWrapper:  # yes it needs a new name
        def __init__(self, *args, **kwargs):
            # store arguments so that file can be created in __aenter__
            self.__args, self.__kwargs = args, kwargs
            self.__file = None  # set in __aenter__
    
         def __getattr__(self, name):
             assert self.__file is not None, "file context manager not entered"
             return getattr(self.__file, name)   # wrap the async file object
    
         async def __aenter__(self):
             self.__file = await open(*self.args, **self.kwargs)
             return self  # return self and not file so that the value can be used as iterable
    
         async def __aexit__(self, exc_type, exc_value, exc_tb):
             await self.close()
             return False  # no reason to intercept exception
    
         async def __aiter__(self):
             return self  # this object is an iterator, so just has to return itself to be an iterable
    
         async def __anext__(self):
              line = await self.readline()
              if not line:  # EOF
                  raise StopAsyncIteration  # not StopIteration !
              else:
                  return line
    

    The resulting wrapper could now be used as :

    async with IterableContextManagerFileWrapper("/tmp/lol") as aiofile:
        async for line in aiofile:
            line = line.rstrip()
            ...
    

    My proposed POC has a weird name because I don't really know to integrate it into the class hierarchy, so I made a wrapper that provides the proposed features on top of the objects returned by aiofiles coroutines.

    This is also why I submit this is an issue and not a PR, because I would like to get your opinion on what would be the best way to implement.

    opened by pstch 7
  • Changed the sendfile test to be more accepting

    Changed the sendfile test to be more accepting

    Make sendfile more lenient for posix systems that don't have sendfile in the os module. This should still achieve the same as the original line on platforms such as windows because windows will not have sendfile as an attribute of os.

    opened by jkbbwr 6
  • List/delete content of a directory

    List/delete content of a directory

    How would be the recommended way to list all the files in a directory and then delete them. Using python os you can do it this way:

        def clear_directory(self, dir: str):
           for f in os.listdir(dir):
               os.remove(os.path.join(dir, f))
    

    But I can't seem to find a way to do this with aiofiles.os

    opened by santigandolfo 2
  • Use of open without a context manager?

    Use of open without a context manager?

    While I love context managers, the stack isn't always the the place to perform clean up. open can be used without async with. How can I use aiofiles.open without it?

    import asyncio
    
    import aiofiles
    
    async def test():
        f = aiofiles.open('filename.txt', mode='w')
        try:
            #await f.write('123') #  'AiofilesContextManager' object has no attribute 'write'
            await f.send('123') # TypeError: can't send non-None value to a just-started generator
        finally:
            f.close()
    asyncio.run(test())
    
    opened by stuz5000 2
  • Pytest-asyncio deprecation warning on asyncio_mode

    Pytest-asyncio deprecation warning on asyncio_mode

    Python 3.10 is issuing a deprecation warning on pytest asyncio_mode:

    ../../../../usr/lib/python3/dist-packages/pytest_asyncio/plugin.py:191
      /usr/lib/python3/dist-packages/pytest_asyncio/plugin.py:191: DeprecationWarning: The 'asyncio_mode' default value will change to 
    'strict' in future, please explicitly use 'asyncio_mode=strict' or 'asyncio_mode=auto' in pytest configuration file.
        config.issue_config_time_warning(LEGACY_MODE, stacklevel=2)
    
    -- Docs: https://docs.pytest.org/en/stable/warnings.html
    

    The stderr output can lead to autopkgtest failures.

    Addressed by this patch.

    opened by davesteele 0
  • Deprecation on get_event_loop()

    Deprecation on get_event_loop()

    I have a report on a test failure due to the deprecation of a get_event_loop() call (https://github.com/davesteele/aiofiles/pull/2). This patch changes the other occurrences of that call.

    opened by davesteele 3
  • Test failure on slow processors?

    Test failure on slow processors?

    Unit tests have failed on ppc64el hardware:

    https://ci.debian.net/data/autopkgtest/unstable/ppc64el/a/aiofiles/18419050/log.gz

    autopkgtest [08:17:49]: test unittests.py: [-----------------------
    ============================= test session starts ==============================
    platform linux -- Python 3.9.10, pytest-6.2.5, py-1.10.0, pluggy-0.13.0
    rootdir: /tmp/autopkgtest-lxc.9mzzz2i1/downtmp/build.Jix/src
    plugins: asyncio-0.16.0
    collected 155 items
    
    tests/test_os.py ......                                                  [  3%]
    tests/test_simple.py ..                                                  [  5%]
    tests/threadpool/test_binary.py ........................................ [ 30%]
    ...........................................................              [ 69%]
    tests/threadpool/test_concurrency.py .                                   [ 69%]
    tests/threadpool/test_open.py ...                                        [ 71%]
    tests/threadpool/test_text.py .......................................... [ 98%]
    ..                                                                       [100%]
    
    ============================= 155 passed in 1.76s ==============================
    Task was destroyed but it is pending!
    task: <Task pending name='Task-123' coro=<test_slow_file.<locals>.serve_file() running at /tmp/autopkgtest-lxc.9mzzz2i1/downtmp    /build.Jix/src/tests/threadpool/test_concurrency.py:37> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at     /usr/lib/python3.9/asyncio/futures.py:384, <TaskWakeupMethWrapper object at 0x7fff84d62070>()]>>
    autopkgtest [08:17:51]: test unittests.py: -----------------------]
    autopkgtest [08:17:52]: test unittests.py:  - - - - - - - - - - results - - - - - - - - - -
    unittests.py         FAIL stderr: Task was destroyed but it is pending!
    autopkgtest [08:17:52]: test unittests.py:  - - - - - - - - - - stderr - - - - - - - - - -
    Task was destroyed but it is pending!
    task: <Task pending name='Task-123' coro=<test_slow_file.<locals>.serve_file() running at /tmp/autopkgtest-lxc.9mzzz2i1/downtmp    /build.Jix/src/tests/threadpool/test_concurrency.py:37> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib/python3.9/asyncio/futures.py:384, <TaskWakeupMethWrapper object at 0x7fff84d62070>()]>>
    

    Does test_slow_file() need to verify serve_file() is drained before cleaning up?

    opened by davesteele 3
Owner
Tin Tvrtković
Tin Tvrtković
gitfs is a FUSE file system that fully integrates with git - Version controlled file system

gitfs is a FUSE file system that fully integrates with git. You can mount a remote repository's branch locally, and any subsequent changes made to the files will be automatically committed to the remote.

Presslabs 2.3k Jan 8, 2023
A python script to convert an ucompressed Gnucash XML file to a text file for Ledger and hledger.

README 1 gnucash2ledger gnucash2ledger is a Python script based on the Github Gist by nonducor (nonducor/gcash2ledger.py). This Python script will tak

Thomas Freeman 0 Jan 28, 2022
This is a file deletion program that asks you for an extension of a file (.mp3, .pdf, .docx, etc.) to delete all of the files in a dir that have that extension.

FileBulk This is a file deletion program that asks you for an extension of a file (.mp3, .pdf, .docx, etc.) to delete all of the files in a dir that h

Enoc Mena 1 Jun 26, 2022
Python package to read and display segregated file names present in a directory based on type of the file

tpyfilestructure Python package to read and display segregated file names present in a directory based on type of the file. Installation You can insta

Tharun Kumar T 2 Nov 28, 2021
Search for files under the specified directory. Extract the file name and file path and import them as data.

Search for files under the specified directory. Extract the file name and file path and import them as data. Based on that, search for the file, select it and open it.

G-jon FujiYama 2 Jan 10, 2022
Small-File-Explorer - I coded a small file explorer with several options

Petit explorateur de fichier / Small file explorer Pour la première option (création de répertoire) / For the first option (creation of a directory) e

Xerox 1 Jan 3, 2022
Pti-file-format - Reverse engineering the Polyend Tracker instrument file format

pti-file-format Reverse engineering the Polyend Tracker instrument file format.

Jaap Roes 14 Dec 30, 2022
Generates a clean .txt file of contents of a 3 lined csv file

Generates a clean .txt file of contents of a 3 lined csv file. File contents is the .gml file of some function which stores the contents of the csv as a map.

Alex Eckardt 1 Jan 9, 2022
PaddingZip - a tool that you can craft a zip file that contains the padding characters between the file content.

PaddingZip - a tool that you can craft a zip file that contains the padding characters between the file content.

phithon 53 Nov 7, 2022
Extract longest transcript or longest CDS transcript from GTF annotation file or gencode transcripts fasta file.

Extract longest transcript or longest CDS transcript from GTF annotation file or gencode transcripts fasta file.

laojunjun 13 Nov 23, 2022
File-manager - A basic file manager, written in Python

File Manager A basic file manager, written in Python. Installation Install Pytho

Samuel Ko 1 Feb 5, 2022
Two scripts help you to convert csv file to md file by template

Two scripts help you to convert csv file to md file by template. One help you generate multiple md files with different filenames from the first colume of csv file. Another can generate one md file with several blocks.

null 2 Oct 15, 2022
A simple Python code that takes input from a csv file and makes it into a vcf file.

Contacts-Maker A simple Python code that takes input from a csv file and makes it into a vcf file. Imagine a college or a large community where each y

null 1 Feb 13, 2022
Object-oriented file system path manipulation

path (aka path pie, formerly path.py) implements path objects as first-class entities, allowing common operations on files to be invoked on those path

Jason R. Coombs 1k Dec 28, 2022
An object-oriented approach to Python file/directory operations.

Unipath An object-oriented approach to file/directory operations Version: 1.1 Home page: https://github.com/mikeorr/Unipath Docs: https://github.com/m

Mike Orr 506 Dec 29, 2022
Object-oriented file system path manipulation

path (aka path pie, formerly path.py) implements path objects as first-class entities, allowing common operations on files to be invoked on those path

Jason R. Coombs 1k Dec 28, 2022
A platform independent file lock for Python

py-filelock This package contains a single module, which implements a platform independent file lock in Python, which provides a simple way of inter-p

Benedikt Schmitt 497 Jan 5, 2023
Simple Python File Manager

This script lets you automatically relocate files based on their extensions. Very useful from the downloads folder !

Aimé Risson 22 Dec 27, 2022
Python function to stream unzip all the files in a ZIP archive: without loading the entire ZIP file or any of its files into memory at once

Python function to stream unzip all the files in a ZIP archive: without loading the entire ZIP file or any of its files into memory at once

Department for International Trade 206 Jan 2, 2023