CouchDB client built on top of aiohttp (asyncio)



license: BSD

CouchDB client built on top of aiohttp and made for asyncio.

Current status: beta. aiocouchdb has all CouchDB API implements up to 1.6.1 release. However, it may lack of some usability and stability bits, but work is in progress. Feel free to send pull request or open issue if you'd found something that should be fixed.


  • Modern CouchDB client for Python 3.3+ based on aiohttp
  • Complete CouchDB API support (JSON and Multipart) up to 1.6.1 version
  • Multiuser workflow with Basic Auth, Cookie, Proxy and OAuth support
  • Stateless behavior
  • Stream-like handling views, changes feeds and bulk docs upload

Roadmap (not exactly in that order):

  • Cloudant support
  • CouchDB 2.0 support
  • ElasticSearch CouchDB river support
  • GeoCouch support
  • Microframework for OS daemons and external handlers
  • Native integration with Python Query Server
  • Replicator-as-a-Library / Replicator-as-a-Service
  • Stateful API


  • Add reques_options argument to each http request

    Add reques_options argument to each http request


    I needed to pass some arguments to the aiohttp request (i.e.: http basic auth), which is supported at the Resource level but not at the Attachment/Document/... level. This patch adds that, on each public method (except the ones i missed, i guess ;) )

    I think it would be best to be able to set that this on the Server object and propagate it to the objects returned, but i don't really have the time to implement that, so here's an intermediate solution.

    What do you think?

    opened by iCart 7
  • Feeds broken against couchbase sync gateway

    Feeds broken against couchbase sync gateway

    First off, thank you for this project, I'm building an app that should scale quite largely against couchdb and I'm looking forward to using the power of async :)

    I had prototype code working perfectly against CouchDb (1.4) and recently we are trying to transition to Couchbase Sync Gateway for scalability issues (we expect thousands of concurrent feeds). After some setup pains, my script to write test data worked, and the web interface returns the list of documents, but all queries with aiocouchdb returned nothing.

    I spent quite some time debugging this. And it seems to be a bug in the library, which may well be related to the problem in issue #8 curl http://localhost:5984/default/_all_docs?include_docs=true returns the proper information. In this case:


    My query code looks something like this:

            db = await ensure_db(server, db_name)
            docs = await db.all_docs(include_docs=True)
            with docs:
                rec = await
                while rec:
                    rec = await

    As I said, this worked against CouchDB, but the first call to now returns None....

    Digging into the aiocouchdb code code, I put a breakpoint in ( and found that chunk was the entire body of the response. Apparently Sync Gateway buffers the text, making one network IO call, while couchdb sends each chunk in it's own network IO call. And quoting from

            elif chunk.startswith(('{"rows"', ']}')):
                return (yield from

    Thus, the chunk is skipped and just moved onto the next one... which doesn't exist, so the result is None, an empty iterator :(

    For me, the solution is to no longer depend on the network IO caching behavior of the server, but use a more powerful parser to return one line at a time. (I think ijson ( is a great solution for parsing streaming json, and with optional c libraries, very fast).

    My work-around looks something like this:

    in, add to the top: from .client import HttpStreamResponse and on line 50, change: resp = yield from request(auth=auth, data=data, params=params) to: resp = yield from request(auth=auth, data=data, params=params, response_class=HttpStreamResponse)

    in Feed._loop (, change: chunk = yield from to chunk = yield from self._resp.content.readline()

    This mostly works and parses the rows correctly now, it raises an exception on parsing the remainder in ( We have an empty line with ], but the code only checks for ]}. Also, we no longer have {"total_rows" with no trailing }, but rather "total_rows" with no leading {.

    I could work on a patch here, but I would like some direction from the project maintainer.

    • There don't seem to be tests for the feeds in unit test (understandably, as it usually requires a server to test), shall I try to mock something there, or is there a way to run tests against various servers.
    • What variants need to be supported (CouchDB 1.4? CouchDB 2.0? Sync Gateway?)
    • Is it acceptable to add requirements that provide more robust streaming json parsing?

    Thanks again for the great package, and please let me know how I can contribute.

    opened by ethanfrey 6
  • Unable to use aiocouchdb: problem between keyboard and chair

    Unable to use aiocouchdb: problem between keyboard and chair


    This is not an issue, I'm abusing your issue tracker to ask a newbie question in the absence of a mailing list / IRC channel. Or if taken positively, maybe it can be taken as a hint that the doc could be more helpful to total asyncio newcomers; I'll be glad to propose such a doc patch if we end up feeling the need for it.

    My humble objective here is to succeed at fetching and printing the 'Welcome!' Couch message using your

    So after reading the doc, I run my localhost:5984 couch instance, create a python3.4 virtualenv, pip install aiocouchdb, touch a file, and here's what I try to run:

    #! /usr/bin/env python3
    import aiocouchdb
    server = aiocouchdb.Server()
    print('server:', server, '\n')

    That works (and I'll omit the imports and server creation for now): it prints server: <aiocouchdb.v1.server.Server(http://localhost:5984) object at 0x7fcc064c0be0>. Now, let's try getting my server info. I see the doc mentioning Python doesn’t supports yield from in shell, but I don't know better, so I try it anyway:

    yield from

    Barf, SyntaxError: 'yield' outside function. Okay, I can fix that:

    def sinfo():
        yield from
    something = sinfo()

    Great, I have a <generator object sinfo at 0x7fe8e59c13f0>. Let's next() it:

    def sinfo():
        yield from
    gen = sinfo()
    box = next(gen)

    Yay, a <Future pending>. How do I attach a callback to this thing? Thank you, python 3.4 doc on add_done_callback:

    def sinfo():
        yield from
    gen = sinfo()
    fut = next(gen)

    And hey, there's even a nice __repr__ here to help me see this in my final print: <Future pending cb=[print()]>.

    But sadly, nothing is printed.

    • Am I exiting before receiving anything? Well no, sleep()ing for a while doesn't help, and more importantly...
    • ... my couchdb log shows no incoming request :-/
    • Am I failing to initialize something? I didn't see more required initialization in the doc :/ .Also, I don't see any mention of me needing to do event loop management, and as I see an asyncio.get_event_loop() call in so I assume it's none of my business.

    What am I doing terribly wrong? Sorry again for hijacking your issue tracker, and thanks for your work :)

    opened by ronjouch 4
  • JSONDecodeError on large views (chunk is split on the middle)

    JSONDecodeError on large views (chunk is split on the middle)

    I'm using this code

    def go():
        db = yield from couch_server.db('db)
        ddoc = yield from gobox_db.ddoc('ddoc')
        viewdoc = yield from ddoc.view('viewdoc')
        while True:
            res = yield from
 would return a JSONDecodeError exception on some rows. by adding print(chunk) on I found on some rows the chunk ends in the middle of the json object.

    opened by shanipribadi 4
  • Fix parsing for sync gateway

    Fix parsing for sync gateway

    The new code works well in line per line, and probably for couchdb 2.0. However, couchbase sync gateway has another peculiarity, in that the "total_rows": X, "offset": Y fields come AFTER the rows, with slightly different sequence of closing brackets :(

    I made a small adjustment to the parsing code, so it can handle both variants. And tested on sync gateway and couchdb 1.4.

    Still working on a more general solution at ...

    opened by ethanfrey 3
  • Is this project's still alive?

    Is this project's still alive?

    It seems that further development is frozen for now (latest commit was in Sep 12, 2016).

    Is this project still maintained?

    There are few PRs, but I'd be glad to help in near future with development. Can I count on code review from maintainers or should I fork it for my own?

    Thanks in advance!

    opened by nlyubchich 2
  • update_body_from_data() function arguments change

    update_body_from_data() function arguments change

    Commit adds skip_auto_header as another arguments for update_body_from_data.

    aiocouchdb currently fails with: TypeError: update_body_from_data() takes 2 positional arguments but 3 were given

    the function call in and needs to be changed accordingly.

    opened by shanipribadi 2
  • force _content to be bytes, not bytearray

    force _content to be bytes, not bytearray

    having _content as bytearray causes chardet to throw a ValueError. Here is an example of requesting a non-existing design doc.

    Traceback (most recent call last):
      File "/home/vukasin/.local/lib64/python3.4/site-packages/aiohttp/", line 272, in start
        yield from self.handle_request(message, payload)
      File "/home/vukasin/.local/lib64/python3.4/site-packages/aiohttp/", line 85, in handle_request
        resp = yield from handler(request)
      File "/usr/lib64/python3.4/asyncio/", line 105, in coro
        res = yield from res
      File "/home/vukasin/peach-2/peach/queue/", line 43, in get
        q = yield from ddoc.view("queue", name)
      File "/home/vukasin/.local/lib64/python3.4/site-packages/aiocouchdb/v1/", line 332, in view
      File "/home/vukasin/.local/lib64/python3.4/site-packages/aiocouchdb/", line 52, in request
        yield from resp.maybe_raise_error()
      File "/home/vukasin/.local/lib64/python3.4/site-packages/aiocouchdb/", line 140, in maybe_raise_error
        data = yield from resp.json()
      File "/home/vukasin/.local/lib64/python3.4/site-packages/aiohttp/", line 703, in json
        encoding = self._get_encoding()
      File "/home/vukasin/.local/lib64/python3.4/site-packages/aiohttp/", line 671, in _get_encoding
        encoding = chardet.detect(self._content)['encoding']
      File "/home/vukasin/.local/lib64/python3.4/site-packages/chardet/", line 26, in detect
        raise ValueError('Expected a bytes object, not a unicode object')
    ValueError: Expected a bytes object, not a unicode object
    opened by vukasin 2
  • Fix #13 - Doc: add basic asyncio howto

    Fix #13 - Doc: add basic asyncio howto

    Fixes #13, and followup to discussion in . I tried to keep minimal, and insist on reading the doc and getting help if needed.

    Feel free to modify as desired. Thanks for aiocouchdb.

    opened by ronjouch 0
  • aiocouchdb pip install is downgrading my aiohttp version

    aiocouchdb pip install is downgrading my aiohttp version

    when i try to install aiocouchdb its uninstall my aiohttp-2.2.5 and install a lower version.

    here is what happened: pip install aiocouchdb Collecting aiocouchdb Downloading aiocouchdb-0.9.1.tar.gz (63kB) 100% |████████████████████████████████| 71kB 523kB/s Collecting aiohttp==0.17.4 (from aiocouchdb) Downloading aiohttp-0.17.4.tar.gz (475kB) 100% |████████████████████████████████| 481kB 968kB/s Requirement already satisfied: chardet in /Users/veto/webs/merkuro/env/lib/python3.6/site-packages (from aiohttp==0.17.4->aiocouchdb) Installing collected packages: aiohttp, aiocouchdb Found existing installation: aiohttp 2.2.5 Uninstalling aiohttp-2.2.5: Successfully uninstalled aiohttp-2.2.5 Running install for aiohttp ... done Running install for aiocouchdb ... done Successfully installed aiocouchdb-0.9.1 aiohttp-0.17.4

    (env)$ pip freeze aiocouchdb==0.9.1 aiodns==1.1.1 aiohttp==0.17.4

    opened by ghost 0
  • Improve test suite

    Improve test suite

    Currently, test suite is quite a overly complicated. It has two mode to run: "mock" and "couchdb". For the "mock" it mocks (obliviously!) HTTP request method and only test logic for some magically expected response. For "couchdb" mode it makes real requests to CouchDB instance and it makes sure that our imaginations of how things should work is close to real.

    Initially, that was good idea: mock tests runs faster, require no service to run and, hell, why not? However, since those time it started to be bad idea because mocks hides real issues and keeping both modes to run the same test cases makes life harder.

    Current plan is to completely move away from mock tests to integration ones. Despite all problems this will make sure that aiocouchdb really works with real service right.

    opened by kxepal 0
The set of asyncio-based libraries built with high quality
Familiar asyncio ORM for python, built with relations in mind

Tortoise ORM Introduction Tortoise ORM is an easy-to-use asyncio ORM (Object Relational Mapper) inspired by Django. Tortoise ORM was build with relati

Tortoise 3.3k Dec 31, 2022
A fast PostgreSQL Database Client Library for Python/asyncio.

asyncpg -- A fast PostgreSQL Database Client Library for Python/asyncio asyncpg is a database interface library designed specifically for PostgreSQL a

magicstack 5.8k Dec 31, 2022
Redis client for Python asyncio (PEP 3156)

Redis client for Python asyncio. Redis client for the PEP 3156 Python event loop. This Redis library is a completely asynchronous, non-blocking client

Jonathan Slenders 554 Dec 4, 2022
MongoX is an async python ODM for MongoDB which is built on top Motor and Pydantic.

MongoX MongoX is an async python ODM (Object Document Mapper) for MongoDB which is built on top Motor and Pydantic. The main features include: Fully t

Amin Alaee 112 Dec 4, 2022
Motor - the async Python driver for MongoDB and Tornado or asyncio

Motor Info: Motor is a full-featured, non-blocking MongoDB driver for Python Tornado and asyncio applications. Documentation: Available at motor.readt

mongodb 2.1k Dec 26, 2022
aiopg is a library for accessing a PostgreSQL database from the asyncio

aiopg aiopg is a library for accessing a PostgreSQL database from the asyncio (PEP-3156/tulip) framework. It wraps asynchronous features of the Psycop

aio-libs 1.3k Jan 3, 2023
aiomysql is a library for accessing a MySQL database from the asyncio

aiomysql aiomysql is a "driver" for accessing a MySQL database from the asyncio (PEP-3156/tulip) framework. It depends on and reuses most parts of PyM

aio-libs 1.5k Jan 3, 2023
aioodbc - is a library for accessing a ODBC databases from the asyncio

aioodbc aioodbc is a Python 3.5+ module that makes it possible to access ODBC databases with asyncio. It relies on the awesome pyodbc library and pres

aio-libs 253 Dec 31, 2022
Motor - the async Python driver for MongoDB and Tornado or asyncio

Motor Info: Motor is a full-featured, non-blocking MongoDB driver for Python Tornado and asyncio applications. Documentation: Available at motor.readt

mongodb 1.6k Feb 6, 2021
asyncio (PEP 3156) Redis support

aioredis asyncio (PEP 3156) Redis client library. Features hiredis parser Yes Pure-python parser Yes Low-level & High-level APIs Yes Connections Pool

aio-libs 2.2k Jan 4, 2023
asyncio compatible driver for elasticsearch

asyncio client library for elasticsearch aioes is a asyncio compatible library for working with Elasticsearch The project is abandoned aioes is not su

null 97 Sep 5, 2022
Asynchronous interface for peewee ORM powered by asyncio

peewee-async Asynchronous interface for peewee ORM powered by asyncio. Important notes Since version 0.6.0a only peewee 3.5+ is supported If you still

05Bit 666 Dec 30, 2022
GINO Is Not ORM - a Python asyncio ORM on SQLAlchemy core.

GINO - GINO Is Not ORM - is a lightweight asynchronous ORM built on top of SQLAlchemy core for Python asyncio. GINO 1.0 supports only PostgreSQL with

GINO Community 2.5k Dec 27, 2022
An asyncio compatible Redis driver, written purely in Python. This is really just a pet-project for me.

asyncredis An asyncio compatible Redis driver. Just a pet-project. Information asyncredis is, like I've said above, just a pet-project for me. I reall

Vish M 1 Dec 25, 2021
Pure Python MySQL Client

PyMySQL Table of Contents Requirements Installation Documentation Example Resources License This package contains a pure-Python MySQL client library,

PyMySQL 7.2k Jan 9, 2023
Python client for Apache Kafka

Kafka Python client Python client for the Apache Kafka distributed stream processing system. kafka-python is designed to function much like the offici

Dana Powers 5.1k Jan 8, 2023
Redis Python Client

redis-py The Python interface to the Redis key-value store. Python 2 Compatibility Note redis-py 3.5.x will be the last version of redis-py that suppo

Andy McCurdy 11k Dec 29, 2022
Asynchronous Python client for InfluxDB

aioinflux Asynchronous Python client for InfluxDB. Built on top of aiohttp and asyncio. Aioinflux is an alternative to the official InfluxDB Python cl

Gustavo Bezerra 159 Dec 27, 2022
Google Cloud Client Library for Python

Google Cloud Python Client Python idiomatic clients for Google Cloud Platform services. Stability levels The development status classifier on PyPI ind

Google APIs 4.1k Jan 1, 2023