Async database support for Python. 🗄

Overview

Databases

Test Suite Package version

Databases gives you simple asyncio support for a range of databases.

It allows you to make queries using the powerful SQLAlchemy Core expression language, and provides support for PostgreSQL, MySQL, and SQLite.

Databases is suitable for integrating against any async Web framework, such as Starlette, Sanic, Responder, Quart, aiohttp, Tornado, or FastAPI.

Documentation: https://www.encode.io/databases/

Community: https://discuss.encode.io/c/databases

Requirements: Python 3.6+


Installation

$ pip install databases

You can install the required database drivers with:

$ pip install databases[postgresql]
$ pip install databases[mysql]
$ pip install databases[sqlite]

Driver support is provided using one of asyncpg, aiomysql, or aiosqlite. Note that if you are using any synchronous SQLAlchemy functions such as engine.create_all() or alembic migrations then you still have to install a synchronous DB driver: psycopg2 for PostgreSQL and pymysql for MySQL.


Quickstart

For this example we'll create a very simple SQLite database to run some queries against.

$ pip install databases[sqlite]
$ pip install ipython

We can now run a simple example from the console.

Note that we want to use ipython here, because it supports using await expressions directly from the console.

# Create a database instance, and connect to it.
from databases import Database
database = Database('sqlite:///example.db')
await database.connect()

# Create a table.
query = """CREATE TABLE HighScores (id INTEGER PRIMARY KEY, name VARCHAR(100), score INTEGER)"""
await database.execute(query=query)

# Insert some data.
query = "INSERT INTO HighScores(name, score) VALUES (:name, :score)"
values = [
    {"name": "Daisy", "score": 92},
    {"name": "Neil", "score": 87},
    {"name": "Carol", "score": 43},
]
await database.execute_many(query=query, values=values)

# Run a database query.
query = "SELECT * FROM HighScores"
rows = await database.fetch_all(query=query)
print('High Scores:', rows)

Check out the documentation on making database queries for examples of how to start using databases together with SQLAlchemy core expressions.

⭐️

Databases is BSD licensed code. Designed & built in Brighton, England.

Comments
  • in-memory sqlite odd behavior

    in-memory sqlite odd behavior

    Hi,

    Thanks for the awesome package (happy to use it with starlette). For rapid dev purpose, I'm relying on the SQLite backend and I see an odd behavior.

    When I create tables with an in-memory db, those tables can never be found in sqlite_master afterwards. But when I rely on a file database, then that works fine.

    Is there any gotcha for in-memory sqllite I should be aware of?

    Cheers,

    bug 
    opened by Lawouach 25
  • build(deps): switch to sqlalchemy 1.4

    build(deps): switch to sqlalchemy 1.4

    Sqlalchemy 1.4 is out with some breaking changes (e.g. RowProxy got removed in favor of Row or the removal of row.values() in favor of row._mapping.values())

    closes #298 and closes #348

    opened by PrettyWood 20
  • Returned rows not properly exposing a mapping interface.

    Returned rows not properly exposing a mapping interface.

    my version: 0.2.2 asyncpg version: 0.18.3 functions: database.fetch_one, database.fetch_all

    dict(user_row) when user_row is a databases.backends.postgres.Record object returns an empty dict for me when using postgres through asyncpg.

    dict(user_row.__dict__['_row']) works, so I know that at least in my case the record has successfully loaded and there is a dict somewhere. dict style user_row['id'] works just fine.

    bug 
    opened by rlittlefield 20
  • Concurrent queries on single connection

    Concurrent queries on single connection

    When using asyncpg, I am getting InterfaceError: cannot perform operation: another operation is in progress when running multiple db calls concurrently.

    I am able to replicate and isolate the issue with force_rollback=True, although in my project where I encountered this bug, it is happening regardless.

    Here is the isolated issue:

    import asyncio
    import databases
    from starlette.applications import Starlette
    from starlette.config import Config
    from starlette.responses import JSONResponse
    
    config = Config('.env')
    DATABASE_URL = config('DATABASE_URL', default='postgresql://postgres@localhost:5432/postgres')
    
    database = databases.Database(DATABASE_URL, force_rollback=True)
    app = Starlette()
    
    
    @app.on_event("startup")
    async def startup():
        await database.connect()
    
    
    @app.on_event("shutdown")
    async def shutdown():
        await database.disconnect()
    
    
    @app.route("/test", methods=["GET"])
    async def test_view(request):
        await asyncio.gather(
            get_from_db(),
            get_from_db(),
        )
        return JSONResponse({"success": True})
    
    
    async def get_from_db():
        return await database.fetch_all("SELECT pg_sleep(1)")
    
    
    if __name__ == '__main__':
        from starlette.testclient import TestClient
        with TestClient(app) as test_client:
            test_client.get('/test')
    
    

    Result

    $ python app.py
    Traceback (most recent call last):
      File "app.py", line 40, in <module>
        test_client.get('/test')
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/requests/sessions.py", line 546, in get
        return self.request('GET', url, **kwargs)
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/starlette/testclient.py", line 382, in request
        json=json,
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/requests/sessions.py", line 533, in request
        resp = self.send(prep, **send_kwargs)
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/requests/sessions.py", line 646, in send
        r = adapter.send(request, **kwargs)
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/starlette/testclient.py", line 211, in send
        raise exc from None
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/starlette/testclient.py", line 208, in send
        loop.run_until_complete(connection(receive, send))
      File "/Users/ryan/.pyenv/versions/3.7.2/lib/python3.7/asyncio/base_events.py", line 584, in run_until_complete
        return future.result()
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/starlette/middleware/errors.py", line 125, in asgi
        raise exc from None
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/starlette/middleware/errors.py", line 103, in asgi
        await asgi(receive, _send)
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/starlette/exceptions.py", line 74, in app
        raise exc from None
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/starlette/exceptions.py", line 63, in app
        await instance(receive, sender)
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/starlette/routing.py", line 41, in awaitable
        response = await func(request)
      File "app.py", line 28, in test_view
        get_from_db(),
      File "app.py", line 34, in get_from_db
        return await database.fetch_all("SELECT pg_sleep(1)")
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/databases/core.py", line 95, in fetch_all
        return await connection.fetch_all(query, values)
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/databases/core.py", line 179, in fetch_all
        return await self._connection.fetch_all(self._build_query(query, values))
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/databases/backends/postgres.py", line 137, in fetch_all
        rows = await self._connection.fetch(query, *args)
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/asyncpg/connection.py", line 421, in fetch
        return await self._execute(query, args, 0, timeout)
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/asyncpg/connection.py", line 1412, in _execute
        with self._stmt_exclusive_section:
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/asyncpg/connection.py", line 1847, in __enter__
        'cannot perform operation: another operation is in progress')
    asyncpg.exceptions._base.InterfaceError: cannot perform operation: another operation is in progress
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "app.py", line 40, in <module>
        test_client.get('/test')
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/starlette/testclient.py", line 415, in __exit__
        loop.run_until_complete(self.wait_shutdown())
      File "/Users/ryan/.pyenv/versions/3.7.2/lib/python3.7/asyncio/base_events.py", line 584, in run_until_complete
        return future.result()
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/starlette/testclient.py", line 435, in wait_shutdown
        self.task.result()
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/starlette/testclient.py", line 420, in lifespan
        await inner(self.receive_queue.get, self.send_queue.put)
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/starlette/routing.py", line 483, in asgi
        await self.shutdown()
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/starlette/routing.py", line 468, in shutdown
        await handler()
      File "app.py", line 21, in shutdown
        await database.disconnect()
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/databases/core.py", line 74, in disconnect
        await self._global_transaction.__aexit__()
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/databases/core.py", line 256, in __aexit__
        await self.rollback()
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/databases/core.py", line 297, in rollback
        await self._transaction.rollback()
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/databases/backends/postgres.py", line 215, in rollback
        await self._transaction.rollback()
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/asyncpg/transaction.py", line 219, in rollback
        await self.__rollback()
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/asyncpg/transaction.py", line 198, in __rollback
        await self._connection.execute(query)
      File "/Users/ryan/.local/share/virtualenvs/postgres-bug-tR_IhxOx/lib/python3.7/site-packages/asyncpg/connection.py", line 273, in execute
        return await self._protocol.query(query, timeout)
      File "asyncpg/protocol/protocol.pyx", line 301, in query
      File "asyncpg/protocol/protocol.pyx", line 659, in asyncpg.protocol.protocol.BaseProtocol._check_state
    asyncpg.exceptions._base.InterfaceError: cannot perform operation: another operation is in progress
    

    When running into this issue with force_rollback=False, since there isn't supposed to be a global connection and transaction, I was able to get around the issue by changing the following function:

    async def get_from_db():
        return await database.fetch_all("SELECT pg_sleep(1)")
    

    to:

    async def get_from_db():
        async with Connection(database._backend) as conn:
            return await conn.fetch_all("SELECT pg_sleep(1)")
    

    I tried using async with database.connection() as conn: but it would return the same connection that is being currently being used. I guess I was just trying to understand the logic of sharing the pool connection, instead of letting the pool allocate connections (when available), since asyncpg Connection only allows one statement at a time.

    feature 
    opened by ryananguiano 18
  • Add _mapping property to the result set interface.

    Add _mapping property to the result set interface.

    Closes #445

    As discussed in the main issue i've added _mapping property to the result set interface. Both asyncpg Record and sqlalchemy Row have this property so it's safe to add it.

    opened by laukhin 16
  • Issues running databases with SQLAlchemy 1.4.42

    Issues running databases with SQLAlchemy 1.4.42

    Maybe I am being an idiot and have forgot to install a package but I am having the following issue:

    I have created an API with FastAPI and databases. When I run my files in a Conda environment with SQLAlchemy 1.4.41 installed there are no issues, but when I switch to a Python Virtual Environment with version 1.4.42, I am returned the following error:

    Traceback (most recent call last):
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/uvicorn/protocols/http/httptools_impl.py", line 404, in run_asgi
        result = await app(  # type: ignore[func-returns-value]
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
        return await self.app(scope, receive, send)
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/fastapi/applications.py", line 270, in __call__
        await super().__call__(scope, receive, send)
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/starlette/applications.py", line 124, in __call__
        await self.middleware_stack(scope, receive, send)
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/starlette/middleware/errors.py", line 184, in __call__
        raise exc
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/starlette/middleware/errors.py", line 162, in __call__
        await self.app(scope, receive, _send)
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/starlette/middleware/exceptions.py", line 75, in __call__
        raise exc
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/starlette/middleware/exceptions.py", line 64, in __call__
        await self.app(scope, receive, sender)
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 21, in __call__
        raise e
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
        await self.app(scope, receive, send)
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/starlette/routing.py", line 680, in __call__
        await route.handle(scope, receive, send)
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/starlette/routing.py", line 275, in handle
        await self.app(scope, receive, send)
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/starlette/routing.py", line 65, in app
        response = await func(request)
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/fastapi/routing.py", line 231, in app
        raw_response = await run_endpoint_function(
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/fastapi/routing.py", line 160, in run_endpoint_function
        return await dependant.call(**values)
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/./app/endpoints/ingredient.py", line 33, in get_ingredient
        for id_, name in await db.fetch_all(query=SEARCH_INGREDIENT_QUERY, values={'search1': f"{query}%", 'search2': f"% {query}%"})
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/databases/core.py", line 142, in fetch_all
        return await connection.fetch_all(query, values)
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/databases/core.py", line 250, in fetch_all
        return await self._connection.fetch_all(built_query)
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/databases/backends/mysql.py", line 115, in fetch_all
        metadata = CursorResultMetaData(context, cursor.description)
      File "/Users/jacob/Documents/python/applications/gumbo/fastapi/venv/lib/python3.9/site-packages/sqlalchemy/engine/cursor.py", line 164, in __init__
        (
    ValueError: not enough values to unpack (expected 5, got 4)
    

    Looking at the file venv/lib/python3.9/site-packages/sqlalchemy/engine/cursor.py on line 164 I can see that:

        def __init__(self, parent, cursor_description):
            context = parent.context
            dialect = context.dialect
            self._tuplefilter = None
            self._translated_indexes = None
            self.case_sensitive = dialect.case_sensitive
            self._safe_for_cache = False
    
            if context.result_column_struct:   
                (
                    result_columns,
                    cols_are_ordered,
                    textual_ordered,
                    ad_hoc_textual,
                    loose_column_name_matching,
                ) = context.result_column_struct      # <---- ERROR IS HERE
    

    I decided to print context.result_column_struct and found that it returned ([], True, False, False).

    I am not really sure how to go about fixing this. If it helps the modules I have installed in the venv are listed below:

    aiomysql==0.1.1
    anyio==3.6.1
    click==8.1.3
    databases==0.6.1
    fastapi==0.85.1
    h11==0.14.0
    httptools==0.5.0
    idna==3.4
    protobuf==3.20.1
    pydantic==1.10.2
    PyMySQL==1.0.2
    python-dotenv==0.21.0
    PyYAML==6.0
    sniffio==1.3.0
    SQLAlchemy==1.4.42
    starlette==0.20.4
    typing_extensions==4.4.0
    uvicorn==0.18.3
    uvloop==0.17.0
    watchfiles==0.17.0
    websockets==10.3
    
    opened by jda5 13
  • Version 0.5.4

    Version 0.5.4

    https://github.com/encode/databases/releases/edit/untagged-d723639e4d24f39254b7

    Changelog:

    Added

    • Support for Unix domain in connections (#423)
    • Added asyncmy MySQL driver (#382)

    Fixed

    • Fix SQLite fetch queries with multiple parameters (#435)
    • Changed Record type to Sequence (#408)
    opened by aminalaee 13
  • DatabaseUrl bug when using Unix domain socket

    DatabaseUrl bug when using Unix domain socket

    I'm deploying a FastAPI application on Google Cloud Run which connects to a Cloud SQL instance using this package. The crux of the issue is that connecting with:

    db = databases.Database(url)
    await db.connect()
    

    fails whereas connecting through sqlalchemy's create_engine with

    engine = create_engine(url)
    engine.connect()
    

    works.

    The connection url uses unix_sock structure (docs here) rather than the regular sqlalchemy connection url, something like this:

    #  all these urls work fine when connecting with sqlalchemy create_engine
    "postgresql://user:pass@/db_name?host=/path/to/sock"
    "postgresql+psycopg2://user:pass@/db_name?host=/path/to/sock"
    "postgresql+pg8000://user:pass@/db_name?unix_sock=/path/to/sock/.s.PGSQL.5432"
    

    I'm unsure whether this would be an issue with using async in the Google Cloud environment or something about how connection urls like the one above get translated in this package to work with sqlalchemy. I've posted on Stack Overflow about it here but thought I'd raise an issue here as well in case it was the latter.

    bug 
    opened by dbatten5 13
  • mysql asyncmy driver

    mysql asyncmy driver

    Fixes #244 . Adding asyncmy as another mysql driver which is more up to date and faster in some areas.

    • [x] Adding asyncmy
    • [x] Fixing all tests (raw queries are failing)
    • [x] Update the docs
    opened by aminalaee 12
  • "Connection is already acquired" error

    Hi. I have a daemon with several asyncio tasks, each of which executes some requests. All of the tasks share the same connection.

    After some time one of them begins to produce this error.

    Failed code is simple :)

    In main.py I have self.engine = Database(config.Config.SQLALCHEMY_DATABASE_URI) await self.engine.connect()

    And in broken task await wait_for(self.engine.execute(query=statement), 10) What could be wrong?

    opened by bokolob 11
  • Best practices on handling DB errors independent of DB used

    Best practices on handling DB errors independent of DB used

    I've been prototyping an API using starlette with SQLAlchemy and had a question about how to handle DB errors with the databases module.

    # SQLAlchemy table definition
    class Users(Base):
        __tablename__ = "users"
        id = Column(Integer, primary_key=True)
        username = Column(String(length=20), unique=True)
        password_hash = Column(String(length=80))
    
    # Endpoint
    @validate_schema(CREATE_USER_SCHEMA)
    async def create_user(request):
        data = await request.json()
        query = Users.__table__.insert().values(username=data["username"], password_hash=hash_password(data["password"]))
        try:
            await database.execute(query)
        except exc.IntegrityError:
            return JSONResponse({"message": "User already exists."})
        return JSONResponse({"message": "Created user."})
    

    When I try to create a user with a username that already exists, violating the UNIQUE constraint specified in the table definition, I receive a sqlite3.IntegrityError: UNIQUE constraint failed: users.username (venv/lib/python3.8/site-packages/aiosqlite/core.py:153: IntegrityError). This is not caught by the code above, which uses sqlalchemy.exc.IntegrityError.

    I like to run tests in SQLite, and use Postgres in production; in other APIs I've built (with plain SQLAlchemy), sqlalchemy.exc generally does the trick regardless of the DB backend used.

    Q: Is there a good way to catch errors independent of the DB used? For now, I'm catching both the SQLite and Postgres exceptions manually, but I was wondering what best practices for this case would look like using the databases module.

    Thanks for creating starlette and databases, and sorry in advance if this is a duplicate.

    opened by rahulsabbineni 11
  • Retrieve last inserted id from `database.execute`

    Retrieve last inserted id from `database.execute`

    Even though #495 mentions that the execute query returns the last inserted id, I am receiving None.

    # from databases import Database
    # here `db: Database`
    res = await self.db.execute(
                'INSERT INTO sensor_data_filter (label, user_id, client_id, filters, "order") VALUES (:label, :user_id, :client_id, :filters, :order)',
                values=new_filter,
            )
    print(res)  # returns `None`
    
    opened by Pk13055 0
  • Bump wheel from 0.37.1 to 0.38.1

    Bump wheel from 0.37.1 to 0.38.1

    Bumps wheel from 0.37.1 to 0.38.1.

    Changelog

    Sourced from wheel's changelog.

    Release Notes

    UNRELEASED

    • Updated vendored packaging to 22.0

    0.38.4 (2022-11-09)

    • Fixed PKG-INFO conversion in bdist_wheel mangling UTF-8 header values in METADATA (PR by Anderson Bravalheri)

    0.38.3 (2022-11-08)

    • Fixed install failure when used with --no-binary, reported on Ubuntu 20.04, by removing setup_requires from setup.cfg

    0.38.2 (2022-11-05)

    • Fixed regression introduced in v0.38.1 which broke parsing of wheel file names with multiple platform tags

    0.38.1 (2022-11-04)

    • Removed install dependency on setuptools
    • The future-proof fix in 0.36.0 for converting PyPy's SOABI into a abi tag was faulty. Fixed so that future changes in the SOABI will not change the tag.

    0.38.0 (2022-10-21)

    • Dropped support for Python < 3.7
    • Updated vendored packaging to 21.3
    • Replaced all uses of distutils with setuptools
    • The handling of license_files (including glob patterns and default values) is now delegated to setuptools>=57.0.0 (#466). The package dependencies were updated to reflect this change.
    • Fixed potential DoS attack via the WHEEL_INFO_RE regular expression
    • Fixed ValueError: ZIP does not support timestamps before 1980 when using SOURCE_DATE_EPOCH=0 or when on-disk timestamps are earlier than 1980-01-01. Such timestamps are now changed to the minimum value before packaging.

    0.37.1 (2021-12-22)

    • Fixed wheel pack duplicating the WHEEL contents when the build number has changed (#415)
    • Fixed parsing of file names containing commas in RECORD (PR by Hood Chatham)

    0.37.0 (2021-08-09)

    • Added official Python 3.10 support
    • Updated vendored packaging library to v20.9

    ... (truncated)

    Commits
    • 6f1608d Created a new release
    • cf8f5ef Moved news item from PR #484 to its proper place
    • 9ec2016 Removed install dependency on setuptools (#483)
    • 747e1f6 Fixed PyPy SOABI parsing (#484)
    • 7627548 [pre-commit.ci] pre-commit autoupdate (#480)
    • 7b9e8e1 Test on Python 3.11 final
    • a04dfef Updated the pypi-publish action
    • 94bb62c Fixed docs not building due to code style changes
    • d635664 Updated the codecov action to the latest version
    • fcb94cd Updated version to match the release
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • How to execute multiple queries into multiple tables?

    How to execute multiple queries into multiple tables?

    I was trying to do something like this where the table name is part of the values in execute_many but it doesn't work...

            queries = []
            for name in unformatted:
                value = user_data[name]['value']
                leaderboard = name.replace("-",'_')
                queries.append(
                    {
                        "table":leaderboard,
                        "uuid":uuid,
                        "user":user,
                        "prefix":prefix,
                        "suffix":suffix,
                        "value":value
                    }
                )
                
            await lb_db.execute_many(query="""
                INSERT INTO :table VALUES 
                (:uuid,:user,:prefix,:suffix, :value) 
                    ON CONFLICT(uuid) DO UPDATE SET 
                user = :user, prefix = :prefix, suffix = :suffix, value = :value
                """, values=queries)
    
    opened by tcrch 0
  • Issue with the Advisory Lock

    Issue with the Advisory Lock

    asyncpg version: 0.25.0 PostgreSQL version: 12.11

    We have been frequently receiving this Timeout Error

    asyncio.exceptions.TimeoutError
    [View similar errors](https://link.datadoghq.com/apm/error-tracking?issueId=...471-da7ad0900002)
    
    Traceback (most recent call last):
      File /usr/local/lib/python3.9/site-packages/ddtrace/contrib/asyncpg/patch.py, line 89, in _traced_query
        return await method(*args, **kwargs)
    File asyncpg/protocol/protocol.pyx, line 338, in query
    asyncio.exceptions.TimeoutError
    

    upon checking the logs, it's making an internal query SELECT pg_advisory_unlock_all ( ) CLOSE ALL UNLISTEN * RESET ALL on PostgreSQL database

    Not sure why the tables are locking when we are only doing read operations from the db.

    opened by jageshmaharjan 0
  • Getting `DatabaseBackend` is not running error

    Getting `DatabaseBackend` is not running error

    Hey everyone. After deploying my fastapi app on Deta and/or on Vercel, I got an error complaining about DatabaseBackend not running. However, the connect method is being called during the startup event of fastapi instance:

    @chat_app.on_event("startup")
    async def startup():
        await init_models(SQLALCHEMY_DATABASE_URL)
        await database.connect()
    

    The weird thing is that the app is running as expected on Heroku; you can interract with the endpoints to test things out.

    Here are the full logs on vercel:

    logs
    [ERROR] AssertionError: DatabaseBackend is not running
    Traceback (most recent call last):
      File "/var/task/vc__handler__python.py", line 293, in vc_handler
        response = asgi_cycle(__vc_module.app, body)
      File "/var/task/vc__handler__python.py", line 204, in __call__
        loop.run_until_complete(asgi_task)
      File "/var/lang/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete
        return future.result()
      File "/var/task/fastapi/applications.py", line 269, in __call__
        await super().__call__(scope, receive, send)
      File "/var/task/starlette/applications.py", line 124, in __call__
        await self.middleware_stack(scope, receive, send)
      File "/var/task/starlette/middleware/errors.py", line 184, in __call__
        raise exc
      File "/var/task/starlette/middleware/errors.py", line 162, in __call__
        await self.app(scope, receive, _send)
      File "/var/task/starlette/middleware/base.py", line 68, in __call__
        response = await self.dispatch_func(request, call_next)
      File "/var/task/app/main.py", line 75, in add_process_time_header
        response = await call_next(request)
      File "/var/task/starlette/middleware/base.py", line 46, in call_next
        raise app_exc
      File "/var/task/starlette/middleware/base.py", line 36, in coro
        await self.app(scope, request.receive, send_stream.send)
      File "/var/task/starlette/middleware/cors.py", line 92, in __call__
        await self.simple_response(scope, receive, send, request_headers=headers)
      File "/var/task/starlette/middleware/cors.py", line 147, in simple_response
        await self.app(scope, receive, send)
      File "/var/task/starlette/exceptions.py", line 93, in __call__
        raise exc
      File "/var/task/starlette/exceptions.py", line 82, in __call__
        await self.app(scope, receive, sender)
      File "/var/task/fastapi/middleware/asyncexitstack.py", line 21, in __call__
        raise e
      File "/var/task/fastapi/middleware/asyncexitstack.py", line 18, in __call__
        await self.app(scope, receive, send)
      File "/var/task/starlette/routing.py", line 670, in __call__
        await route.handle(scope, receive, send)
      File "/var/task/starlette/routing.py", line 266, in handle
        await self.app(scope, receive, send)
      File "/var/task/starlette/routing.py", line 65, in app
        response = await func(request)
      File "/var/task/fastapi/routing.py", line 231, in app
        raw_response = await run_endpoint_function(
      File "/var/task/fastapi/routing.py", line 160, in run_endpoint_function
        return await dependant.call(**values)
      File "/var/task/app/auth/router.py", line 62, in register
        results = await register_user(user)
      File "/var/task/app/auth/crud.py", line 131, in register_user
        fetched_user = await find_existed_user(user.email)
      File "/var/task/app/auth/crud.py", line 64, in find_existed_user
        return await database.fetch_one(query, values=values)
      File "/var/task/databases/core.py", line 147, in fetch_one
        async with self.connection() as connection:
      File "/var/task/databases/core.py", line 230, in __aenter__
        raise e
      File "/var/task/databases/core.py", line 227, in __aenter__
        await self._connection.acquire()
      File "/var/task/databases/backends/mysql.py", line 99, in acquire
        assert self._database._pool is not None, "DatabaseBackend is not running"
    

    Digging a bit deeper, I realized that the query calls were not working even though the database.connect is being called at the startup. It seems like some sort of a race condition. Not sure.

    opened by wiseaidev 0
Releases(0.7.0)
  • 0.7.0(Dec 18, 2022)

    Fixed

    • Fixed breaking changes in SQLAlchemy cursor; supports >=1.4.42,<1.5 (#513).
    • Wrapped types in typing.Optional where applicable (#510).
    Source code(tar.gz)
    Source code(zip)
  • 0.6.2(Nov 7, 2022)

  • 0.6.1(Aug 9, 2022)

  • 0.6.0(May 30, 2022)

    0.6.0 (May 30th, 2022)

    • Dropped Python 3.6 support (#458)

    Added

    • Add _mapping property to the result set interface (#447 )
    • Add contributing docs (#453 )

    Fixed

    • Fix query result named access (#448)
    • Fix connections getting into a bad state when a task is cancelled (#457)
    • Revert #328 parallel transactions (#472)
    • Change extra installations to specific drivers (#436)
    Source code(tar.gz)
    Source code(zip)
  • 0.5.5(Jan 20, 2022)

  • 0.5.4(Jan 14, 2022)

    0.5.4 (January 14th, 2022)

    Added

    • Support for Unix domain in connections (#423)
    • asyncmy MySQL driver (#382)

    Fixed

    • Fix SQLite fetch queries with multiple parameters (#435)
    • Change Record type to Sequence (#408)
    Source code(tar.gz)
    Source code(zip)
  • 0.5.3(Oct 10, 2021)

    0.5.3 (October 10th, 2021)

    Added

    • Support dialect+driver for default database drivers like postgresql+asyncpg (#396)

    Fixed

    • Documentation of low-level transaction (#390)
    Source code(tar.gz)
    Source code(zip)
  • 0.5.2(Sep 10, 2021)

    0.5.2 (September 10th, 2021)

    Fixed

    • Reset counter for failed connections (#385)
    • Avoid dangling task-local connections after Database.disconnect() (#211)
    Source code(tar.gz)
    Source code(zip)
  • 0.5.1(Sep 2, 2021)

    0.5.1 (September 2nd, 2021)

    Added

    • Make database connect and disconnect calls idempotent (#379)

    Fixed

    • Fix in_ and notin_ queries in SQLAlchemy 1.4 (#378)
    Source code(tar.gz)
    Source code(zip)
  • 0.5.0(Aug 26, 2021)

  • 0.4.3(Mar 26, 2021)

  • 0.4.2(Mar 14, 2021)

  • 0.4.1(Nov 16, 2020)

    Changelog:

    • Removed psycopg2 and pymysql from the direct package dependencies to avoid non-binary builds (https://github.com/encode/databases/pull/256)
    Source code(tar.gz)
    Source code(zip)
  • 0.4.0(Oct 20, 2020)

    Changelog:

    • Use backend native fetch_val() implementation when available (#132)
    • Replace psycopg2-binary with psycopg2 (#204)
    • Speed up PostgresConnection fetch() and iterate() (#193)
    • Access asyncpg Record field by key on raw query (#207)
    • Fix type hinting for sqlite backend (#227)
    • Allow setting min_size and max_size in postgres DSN (#210)
    • Add option pool_recycle in postgres DSN (#233)
    • Fix SQLAlchemy DDL statements (#226)
    • Make fetch_val call fetch_one for type conversion (#246)
    • Allow extra transaction options (#242)
    • Unquote username and password in DatabaseURL (#248)
    Source code(tar.gz)
    Source code(zip)
  • 0.3.2(Apr 30, 2020)

Owner
Encode
Collaboratively funded software development.
Encode
MySQL database connector for Python (with Python 3 support)

mysqlclient This project is a fork of MySQLdb1. This project adds Python 3 support and fixed many bugs. PyPI: https://pypi.org/project/mysqlclient/ Gi

PyMySQL 2.2k Dec 25, 2022
Motor - the async Python driver for MongoDB and Tornado or asyncio

Motor Info: Motor is a full-featured, non-blocking MongoDB driver for Python Tornado and asyncio applications. Documentation: Available at motor.readt

mongodb 2.1k Dec 26, 2022
Motor - the async Python driver for MongoDB and Tornado or asyncio

Motor Info: Motor is a full-featured, non-blocking MongoDB driver for Python Tornado and asyncio applications. Documentation: Available at motor.readt

mongodb 1.6k Feb 6, 2021
Async ODM (Object Document Mapper) for MongoDB based on python type hints

ODMantic Documentation: https://art049.github.io/odmantic/ Asynchronous ODM(Object Document Mapper) for MongoDB based on standard python type hints. I

Arthur Pastel 732 Dec 31, 2022
MongoX is an async python ODM for MongoDB which is built on top Motor and Pydantic.

MongoX MongoX is an async python ODM (Object Document Mapper) for MongoDB which is built on top Motor and Pydantic. The main features include: Fully t

Amin Alaee 112 Dec 4, 2022
sync/async MongoDB ODM, yes.

μMongo: sync/async ODM μMongo is a Python MongoDB ODM. It inception comes from two needs: the lack of async ODM and the difficulty to do document (un)

Scille 428 Dec 29, 2022
Async ORM based on PyPika

PyPika-ORM - ORM for PyPika SQL Query Builder The package gives you ORM for PyPika with asycio support for a range of databases (SQLite, PostgreSQL, M

Kirill Klenov 7 Jun 4, 2022
Python interface to Oracle Database conforming to the Python DB API 2.0 specification.

cx_Oracle version 8.2 (Development) cx_Oracle is a Python extension module that enables access to Oracle Database. It conforms to the Python database

Oracle 841 Dec 21, 2022
A fast PostgreSQL Database Client Library for Python/asyncio.

asyncpg -- A fast PostgreSQL Database Client Library for Python/asyncio asyncpg is a database interface library designed specifically for PostgreSQL a

magicstack 5.8k Dec 31, 2022
The Database Toolkit for Python

SQLAlchemy The Python SQL Toolkit and Object Relational Mapper Introduction SQLAlchemy is the Python SQL toolkit and Object Relational Mapper that giv

SQLAlchemy 6.5k Jan 1, 2023
A HugSQL-inspired database library for Python

PugSQL PugSQL is a simple Python interface for using parameterized SQL, in files. See pugsql.org for the documentation. To install: pip install pugsql

Dan McKinley 558 Dec 24, 2022
PostgreSQL database adapter for the Python programming language

psycopg2 - Python-PostgreSQL Database Adapter Psycopg is the most popular PostgreSQL database adapter for the Python programming language. Its main fe

The Psycopg Team 2.8k Jan 5, 2023
A tiny python web application based on Flask to set, get, expire, delete keys of Redis database easily with direct link at the browser.

First Redis Python (CRUD) A tiny python web application based on Flask to set, get, expire, delete keys of Redis database easily with direct link at t

Max Base 9 Dec 24, 2022
A Relational Database Management System for a miniature version of Twitter written in MySQL with CLI in python.

Mini-Twitter-Database This was done as a database design course project at Amirkabir university of technology. This is a relational database managemen

Ali 12 Nov 23, 2022
Database connection pooler for Python

Nimue Strange women lying in ponds distributing swords is no basis for a system of government! --Dennis, Peasant Nimue is a database connection pool f

null 1 Nov 9, 2021
A tutorial designed to introduce you to SQlite 3 database using python

SQLite3-python-tutorial A tutorial designed to introduce you to SQlite 3 database using python What is SQLite? SQLite is an in-process library that im

null 0 Dec 28, 2021
Creating a python package to convert /transfer excelsheet data to a mysql Database Table

Creating a python package to convert /transfer excelsheet data to a mysql Database Table

Odiwuor Lameck 1 Jan 7, 2022
ClickHouse Python Driver with native interface support

ClickHouse Python Driver ClickHouse Python Driver with native (TCP) interface support. Asynchronous wrapper is available here: https://github.com/myma

Marilyn System 957 Dec 30, 2022
PostgreSQL database access simplified

Queries: PostgreSQL Simplified Queries is a BSD licensed opinionated wrapper of the psycopg2 library for interacting with PostgreSQL. The popular psyc

Gavin M. Roy 251 Oct 25, 2022