New generation PostgreSQL database adapter for the Python programming language

Overview

Psycopg 3 -- PostgreSQL database adapter for Python

Psycopg 3 is a modern implementation of a PostgreSQL adapter for Python.

Installation

Quick version:

pip install --upgrade pip               # upgrade pip to at least 20.3
pip install psycopg[binary,pool]        # install binary dependencies

For further information about installation please check the documentation.

Hacking

In order to work on the Psycopg source code you need to have the libpq PostgreSQL client library installed in the system. For instance, on Debian systems, you can obtain it by running:

sudo apt install libpq5

After which you can clone this repository:

git clone https://github.com/psycopg/psycopg.git
cd psycopg

Please note that the repository contains the source code of several Python packages: that's why you don't see a setup.py here. The packages may have different requirements:

  • The psycopg directory contains the pure python implementation of psycopg. The package has only a runtime dependency on the libpq, the PostgreSQL client library, which should be installed in your system.
  • The psycopg_c directory contains an optimization module written in C/Cython. In order to build it you will need a few development tools: please look at Local installation in the docs for the details.
  • The psycopg_pool directory contains the connection pools implementations. This is kept as a separate package to allow a different release cycle.

You can create a local virtualenv and install there the packages in development mode, together with their development and testing requirements:

python -m venv .venv
source .venv/bin/activate
pip install -e ./psycopg[dev,test]      # for the base Python package
pip install -e ./psycopg_c              # for the C extension module
pip install -e ./psycopg_pool           # for the connection pool

Now hack away! You can use tox to validate the code:

pip install tox
tox -p4

and to run the tests:

psql -c 'create database psycopg_test'
export PSYCOPG_TEST_DSN="dbname=psycopg_test"
tox -c psycopg -s
tox -c psycopg_c -s

Please look at the commands definitions in the tox.ini files if you want to run some of them interactively: the dependency should be already in your virtualenv. Feel free to adapt these recipes if you follow a different development pattern.

Comments
  • macOS: kqueue file descriptor leak

    macOS: kqueue file descriptor leak

    I was trying to pipe data from one server to another like this:

    with src_cur.copy("""COPY public.src_table TO STDOUT (FORMAT BINARY);""") as copy_src:
        with dst_cur.copy("""COPY public.dst_table FROM STDIN (FORMAT BINARY);""") as copy_dst:
            while data_row := copy_src.read():
                copy_dst.write(bytes(data_row))
    

    and got the following error on bigger tables:

    sending query failed: another command is already in progress
    Traceback (most recent call last):
      File ".env/lib/python3.9/site-packages/psycopg/cursor.py", line 681, in copy
        yield copy
      File "src/run_psycopg3.py", line 27, in transfer_table_data
        while data_row := copy_src.read():
      File ".env/lib/python3.9/site-packages/psycopg/copy.py", line 204, in read
        return self.connection.wait(self._read_gen())
      File ".env/lib/python3.9/site-packages/psycopg/connection.py", line 767, in wait
        return waiting.wait(gen, self.pgconn.socket, timeout=timeout)
      File ".env/lib/python3.9/site-packages/psycopg/waiting.py", line 53, in wait_selector
        sel = DefaultSelector()
      File "/usr/local/Cellar/[email protected]/3.9.8/Frameworks/Python.framework/Versions/3.9/lib/python3.9/selectors.py", line 512, in __init__
        self._selector = select.kqueue()
    OSError: [Errno 24] Too many open files
    

    macOS 12.0.1 (21A559) Python 3.9.8 psycopg-binary 3.0.4

    I was adapting my code for psycopg3, previously I used the following pattern for piping data between servers. Not sure if I'm using the v3 interface wrong ...

    dst_fd, src_fd = os.pipe()
    
    def copy_dst():
        with os.fdopen(dst_fd) as read_f:
            dst_cur.copy_expert(f"""COPY public.dst_table FROM STDIN;""", read_f)
    
    dst_thread = threading.Thread(target=copy_dst)
    dst_thread.start()
    
    with os.fdopen(src_fd, 'w') as write_f:
        src_cur.copy_expert("""COPY public.src_table TO STDOUT;""", write_f)
    
    dst_thread.join()
    
    bug macOS 
    opened by qwesda 30
  • Cursor and Connection generic on Row

    Cursor and Connection generic on Row

    This adds support for having Cursor's fetch*() return type match whatever user-supplied row factory produces. For instance:

    class R:
        ...
    
    def my_row_factory(cursor: BaseCursor[Any, R]) -> Callable[[Sequence[Any]], R]:
        ...
    
    with conn.cursor(row_factory=my_row_factory) as cur:
        cur.execute("select 1")
        r = cur.fetchone()
        reveal_type(r)
        # Revealed type is 'Union[R`-1, None]'
    
    • the big commits are "Make Cursor generic on Row" and "Make Connection generic on Row"
    • other commits are either refactoring to ease the latter one or minor improvements
    • an example file tests/typing_example.py is added and used as a mypy test in tox and ci
    opened by dlax 29
  • `write_row` busy loop

    `write_row` busy loop

    Found some weird behavior in version 3.0.10 (in both C and python variant, installed from pip).

    This seems never to return, it sticks at 100% python CPU with no postgres activity:

    def provoke(conn):
        txt = '\\' * 536870883
        c = conn.cursor()
        c.execute('create temporary table ttt(t1 text, t2 text, t3 text, t4 text, t5 text)')
        with c.copy('COPY ttt (t1,t2) FROM STDIN') as copy:
            copy.write_row([txt] * 2)
        c.execute('select length(t1), length(t2), length(t3) from ttt')
        print(c.fetchall())
        c.execute('drop table ttt')
    provoke(connection)
    

    Have not yet further debugged, maybe some sort of busy loop in text escaping solely on python side. The value size was chosen for typical postgres limits, which should raise here, but it never gets to postgres side of things.

    opened by jerch 25
  • Initial Connect SSL negotiation packet Error

    Initial Connect SSL negotiation packet Error

    When using the binary version, ie psycopg[binary] calls into libpq, the following initial setup (with *** being user / password )

    await AsyncConnectionPool("postgresql://*****:******@localhost:26257/passport", max_size=50).wait()
    

    causes a

    WARNING:psycopg.pool:error connecting in 'pool-1': connection failed: Connection refused
    could not send SSL negotiation packet: Connection refused
    

    I use this url in other libs (java / rust sqlx). Is there a different url it's expecting? Typically, SSL takes care of itself.

    opened by lacasaprivata2 25
  • Segfault in the psycopg python only variant.

    Segfault in the psycopg python only variant.

    Hi there, I am trying to get pyscopg working on Django (https://github.com/django/django/pull/15687/files), thank you for your work so far on this -- I'll try to get it over the finish line.

    One thing I did note though is that if I run the testsuite like this (this is Django's testsuite from my branch with psycopg3 installed):

    ./runtests.py --settings=test_postgresql --parallel=8
    cat test_postgresql.py 
    # This is an example test settings file for use with the Django test suite.
    #
    # The 'sqlite3' backend requires only the ENGINE setting (an in-
    # memory database will be used). All other backends will require a
    # NAME and potentially authentication information. See the
    # following section in the docs for more information:
    #
    # https://docs.djangoproject.com/en/dev/internals/contributing/writing-code/unit-tests/
    #
    # The different databases that Django supports behave differently in certain
    # situations, so it is recommended to run the test suite against as many
    # database backends as possible.  You may want to create a separate settings
    # file for each of the backends you test against.
    
    DATABASES = {
        "default": {
            "ENGINE": "django.db.backends.postgresql",
            "NAME": "django",
        },
        "other": {
            "ENGINE": "django.db.backends.postgresql",
            "NAME": "django2",
        },
    }
    
    SECRET_KEY = "django_tests_secret_key"
    
    # Use a fast hasher to speed up tests.
    PASSWORD_HASHERS = [
        "django.contrib.auth.hashers.MD5PasswordHasher",
    ]
    
    DEFAULT_AUTO_FIELD = "django.db.models.AutoField"
    
    USE_TZ = False
    

    I do get segfaults after a few tests:

    ./runtests.py --settings=test_postgresql --parallel=8                                                                                                                                            
    Testing against Django installed in '/home/florian/sources/django.git/django' with up to 8 processes
    Found 15782 test(s).
    Creating test database for alias 'default'...
    Cloning test database for alias 'default'...
    Cloning test database for alias 'default'...
    Cloning test database for alias 'default'...
    Cloning test database for alias 'default'...
    Cloning test database for alias 'default'...
    Cloning test database for alias 'default'...
    Cloning test database for alias 'default'...
    Cloning test database for alias 'default'...
    Creating test database for alias 'other'...
    Cloning test database for alias 'other'...
    Cloning test database for alias 'other'...
    Cloning test database for alias 'other'...
    Cloning test database for alias 'other'...
    Cloning test database for alias 'other'...
    Cloning test database for alias 'other'...
    Cloning test database for alias 'other'...
    Cloning test database for alias 'other'...
    System check identified no issues (17 silenced).
    ....Fatal Python error: Segmentation fault
    
    Current thread 0x00007ff582c5b740 (most recent call first):
      File "/home/florian/sources/django.git/django/db/backends/base/base.py", line 364 in close
      File "/home/florian/sources/django.git/django/utils/asyncio.py", line 26 in inner
      File "/home/florian/sources/django.git/django/test/testcases.py", line 1461 in tearDownClass
      File "/usr/lib64/python3.10/unittest/suite.py", line 306 in _tearDownPreviousClass
      File "/usr/lib64/python3.10/unittest/suite.py", line 130 in run
      File "/usr/lib64/python3.10/unittest/suite.py", line 84 in __call__
      File "/home/florian/sources/django.git/django/test/runner.py", line 366 in run
      File "/home/florian/sources/django.git/django/test/runner.py", line 446 in _run_subsuite
      File "/usr/lib64/python3.10/multiprocessing/pool.py", line 125 in worker
      File "/usr/lib64/python3.10/multiprocessing/process.py", line 108 in run
      File "/usr/lib64/python3.10/multiprocessing/process.py", line 315 in _bootstrap
      File "/usr/lib64/python3.10/multiprocessing/popen_fork.py", line 71 in _launch
      File "/usr/lib64/python3.10/multiprocessing/popen_fork.py", line 19 in __init__
      File "/usr/lib64/python3.10/multiprocessing/context.py", line 277 in _Popen
      File "/usr/lib64/python3.10/multiprocessing/process.py", line 121 in start
      File "/usr/lib64/python3.10/multiprocessing/pool.py", line 326 in _repopulate_pool_static
      File "/usr/lib64/python3.10/multiprocessing/pool.py", line 303 in _repopulate_pool
      File "/usr/lib64/python3.10/multiprocessing/pool.py", line 212 in __init__
      File "/usr/lib64/python3.10/multiprocessing/context.py", line 119 in Pool
      File "/home/florian/sources/django.git/django/test/runner.py", line 497 in run
      File "/usr/lib64/python3.10/unittest/suite.py", line 84 in __call__
      File "/usr/lib64/python3.10/unittest/runner.py", line 184 in run
      File "/home/florian/sources/django.git/django/test/runner.py", line 967 in run_suite
      File "/home/florian/sources/django.git/django/test/runner.py", line 1045 in run_tests
      File "/home/florian/sources/django.git/tests/./runtests.py", line 427 in django_tests
      File "/home/florian/sources/django.git/tests/./runtests.py", line 768 in <module>
    
    Extension modules: pywatchman.bser, markupsafe._speedups, PIL._imaging, psycopg2._psycopg, PIL._webp, yaml._yaml, numpy.core._multiarray_umath, numpy.core._multiarray_tests, numpy.linalg._umath_linalg, numpy.fft._pocketfft_internal, numpy.random._common, numpy.random.bit_generator, numpy.random._bounded_integers, numpy.random._mt19937, numpy.random.mtrand, numpy.random._philox, numpy.random._pcg64, numpy.random._sfc64, numpy.random._generator, _cffi_backend (total: 20)
    

    Now I am not 100% sure yet if this is due to my code changes or not, but Python should not crash. Then again since psycopg uses ctypes there is probably a chance of crashes? The other option is that this is a python bug but I am opening it here first and maybe we can figure out if it is psycopg or python. I can also gather core dumps, but it would be great if someone can try verifying it on another machine so I am not chasing heisenbugs.

    I also do not observe this issue if I use 'psycopg[binary]', which points even more towards a psycopg ctypes or python bug. I will try to gather more details tomorrow, but if you have any ideas let me know.

    opened by apollo13 23
  • Update cibuildwheel to v2.2.2 to build `musllinux` wheels

    Update cibuildwheel to v2.2.2 to build `musllinux` wheels

    The v2.2.x version of cibuildwheel adds supports for building musllinux^1 wheels. Meaning Alpine users can finally take advantage of binary installations.

    opened by lithammer 22
  • Error deleting closed connection on exit

    Error deleting closed connection on exit

    psycopg 3.0.7 Python 3.10.1 Postgres 14.1 MacOS 12.1 (Monterey) Mac mini (M1 2020)

    Applications run to completion, then when exiting, throw the following exception:

    Exception ignored in: <function BaseConnection.__del__ at 0x102356dd0>
    Traceback (most recent call last):
      File "/opt/homebrew/lib/python3.10/site-packages/psycopg/connection.py", line 140, in __del__
      File "/opt/homebrew/lib/python3.10/site-packages/psycopg/connection.py", line 161, in closed
      File "/opt/homebrew/lib/python3.10/site-packages/psycopg/pq/pq_ctypes.py", line 205, in status
    TypeError: 'NoneType' object is not callable
    Exception ignored in: <function PGconn.__del__ at 0x1021bbb50>
    Traceback (most recent call last):
      File "/opt/homebrew/lib/python3.10/site-packages/psycopg/pq/pq_ctypes.py", line 91, in __del__
    TypeError: 'NoneType' object is not callable
    Exception ignored in: <function PGresult.__del__ at 0x102232320>
    Traceback (most recent call last):
      File "/opt/homebrew/lib/python3.10/site-packages/psycopg/pq/pq_ctypes.py", line 723, in __del__
      File "/opt/homebrew/lib/python3.10/site-packages/psycopg/pq/pq_ctypes.py", line 733, in clear
    TypeError: 'NoneType' object is not callable
    
    opened by cvickery 21
  • Python 3.11 compatibility

    Python 3.11 compatibility

    At the moment, psycopg-c build is not compatible with Python 3.11. We probably need to wait for release of Cython 3.0.0a11 because the errors during build seem already addressed and merged to master:

    • https://github.com/cython/cython/issues/4500
    • https://github.com/cython/cython/issues/4721
    opened by dvarrazzo 18
  • row factory proposal

    row factory proposal

    Following-up on the discussion on the ML, here's a proposal to add a row_factory argument to connection.cursor() allowing to return custom values for rows of the result set.


    We add a row_factory keyword argument in connection.cursor() and cursor classes that will be used to produce individual rows of the result set.

    A RowFactory can be implemented as a class with a __call__ method accepting raw values and initialized with a cursor instance; the RowFactory instance is created when results are available. Type definitions for RowFactory (and its respective RowMaker) are defined as callback protocols so as to allow user to define a row factory without the need for writing a class.

    The default row factory returns values unchanged.

    opened by dlax 16
  • Provide pre-compiled wheels for Mac OS 12 (Monterey)

    Provide pre-compiled wheels for Mac OS 12 (Monterey)

    Hello, as the name title suggest, this issue is about providing precompiled wheels for the newest MacOS Monterey (12.0). The main psycopg package installs fine (via poetry or pip! but the provider (psycopg-binary) can't be installed. Pip currently doesn't find the matching wheel/distribution for the latest MacOS, I would guess the OS selector does not match the existing uploaded wheels. Tested on py3.9 with poetry & pip, Mac OS 12.0.1; I've checked also wheels on pypi and it appears this is the case for all other python version for which the psycopg-binary is released as well.

    macOS binary 
    opened by RootLUG 15
  • WIP: Add PostGIS Geometry adapter based on Shapely

    WIP: Add PostGIS Geometry adapter based on Shapely

    As mentioned in https://github.com/rustprooflabs/pgosm-flex/pull/165 I tried to write an adapter for PostGIS geometries based on Shapely.

    The usage looks like this:

    Example usage
    
    import psycopg
    
    
    from psycopg.types.geometry import register_shapely_adapters
    
    CONN_STR = "postgres://postgres:testpassword@localhost:15432/osm_data"
    
    # get the Reichstag building geometry from a pgosm-flex import of Berlin
    # it's a multiploygon with holes in it
    # https://www.openstreetmap.org/relation/2201742
    READ_QUERY = """
        select osm_id, name, st_area(geom), geom
        from osm.building_polygon
        where osm_id = -2201742"""
    
    with psycopg.connect(CONN_STR) as conn:
        register_shapely_adapters(conn)
    
        with conn.cursor(binary=False) as cur:
            cur.execute(READ_QUERY)
            row = cur.fetchone()
            print('Binary protocol:')
            print(row)
            print('area from shape:', row[-1].area)
            print('area from DB:', row[-2])
    
    
        with conn.cursor(binary=True) as cur:
            cur.execute(READ_QUERY)
            row = cur.fetchone()
            print('Text protocol:')
            print(row)
            print('area from shape:', row[-1].area)
            print('area from DB:', row[-2])
            from shapely import affinity
            cur.execute("insert into osm.building_polygon(osm_id, osm_type, address, geom) VALUES(999999, 'fake', 'fake', %s)", (affinity.rotate(row[-1], 30),))
            conn.commit()
    
    
    with psycopg.connect(CONN_STR) as conn:
        # this connection has no adapters
        with conn.cursor(binary=False) as cur:
            cur.execute(READ_QUERY)
            row = cur.fetchone()
            print(str(row)[:400] + '...(truncated)')
    
    it works both in binary and text mode, and I can see the rotated shape in QGIS
    Screenshot of the polygons in QGIS

    a map showing the polygon and the rotated one inserted using the adapter

    There are a few things missing:

    • the geometry is a type from postGIS, so I think it can only be retrieved at runtime
    • how to write a test foir this? I ran it against a test PostGIS DB
    • I'm not sure about the import for shapely, now it's simply attempted and there's no error handling on it
    • documentation, once the usage is defined
    opened by jacopofar 15
  • Using a prerelease version of cython is problematic for distribution packaging

    Using a prerelease version of cython is problematic for distribution packaging

    Hi! I'm currently working on removing the use of pip from our packaging process on Arch Linux (as it allows packages to download arbitrary dependencies from unknown locations and execute them).

    I noticed that psycopg makes use of a prerelease version of cython (3.0.0a11):

    /usr/lib/python3.10/site-packages/setuptools/installer.py:27: SetuptoolsDeprecationWarning: setuptools.installer is deprecated. Requirements should be satisfied by a PEP 517 installer.
      warnings.warn(
    /usr/bin/python: No module named pip
    Traceback (most recent call last):
      File "/usr/lib/python3.10/site-packages/setuptools/installer.py", line 82, in fetch_build_egg
        subprocess.check_call(cmd)
      File "/usr/lib/python3.10/subprocess.py", line 369, in check_call
        raise CalledProcessError(retcode, cmd)
    subprocess.CalledProcessError: Command '['/usr/bin/python', '-m', 'pip', '--disable-pip-version-check', 'wheel', '--no-deps', '-w', '/tmp/tmp63nki98q', '--quiet', 'Cython>=3.0.0a11']' returned non-zero exit status 1.
    
    The above exception was the direct cause of the following exception:
    
    Traceback (most recent call last):
      File "/build/python-psycopg/src/python-psycopg/psycopg_c/setup.py", line 106, in <module>
        setup(
      File "/usr/lib/python3.10/site-packages/setuptools/__init__.py", line 86, in setup
        _install_setup_requires(attrs)
      File "/usr/lib/python3.10/site-packages/setuptools/__init__.py", line 80, in _install_setup_requires
        dist.fetch_build_eggs(dist.setup_requires)
      File "/usr/lib/python3.10/site-packages/setuptools/dist.py", line 874, in fetch_build_eggs
        resolved_dists = pkg_resources.working_set.resolve(
      File "/usr/lib/python3.10/site-packages/pkg_resources/__init__.py", line 789, in resolve
        dist = best[req.key] = env.best_match(
      File "/usr/lib/python3.10/site-packages/pkg_resources/__init__.py", line 1075, in best_match
        return self.obtain(req, installer)
      File "/usr/lib/python3.10/site-packages/pkg_resources/__init__.py", line 1087, in obtain
        return installer(requirement)
      File "/usr/lib/python3.10/site-packages/setuptools/dist.py", line 944, in fetch_build_egg
        return fetch_build_egg(self, req)
      File "/usr/lib/python3.10/site-packages/setuptools/installer.py", line 84, in fetch_build_egg
        raise DistutilsError(str(e)) from e
    distutils.errors.DistutilsError: Command '['/usr/bin/python', '-m', 'pip', '--disable-pip-version-check', 'wheel', '--no-deps', '-w', '/tmp/tmp63nki98q', '--quiet', 'Cython>=3.0.0a11']' returned non-zero exit status 1.
    

    This effectively means that downstream distributions would install cython in an arbitrary version >= 3.0.0a11 in their packaging environments, which means the downloading of a package without version pinning in the context of the build system, leading to unreproducible packages.

    Since cython 3.0.0 is not released in a stable version, it is also not packaged on downstream distributions (apart from some bizarre edge cases): https://repology.org/project/python:cython/versions Therefore the optimized version of psycopg can not be packaged cleanly (and reproducibly).

    Can you please make the build compatible with the current stable version of cython?

    opened by dvzrv 10
  • No generators

    No generators

    Initial attempt to refactor the generators + wait consumer approach into process functions taking a wait function to block.

    There is some speedup to gain from this approach because of less use of generators/exceptions. But generators also require to reacquire the GIL at every yield, so the idea, unless I'm terribly mistaken, is that not using them, when we can, should allow to take longer dives into C code without resurfacing back to Python (and re-acquire the GIL), addressing the concurrency problem exposed in #448.

    opened by dvarrazzo 1
  • High level generators

    High level generators

    This MR is a refactoring of the generators module, with the idea of having generators doing more high-level work. For instance a public generator would take care of a whole execute-fetch cycle (similar to e.g. what PQexecParams() does) whereas, until 3.1, generators take care separately of flushing and receiving data.

    The idea is that, using this organization, we can, at a later stage, release the GIL for the whole duration of the send/fetch cycle and avoid the GIL thrashing that might be causing #448. This will also need to actually drop the use of generators and use a blocking function to wait, so we will have to build #415 on top of this in order to really address #448.

    opened by dvarrazzo 0
  • Optimize query conversion in C

    Optimize query conversion in C

    Profiling the current state of Psycopg (3.1.5), in a tight loop, profiling looks like:

    image

    The ragged hill is the machinery to convert parameters from Python to Postgres. It is probably the best area to attack in order to improve performances, probably even more valuable, and less intrusive, than #415).

    Rewriting this part in Cython would also allow to communicate in C between query and Transformer, so probably there is no need to resurface to Python in a PostgresQuery.convert() call (except to call Python dumpers of course).

    https://github.com/psycopg/psycopg/blob/1edf0eb47d44b985e82a60cf1ce4fd2a0c1fdafd/psycopg/psycopg/cursor.py#L469-L470

    performance 
    opened by dvarrazzo 6
  • Refactor: move modifier parsing functions to TypeInfo

    Refactor: move modifier parsing functions to TypeInfo

    Currently, the Column object contains convoluted ad-hoc code to be able to parse the type modifier value into something to display. See #449 for some details.

    Create equivalent typmodout functions and attach them to builtin data types, giving the possibility to extension types to use theirs as well.

    enhancement 
    opened by dvarrazzo 0
  • Performance degradation under concurrency

    Performance degradation under concurrency

    @dvarrazzo Something bad is happening with concurrency. Increasing the number of concurrent connections makes psycopg performance worse (!). I am talking about both psycopg and psycopg_async.

    Here is the modified script I used. In the case of sync driver adding concurrency means multiple threads, in the async case - multiple coroutines.

    Both psycopg2 and asyncpg behave as expected so using concurrency roughly doubles their performance

    Originally posted by @pwtail in https://github.com/psycopg/psycopg/discussions/411#discussioncomment-4288249

    performance 
    opened by dvarrazzo 8
Owner
The Psycopg Team
We make reptiles and pachiderms talk to each other.
The Psycopg Team
A fast PostgreSQL Database Client Library for Python/asyncio.

asyncpg -- A fast PostgreSQL Database Client Library for Python/asyncio asyncpg is a database interface library designed specifically for PostgreSQL a

magicstack 5.8k Dec 31, 2022
PostgreSQL database access simplified

Queries: PostgreSQL Simplified Queries is a BSD licensed opinionated wrapper of the psycopg2 library for interacting with PostgreSQL. The popular psyc

Gavin M. Roy 251 Oct 25, 2022
aiopg is a library for accessing a PostgreSQL database from the asyncio

aiopg aiopg is a library for accessing a PostgreSQL database from the asyncio (PEP-3156/tulip) framework. It wraps asynchronous features of the Psycop

aio-libs 1.3k Jan 3, 2023
Pure-python PostgreSQL driver

pg-purepy pg-purepy is a pure-Python PostgreSQL wrapper based on the anyio library. A lot of this library was inspired by the pg8000 library. Credits

Lura Skye 11 May 23, 2022
A Python wheel containing PostgreSQL

postgresql-wheel A Python wheel for Linux containing a complete, self-contained, locally installable PostgreSQL database server. All servers run as th

Michel Pelletier 71 Nov 9, 2022
Application which allows you to make PostgreSQL databases with Python

Automate PostgreSQL Databases with Python Application which allows you to make PostgreSQL databases with Python I used the psycopg2 library which is u

Marc-Alistair Coffi 0 Dec 31, 2021
a small, expressive orm -- supports postgresql, mysql and sqlite

peewee Peewee is a simple and small ORM. It has few (but expressive) concepts, making it easy to learn and intuitive to use. a small, expressive ORM p

Charles Leifer 9.7k Dec 30, 2022
Pandas on AWS - Easy integration with Athena, Glue, Redshift, Timestream, QuickSight, Chime, CloudWatchLogs, DynamoDB, EMR, SecretManager, PostgreSQL, MySQL, SQLServer and S3 (Parquet, CSV, JSON and EXCEL).

AWS Data Wrangler Pandas on AWS Easy integration with Athena, Glue, Redshift, Timestream, QuickSight, Chime, CloudWatchLogs, DynamoDB, EMR, SecretMana

Amazon Web Services - Labs 3.3k Dec 31, 2022
MySQL database connector for Python (with Python 3 support)

mysqlclient This project is a fork of MySQLdb1. This project adds Python 3 support and fixed many bugs. PyPI: https://pypi.org/project/mysqlclient/ Gi

PyMySQL 2.2k Dec 25, 2022
MySQL database connector for Python (with Python 3 support)

mysqlclient This project is a fork of MySQLdb1. This project adds Python 3 support and fixed many bugs. PyPI: https://pypi.org/project/mysqlclient/ Gi

PyMySQL 2.2k Dec 25, 2022
Python interface to Oracle Database conforming to the Python DB API 2.0 specification.

cx_Oracle version 8.2 (Development) cx_Oracle is a Python extension module that enables access to Oracle Database. It conforms to the Python database

Oracle 841 Dec 21, 2022
Async database support for Python. 🗄

Databases Databases gives you simple asyncio support for a range of databases. It allows you to make queries using the powerful SQLAlchemy Core expres

Encode 3.2k Dec 30, 2022
The Database Toolkit for Python

SQLAlchemy The Python SQL Toolkit and Object Relational Mapper Introduction SQLAlchemy is the Python SQL toolkit and Object Relational Mapper that giv

SQLAlchemy 6.5k Jan 1, 2023
A HugSQL-inspired database library for Python

PugSQL PugSQL is a simple Python interface for using parameterized SQL, in files. See pugsql.org for the documentation. To install: pip install pugsql

Dan McKinley 558 Dec 24, 2022
A tiny python web application based on Flask to set, get, expire, delete keys of Redis database easily with direct link at the browser.

First Redis Python (CRUD) A tiny python web application based on Flask to set, get, expire, delete keys of Redis database easily with direct link at t

Max Base 9 Dec 24, 2022
A Relational Database Management System for a miniature version of Twitter written in MySQL with CLI in python.

Mini-Twitter-Database This was done as a database design course project at Amirkabir university of technology. This is a relational database managemen

Ali 12 Nov 23, 2022
Database connection pooler for Python

Nimue Strange women lying in ponds distributing swords is no basis for a system of government! --Dennis, Peasant Nimue is a database connection pool f

null 1 Nov 9, 2021
A tutorial designed to introduce you to SQlite 3 database using python

SQLite3-python-tutorial A tutorial designed to introduce you to SQlite 3 database using python What is SQLite? SQLite is an in-process library that im

null 0 Dec 28, 2021
Creating a python package to convert /transfer excelsheet data to a mysql Database Table

Creating a python package to convert /transfer excelsheet data to a mysql Database Table

Odiwuor Lameck 1 Jan 7, 2022