A PostgreSQL or SQLite orm for Python

Overview

Prom

An opinionated lightweight orm for PostgreSQL or SQLite.

Prom has been used in both single threaded and multi-threaded environments, including environments using Greenthreads.

1 Minute Getting Started with SQLite

First, install prom:

$ pip install prom

Set an environment variable:

$ export PROM_DSN=sqlite://:memory:

Start python:

$ python

Create a prom Orm:

>>">
>>> import prom
>>>
>>> class Foo(prom.Orm):
...     table_name = "foo_table_name"
...     bar = prom.Field(int)
...
>>>

Now go wild and create some Foo objects:

>>> for x in range(10):
...     f = Foo.create(bar=x)
...
>>>

Now query them:

>>> f = Foo.query.one()
>>> f.bar
0
>>> f.pk
1
>>>
>>> for f in Foo.query.in_bar([3, 4, 5]):
...     f.pk
...
3
4
5
>>>

Update them:

>>> for f in Foo.query:
...     f.bar += 100
...     f.save()
...
>>>

and get rid of them:

>>> for f in Foo.query:
...     f.delete()
...
>>>

Congratulations, you have now created, retrieved, updated, and deleted from your database.


Configuration

Prom can be automatically configured on import by setting the environment variable PROM_DSN.

The PROM_DSN should define a dsn url:


   
    ://
    
     :
     
      @
      
       :
       
        /
        
         ?
         
          #
           
          
         
        
       
      
     
    
   

The built-in interface classes don't need their full python paths, you can just use sqlite and postgres.

So to use the builtin Postgres interface on testdb database on host localhost with username testuser and password testpw:

postgres://testuser:testpw@localhost/testdb

And to set it in your environment:

export PROM_DSN=postgres://testuser:testpw@localhost/testdb

After you've set the environment variable, then you just need to import Prom in your code:

import prom

and Prom will take care of parsing the dsn url(s) and creating the connection(s) automatically.

Multiple db interfaces or connections

If you have multiple connections, you can actually set multiple environment variables:

export PROM_DSN_1=postgres://testuser:testpw@localhost/testdb1#conn_1
export PROM_DSN_2=sqlite://testuser:testpw@localhost/testdb2#conn_2

It's easy to have one set of prom.Orm children use one connection and another set use a different connection, since the fragment part of a Prom dsn url sets the name:

import prom

prom.configure("Interface://testuser:testpw@localhost/testdb#connection_1")
prom.configure("Interface://testuser:testpw@localhost/testdb#connection_2")

class Orm1(prom.Orm):
    connection_name = "connection_1"
  
class Orm2(prom.Orm):
    connection_name = "connection_2"

Now, any child class that extends Orm1 will use connection_1 and any child class that extends Orm2 will use connection_2.

Creating Models

Checkout the README to see how to define the db schema and create models your python code can use.

Querying Rows

Checkout the README to see how to perform queries on the db.

Versions

While Prom will most likely work on other versions, Prom is tested to work on 2.7+ and 3.8.

Installation

Postgres

If you want to use Prom with Postgres, you need psycopg2:

$ apt-get install libpq-dev python-dev
$ pip install psycopg

Green Threads

If you want to use Prom with gevent, you'll need gevent and psycogreen:

$ pip install gevent
$ pip install psycogreen

These are the versions we're using:

$ pip install "gevent==20.6.2"
$ pip install "psycogreen==1.0"

Then you can setup Prom like this:

from prom.interface.postgres import gevent
gevent.patch_all()

Now you can use Prom in the same way you always have. If you would like to configure the threads and stuff, you can pass in some configuration options using the dsn, the three parameters are async, pool_maxconn, pool_minconn, and pool_class. The only one you'll really care about is pool_maxconn which sets how many connections should be created.

Prom

Prom installs using pip:

$ pip install prom

and to install the latest and greatest:

$ pip install --upgrade "git+https://github.com/Jaymon/prom#egg=prom"

Using for the first time

Prom takes the approach that you don't want to be hassled with table installation while developing, so when it tries to do something and sees that the table doesn't yet exist, it will use your defined fields for your prom.model.Orm child and create a table for you, that way you don't have to remember to run a script or craft some custom db query to add your tables. Prom takes care of that for you automatically.

Likewise, if you add a field (and the field is not required) then prom will go ahead and add that field to your table so you don't have to bother with crafting ALTER queries while developing.

If you want to install the tables manually, you can create a script or something and use the Orm's install() method:

SomeOrm.install()
Comments
  • Cleanup Query api

    Cleanup Query api

    I would like to explore the following interface:

    1. if you add select fields then get() or all() would return just the selected fields, not Orm instances. Likewise, if you called get_one() it would return just the value, not an orm instance, basically, we never want half populated orm objects for it to be the default, so if you did want a half populated orm object you could just create it and pass in the values returned. We almost exclusively want lists when we use the select() method.

    2. change get_one() to one() so that it is more inline with first() and last(), I know why I did this, to make it similar to get() and get_pk().

    3. remove Query value() and values(), because of number 1 above, we wouldn't need these any longer, and if you ever needed them with the full fields, they would still exist on the returned iterator.

    question 
    opened by Jaymon 5
  • break up isetter to iupdate and icreate

    break up isetter to iupdate and icreate

    It would just be easier if all the methods had the very same interface, in this case you get passed the value and then you return a value, right now I have to lookup isetter everytime I want to use it because this is the definition:

    @foo.isetter
    def foo(cls, val, is_update, is_modified)
    

    I would also just like to make this whole interface better and easier to use, I ran into an issue the other day with this method being called with val=None when it wasn't actually getting set, which broke the principal of expectation, so I just think the whole interface can be cleaned up a bit, maybe have an inochange or something like that method also.

    opened by Jaymon 3
  • add unique select

    add unique select

    Foo.unique("bar_id").gt__created(now - ago).lt__created(now).values()
    

    Would result in:

    SELECT DISTINCT bar_id WHERE _created > ... AND _created < ...
    
    opened by Jaymon 3
  • add Query.one_field_name and get_field_name

    add Query.one_field_name and get_field_name

    So we currently have get_pk() that acts as an alias for Query.is_pk(pk).one() but we should generic that, so you can do it for any field, the get_* version would return an iterator, and the one_* version would return the first match

    opened by Jaymon 2
  • Remove usage of `int` method within SQLite interface TimestampType

    Remove usage of `int` method within SQLite interface TimestampType

    int(...) is used several times within the convert method of TimestampType. This, however, leads to the unintended consequence of trimming off leading 0s from any string segment. All usages of int(...) should be replaced with alternative solutions.

    opened by jaridmargolin 2
  • Could query.Iterator extend list?

    Could query.Iterator extend list?

    would be really handy if it could almost completely mimic list but be an iterator on the backend, it already does most of it but because it doesn't extend list you lost out on some functionality

    opened by Jaymon 2
  • Query should run values through isetter also

    Query should run values through isetter also

    I'm not sure if it does this or not, but Query should run values through the isetter method also before it runs the query, that way if you have a mapping from String to int in the isetter methods, say something like:

    mapping = {
        "foo": 1,
    }
    

    Then doing something like:

    MyOrm.query.is_name("foo")
    

    the "foo" value would be converted to 1 before running the query.

    opened by Jaymon 2
  • get_modified sets container fields as modified but depopulate doesn't

    get_modified sets container fields as modified but depopulate doesn't

    similar to the problem with JsonField, let's say you have a custom container that stores in the db as a string, but when pulled out from the db it becomes a set() or something similar, well, if you did something like:

    f = FooOrm()
    f.myset.add("val")
    

    Then the modified fields wouldn't identify that the field has been changed, so there needs to be a way to account for this.

    possible solutions:

    1. pass a value into the Field() creation that basically says to always mark this as modified, something like: myset = Field(str, dirty=True)

    2. when pulling values from the db take hashes of the values and compare those hashes with the the current values to determine what is modified, this would be a wholesale re-write of the current system and would be way slower to pull rows from the db, but would generic and wouldn't need the end user to be aware of anything

    3. when pulling out of the db, if after calling Field.isetter() the value is no longer the same type as the what the field says (ie, myset is stored in the db as a string but after calling myset.issetter() it is now a set) then go ahead and automatically mark that field as modified, this would replace the current checks for ObjectField and JsonField and the like since I think this system would be pretty generic, relatively fast (only one extra isinstance() check) and wouldn't involve end user having to do anything.

    wontfix 
    opened by Jaymon 2
  • Orm.read_only

    Orm.read_only

    if you set this to True on a prom.Orm instance it will cause insert() and update() to fail.

    I was in a situation where I needed to mimic some production data in the test environment and it would've been nice to just have an added level of WTF just in case I accidentally triggered a change

    enhancement 
    opened by Jaymon 2
  • Recovering from connection error

    Recovering from connection error

    File "/usr/local/lib/python2.7/dist-packages/prom/query.py", line 573, in get_one
        d = self._query('get_one')
      File "/usr/local/lib/python2.7/dist-packages/prom/query.py", line 682, in _query
        return getattr(i, method_name)(s, self)
      File "/usr/local/lib/python2.7/dist-packages/prom/interface/base.py", line 479, in get_one
        ret = self._get_query(self._get_one, schema, query, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/prom/interface/base.py", line 463, in _get_query
        ret = callback(schema, query, *args, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/prom/interface/base.py", line 851, in _get_one
        return self.query(query_str, *query_args, fetchone=True, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/prom/interface/base.py", line 194, in query
        return self._query(query_str, query_args, **query_options)
      File "/usr/local/lib/python2.7/dist-packages/prom/interface/base.py", line 625, in _query
        return ret
      File "/usr/lib/python2.7/contextlib.py", line 35, in __exit__
        self.gen.throw(type, value, traceback)
      File "/usr/local/lib/python2.7/dist-packages/prom/interface/base.py", line 170, in connection
        self.raise_error(e)
      File "/usr/local/lib/python2.7/dist-packages/prom/interface/base.py", line 155, in connection
        yield connection
      File "/usr/local/lib/python2.7/dist-packages/prom/interface/base.py", line 595, in _query
        cur = connection.cursor()
    InterfaceError: connection already closed
    

    Prom should be able to recover from this error and only propagate the error if it fails to reconnect N number of times or something.

    enhancement 
    opened by Jaymon 2
  • Model.__getattr__ handling of pk alias

    Model.__getattr__ handling of pk alias

    we have a line in __getattr__ like this:

    if k != "pk"
    

    It would be better if that line was updated to check all the primary key aliases:

    if k in schema.fields[schema.pk_name].names
    
    opened by Jaymon 1
  • Query filter syntax

    Query filter syntax

    https://www.gatsbyjs.com/docs/query-filters/

    This has come up a few times, but Prom's query class was based on MongoDB's query syntax, and that syntax has seemed to become pretty standard over the last decade or so. The same syntax is used in GraphQL also.

    I should maybe add some of these sources to the query docs. But right now I just need a place I can put these that I can remember when the subject comes up in discussion, this will be that place for awhile.

    opened by Jaymon 0
  • Remove Postgres VARCHAR in favor of TEXT with check constraint

    Remove Postgres VARCHAR in favor of TEXT with check constraint

    So instead of VARCHAR(256) I would do:

    <COLNAME> TEXT NOT NULL CHECK (length(<COLNAME>) <= 256)
    

    You can also easily modify the checks:

    ALTER TABLE <TABLE NAME> ADD CONSTRAINT <CONSTRAINT-NAME> CHECK (length(<COLNAME>) <= 128)
    

    This should work in both SQLite and Postgres. CHECK constraints can also be more involved like regexes and stuff

    Links

    Search

    • sqlite add constraint check
    opened by Jaymon 0
  • Switch timestamp to timestamptz

    Switch timestamp to timestamptz

    I'd want to look into how SQLite handles these also. But I'm thinking I should change to this for all timestamps:

    timestamp with time zone
    

    references

    The SQL standard requires that writing just timestamp be equivalent to timestamp without time zone, and PostgreSQL honors that behavior. timestamptz is accepted as an abbreviation for timestamp with time zone; this is a PostgreSQL extension.

    opened by Jaymon 0
  • Add support for intersect

    Add support for intersect

    An intersect query:

            SELECT 
    			"_id"
    		FROM 
    			"foo"
    		WHERE 
    			("type" = 'a' AND "value" = '1')
             INTERSECT
    		SELECT 
    			"_id"
    		FROM 
    			"foo"
    		WHERE 
    			("type" = 'b' AND "value" = '2')
    

    and a count query:

    SELECT count(*) FROM (
    		SELECT 
    			"_id"
    		FROM 
    			"foo"
    		WHERE 
    			("type" = 'a' AND "value" = '1')
             INTERSECT
    		SELECT 
    			"_id"
    		FROM 
    			"foo"
    		WHERE 
    			("type" = 'b' AND "value" = '2')
    ) I
    

    This could be done in prom:

    q1 = Foo.query.select_pk().eq_type("a").eq_value("1")
    q2 = Foo.query.select_pk().eq_type("b").eq_value("2")
    
    # get the results
    Foo.query.intersect(q1, q2).all()
    
    # count the results
    Foo.query.intersect(q1, q2).count()
    

    References

    Search

    • sql postgres intersect count query
    opened by Jaymon 3
  • A better way to do Postgres case-insensitive searching?

    A better way to do Postgres case-insensitive searching?

    Currently we basically do this but it is probably worth revisiting this and taking a good look at citext or collation (which is how we do it on SQLite).

    Jarid was researching this in July 2022 for something on Bodega and is the one that showed me citext.

    opened by Jaymon 0
  • Add support for Upsert

    Add support for Upsert

    • https://www.sqlite.org/lang_UPSERT.html
    • https://stackoverflow.com/questions/418898/sqlite-upsert-not-insert-or-replace

    Both Postgres and SQLite have roughly the same syntax and it would be nice to bake in support for it

    opened by Jaymon 0
Owner
Jay Marcyes
I build things, some of those things end up here, others don't
Jay Marcyes
The ormar package is an async mini ORM for Python, with support for Postgres, MySQL, and SQLite.

python async mini orm with fastapi in mind and pydantic validation

null 1.2k Jan 5, 2023
Tortoise ORM is an easy-to-use asyncio ORM inspired by Django.

Tortoise ORM was build with relations in mind and admiration for the excellent and popular Django ORM. It's engraved in it's design that you are working not with just tables, you work with relational data.

Tortoise 3.3k Jan 7, 2023
ORM for Python for PostgreSQL.

New generation (or genius) ORM for Python for PostgreSQL. Fully-typed for any query with Pydantic and auto-model generation, compatible with any sync or async driver

Yan Kurbatov 3 Apr 13, 2022
A new ORM for Python specially for PostgreSQL

A new ORM for Python specially for PostgreSQL. Fully-typed for any query with Pydantic and auto-model generation, compatible with any sync or async driver

Yan Kurbatov 3 Apr 13, 2022
Solrorm : A sort-of solr ORM for python

solrorm : A sort-of solr ORM for python solrpy - deprecated solrorm - currently in dev Usage Cores The first step to interact with solr using solrorm

Aj 1 Nov 21, 2021
The Orator ORM provides a simple yet beautiful ActiveRecord implementation.

Orator The Orator ORM provides a simple yet beautiful ActiveRecord implementation. It is inspired by the database part of the Laravel framework, but l

Sébastien Eustace 1.4k Jan 1, 2023
An async ORM. 🗃

ORM The orm package is an async ORM for Python, with support for Postgres, MySQL, and SQLite. ORM is built with: SQLAlchemy core for query building. d

Encode 1.7k Dec 28, 2022
Pydantic model support for Django ORM

Pydantic model support for Django ORM

Jordan Eremieff 318 Jan 3, 2023
Piccolo - A fast, user friendly ORM and query builder which supports asyncio.

A fast, user friendly ORM and query builder which supports asyncio.

null 919 Jan 4, 2023
Bringing Async Capabilities to django ORM

Bringing Async Capabilities to django ORM

Skander BM 119 Dec 1, 2022
A simple project to explore the number of GCs when doing basic ORM work.

Question: Does Python do extremely too many GCs for ORMs? YES, OMG YES. Check this out Python Default GC Settings: SQLAlchemy - 20,000 records in one

Michael Kennedy 26 Jun 5, 2022
A dataclasses-based ORM framework

dcorm A dataclasses-based ORM framework. [WIP] - Work in progress This framework is currently under development. A first release will be announced in

HOMEINFO - Digitale Informationssysteme GmbH 1 Dec 24, 2021
Python 3.6+ Asyncio PostgreSQL query builder and model

windyquery - A non-blocking Python PostgreSQL query builder Windyquery is a non-blocking PostgreSQL query builder with Asyncio. Installation $ pip ins

null 67 Sep 1, 2022
SQLModel is a library for interacting with SQL databases from Python code, with Python objects.

SQLModel is a library for interacting with SQL databases from Python code, with Python objects. It is designed to be intuitive, easy to use, highly compatible, and robust.

Sebastián Ramírez 9.1k Dec 31, 2022
A pure Python Database Abstraction Layer

pyDAL pyDAL is a pure Python Database Abstraction Layer. It dynamically generates the SQL/noSQL in realtime using the specified dialect for the databa

null 440 Nov 13, 2022
Rich Python data types for Redis

Created by Stephen McDonald Introduction HOT Redis is a wrapper library for the redis-py client. Rather than calling the Redis commands directly from

Stephen McDonald 281 Nov 10, 2022
A Python Object-Document-Mapper for working with MongoDB

MongoEngine Info: MongoEngine is an ORM-like layer on top of PyMongo. Repository: https://github.com/MongoEngine/mongoengine Author: Harry Marr (http:

MongoEngine 3.9k Dec 30, 2022
A Python Library for Simple Models and Containers Persisted in Redis

Redisco Python Containers and Simple Models for Redis Description Redisco allows you to store objects in Redis. It is inspired by the Ruby library Ohm

sebastien requiem 436 Nov 10, 2022
Beanie - is an Asynchronous Python object-document mapper (ODM) for MongoDB

Beanie - is an Asynchronous Python object-document mapper (ODM) for MongoDB, based on Motor and Pydantic.

Roman 993 Jan 3, 2023