Monty, Mongo tinified. MongoDB implemented in Python !

Overview

drawing

Monty, Mongo tinified. MongoDB implemented in Python !

Build Status Coverage Status Version Join the chat at https://gitter.im/montydb-hq/community

Inspired by TinyDB and it's extension TinyMongo.

MontyDB is:

  • A tiny version of MongoDB, against to MongoDB 4.0.11
  • Written in pure Python, testing on Python 2.7, 3.6, 3.7, 3.8, PyPy*
  • Literally serverless.
  • Similar to mongomock, but a bit more than that.

All those implemented functions and operators, should behaved just like you were working with MongoDB. Even raising error for same cause.

Install

pip install montydb

Optinal Requirements
  • lmdb (for LMDB storage lightning)

  • pymongo (for bson)

    bson is opt-out by default even it's installed, set env var MONTY_ENABLE_BSON=1 to enable it.

Example Code

>>> from montydb import MontyClient
>>> col = MontyClient(":memory:").db.test
>>> col.insert_many([{"stock": "A", "qty": 6}, {"stock": "A", "qty": 2}])

>>> cur = col.find({"stock": "A", "qty": {"$gt": 4}})
>>> next(cur)
{'_id': ObjectId('5ad34e537e8dd45d9c61a456'), 'stock': 'A', 'qty': 6}

Development

  • You may visit Projects' TODO to see what's going on.
  • You may visit This Issue to see what's been implemented and what's not.

Storage Engine Configurations

The configuration process only required on repository creation or modification.

Currently, one repository can only assign one storage engine.

  • Memory

Memory storage does not need nor have any configuration, nothing saved to disk.

>>> from montydb import MontyClient
>>> client = MontyClient(":memory:")
  • FlatFile

FlatFile is the default on-disk storage engine.

>>> from montydb import MontyClient
>>> client = MontyClient("/db/repo")

FlatFile config:

[flatfile]
cache_modified: 0  # how many document CRUD cached before flush to disk.
  • LMDB (Lightning Memory-Mapped Database)

LMDB is NOT the default on-disk storage, need configuration first before get client.

Newly implemented.

>>> from montydb import set_storage, MontyClient
>>> set_storage("/db/repo", storage="lightning")
>>> client = MontyClient("/db/repo")

LMDB config:

[lightning]
map_size: 10485760  # Maximum size database may grow to.
  • SQLite

SQLite is NOT the default on-disk storage, need configuration first before get client.

Pre-existing sqlite storage file which saved by montydb<=1.3.0 is not read/writeable after montydb==2.0.0.

>>> from montydb import set_storage, MontyClient
>>> set_storage("/db/repo", storage="sqlite")
>>> client = MontyClient("/db/repo")

SQLite config:

[sqlite]
journal_mode: WAL

SQLite write concern:

>>> client = MontyClient("/db/repo",
>>>                      synchronous=1,
>>>                      automatic_index=False,
>>>                      busy_timeout=5000)

MontyDB URI

You could prefix the repository path with montydb URI scheme.

  >>> client = MontyClient("montydb:///db/repo")

Utilities

Pymongo bson may required.

  • montyimport

    Imports content from an Extended JSON file into a MontyCollection instance. The JSON file could be generated from montyexport or mongoexport.

    >>> from montydb import open_repo, utils
    >>> with open_repo("foo/bar"):
    >>>     utils.montyimport("db", "col", "/path/dump.json")
    >>>
  • montyexport

    Produces a JSON export of data stored in a MontyCollection instance. The JSON file could be loaded by montyimport or mongoimport.

    >>> from montydb import open_repo, utils
    >>> with open_repo("foo/bar"):
    >>>     utils.montyexport("db", "col", "/data/dump.json")
    >>>
  • montyrestore

    Loads a binary database dump into a MontyCollection instance. The BSON file could be generated from montydump or mongodump.

    >>> from montydb import open_repo, utils
    >>> with open_repo("foo/bar"):
    >>>     utils.montyrestore("db", "col", "/path/dump.bson")
    >>>
  • montydump

    Creates a binary export from a MontyCollection instance. The BSON file could be loaded by montyrestore or mongorestore.

    >>> from montydb import open_repo, utils
    >>> with open_repo("foo/bar"):
    >>>     utils.montydump("db", "col", "/data/dump.bson")
    >>>
  • MongoQueryRecorder

    Record MongoDB query results in a period of time. Requires to access databse profiler.

    This works via filtering the database profile data and reproduce the queries of find and distinct commands.

    >>> from pymongo import MongoClient
    >>> from montydb.utils import MongoQueryRecorder
    >>> client = MongoClient()
    >>> recorder = MongoQueryRecorder(client["mydb"])
    >>> recorder.start()
    >>> # Make some queries or run the App...
    >>> recorder.stop()
    >>> recorder.extract()
    {<collection_1>: [<doc_1>, <doc_2>, ...], ...}
  • MontyList

    Experimental, a subclass of list, combined the common CRUD methods from Mongo's Collection and Cursor.

    >>> from montydb.utils import MontyList
    >>> mtl = MontyList([1, 2, {"a": 1}, {"a": 5}, {"a": 8}])
    >>> mtl.find({"a": {"$gt": 3}})
    MontyList([{'a': 5}, {'a': 8}])

Why I did this ?

Mainly for personal skill practicing and fun. I work in VFX industry, some of my production needs (mostly edge-case) requires to run in a limited environment (e.g. outsourced render farms), which may have problem to run or connect a MongoDB instance. And I found this project really helps.


drawing

Comments
  • uses ast.literal_eval() over eval()

    uses ast.literal_eval() over eval()

    static security checking of codebase using Bandit revealed use of unsecure eval() function.

    >> Issue: [B307:blacklist] Use of possibly insecure function - consider using safer ast.literal_eval.
       Severity: Medium   Confidence: High
       Location: montydb/types/_nobson.py:163
       More Info: https://bandit.readthedocs.io/en/latest/blacklists/blacklist_calls.html#b307-eval
    162                     if not _encoder.key_is_keyword:
    163                         key = eval(candidate)
    164                         if not isinstance(key, cls._string_types):
    

    Have implemented ast.literal_eval() in its place

    opened by madeinoz67 3
  • Dropping Python 3.4, 3.5 and adding 3.7 to CI

    Dropping Python 3.4, 3.5 and adding 3.7 to CI

    Dropping Python 3.4, 3.5 tests

    In some test cases, for example:

    • test/test_engine/test_find.py

      • test_find_2
      • test_find_3
    • tests/test_engine/test_update/test_update.py

      • test_update_positional_filtered_near_conflict
      • test_update_positional_filtered_has_conflict_1
    • tests/test_engine/test_update/test_update_pull.py

      • test_update_pull_6
      • test_update_pull_7

    They often failed randomly due to the dict key order in run-time. I think, unless changing those test case documents into OrderedDict, or can not ensure the key order input into monty and mongo were the same ( which may cause different output ).

    Since this is not the issue of montydb's functionality, dropping them for good.

    Involving Python 3.7

    Well, it's 2019.

    opened by davidlatwe 3
  • Update base.py

    Update base.py

    MutableMapping should be imported from collections.abc as said in documentation https://docs.python.org/3.9/library/collections.abc.html#collections.abc.MutableMapping

    Mainly we need it because its fix #65 (python3.10 compatibility)

    opened by bobuk 2
  • Positional operator issue

    Positional operator issue

    Doesn't work with positional operators. Made the same update_one with pymongo successfully. Don't sure if monty got this feature yet not, but as I can see it causing error.

    update_one({'users': {'$elemMatch': {'_id': id_}}}, {'$set': {'invoices.$.name': name}})

    montydb.erorrs.WriteError: The positional operator did not find the match needed from the query.

    bug 
    opened by rewiaca 2
  • bson.errors.InvalidBSON: objsize too large after update_one

    bson.errors.InvalidBSON: objsize too large after update_one

    Getting this error when making any operation after editing database. Using lmdb. Guessed it was after wrong update_one, but not sure about. Anyway, adding original code of editing db:

        record = {'free': 12313232, 'path': '/media/mnt/'}
        col = getattr(db, 'storages')
    
        record = {**models['storage'], **record}
        if col.count_documents({'path': record['path']}) > 0:
            col.update_one({'path': record['path']}, {'$set': {**record}})
        else:
            col.insert_one(record)
    

    Error text:

    Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/home/user/.local/lib/python3.8/site-packages/montydb/cursor.py", line 365, in next if len(self._data) or self._refresh(): File "/home/user/.local/lib/python3.8/site-packages/montydb/cursor.py", line 354, in _refresh self.__query() File "/home/user/.local/lib/python3.8/site-packages/montydb/cursor.py", line 311, in __query for doc in documents: File "/home/user/.local/lib/python3.8/site-packages/montydb/storage/lightning.py", line 253, in <genexpr> docs = (self._decode_doc(doc) for doc in self._conn.iter_docs()) File "/home/user/.local/lib/python3.8/site-packages/montydb/storage/__init__.py", line 227, in _decode_doc return bson.document_decode( File "/home/user/.local/lib/python3.8/site-packages/montydb/types/_bson.py", line 64, in document_decode return cls.BSON(doc).decode(codec_options) File "/home/user/.local/lib/python3.8/site-packages/bson/__init__.py", line 1258, in decode return decode(self, codec_options) File "/home/user/.local/lib/python3.8/site-packages/bson/__init__.py", line 970, in decode return _bson_to_dict(data, codec_options) bson.errors.InvalidBSON: objsize too large

    That's how db looks like in plain:

    $ cat db/storages.mdb @ @ ~60c9a9cf368c720edc2668a6{"path": "/mnt/hdd4", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a6"}}~60c9a9cf368c720edc2668a5{"path": "/boot/efi", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a5"}}y60c9a9cf368c720edc2668a4{"path": "/run", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a4"}}v60c9a9cf368c720edc2668a3{"path": "/", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a3"}} f*~s60c9a9cf368c720edc2668a3{"path": "/", "total": 9999, "used": 99, "free": 0, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a3"}}2~60c9a9cf368c720edc2668a6{"path": "/mnt/hdd4", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a6"}}~60c9a9cf368c720edc2668a5{"path": "/boot/efi", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a5"}}y60c9a9cf368c720edc2668a4{"path": "/run", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a4"} f*~sr60c9a9cf368c720edc2668a3{"path": "/", "total": 9999, "used": 99, "free": 0, "status": "busy", "_id": {"$oid": "60c9a9cf368c720edc2668a3"}}~60c9a9cf368c720edc2668a6{"path": "/mnt/hdd4", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a6"}}~60c9a9cf368c720edc2668a5{"path": "/boot/efi", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a5"}}y60c9a9cf368c720edc2668a4{"path": "/run", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a4"}}

    Or this:

    $ cat db/storages.mdb @ @ 0 documents 0 document 0̝φħ^sTb_id̝φħ^sTpath/media/user/ssd1totalusedfreestatusreadyr60c9a9cf368c720edc2668a3{"path": "/", "total": 9999, "used": 99, "free": 0, "status": "busy", "_id": {"$oid": "60c9a9cf368c720edc2668a3"}}~60c9a9cf368c720edc2668a6{"path": "/mnt/hdd4", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a6"}}~60c9a9cf368c720edc2668a5{"path": "/boot/efi", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a5"}}y60c9a9cf368c720edc2668a4{"path": "/run", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a4"}} 0 documents̝φħ^sTb_id̝φħ^sTpath/media/user/ssd1totalusedfreestatusreadyr60c9a9cf368c720edc2668a3{"path": "/", "total": 9999, "used": 99, "free": 0, "status": "busy", "_id": {"$oid": "60c9a9cf368c720edc2668a3"}}~60c9a9cf368c720edc2668a6{"path": "/mnt/hdd4", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a6"}}~60c9a9cf368c720edc2668a5{"path": "/boot/efi", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a5"}}y60c9a9cf368c720edc2668a4{"path": "/run", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a4"} 0 0̝φħ^sTb_id̝φħ^sTpath/media/user/ssd1totalusedfreestatusreadyr60c9a9cf368c720edc2668a3{"path": "/", "total": 9999, "used": 99, "free": 0, "status": "busy", "_id": {"$oid": "60c9a9cf368c720edc2668a3"}}~60c9a9cf368c720edc2668a6{"path": "/mnt/hdd4", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a6"}}~60c9a9cf368c720edc2668a5{"path": "/boot/efi", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a5"}}y60c9a9cf368c720edc2668a4{"path": "/run", "total": 9999, "used": 99, "free": 9900, "status": "ready", "_id": {"$oid": "60c9a9cf368c720edc2668a4"}}

    Does it stores changes after updating?

    bug 
    opened by rewiaca 2
  • update_one and update_many creating extra records for flatfile

    update_one and update_many creating extra records for flatfile

    Performing updates with the flat file is giving me duplicate documents. I'm running Python 3.8.9 and have tried it with both the pip install montydb install and pip install montydb[bson] install. This problem is not occurring when in sqlite mode.

    Subsequent program runs after inserting a record are causing a duplicate document to be added with the same _id.

    It looks like the OrderedDict cache update at https://github.com/davidlatwe/montydb/blob/master/montydb/storage/flatfile.py#L79 is where the extra document is being added. Debugging the process shows that Python is adding a duplicate document because the keys in the ordered dict are actually different. One is an ObjectId object and the other is the binary serialized representation of that id. Here is a screenshot of the debugging output: debug output

    Here is the source code to reproduce. Note that you will have to run it twice because this only occurs on subsequent runs.

    from montydb import MontyClient, set_storage
    
    set_storage("./db/repo",  cache_modified=0)
    client =  MontyClient("./db/repo")
    coll = client.petsDB.pets
    
    if  len([x for x in coll.find({"pet":  "cat"})])  ==  0:
        coll.insert_one({"pet":"cat",  "domestic?":True, "climate":  ["polar",  "equatorial",  "mountain"]})
        
    coll.update_one({"pet":  "cat"},  {"$push":  {"climate":  "continental"}})
    # This should only ever print 1 on subsequent runs.
    print(len([x for x in coll.find({"pet":  "cat"})]))
    
    bug 
    opened by SEary342 2
  • Find by ObjectId

    Find by ObjectId

    Could not find by ObjectId. Code:

    from bson.objectid import ObjectId
    
    x = collection.find()
    i = list(x)[0]['_id']
    
    y = collection.find({'_id': ObjectId(i)})
    print(list(y))
    

    I see that database in mdb format has "_id": {"$oid": "id"} and I tried to find id in string with: {"_id.$oid": "string id"} but anyway, it would not work. Any suggestions? Thanks!

    bug 
    opened by rewiaca 2
  • `bytes` type unsupported

    `bytes` type unsupported

    Issue

    Cannot store bytes as values.

    Env

    Windows 10 Python 3.8.1 MontyDB 2.3.6

    Actual error

    >>> from montydb import MontyClient
    >>> col = MontyClient(":memory:").db.test
    >>> col.insert_one({'data': b'some bytes'})
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "C:\Users\user\.virtualenvs\name\lib\site-packages\montydb\collection.py", line 139, in insert_one
        result = self._storage.write_one(self, document)
      File "C:\Users\user\.virtualenvs\name\lib\site-packages\montydb\storage\__init__.py", line 45, in delegate
        return getattr(delegator, attr)(*args, **kwargs)
      File "C:\Users\user\.virtualenvs\name\lib\site-packages\montydb\storage\memory.py", line 120, in write_one
        self._col[b_id] = self._encode_doc(doc, check_keys)
      File "C:\Users\user\.virtualenvs\name\lib\site-packages\montydb\storage\__init__.py", line 183, in _encode_doc
        return bson.document_encode(
      File "C:\Users\user\.virtualenvs\name\lib\site-packages\montydb\types\_bson.py", line 236, in document_encode
        for s in _encoder.iterencode(doc):
      File "C:\Program Files\Python38\lib\json\encoder.py", line 431, in _iterencode
        yield from _iterencode_dict(o, _current_indent_level)
      File "C:\Program Files\Python38\lib\json\encoder.py", line 405, in _iterencode_dict
        yield from chunks
      File "C:\Program Files\Python38\lib\json\encoder.py", line 438, in _iterencode
        o = _default(o)
      File "C:\Users\user\.virtualenvs\name\lib\site-packages\montydb\types\_bson.py", line 222, in default
        return NoBSON.JSONEncoder.default(self, obj)
      File "C:\Program Files\Python38\lib\json\encoder.py", line 179, in default
        raise TypeError(f'Object of type {o.__class__.__name__} '
    TypeError: Object of type bytes is not JSON serializable
    

    Same with PyMongo

    >>> from pymongo import MongoClient
    >>> col = MongoClient('127.0.0.1').tests.test1
    >>> col.insert_one({'data': b'some bytes'})
    <pymongo.results.InsertOneResult object at 0x000002BBB52BC7C0>
    >>> next(col.find())
    {'_id': ObjectId('60bdaa528ff3727b58f514f7'), 'data': b'some bytes'}
    
    bug 
    opened by strayge 2
  • GitHub Actions: Add more flake8 tests

    GitHub Actions: Add more flake8 tests

    Instead of selecting a handful of vital tests to run, let’s run all flake8 tests ignoring only a handful of tests.

    flake8 . --ignore=E302,F401,F841,W605

    opened by cclauss 2
  • Inactive project?

    Inactive project?

    Hello guys, I find your project very interesting but there has not been any development for quite a while. Is this project not under development anymore? Kind regards

    opened by flome 2
  • $elemMatch in $elemMatch find nothing

    $elemMatch in $elemMatch find nothing

    Hi @davidlatwe, Just discovered that monty(montydb-2.3.10) can't handle query like:

    x = col.find({"mapping": {'$elemMatch': {'$elemMatch': {'$in': [https://accounts.google.com/o/oauth2/aut']}}}})

    for a array in array elements:

    "mapping": [ ["https://accounts.google.com/o/oauth2/auth", "client_id", "redirect_uri", "scope", "response_type"] ]

    Just doesn't find anything, unlike pymongo does

    bug 
    opened by rewiaca 1
  • Updating a document in an array leads to an error

    Updating a document in an array leads to an error

    Updating a document in an array leads to an error: https://docs.mongodb.com/manual/reference/operator/update/positional/#update-documents-in-an-array

      collection.update_one(
          filter={
              "order": order_number,
              "products.product_id": product_id,
          },
          update={
              "$set": {
                  "products.$.quantity": quantity
              }
          }
      )
    

    leads to an exception:

                else:
                    # Replace "$" into matched array element index
                    matched = fieldwalker.get_matched()
    >               position = matched.split(".")[0]
    E               AttributeError: 'NoneType' object has no attribute 'split'
    \field_walker.py:586: AttributeError
    
    bug 
    opened by thasler 0
  • $slice projection does not return other fields

    $slice projection does not return other fields

    When using the $slice projection together with an exclusion projection, the operation should return all the other fields in the document. https://docs.mongodb.com/manual/reference/operator/projection/slice/#behavior

    Monty DB is only returning the array that is sliced.

    bug 
    opened by thasler 1
  • Implement MongoDB aggregate

    Implement MongoDB aggregate

    NotImplementedError: 'MontyCollection.aggregate' is NOT implemented ! It would be awesome to have aggregate, this project is very cool non the less!

    epic feature 
    opened by hedrickw 0
  • Support for Mongoengine

    Support for Mongoengine

    Montydb with the sqlite backend provides multi-process operation, at least in my initial trials with just 2 processes writing to the database simultaneously. This is clearly an advantage over mongita, which also provides a file/mem clone of pyMongodb; however, doesn't provide multi-process support. But mongitadb does support Mongoengine which was achieved recently.

    Has anyone been able to use Montydb with Mongoengine?

    feature 
    opened by yamsu 2
Releases(2.4.0)
Owner
David Lai
Animation Production Pipeline TD, and Troubleshooter. "It's all good, man"
David Lai
PyMongo - the Python driver for MongoDB

PyMongo Info: See the mongo site for more information. See GitHub for the latest source. Documentation: Available at pymongo.readthedocs.io Author: Mi

mongodb 3.7k Jan 8, 2023
Motor - the async Python driver for MongoDB and Tornado or asyncio

Motor Info: Motor is a full-featured, non-blocking MongoDB driver for Python Tornado and asyncio applications. Documentation: Available at motor.readt

mongodb 2.1k Dec 26, 2022
Motor - the async Python driver for MongoDB and Tornado or asyncio

Motor Info: Motor is a full-featured, non-blocking MongoDB driver for Python Tornado and asyncio applications. Documentation: Available at motor.readt

mongodb 1.6k Feb 6, 2021
A Python Object-Document-Mapper for working with MongoDB

MongoEngine Info: MongoEngine is an ORM-like layer on top of PyMongo. Repository: https://github.com/MongoEngine/mongoengine Author: Harry Marr (http:

MongoEngine 3.9k Jan 8, 2023
Async ODM (Object Document Mapper) for MongoDB based on python type hints

ODMantic Documentation: https://art049.github.io/odmantic/ Asynchronous ODM(Object Document Mapper) for MongoDB based on standard python type hints. I

Arthur Pastel 732 Dec 31, 2022
A simple password manager I typed with python using MongoDB .

Python with MongoDB A simple python code example using MongoDB. How do i run this code • First of all you need to have a python on your computer. If y

null 31 Dec 6, 2022
MongoX is an async python ODM for MongoDB which is built on top Motor and Pydantic.

MongoX MongoX is an async python ODM (Object Document Mapper) for MongoDB which is built on top Motor and Pydantic. The main features include: Fully t

Amin Alaee 112 Dec 4, 2022
Implementing basic MongoDB CRUD (Create, Read, Update, Delete) queries, using Python.

MongoDB with Python Implementing basic MongoDB CRUD (Create, Read, Update, Delete) queries, using Python. We can connect to a MongoDB database hosted

MousamSingh 4 Dec 1, 2021
sync/async MongoDB ODM, yes.

μMongo: sync/async ODM μMongo is a Python MongoDB ODM. It inception comes from two needs: the lack of async ODM and the difficulty to do document (un)

Scille 428 Dec 29, 2022
A Pythonic, object-oriented interface for working with MongoDB.

PyMODM MongoDB has paused the development of PyMODM. If there are any users who want to take over and maintain this project, or if you just have quest

mongodb 345 Dec 25, 2022
Micro ODM for MongoDB

Beanie - is an asynchronous ODM for MongoDB, based on Motor and Pydantic. It uses an abstraction over Pydantic models and Motor collections to work wi

Roman 993 Jan 3, 2023
A simple wrapper to make a flat file drop in raplacement for mongodb out of TinyDB

Purpose A simple wrapper to make a drop in replacement for mongodb out of tinydb. This module is an attempt to add an interface familiar to those curr

null 180 Jan 1, 2023
Query multiple mongoDB database collections easily

leakscoop Perform queries across multiple MongoDB databases and collections, where the field names and the field content structure in each database ma

bagel 5 Jun 24, 2021
A CRUD and REST api with mongodb atlas.

Movies_api A CRUD and REST api with mongodb atlas. Setup First import all the python dependencies in your virtual environment or globally by the follo

Pratyush Kongalla 0 Nov 9, 2022
MySQL database connector for Python (with Python 3 support)

mysqlclient This project is a fork of MySQLdb1. This project adds Python 3 support and fixed many bugs. PyPI: https://pypi.org/project/mysqlclient/ Gi

PyMySQL 2.2k Dec 25, 2022
MySQL database connector for Python (with Python 3 support)

mysqlclient This project is a fork of MySQLdb1. This project adds Python 3 support and fixed many bugs. PyPI: https://pypi.org/project/mysqlclient/ Gi

PyMySQL 2.2k Dec 25, 2022
Python interface to Oracle Database conforming to the Python DB API 2.0 specification.

cx_Oracle version 8.2 (Development) cx_Oracle is a Python extension module that enables access to Oracle Database. It conforms to the Python database

Oracle 841 Dec 21, 2022
PubMed Mapper: A Python library that map PubMed XML to Python object

pubmed-mapper: A Python Library that map PubMed XML to Python object 中文文档 1. Philosophy view UML Programmatically access PubMed article is a common ta

灵魂工具人 33 Dec 8, 2022
python-beryl, a Python driver for BerylDB.

python-beryl, a Python driver for BerylDB.

BerylDB 3 Nov 24, 2021