Full text search for flask.

Overview

flask-msearch

https://img.shields.io/badge/pypi-v0.2.9-brightgreen.svg https://img.shields.io/badge/python-2/3-brightgreen.svg https://img.shields.io/badge/license-BSD-blue.svg

Installation

To install flask-msearch:

pip install flask-msearch
# when MSEARCH_BACKEND = "whoosh"
pip install whoosh blinker
# when MSEARCH_BACKEND = "elasticsearch", only for 6.x.x
pip install elasticsearch==6.3.1

Or alternatively, you can download the repository and install manually by doing:

git clone https://github.com/honmaple/flask-msearch
cd flask-msearch
python setup.py install

Quickstart

from flask_msearch import Search
[...]
search = Search()
search.init_app(app)

# models.py
class Post(db.Model):
    __tablename__ = 'post'
    __searchable__ = ['title', 'content']

# views.py
@app.route("/search")
def w_search():
    keyword = request.args.get('keyword')
    results = Post.query.msearch(keyword,fields=['title'],limit=20).filter(...)
    # or
    results = Post.query.filter(...).msearch(keyword,fields=['title'],limit=20).filter(...)
    # elasticsearch
    keyword = "title:book AND content:read"
    # more syntax please visit https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-query-string-query.html
    results = Post.query.msearch(keyword,limit=20).filter(...)
    return ''

Config

# when backend is elasticsearch, MSEARCH_INDEX_NAME is unused
# flask-msearch will use table name as elasticsearch index name unless set __msearch_index__
MSEARCH_INDEX_NAME = 'msearch'
# simple,whoosh,elaticsearch, default is simple
MSEARCH_BACKEND = 'whoosh'
# table's primary key if you don't like to use id, or set __msearch_primary_key__ for special model
MSEARCH_PRIMARY_KEY = 'id'
# auto create or update index
MSEARCH_ENABLE = True
# logger level, default is logging.WARNING
MSEARCH_LOGGER = logging.DEBUG
# SQLALCHEMY_TRACK_MODIFICATIONS must be set to True when msearch auto index is enabled
SQLALCHEMY_TRACK_MODIFICATIONS = True
# when backend is elasticsearch
ELASTICSEARCH = {"hosts": ["127.0.0.1:9200"]}

Usage

from flask_msearch import Search
[...]
search = Search()
search.init_app(app)

class Post(db.Model):
    __tablename__ = 'basic_posts'
    __searchable__ = ['title', 'content']

    id = db.Column(db.Integer, primary_key=True)
    title = db.Column(db.String(49))
    content = db.Column(db.Text)

    def __repr__(self):
        return '<Post:{}>'.format(self.title)

if raise sqlalchemy ValueError,please pass db param to Search

db = SQLalchemy()
search = Search(db=db)

Create_index

search.create_index()
search.create_index(Post)

Update_index

search.update_index()
search.update_index(Post)
# or
search.create_index(update=True)
search.create_index(Post, update=True)

Delete_index

search.delete_index()
search.delete_index(Post)
# or
search.create_index(delete=True)
search.create_index(Post, delete=True)

Custom Analyzer

only for whoosh backend

from jieba.analyse import ChineseAnalyzer
search = Search(analyzer=ChineseAnalyzer())

or use __msearch_analyzer__ for special model

class Post(db.Model):
    __tablename__ = 'post'
    __searchable__ = ['title', 'content', 'tag.name']
    __msearch_analyzer__ = ChineseAnalyzer()

Custom index name

If you want to set special index name for some model.

class Post(db.Model):
    __tablename__ = 'post'
    __searchable__ = ['title', 'content', 'tag.name']
    __msearch_index__ = "post111"

Custom schema

from whoosh.fields import ID

class Post(db.Model):
    __tablename__ = 'post'
    __searchable__ = ['title', 'content', 'tag.name']
    __msearch_schema__ = {'title': ID(stored=True, unique=True), 'content': 'text'}

Note: if you use hybrid_property, default field type is Text unless set special __msearch_schema__

Custom parser

from whoosh.qparser import MultifieldParser

class Post(db.Model):
    __tablename__ = 'post'
    __searchable__ = ['title', 'content']

    def _parser(fieldnames, schema, group, **kwargs):
        return MultifieldParser(fieldnames, schema, group=group, **kwargs)

    __msearch_parser__ = _parser

Note: Only for MSEARCH_BACKEND is whoosh

Custom index signal

flask-msearch uses flask signal to update index by default, if you want to use other asynchronous tools such as celey to update index, please set special MSEARCH_INDEX_SIGNAL

# app.py
app.config["MSEARCH_INDEX_SIGNAL"] = celery_signal
# or use string as variable
app.config["MSEARCH_INDEX_SIGNAL"] = "modulename.tasks.celery_signal"
search = Search(app)

# tasks.py
from flask_msearch.signal import default_signal

@celery.task(bind=True)
def celery_signal_task(self, backend, sender, changes):
    default_signal(backend, sender, changes)
    return str(self.request.id)

def celery_signal(backend, sender, changes):
    return celery_signal_task.delay(backend, sender, changes)

Relate index(Experimental)

for example

class Tag(db.Model):
    __tablename__ = 'tag'

    id = db.Column(db.Integer, primary_key=True)
    name = db.Column(db.String(49))

class Post(db.Model):
    __tablename__ = 'post'
    __searchable__ = ['title', 'content', 'tag.name']

    id = db.Column(db.Integer, primary_key=True)
    title = db.Column(db.String(49))
    content = db.Column(db.Text)

    # one to one
    tag_id = db.Column(db.Integer, db.ForeignKey('tag.id'))
    tag = db.relationship(
        Tag, backref=db.backref(
            'post', uselist=False), uselist=False)

    def __repr__(self):
        return '<Post:{}>'.format(self.title)

You must add msearch_FUN to Tag model,or the tag.name can’t auto update.

class Tag....
  ......
  def msearch_post_tag(self, delete=False):
      from sqlalchemy import text
      sql = text('select id from post where tag_id=' + str(self.id))
      return {
          'attrs': [{
              'id': str(i[0]),
              'tag.name': self.name
          } for i in db.engine.execute(sql)],
          '_index': Post
      }
Comments
  • returns nothing no mater what keyword

    returns nothing no mater what keyword

    Hi, I just modified my code following the instruction of QuickStart and Config, like this

    @course.route("/search")
    def w_search():
        keyword = request.args.get('q')
        results = Course.query.msearch(keyword,fields=['title']).first()
        return redirect(url_for('course.classes', id=results.id))
    

    However it returns nothing no mater what keywords did I miss some other configurations? actually, I know nothing about the role of Create_index

    Thanks a Lot For Your Help!!!

    >>> Course.query.msearch('asdf',fields=['title']).all()
    []
    >>> Course.query.filter_by(title='asdf').all()
    [<Course u'asdf'>]
    
    opened by zjyfdu 9
  • Creating indexes is very slow

    Creating indexes is very slow

    My tests are creating new DB rows and before I implemented the library, the tests took about 20 seconds. When using the library they take about 10 minutes, probably because of the indexes creation. Is there a solution for this?

    opened by AdamGold 7
  • MySQL timeout with big db

    MySQL timeout with big db

    I have a big database using flask and MySQL. It has about 1,060,000 items to search and index. I'm trying to update the search index for all the items at once using search.update_index() however, MySQL keeps timing out halfway through. Any ideas on how to fix this?

    opened by johnroper100 6
  • 我测试搜索数据库里存在的关键字后没有返回数据,不知道是哪里没设置好

    我测试搜索数据库里存在的关键字后没有返回数据,不知道是哪里没设置好

    大部分用的是你项目readme文件的代码,搜索英文中文都没返回 打印了搜索语句返回结果

    results = Post.query.msearch(q, fields=['title', 'content'], limit=20)
    print(results)
    
    ==> SELECT post.id AS post_id, post.title AS post_title, post.content AS post_content FROM post WHERE null                                      
    

    这是所有配置:

    app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:////' + base_path +'/test.db'
    app.config.update(
        DEBUG=True,
        MSEARCH_INDEX_NAME='whoosh_index',
        MSEARCH_BACKEND='whoosh',
        MSEARCH_ENABLE=True
    )
    db = SQLAlchemy(app)
    search = Search(db=db, analyzer=ChineseAnalyzer())
    search.init_app(app)
    
    opened by seale 6
  • Autoindexing doesn't work propably

    Autoindexing doesn't work propably

    I initialize flask_msearch using:

    search = Search()
    
    search.init_app(app)
    

    The models are defined like this:

    class Page(db.Model):
        __tablename__ = "Page"
        __searchable__ = ["title", "content"]
    
        id = db.Column(db.Integer, primary_key=True, unique=True)
        title = db.Column(db.String(120), nullable=False, unique=True)
        content = db.Column(db.Text())
       # ...
    
    class Post(db.Model):
        __tablename__ = "Post"
        __searchable__ = ["title", "content"]
    
        id = db.Column(db.Integer, primary_key=True, unique=True)
        title = db.Column(db.String(120), nullable=False)
        content = db.Column(db.Text())
    

    The search is performed like this:

            result = pages = Page.query.msearch(query, fields=["title", "content"]).all()
            result += Post.query.msearch(query, fields=["title", "content"]).all()
    

    I'm using the following configuration:

        MSEARCH_ENABLE = True
        MSEARCH_INDEX_NAME = "msearch"
        MSEARCH_BACKEND = "whoosh"
        MSEARCH_PRIMARY_KEY = "id"
    

    But unfortunately new pages or posts are only indexed after calling search.create_index(update=True). What did I do wrong?

    My packages have the following versions

    $ pip list | grep  "msearch\|blinker\|whoosh\|SQL" -i                                                                                   
    blinker             1.4      
    flask-msearch       0.2.2    
    Flask-SQLAlchemy    2.3.2    
    PyMySQL             0.9.2    
    SQLAlchemy          1.3.6    
    Whoosh              2.7.4
    
    opened by jnnkB 4
  • incorrect header check

    incorrect header check

    Hi,

    I'm sometimes getting this error with no other information:

    Error -3 while decompressing data: incorrect header check

    the code that is generating this error is:

    search.update_one_index(user)

    opened by johnroper100 4
  • [Rel #1] index SQLA hybrid properties

    [Rel #1] index SQLA hybrid properties

    This allows to specify SQLA hybrid_property (proxied and managed computed field) in __searchable__ list to be indexed. Computed nature of this property limits underlying schema field to be TEXT and sorting by it has no sense. See test for example.

    opened by zgoda 4
  • RecursionError: maximum recursion depth exceeded

    RecursionError: maximum recursion depth exceeded

    When I init the module with the Search() class, I get the following error below:

    Traceback (most recent call last):
      File "c:\Users\ghub4\Projects_Website\ProjectsWebsite\__init__.py", line 14, in <module>
        from ProjectsWebsite.admin import admin
      File ".\ProjectsWebsite\__init__.py", line 14, in <module>
        from ProjectsWebsite.admin import admin
      File ".\ProjectsWebsite\admin.py", line 8, in <module>
        from ProjectsWebsite.database.models import User, Article, user_datastore
      File ".\ProjectsWebsite\database\models\__init__.py", line 5, in <module>
        from ProjectsWebsite.database.models._models import *
      File ".\ProjectsWebsite\database\models\_models.py", line 143, in <module>
        search.create_index(Article)
      File "C:\Users\ghub4\Projects_Website\ProjectEnv\lib\site-packages\flask_msearch\__init__.py", line 46, in __getattr__
        return getattr(self._backend, name)
      File "C:\Users\ghub4\Projects_Website\ProjectEnv\lib\site-packages\flask_msearch\__init__.py", line 46, in __getattr__
        return getattr(self._backend, name)
      File "C:\Users\ghub4\Projects_Website\ProjectEnv\lib\site-packages\flask_msearch\__init__.py", line 46, in __getattr__
        return getattr(self._backend, name)
      [Previous line repeated 964 more times]
    RecursionError: maximum recursion depth exceeded
    

    It seems the __getattr__ somehow exceeded the Recursion limit. Below is my config if it is something in my config that is causing this to happen

    app.config["MSEARCH_INDEX_NAME"] = mkdtemp()
    app.config["MSEARCH_PRIMARY_KEY"] = 'id'
    app.config["MSEARCH_BACKEND"] = 'whoosh'
    app.config["MSEARCH_LOGGER"] = logging.DEBUG
    
    opened by SLey3 3
  • Woosh lock error with big site

    Woosh lock error with big site

    Hi, I am running a fairly big site that has many users per day, however, this causes an issue when updating search because I am getting whoosh.index.LockError a lot. Obviously this is due to the high usage. How can I fix this (other than just doing something to block the error ie. try/except)?

    opened by johnroper100 3
  • Allow indexing fields from related models

    Allow indexing fields from related models

    I have 2 models related by foreign key and I would like to search data from related model, eg Author and Book so I could search for both author name and book title. I can create artificial field that concatenates all fields I want to have searchable but this is inconvenient.

    This could be like eg:

    class Book(db.Model):
        __searchable__ = ['title', 'summary', 'author.full_name']
    

    where author is db.relationship based on foreign key

    opened by zgoda 3
  • fixed RunTime Error caused by create_index in BaseBackend

    fixed RunTime Error caused by create_index in BaseBackend

    This fixed a RunTime Error caused by the create_index() function in the BaseBackend class. When instances = model.query.enable_eagerloads(False).yield_per(yield_per) was called, it did not have an app context which Sqlalchemy required causing a RunTime Error. Traceback Message:

    Traceback (most recent call last):
      File "C:\Users\ghub4\Projects_Website\ProjectEnv\lib\site-packages\sqlalchemy\util\_collections.py", line 1020, in __call__
        return self.registry[key]
    KeyError: 20444
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "c:\Users\ghub4\Projects_Website\ProjectsWebsite\__init__.py", line 14, in <module>
        from ProjectsWebsite.forms import loginForm
      File ".\ProjectsWebsite\__init__.py", line 83, in <module>
        from ProjectsWebsite.database.models import User, user_datastore
      File ".\ProjectsWebsite\database\models\__init__.py", line 5, in <module>
        from ProjectsWebsite.database.models._models import *
      File ".\ProjectsWebsite\database\models\_models.py", line 175, in <module>
        search.create_index(Article)
      File "C:\Users\ghub4\Projects_Website\ProjectEnv\lib\site-packages\flask_msearch\backends.py", line 161, in create_index
        instances = model.query.enable_eagerloads(False).yield_per(yield_per)
      File "C:\Users\ghub4\Projects_Website\ProjectEnv\lib\site-packages\flask_sqlalchemy\__init__.py", line 514, in __get__
        return type.query_class(mapper, session=self.sa.session())
      File "C:\Users\ghub4\Projects_Website\ProjectEnv\lib\site-packages\sqlalchemy\orm\scoping.py", line 78, in __call__
        return self.registry()
      File "C:\Users\ghub4\Projects_Website\ProjectEnv\lib\site-packages\sqlalchemy\util\_collections.py", line 1022, in __call__
        return self.registry.setdefault(key, self.createfunc())
      File "C:\Users\ghub4\Projects_Website\ProjectEnv\lib\site-packages\sqlalchemy\orm\session.py", line 3309, in __call__
        return self.class_(**local_kw)
      File "C:\Users\ghub4\Projects_Website\ProjectEnv\lib\site-packages\flask_sqlalchemy\__init__.py", line 136, in __init__
        self.app = app = db.get_app()
      File "C:\Users\ghub4\Projects_Website\ProjectEnv\lib\site-packages\flask_sqlalchemy\__init__.py", line 987, in get_app
        raise RuntimeError(
    RuntimeError: No application found. Either work inside a view function or push an application context. See http://flask-sqlalchemy.pocoo.org/contexts/.
    

    So I inserted with self.app.app_context():... and the module now runs as expected.

    opened by SLey3 2
  • AttributeError: 'Query' object has no attribute 'msearch'

    AttributeError: 'Query' object has no attribute 'msearch'

    Hi Team, I am following the example given in the README for this repository. I am getting the following error message:

    AttributeError: 'Query' object has no attribute 'msearch'
    

    I am getting this on the following line:

    results = Post.query.msearch(keyword,fields=['title'],limit=20).filter(...)
    

    Thanks for any help on this.

    opened by thisisashukla 1
  • AND and OR searches not giving any result

    AND and OR searches not giving any result

    keyword = "first_name:Akhil AND country:london"
    
    results = Post.query.msearch(keyword,limit=20)
    
    there is a row in my db with first_name=Akhil and country=london
    It gives rollback and not showing any result 
    Please help me with this
    thanks in advance
    
    opened by chaudharynirupma 1
  • Adding the backend as a property

    Adding the backend as a property

    It would be nice to be able to access all the features of the backend in use by adding the backend property to the model so that you have these two possibilities:

    Model.query.msearch(...)                                    # Already implemented
    Model.backend.<all methods that the current backend offers> # To implement
    
    opened by BnGx 0
  • Using rank_order=True argument can be slow and can cause SQLAlchemy Operational Error

    Using rank_order=True argument can be slow and can cause SQLAlchemy Operational Error

    I tried using the rank_order=True argument on my SQLite3 database index (~400k rows) and for queries that return a small number of results it works great. But if there are tens of thousands of rows returned it runs very slow and can even completely fail at the SQLAlchemy/database level with the following error:

    sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) too many SQL variables
    

    It seems to create a huge query that looks like the following:

    [SQL: SELECT food.fdc_id AS food_fdc_id, food.data_type AS food_data_type, food.description AS food_description, food.food_category_id AS food_food_category_id, food.publication_date AS food_publication_date 
    FROM food 
    WHERE food.data_type = ? AND food.fdc_id IN (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?,...'1053481', 126716, '350165', 126717, '350173', 126718, '397064', 126719, '408685', 126720, '456184', 126721, '476338', 126722, '476496', 126723, '529344', 126724, '591595', 126725, '600635', 126726, '604648', 126727, '611206', 126728, '611218', 126729, '626396', 126730, '715414', 126731, '760876', 126732, '760892', 126733, '888874', 126734, '900938', 126735, '958545', 126736, '998071', 126737, '349959', 126738, '404281', 126739, '610910', 126740, '760506', 126741, '826782', 126742, '406192', 126743, '439251', 126744, '439253', 126745, '942754', 126746, '471345', 126747, '541084', 126748, '600501', 126749, '717922', 126750, '888832', 126751, '505267', 126752, 10, 0)]
    (Background on this error at: http://sqlalche.me/e/13/e3q8)
    

    (I snipped out 10's of thousands of lines from the middle of the query)

    opened by kellyjonbrazil 0
  • auto update index may cause error

    auto update index may cause error

    When I set MSEARCH_ENABLE = True and try to add a new record to the table, it raises error like

    Traceback (most recent call last):
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/flask/app.py", line 2463, in __call__
        return self.wsgi_app(environ, start_response)
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/flask/app.py", line 2449, in wsgi_app
        response = self.handle_exception(e)
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/flask/app.py", line 1866, in handle_exception
        reraise(exc_type, exc_value, tb)
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/flask/_compat.py", line 39, in reraise
        raise value
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/flask/app.py", line 2446, in wsgi_app
        response = self.full_dispatch_request()
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/flask/app.py", line 1951, in full_dispatch_request
        rv = self.handle_user_exception(e)
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/flask/app.py", line 1820, in handle_user_exception
        reraise(exc_type, exc_value, tb)
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/flask/_compat.py", line 39, in reraise
        raise value
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/flask/app.py", line 1949, in full_dispatch_request
        rv = self.dispatch_request()
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/flask/app.py", line 1935, in dispatch_request
        return self.view_functions[rule.endpoint](**req.view_args)
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/flask_login/utils.py", line 272, in decorated_view
        return func(*args, **kwargs)
      File "/Users/sucan/compNet/app/routes.py", line 287, in record_add
        db.session.commit()
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/sqlalchemy/orm/scoping.py", line 162, in do
        return getattr(self.registry(), name)(*args, **kwargs)
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 1036, in commit
        self.transaction.commit()
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 510, in commit
        self.session.dispatch.after_commit(self.session)
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/sqlalchemy/event/attr.py", line 322, in __call__
        fn(*args, **kw)
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/flask_sqlalchemy/__init__.py", line 224, in after_commit
        models_committed.send(session.app, changes=list(d.values()))
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/blinker/base.py", line 267, in send
        for receiver in self.receivers_for(sender)]
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/blinker/base.py", line 267, in <listcomp>
        for receiver in self.receivers_for(sender)]
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/flask_msearch/backends.py", line 121, in index_signal
        return self._signal(self, sender, changes)
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/flask_msearch/signal.py", line 42, in default_signal
        backend.create_one_index(instance)
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/flask_msearch/whoosh_backend.py", line 190, in create_one_index
        attrs[field] = str(relation_column(instance, field.split('.')))
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/flask_msearch/backends.py", line 30, in relation_column
        _field = getattr(instance, fields[0])
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/sqlalchemy/orm/attributes.py", line 282, in __get__
        return self.impl.get(instance_state(instance), dict_)
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/sqlalchemy/orm/attributes.py", line 710, in get
        value = self.callable_(state, passive)
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/sqlalchemy/orm/strategies.py", line 747, in _load_for_state
        session, state, primary_key_identity, passive
      File "<string>", line 1, in <lambda>
        
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/sqlalchemy/orm/strategies.py", line 837, in _emit_lazyload
        session.query(self.mapper), primary_key_identity
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/sqlalchemy/ext/baked.py", line 612, in _load_on_pk_identity
        result = list(bq.for_session(self.session).params(**params))
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/sqlalchemy/ext/baked.py", line 444, in __iter__
        return q._execute_and_instances(context)
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3411, in _execute_and_instances
        querycontext, self._connection_from_session, close_with_result=True
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3426, in _get_bind_args
        mapper=self._bind_mapper(), clause=querycontext.statement, **kw
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3404, in _connection_from_session
        conn = self.session.connection(**kw)
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 1133, in connection
        execution_options=execution_options,
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 1139, in _connection_for_bind
        engine, execution_options
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 408, in _connection_for_bind
        self._assert_active()
      File "/Users/sucan/compNet/new_venv/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 276, in _assert_active
        "This session is in 'committed' state; no further "
    sqlalchemy.exc.InvalidRequestError: This session is in 'committed' state; no further SQL can be emitted within this transaction.
    

    When set MSEARCH_ENABLE = False, the error disappears. So I guess something with update index may cause error?

    opened by easypickings 1
Owner
honmaple
风落花语风落天,花落风雨花落田.
honmaple
Full-text multi-table search application for Django. Easy to install and use, with good performance.

django-watson django-watson is a fast multi-model full-text search plugin for Django. It is easy to install and use, and provides high quality search

Dave Hall 1.1k Jan 3, 2023
Deep Image Search - AI-Based Image Search Engine

Deep Image Search is an AI-based image search engine that includes deep transfer learning features Extraction and tree-based vectorized search technique.

null 144 Jan 5, 2023
Search emails from a domain through search engines

EmailFinder - search emails through Search Engines

Josué Encinar 155 Dec 30, 2022
Image search service based on imgsmlr extension of PostgreSQL. Support image search by image.

imgsmlr-server Image search service based on imgsmlr extension of PostgreSQL. Support image search by image. This is a sample application of imgsmlr.

jie 45 Dec 12, 2022
GitScanner is a script to make it easy to search for Exposed Git through an advanced Google search.

GitScanner Legal disclaimer Usage of GitScanner for attacking targets without prior mutual consent is illegal. It is the end user's responsibility to

Kaio Gomes 3 Oct 28, 2022
A fast, efficiency python package for searching and getting search results with many different search engines

search A fast, efficiency python package for searching and getting search results with many different search engines. Installation To install the pack

Neurs 0 Oct 6, 2022
Reverse-ikea-image-search - A simple image of ikea search using jina.ai

IKEA Reverse Image Search This is a demo project to fetch ikea product images(IK

SOUVIK GHOSH 4 Mar 8, 2022
Google Project: Search and auto-complete sentences within given input text files, manipulating data with complex data-structures.

Auto-Complete Google Project In this project there is an implementation for one feature of Google's search engines - AutoComplete. Autocomplete, or wo

Hadassah Engel 10 Jun 20, 2022
Modular search for Django

Haystack Author: Daniel Lindsley Date: 2013/07/28 Haystack provides modular search for Django. It features a unified, familiar API that allows you to

Haystack Search 3.4k Jan 4, 2023
Jina allows you to build deep learning-powered search-as-a-service in just minutes

Cloud-native neural search framework for any kind of data

Jina AI 17k Dec 31, 2022
Senginta is All in one Search Engine Scrapper for used by API or Python Module. It's Free!

Senginta is All in one Search Engine Scrapper. With traditional scrapping, Senginta can be powerful to get result from any Search Engine, and convert to Json. Now support only for Google Product Search Engine (GShop, GVideo and many too) and Baidu Search Engine.

null 33 Nov 21, 2022
Google Search Engine Results Pages (SERP) in locally, no API key, no signup required

Local SERP Google Search Engine Results Pages (SERP) in locally, no API key, no signup required Make sure the chromedriver and required package are in

theblackcat102 4 Jun 29, 2021
A web search server for ParlAI, including Blenderbot2.

Description A web search server for ParlAI, including Blenderbot2. Querying the server: The server reacting correctly: Uses html2text to strip the mar

Jules Gagnon-Marchand 119 Jan 6, 2023
This project is a sample demo of Arxiv search related to AI/ML Papers built using Streamlit, sentence-transformers and Faiss.

This project is a sample demo of Arxiv search related to AI/ML Papers built using Streamlit, sentence-transformers and Faiss.

Karn Deb 49 Oct 30, 2022
rclip - AI-Powered Command-Line Photo Search Tool

rclip is a command-line photo search tool based on the awesome OpenAI's CLIP neural network.

Yurij Mikhalevich 394 Dec 12, 2022
An image inline search telegram bot.

Image-Search-Bot An image inline search telegram bot. Note: Use Telegram picture bot. That is better. Not recommending to deploy this bot. Made with P

Fayas Noushad 24 Oct 21, 2022
txtai executes machine-learning workflows to transform data and build AI-powered semantic search applications.

txtai executes machine-learning workflows to transform data and build AI-powered semantic search applications.

NeuML 3.1k Dec 31, 2022
Simple algorithm search engine like google in python using function

Mini-Search-Engine-Like-Google I have created the simple algorithm search engine like google in python using function. I am matching every word with w

Sachin Vinayak Dabhade 5 Sep 24, 2021
A play store search application programming interface ( API )

Play-Store-API A play store search application programming interface ( API ) Made with Python3

Fayas Noushad 8 Oct 21, 2022