Find big moving stocks before they move using machine learning and anomaly detection

Overview

Surpriver - Find High Moving Stocks before they Move

Find high moving stocks before they move using anomaly detection and machine learning. Surpriver uses machine learning to look at volume + price action and infer unusual patterns which can result in big moves in stocks.

Files Description

Path Description
surpriver Main folder.
└  dictionaries Folder to save data dictionaries for later use.
└  figures Figures for this github repositories.
└  stocks List of all the stocks that you want to analyze.
data_loader.py Module for loading data from yahoo finance.
detection_engine.py Main module for running anomaly detection on data and finding stocks with most unusual price and volume patterns.
feature_generator.py Generates price and volume return features as well as plenty of technical indicators.

Usage

Packages

You will need to install the following package to train and test the models.

You can install all packages using the following command. Please note that the script was written using python3.

pip install -r requirements.txt

Running with Docker

You can also use docker if you know what it is and have some knowledge on how to use it. Here are the steps to run the tool with docker.

  • First you must build the container: docker build . -t surpriver
  • Then you need to copy the contents of docker-compose.yml.template to a new file called docker-compose.yml
  • Replace <C:\\path\\to\\this\\dir> with the directory you are working in.
  • Run the container by executing docker-compose up -d
  • Execute any of the commands below by prepending docker exec -it surpriver to your command line.

Predictions for Today

If you want to go ahead and directly get the most anomalous stocks for today, you can simple run the following command to get the stocks with the most unusual patterns. We will dive deeper into the command in the following sections.

Get Most Anomalous Stocks for Today

When you do not have the data dictionary saved and you are running it for the first time.
python detection_engine.py --top_n 25 --min_volume 5000 --data_granularity_minutes 60 --history_to_use 14 --is_load_from_dictionary 0 --data_dictionary_path 'dictionaries/data_dict.npy' --is_save_dictionary 1 --is_test 0 --future_bars 0

This command will give you the top 25 stocks that had the highest anomaly score in the last 14 bars of 60 minute candles. It will also store all the data that it used to make predictions in the dictionaries/data_dict.npy folder. Below is a more detailed explanation of each parameter.

  • top_n: The total number of most anomalous stocks you want to see.
  • min_volume: Filter for volume. Any stock that has an average of volume lower than this value will be ignored.
  • data_granularity_minutes: Data granularity to use for analysis. The available options are 1min, 5min, 15min, 30min, 60min.
  • history_to_use: Historical bars to use to analyze the unusual and anomalous patterns.
  • is_save_dictionary: Whether to save the stock data that is used for analysis in a dictionary or not. Enabling this would save you time if you want to do some further analysis on the data.
  • data_dictionary_path: Dictionary path where data would be stored.
  • is_load_from_dictionary: Whether to load the data from dictionary or download it from yahoo finance directly. You can use the dictionary you saved above here for multiple runs.
  • is_test: You can actually test the predictions by leaving some of the recent data as future data and analyzing whether the most anomalous stocks moved the most after their predictions. If this value is 1, the value of future_bars should be greater than 5.
  • future_bars: These number of bars will be saved from the recent history for testing purposes.
  • output_format: The format for results. If you pass CLI, the results will be printed to the console. If you pass JSON, a JSON file will be created with results for today's date. The default is CLI.
When you have the data dictionary saved, you can just run the following command.
python detection_engine.py --top_n 25 --min_volume 5000 --data_granularity_minutes 60 --history_to_use 14 --is_load_from_dictionary 1 --data_dictionary_path 'dictionaries/data_dict.npy' --is_save_dictionary 0 --is_test 0 --future_bars 0 --output_format 'CLI'

Notice the change in is_save_dictionary and is_load_from_dictionary.

Here is an output of how a single prediction looks like. Please note that negative scores indicate higher anomalous and unusual patterns while positive scores indicate normal patterns. The lower the better.

Last Bar Time: 2020-08-25 11:30:00-04:00
Symbol: SPI
Anomaly Score: -0.029
Today Volume (Today = Date Above): 313.94K
Average Volume 5d: 206.53K
Average Volume 20d: 334.14K
Volatility 5bars: 0.013
Volatility 20bars: 0.038
Future Absolute Sum Price Changes: 72.87

Test on Historical Data

If you are suspicious of the use of Machine Learning and Artificial Intelligence in trading, you can actually test the predictions from this tool on historical data. The two most important command line arguments for testing are is_test and future_bars. If the former one is set to 1 and the later one is set to anything more than 5, the tool will actually leave that amount of data for analysis purposes and use the data prior to that for anomalous predictions. Next, it will look at that remaining data to see how well the predictions did. Here is an example of a scatter plot from the following command.

Find Anomalous Stocks and Test them on Historical Data

python detection_engine.py --top_n 25 --min_volume 5000 --data_granularity_minutes 60 --history_to_use 14 --is_load_from_dictionary 0 --data_dictionary_path 'dictionaries/data_dict.npy' --is_save_dictionary 1 --is_test 1 --future_bars 25

If you have already generated the data dictionary, you can use the following command where we set is_load_from_dictionary to 1 and is_save_dictionary to 0.

python detection_engine.py --top_n 25 --min_volume 5000 --data_granularity_minutes 60 --history_to_use 14 --is_load_from_dictionary 1 --data_dictionary_path 'dictionaries/data_dict.npy' --is_save_dictionary 0 --is_test 1 --future_bars 25

As you can see in the image above, the anomalous stocks (score < 0) usually have a higher absolute change in the future on average. That proves that the predictions are actually for those stocks that moved more than average in the next few hours/days. One question arises here, what if the tool is just picking the highest volatility stocks because those would yield high future absolute change. In order to prove that it's not the case, here is the more detailed description of stats you get from the above command.

--> Future Performance
Correlation between future absolute change vs anomalous score (lower is better, range = (-1, 1)): **-0.23**
Total absolute change in future for Anomalous Stocks: **89.660**
Total absolute change in future for Normal Stocks: **43.000**
Average future volatility of Anomalous Stocks: **0.332**
Average future volatility of Normal Stocks: **0.585**
Historical volatility for Anomalous Stocks: **2.528**
Historical volatility for Normal Stocks: **2.076**

You can see that historical volatility for normal vs anomalous stocks is not that different. However, the difference in total absolute future change is double for anomalous stocks as compared to normal stocks.

Support for Crypto Currencies

You can now specify which data source you wold like to use along with which stocks list you would like to use.

python detection_engine.py --top_n 25 --min_volume 500 --data_granularity_minutes 60 --history_to_use 14 --is_load_from_dictionary 0 --data_dictionary_path 'dictionaries/feature_dict.npy' --is_save_dictionary 1 --is_test 0 --future_bars 0  --data_source binance --stock_list cryptos.txt
  • data_source: Specifies where to get data from, current supported options are binance and yahoo_finance(default)
  • stocks_list: Which file in the stocks directory contains the list of tickers to analyze. Default is stocks.txt.

Results

We will try to post the top 25 results for a single set of parameters every week.

August 31, 2020 to September 05, 2020: https://pastebin.com/L5T2BYUx

Limitations

The tool only finds stocks that have some unusual behavior in their price and volume action combined. It does not predict which direction the stock is going to move. That might be a feature that I'll implement in the future but for right now, you'll need to look at the charts and do your DD to figure that out.

License

License: GPL v3

A product by Tradytics

Copyright (c) 2020-present, Tradytics.com

Comments
  • Remove pkg-resources

    Remove pkg-resources

    Due to https://github.com/tradytics/surpriver/issues/15 and https://stackoverflow.com/questions/39577984/what-is-pkg-resources-0-0-0-in-output-of-pip-freeze-command

    opened by MohanVashist1 5
  • Python requirements are broken

    Python requirements are broken

    https://github.com/tradytics/surpriver/blob/cbcc7a372a5dd21e68fd12058863d1ba8e734db5/requirements.txt#L21

    Related stack overflow question: https://stackoverflow.com/questions/39577984/what-is-pkg-resources-0-0-0-in-output-of-pip-freeze-command

    opened by clschnei 5
  • Added Support for Crypto-Currencies list on Binance

    Added Support for Crypto-Currencies list on Binance

    This PR is to support Crypto-Currencies listed on Binance. With some simple modifications you can add support for more data sources/exchanges with ease.

    List of changes:

    • requirements.txt: Included python-binance as well as the supported packages required by python-binance
    • Readme.md: Added section about how to use Binance as a data source and how to use an alternative ticker list
    • crypto_list.txt: Sample list of crypto currencies from Binance
    • detection_engine.py: added two new arguments; the data source and the stocks list. Data source tells the program where to go to fetch data (currently either yahoo finance(default) or binance). Stocks list is the file in the stocks folder which contains the list of tickers to use. - Added argument checking. Data source must be either 'yahoo_finance' or 'binance', stocks list must exist in the stocks folder
    • data_loader.py - Added new two input parameters to data_loader.py, - changed how the stocks list is read by using the file indicated by the stocks_list parameter - added the ability to get data from the binance client - line 188-190 fixed bug with saving dictionary every 100 iterations. Due to the continue statements, the size of the dictionary can increase but checking if you need to save the file may not occur. Thus, I moved the statement which checks if we need to save the dictionary immediately after we add to the dictionary
    opened by MohanVashist1 4
  • How do I fix my JSON error when it doesn't show loading data for all stocks?

    How do I fix my JSON error when it doesn't show loading data for all stocks?

    Hey everyone,

    I've been using surpriver for a little while and this was the first time that this has happened. Thanks!

    Surpriver has been initialized...
    Data engine has been initialized...
    Loading all stocks from file...
    Total number of stocks: 5693
    Technical Indicator Engine has been initialized
    Loading data for all stocks...
      0%|                                                  | 0/5693 [00:00<?, ?it/s]Exception in thread Thread-2:
    Traceback (most recent call last):
      File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
        self.run()
      File "/usr/lib/python3.8/threading.py", line 870, in run
        self._target(*self._args, **self._kwargs)
      File "/usr/local/lib/python3.8/dist-packages/multitasking/__init__.py", line 102, in _run_via_pool
        return callee(*args, **kwargs)
      File "/usr/local/lib/python3.8/dist-packages/yfinance/multi.py", line 166, in _download_one_threaded
        data = _download_one(ticker, start, end, auto_adjust, back_adjust,
      File "/usr/local/lib/python3.8/dist-packages/yfinance/multi.py", line 178, in _download_one
        return Ticker(ticker).history(period=period, interval=interval,
      File "/usr/local/lib/python3.8/dist-packages/yfinance/base.py", line 155, in history
        data = data.json()
      File "/usr/local/lib/python3.8/dist-packages/requests/models.py", line 898, in json
        return complexjson.loads(self.text, **kwargs)
      File "/usr/lib/python3/dist-packages/simplejson/__init__.py", line 518, in loads
        return _default_decoder.decode(s)
      File "/usr/lib/python3/dist-packages/simplejson/decoder.py", line 370, in decode
        obj, end = self.raw_decode(s)
      File "/usr/lib/python3/dist-packages/simplejson/decoder.py", line 400, in raw_decode
        return self.scan_once(s, idx=_w(s, idx).end())
    simplejson.errors.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
    
    
    opened by MasterHiddenTank 3
  • exec:

    exec: "/usr/src/app/entry_point.sh": permission denied

    Hi,

    I am trying to run surpriver using docker. I followed all the steps in README but got stuck on the last steps. I am getting the following error when I run docker-compose up -d:

    Starting surpriver ... error                                                                                                         
                                                                                                                                         
    ERROR: for surpriver  Cannot start service surpriver: OCI runtime create failed: container_linux.go:367: starting container process caused: exec: "/usr/src/app/entry_point
    .sh": permission denied: unknown                                                                                                     
                                                                                                                                         
    ERROR: for surpriver  Cannot start service surpriver: OCI runtime create failed: container_linux.go:367: starting container process caused: exec: "/usr/src/app/entry_point
    .sh": permission denied: unknown                                                                                                     
    ERROR: Encountered errors while bringing up the project.
    

    Did anyone face the same problems as above, and got away with it after all? Please share if so :)

    Information: Using the latest version (commit) as of today 5241766. System: No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 20.10 Release: 20.10 Codename: groovy

    opened by ibuda 1
  • No file or directory: dictionaries/data_dict.npy

    No file or directory: dictionaries/data_dict.npy

    Those who encounter this error, change the first command from this: python detection_engine.py --top_n 25 --min_volume 5000 --data_granularity_minutes 60 --history_to_use 14 --is_load_from_dictionary 0 --data_dictionary_path 'dictionaries/data_dict.npy' --is_save_dictionary 1 --is_test 0 --future_bars 0

    And change this to: python detection_engine.py --top_n 25 --min_volume 5000 --data_granularity_minutes 60 --history_to_use 14 --is_load_from_dictionary 0 --data_dictionary_path dictionaries/data_dict.npy --is_save_dictionary 1 --is_test 0 --future_bars 0

    opened by DoubleA11 1
  • cleaned up the requirements.txt file so that it only contains what is…

    cleaned up the requirements.txt file so that it only contains what is…

    The original requirements.txt seemed to have a few extra packages and needed some cleaning up. I created a venv and made sure to create a requirements file that containing only what was needed.

    opened by AndrewCGreen 1
  • new

    new

    My name is Luis, I'm a big-data machine-learning developer, I'm a fan of your work, and I usually check your updates.

    I was afraid that my savings would be eaten by inflation. I have created a powerful tool that based on past technical patterns (volatility, moving averages, statistics, trends, candlesticks, support and resistance, stock index indicators). All the ones you know (RSI, MACD, STOCH, Bolinger Bands, SMA, DEMARK, Japanese candlesticks, ichimoku, fibonacci, williansR, balance of power, murrey math, etc) and more than 200 others.

    The tool creates prediction models of correct trading points (buy signal and sell signal, every stock is good traded in time and direction). For this I have used big data tools like pandas python, stock market libraries like: tablib, TAcharts ,pandas_ta... For data collection and calculation. And powerful machine-learning libraries such as: Sklearn.RandomForest , Sklearn.GradientBoosting, XGBoost, Google TensorFlow and Google TensorFlow LSTM.

    With the models trained with the selection of the best technical indicators, the tool is able to predict trading points (where to buy, where to sell) and send real-time alerts to Telegram or Mail. The points are calculated based on the learning of the correct trading points of the last 2 years (including the change to bear market after the rate hike).

    I think it could be useful to you, to improve, I would like to share it with you, and if you are interested in improving and collaborating I am also willing, and if not file it in the box.

    If tou want, Please read the readme , and in case of any problem you can contact me , If you are convinced try to install it with the documentation. https://github.com/Leci37/LecTrade/tree/develop I appreciate the feedback

    opened by Leci37 0
  • Process stopped with an error after fetching

    Process stopped with an error after fetching

    Same error if I run from docker and from sources

    100%|█████████████████████████████████████████████████████████████████████████████████████| 5693/5693 [10:14<00:00, 9.27it/s] Traceback (most recent call last): File "detection_engine.py", line 357, in supriver.find_anomalies() File "detection_engine.py", line 196, in find_anomalies features, historical_price_info, future_prices, symbol_names = self.dataEngine.collect_data_for_all_tickers() File "/usr/src/app/data_loader.py", line 211, in collect_data_for_all_tickers features, historical_price_info, future_price_info, symbol_names = self.remove_bad_data(features, historical_price_info, future_price_info, symbol_names) File "/usr/src/app/data_loader.py", line 249, in remove_bad_data most_common_length = length_dictionary[0] IndexError: list index out of range

    opened by pavlosidelov 0
  • requests.exceptions.ConnectionError

    requests.exceptions.ConnectionError

    Hi,

    I am going through the readme section to set up and run the package with docker. I am getting the following error when I run docker-compose up -d:

    Traceback (most recent call last):
      File "urllib3/connectionpool.py", line 677, in urlopen
      File "urllib3/connectionpool.py", line 392, in _make_request
      File "http/client.py", line 1277, in request
      File "http/client.py", line 1323, in _send_request
      File "http/client.py", line 1272, in endheaders
      File "http/client.py", line 1032, in _send_output
      File "http/client.py", line 972, in send
      File "docker/transport/unixconn.py", line 43, in connect
    PermissionError: [Errno 13] Permission denied
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "requests/adapters.py", line 449, in send
      File "urllib3/connectionpool.py", line 727, in urlopen
      File "urllib3/util/retry.py", line 410, in increment
      File "urllib3/packages/six.py", line 734, in reraise
      File "urllib3/connectionpool.py", line 677, in urlopen
      File "urllib3/connectionpool.py", line 392, in _make_request
      File "http/client.py", line 1277, in request
      File "http/client.py", line 1323, in _send_request
      File "http/client.py", line 1272, in endheaders
      File "http/client.py", line 1032, in _send_output
      File "http/client.py", line 972, in send
      File "docker/transport/unixconn.py", line 43, in connect
    urllib3.exceptions.ProtocolError: ('Connection aborted.', PermissionError(13, 'Permission denied'))
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "docker/api/client.py", line 214, in _retrieve_server_version
      File "docker/api/daemon.py", line 181, in version
      File "docker/utils/decorators.py", line 46, in inner
      File "docker/api/client.py", line 237, in _get
      File "requests/sessions.py", line 543, in get
      File "requests/sessions.py", line 530, in request
      File "requests/sessions.py", line 643, in send
      File "requests/adapters.py", line 498, in send
    requests.exceptions.ConnectionError: ('Connection aborted.', PermissionError(13, 'Permission denied'))
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "docker-compose", line 3, in <module>
      File "compose/cli/main.py", line 80, in main
      File "compose/cli/main.py", line 189, in perform_command
      File "compose/cli/command.py", line 70, in project_from_options
      File "compose/cli/command.py", line 153, in get_project
      File "compose/cli/docker_client.py", line 43, in get_client
      File "compose/cli/docker_client.py", line 170, in docker_client
      File "docker/api/client.py", line 197, in __init__
      File "docker/api/client.py", line 222, in _retrieve_server_version
    docker.errors.DockerException: Error while fetching server API version: ('Connection aborted.', PermissionError(13, 'Permission denied'))
    [194412] Failed to execute script docker-compose
    

    My system is Ubuntu Ubuntu 20.04.2 LTS. Any recommendation/advices on how to make it run? Thank you.

    opened by ibuda 1
  • list index out of range

    list index out of range

    After running the below:

    python detection_engine.py --top_n 25 --min_volume 5000 --data_granularity_minutes 60 --history_to_use 14 --is_load_from_dictionary 0 --data_dictionary_path 'dictionaries/data_dict.npy' --is_save_dictionary 1 --is_test 0 --future_bars 0

    Here is the error message:

    Traceback (most recent call last): File "detection_engine.py", line 357, in supriver.find_anomalies() File "detection_engine.py", line 196, in find_anomalies features, historical_price_info, future_prices, symbol_names = self.dataEngine.collect_data_for_all_tickers() File "/Users/Home/Documents/Coding/surpriver-master/data_loader.py", line 211, in collect_data_for_all_tickers features, historical_price_info, future_price_info, symbol_names = self.remove_bad_data(features, historical_price_info, future_price_info, symbol_names) File "/Users/Home/Documents/Coding/surpriver-master/data_loader.py", line 249, in remove_bad_data most_common_length = length_dictionary[0] IndexError: list index out of range

    opened by guijames 2
  • Error while processing

    Error while processing

    Hello, I keep getting the following error while the script is running.

    Exception __init__() got an unexpected keyword argument 'n'

    Then, when finished, it gives this error:

    Exception in thread Thread-2123: Traceback (most recent call last): File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/threading.py", line 926, in _bootstrap_inner self.run() File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/threading.py", line 870, in run self._target(*self._args, **self._kwargs) File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/site-packages/multitasking/__init__.py", line 102, in _run_via_pool return callee(*args, **kwargs) File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/site-packages/yfinance/multi.py", line 167, in _download_one_threaded actions, period, interval, prepost, proxy, rounding) File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/site-packages/yfinance/multi.py", line 182, in _download_one rounding=rounding, many=True) File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/site-packages/yfinance/base.py", line 156, in history data = data.json() File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/site-packages/requests/models.py", line 900, in json return complexjson.loads(self.text, **kwargs) File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/json/__init__.py", line 348, in loads return _default_decoder.decode(s) File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/json/decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/json/decoder.py", line 355, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

    I'm running it on Anaconda and all of the requirements are installed. I've also tried this, in case it was a Python version issue.

    python3 detection_engine.py --top_n 25 --min_volume 5000 --data_granularity_minutes 60 --history_to_use 14 --is_load_from_dictionary 0 --data_dictionary_path 'dictionaries/data_dict.npy' --is_save_dictionary 1 --is_test 1 --future_bars 25

    Ialso found these in the detection_engine.py

    Basic libraries

    import os import ta = Not working import sys = Not working import json import math = Not working import pickle = Not working import random = Not working import requests = Not working import collections import numpy as np from os import walk = Not working, path import pandas as pd = Not working import yfinance as yf = Not working import datetime as dt from scipy.stats import linregress = Not working from datetime import datetime = Not working, timedelta = Not working import matplotlib.pyplot as plt from sklearn.ensemble import IsolationForest from data_loader import DataEngine import warnings

    opened by octoma 3
Owner
Tradytics
Artificial Intelligence driven Trading Tools
Tradytics
Technical Analysis Library using Pandas and Numpy

Technical Analysis Library in Python It is a Technical Analysis library useful to do feature engineering from financial time series datasets (Open, Cl

Darío López Padial 3.4k Jan 2, 2023
personal finance tracker, written in python 3 and using the wxPython GUI toolkit.

personal finance tracker, written in python 3 and using the wxPython GUI toolkit.

wenbin wu 23 Oct 30, 2022
An open source reinforcement learning framework for training, evaluating, and deploying robust trading agents.

TensorTrade: Trade Efficiently with Reinforcement Learning TensorTrade is still in Beta, meaning it should be used very cautiously if used in producti

null 4k Dec 30, 2022
crypto utilities as a way of learning

cryptos Just me developing a pure Python from-scratch zero-dependency implementation of Bitcoin for educational purposes. This includes a lot of the c

Andrej 958 Jan 2, 2023
Common financial risk and performance metrics. Used by zipline and pyfolio.

empyrical Common financial risk metrics. Table of Contents Installation Usage Support Contributing Testing Installation pip install empyrical Usage S

Quantopian, Inc. 1k Dec 26, 2022
Portfolio and risk analytics in Python

pyfolio pyfolio is a Python library for performance and risk analysis of financial portfolios developed by Quantopian Inc. It works well with the Zipl

Quantopian, Inc. 4.8k Jan 8, 2023
rotki is an open source portfolio tracking, analytics, accounting and tax reporting tool that respects your privacy.

rotki is an open source portfolio tracking, analytics, accounting and tax reporting tool that respects your privacy. The mission of rotki is to bring transparency into the crypto and financial sectors through the use of open source.

Rotki 2k Dec 30, 2022
Fourth and final milestone project

Milestone Project 4: Pound Dog Click link to visit "Pound Dog" Aim of the project The aim of this project is to provide access to a website informing

Jamie Wilson 1 Oct 31, 2021
'Personal Finance' is a project where people can manage and track their expenses

Personal Finance by Abhiram Rishi Pratitpati 'Personal Finance' is a project where people can manage and track their expenses. It is hard to keep trac

Abhiram Rishi Prattipati 1 Dec 21, 2021
One Stop Anomaly Shop: Anomaly detection using two-phase approach: (a) pre-labeling using statistics, Natural Language Processing and static rules; (b) anomaly scoring using supervised and unsupervised machine learning.

One Stop Anomaly Shop (OSAS) Quick start guide Step 1: Get/build the docker image Option 1: Use precompiled image (might not reflect latest changes):

Adobe, Inc. 148 Dec 26, 2022
Ahmed Hossam 12 Oct 17, 2022
Track live sentiment for stocks from Reddit and Twitter and identify growing stocks

Market Sentiment About This repository can mainly be used for two things. a. Tracking the live sentiment of stocks from Reddit and Twitter b. Tracking

Market Sentiment 345 Dec 17, 2022
Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy" (ICLR 2022 Spotlight)

About Code release for Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy (ICLR 2022 Spotlight)

THUML @ Tsinghua University 221 Dec 31, 2022
Motion detector, Full body detection, Upper body detection, Cat face detection, Smile detection, Face detection (haar cascade), Silverware detection, Face detection (lbp), and Sending email notifications

Security camera running OpenCV for object and motion detection. The camera will send email with image of any objects it detects. It also runs a server that provides web interface with live stream video.

Peace 10 Jun 30, 2021
An automation program that checks whether email addresses are real, whether they exist and whether they are a validated mail

Email Validator It is an automation program that checks whether email addresses are real, whether they exist and whether they are a validated mail. Re

Ender MIRIZ 4 Dec 22, 2021
Participants of Bertelsmann Technology Scholarship created an awesome list of resources and they want to share it with the world, if you find illegal resources please report to us and we will remove.

Participants of Bertelsmann Technology Scholarship created an awesome list of resources and they want to share it with the world, if you find illegal

Wissem Marzouki 29 Nov 28, 2022
Track player's stats, find out when they're online and grinding!

Hypixel Stats Tracker Track player's stats, find out when they're online and playing games! INFO Showcase Server: https://discord.gg/yY5qQHPar6 Suppor

null 4 Dec 18, 2022
A script to find the people whom you follow, but they don't follow you back

insta-non-followers A script to find the people whom you follow, but they don't follow you back Dependencies: python3 libraries - instaloader, getpass

Ritvik 5 Jul 3, 2022