Fastquant - Backtest and optimize your trading strategies with only 3 lines of code!

Overview

fastquant πŸ€“

Build Status Code style: black License: MIT Downloads

Bringing backtesting to the mainstream

fastquant allows you to easily backtest investment strategies with as few as 3 lines of python code. Its goal is to promote data driven investments by making quantitative analysis in finance accessible to everyone.

To do this type of analysis without coding, you can also try out Hawksight, which was just recently launched! πŸ˜„

If you want to interact with us directly, you can also reach us on the Hawksight discord. Feel free to ask about fastquant in the #feedback-suggestions and #bug-report channels.

Features

  1. Easily access historical stock data
  2. Backtest and optimize trading strategies with only 3 lines of code

* - Both Yahoo Finance and Philippine stock data data are accessible straight from fastquant

Check out our blog posts in the fastquant website and this intro article on Medium!

Installation

Python

pip install fastquant
or
python -m pip install fastquant

Get stock data

All symbols from Yahoo Finance and Philippine Stock Exchange (PSE) are accessible via get_stock_data.

Python

from fastquant import get_stock_data
df = get_stock_data("JFC", "2018-01-01", "2019-01-01")
print(df.head())

#           dt  close
#   2019-01-01  293.0
#   2019-01-02  292.0
#   2019-01-03  309.0
#   2019-01-06  323.0
#   2019-01-07  321.0

Get crypto data

The data is pulled from Binance, and all the available tickers are found here.

Python

from fastquant import get_crypto_data
crypto = get_crypto_data("BTC/USDT", "2018-12-01", "2019-12-31")
crypto.head()

#             open    high     low     close    volume
# dt                                                          
# 2018-12-01  4041.27  4299.99  3963.01  4190.02  44840.073481
# 2018-12-02  4190.98  4312.99  4103.04  4161.01  38912.154790
# 2018-12-03  4160.55  4179.00  3827.00  3884.01  49094.369163
# 2018-12-04  3884.76  4085.00  3781.00  3951.64  48489.551613
# 2018-12-05  3950.98  3970.00  3745.00  3769.84  44004.799448

Backtest trading strategies

Simple Moving Average Crossover (15 day MA vs 40 day MA)

Daily Jollibee prices from 2018-01-01 to 2019-01-01

from fastquant import backtest
backtest('smac', df, fast_period=15, slow_period=40)

# Starting Portfolio Value: 100000.00
# Final Portfolio Value: 102272.90

Want to do this without coding at all?

If you want to make this kind of analysis even more simple without having to code at all (or want to avoid the pain of doing all of the setup required), you can signup for free and try out Hawksight - this new no-code tool I’m building to democratize data driven investments.

Hoping to make these kinds of powerful analyses accessible to more people!

Optimize trading strategies with automated grid search

fastquant allows you to automatically measure the performance of your trading strategy on multiple combinations of parameters. All you need to do is to input the values as iterators (like as a list or range).

Simple Moving Average Crossover (15 to 30 day MA vs 40 to 55 day MA)

Daily Jollibee prices from 2018-01-01 to 2019-01-01

from fastquant import backtest
res = backtest("smac", df, fast_period=range(15, 30, 3), slow_period=range(40, 55, 3), verbose=False)

# Optimal parameters: {'init_cash': 100000, 'buy_prop': 1, 'sell_prop': 1, 'execution_type': 'close', 'fast_period': 15, 'slow_period': 40}
# Optimal metrics: {'rtot': 0.022, 'ravg': 9.25e-05, 'rnorm': 0.024, 'rnorm100': 2.36, 'sharperatio': None, 'pnl': 2272.9, 'final_value': 102272.90}

print(res[['fast_period', 'slow_period', 'final_value']].head())

#	fast_period	slow_period	final_value
#0	15	        40	        102272.90
#1	21	        40	         98847.00
#2	21	        52	         98796.09
#3	24	        46	         98008.79
#4	15	        46	         97452.92

Library of trading strategies

Strategy Alias Parameters
Relative Strength Index (RSI) rsi rsi_period, rsi_upper, rsi_lower
Simple moving average crossover (SMAC) smac fast_period, slow_period
Exponential moving average crossover (EMAC) emac fast_period, slow_period
Moving Average Convergence Divergence (MACD) macd fast_perod, slow_upper, signal_period, sma_period, dir_period
Bollinger Bands bbands period, devfactor
Buy and Hold buynhold N/A
Sentiment Strategy sentiment keyword , page_nums, senti
Custom Prediction Strategy custom upper_limit, lower_limit, custom_column
Custom Ternary Strategy ternary buy_int, sell_int, custom_column

Relative Strength Index (RSI) Strategy

backtest('rsi', df, rsi_period=14, rsi_upper=70, rsi_lower=30)

# Starting Portfolio Value: 100000.00
# Final Portfolio Value: 132967.87

Simple moving average crossover (SMAC) Strategy

backtest('smac', df, fast_period=10, slow_period=30)

# Starting Portfolio Value: 100000.00
# Final Portfolio Value: 95902.74

Exponential moving average crossover (EMAC) Strategy

backtest('emac', df, fast_period=10, slow_period=30)

# Starting Portfolio Value: 100000.00
# Final Portfolio Value: 90976.00

Moving Average Convergence Divergence (MACD) Strategy

backtest('macd', df, fast_period=12, slow_period=26, signal_period=9, sma_period=30, dir_period=10)

# Starting Portfolio Value: 100000.00
# Final Portfolio Value: 96229.58

Bollinger Bands Strategy

backtest('bbands', df, period=20, devfactor=2.0)

# Starting Portfolio Value: 100000.00
# Final Portfolio Value: 97060.30

News Sentiment Strategy

Use Tesla (TSLA) stock from yahoo finance and news articles from Business Times

from fastquant import get_yahoo_data, get_bt_news_sentiment
data = get_yahoo_data("TSLA", "2020-01-01", "2020-07-04")
sentiments = get_bt_news_sentiment(keyword="tesla", page_nums=3)
backtest("sentiment", data, sentiments=sentiments, senti=0.2)

# Starting Portfolio Value: 100000.00
# Final Portfolio Value: 313198.37
# Note: Unfortunately, you can't recreate this scenario due to inconsistencies in the dates and sentiments that is scraped by get_bt_news_sentiment. In order to have a quickstart with News Sentiment Strategy you need to make the dates consistent with the sentiments that you are scraping.

from fastquant import get_yahoo_data, get_bt_news_sentiment
from datetime import datetime, timedelta

# we get the current date and delta time of 30 days
current_date = datetime.now().strftime("%Y-%m-%d")
delta_date = (datetime.now() - timedelta(30)).strftime("%Y-%m-%d")
data = get_yahoo_data("TSLA", delta_date, current_date)
sentiments = get_bt_news_sentiment(keyword="tesla", page_nums=3)
backtest("sentiment", data, sentiments=sentiments, senti=0.2)

Multi Strategy

Multiple registered strategies can be utilized together in an OR fashion, where buy or sell signals are applied when at least one of the strategies trigger them.

df = get_stock_data("JFC", "2018-01-01", "2019-01-01")

# Utilize single set of parameters
strats = { 
    "smac": {"fast_period": 35, "slow_period": 50}, 
    "rsi": {"rsi_lower": 30, "rsi_upper": 70} 
} 
res = backtest("multi", df, strats=strats)
res.shape
# (1, 16)


# Utilize auto grid search
strats_opt = { 
    "smac": {"fast_period": 35, "slow_period": [40, 50]}, 
    "rsi": {"rsi_lower": [15, 30], "rsi_upper": 70} 
} 

res_opt = backtest("multi", df, strats=strats_opt)
res_opt.shape
# (4, 16)

Custom Strategy for Backtesting Machine Learning & Statistics Based Predictions

This powerful strategy allows you to backtest your own trading strategies using any type of model w/ as few as 3 lines of code after the forecast!

Predictions based on any model can be used as a custom indicator to be backtested using fastquant. You just need to add a custom column in the input dataframe, and set values for upper_limit and lower_limit.

The strategy is structured similar to RSIStrategy where you can set an upper_limit, above which the asset is sold (considered "overbought"), and a lower_limit, below which the asset is bought (considered "underbought). upper_limit is set to 95 by default, while lower_limit is set to 5 by default.

In the example below, we show how to use the custom strategy to backtest a custom indicator based on out-of-sample time series forecasts. The forecasts were generated using Facebook's Prophet package on Bitcoin prices.

from fastquant import get_crypto_data, backtest
from fbprophet import Prophet
import pandas as pd
from matplotlib import pyplot as plt

# Pull crypto data
df = get_crypto_data("BTC/USDT", "2019-01-01", "2020-05-31")

# Fit model on closing prices
ts = df.reset_index()[["dt", "close"]]
ts.columns = ['ds', 'y']
m = Prophet(daily_seasonality=True, yearly_seasonality=True).fit(ts)
forecast = m.make_future_dataframe(periods=0, freq='D')

# Predict and plot
pred = m.predict(forecast)
fig1 = m.plot(pred)
plt.title('BTC/USDT: Forecasted Daily Closing Price', fontsize=25)

+1.5%, and sell when it's < -1.5%. df["custom"] = expected_1day_return.multiply(-1) backtest("custom", df.dropna(),upper_limit=1.5, lower_limit=-1.5)">
# Convert predictions to expected 1 day returns
expected_1day_return = pred.set_index("ds").yhat.pct_change().shift(-1).multiply(100)

# Backtest the predictions, given that we buy bitcoin when the predicted next day return is > +1.5%, and sell when it's < -1.5%.
df["custom"] = expected_1day_return.multiply(-1)
backtest("custom", df.dropna(),upper_limit=1.5, lower_limit=-1.5)

See more examples here.

fastquant API

View full list of fastquan API here

Be part of the growing fastquant community

Want to discuss more about fastquant with other users, and our team of developers?

You can reach us on the Hawksight discord. Feel free to ask about fastquant in the #feedback-suggestions and #bug-report channels.

Run fastquant in a Docker Container

>> df.head()">
# Build the image
docker build -t myimage .

# Run the container
docker run -t -d -p 5000:5000 myimage

# Get the container id
docker ps

# SSH into the fastquant container
docker exec -it 
   
     /bin/bash

# Run python and use fastquant
python

>>> from fastquant import get_stock_data
>>> df = get_stock_data("TSLA", "2019-01-01", "2020-01-01")
>>> df.head()

   
Comments
  • [FEATURE] add multi-strategy

    [FEATURE] add multi-strategy

    Feature Suggestion

    • A backtest/trade strategy which incorporates multiple indicators

    Example:

    • Buy when Price > 20 ema > 50 ema > 100 ema AND StochasticRSI is oversold
    • Sell when close is below 20 ema

    Position Size/Risk Mgt:

    • Risk 2% of portfolio every trade
    enhancement 
    opened by ChingSlayer 16
  • Add risk management analyzers

    Add risk management analyzers

    Resolves #254

    Add risk metrics to backtesting results of backtests on trading stategies. This is an ongoing pull request for the following metrics:

    1. Sharpe ratio - Measure of statistical significance of the return of investment given a risk free rate (default in package is 1%)
    2. Drawdown - Measure of how low the asset price went before recovery relative to the overall price
    3. Drawdown for a specified time period - Measure of the above within a time period specified by the user
    opened by ghost 13
  • Backtesting functionality in R support

    Backtesting functionality in R support

    Resolve #141

    Attempts in this PR aim to import all trading strategies and the main backtest function the so that both R and Python have full support for them. Two approaches are top of mind. No. 1 will apply in this pull request:

    1. Use the {reticulate} package to port Python functionality into R Advantage - Relies on the backtrader framework Disadvantage - Dependency on the latest PyPI release of fastquant

    2. Write native R code Advantage - Full control of the R functionality Disadvantage - Rewriting something that can replace the backtrader from scratch

    It is possible that we could implement No. 2 in a future PR.

    opened by ghost 13
  • How to pass Highs/Lows from DF to a Strategy?

    How to pass Highs/Lows from DF to a Strategy?

    Hey!

    Love the tool, just started using it. I see a lot of examples of how to apply some indicators to the closing price in the strategies.py file, but I'm curious because I want to build a strategy using the PSAR, and this requires the High and Low from the passed in DF as well. In the strategies.py file, there's a couple of format mapping examples of "c" and "cv" for I'm assuming the "close" price, or whatever price is in the DF being passed in, but how can one also include the highs and lows, or basically the entire df?

    I'm assuming we'd have to create our own format mapping for "ohlcv" in the strategies.py file, and then also add to the:

        self.dataclose = self.datas[0].close
        self.dataopen = self.datas[0].open
    

    line to include highs and lows and volumes? But how would we go about doing this?

    Thanks!

    enhancement 
    opened by windowshopr 13
  • Cache

    Cache

    @enzoampil I added cache of all PSE companies from Jan 2010 to now that lives in /data/merged_stock_data.zip (size~4MB) based on new added function fastquant.update_pse_data_cache which in turn uses get_pse_data_old. get_pse_data now loads cache then downloads & appends data only newer than the cache. My problem is cache and every new phisix query have columns that match on dt and close only:

    data = get_stock_data(symbol="JFC", 
                        start_date="2018-01-01", 
                        end_date="2020-04-16", 
                        source="phisix",
                        format="dohlcv",
                       )
    data.tail()
    

    dt | open | high | low | close | volume -- | -- | -- | -- | -- | -- 2020-04-08 | 111.0 | 120.0 | 108.1 | 120.0 | NaN 2020-04-13 | 121.0 | 136.1 | 121.0 | 135.0 | NaN 2020-04-14 | 139.9 | 148.0 | 139.8 | 146.5 | NaN 2020-04-15 | NaN | NaN | NaN | 148.6 | 3944620.0 2020-04-16 | NaN | NaN | NaN | 141.5 | 2318510.0

    where 2020-04-14 is the last entry in cache.

    Should we set format="dc" as new default, instead of the previous format="dcv"? The advantage of having cache is enormous speedup (the query above took ~2 seconds) though we have to update the cache (takes ~3 minutes) say once a month so that new queries won't be longer than 30 iterations. Do we lose a lot if we ignore volume?

    EDIT: I remember strategies require data format=DCV. :( Is volume here necessary or optional?

    opened by jpdeleon 13
  • [FEATURE] Dividends

    [FEATURE] Dividends

    I am not quite sure what happens to the dividends that one would get while holding the share in backtesting.

    Maybe there should be an option for reinvesting or adding it to the account balance.

    If this feature is already present, I am sorry.

    enhancement 
    opened by timweissenfels 12
  • [BUG] Error when running the get_pse_data from lesson 1 (fresh installation)

    [BUG] Error when running the get_pse_data from lesson 1 (fresh installation)

    Problem description

    I'm unable to run the "three line code" from lesson 1

    Initially, installed fastquant in Anaconda terminal using

    pip install git+git://github.com/enzoampil/fastquant.git
    

    Proceeded to open my Jupyter notebook to run the following codes

    from fastquant import get_pse_data
    df = get_pse_data('JFC', '2018-01-01', '2019-01-01')
    df.head()
    

    Error showed


    AssertionError Traceback (most recent call last) in 1 from fastquant import get_pse_data ----> 2 df = get_pse_data('JFC', '2018-01-01', '2019-01-01') 3 df.head()

    C:\ProgramData\Anaconda3\lib\site-packages\fastquant\fastquant.py in get_pse_data(symbol, start_date, end_date, save, max_straight_nones, format) 404 ) 405 else: --> 406 cache = get_pse_data_cache(symbol=symbol) 407 cache = cache.reset_index() 408 # oldest_date = cache["dt"].iloc[0]

    C:\ProgramData\Anaconda3\lib\site-packages\fastquant\fastquant.py in get_pse_data_cache(symbol, cache_fp, update, verbose) 303 print("Loaded: ", cache_fp) 304 errmsg = "Cache does not exist! Try update=True" --> 305 assert cache_fp.exists(), errmsg 306 df = pd.read_csv(cache_fp, index_col=0, header=[0, 1]) 307 df.index = pd.to_datetime(df.index)

    AssertionError: Cache does not exist! Try update=True

    image

    Downloaded the "data" folder from fastquant repo and pasted in my installation directory

    C:\ProgramData\Anaconda3\Lib\site-packages\fastquant

    Re-ran my Jupyter notebook and re-ran

    from fastquant import get_pse_data
    df = get_pse_data('JFC', '2018-01-01', '2019-01-01')
    df.head()
    

    Same error message.

    bug 
    opened by alexismanalo 12
  • [BUG] Cannot install fastquant using

    [BUG] Cannot install fastquant using "pip install"

    Problem description:

    I tried to install fastquant using pip install in a virtual environment and it resulted to an error message

    Example

    C:\Users\User>fastquant\Scripts\activate.bat

    (fastquant) C:\Users\User>pip install fastquant Collecting fastquant Using cached fastquant-0.1.3.23-py3-none-any.whl (5.3 MB) Collecting certifi==2019.11.28 Using cached certifi-2019.11.28-py2.py3-none-any.whl (156 kB) Collecting requests==2.22.0 Downloading requests-2.22.0-py2.py3-none-any.whl (57 kB) |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 57 kB 270 kB/s Collecting pandas==1.0.3 Downloading pandas-1.0.3.tar.gz (5.0 MB) |β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 5.0 MB 1.1 MB/s Installing build dependencies ... error ERROR: Command errored out with exit status 1: command: 'c:\users\user\fastquant\scripts\python.exe' 'c:\users\user\fastquant\lib\site-packages\pip' install --ignore-installed --no-user --prefix 'C:\Users\User\AppData\Local\Temp\pip-build-env-s9ese2uo\overlay' --no-warn-script-location --no-binary :none: --only-binary :none: -i https://pypi.org/simple -- setuptools wheel 'Cython>=0.29.13' 'numpy==1.13.3; python_version=='"'"'3.6'"'"' and platform_system!='"'"'AIX'"'"'' 'numpy==1.14.5; python_version>='"'"'3.7'"'"' and platform_system!='"'"'AIX'"'"'' 'numpy==1.16.0; python_version=='"'"'3.6'"'"' and platform_system=='"'"'AIX'"'"'' 'numpy==1.16.0; python_version>='"'"'3.7'"'"' and platform_system=='"'"'AIX'"'"'' cwd: None Complete output (270 lines): Ignoring numpy: markers 'python_version == "3.6" and platform_system != "AIX"' don't match your environment Ignoring numpy: markers 'python_version == "3.6" and platform_system == "AIX"' don't match your environment Ignoring numpy: markers 'python_version >= "3.7" and platform_system == "AIX"' don't match your environment Collecting setuptools Using cached setuptools-50.3.0-py3-none-any.whl (785 kB) Collecting wheel Using cached wheel-0.35.1-py2.py3-none-any.whl (33 kB) Collecting Cython>=0.29.13 Using cached Cython-0.29.21-py2.py3-none-any.whl (974 kB) Collecting numpy==1.14.5 Downloading numpy-1.14.5.zip (4.9 MB) Using legacy 'setup.py install' for numpy, since package 'wheel' is not installed. Installing collected packages: setuptools, wheel, Cython, numpy Running setup.py install for numpy: started Running setup.py install for numpy: finished with status 'error' ERROR: Command errored out with exit status 1: command: 'c:\users\user\fastquant\scripts\python.exe' -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\Users\User\AppData\Local\Temp\pip-install-r5y02nek\numpy\setup.py'"'"'; file='"'"'C:\Users\User\AppData\Local\Temp\pip-install-r5y02nek\numpy\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' install --record 'C:\Users\User\AppData\Local\Temp\pip-record-blet8ru4\install-record.txt' --single-version-externally-managed --prefix 'C:\Users\User\AppData\Local\Temp\pip-build-env-s9ese2uo\overlay' --compile --install-headers 'C:\Users\User\AppData\Local\Temp\pip-build-env-s9ese2uo\overlay\include\site\python3.9\numpy' cwd: C:\Users\User\AppData\Local\Temp\pip-install-r5y02nek\numpy
    Complete output (249 lines): Running from numpy source directory.

      Note: if you need reliable uninstall behavior, then install
      with pip instead of using `setup.py install`:
    
        - `pip install .`       (from a git repo or downloaded source
                                 release)
        - `pip install numpy`   (last NumPy release on PyPi)
    
    
      C:\Users\User\AppData\Local\Temp\pip-install-r5y02nek\numpy\numpy\distutils\misc_util.py:464: SyntaxWarning: "is" with a literal. Did you mean "=="?
        return is_string(s) and ('*' in s or '?' is s)
      blas_opt_info:
      blas_mkl_info:
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries mkl_rt not found in ['c:\\users\\user\\fastquant\\lib', 'C:\\']
        NOT AVAILABLE
    
      blis_info:
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries blis not found in ['c:\\users\\user\\fastquant\\lib', 'C:\\']
        NOT AVAILABLE
    
      openblas_info:
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries openblas not found in ['c:\\users\\user\\fastquant\\lib', 'C:\\']
      get_default_fcompiler: matching types: '['gnu', 'intelv', 'absoft', 'compaqv', 'intelev', 'gnu95', 'g95', 'intelvem', 'intelem', 'flang']'
      customize GnuFCompiler
      Could not locate executable g77
      Could not locate executable f77
      customize IntelVisualFCompiler
      Could not locate executable ifort
      Could not locate executable ifl
      customize AbsoftFCompiler
      Could not locate executable f90
      customize CompaqVisualFCompiler
      Could not locate executable DF
      customize IntelItaniumVisualFCompiler
      Could not locate executable efl
      customize Gnu95FCompiler
      Could not locate executable gfortran
      Could not locate executable f95
      customize G95FCompiler
      Could not locate executable g95
      customize IntelEM64VisualFCompiler
      customize IntelEM64TFCompiler
      Could not locate executable efort
      Could not locate executable efc
      customize PGroupFlangCompiler
      Could not locate executable flang
      don't know how to compile Fortran code on platform 'nt'
        NOT AVAILABLE
    
      atlas_3_10_blas_threads_info:
      Setting PTATLAS=ATLAS
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries tatlas not found in ['c:\\users\\user\\fastquant\\lib', 'C:\\']
        NOT AVAILABLE
    
      atlas_3_10_blas_info:
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries satlas not found in ['c:\\users\\user\\fastquant\\lib', 'C:\\']
        NOT AVAILABLE
    
      atlas_blas_threads_info:
      Setting PTATLAS=ATLAS
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries ptf77blas,ptcblas,atlas not found in ['c:\\users\\user\\fastquant\\lib', 'C:\\']
        NOT AVAILABLE
    
      atlas_blas_info:
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries f77blas,cblas,atlas not found in ['c:\\users\\user\\fastquant\\lib', 'C:\\']
        NOT AVAILABLE
    
      C:\Users\User\AppData\Local\Temp\pip-install-r5y02nek\numpy\numpy\distutils\system_info.py:624: UserWarning:
          Atlas (http://math-atlas.sourceforge.net/) libraries not found.
          Directories to search for the libraries can be specified in the
          numpy/distutils/site.cfg file (section [atlas]) or by setting
          the ATLAS environment variable.
        self.calc_info()
      blas_info:
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries blas not found in ['c:\\users\\user\\fastquant\\lib', 'C:\\']
        NOT AVAILABLE
    
      C:\Users\User\AppData\Local\Temp\pip-install-r5y02nek\numpy\numpy\distutils\system_info.py:624: UserWarning:
          Blas (http://www.netlib.org/blas/) libraries not found.
          Directories to search for the libraries can be specified in the
          numpy/distutils/site.cfg file (section [blas]) or by setting
          the BLAS environment variable.
        self.calc_info()
      blas_src_info:
        NOT AVAILABLE
    
      C:\Users\User\AppData\Local\Temp\pip-install-r5y02nek\numpy\numpy\distutils\system_info.py:624: UserWarning:
          Blas (http://www.netlib.org/blas/) sources not found.
          Directories to search for the sources can be specified in the
          numpy/distutils/site.cfg file (section [blas_src]) or by setting
          the BLAS_SRC environment variable.
        self.calc_info()
        NOT AVAILABLE
    
      'svnversion' is not recognized as an internal or external command,
      operable program or batch file.
      'svnversion' is not recognized as an internal or external command,
      operable program or batch file.
      non-existing path in 'numpy\\distutils': 'site.cfg'
      F2PY Version 2
      lapack_opt_info:
      lapack_mkl_info:
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries mkl_rt not found in ['c:\\users\\user\\fastquant\\lib', 'C:\\']
        NOT AVAILABLE
    
      openblas_lapack_info:
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries openblas not found in ['c:\\users\\user\\fastquant\\lib', 'C:\\']
        NOT AVAILABLE
    
      openblas_clapack_info:
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries openblas,lapack not found in ['c:\\users\\user\\fastquant\\lib', 'C:\\']
        NOT AVAILABLE
    
      atlas_3_10_threads_info:
      Setting PTATLAS=ATLAS
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries tatlas,tatlas not found in c:\users\user\fastquant\lib
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries lapack_atlas not found in c:\users\user\fastquant\lib
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries tatlas,tatlas not found in C:\
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries lapack_atlas not found in C:\
      <class 'numpy.distutils.system_info.atlas_3_10_threads_info'>
        NOT AVAILABLE
    
      atlas_3_10_info:
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries satlas,satlas not found in c:\users\user\fastquant\lib
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries lapack_atlas not found in c:\users\user\fastquant\lib
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries satlas,satlas not found in C:\
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries lapack_atlas not found in C:\
      <class 'numpy.distutils.system_info.atlas_3_10_info'>
        NOT AVAILABLE
    
      atlas_threads_info:
      Setting PTATLAS=ATLAS
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries ptf77blas,ptcblas,atlas not found in c:\users\user\fastquant\lib
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries lapack_atlas not found in c:\users\user\fastquant\lib
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries ptf77blas,ptcblas,atlas not found in C:\
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries lapack_atlas not found in C:\
      <class 'numpy.distutils.system_info.atlas_threads_info'>
        NOT AVAILABLE
    
      atlas_info:
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries f77blas,cblas,atlas not found in c:\users\user\fastquant\lib
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries lapack_atlas not found in c:\users\user\fastquant\lib
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries f77blas,cblas,atlas not found in C:\
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries lapack_atlas not found in C:\
      <class 'numpy.distutils.system_info.atlas_info'>
        NOT AVAILABLE
    
      lapack_info:
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      customize MSVCCompiler
        libraries lapack not found in ['c:\\users\\user\\fastquant\\lib', 'C:\\']
        NOT AVAILABLE
    
      C:\Users\User\AppData\Local\Temp\pip-install-r5y02nek\numpy\numpy\distutils\system_info.py:624: UserWarning:
          Lapack (http://www.netlib.org/lapack/) libraries not found.
          Directories to search for the libraries can be specified in the
          numpy/distutils/site.cfg file (section [lapack]) or by setting
          the LAPACK environment variable.
        self.calc_info()
      lapack_src_info:
        NOT AVAILABLE
    
      C:\Users\User\AppData\Local\Temp\pip-install-r5y02nek\numpy\numpy\distutils\system_info.py:624: UserWarning:
          Lapack (http://www.netlib.org/lapack/) sources not found.
          Directories to search for the sources can be specified in the
          numpy/distutils/site.cfg file (section [lapack_src]) or by setting
          the LAPACK_SRC environment variable.
        self.calc_info()
        NOT AVAILABLE
    
      C:\Users\User\AppData\Local\Programs\Python\Python39\lib\distutils\dist.py:274: UserWarning: Unknown distribution option: 'define_macros'
        warnings.warn(msg)
      running install
      running build
      running config_cc
      unifing config_cc, config, build_clib, build_ext, build commands --compiler options
      running config_fc
      unifing config_fc, config, build_clib, build_ext, build commands --fcompiler options
      running build_src
      build_src
      building py_modules sources
      creating build
      creating build\src.win-amd64-3.9
      creating build\src.win-amd64-3.9\numpy
      creating build\src.win-amd64-3.9\numpy\distutils
      building library "npymath" sources
      No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils
      error: Microsoft Visual C++ 14.0 is required. Get it with "Build Tools for Visual Studio": https://visualstudio.microsoft.com/downloads/
      ----------------------------------------
    

    ERROR: Command errored out with exit status 1: 'c:\users\user\fastquant\scripts\python.exe' -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\Users\User\AppData\Local\Temp\pip-install-r5y02nek\numpy\setup.py'"'"'; file='"'"'C:\Users\User\AppData\Local\Temp\pip-install-r5y02nek\numpy\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' install --record 'C:\Users\User\AppData\Local\Temp\pip-record-blet8ru4\install-record.txt' --single-version-externally-managed --prefix 'C:\Users\User\AppData\Local\Temp\pip-build-env-s9ese2uo\overlay' --compile --install-headers 'C:\Users\User\AppData\Local\Temp\pip-build-env-s9ese2uo\overlay\include\site\python3.9\numpy' Check the logs for full command output.

    ERROR: Command errored out with exit status 1: 'c:\users\user\fastquant\scripts\python.exe' 'c:\users\user\fastquant\lib\site-packages\pip' install --ignore-installed --no-user --prefix 'C:\Users\User\AppData\Local\Temp\pip-build-env-s9ese2uo\overlay' --no-warn-script-location --no-binary :none: --only-binary :none: -i https://pypi.org/simple -- setuptools wheel 'Cython>=0.29.13' 'numpy==1.13.3; python_version=='"'"'3.6'"'"' and platform_system!='"'"'AIX'"'"'' 'numpy==1.14.5; python_version>='"'"'3.7'"'"' and platform_system!='"'"'AIX'"'"'' 'numpy==1.16.0; python_version=='"'"'3.6'"'"' and platform_system=='"'"'AIX'"'"'' 'numpy==1.16.0; python_version>='"'"'3.7'"'"' and platform_system=='"'"'AIX'"'"'' Check the logs for full command output.

    Environment

    • platform: Windows 10
    • fastquant version:
    • installation method: pip
    bug 
    opened by ednicole12 10
  • [BUG] Unable to pull data for a longer date range

    [BUG] Unable to pull data for a longer date range

    Tried downloading 2010 - 2020 data for JFC using the code below.

    from fastquant import get_stock_data
    data = get_stock_data("JFC", "2010-01-01", "2020-01-01")
    

    Error Message:

    9it [00:08,  1.06s/it]
    
    Symbol {} not found in phisix after the first {} date iterations!
    
    
    [*********************100%***********************]  1 of 1 completed
    
    
    
    
    bug 
    opened by benjcabalona1029 10
  • [FEATURE] Add stop loss and take profit as general `backtest` parameters

    [FEATURE] Add stop loss and take profit as general `backtest` parameters

    Currently, backtest does not allow users to specify the target profit / stop loss at which positions are exited. This can also be done at the strategy level, but I also realized (from feedback from one of our users) that this is a parameter that is likely generalizable across strategies.

    enhancement 
    opened by enzoampil 9
  • [FEATURE] implement smart caching for stock data

    [FEATURE] implement smart caching for stock data

    We need to implement smart caching for get_pse_data, similar to load_disclosures. The former only checks for exact match of filename of saved stock data before loading; otherwise it re-downloads everything from scratch even if there's only 1 day difference between old and new query. The latter finds any saved discloures data of that company and appends older and/or newer data depending on the query so no data is downloaded twice.

    enhancement r-dev 
    opened by jpdeleon 8
  • [BUG] Error when install packgage

    [BUG] Error when install packgage

    Problem description

    Can't install fastquant packgage

    Example

    pip install fastquant 
    

    it's error like below


    note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for pandas Failed to build pandas ERROR: Could not build wheels for pandas, which is required to install pyproject.toml-based projects

    Environment

    • platform : windows
    • installation method : pip
    • python version 3.11
    bug 
    opened by psasin36156 2
  • new

    new

    My name is Luis, I'm a big-data machine-learning developer, I'm a fan of your work, and I usually check your updates.

    I was afraid that my savings would be eaten by inflation. I have created a powerful tool that based on past technical patterns (volatility, moving averages, statistics, trends, candlesticks, support and resistance, stock index indicators). All the ones you know (RSI, MACD, STOCH, Bolinger Bands, SMA, DEMARK, Japanese candlesticks, ichimoku, fibonacci, williansR, balance of power, murrey math, etc) and more than 200 others.

    The tool creates prediction models of correct trading points (buy signal and sell signal, every stock is good traded in time and direction). For this I have used big data tools like pandas python, stock market libraries like: tablib, TAcharts ,pandas_ta... For data collection and calculation. And powerful machine-learning libraries such as: Sklearn.RandomForest , Sklearn.GradientBoosting, XGBoost, Google TensorFlow and Google TensorFlow LSTM.

    With the models trained with the selection of the best technical indicators, the tool is able to predict trading points (where to buy, where to sell) and send real-time alerts to Telegram or Mail. The points are calculated based on the learning of the correct trading points of the last 2 years (including the change to bear market after the rate hike).

    I think it could be useful to you, to improve, I would like to share it with you, and if you are interested in improving and collaborating I am also willing, and if not file it in the box.

    If tou want, Please read the readme , and in case of any problem you can contact me , If you are convinced try to install it with the documentation. https://github.com/Leci37/stocks-Machine-learning-RealTime-telegram/tree/develop I appreciate the feedback

    enhancement 
    opened by Leci37 0
  • Error when get stock data from Thai stock

    Error when get stock data from Thai stock

    I got error like this "ValueError: You are trying to merge on datetime64[ns] and datetime64[ns, Asia/Bangkok] columns. If you wish to proceed you should use pd.concat" only when i try to get stock data from thailand stock exchange for example 'PTT.BK'. image

    opened by skproptrade 0
  • Testing with machine learning model and a data frame with pre-generated signals

    Testing with machine learning model and a data frame with pre-generated signals

    Hi all,

    I have a machine learning model that takes in several technical indicators as indepenent variables. I have already generated the model and have made predictions. Can I test using this data via fast quant? All the data is in a pandas data frame.

    By the way, I am asking this question here because I am unable to get to the discord server. I am blind and the robot challenges are inaccessible.

    opened by pranavlal 0
  • Add token permissions for py-cli.yml

    Add token permissions for py-cli.yml

    GitHub asks users to define workflow permissions, see https://github.blog/changelog/2021-04-20-github-actions-control-permissions-for-github_token/ and https://docs.github.com/en/actions/security-guides/automatic-token-authentication#modifying-the-permissions-for-the-github_token for securing GitHub workflows against supply-chain attacks.

    The Open Source Security Foundation (OpenSSF) Scorecards also treats not setting token permissions as a high-risk issue.

    This repository has a Scorecards score of 5.1/10 with 10 being the most secure. The Token-Permissions category has a score of 0/10.

    This file was fixed automatically using the open-source tool https://github.com/step-security/secure-workflows. If you like the changes and merge them, please consider starring the repo.

    opened by arjundashrath 0
  • [Indicator import from Pandas-ta ]

    [Indicator import from Pandas-ta ]

    In case need to develope strategy with indicators that's out of scope of Backtrader-indicator then it's difficult to create custom indicator to fulfill the criteria. Luckily pandas-ta lib Congress almost all. Combine techno-funda analysis. Another request is to backtest multiple strategy with same data, concurrently & compare their performance on plot.

    Thanks a lot for fastquant.

    enhancement 
    opened by algopy 1
Owner
Lorenzo Ampil
co-founder & dev @ Hawksight.co | democratizing smart defi | creator of fastquant | top contributor @flipsidecrypto | πŸ‡΅πŸ‡­ based in πŸ‡ΈπŸ‡¬
Lorenzo Ampil
This is a simple backtesting framework to help you test your crypto currency trading. It includes a way to download and store historical crypto data and to execute a trading strategy.

You can use this simple crypto backtesting script to ensure your trading strategy is successful Minimal setup required and works well with static TP a

Andrei 154 Sep 12, 2022
Trading Strategies for Freqtrade

Freqtrade Strategies Strategies for Freqtrade, developed primarily in a partnership between @werkkrew and @JimmyNixx from the Freqtrade Discord. Use t

Bryan Chain 242 Jan 7, 2023
Providing the solutions for high-frequency trading (HFT) strategies using data science approaches (Machine Learning) on Full Orderbook Tick Data.

Modeling High-Frequency Limit Order Book Dynamics Using Machine Learning Framework to capture the dynamics of high-frequency limit order books. Overvi

Chang-Shu Chung 1.3k Jan 7, 2023
Using deep actor-critic model to learn best strategies in pair trading

Deep-Reinforcement-Learning-in-Stock-Trading Using deep actor-critic model to learn best strategies in pair trading Abstract Partially observed Markov

null 281 Dec 9, 2022
A general-purpose, flexible, and easy-to-use simulator alongside an OpenAI Gym trading environment for MetaTrader 5 trading platform (Approved by OpenAI Gym)

gym-mtsim: OpenAI Gym - MetaTrader 5 Simulator MtSim is a simulator for the MetaTrader 5 trading platform alongside an OpenAI Gym environment for rein

Mohammad Amin Haghpanah 184 Dec 31, 2022
Trading Gym is an open source project for the development of reinforcement learning algorithms in the context of trading.

Trading Gym Trading Gym is an open-source project for the development of reinforcement learning algorithms in the context of trading. It is currently

Dimitry Foures 535 Nov 15, 2022
Let Python optimize the best stop loss and take profits for your TradingView strategy.

TradingView Machine Learning TradeView is a free and open source Trading View bot written in Python. It is designed to support all major exchanges. It

Robert Roman 473 Jan 9, 2023
Technical Indicators implemented in Python only using Numpy-Pandas as Magic - Very Very Fast! Very tiny! Stock Market Financial Technical Analysis Python library . Quant Trading automation or cryptocoin exchange

MyTT Technical Indicators implemented in Python only using Numpy-Pandas as Magic - Very Very Fast! to Stock Market Financial Technical Analysis Python

dev 34 Dec 27, 2022
Theano is a Python library that allows you to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. It can use GPUs and perform efficient symbolic differentiation.

============================================================================================================ `MILA will stop developing Theano <https:

null 9.6k Dec 31, 2022
Theano is a Python library that allows you to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. It can use GPUs and perform efficient symbolic differentiation.

============================================================================================================ `MILA will stop developing Theano <https:

null 9.6k Jan 6, 2023
Theano is a Python library that allows you to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. It can use GPUs and perform efficient symbolic differentiation.

============================================================================================================ `MILA will stop developing Theano <https:

null 9.3k Feb 12, 2021
TensorFlow implementation for Bayesian Modeling and Uncertainty Quantification for Learning to Optimize: What, Why, and How

Bayesian Modeling and Uncertainty Quantification for Learning to Optimize: What, Why, and How TensorFlow implementation for Bayesian Modeling and Unce

Shen Lab at Texas A&M University 8 Sep 2, 2022
Open-L2O: A Comprehensive and Reproducible Benchmark for Learning to Optimize Algorithms

Open-L2O This repository establishes the first comprehensive benchmark efforts of existing learning to optimize (L2O) approaches on a number of proble

VITA 161 Jan 2, 2023
OCTIS: Comparing Topic Models is Simple! A python package to optimize and evaluate topic models (accepted at EACL2021 demo track)

OCTIS : Optimizing and Comparing Topic Models is Simple! OCTIS (Optimizing and Comparing Topic models Is Simple) aims at training, analyzing and compa

MIND 478 Jan 1, 2023
Official implementation for "Symbolic Learning to Optimize: Towards Interpretability and Scalability"

Symbolic Learning to Optimize This is the official implementation for ICLR-2022 paper "Symbolic Learning to Optimize: Towards Interpretability and Sca

VITA 8 Dec 19, 2022
Sequential model-based optimization with a `scipy.optimize` interface

Scikit-Optimize Scikit-Optimize, or skopt, is a simple and efficient library to minimize (very) expensive and noisy black-box functions. It implements

Scikit-Optimize 2.5k Jan 4, 2023
An end-to-end machine learning library to directly optimize AUC loss

LibAUC An end-to-end machine learning library for AUC optimization. Why LibAUC? Deep AUC Maximization (DAM) is a paradigm for learning a deep neural n

Andrew 75 Dec 12, 2022
SparseML is a libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models

SparseML is a toolkit that includes APIs, CLIs, scripts and libraries that apply state-of-the-art sparsification algorithms such as pruning and quantization to any neural network. General, recipe-driven approaches built around these algorithms enable the simplification of creating faster and smaller models for the ML performance community at large.

Neural Magic 1.5k Dec 30, 2022
sequitur is a library that lets you create and train an autoencoder for sequential data in just two lines of code

sequitur sequitur is a library that lets you create and train an autoencoder for sequential data in just two lines of code. It implements three differ

Jonathan Shobrook 305 Dec 21, 2022