Performance analysis of predictive (alpha) stock factors

Overview

https://media.quantopian.com/logos/open_source/alphalens-logo-03.png

Alphalens

GitHub Actions status

Alphalens is a Python Library for performance analysis of predictive (alpha) stock factors. Alphalens works great with the Zipline open source backtesting library, and Pyfolio which provides performance and risk analysis of financial portfolios. You can try Alphalens at Quantopian -- a free, community-centered, hosted platform for researching and testing alpha ideas. Quantopian also offers a fully managed service for professionals that includes Zipline, Alphalens, Pyfolio, FactSet data, and more.

The main function of Alphalens is to surface the most relevant statistics and plots about an alpha factor, including:

  • Returns Analysis
  • Information Coefficient Analysis
  • Turnover Analysis
  • Grouped Analysis

Getting started

With a signal and pricing data creating a factor "tear sheet" is a two step process:

import alphalens

# Ingest and format data
factor_data = alphalens.utils.get_clean_factor_and_forward_returns(my_factor,
                                                                   pricing,
                                                                   quantiles=5,
                                                                   groupby=ticker_sector,
                                                                   groupby_labels=sector_names)

# Run analysis
alphalens.tears.create_full_tear_sheet(factor_data)

Learn more

Check out the example notebooks for more on how to read and use the factor tear sheet. A good starting point could be this

Installation

Install with pip:

pip install alphalens

Install with conda:

conda install -c conda-forge alphalens

Install from the master branch of Alphalens repository (development code):

pip install git+https://github.com/quantopian/alphalens

Alphalens depends on:

Usage

A good way to get started is to run the examples in a Jupyter notebook.

To get set up with an example, you can:

Run a Jupyter notebook server via:

jupyter notebook

From the notebook list page(usually found at http://localhost:8888/), navigate over to the examples directory, and open any file with a .ipynb extension.

Execute the code in a notebook cell by clicking on it and hitting Shift+Enter.

Questions?

If you find a bug, feel free to open an issue on our github tracker.

Contribute

If you want to contribute, a great place to start would be the help-wanted issues.

Credits

For a full list of contributors see the contributors page.

Example Tear Sheet

Example factor courtesy of ExtractAlpha

https://github.com/quantopian/alphalens/raw/master/alphalens/examples/table_tear.png

https://github.com/quantopian/alphalens/raw/master/alphalens/examples/returns_tear.png

https://github.com/quantopian/alphalens/raw/master/alphalens/examples/ic_tear.png

Comments
  • Feature request: Use alphalens with returns instead of prices

    Feature request: Use alphalens with returns instead of prices

    Is there a way to run Alphalens using my own custom input matrix of returns (ie. custom factor adjusted) rather than inputting Prices (ie. the "pricing" table in the examples) for each period?

    enhancement help wanted 
    opened by neman018 57
  • Integration with pyfolio and Quantopian's new Risk Model

    Integration with pyfolio and Quantopian's new Risk Model

    I have been thinking about the nice "Alpha decomposition" #180 feature. The information it provides is actually a small part of what already available in pyfolio and the Quantopian's new Risk Model. On one side we cannot replicate all the information provided by those two tools but on the other side it would be great to have all that analysis without having to build an algorithm and run a backtest, something that could be integrated into Alphalens.

    Then, why don't we create a function in Alphalens that builds the input required by pyfolio and the Quantopian's new Risk Model? Alphalens already simulates the cumulative returns of a portfolio weighted by factor values, so we only need to format those information in a way that is compatible with the other two tools. That would be a pure theoretical analysis, but except for commissions and slippage the results would be realistic and it also would serve as benchmark for the algorithm (users can compare the algorithm results, after setting commission and slippage to 0, with these theoretical results and check if they implemented the algorithm correctly).

    I haven't looked at pyfolio in details so I don't know the details of the input, but if @twiecki can help me with those details I can work on this feature and the same for Quantopian's new Risk Mode (I don't know if that is part of pyfolio or a separate project).

    enhancement api 
    opened by luca-s 32
  •  ENH: added positions computation in 'performance.create_pyfolio_input'

    ENH: added positions computation in 'performance.create_pyfolio_input'

    'performance.create_pyfolio_input' now computes positions too. Also it is now possible to select the 'period' to be used in benchmark computation and for factor returns/positions is now possible to select equal weighing instead of factor weighing.

    opened by luca-s 30
  • new api

    new api

    Here is where I'm headed with the new api thoughts from https://github.com/quantopian/alphalens/pull/110 so to review creating a tear sheet would be a two step process.

    1. format the data
    2. call for the tear sheet

    where all of the tear sheets take a factor_data dataframe that looks something like screen shot 2017-01-13 at 11 53 54 am

    Right now the main blocking issue is the event-study-esque plots as that function requires raw prices. I think that the number of plots and the uniqueness of what they are saying probably merits them getting a separate tear sheet (which would be able to take raw prices).

    opened by jameschristopher 23
  • API: run Alphalens with returns instead of prices (utils.get_clean_factor)

    API: run Alphalens with returns instead of prices (utils.get_clean_factor)

    For issue #253 refactor compute_forward_returns add get_clean_factor API refactor get_clean_factor_and_forward_returns as compose of compute_forward_returns and get_clean_factor

    opened by HereticSK 22
  • Create sub tearsheets

    Create sub tearsheets

    This breaks down the full tear sheet into multiple smaller ones covering: returns, information, and turnover analysis.

    This PR is aimed mainly at issue #106 so Thomas your thoughts would be great!

    This is a first start so things are pretty rough, and it definitely won't pass tests but I think there will be a lot of discussion so I'm not too worried.

    opened by jameschristopher 21
  • Two Factor Interaction Development: Initial Data Structure, Test Module, and Plot

    Two Factor Interaction Development: Initial Data Structure, Test Module, and Plot

    This begins the development of the "factors_interaction_tearsheet" from issue #219. The goal of this pull request is to get feedback on whether this branch seems to be going in the right direction.

    Description of Changes

    1. Create join_factor_with_factor_and_forward_returns function
      • Creates a function complimentary to get_clean_factor_and_forward_returns that joins an additional factor to the factor_data dataframe returned by get_clean_factor_and_forward_returns.
      • This new dataframe returned, call it "multi_factor_data", will be the core source/data structure providing the necessary data for the factors_interaction_tearsheet computations.
    2. Create an associated test module.
    3. Modify perf.mean_return_by_quantile to take an additional parameter so that it can group by multiple factor quantiles.
    4. Add first plotting function, plot_multi_factor_quantile_returns, to create an annotated heatmap of mean returns by two-factor quantile bins.
    5. Create the tears.create_factors_interaction_tear_sheet as the entry point to the multi-factor tearsheet.

    Requesting Feedback

    1. Comments and suggestions on the utils.join_factor_with_factor_and_forward_returns function
      1. Should there be a wrapper that builds the multi_factor_data dataframe in one step. (i.e. wrap this function with get_clean_factor_and_forward_returns?
    2. I'm not too familiar with creating effective unit tests, so any feedback on this module is appreciated.
    3. In regards to Change 3 above:
      1. My first thought, following suggestion of @luca-s, was to create a separate performance module which would contain all functions for this sort of computation.
      2. Since the existing performance module already contains a lot of the needed functionality, I thought maybe I would create a wrapper function in this new module that added the necessary functionality.
      3. However, in perf.mean_return_by_quantile, I needed to add a parameter to this function to make it work in a clean manner. Not sure how I could have done that with a wrapper.
      4. So I guess my question is, what are the community's thoughts on how I dealt with this particular issue, and also what are thoughts on related considerations going forward?
    4. Any other comments/guidance on path of development going forward is greatly appreciated.
    5. Also, let me know if there are too many changes in this pull request for efficient/easy review.
    opened by MichaelJMath 19
  • Added support for group neutral factor analysis

    Added support for group neutral factor analysis

    I am working on a sector neutral factor and I discovered that Alphalens analysis on a group neutral factor is currently limited. With group neutral factor I mean a factor intended to rank stocks across the same group, so that it makes sense to compare performance of top vs bottom stocks in the same group but it doesn't make sense to compare performance of a stock in one group with performance of another group.

    The main shortcoming is the return analysis: as there is no way to demean the returns by group, the statistics shown are of little use. Also when the plots are broken down by group, the results are not useful either as there are 2 bugs:

    • API documentation claims that perf.mean_return_by_quantile was performing is performing the demeaning at group level when splitting the plots by group, but even in that case it is not true
    • The same goes for perf.mean_information_coefficient

    Also changed the API arguments names in a consistent way:

    • the tears.* functions use 'long_short' and 'group_neutral' all around, as those functions are intended to be the top level ones. 'by_group' is used if the function support multiple plots output, one for each group
    • the remaining API (mostry performance.*) use 'demeaned' and 'group_adjust' as they used to be
    enhancement 
    opened by luca-s 19
  • ENH Pyfolio integration

    ENH Pyfolio integration

    Issue #225

    Added performance.create_pyfolio_input, which create input suitable for pyfolio. Intended use:

    factor_data = alphales.utils.get_clean_factor_and_forward_returns(factor, prices)
    
    pf_returns = alphales.performance.create_pyfolio_input(factor_data,'1D',
                         long_short=True,
                         group_neutral=False,
                         quantiles=None,
                         groups=None)
    
    pyfolio.tears.create_returns_tear_sheet(pf_returns)
    

    Also, I greatly improved the function that computes the cumulative returns as that is the base on which the pyfolio integration feature is built on. Mainly I removed the legacy assumptions which required the factor to be computed at specific frequency (daily, weekly etc) and also the periods had to be multiple of this frequency (2 days, 4 days etc). The function is now able to properly compute returns when there are gaps in the factor data or when we analyze an intraday factor.

    opened by luca-s 18
  • BUG: Calculate mean returns by date before mean return by quantile

    BUG: Calculate mean returns by date before mean return by quantile

    Compute mean return by date. If by_date flag is false, then compute and return the average of the daily mean returns (and also the standard error of these daily mean returns).

    Resolves: Issue #309

    opened by MichaelJMath 17
  • Will alphalens support multi-factor models in the future?

    Will alphalens support multi-factor models in the future?

    alphalens is awesome! I have been used alphalens to filter several effective factors from many factors in stock market for some time. However, by the previous step I just got several effective factors independently. In practice, the more common scenario is to generate a multi-factor linear model and regression the multi-factor model (for example Fama-French 3-factor model) other than single-factor models and do hypothesis testing in this multi-factor model. Will this multi-factor model be considered into adding up to alphalens in the future?

    enhancement question 
    opened by huaiweicheng 17
  • fix 'Index' object has no attribute 'get_values' bug

    fix 'Index' object has no attribute 'get_values' bug

    When invoke the function alphalens.tears.create_turnover_tear_sheetalphalens.tears.create_turnover_tear_sheet() without allocating the param turnover_period, may cause the AttributeError: 'Index' object has no attribute 'get_values'.It is because the utils.get_forward_returns_columns() returns columns as an object of Index instead of pd.Series.Hence, Index object hasn't have function get_values().

    opened by MiaoMiaoKuangFei 0
  • importing alphalens 0.3.6 gets error

    importing alphalens 0.3.6 gets error "No module named 'pandas.util._decorators'"

    Problem Description

    I've tried importing alphalens but encountered the issue of "No module named 'pandas.util._decorators'" right the first command of "import alphalens" Please provide a minimal, self-contained, and reproducible example:

    import alphalens
    

    Please provide the full traceback:

    ModuleNotFoundError                       Traceback (most recent call last)
    <ipython-input-21-6e4fa055c088> in <module>
    ----> 1 import alphalens
    
    e:\temp\Python36\lib\site-packages\alphalens\__init__.py in <module>
    ----> 1 from . import performance
          2 from . import plotting
          3 from . import tears
          4 from . import utils
          5 
    
    e:\temp\Python36\lib\site-packages\alphalens\performance.py in <module>
         20 from pandas.tseries.offsets import BDay
         21 from scipy import stats
    ---> 22 from statsmodels.regression.linear_model import OLS
         23 from statsmodels.tools.tools import add_constant
         24 from . import utils
    
    e:\temp\Python36\lib\site-packages\statsmodels\regression\__init__.py in <module>
    ----> 1 from .linear_model import yule_walker
          2 
          3 from statsmodels.tools._testing import PytestTester
          4 
          5 __all__ = ['yule_walker', 'test']
    
    e:\temp\Python36\lib\site-packages\statsmodels\regression\linear_model.py in <module>
         34 
         35 from statsmodels.compat.python import lrange, lzip
    ---> 36 from statsmodels.compat.pandas import Appender
         37 
         38 import numpy as np
    
    e:\temp\Python36\lib\site-packages\statsmodels\compat\__init__.py in <module>
    ----> 1 from statsmodels.tools._testing import PytestTester
          2 
          3 from .python import (
          4     PY37,
          5     asunicode, asbytes, asstr,
    
    e:\temp\Python36\lib\site-packages\statsmodels\tools\__init__.py in <module>
          1 from .tools import add_constant, categorical
    ----> 2 from statsmodels.tools._testing import PytestTester
          3 
          4 __all__ = ['test', 'add_constant', 'categorical']
          5 
    
    e:\temp\Python36\lib\site-packages\statsmodels\tools\_testing.py in <module>
          9 
         10 """
    ---> 11 from statsmodels.compat.pandas import assert_equal
         12 
         13 import os
    
    e:\temp\Python36\lib\site-packages\statsmodels\compat\pandas.py in <module>
          3 import numpy as np
          4 import pandas as pd
    ----> 5 from pandas.util._decorators import deprecate_kwarg, Appender, Substitution
          6 
          7 __all__ = ['assert_frame_equal', 'assert_index_equal', 'assert_series_equal',
    
    ModuleNotFoundError: No module named 'pandas.util._decorators'
    

    Please provide any additional information below: I'm using VScode Win10. I tried updating pip, pandas, np, alphalens to newer versions, still no hope. Please give me a hand on that, thanks in advance

    Versions

    • Alphalens version: 0.3.6
    • Python version: 3.6.8
    • Pandas version: 0.18.1
    • Matplotlib version: 3.3.4
    • Numpy: 1.17.0
    • Scipy: 1.0.0
    • Statsmodels: 0.12.2
    • Zipline: 1.2.0
    opened by BinhKieu82 1
  • MissingDataError: exog contains inf or nans

    MissingDataError: exog contains inf or nans

    I am getting the MissingDataError: exog contains inf or nans when I am trying to get the returns tearsheet. The input data from get_clean_factor_and_forward_returns does not have any nans or infs in it. I saw mentions of this error on the Quantopian forum (https://quantopian-archive.netlify.app/forum/threads/alphalens-giving-exog-contains-inf-or-nans.html) and a couple of other places, but no solution. Any thoughts? Thank you!

    opened by sokol11 1
  • Alphalens

    Alphalens

    Problem Description

    UnboundLocalError: local variable 'period_len' referenced before assignment, line 319

    days_diffs = []
            for i in range(30):
                if i >= len(forward_returns.index):
                    break
                p_idx = prices.index.get_loc(forward_returns.index[i])
                if p_idx is None or p_idx < 0 or (
                        p_idx + period) >= len(prices.index):
                    continue
                start = prices.index[p_idx]
                end = prices.index[p_idx + period]
                period_len = diff_custom_calendar_timedeltas(start, end, freq)
                days_diffs.append(period_len.components.days)
    
            delta_days = period_len.components.days - mode(days_diffs).mode[0]
            period_len -= pd.Timedelta(days=delta_days)
            label = timedelta_to_string(period_len)
    

    Please provide the full traceback:

    [Paste traceback here]
    

    Please provide any additional information below:

    Versions

    • Alphalens version:
    • Python version: 3.7
    • Pandas version:
    • Matplotlib version:
    opened by Swordsman-T 0
  • Problem:create_event_returns_tear_sheet

    Problem:create_event_returns_tear_sheet

    Problem Description

    When I use the alphalens.tears.create_event_returns_tear_sheet, it shows one error: unsupported operand type(s) for -: 'slice' and 'int', besides other functions work well. Looking forwards someone could help me. Thank you.

    Please provide a minimal, self-contained, and reproducible example:

    [Paste code here]
    **alphalens.tears.create_event_returns_tear_sheet(data,
                                                    returns,
                                                    avgretplot=(5, 15),
                                                    long_short=True,
                                                    group_neutral=False,
                                                    std_bar=True,
                                                    by_group=False)**
    
    
    
    **Please provide the full traceback:**
    ```python
    [Paste traceback here]
    ```---------------------------------------------------------------------------
    TypeError                                 Traceback (most recent call last)
    D:\Anaconda\lib\site-packages\pandas\core\groupby\groupby.py in apply(self, func, *args, **kwargs)
        734             try:
    --> 735                 result = self._python_apply_general(f)
        736             except TypeError:
    
    D:\Anaconda\lib\site-packages\pandas\core\groupby\groupby.py in _python_apply_general(self, f)
        750     def _python_apply_general(self, f):
    --> 751         keys, values, mutated = self.grouper.apply(f, self._selected_obj, self.axis)
        752 
    
    D:\Anaconda\lib\site-packages\pandas\core\groupby\ops.py in apply(self, f, data, axis)
        205             group_axes = group.axes
    --> 206             res = f(group)
        207             if not _is_indexed_like(res, group_axes):
    
    D:\Anaconda\lib\site-packages\pandas\core\groupby\groupby.py in f(g)
        718                     with np.errstate(all="ignore"):
    --> 719                         return func(g, *args, **kwargs)
        720 
    
    D:\Anaconda\lib\site-packages\alphalens\performance.py in average_cumulative_return(q_fact, demean_by)
        802     def average_cumulative_return(q_fact, demean_by):
    --> 803         q_returns = cumulative_return_around_event(q_fact, demean_by)
        804         q_returns.replace([np.inf, -np.inf], np.nan, inplace=True)
    
    D:\Anaconda\lib\site-packages\alphalens\performance.py in cumulative_return_around_event(q_fact, demean_by)
        798             mean_by_date=True,
    --> 799             demean_by=demean_by,
        800         )
    
    D:\Anaconda\lib\site-packages\alphalens\performance.py in common_start_returns(factor, returns, before, after, cumulative, mean_by_date, demean_by)
        701 
    --> 702         starting_index = max(day_zero_index - before, 0)
        703         ending_index = min(day_zero_index + after + 1,
    
    TypeError: unsupported operand type(s) for -: 'slice' and 'int'
    
    During handling of the above exception, another exception occurred:
    
    TypeError                                 Traceback (most recent call last)
    <ipython-input-46-6fc348201897> in <module>
          5                                                 group_neutral=False,
          6                                                 std_bar=True,
    ----> 7                                                 by_group=False)
    
    D:\Anaconda\lib\site-packages\alphalens\plotting.py in call_w_context(*args, **kwargs)
         43             with plotting_context(), axes_style(), color_palette:
         44                 sns.despine(left=True)
    ---> 45                 return func(*args, **kwargs)
         46         else:
         47             return func(*args, **kwargs)
    
    D:\Anaconda\lib\site-packages\alphalens\tears.py in create_event_returns_tear_sheet(factor_data, returns, avgretplot, long_short, group_neutral, std_bar, by_group)
        573         periods_after=after,
        574         demeaned=long_short,
    --> 575         group_adjust=group_neutral,
        576     )
        577 
    
    D:\Anaconda\lib\site-packages\alphalens\performance.py in average_cumulative_return_by_quantile(factor_data, returns, periods_before, periods_after, demeaned, group_adjust, by_group)
        858         elif demeaned:
        859             fq = factor_data['factor_quantile']
    --> 860             return fq.groupby(fq).apply(average_cumulative_return, fq)
        861         else:
        862             fq = factor_data['factor_quantile']
    
    D:\Anaconda\lib\site-packages\pandas\core\groupby\generic.py in apply(self, func, *args, **kwargs)
        222     )
        223     def apply(self, func, *args, **kwargs):
    --> 224         return super().apply(func, *args, **kwargs)
        225 
        226     @Substitution(
    
    D:\Anaconda\lib\site-packages\pandas\core\groupby\groupby.py in apply(self, func, *args, **kwargs)
        744 
        745                 with _group_selection_context(self):
    --> 746                     return self._python_apply_general(f)
        747 
        748         return result
    
    D:\Anaconda\lib\site-packages\pandas\core\groupby\groupby.py in _python_apply_general(self, f)
        749 
        750     def _python_apply_general(self, f):
    --> 751         keys, values, mutated = self.grouper.apply(f, self._selected_obj, self.axis)
        752 
        753         return self._wrap_applied_output(
    
    D:\Anaconda\lib\site-packages\pandas\core\groupby\ops.py in apply(self, f, data, axis)
        204             # group might be modified
        205             group_axes = group.axes
    --> 206             res = f(group)
        207             if not _is_indexed_like(res, group_axes):
        208                 mutated = True
    
    D:\Anaconda\lib\site-packages\pandas\core\groupby\groupby.py in f(g)
        717                 def f(g):
        718                     with np.errstate(all="ignore"):
    --> 719                         return func(g, *args, **kwargs)
        720 
        721             elif hasattr(nanops, "nan" + func):
    
    D:\Anaconda\lib\site-packages\alphalens\performance.py in average_cumulative_return(q_fact, demean_by)
        801 
        802     def average_cumulative_return(q_fact, demean_by):
    --> 803         q_returns = cumulative_return_around_event(q_fact, demean_by)
        804         q_returns.replace([np.inf, -np.inf], np.nan, inplace=True)
        805 
    
    D:\Anaconda\lib\site-packages\alphalens\performance.py in cumulative_return_around_event(q_fact, demean_by)
        797             cumulative=True,
        798             mean_by_date=True,
    --> 799             demean_by=demean_by,
        800         )
        801 
    
    D:\Anaconda\lib\site-packages\alphalens\performance.py in common_start_returns(factor, returns, before, after, cumulative, mean_by_date, demean_by)
        700             continue
        701 
    --> 702         starting_index = max(day_zero_index - before, 0)
        703         ending_index = min(day_zero_index + after + 1,
        704                            len(returns.index))
    
    TypeError: unsupported operand type(s) for -: 'slice' and 'int'
    
    <Figure size 432x288 with 0 Axes>
    ​
    
    **Please provide any additional information below:**
    
    
    ## Versions
    
    * Alphalens version: 0.4.0
    * Python version: 3.7.6
    * Pandas version: 1.0.1
    * Matplotlib version: 3.1.3
    
    opened by RemnantLi 0
Releases(v0.4.0)
  • v0.4.0(Apr 30, 2020)

    This is a minor release from 0.3.6 that includes bugfixes, performance improvements, and build changes.

    Bug Fixes

    https://github.com/quantopian/alphalens/pull/334 Pandas 1.0 fix: https://github.com/quantopian/alphalens/pull/364

    New Features

    Turnover tearsheet improvement: https://github.com/quantopian/alphalens/pull/354 CI builds now run on GitHub Actions: https://github.com/quantopian/alphalens/pull/363

    Performance + API Change

    https://github.com/quantopian/alphalens/pull/361 simplified the cumulative returns calculation, making it much faster. Other function signatures and behaviors were modified as well.

    Docs + Miscellaneous

    https://github.com/quantopian/alphalens/pull/332 README updates: https://github.com/quantopian/alphalens/pull/345, https://github.com/quantopian/alphalens/pull/337 A new tutorial notebook.

    Credits

    The following people contributed to this release: @eigenfoo, @luca-s, @twiecki, @ivigamberdiev, @fawce, @jbredeche, @jimportico, @gerrymanoim, @jmccorriston, @altquant

    Source code(tar.gz)
    Source code(zip)
  • v0.3.6(Jan 7, 2019)

  • v0.3.5(Dec 17, 2018)

    This is a minor release from 0.3.4 that includes bugfixes, speed enhancement and compatibility with more recent pandas versions. We recommend that all users upgrade to this version.

    Bugfixes

    • Issue 323 factor_rank_autocorrelation infers turnover period in calendar space while periods could have different time space
    • PR 324 avoid crashing Alphalens when autocorrelation or turnover data contains only NaNs

    Performance

    • PR 327 Speed up compute_forward_returns and get_clean_factor

    Compatibility with new pandas versions

    • PR 328 improved compatibility with pandas 0.23.4
    Source code(tar.gz)
    Source code(zip)
  • v0.3.4(Oct 11, 2018)

    This is a minor release from 0.3.3 that includes bugfixes, small enhancements and backward compatibility breakages. We recommend that all users upgrade to this version.

    Bugfixes

    • PR 317 Fix date conversion in newer versions of pandas
    • Issue 309 Biased Mean Quantile Returns for Non-Equal Bins

    New features

    • PR 306 added zero aware quantiles option

    API change

    • PR 268 All functions deprecated in version v0.2.0 are no longer available

    Credits

    The following people contributed to this release:

    @eigenfoo - George Ho @MichaelJMath - Mike Matthews @freddiev4 - Freddie Vargus @vikram-narayan - Vikram Narayan @twiecki - Thomas Wiecki @luca-s - Luca Scarabello

    Source code(tar.gz)
    Source code(zip)
  • v0.3.2(May 14, 2018)

    This is a minor release from 0.3.1 that includes bugfixes and small enhancements. We recommend that all users upgrade to this version.

    Bugfixes

    • PR297 BUG: create_pyfolio_input doesn't work with frequency higher than 1 day
    • PR302 BUG: compute_mean_return_spread returns error if no std_err argument

    New features

    • PR298 ENH: added rate of return option in 'create_event_study_tear_sheet'
    • PR300 ENH: added n_bars option in 'create_event_study_tear_sheet'
    Source code(tar.gz)
    Source code(zip)
  • v0.3.1(Apr 24, 2018)

    This is a minor release from 0.3.0 that includes bugfixes and performance improvement. We recommend that all users upgrade to this version.

    Bugfixes

    • PR 287 utils.get_clean_factor crashes with malformed 'groupby' data
    • PR 287 perf.average_cumulative_return_by_quantile crahes in certain scenarios
    • PR 288 monthly IC heatmap plot has inverted colors (red for positive and blue for negative IC)
    • PR 295 Issue 292 utils.compute_forward_returns fails to detect the correct period length

    Performance

    • PR 294 computation of cumulative returns is very slow
    Source code(tar.gz)
    Source code(zip)
  • v0.3.0(Mar 14, 2018)

    This is a major release from 0.2.1, we recommend that all users upgrade to this version.

    New features

    Bugfixes

    • Alphalens now works with both tz-aware and tz-naive data (but not mixed)
    • "Cumulative Returns by Quantile" plot used a different color scheme for quantiles than "Average Cumulative Returns by Quantile" plot
    • Many small but useful bug fixes that avoid sporadic crashes and memory leaks. Please see the git history for more details

    Documentation

    Maintenance

    • Removed deprecated pandas.TimeGrouper
    • Migrated tests from deprecated nose-parameterized (#251)
    • Fixed compatibility with matplotlib 2.2.0
    • Alphalens is now available via conda-forge. Install via conda install -c conda-forge alphalens

    Credits

    The following people contributed to this release:

    @luca-s - Luca Scarabello @twiecki - Thomas Wiecki @mmargenot - Max Margenot @MichaelJMath @HereticSK @TimShawver - Tim Shawver @alen12345 - Alessio Nava

    Source code(tar.gz)
    Source code(zip)
  • v0.2.1(Nov 18, 2017)

    This is a bugfix release from v0.2.0. All users are recommended to upgrade.

    Bugfixes

    • tears.create_information_tear_sheet: argument group_adjust was erroneously removed without a replacement. From this release argumentgroup_adjust is still deprecated but group_neutral can be used instead
    Source code(tar.gz)
    Source code(zip)
  • v0.2.0(Nov 17, 2017)

    This is a major new release since v0.1.0. It contains small API breakage, several new features and many bug fixes. All users are recommended to upgrade.

    New since v0.1.0

    New features

    • Added event study analysis: an event study is a statistical method to assess the impact of a particular event on the value of equities and it is now possible to perform this analysis through the API alphalens.tears.create_event_study_tear_sheet. Check out the relative NoteBook in the example folder.

    • Added support for group neutral factor analysis (group_neutral argument): this affects the return analysis that is now able to compute returns statistics for each group independently and aggregate them together assuming a portfolio where each group has equal weight.

    • utils.get_clean_factor_and_forward_returns has a new parameter max_loss that controls how much data the function is allowed to drop due to not having enough price data or due to binning errors (pandas.qcut). This gives the users more control on what is happening and also avoid the function to raise an exception if the binning doesn't go well on some values.

    • Greatly improved API documentation

    Bugfixes

    API change

    • Removed deprecated alphalens.tears.create_factor_tear_sheet
    • tears.create_summary_tear_sheet: added argument group_neutral.
    • tears.create_returns_tear_sheet: added argument group_neutral. Please consider using keyword arguments to avoid API breakage
    • tears.create_information_tear_sheet: group_adjust is now deprecated and group_neutral should be used instead
    • tears.create_full_tear_sheet: group_adjust is now deprecated and group_neutral should be used instead
    • tears.create_event_returns_tear_sheet: added argument group_neutral. Please consider using keyword arguments to avoid API breakage
    • Several small changes to lower level API (alphalens.performance)

    Maintenance

    • Depends on pandas>=0.18.0
    • Changed deprecated pd.rolling_mean() to use the new *.rolling().mean() API
    • Changed deprecated pd.rolling_apply() to use the new *.rolling().apply() API
    • Use versioneer to pull version from git tag
    Source code(tar.gz)
    Source code(zip)
  • v0.1.2(Oct 3, 2017)

    New release v0.1.2

    • Removed deprecated API 'alphalens.tears.create_factor_tear_sheet'

    • Added event study API 'alphalens.tears.create_event_study_tear_sheet' and relative example NB

    • Added Long only option to 'alphalens.performance.factor_alpha_beta'

    • Improved docstrings all around

    • Small bug fixes

    Source code(tar.gz)
    Source code(zip)
Owner
Quantopian, Inc.
Quantopian builds software tools and libraries for quantitative finance.
Quantopian, Inc.
Supply a wrapper ``StockDataFrame`` based on the ``pandas.DataFrame`` with inline stock statistics/indicators support.

Stock Statistics/Indicators Calculation Helper VERSION: 0.3.2 Introduction Supply a wrapper StockDataFrame based on the pandas.DataFrame with inline s

Cedric Zhuang 1.1k Dec 28, 2022
stock data on eink with raspberry

small python skript to display tradegate data on a waveshare e-ink important you need locale "de_AT.UTF-8 UTF-8" installed. do so in raspi-config's Lo

Simon Oberhammer 24 Feb 22, 2022
Common financial risk and performance metrics. Used by zipline and pyfolio.

empyrical Common financial risk metrics. Table of Contents Installation Usage Support Contributing Testing Installation pip install empyrical Usage S

Quantopian, Inc. 1k Dec 26, 2022
High-performance TensorFlow library for quantitative finance.

TF Quant Finance: TensorFlow based Quant Finance Library Table of contents Introduction Installation TensorFlow training Development roadmap Examples

Google 3.5k Jan 1, 2023
Technical Analysis Library using Pandas and Numpy

Technical Analysis Library in Python It is a Technical Analysis library useful to do feature engineering from financial time series datasets (Open, Cl

Darío López Padial 3.4k Jan 2, 2023
Github.com/CryptoSignal - #1 Quant Trading & Technical Analysis Bot - 2,100 + stars, 580 + forks

CryptoSignal - #1 Quant Trading & Technical Analysis Bot - 2,100 + stars, 580 + forks https://github.com/CryptoSignal/Crypto-Signal Development state:

Github.com/Signal - 2,100 + stars, 580 + forks 4.2k Jan 1, 2023
Performance analysis of predictive (alpha) stock factors

Alphalens Alphalens is a Python Library for performance analysis of predictive (alpha) stock factors. Alphalens works great with the Zipline open sour

Quantopian, Inc. 2.5k Dec 28, 2022
Python-Stock-Info-CLI: Get stock info through CLI by passing stock ticker.

Python-Stock-Info-CLI Get stock info through CLI by passing stock ticker. Installation Use the following command to install the required modules at on

Ayush Soni 1 Nov 5, 2021
This is the Alpha of Nutte language, she is not complete yet / Essa é a Alpha da Nutte language, não está completa ainda

nutte-language This is the Alpha of Nutte language, it is not complete yet / Essa é a Alpha da Nutte language, não está completa ainda My language was

catdochrome 2 Dec 18, 2021
Sakamata-alpha-pycord - Sakamata bot alpha with pycord

sakamatabot このリポジトリは? ホロライブ所属VTuber沙花叉クロヱさんの非公式ファンDiscordサーバー「クロヱ水族館」の運営/管理補助を行う

sushichaaaan 1 May 4, 2022
Stock-history-display - something like a easy yearly review for your stock performance

Stock History Display Available on Heroku: https://stock-history-display.herokua

LiaoJJ 1 Jan 7, 2022
Driver Analysis with Factors and Forests: An Automated Data Science Tool using Python

Driver Analysis with Factors and Forests: An Automated Data Science Tool using Python ??

Thomas 2 May 26, 2022
See trending stock tickers on Reddit and check Stock perfomance

See trending stock tickers on Reddit and check Stock perfomance

Abbas 1.5k Jan 6, 2023
Stock Market Insights is a Dashboard that gives the 360 degree view of the particular company stock

Stock Market Insights is a Dashboard that gives the 360 degree view of the particular company stock.It extracts specific data from multiple sources like Social Media (Twitter,Reddit ,StockTwits) , News Articles and applies NLP techniques to get sentiments and insights.

Ganesh N 3 Sep 10, 2021
This python code will get requests from SET (The Stock Exchange of Thailand) a previously-close stock price and return it in Thai Baht currency using beautiful soup 4 HTML scrapper.

This python code will get requests from SET (The Stock Exchange of Thailand) a previously-close stock price and return it in Thai Baht currency using beautiful soup 4 HTML scrapper.

Andre 1 Oct 24, 2022
Stock game is a python program that simulates real-life stock marketing, saving, and investments

Stock game is a python program that simulates real-life stock marketing, saving, and investments. Users get to trade and manage their portfolio and manage their 100,000 dollar portfolio.

Sai Praneth Raju K. 1 Jul 14, 2022
DeepSTD: Mining Spatio-temporal Disturbances of Multiple Context Factors for Citywide Traffic Flow Prediction

DeepSTD: Mining Spatio-temporal Disturbances of Multiple Context Factors for Citywide Traffic Flow Prediction This is the implementation of DeepSTD in

null 5 Sep 26, 2022
AutoTabular automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications.

AutoTabular automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications. With just a few lines of code, you can train and deploy high-accuracy machine learning and deep learning models tabular data.

Robin 55 Dec 27, 2022
AutoTabular automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications.

AutoTabular AutoTabular automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications. With just

wenqi 2 Jun 26, 2022
A python wrapper for Alpha Vantage API for financial data.

alpha_vantage Python module to get stock data/cryptocurrencies from the Alpha Vantage API Alpha Vantage delivers a free API for real time financial da

Romel Torres 3.8k Jan 7, 2023