Financial portfolio optimisation in python, including classical efficient frontier, Black-Litterman, Hierarchical Risk Parity

Overview

python   pypi   MIT license   build   downloads   binder  

PyPortfolioOpt has recently been published in the Journal of Open Source Software 🎉

PyPortfolioOpt is a library that implements portfolio optimization methods, including classical mean-variance optimization techniques and Black-Litterman allocation, as well as more recent developments in the field like shrinkage and Hierarchical Risk Parity, along with some novel experimental features like exponentially-weighted covariance matrices.

It is extensive yet easily extensible, and can be useful for both the casual investor and the serious practitioner. Whether you are a fundamentals-oriented investor who has identified a handful of undervalued picks, or an algorithmic trader who has a basket of strategies, PyPortfolioOpt can help you combine your alpha sources in a risk-efficient way.

Head over to the documentation on ReadTheDocs to get an in-depth look at the project, or check out the cookbook to see some examples showing the full process from downloading data to building a portfolio.

Table of contents

Getting started

If you would like to play with PyPortfolioOpt interactively in your browser, you may launch Binder here. It takes a while to set up, but it lets you try out the cookbook recipes without having to deal with all of the requirements.

Note: macOS users will need to install Command Line Tools.

Note: if you are on windows, you first need to installl C++. (download, install instructions)

This project is available on PyPI, meaning that you can just:

pip install PyPortfolioOpt

However, it is best practice to use a dependency manager within a virtual environment. My current recommendation is to get yourself set up with poetry then just run

poetry add PyPortfolioOpt

Otherwise, clone/download the project and in the project directory run:

python setup.py install

PyPortfolioOpt supports Docker. Build your first container with docker build -f docker/Dockerfile . -t pypfopt. You can use the image to run tests or even launch a Jupyter server.

# iPython interpreter:
docker run -it pypfopt poetry run ipython

# Jupyter notebook server:
docker run -it -p 8888:8888 pypfopt poetry run jupyter notebook --allow-root --no-browser --ip 0.0.0.0
# click on http://127.0.0.1:8888/?token=xxx

# Pytest
docker run -t pypfopt poetry run pytest

# Bash
docker run -it pypfopt bash

For more information, please read this guide.

For development

If you would like to make major changes to integrate this with your proprietary system, it probably makes sense to clone this repository and to just use the source code.

git clone https://github.com/robertmartin8/PyPortfolioOpt

Alternatively, you could try:

pip install -e git+https://github.com/robertmartin8/PyPortfolioOpt.git

A quick example

Here is an example on real life stock data, demonstrating how easy it is to find the long-only portfolio that maximises the Sharpe ratio (a measure of risk-adjusted returns).

import pandas as pd
from pypfopt import EfficientFrontier
from pypfopt import risk_models
from pypfopt import expected_returns

# Read in price data
df = pd.read_csv("tests/resources/stock_prices.csv", parse_dates=True, index_col="date")

# Calculate expected returns and sample covariance
mu = expected_returns.mean_historical_return(df)
S = risk_models.sample_cov(df)

# Optimize for maximal Sharpe ratio
ef = EfficientFrontier(mu, S)
raw_weights = ef.max_sharpe()
cleaned_weights = ef.clean_weights()
ef.save_weights_to_file("weights.csv")  # saves to file
print(cleaned_weights)
ef.portfolio_performance(verbose=True)

This outputs the following weights:

{'GOOG': 0.03835,
 'AAPL': 0.0689,
 'FB': 0.20603,
 'BABA': 0.07315,
 'AMZN': 0.04033,
 'GE': 0.0,
 'AMD': 0.0,
 'WMT': 0.0,
 'BAC': 0.0,
 'GM': 0.0,
 'T': 0.0,
 'UAA': 0.0,
 'SHLD': 0.0,
 'XOM': 0.0,
 'RRC': 0.0,
 'BBY': 0.01324,
 'MA': 0.35349,
 'PFE': 0.1957,
 'JPM': 0.0,
 'SBUX': 0.01082}

Expected annual return: 30.5%
Annual volatility: 22.2%
Sharpe Ratio: 1.28

This is interesting but not useful in itself. However, PyPortfolioOpt provides a method which allows you to convert the above continuous weights to an actual allocation that you could buy. Just enter the most recent prices, and the desired portfolio size ($10,000 in this example):

from pypfopt.discrete_allocation import DiscreteAllocation, get_latest_prices


latest_prices = get_latest_prices(df)

da = DiscreteAllocation(weights, latest_prices, total_portfolio_value=10000)
allocation, leftover = da.greedy_portfolio()
print("Discrete allocation:", allocation)
print("Funds remaining: ${:.2f}".format(leftover))
12 out of 20 tickers were removed
Discrete allocation: {'GOOG': 1, 'AAPL': 4, 'FB': 12, 'BABA': 4, 'BBY': 2,
                      'MA': 20, 'PFE': 54, 'SBUX': 1}
Funds remaining: $11.89

Disclaimer: nothing about this project constitues investment advice, and the author bears no responsibiltiy for your subsequent investment decisions. Please refer to the license for more information.

An overview of classical portfolio optimization methods

Harry Markowitz's 1952 paper is the undeniable classic, which turned portfolio optimization from an art into a science. The key insight is that by combining assets with different expected returns and volatilities, one can decide on a mathematically optimal allocation which minimises the risk for a target return – the set of all such optimal portfolios is referred to as the efficient frontier.

Although much development has been made in the subject, more than half a century later, Markowitz's core ideas are still fundamentally important and see daily use in many portfolio management firms. The main drawback of mean-variance optimization is that the theoretical treatment requires knowledge of the expected returns and the future risk-characteristics (covariance) of the assets. Obviously, if we knew the expected returns of a stock life would be much easier, but the whole game is that stock returns are notoriously hard to forecast. As a substitute, we can derive estimates of the expected return and covariance based on historical data – though we do lose the theoretical guarantees provided by Markowitz, the closer our estimates are to the real values, the better our portfolio will be.

Thus this project provides four major sets of functionality (though of course they are intimately related)

  • Estimates of expected returns
  • Estimates of risk (i.e covariance of asset returns)
  • Objective functions to be optimized
  • Optimizers.

A key design goal of PyPortfolioOpt is modularity – the user should be able to swap in their components while still making use of the framework that PyPortfolioOpt provides.

Features

In this section, we detail some of PyPortfolioOpt's available functionality. More examples are offered in the Jupyter notebooks here. Another good resource is the tests.

A far more comprehensive version of this can be found on ReadTheDocs, as well as possible extensions for more advanced users.

Expected returns

  • Mean historical returns:
    • the simplest and most common approach, which states that the expected return of each asset is equal to the mean of its historical returns.
    • easily interpretable and very intuitive
  • Exponentially weighted mean historical returns:
    • similar to mean historical returns, except it gives exponentially more weight to recent prices
    • it is likely the case that an asset's most recent returns hold more weight than returns from 10 years ago when it comes to estimating future returns.
  • Capital Asset Pricing Model (CAPM):
    • a simple model to predict returns based on the beta to the market
    • this is used all over finance!

Risk models (covariance)

The covariance matrix encodes not just the volatility of an asset, but also how it correlated to other assets. This is important because in order to reap the benefits of diversification (and thus increase return per unit risk), the assets in the portfolio should be as uncorrelated as possible.

  • Sample covariance matrix:
    • an unbiased estimate of the covariance matrix
    • relatively easy to compute
    • the de facto standard for many years
    • however, it has a high estimation error, which is particularly dangerous in mean-variance optimization because the optimizer is likely to give excess weight to these erroneous estimates.
  • Semicovariance: a measure of risk that focuses on downside variation.
  • Exponential covariance: an improvement over sample covariance that gives more weight to recent data
  • Covariance shrinkage: techniques that involve combining the sample covariance matrix with a structured estimator, to reduce the effect of erroneous weights. PyPortfolioOpt provides wrappers around the efficient vectorised implementations provided by sklearn.covariance.
    • manual shrinkage
    • Ledoit Wolf shrinkage, which chooses an optimal shrinkage parameter. We offer three shrinkage targets: constant_variance, single_factor, and constant_correlation.
    • Oracle Approximating Shrinkage
  • Minimum Covariance Determinant:
    • a robust estimate of the covariance
    • implemented in sklearn.covariance

(This plot was generated using plotting.plot_covariance)

Objective functions

  • Maximum Sharpe ratio: this results in a tangency portfolio because on a graph of returns vs risk, this portfolio corresponds to the tangent of the efficient frontier that has a y-intercept equal to the risk-free rate. This is the default option because it finds the optimal return per unit risk.
  • Minimum volatility. This may be useful if you're trying to get an idea of how low the volatility could be, but in practice it makes a lot more sense to me to use the portfolio that maximises the Sharpe ratio.
  • Efficient return, a.k.a. the Markowitz portfolio, which minimises risk for a given target return – this was the main focus of Markowitz 1952
  • Efficient risk: the Sharpe-maximising portfolio for a given target risk.
  • Maximum quadratic utility. You can provide your own risk-aversion level and compute the appropriate portfolio.

Adding constraints or different objectives

  • Long/short: by default all of the mean-variance optimization methods in PyPortfolioOpt are long-only, but they can be initialised to allow for short positions by changing the weight bounds:
ef = EfficientFrontier(mu, S, weight_bounds=(-1, 1))
  • Market neutrality: for the efficient_risk and efficient_return methods, PyPortfolioOpt provides an option to form a market-neutral portfolio (i.e weights sum to zero). This is not possible for the max Sharpe portfolio and the min volatility portfolio because in those cases because they are not invariant with respect to leverage. Market neutrality requires negative weights:
ef = EfficientFrontier(mu, S, weight_bounds=(-1, 1))
ef.efficient_return(target_return=0.2, market_neutral=True)
  • Minimum/maximum position size: it may be the case that you want no security to form more than 10% of your portfolio. This is easy to encode:
ef = EfficientFrontier(mu, S, weight_bounds=(0, 0.1))

One issue with mean-variance optimization is that it leads to many zero-weights. While these are "optimal" in-sample, there is a large body of research showing that this characteristic leads mean-variance portfolios to underperform out-of-sample. To that end, I have introduced an objective function that can reduce the number of negligible weights for any of the objective functions. Essentially, it adds a penalty (parameterised by gamma) on small weights, with a term that looks just like L2 regularisation in machine learning. It may be necessary to try several gamma values to achieve the desired number of non-negligible weights. For the test portfolio of 20 securities, gamma ~ 1 is sufficient

ef = EfficientFrontier(mu, S)
ef.add_objective(objective_functions.L2_reg, gamma=1)
ef.max_sharpe()

Black-Litterman allocation

As of v0.5.0, we now support Black-Litterman asset allocation, which allows you to combine a prior estimate of returns (e.g the market-implied returns) with your own views to form a posterior estimate. This results in much better estimates of expected returns than just using the mean historical return. Check out the docs for a discussion of the theory, as well as advice on formatting inputs.

S = risk_models.sample_cov(df)
viewdict = {"AAPL": 0.20, "BBY": -0.30, "BAC": 0, "SBUX": -0.2, "T": 0.131321}
bl = BlackLittermanModel(S, pi="equal", absolute_views=viewdict, omega="default")
rets = bl.bl_returns()

ef = EfficientFrontier(rets, S)
ef.max_sharpe()

Other optimizers

The features above mostly pertain to solving mean-variance optimization problems via quadratic programming (though this is taken care of by cvxpy). However, we offer different optimizers as well:

  • Mean-semivariance optimization
  • Mean-CVaR optimization
  • Hierarchical Risk Parity, using clustering algorithms to choose uncorrelated assets
  • Markowitz's critical line algorithm (CLA)

Please refer to the documentation for more.

Advantages over existing implementations

  • Includes both classical methods (Markowitz 1952 and Black-Litterman), suggested best practices (e.g covariance shrinkage), along with many recent developments and novel features, like L2 regularisation, shrunk covariance, hierarchical risk parity.
  • Native support for pandas dataframes: easily input your daily prices data.
  • Extensive practical tests, which use real-life data.
  • Easy to combine with your proprietary strategies and models.
  • Robust to missing data, and price-series of different lengths (e.g FB data only goes back to 2012 whereas AAPL data goes back to 1980).

Project principles and design decisions

  • It should be easy to swap out individual components of the optimization process with the user's proprietary improvements.
  • Usability is everything: it is better to be self-explanatory than consistent.
  • There is no point in portfolio optimization unless it can be practically applied to real asset prices.
  • Everything that has been implemented should be tested.
  • Inline documentation is good: dedicated (separate) documentation is better. The two are not mutually exclusive.
  • Formatting should never get in the way of coding: because of this, I have deferred all formatting decisions to Black.

Testing

Tests are written in pytest (much more intuitive than unittest and the variants in my opinion), and I have tried to ensure close to 100% coverage. Run the tests by navigating to the package directory and simply running pytest on the command line.

PyPortfolioOpt provides a test dataset of daily returns for 20 tickers:

['GOOG', 'AAPL', 'FB', 'BABA', 'AMZN', 'GE', 'AMD', 'WMT', 'BAC', 'GM',
'T', 'UAA', 'SHLD', 'XOM', 'RRC', 'BBY', 'MA', 'PFE', 'JPM', 'SBUX']

These tickers have been informally selected to meet several criteria:

  • reasonably liquid
  • different performances and volatilities
  • different amounts of data to test robustness

Currently, the tests have not explored all of the edge cases and combinations of objective functions and parameters. However, each method and parameter has been tested to work as intended.

Citing PyPortfolioOpt

If you use PyPortfolioOpt for published work, please cite the JOSS paper.

Citation string:

Martin, R. A., (2021). PyPortfolioOpt: portfolio optimization in Python. Journal of Open Source Software, 6(61), 3066, https://doi.org/10.21105/joss.03066

BibTex::

@article{Martin2021,
  doi = {10.21105/joss.03066},
  url = {https://doi.org/10.21105/joss.03066},
  year = {2021},
  publisher = {The Open Journal},
  volume = {6},
  number = {61},
  pages = {3066},
  author = {Robert Andrew Martin},
  title = {PyPortfolioOpt: portfolio optimization in Python},
  journal = {Journal of Open Source Software}
}

Contributing

Contributions are most welcome. Have a look at the Contribution Guide for more.

I'd like to thank all of the people who have contributed to PyPortfolioOpt since its release in 2018. Special shout-outs to:

  • Philipp Schiele
  • Carl Peasnell
  • Felipe Schneider
  • Dingyuan Wang
  • Pat Newell
  • Aditya Bhutra
  • Thomas Schmelzer
  • Rich Caputo
  • Nicolas Knudde

Getting in touch

If you are having a problem with PyPortfolioOpt, please raise a GitHub issue. For anything else, you can reach me at:

Comments
  • Constraint on number of assets?

    Constraint on number of assets?

    What are you trying to do? Setting a max number of assets to allocate weights to. For example observe a universe of 500 stocks, I only want the weights for max 10 stocks and the sum should ofcourse be 100%.

    What data are you using? SP500 data universe.

    question 
    opened by sword134 28
  • max.sharpe() OptimizationError

    max.sharpe() OptimizationError

    Hey,

    Trying the basic tutorial using the test prices seems to work fine, but inputting crypto prices leads to errors when calculating the max Sharpe value:

    df_merged.head()

    | Date | BTC | ETH | LTC | |:--------------------|--------:|--------:|--------:| | 2018-01-01 | 13657.2 | 772.641 | 229.033 | | 2018-01-02 | 14982.1 | 884.444 | 255.684 | | 2018-01-03 | 15201 | 962.72 | 245.368 | | 2018-01-04 | 15599.2 | 980.922 | 241.37 | | 2018-01-05 | 17429.5 | 997.72 | 249.271 |

    mu = expected_returns.mean_historical_return(df_merged)

    BTC   -0.654099
    ETH   -1.011251
    LTC   -1.016013
    dtype: float64
    

    S = risk_models.sample_cov(df_merged)

    | | BTC | ETH | LTC | |:----|---------:|---------:|---------:| | BTC | 0.447307 | 0.467077 | 0.487518 | | ETH | 0.467077 | 0.756588 | 0.623595 | | LTC | 0.487518 | 0.623595 | 0.803425 |

    ef = EfficientFrontier(mu, S)
    raw_weights = ef.max_sharpe()
    
    OptimizationError                         Traceback (most recent call last)
    <ipython-input-96-c11009779024> in <module>()
          1 ef = EfficientFrontier(mu, S)
    ----> 2 raw_weights = ef.max_sharpe()
    
    1 frames
    /usr/local/lib/python3.6/dist-packages/pypfopt/base_optimizer.py in _solve_cvxpy_opt_problem(self)
        198             raise exceptions.OptimizationError
        199         if opt.status != "optimal":
    --> 200             raise exceptions.OptimizationError
        201         self.weights = self._w.value.round(16) + 0.0  # +0.0 removes signed zero
        202 
    
    OptimizationError: Please check your objectives/constraints or use a different solver.
    

    Weirdly enough, if I limit the price interval to just a couple of months, max_sharpe works fine, but then min_volatility breaks down with the same error.

    question 
    opened by alienss 26
  • ModuleNotFoundError: No module named 'pulp'

    ModuleNotFoundError: No module named 'pulp'

    Dear Robert As you can see in the attached image, the import of modules returns the error mentioned in the query title. GestionValores Perhaps modifications to the API have affected some package names. `What should be the new names in the API, of "DiscreteAllocation or get_latest_prices"? I will appreciate your help. Best regards

    packaging 
    opened by akitxu 24
  • How to get the entire set of weights for the efficient frontier portfolio

    How to get the entire set of weights for the efficient frontier portfolio

    Im trying to obtain the entire set of weights for the efficient frontier but im only obtaining the max sharpe weights ( ef.max_sharpe())

    and need the whole set.

    does someone have this same issue, or some one have already figured it out?

    thanks, im looking forward to it!

    question 
    opened by papisagre 23
  • Optimise portfolio with target volatility

    Optimise portfolio with target volatility

    Hi! Thanks for all the great work, I am following this one since 0.5.4 and I am a big fan, hence I decided to contribute.

    Since the cvxpy update efficient_risk either does not work as intended, or the intention on how it should work is formulated wrongly.

    It does not give you an efficient frontier portfolio on target risk, because of line 271: self._constraints.append(variance <= target_volatility ** 2) so the solver will not enforce risk equal to target, but (obviously) lower or equal.

    On the other hand, the docs give us: :raises ValueError: if no portfolio can be found with volatility equal to ``target_volatility`` which is not true anymore I believe, but lead the user to expect the target volatility will be reached, or an error will be shown.

    This does not happen if you request any volatility that is higher than that of the max Sharpe portfolio. And it is useful to have it for all the people that don't have access to leverage but want riskier, yet efficient, portfolios.

    Now, I know going around it with cvxpy will be a hassle (I tried and unfortunately it is not DCP-enough for it...) but maybe you would consider either of:

    • reverting back to SLSQP for it (works like a charm!)
    • spending some time on DCPing it (I've spent a day and I am fed up but I believe there could be some magicians that could figure something out)
    • finally, specifying the docstring correclty (i.e. 'Maximise return for a target risk or lower.')

    Anyway, happy to hear your thoughts if you can spare a moment.

    Cheers, M

    bug help wanted 
    opened by mkeds 19
  • Idzorek's method for Black-Litterman

    Idzorek's method for Black-Litterman

    Idzorek's method is a way of translating percentage confidence into the uncertainty matrix required by Black-Litterman. This can be seen in section 3.2 (pg 23) of Idzorek's Step-by-Step Guide to Black-Litterman.

    This has been requested in #70, and selfishly, I now think it would be useful for my own investing :)

    That being said, Idzorek's method is really quite involved, and I would thus like to open the floor to any suggestions regarding implementation/architecture.

    A brief overview of the steps are:

    1. Construct a BL returns vector for each view separately (assume 100% confidence)
    2. Convert these return vectors to implied weights for each view, w_k
    3. Determine the deviation between each view's weight vector and the market weights
    4. Multiply this deviation by the user's confidence to get the tilt_k, and compute target weights for each view using w_k%= w_mkt + tilt_k
    5. The kth uncertainty is the uncertainty that minimises the sum of square difference between w_k and w_k%.
    6. Put these k uncertainties on the diagonal of a matrix.

    I think I've figured out how to do it on the backend. Have a static method within the BlackLitterman class (it must be static so that we can instantiate BlackLitterman objects) that computes the Idzorek matrix.

    I'm not quite sure what the API should be. Some ideas:

    # 1. Recognise omega parameter as a string then automatically construct
    bl = BlackLittermanModel(S, absolute_views=viewdict, omega="idzorek")
    ret = bl.bl_returns()
    
    # 2. Method to replace current omega with Idzorek omega
    bl = BlackLittermanModel(S, absolute_views=viewdict)
    bl.omega = bl.idzorek_omega() 
    # or
    bl.set_idzorek_omega()
    ret = bl.bl_returns()
    
    # 3. Put function in module rather than class
    idzorek_omega = black_litterman.idzorek_uncertainty(absolute_views=viewdict)
    bl = BlackLittermanModel(S, absolute_views=viewdict, omega=idzorek_omega)
    

    I think option 1 is the nicest API, but I don't know if it's too confusing to allow the omega parameter in the constructor to accept a string, in addition to the pd.DataFrame and None options currently provided.

    @schneiderfelipe any thoughts?

    enhancement help wanted 
    opened by robertmartin8 16
  • CLA fails with TypeError: '<' not supported between instances of 'NoneType' and 'float'

    CLA fails with TypeError: '<' not supported between instances of 'NoneType' and 'float'

    When I do the following, I can get the target weights.

    mu = returns_series
    S = risk_models.sample_cov(df)
    
    ef = EfficientFrontier(mu, S)
    raw_weights = ef.max_sharpe(risk_free_rate=0.0004)
    max_sharpe_pf = dict(ef.clean_weights())
    print(cleaned_weights)
    ef.portfolio_performance(verbose=True)
    

    I get this,

    OrderedDict([('asset1', 0.01464), ('asset2', 0.00269), ('asset3', 0.00481), ('asset4', 0.00314), ('asset5', 0.01349), ('asset6', 0.01292), ('asset7', 0.00397), ('asset8', 0.0014), ('asset9', 0.005), ('asset10', 0.00197), ('asset11', 0.00021), ('asset12', 0.00548), ('asset13', 0.00808), ('asset14', 0.00071), ('asset15', 0.00569), ('asset16', 0.00128), ('asset17', 0.00046), ('asset18', 0.06866), ('asset19', 0.0007), ('asset20', 0.00213), ('asset21', 0.00172), ('asset22', 0.00522), ('asset23', 0.33997), ('asset24', 0.00078), ('asset25', 0.00079), ('asset26', 0.00106), ('asset27', 0.0055), ('asset28', 0.25442), ('asset29', 0.00689), ('asset30', 0.00132), ('asset31', 0.21176), ('asset32', 0.00518), ('asset33', 0.00796)])
    Expected annual return: 0.3%
    Annual volatility: 0.0%
    Sharpe Ratio: 4901822.92
    /Users/xxx/miniconda3/envs/ai/lib/python3.8/site-packages/pypfopt/efficient_frontier/efficient_frontier.py:404: UserWarning: The risk_free_rate provided to portfolio_performance is different to the one used by max_sharpe. Using the previous value.
      warnings.warn(
    (0.0025311989428367086, 4.3477681211645237e-10, 4901822.920275426)
    

    However when I try,

    cla = CLA(mu, S)
    print(cla.max_sharpe())
    cla.portfolio_performance(verbose=True)
    

    I get the following error

    ---------------------------------------------------------------------------
    TypeError                                 Traceback (most recent call last)
    <ipython-input-731-4dfc5af627ff> in <module>
          1 cla = CLA(mu, S)
    ----> 2 print(cla.max_sharpe())
          3 cla.portfolio_performance(verbose=True)
    
    ~/miniconda3/envs/ai/lib/python3.8/site-packages/pypfopt/cla.py in max_sharpe(self)
        378         """
        379         if not self.w:
    --> 380             self._solve()
        381         # 1) Compute the local max SR portfolio between any two neighbor turning points
        382         w_sr, sr = [], []
    
    ~/miniconda3/envs/ai/lib/python3.8/site-packages/pypfopt/cla.py in _solve(self)
        336                         self.w[-1][i],
        337                     )
    --> 338                     if (self.ls[-1] is None or l < self.ls[-1]) and l > CLA._infnone(
        339                         l_out
        340                     ):
    
    TypeError: '<' not supported between instances of 'NoneType' and 'float'
    

    Any Idea why this happens?

    Here is a sample of my mu and S

    mu is

    asset1      0.001109
    asset2    0.001651
    asset3       0.006850
    asset4      0.008584
    asset5      0.014223
    asset6     0.000375
    asset7      0.009315
    asset8      0.006099
    asset9      0.000802
    asset10      0.000526
    asset11      0.004706
    asset12      0.012302
    asset13    0.004301
    asset14      0.001811
    asset15     0.004071
    asset16      0.006880
    asset17       0.001191
    asset18      0.000108
    asset19       0.004835
    asset20      0.000477
    asset21     0.001538
    asset22     0.003291
    asset23       0.003846
    asset24      0.003557
    asset25     0.005364
    asset26     0.000073
    asset27       0.011400
    asset28      0.001145
    asset29      0.000955
    asset30        0.003946
    asset31       0.000931
    asset32      0.011024
    asset33      0.012595
    

    Here is a preview of S, i cant paste the whole thing because its 33 by 33

    0.025999 | 0.062508 | -0.058524 | -0.032703 | -0.009928 | 0.009853 | -0.016384 | 0.005907 | -0.010904 | 0.007083 | ... | -0.008924 | 0.002112 | 0.068362 | -0.016891 | 0.034906 | -0.000746 | 0.003678 | -0.031238 | -0.054474 | -0.041703
    0.062508 | 0.159056 | -0.184588 | -0.089246 | -0.052268 | 0.020778 | -0.055927 | 0.002663 | -0.028131 | 0.009429 | ... | -0.031285 | -0.041631 | 0.134849 | -0.068001 | 0.077892 | -0.011617 | -0.003193 | -0.068701 | -0.139866 | -0.122891
    -0.058524 | -0.184588 | 0.527175 | 0.156472 | 0.170847 | -0.004327 | 0.156618 | 0.090105 | 0.067120 | 0.036571 | ... | 0.144244 | 0.238333 | 0.068252 | 0.237680 | -0.082467 | 0.051147 | 0.016966 | 0.031354 | 0.218510 | 0.236453
    -0.032703 | -0.089246 | 0.156472 | 0.063347 | 0.055401 | -0.004898 | 0.051392 | 0.015272 | 0.030596 | 0.004461 | ... | 0.043299 | 0.052218 | -0.035347 | 0.064125 | -0.048398 | 0.012993 | 0.004915 | 0.039782 | 0.095891 | 0.088705
    -0.009928 | -0.052268 | 0.170847 | 0.055401 | 0.108775 | 0.011654 | 0.068900 | 0.038511 | 0.027025 | 0.025390 | ... | 0.050873 | 0.145137 | 0.076153 | 0.095925 | -0.005434 | 0.032286 | 0.037794 | 0.007141 | 0.065100 | 0.096941
    0.009853 | 0.020778 | -0.004327 | -0.004898 | 0.011654 | 0.007455 | 0.003526 | 0.007720 | 0.004208 | 0.006834 | ... | 0.007199 | 0.013899 | 0.038925 | 0.003175 | 0.009831 | 0.003068 | 0.005427 | -0.006667 | -0.010487 | -0.004679
    -0.016384 | -0.055927 | 0.156618 | 0.051392 | 0.068900 | 0.003526 | 0.053981 | 0.028695 | 0.026781 | 0.014705 | ... | 0.047741 | 0.085304 | 0.030648 | 0.074555 | -0.024080 | 0.019169 | 0.013931 | 0.015959 | 0.070154 | 0.079168
    0.005907 | 0.002663 | 0.090105 | 0.015272 | 0.038511 | 0.007720 | 0.028695 | 0.028618 | 0.010712 | 0.015772 | ... | 0.032128 | 0.063601 | 0.074239 | 0.048253 | 0.005605 | 0.012862 | 0.007793 | -0.015116 | 0.014521 | 0.028828
    -0.010904 | -0.028131 | 0.067120 | 0.030596 | 0.027025 | 0.004208 | 0.026781 | 0.010712 | 0.029829 | 0.004976 | ... | 0.035519 | 0.004307 | -0.003429 | 0.023013 | -0.032236 | 0.002741 | -0.003645 | 0.029937 | 0.050853 | 0.036014
    0.007083 | 0.009429 | 0.036571 | 0.004461 | 0.025390 | 0.006834 | 0.014705 | 0.015772 | 0.004976 | 0.010393 | ... | 0.015242 | 0.040526 | 0.051256 | 0.023947 | 0.009532 | 0.008378 | 0.008900 | -0.010890 | 0.000245 | 0.012209
    -0.018393 | -0.069133 | 0.227598 | 0.067385 | 0.096261 | 0.005353 | 0.075344 | 0.045440 | 0.032347 | 0.023110 | ... | 0.066856 | 0.131673 | 0.063500 | 0.110922 | -0.024627 | 0.028631 | 0.020088 | 0.009501 | 0.088525 | 0.107356
    0.057139 | 0.073907 | 0.274531 | 0.011136 | 0.172500 | 0.037764 | 0.093012 | 0.116761 | -0.011275 | 0.074317 | ... | 0.073633 | 0.353416 | 0.395560 | 0.193212 | 0.115323 | 0.069389 | 0.076442 | -0.136390 | -0.045501 | 0.078905
    0.038337 | 0.081170 | -0.103536 | -0.049892 | 0.013561 | 0.014638 | -0.021515 | 0.003763 | -0.032985 | 0.012921 | ... | -0.036439 | 0.059895 | 0.105703 | -0.015737 | 0.076885 | 0.011017 | 0.034312 | -0.057262 | -0.095340 | -0.047683
    -0.012954 | -0.042376 | 0.141476 | 0.050583 | 0.062633 | 0.008706 | 0.052814 | 0.029027 | 0.041668 | 0.015277 | ... | 0.060369 | 0.053314 | 0.033513 | 0.061228 | -0.036157 | 0.013331 | 0.004701 | 0.029514 | 0.075413 | 0.069192
    0.026521 | 0.067901 | -0.110517 | -0.045554 | -0.028227 | 0.006462 | -0.033031 | -0.007728 | -0.022021 | 0.000327 | ... | -0.030205 | -0.019917 | 0.041933 | -0.040456 | 0.041511 | -0.005503 | 0.003633 | -0.032209 | -0.072389 | -0.060127
    -0.024764 | -0.069288 | 0.136418 | 0.053271 | 0.050663 | -0.001666 | 0.045828 | 0.016635 | 0.028722 | 0.006287 | ... | 0.041557 | 0.047903 | -0.016156 | 0.056905 | -0.039587 | 0.011811 | 0.004537 | 0.032090 | 0.080098 | 0.074627
    -0.007158 | -0.009839 | -0.038815 | 0.005426 | -0.007343 | 0.001128 | -0.004438 | -0.014045 | 0.015434 | -0.006488 | ... | 0.001048 | -0.047118 | -0.046753 | -0.026473 | -0.022883 | -0.007873 | -0.005826 | 0.032945 | 0.017598 | -0.003002
    0.002297 | 0.009139 | -0.006640 | 0.004161 | 0.002845 | 0.006780 | 0.004207 | 0.002098 | 0.019252 | 0.002285 | ... | 0.017175 | -0.024676 | 0.005541 | -0.008593 | -0.014634 | -0.003872 | -0.006188 | 0.017960 | 0.012130 | -0.002494
    -0.000514 | -0.013022 | 0.057128 | 0.018267 | 0.044876 | 0.006702 | 0.025929 | 0.015512 | 0.010235 | 0.011358 | ... | 0.018793 | 0.059641 | 0.039099 | 0.035298 | 0.002611 | 0.013303 | 0.017474 | 0.000522 | 0.019212 | 0.034190
    0.006165 | 0.011249 | 0.055896 | 0.019584 | 0.035611 | 0.015727 | 0.028602 | 0.022910 | 0.037483 | 0.014709 | ... | 0.048679 | 0.011585 | 0.057865 | 0.022782 | -0.019161 | 0.004276 | -0.001639 | 0.018895 | 0.031898 | 0.020793
    0.056202 | 0.126471 | -0.078586 | -0.059277 | 0.006989 | 0.024380 | -0.017966 | 0.025391 | -0.020483 | 0.023254 | ... | -0.007343 | 0.050759 | 0.178932 | -0.007877 | 0.080281 | 0.008089 | 0.018929 | -0.073678 | -0.107334 | -0.066915
    -0.034814 | -0.055969 | -0.021147 | 0.033343 | -0.046674 | -0.008647 | -0.004908 | -0.030342 | 0.050138 | -0.023730 | ... | 0.026168 | -0.160807 | -0.159354 | -0.053312 | -0.096384 | -0.029575 | -0.047406 | 0.096377 | 0.086423 | 0.005683
    -0.007345 | -0.013142 | 0.012920 | 0.002835 | -0.018253 | -0.007195 | -0.004208 | -0.003650 | -0.002945 | -0.005997 | ... | -0.001627 | -0.020508 | -0.028693 | -0.001744 | -0.010866 | -0.004916 | -0.011852 | 0.002364 | 0.008777 | -0.000421
    -0.008924 | -0.031285 | 0.144244 | 0.043299 | 0.050873 | 0.007199 | 0.047741 | 0.032128 | 0.035519 | 0.015242 | ... | 0.059033 | 0.049980 | 0.045391 | 0.061687 | -0.030197 | 0.011589 | -0.001153 | 0.016897 | 0.064281 | 0.058934
    0.002112 | -0.041631 | 0.238333 | 0.052218 | 0.145137 | 0.013899 | 0.085304 | 0.063601 | 0.004307 | 0.040526 | ... | 0.049980 | 0.251829 | 0.164916 | 0.148482 | 0.037661 | 0.052183 | 0.061840 | -0.044085 | 0.039742 | 0.115733
    0.068362 | 0.134849 | 0.068252 | -0.035347 | 0.076153 | 0.038925 | 0.030648 | 0.074239 | -0.003429 | 0.051256 | ... | 0.045391 | 0.164916 | 0.311828 | 0.073794 | 0.094396 | 0.031281 | 0.035819 | -0.101553 | -0.087288 | -0.019001
    -0.016891 | -0.068001 | 0.237680 | 0.064125 | 0.095925 | 0.003175 | 0.074555 | 0.048253 | 0.023013 | 0.023947 | ... | 0.061687 | 0.148482 | 0.073794 | 0.118977 | -0.014769 | 0.031254 | 0.022565 | -0.004048 | 0.079883 | 0.107430
    0.034906 | 0.077892 | -0.082467 | -0.048398 | -0.005434 | 0.009831 | -0.024080 | 0.005605 | -0.032236 | 0.009532 | ... | -0.030197 | 0.037661 | 0.094396 | -0.014769 | 0.066013 | 0.005539 | 0.018751 | -0.058114 | -0.087998 | -0.051547
    -0.000746 | -0.011617 | 0.051147 | 0.012993 | 0.032286 | 0.003068 | 0.019169 | 0.012862 | 0.002741 | 0.008378 | ... | 0.011589 | 0.052183 | 0.031281 | 0.031254 | 0.005539 | 0.011027 | 0.013330 | -0.006052 | 0.011819 | 0.026687
    0.003678 | -0.003193 | 0.016966 | 0.004915 | 0.037794 | 0.005427 | 0.013931 | 0.007793 | -0.003645 | 0.008900 | ... | -0.001153 | 0.061840 | 0.035819 | 0.022565 | 0.018751 | 0.013330 | 0.024158 | -0.009907 | -0.004099 | 0.020094
    -0.031238 | -0.068701 | 0.031354 | 0.039782 | 0.007141 | -0.006667 | 0.015959 | -0.015116 | 0.029937 | -0.010890 | ... | 0.016897 | -0.044085 | -0.101553 | -0.004048 | -0.058114 | -0.006052 | -0.009907 | 0.062937 | 0.074137 | 0.040891
    -0.054474 | -0.139866 | 0.218510 | 0.095891 | 0.065100 | -0.010487 | 0.070154 | 0.014521 | 0.050853 | 0.000245 | ... | 0.064281 | 0.039742 | -0.087288 | 0.079883 | -0.087998 | 0.011819 | -0.004099 | 0.074137 | 0.152636 | 0.126022
    -0.041703 | -0.122891 | 0.236453 | 0.088705 | 0.096941 | -0.004679 | 0.079168 | 0.028828 | 0.036014 | 0.012209 | ... | 0.058934 | 0.115733 | -0.019001 | 0.107430 | -0.051547 | 0.026687 | 0.020094 | 0.040891 | 0.126022 | 0.133659
    

    Any idea why the CLA fails?

    bug 
    opened by anarchy89 15
  • Add constraint to reduce option strategy cost

    Add constraint to reduce option strategy cost

    What are you trying to do? I am interested in a married put option strategy for my portfolio. However, I would like to keep the option premium at a minimum for my portfolio. How can I add a constraint so that the existing efficient frontier portfolio selection is also taking into account the corresponding put option costs for each ticker?

    For example- for my pre-selected strikes, I have downloaded the corresponding put option prices separately for each ticker in 'put_option_prices' pandra core series.

    Would something like this work? ef.add_constraint(lambda w: w @ put_option_prices <= 0.5)

    What data are you using? S&P 500 universe.

    Example:

    put_option_prices
    Out[1]: 
    AAPL    4.20
    GE      0.31
    MSFT    5.10
    dtype: float64
    
    question 
    opened by linda390 15
  • Efficient Semivariance

    Efficient Semivariance

    Hi @robertmartin8, in this PR, the semivariance optimization mentioned in issue 202 is being implemented. It would be nice to get some early feedback on the overall implementation, while I am working on some remaining todos.

    Progress:

    • [x] Initial draft
    • [x] More convenience functions (min semivariance, efficient risk, ...)
    • [x] More tests
    • [x] Documentation
    opened by phschiele 15
  • The objective is not DCP.

    The objective is not DCP.

    I've being using PyPortfolioOpt for a couple of days now and I'm finding it very interesting. However, it seems that for some combination of assets in my dataset, the max_sharpe method will not work. The error comes from cvxpy:

    DCPError: Problem does not follow DCP rules. Specifically:
    The objective is not DCP. Its following subexpressions are not:
    QuadForm(var20890, [[1.28992571 0.05675357 0.23213086 0.04030597]
     [0.05675357 0.39647776 0.3670981  0.09351321]
     [0.23213086 0.3670981  0.37204403 0.19053016]
     [0.04030597 0.09351321 0.19053016 0.17553035]])
    

    Any ideas on why this error would occurr?

    bug 
    opened by filipeteotonio 15
  • Black-Litterman model of expected returns

    Black-Litterman model of expected returns

    I'm interested in implementing the Black-Litterman model for expected returns (equation (1) in here). The model has a lot of moving parts but I think that it can all be contained in a simple function such as black_litterman_return(). If it's useful/interesting/desired, I can make a PR.

    enhancement 
    opened by schneiderfelipe 15
  • Calculating CVaR - can't replicate the results I see in the portfolio_performance

    Calculating CVaR - can't replicate the results I see in the portfolio_performance

    What are you trying to do? I am trying to understand the calculation of CVaR after optimization.

    What have you tried? Multiplying the optimal weights with the returns dataframe, sum(axis=1), get "returns_distribution", and then: var = returns_distribution.quantile(0.05) cvar = returns_distribution[returns_distribution <= var].mean()

    Or, using the formula similar to the optimization code: beta=0.95 alpha=returns_distribution.quantile(1-beta) cvar = alpha + 1 / (len(returns_distribution) * (1 - beta)) * np.sum((- returns_distribution + alpha).clip(lower=0))

    What ever I do, I just can't get the same number I see in the portfolio_performance. I get the same number when I calculate the expected return, but not the CVaR.

    What am I missing here?

    bug question 
    opened by OriKatz1 7
  • Feature request: Replace cvxpy by cvxpy-base

    Feature request: Replace cvxpy by cvxpy-base

    Hi Robert, I can work on this feature during my holiday. Most professional shops (they all use your package) use professional solvers such as Mosek or Gorubi. The solvers that come with cvxpy are somewhat a pain. See the discussion here: https://github.com/cvxpy/cvxpy/issues/1478 I will make "solvers" an extra in your pyproject.toml file and only if explicitly specified they will be installed. Maybe @phschiele has a better idea? Best wishes for 2023 Thomas

    enhancement packaging 
    opened by tschm 4
  • Github CI does not run on MacOS and Windows

    Github CI does not run on MacOS and Windows

    Describe the bug This is not code issue, but CI issue. Inserted the line below and got 'linux' from pytest (macos-latest) and pytest (windows-latest) run. This meant all CI tests run on Ubuntu.

    python -c "import sys; print(sys.platform)" 
    

    Collaps 'Get full python version' tab and check line 8 https://github.com/yosukesan/PyPortfolioOpt/actions/runs/3775428942/jobs/6418111547 https://github.com/yosukesan/PyPortfolioOpt/actions/runs/3775428942/jobs/6418111498

    Expected behavior Windows-latest and macos-latest tests shall run on the specified OS respectively

    Code sample I think the change blow fix this problem.

    - runs-on : ubuntu-latest
    + runs-on : ${{ matrix.os }}
    
    bug 
    opened by yosukesan 2
  • Remove hardcoded versions from dockerfile#507

    Remove hardcoded versions from dockerfile#507

    Host env

    $ grep VERSION /etc/os-release && uname -a && docker -v
    VERSION_ID="11"
    VERSION="11 (bullseye)"
    VERSION_CODENAME=bullseye
    Linux localhost 5.10.0-20-amd64 #1 SMP Debian 5.10.158-2 (2022-12-13) x86_64 GNU/Linux
    Docker version 20.10.22, build 3a2c30b
    

    Test command

    $ cat run_docker_test.sh 
    docker build -f docker/Dockerfile . -t pypfopt && \
    docker run --name tests_pypfopt -t pypfopt bash -c 'cat /etc/os-release && python -V && poetry run pytest'
    

    Test log

    $ bash run_docker_test.sh 
    Sending build context to Docker daemon  18.04MB
    Step 1/6 : FROM python:3.9-slim
    3.9-slim: Pulling from library/python
    3f4ca61aafcd: Pull complete 
    3f487a3359db: Pull complete 
    ae22731824be: Pull complete 
    3583ac268677: Pull complete 
    de224c04316a: Pull complete 
    Digest: sha256:9e0b4391fc41bc35c16caef4740736b6b349f6626fd14eba32793ae3c7b01908
    Status: Downloaded newer image for python:3.9-slim
     ---> e2f464551004
    Step 2/6 : WORKDIR pypfopt
     ---> Running in dd791859addc
    Removing intermediate container dd791859addc
     ---> 13e595ab1a9c
    Step 3/6 : COPY pyproject.toml poetry.lock ./
     ---> 83fb9a7c316d
    Step 4/6 : RUN pip install --upgrade pip     pip install poetry yfinance &&     poetry install -E optionals --no-root
     ---> Running in 000b65f1574a
    Requirement already satisfied: pip in /usr/local/lib/python3.9/site-packages (22.0.4)
    Collecting pip
      Downloading pip-22.3.1-py3-none-any.whl (2.1 MB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.1/2.1 MB 5.9 MB/s eta 0:00:00
    Collecting install
      Downloading install-1.3.5-py3-none-any.whl (3.2 kB)
    Collecting poetry
      Downloading poetry-1.3.1-py3-none-any.whl (218 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 218.9/218.9 KB 2.8 MB/s eta 0:00:00
    Collecting yfinance
      Downloading yfinance-0.2.3-py2.py3-none-any.whl (50 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 50.4/50.4 KB 591.1 kB/s eta 0:00:00
    Collecting keyring<24.0.0,>=23.9.0
      Downloading keyring-23.13.1-py3-none-any.whl (37 kB)
    Collecting packaging>=20.4
      Downloading packaging-22.0-py3-none-any.whl (42 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 42.6/42.6 KB 581.3 kB/s eta 0:00:00
    Collecting cleo<3.0.0,>=2.0.0
      Downloading cleo-2.0.1-py3-none-any.whl (77 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 77.3/77.3 KB 1.1 MB/s eta 0:00:00
    Collecting filelock<4.0.0,>=3.8.0
      Downloading filelock-3.8.2-py3-none-any.whl (10 kB)
    Collecting urllib3<2.0.0,>=1.26.0
      Downloading urllib3-1.26.13-py2.py3-none-any.whl (140 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 140.6/140.6 KB 2.2 MB/s eta 0:00:00
    Collecting cachecontrol[filecache]<0.13.0,>=0.12.9
      Downloading CacheControl-0.12.11-py2.py3-none-any.whl (21 kB)
    Collecting trove-classifiers>=2022.5.19
      Downloading trove_classifiers-2022.12.22-py3-none-any.whl (13 kB)
    Collecting tomli<3.0.0,>=2.0.1
      Downloading tomli-2.0.1-py3-none-any.whl (12 kB)
    Collecting dulwich<0.21.0,>=0.20.46
      Downloading dulwich-0.20.50-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (499 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 499.4/499.4 KB 4.1 MB/s eta 0:00:00
    Collecting poetry-plugin-export<2.0.0,>=1.2.0
      Downloading poetry_plugin_export-1.2.0-py3-none-any.whl (10 kB)
    Collecting virtualenv!=20.4.5,!=20.4.6,<21.0.0,>=20.4.3
      Downloading virtualenv-20.17.1-py3-none-any.whl (8.8 MB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 8.8/8.8 MB 7.0 MB/s eta 0:00:00
    Collecting requests-toolbelt<0.11.0,>=0.9.1
      Downloading requests_toolbelt-0.10.1-py2.py3-none-any.whl (54 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 54.5/54.5 KB 798.6 kB/s eta 0:00:00
    Collecting jsonschema<5.0.0,>=4.10.0
      Downloading jsonschema-4.17.3-py3-none-any.whl (90 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 90.4/90.4 KB 1.4 MB/s eta 0:00:00
    Collecting poetry-core==1.4.0
      Downloading poetry_core-1.4.0-py3-none-any.whl (546 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 546.4/546.4 KB 4.4 MB/s eta 0:00:00
    Collecting pkginfo<2.0,>=1.5
      Downloading pkginfo-1.9.2-py3-none-any.whl (26 kB)
    Collecting platformdirs<3.0.0,>=2.5.2
      Downloading platformdirs-2.6.0-py3-none-any.whl (14 kB)
    Collecting shellingham<2.0,>=1.5
      Downloading shellingham-1.5.0-py2.py3-none-any.whl (9.3 kB)
    Collecting html5lib<2.0,>=1.0
      Downloading html5lib-1.1-py2.py3-none-any.whl (112 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 112.2/112.2 KB 1.8 MB/s eta 0:00:00
    Collecting requests<3.0,>=2.18
      Downloading requests-2.28.1-py3-none-any.whl (62 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 62.8/62.8 KB 946.8 kB/s eta 0:00:00
    Collecting tomlkit!=0.11.2,!=0.11.3,<1.0.0,>=0.11.1
      Downloading tomlkit-0.11.6-py3-none-any.whl (35 kB)
    Collecting crashtest<0.5.0,>=0.4.1
      Downloading crashtest-0.4.1-py3-none-any.whl (7.6 kB)
    Collecting lockfile<0.13.0,>=0.12.2
      Downloading lockfile-0.12.2-py2.py3-none-any.whl (13 kB)
    Collecting pexpect<5.0.0,>=4.7.0
      Downloading pexpect-4.8.0-py2.py3-none-any.whl (59 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 59.0/59.0 KB 772.7 kB/s eta 0:00:00
    Collecting importlib-metadata<5.0,>=4.4
      Downloading importlib_metadata-4.13.0-py3-none-any.whl (23 kB)
    Collecting lxml>=4.9.1
      Downloading lxml-4.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl (7.1 MB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.1/7.1 MB 7.5 MB/s eta 0:00:00
    Collecting multitasking>=0.0.7
      Downloading multitasking-0.0.11-py3-none-any.whl (8.5 kB)
    Collecting numpy>=1.16.5
      Downloading numpy-1.24.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (17.3 MB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 17.3/17.3 MB 6.7 MB/s eta 0:00:00
    Collecting appdirs>=1.4.4
      Downloading appdirs-1.4.4-py2.py3-none-any.whl (9.6 kB)
    Collecting beautifulsoup4>=4.11.1
      Downloading beautifulsoup4-4.11.1-py3-none-any.whl (128 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 128.2/128.2 KB 2.0 MB/s eta 0:00:00
    Collecting cryptography>=3.3.2
      Downloading cryptography-38.0.4-cp36-abi3-manylinux_2_28_x86_64.whl (4.2 MB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.2/4.2 MB 7.6 MB/s eta 0:00:00
    Collecting frozendict>=2.3.4
      Downloading frozendict-2.3.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (112 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 112.6/112.6 KB 1.8 MB/s eta 0:00:00
    Collecting pandas>=1.3.0
      Downloading pandas-1.5.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (12.2 MB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 12.2/12.2 MB 7.2 MB/s eta 0:00:00
    Collecting pytz>=2022.5
      Downloading pytz-2022.7-py2.py3-none-any.whl (499 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 499.4/499.4 KB 4.3 MB/s eta 0:00:00
    Collecting soupsieve>1.2
      Downloading soupsieve-2.3.2.post1-py3-none-any.whl (37 kB)
    Collecting msgpack>=0.5.2
      Downloading msgpack-1.0.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (322 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 322.4/322.4 KB 3.5 MB/s eta 0:00:00
    Collecting rapidfuzz<3.0.0,>=2.2.0
      Downloading rapidfuzz-2.13.7-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.2 MB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.2/2.2 MB 6.8 MB/s eta 0:00:00
    Collecting cffi>=1.12
      Downloading cffi-1.15.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (441 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 441.2/441.2 KB 4.1 MB/s eta 0:00:00
    Collecting six>=1.9
      Downloading six-1.16.0-py2.py3-none-any.whl (11 kB)
    Collecting webencodings
      Downloading webencodings-0.5.1-py2.py3-none-any.whl (11 kB)
    Collecting zipp>=0.5
      Downloading zipp-3.11.0-py3-none-any.whl (6.6 kB)
    Collecting pyrsistent!=0.17.0,!=0.17.1,!=0.17.2,>=0.14.0
      Downloading pyrsistent-0.19.2-py3-none-any.whl (57 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 57.5/57.5 KB 882.0 kB/s eta 0:00:00
    Collecting attrs>=17.4.0
      Downloading attrs-22.2.0-py3-none-any.whl (60 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 60.0/60.0 KB 880.8 kB/s eta 0:00:00
    Collecting SecretStorage>=3.2
      Downloading SecretStorage-3.3.3-py3-none-any.whl (15 kB)
    Collecting jaraco.classes
      Downloading jaraco.classes-3.2.3-py3-none-any.whl (6.0 kB)
    Collecting jeepney>=0.4.2
      Downloading jeepney-0.8.0-py3-none-any.whl (48 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 48.4/48.4 KB 662.4 kB/s eta 0:00:00
    Collecting python-dateutil>=2.8.1
      Downloading python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 247.7/247.7 KB 3.2 MB/s eta 0:00:00
    Collecting ptyprocess>=0.5
      Downloading ptyprocess-0.7.0-py2.py3-none-any.whl (13 kB)
    Collecting idna<4,>=2.5
      Downloading idna-3.4-py3-none-any.whl (61 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 61.5/61.5 KB 789.2 kB/s eta 0:00:00
    Collecting charset-normalizer<3,>=2
      Downloading charset_normalizer-2.1.1-py3-none-any.whl (39 kB)
    Collecting certifi>=2017.4.17
      Downloading certifi-2022.12.7-py3-none-any.whl (155 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 155.3/155.3 KB 2.5 MB/s eta 0:00:00
    Collecting distlib<1,>=0.3.6
      Downloading distlib-0.3.6-py2.py3-none-any.whl (468 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 468.5/468.5 KB 4.5 MB/s eta 0:00:00
    Collecting pycparser
      Downloading pycparser-2.21-py2.py3-none-any.whl (118 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 118.7/118.7 KB 1.7 MB/s eta 0:00:00
    Collecting more-itertools
      Downloading more_itertools-9.0.0-py3-none-any.whl (52 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 52.8/52.8 KB 603.7 kB/s eta 0:00:00
    Installing collected packages: webencodings, trove-classifiers, pytz, ptyprocess, multitasking, msgpack, lockfile, distlib, appdirs, zipp, urllib3, tomlkit, tomli, soupsieve, six, shellingham, rapidfuzz, pyrsistent, pycparser, poetry-core, platformdirs, pkginfo, pip, pexpect, packaging, numpy, more-itertools, lxml, jeepney, install, idna, frozendict, filelock, crashtest, charset-normalizer, certifi, attrs, virtualenv, requests, python-dateutil, jsonschema, jaraco.classes, importlib-metadata, html5lib, dulwich, cleo, cffi, beautifulsoup4, requests-toolbelt, pandas, cryptography, cachecontrol, yfinance, SecretStorage, keyring, poetry-plugin-export, poetry
      Attempting uninstall: pip
        Found existing installation: pip 22.0.4
        Uninstalling pip-22.0.4:
          Successfully uninstalled pip-22.0.4
    Successfully installed SecretStorage-3.3.3 appdirs-1.4.4 attrs-22.2.0 beautifulsoup4-4.11.1 cachecontrol-0.12.11 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-2.1.1 cleo-2.0.1 crashtest-0.4.1 cryptography-38.0.4 distlib-0.3.6 dulwich-0.20.50 filelock-3.8.2 frozendict-2.3.4 html5lib-1.1 idna-3.4 importlib-metadata-4.13.0 install-1.3.5 jaraco.classes-3.2.3 jeepney-0.8.0 jsonschema-4.17.3 keyring-23.13.1 lockfile-0.12.2 lxml-4.9.2 more-itertools-9.0.0 msgpack-1.0.4 multitasking-0.0.11 numpy-1.24.0 packaging-22.0 pandas-1.5.2 pexpect-4.8.0 pip-22.3.1 pkginfo-1.9.2 platformdirs-2.6.0 poetry-1.3.1 poetry-core-1.4.0 poetry-plugin-export-1.2.0 ptyprocess-0.7.0 pycparser-2.21 pyrsistent-0.19.2 python-dateutil-2.8.2 pytz-2022.7 rapidfuzz-2.13.7 requests-2.28.1 requests-toolbelt-0.10.1 shellingham-1.5.0 six-1.16.0 soupsieve-2.3.2.post1 tomli-2.0.1 tomlkit-0.11.6 trove-classifiers-2022.12.22 urllib3-1.26.13 virtualenv-20.17.1 webencodings-0.5.1 yfinance-0.2.3 zipp-3.11.0
    WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
    Creating virtualenv pyportfolioopt-SZ_KFjxI-py3.9 in /root/.cache/pypoetry/virtualenvs
    Installing dependencies from lock file
    
    Package operations: 115 installs, 0 updates, 0 removals
    
      • Installing attrs (22.1.0)
      • Installing platformdirs (2.5.4)
      • Installing pyrsistent (0.19.2)
      • Installing six (1.16.0)
      • Installing traitlets (5.5.0)
      • Installing entrypoints (0.4)
      • Installing fastjsonschema (2.16.2)
      • Installing jsonschema (4.17.1)
      • Installing jupyter-core (5.0.0)
      • Installing nest-asyncio (1.5.6)
      • Installing pycparser (2.21)
      • Installing python-dateutil (2.8.2)
      • Installing pyzmq (24.0.1)
      • Installing tornado (6.2)
      • Installing cffi (1.15.1)
      • Installing jupyter-client (7.4.7)
      • Installing markupsafe (2.1.1)
      • Installing nbformat (5.7.0)
      • Installing pyparsing (3.0.9)
      • Installing webencodings (0.5.1)
      • Installing zipp (3.11.0)
      • Installing soupsieve (2.3.2.post1)
      • Installing argon2-cffi-bindings (21.2.0)
      • Installing asttokens (2.1.0)
      • Installing beautifulsoup4 (4.11.1)
      • Installing bleach (5.0.1)
      • Installing executing (1.2.0)
      • Installing idna (3.4)
      • Installing defusedxml (0.7.1)
      • Installing importlib-metadata (5.1.0)
      • Installing jinja2 (3.1.2)
      • Installing jupyterlab-pygments (0.2.2)
      • Installing mistune (2.0.4)
      • Installing nbclient (0.7.0)
      • Installing packaging (21.3)
      • Installing pandocfilters (1.5.0)
      • Installing parso (0.8.3)
      • Installing ptyprocess (0.7.0)
      • Installing pure-eval (0.2.2)
      • Installing pygments (2.13.0)
      • Installing sniffio (1.3.0)
      • Installing tinycss2 (1.2.1)
      • Installing wcwidth (0.2.5)
      • Installing anyio (3.6.2)
      • Installing argon2-cffi (21.3.0)
      • Installing backcall (0.2.0)
      • Installing decorator (5.1.1)
      • Installing jedi (0.18.2)
      • Installing matplotlib-inline (0.1.6)
      • Installing pexpect (4.8.0)
      • Installing nbconvert (7.2.5)
      • Installing pickleshare (0.7.5)
      • Installing prometheus-client (0.15.0)
      • Installing prompt-toolkit (3.0.33)
      • Installing send2trash (1.8.0)
      • Installing stack-data (0.6.1)
      • Installing terminado (0.17.0)
      • Installing websocket-client (1.4.2)
      • Installing comm (0.1.1)
      • Installing debugpy (1.6.3)
      • Installing ipython (8.6.0)
      • Installing jupyter-server (1.23.3)
      • Installing numpy (1.23.5)
      • Installing psutil (5.9.4)
      • Installing certifi (2022.9.24)
      • Installing charset-normalizer (2.1.1)
      • Installing ipykernel (6.18.0)
      • Installing ipython-genutils (0.2.0)
      • Installing notebook-shim (0.2.2)
      • Installing pytz (2022.6)
      • Installing scipy (1.9.3)
      • Installing urllib3 (1.26.13)
      • Installing babel (2.11.0)
      • Installing exceptiongroup (1.0.4)
      • Installing iniconfig (1.1.1)
      • Installing json5 (0.9.10)
      • Installing nbclassic (0.4.8)
      • Installing qdldl (0.1.5.post2)
      • Installing requests (2.28.1)
      • Installing pluggy (1.0.0)
      • Installing tomli (2.0.1)
      • Installing typing-extensions (4.4.0)
      • Installing appdirs (1.4.4)
      • Installing click (8.1.3)
      • Installing contourpy (1.0.6)
      • Installing coverage (6.5.0)
      • Installing ecos (2.0.10)
      • Installing joblib (1.2.0)
      • Installing fonttools (4.38.0)
      • Installing cycler (0.11.0)
      • Installing jupyterlab-server (2.16.3)
      • Installing kiwisolver (1.4.4)
      • Installing lxml (4.9.1)
      • Installing mccabe (0.6.1)
      • Installing multitasking (0.0.11)
      • Installing mypy-extensions (0.4.3)
      • Installing notebook (6.5.2)
      • Installing osqp (0.6.2.post8)
      • Installing pandas (1.5.2)
      • Installing pathspec (0.10.2)
      • Installing pillow (9.3.0)
      • Installing pycodestyle (2.8.0)
      • Installing pyflakes (2.4.0)
      • Installing pytest (7.2.0)
      • Installing scs (3.2.2)
      • Installing setuptools-scm (7.0.5)
      • Installing threadpoolctl (3.1.0)
      • Installing black (22.10.0)
      • Installing cvxpy (1.2.2)
      • Installing flake8 (4.0.1)
      • Installing jupyterlab (3.5.0)
      • Installing matplotlib (3.6.2)
      • Installing pytest-cov (3.0.0)
      • Installing yfinance (0.1.87)
      • Installing scikit-learn (1.1.3)
    Warning: The file chosen for install of ipykernel 6.18.0 (ipykernel-6.18.0-py3-none-any.whl) is yanked. Reason for being yanked: Breaking change in Comm
    Removing intermediate container 000b65f1574a
     ---> f6b7423182ab
    Step 5/6 : COPY . .
     ---> cd78218c00d6
    Step 6/6 : RUN cd cookbook
     ---> Running in c17bbd72084c
    Removing intermediate container c17bbd72084c
     ---> 25085e46ebea
    Successfully built 25085e46ebea
    Successfully tagged pypfopt:latest
    PRETTY_NAME="Debian GNU/Linux 11 (bullseye)"
    NAME="Debian GNU/Linux"
    VERSION_ID="11"
    VERSION="11 (bullseye)"
    VERSION_CODENAME=bullseye
    ID=debian
    HOME_URL="https://www.debian.org/"
    SUPPORT_URL="https://www.debian.org/support"
    BUG_REPORT_URL="https://bugs.debian.org/"
    Python 3.9.16
    ========================================================================== test session starts ==========================================================================
    platform linux -- Python 3.9.16, pytest-7.2.0, pluggy-1.0.0
    rootdir: /pypfopt
    plugins: anyio-3.6.2, cov-3.0.0
    collected 312 items                                                                                                                                                     
    
    tests/test_base_optimizer.py ...................                                                                                                                  [  6%]
    tests/test_black_litterman.py .....................                                                                                                               [ 12%]
    tests/test_cla.py ............                                                                                                                                    [ 16%]
    tests/test_custom_objectives.py ..................                                                                                                                [ 22%]
    tests/test_discrete_allocation.py .................                                                                                                               [ 27%]
    tests/test_efficient_cdar.py ....................                                                                                                                 [ 34%]
    tests/test_efficient_cvar.py .....................                                                                                                                [ 41%]
    tests/test_efficient_frontier.py ..........................................................................                                                       [ 64%]
    tests/test_efficient_semivariance.py ............................                                                                                                 [ 73%]
    tests/test_expected_returns.py ...................                                                                                                                [ 79%]
    tests/test_hrp.py ......                                                                                                                                          [ 81%]
    tests/test_imports.py ...                                                                                                                                         [ 82%]
    tests/test_objective_functions.py ...........                                                                                                                     [ 86%]
    tests/test_plotting.py .................                                                                                                                          [ 91%]
    tests/test_risk_models.py ..........................                                                                                                              [100%]
    
    =========================================================================== warnings summary ============================================================================
    pypfopt/plotting.py:21
      /pypfopt/pypfopt/plotting.py:21: MatplotlibDeprecationWarning: The seaborn styles shipped by Matplotlib are deprecated since 3.6, as they no longer correspond to the styles shipped by seaborn. However, they will remain available as 'seaborn-v0_8-<style>'. Alternatively, directly use the seaborn API instead.
        plt.style.use("seaborn-deep")
    
    tests/test_base_optimizer.py::test_exception_immutability
      /pypfopt/pypfopt/efficient_frontier/efficient_frontier.py:175: RuntimeWarning: Market neutrality requires shorting - bounds have been amended
        warnings.warn(
    
    tests/test_black_litterman.py: 12 warnings
      /pypfopt/pypfopt/black_litterman.py:257: UserWarning: Running Black-Litterman with no prior.
        warnings.warn("Running Black-Litterman with no prior.")
    
    tests/test_discrete_allocation.py: 9 warnings
    tests/test_efficient_cvar.py: 1 warning
    tests/test_efficient_semivariance.py: 2 warnings
    tests/test_plotting.py: 4 warnings
      /root/.cache/pypoetry/virtualenvs/pyportfolioopt-SZ_KFjxI-py3.9/lib/python3.9/site-packages/cvxpy/problems/problem.py:1337: UserWarning: Solution may be inaccurate. Try another solver, adjusting the solver settings, or solve with verbose=True for more information.
        warnings.warn(
    
    tests/test_efficient_frontier.py::test_min_volatility_sector_constraints
      /pypfopt/pypfopt/base_optimizer.py:397: UserWarning: Sector constraints may not produce reasonable results if shorts are allowed.
        warnings.warn(
    
    tests/test_efficient_frontier.py::test_max_sharpe_L2_reg_different_gamma
    tests/test_efficient_frontier.py::test_max_sharpe_L2_reg_different_gamma
    tests/test_efficient_frontier.py::test_max_sharpe_L2_reg_reduces_sharpe
    tests/test_efficient_frontier.py::test_max_sharpe_L2_reg_with_shorts
      /pypfopt/pypfopt/efficient_frontier/efficient_frontier.py:262: UserWarning: max_sharpe transforms the optimization problem so additional objectives may not work as expected.
        warnings.warn(
    
    -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
    ============================================================== 312 passed, 35 warnings in 87.43s (0:01:27) ==============================================================
    
    opened by yosukesan 0
  • Docker build failed due to obsolete pip and poetry versions were hard coded on docker/Dockerfile

    Docker build failed due to obsolete pip and poetry versions were hard coded on docker/Dockerfile

    Describe the bug Follow up for issue https://github.com/robertmartin8/PyPortfolioOpt/issues/476 and RP https://github.com/robertmartin8/PyPortfolioOpt/pull/498. Issue https://github.com/robertmartin8/PyPortfolioOpt/issues/476 was fixed by RP https://github.com/robertmartin8/PyPortfolioOpt/pull/498, but we found pip and poetry versions were hard coded on the test. They were too obsolete to make dependency and set up (download) failed. Then docker build terminated with ERROR. See the error detail for the pull request comment.

    Expected behavior After removing the hard coded versions, all unit test terminated successfully. see https://github.com/robertmartin8/PyPortfolioOpt/pull/498.

    Fix paln

    1. Remove the hard coded versions. pyproject.toml and poetry.lock are used by docker on its build. pip version was specified (>=18.1) by pyproject.toml, but poerty's was unspecified.

    2. Update debian's container image Buster is going to be unsupported in 2023. Updated to 'latest' which is currently the latest stable is Bullseye.

    bug 
    opened by yosukesan 2
Releases(v1.4.1)
Owner
Robert Martin
Astrophysics at the University of Cambridge. Python <3
Robert Martin
Nasdaq Cloud Data Service (NCDS) provides a modern and efficient method of delivery for realtime exchange data and other financial information. This repository provides an SDK for developing applications to access the NCDS.

Nasdaq Cloud Data Service (NCDS) Nasdaq Cloud Data Service (NCDS) provides a modern and efficient method of delivery for realtime exchange data and ot

Nasdaq 8 Dec 1, 2022
Python SDK for LUSID by FINBOURNE, a bi-temporal investment management data platform with portfolio accounting capabilities.

LUSID® Python SDK This is the Python SDK for LUSID by FINBOURNE, a bi-temporal investment management data platform with portfolio accounting capabilit

FINBOURNE 6 Dec 24, 2022
OpenQuake's Engine for Seismic Hazard and Risk Analysis

OpenQuake Engine The OpenQuake Engine is an open source application that allows users to compute seismic hazard and seismic risk of earthquakes on a g

Global Earthquake Model 281 Dec 21, 2022
WallAlley.bot is an open source and free to use financial discord bot originaly build for WallAlley server's community

WallAlley.bot About WallAlley.bot is an open source and free to use financial discord bot originaly build for WallAlley server's community. All data a

Mohammad KHADDAN 1 Jan 22, 2022
Monitor your Binance portfolio

Binance Report Bot The intent of this bot is to take a snapshot of your binance wallet, e.g. the current balances and store it for further plotting. I

null 37 Oct 29, 2022
Support for Competitive Coding badges to add in Github readme or portfolio websites.

Support for Competitive Coding badges to add in Github readme or portfolio websites.

Akshat Aggarwal 2 Feb 14, 2022
Black-hat with python

black-hat_python Advantages - More advance tool Easy to use allows updating tool update - run bash update.sh Here -: Command to install tool main- clo

Hackers Tech 2 Feb 10, 2022
Repository for the IPvSeeYou talk at Black Hat 2021

IPvSeeYou Geolocation Lookup Tool Overview IPvSeeYou.py is a tool to assist with geolocating EUI-64 IPv6 hosts. It takes as input an EUI-64-derived MA

null 57 Nov 8, 2022
This is a very easy to use tool developed in python that will search for free courses from multiple sites including youtube and enroll in the ones in which it can.

Free-Course-Hunter-and-Enroller This is a very easy to use tool developed in python that will search for free courses from multiple sites including yo

Zain 12 Nov 12, 2022
Python library for RetroMMO related stuff, including API wrapper

python library for RetroMMO related stuff, including API wrapper.

null 1 Nov 25, 2021
Simple software that can send WhatsApp message to a single or multiple users (including unsaved number**)

wp-automation Info: this is a simple automation software that sends WhatsApp message to single or multiple users. Key feature: -Sends message to multi

null 3 Jan 31, 2022
Netflix Movies and TV Series Downloader Tool including CDM L1 which you guys can Donwload 4K Movies

NFRipper2.0 I could not shared all the code here Because its has lots of files inisde it https://new.gdtot.me/file/86651844 - Downoad File From Here.

Kiran 15 May 6, 2022
Fetch Flipkart product details including name, price, MRP and Stock details in general as well as specific to a pincode

Fetch Flipkart product details including name, price, MRP and Stock details in general as well as specific to a pincode

Vishal Das 6 Jul 11, 2022
Project template for using aws-cdk, Chalice and React in concert, including RDS Postgresql and AWS Cognito

What is This? This repository is an opinonated project template for using aws-cdk, Chalice and React in concert. Where aws-cdk and Chalice are in Pyth

Rasmus Jones 4 Nov 7, 2022
My personal template for a discord bot, including an asynchronous database and colored logging :)

My personal template for a discord bot, including an asynchronous database and colored logging :)

Timothy Pidashev 9 Dec 24, 2022
It connects to Telegram's API. It generates JSON files containing channel's data, including channel's information and posts.

It connects to Telegram's API. It generates JSON files containing channel's data, including channel's information and posts. You can search for a specific channel, or a set of channels provided in a text file (one channel per line.)

Esteban Ponce de Leon 75 Jan 2, 2023
Fast and multi-threaded script to automatically claim targeted username including 14 day bypass

Instagram Username Auto Claimer Fast and multi-threaded script to automatically claim targeted username. Click here to report bugs. Usage Download ZIP

null 265 Dec 28, 2022
Joshua McDonagh 1 Jan 24, 2022