The purpose of this project is to share knowledge on how awesome Streamlit is and can be

Overview

Awesome Streamlit Awesome

The fastest way to build Awesome Tools and Apps! Powered by Python!

The purpose of this project is to share knowledge on how Awesome Streamlit is and can become. Pull requests are very welcome!

Streamlit has just been announced (Oct 2019) but I see the potential of becoming the Iphone of Data Science Apps. And maybe it can even become the Iphone of Technical Writing, Code, Micro Apps and Python.

This project provides

  • A curated list of Awesome Streamlit resources. See below.
  • An awesome Streamlit application with a gallery of Awesome Streamlit Apps.
    • Feel free to add your awesome app to the gallery via a Pull request. It's easy (see below).
  • A vision on how awesome Streamlit is and can become.
  • A best practices example and starter template of an awesome, multipage app with an automated CI/ CD pipeline, deployed to the cloud and running in a Docker container.

Visit the app at awesome-streamlit.org!

Awesome Streamlit Org Animation

The Magic of Streamlit

The only way to truly understand how magical Streamlit is to play around with it. But if you need to be convinced first, then here is the 4 minute introduction to Streamlit!

Afterwards you can go to the Streamlit docs to get started. You might also visit Awesome Streamlit docs.

Introduction to Streamlit

Awesome Resources

A curated list of awesome streamlit resources. Inspired by awesome-python and awesome-pandas.

Alternative

App

Article

Awesome-Streamlit.org

Code

Guide

Sister Sites

Social

Streamlit.io

Technical

Tutorial

Governance

This repo is maintained by me :-)

I'm Marc, Skov, Madsen, PhD, CFA®, Lead Data Scientist Developer at Ørsted

You can learn more about me at datamodelsanalytics.com

I try my best to govern and maintain this project in the spirit of the Zen of Python.

But i'm not an experienced open source maintainer so helpfull suggestions are appreciated.

Thanks

Contribute

GitHub Issues and Pull requests are very welcome!

If you believe Awesome Streamlit is awesome and would like to join as a Core Developer feel free to reach out via datamodelsanalytics.com

How to contribute awesome links

The best way to contribute an awesome link is via a Pull request.

In the pull request you should

Thanks.

How to contribute awesome apps

The best way to contribute an awesome app is via a Pull request.

In the pull request you should

  • describe why your contribution is awesome and should be included.
  • create a new folder gallery/<your_app_name> and app file gallery/<your_app_name>/<your_app_name.py>.
  • Add your app code conforming to the template
"""
## APP NAME

DESCRIPTION

Author: [YOUR NAME](https://URL_TO_YOU))\n
Source: [Github](https://github.com/URL_TO_CODE)
"""
import streamlit as st

# Your imports goes below

def main():
    st.title("APP NAME")
    st.markdown("DESCRIPTION")

    # Your code goes below

if __name__ == "__main__":
    main()
  • Please note magic in sub pages does not work. So don't use magic.
  • add the your_app_name to the
  • update the requirements_base.txt file. Please specify the required versions.
  • Run the automated tests using invoke test.all and fix all errors from your app
  • Run the full app via streamlit run app.py and manually test your contribution.

Please note that your app should not require high compute power as we are running on one of the cheapest tiers available on Azure.

Feel free to reach out if you have comments, questions or need help.

Thanks.

How to contribute to the Streamlit Community

Please sign up to and participate in the community at discuss.streamlit.io

How to contribute to the Streamlit Package

Please contribute to improving the Streamlit package at GitHub/streamlit/streamlit

How to contribute to Streamlit.io

Streamlit.io is in the position of trying to balance building an awesome, succesfull business and providing an awesome product to the open source community.

If you are in a Team please consider signing up for the beta of

How to sponsor the Awesome Streamlit project

If you would like to sponsor my time or the infrastructure the platform is running on, feel free to reach out via datamodelsanalytics.com.

You can also appreciate the work I have already done if you

Buy me a coffee

Thanks

Marc

LICENSE

Attribution-ShareAlike 4.0 International

Getting Started with the Awesome Streamlit Repository

Prerequisites

  • An Operating System like Windows, OsX or Linux
  • A working Python installation.
    • We recommend using 64bit Python 3.7.4.
  • a Shell
    • We recommend Git Bash for Windows 8.1
    • We recommend wsl for For Windows 10
  • an Editor
  • The Git cli

Installation

Clone the repo

git clone https://github.com/MarcSkovMadsen/awesome-streamlit.git

cd into the project root folder

cd awesome-streamlit

Create virtual environment

via python

Then you should create a virtual environment named .venv

python -m venv .venv

and activate the environment.

On Linux, OsX or in a Windows Git Bash terminal it's

source .venv/Scripts/activate

or alternatively

source .venv/bin/activate

In a Windows terminal it's

.venv/Scripts/activate.bat
or via anaconda

Create virtual environment named awesome-streamlit

conda create -n awesome-streamlit python=3.7.4

and activate environment.

activate awesome-streamlit

If you are on windows you need to install some things required by GeoPandas by following these instructions.

Then you should install the local requirements

pip install -r requirements_local.txt

Finally you need to install some spacy dependencies

python -m spacy download en_core_web_sm
python -m spacy download en_core_web_md
python -m spacy download de_core_news_sm

Build and run the Application Locally

streamlit run app.py

or as a Docker container via

invoke docker.build --rebuild
invoke docker.run-server

Run the Application using the image on Dockerhub

If you don't wan't to clone the repo and build the docker container you can just use docker run to run the image from Dockerhub

To run bash interactively

docker run -it -p 80:80 --entrypoint "/bin/bash" marcskovmadsen/awesome-streamlit:latest

To run the streamlit interactively on port 80

docker run -it -p 80:80 --entrypoint "streamlit" marcskovmadsen/awesome-streamlit:latest run app.py

Code quality and Tests

We use

  • isort for sorting import statements
  • autoflake to remove unused imports and unused variables
  • black the opinionated code formatter
  • pylint for static analysis
  • mypy for static type checking
  • pytest for unit to functional tests

to ensure a high quality of our code and application.

You can run all tests using

invoke test.all

Streamlit Tests

I've created a first version of an awesome streamlit test runner. You run it via

streamlit run test_runner_app.py

or in Docker

docker run -it -p 80:80 --entrypoint "streamlit" marcskovmadsen/awesome-streamlit:latest run test_runner_app.py

Awesome Streamlit Test Runner

Workflow

We use the power of Invoke to semi-automate the local workflow. You can see the list of available commands using

$ invoke --list
Available tasks:

  docker.build                            Build Docker image
  docker.push                             Push the Docker container
  docker.run                              Run the Docker container interactively.
  docker.run-server                       Run the Docker container interactively
  docker.system-prune                     The docker system prune command will free up space
  test.all (test.pre-commit, test.test)   Runs isort, autoflake, black, pylint, mypy and pytest
  test.autoflake                          Runs autoflake to remove unused imports on all .py files recursively
  test.bandit                             Runs Bandit the security linter from PyCQA.
  test.black                              Runs black (autoformatter) on all .py files recursively
  test.isort                              Runs isort (import sorter) on all .py files recursively
  test.mypy                               Runs mypy (static type checker) on all .py files recursively
  test.pylint                             Runs pylint (linter) on all .py files recursively to identify coding errors
  test.pytest                             Runs pytest to identify failing tests

Configuration

You can configure the app in the config.py file.

Please note that Streamlit has its own config files in the ~/.streamlit folder.

CI/ CD and Hosting

The application is

  • build as a Docker image and tested via Azure Pipelines builds
    • You find the Dockerfiles here and the Azure pipelines yml files here

Azure Pipelines

Dockerhub

  • released via Azure Pipelines

Azure Pipelines

  • to a web app for containers service on Azure on the cheapest non-free pricing tier

Azure Pipelines

The Awesome-Streamlit Package

You can build the package using

cd package
python setup.py sdist bdist_wheel

If you wan't to publish the package to PyPi you should first

update the version number in the setup.py file. The format is YYYYmmdd.version. For example 20191014.2

Then you run

twine upload dist/awesome-streamlit-YYYYmmdd.version.tar.gz -u <the-pypi-username> -p <the-pypi-password>

For more info see the package README.md

Project Layout

The basic layout of a application is as simple as

.
└── app.py

As our application grows we would refactor our app.py file into multiple folders and files.

  • assets here we keep our css and images assets.
  • models - Defines the layout of our data in the form of
    • Classes: Name, attribute names, types
    • DataFrame Schemas: column and index names, dtypes
    • SQLAlchemy Tables: columns names, types
  • pages - Defines the different pages of the Streamlit app
  • services - Organizes and shares business logic, models, data and functions with different pages of the Streamlit App.
    • Database interactions: Select, Insert, Update, Delete
    • REST API interactions, get, post, put, delete
    • Pandas transformations

and end up with a project structure like

.
├── app.py
└── src
    └── assets
    |    └── css
    |    |   ├── app.css
    |    |   ├── component1.css
    |    |   ├── component2.css
    |    |   ├── page1.css
    |    |   └── page2.css
    |    └── images
    |    |   ├── image1.png
    |    |   └── image2.png
    ├── core
    |   └── services
    |       ├── service1.py
    |       └── service2.py
    └── pages
    |   └── pages
    |       ├── page1.py
    |       └── page2.py
    └── shared
        └── models
        |   ├── model1.py
        |   └── model2.py
        └── components
            ├── component1.py
            └── component2.py

Further refactoring is guided by by this blog post and the Angular Style Guide.

We place our tests in a test folder in the root folder organized with folders similar to the app folder and file names with a test_ prefix.

.
└── test
    ├── test_app.py
    ├── core
    |   └── services
    |       ├── test_service1.py
    |       └── test_service2.py
    └── pages
    |   └── pages
    |       ├── page1
    |       |   └── test_page1.py
    |       └── page2
    └── shared
        └── models
        |   ├── test_model1.py
        |   └── test_model2.py
        └── components
            ├── test_component1.py
            └── test_component2.py
Comments
  • Add yahooquery streamlit tutorial

    Add yahooquery streamlit tutorial

    The streamlit app I am adding is a demo to a python package I wrote called yahooquery. It uses an unofficial Yahoo Finance API to grab pretty much any of the data you can see on Yahoo Finance. I have a demo of the app on heroku, which you can view at yahooquery-streamlit.herokuapp.com. I realize you already have a yahoo finance app, so if this would be too much of a duplicate, I understand.

    Thanks for making this awesome collection of streamlit apps / resources; it truly is an incredible tool.

    opened by dpguthrie 19
  • Question: Why is importlib necessary?

    Question: Why is importlib necessary?

    I was looking at this line and was confused why you had to do that.

    I feel like whatever you're trying to accomplish is something which Streamlit should fix automatically. Could you please share some insight?

    Thanks! Adrien

    opened by treuille 19
  • Gallery apps not working fully

    Gallery apps not working fully

    First off, thanks for the great resource!

    I tried containerizing a simple demo myself, and can't get it working on a CentOS (though it works on Windows); however, your image works 👍

    But an issue with the gallery piece (only spacy demo is working), as shown below:

    streamlit_error

    Exact error message is:

    URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1076)>
    File "/usr/local/lib/python3.7/site-packages/streamlit/ScriptRunner.py", line 306, in _run_script exec(code, module.__dict__)
    File "/app/app.py", line 40, in <module> src.st_extensions.write_page(page)
    File "/app/src/st_extensions.py", line 32, in write_page page.write()
    File "/app/src/pages/gallery/index.py", line 51, in write python_code = get_file_content_as_string(run_app.url)
    File "/usr/local/lib/python3.7/site-packages/streamlit/caching.py", line 544, in wrapped_func return get_or_set_cache()
    File "/usr/local/lib/python3.7/site-packages/streamlit/caching.py", line 526, in get_or_set_cache return_value = func(*args, **kwargs)
    File "/app/src/pages/gallery/index.py", line 122, in get_file_content_as_string data = urllib.request.urlopen(url).read()
    File "/usr/local/lib/python3.7/urllib/request.py", line 222, in urlopen return opener.open(url, data, timeout)
    File "/usr/local/lib/python3.7/urllib/request.py", line 525, in open response = self._open(req, data)
    File "/usr/local/lib/python3.7/urllib/request.py", line 543, in _open '_open', req)
    File "/usr/local/lib/python3.7/urllib/request.py", line 503, in _call_chain result = func(*args)
    File "/usr/local/lib/python3.7/urllib/request.py", line 1360, in https_open context=self._context, check_hostname=self._check_hostname)
    File "/usr/local/lib/python3.7/urllib/request.py", line 1319, in do_open raise URLError(err)
    
    opened by leungi 5
  • really slow/non working website

    really slow/non working website

    Hi!

    I just fount out about streamlit and this project, and I'm evaluating using it for my next project.

    Before diving in, I was browsing projects made with streamlit, starring from awesome-streamlit.org.

    Sadly, I encounter many problems in doing that:

    • The first time I opened the link, nothing showed (and no loading sign) - but that could be a temporary problem in my network
    • Then, I could enter the website, but it took a very long time to pass the "Please wait..." message;
    • After that, the site responded for some clicks, but gave up when I tried to filter the list;
    • Refreshing the page a few times brought me back to the "please wait..." message, but nothing else came up; in the chrome devtools I saw the error message of the websocket connection closed before the connection is established.

    Not a great advertisement of the product, I must say!

    ~Is this a problem with streamlit or is it that the hosting is crippled and doesn't handle the traffic well?~ EDIT: Just tried cloning the repo and running locally, it works fine! so it's something related to the network (not on my side, I tried with both ADSL and LTE connections, same result) or hosting

    opened by sanzoghenzo 3
  • First boked demo is not displaying

    First boked demo is not displaying

    Hi, I tried running the first bokeh demo from the gallery. I was able to launch the app, but the bokeh plot was not displayed.

    Code and expected result: Capture

    The outcome was just a blank streamlit app.

    I'll follow-up if I find a fix

    opened by srcoulombe 3
  • ModuleNotFoundError: No module named 'awesome_streamlit.database'

    ModuleNotFoundError: No module named 'awesome_streamlit.database'

    I'm using a Python3.7 conda environment with streamlit installed:

    $ streamlit --version
    Streamlit, version 0.57.3
    $ python --version
    Python 3.7.7
    $ pip freeze | grep awesome-streamlit
    awesome-streamlit==20191018.1
    

    However, if I try to import it in the Python repl:

    import awesome_streamlit as ast
    

    I get

    ModuleNotFoundError: No module named 'awesome_streamlit.database'
    
    opened by MarcoGorelli 3
  • Added Google Playstore Analytics app

    Added Google Playstore Analytics app

    Following the issue #26,

    • Why this app is awesome: I built this app in place of the standard ppt presentation of data science projects for a school project 😄 and encouraged the Prof to try out streamlit too.
    • Tests: I've ran invoke test.all and none has failed.
    • Any errors: However, I'm not able to run the full app to see the website due to the error No module named 'awesome_streamlit', tried compiling the package by python package/setup.py, the commands ran but it was still unable to find the package. Please assist!
    opened by lyqht 3
  • Add NBA roster turnover app

    Add NBA roster turnover app

    Hi @MarcSkovMadsen great work with this repo, truly awesome 😉

    I'd like to add an app that explores historical statistics for the National Basketball Association. I think it's a nice example of interacting with a slider and using both a plotly scatter with a pandas DataFrame. The app is also deployed on Heroku.

    I feel bad making a PR with broken code, but it's late here and was hoping to get some feedback at some point to see if I'm on the right track. Thank you!

    opened by arvkevi 3
  • awesome-streamlit application in docker container gets unresponsive

    awesome-streamlit application in docker container gets unresponsive

    When I run my awesome-streamlit docker container locally or on Azure it's snappy. But over time it gets unresponsive.

    When the docker container is just run

    docker run -it -p 80:80 --entrypoint "streamlit" marcskovmadsen/awesome-streamlit:latest run app.py
    

    I can navigate from the Home to the Vision page in much less than a second.

    image

    It's the same when the container is deployed to https://awesome-streamlit.azurewebsites.net

    But after some time the response time increases on Azure but not locally. For example after runing 30 minutes in Azure the response time is like 8 seconds. And in the end it gets so unresponsive that nobody would ever wait on it.

    opened by MarcSkovMadsen 3
  • Error installing awesome-streamlit in anaconda navigator

    Error installing awesome-streamlit in anaconda navigator

    I installed anaconda navigator and created a new enviroment with Python version 3.8.2.

    I launched the anaconda navigator prompt and did:

    conda activate awesome-streamlit

    Once the environment was activated, I changed to the directory to where awesome-streamlit is present and did:

    (awesome-streamlit) C:\Users\kashy\OneDrive\Desktop\ediapp>pip install -r requirements.txt

    Following which, I start getting the error:

    (awesome-streamlit) C:\Users\kashy\OneDrive\Desktop\ediapp>pip install -r requirements.txt
    Collecting streamlit==0.49.0
      Using cached streamlit-0.49.0-py2.py3-none-any.whl (5.1 MB)
    Collecting invoke==1.3.0
      Using cached invoke-1.3.0-py3-none-any.whl (207 kB)
    Collecting pandas==0.24.2
      Downloading pandas-0.24.2.tar.gz (11.8 MB)
         |████████████████████████████████| 11.8 MB 2.0 kB/s
    Collecting xlrd==1.2.0
      Using cached xlrd-1.2.0-py2.py3-none-any.whl (103 kB)
    Collecting pytest
      Using cached pytest-5.4.2-py3-none-any.whl (247 kB)
    Collecting pytest-sugar
      Using cached pytest-sugar-0.9.3.tar.gz (12 kB)
    Collecting pytest-mock
      Using cached pytest_mock-3.1.0-py2.py3-none-any.whl (10 kB)
    Collecting pytest-cov
      Using cached pytest_cov-2.8.1-py2.py3-none-any.whl (18 kB)
    Requirement already satisfied: isort>=4.3.15 in c:\users\kashy\anaconda3\envs\awesome-streamlit\lib\site-packages (from -r ./requirements_base.txt (line 22)) (4.3.21)
    Requirement already satisfied: pylint in c:\users\kashy\anaconda3\envs\awesome-streamlit\lib\site-packages (from -r ./requirements_base.txt (line 23)) (2.5.0)
    Collecting pylint2junit
      Using cached pylint2junit-1.0.1-py3-none-any.whl (6.5 kB)
    Collecting black
      Using cached black-19.10b0-py36-none-any.whl (97 kB)
    Collecting autoflake
      Using cached autoflake-1.3.1.tar.gz (17 kB)
    Collecting coverage
      Downloading coverage-5.1-cp38-cp38-win_amd64.whl (206 kB)
         |████████████████████████████████| 206 kB 2.2 MB/s
    Collecting mypy
      Downloading mypy-0.770-cp38-cp38-win_amd64.whl (7.8 MB)
         |████████████████████████████████| 7.8 MB 3.3 MB/s
    Collecting bandit
      Using cached bandit-1.6.2-py2.py3-none-any.whl (122 kB)
    Collecting Sphinx==1.8.4
      Using cached Sphinx-1.8.4-py2.py3-none-any.whl (3.1 MB)
    Collecting sphinx_rtd_theme
      Using cached sphinx_rtd_theme-0.4.3-py2.py3-none-any.whl (6.4 MB)
    Collecting recommonmark
      Using cached recommonmark-0.6.0-py2.py3-none-any.whl (10 kB)
    Collecting sphinx-autobuild
      Using cached sphinx-autobuild-0.7.1.tar.gz (14 kB)
    Collecting doc8
      Using cached doc8-0.8.0-py2.py3-none-any.whl (19 kB)
    Collecting spacy==2.2.1
      Downloading spacy-2.2.1.tar.gz (5.8 MB)
         |████████████████████████████████| 5.8 MB 218 kB/s
      Installing build dependencies ... done
      Getting requirements to build wheel ... done
      Installing backend dependencies ... error
      ERROR: Command errored out with exit status 1:
       command: 'C:\Users\kashy\anaconda3\envs\awesome-streamlit\python.exe' 'C:\Users\kashy\anaconda3\envs\awesome-streamlit\lib\site-packages\pip' install --ignore-installed --no-user --prefix 'C:\Users\kashy\AppData\Local\Temp\pip-build-env-q38svyds\normal' --no-warn-script-location --no-binary :none: --only-binary :none: -i https://pypi.org/simple -- wheel 'cymem<2.1.0,>=2.0.2' 'preshed<3.1.0,>=3.0.2' 'thinc<7.2.0,>=7.1.1' 'cython>=0.25' 'murmurhash<1.1.0,>=0.28.0'
           cwd: None
      Complete output (215 lines):
      Collecting wheel
        Using cached wheel-0.34.2-py2.py3-none-any.whl (26 kB)
      Collecting cymem<2.1.0,>=2.0.2
        Downloading cymem-2.0.3-cp38-cp38-win_amd64.whl (33 kB)
      Collecting preshed<3.1.0,>=3.0.2
        Downloading preshed-3.0.2-cp38-cp38-win_amd64.whl (115 kB)
      Collecting thinc<7.2.0,>=7.1.1
        Downloading thinc-7.1.1.tar.gz (1.9 MB)
      Collecting cython>=0.25
        Downloading Cython-0.29.17-cp38-cp38-win_amd64.whl (1.7 MB)
      Collecting murmurhash<1.1.0,>=0.28.0
        Downloading murmurhash-1.0.2-cp38-cp38-win_amd64.whl (20 kB)
      Collecting blis<0.5.0,>=0.4.0
        Downloading blis-0.4.1-cp38-cp38-win_amd64.whl (5.0 MB)
      Collecting wasabi<1.1.0,>=0.0.9
        Using cached wasabi-0.6.0-py3-none-any.whl (20 kB)
      Collecting srsly<1.1.0,>=0.0.6
        Downloading srsly-1.0.2-cp38-cp38-win_amd64.whl (181 kB)
      Collecting numpy>=1.7.0
        Downloading numpy-1.18.4-cp38-cp38-win_amd64.whl (12.8 MB)
      Collecting plac<1.0.0,>=0.9.6
        Using cached plac-0.9.6-py2.py3-none-any.whl (20 kB)
      Collecting tqdm<5.0.0,>=4.10.0
        Using cached tqdm-4.46.0-py2.py3-none-any.whl (63 kB)
      Building wheels for collected packages: thinc
        Building wheel for thinc (setup.py): started
        Building wheel for thinc (setup.py): finished with status 'error'
        ERROR: Command errored out with exit status 1:
         command: 'C:\Users\kashy\anaconda3\envs\awesome-streamlit\python.exe' -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\kashy\\AppData\\Local\\Temp\\pip-install-062p5m4u\\thinc\\setup.py'"'"'; __file__='"'"'C:\\Users\\kashy\\AppData\\Local\\Temp\\pip-install-062p5m4u\\thinc\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d 'C:\Users\kashy\AppData\Local\Temp\pip-wheel-shptm3u2'
             cwd: C:\Users\kashy\AppData\Local\Temp\pip-install-062p5m4u\thinc\
        Complete output (166 lines):
        running bdist_wheel
        running build
        running build_py
        creating build
        creating build\lib.win-amd64-3.8
        creating build\lib.win-amd64-3.8\thinc
        copying thinc\about.py -> build\lib.win-amd64-3.8\thinc
        copying thinc\api.py -> build\lib.win-amd64-3.8\thinc
        copying thinc\check.py -> build\lib.win-amd64-3.8\thinc
        copying thinc\compat.py -> build\lib.win-amd64-3.8\thinc
        copying thinc\describe.py -> build\lib.win-amd64-3.8\thinc
        copying thinc\exceptions.py -> build\lib.win-amd64-3.8\thinc
        copying thinc\i2v.py -> build\lib.win-amd64-3.8\thinc
        copying thinc\loss.py -> build\lib.win-amd64-3.8\thinc
        copying thinc\misc.py -> build\lib.win-amd64-3.8\thinc
        copying thinc\rates.py -> build\lib.win-amd64-3.8\thinc
        copying thinc\t2t.py -> build\lib.win-amd64-3.8\thinc
        copying thinc\t2v.py -> build\lib.win-amd64-3.8\thinc
        copying thinc\v2v.py -> build\lib.win-amd64-3.8\thinc
        copying thinc\__init__.py -> build\lib.win-amd64-3.8\thinc
        creating build\lib.win-amd64-3.8\thinc\tests
        copying thinc\tests\conftest.py -> build\lib.win-amd64-3.8\thinc\tests
        copying thinc\tests\strategies.py -> build\lib.win-amd64-3.8\thinc\tests
        copying thinc\tests\test_api_funcs.py -> build\lib.win-amd64-3.8\thinc\tests
        copying thinc\tests\test_util.py -> build\lib.win-amd64-3.8\thinc\tests
        copying thinc\tests\util.py -> build\lib.win-amd64-3.8\thinc\tests
        copying thinc\tests\__init__.py -> build\lib.win-amd64-3.8\thinc\tests
        creating build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\test_about.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\test_affine.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\test_beam_search.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\test_check_exceptions.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\test_difference.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\test_feature_extracter.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\test_hash_embed.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\test_imports.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\test_linear.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\test_loss.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\test_mem.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\test_model.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\test_ops.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\test_pickle.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\test_pooling.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\test_pytorch_wrapper.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\test_rates.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\test_rnn.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        copying thinc\tests\unit\__init__.py -> build\lib.win-amd64-3.8\thinc\tests\unit
        creating build\lib.win-amd64-3.8\thinc\tests\integration
        copying thinc\tests\integration\test_affine_learns.py -> build\lib.win-amd64-3.8\thinc\tests\integration
        copying thinc\tests\integration\test_basic_tagger.py -> build\lib.win-amd64-3.8\thinc\tests\integration
        copying thinc\tests\integration\test_batch_norm.py -> build\lib.win-amd64-3.8\thinc\tests\integration
        copying thinc\tests\integration\test_feed_forward.py -> build\lib.win-amd64-3.8\thinc\tests\integration
        copying thinc\tests\integration\test_mnist.py -> build\lib.win-amd64-3.8\thinc\tests\integration
        copying thinc\tests\integration\test_pickle.py -> build\lib.win-amd64-3.8\thinc\tests\integration
        copying thinc\tests\integration\test_roundtrip_bytes.py -> build\lib.win-amd64-3.8\thinc\tests\integration
        copying thinc\tests\integration\test_shape_check.py -> build\lib.win-amd64-3.8\thinc\tests\integration
        copying thinc\tests\integration\__init__.py -> build\lib.win-amd64-3.8\thinc\tests\integration
        creating build\lib.win-amd64-3.8\thinc\tests\linear
        copying thinc\tests\linear\test_avgtron.py -> build\lib.win-amd64-3.8\thinc\tests\linear
        copying thinc\tests\linear\test_linear.py -> build\lib.win-amd64-3.8\thinc\tests\linear
        copying thinc\tests\linear\test_sparse_array.py -> build\lib.win-amd64-3.8\thinc\tests\linear
        copying thinc\tests\linear\__init__.py -> build\lib.win-amd64-3.8\thinc\tests\linear
        creating build\lib.win-amd64-3.8\thinc\linear
        copying thinc\linear\__init__.py -> build\lib.win-amd64-3.8\thinc\linear
        creating build\lib.win-amd64-3.8\thinc\neural
        copying thinc\neural\mem.py -> build\lib.win-amd64-3.8\thinc\neural
        copying thinc\neural\pooling.py -> build\lib.win-amd64-3.8\thinc\neural
        copying thinc\neural\train.py -> build\lib.win-amd64-3.8\thinc\neural
        copying thinc\neural\util.py -> build\lib.win-amd64-3.8\thinc\neural
        copying thinc\neural\vec2vec.py -> build\lib.win-amd64-3.8\thinc\neural
        copying thinc\neural\vecs2vec.py -> build\lib.win-amd64-3.8\thinc\neural
        copying thinc\neural\vecs2vecs.py -> build\lib.win-amd64-3.8\thinc\neural
        copying thinc\neural\_lsuv.py -> build\lib.win-amd64-3.8\thinc\neural
        copying thinc\neural\__init__.py -> build\lib.win-amd64-3.8\thinc\neural
        creating build\lib.win-amd64-3.8\thinc\extra
        copying thinc\extra\datasets.py -> build\lib.win-amd64-3.8\thinc\extra
        copying thinc\extra\hpbff.py -> build\lib.win-amd64-3.8\thinc\extra
        copying thinc\extra\load_nlp.py -> build\lib.win-amd64-3.8\thinc\extra
        copying thinc\extra\visualizer.py -> build\lib.win-amd64-3.8\thinc\extra
        copying thinc\extra\wrappers.py -> build\lib.win-amd64-3.8\thinc\extra
        copying thinc\extra\__init__.py -> build\lib.win-amd64-3.8\thinc\extra
        creating build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\affine.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\attention.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\batchnorm.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\convolution.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\difference.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\elu.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\embed.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\encoder_decoder.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\feature_extracter.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\feed_forward.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\function_layer.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\hash_embed.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\layernorm.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\maxout.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\model.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\multiheaded_attention.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\relu.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\resnet.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\rnn.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\selu.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\softmax.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\static_vectors.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        copying thinc\neural\_classes\__init__.py -> build\lib.win-amd64-3.8\thinc\neural\_classes
        creating build\lib.win-amd64-3.8\thinc\extra\_vendorized
        copying thinc\extra\_vendorized\keras_datasets.py -> build\lib.win-amd64-3.8\thinc\extra\_vendorized
        copying thinc\extra\_vendorized\keras_data_utils.py -> build\lib.win-amd64-3.8\thinc\extra\_vendorized
        copying thinc\extra\_vendorized\keras_generic_utils.py -> build\lib.win-amd64-3.8\thinc\extra\_vendorized
        copying thinc\extra\_vendorized\__init__.py -> build\lib.win-amd64-3.8\thinc\extra\_vendorized
        creating build\lib.win-amd64-3.8\thinc\extra\wrapt
        copying thinc\extra\wrapt\decorators.py -> build\lib.win-amd64-3.8\thinc\extra\wrapt
        copying thinc\extra\wrapt\importer.py -> build\lib.win-amd64-3.8\thinc\extra\wrapt
        copying thinc\extra\wrapt\wrappers.py -> build\lib.win-amd64-3.8\thinc\extra\wrapt
        copying thinc\extra\wrapt\__init__.py -> build\lib.win-amd64-3.8\thinc\extra\wrapt
        copying thinc\linalg.pyx -> build\lib.win-amd64-3.8\thinc
        copying thinc\structs.pyx -> build\lib.win-amd64-3.8\thinc
        copying thinc\typedefs.pyx -> build\lib.win-amd64-3.8\thinc
        copying thinc\cpu.pxd -> build\lib.win-amd64-3.8\thinc
        copying thinc\linalg.pxd -> build\lib.win-amd64-3.8\thinc
        copying thinc\structs.pxd -> build\lib.win-amd64-3.8\thinc
        copying thinc\typedefs.pxd -> build\lib.win-amd64-3.8\thinc
        copying thinc\__init__.pxd -> build\lib.win-amd64-3.8\thinc
        copying thinc\compile_time_constants.pxi -> build\lib.win-amd64-3.8\thinc
        copying thinc\linalg.cpp -> build\lib.win-amd64-3.8\thinc
        copying thinc\structs.cpp -> build\lib.win-amd64-3.8\thinc
        copying thinc\typedefs.cpp -> build\lib.win-amd64-3.8\thinc
        copying thinc\linear\avgtron.pyx -> build\lib.win-amd64-3.8\thinc\linear
        copying thinc\linear\features.pyx -> build\lib.win-amd64-3.8\thinc\linear
        copying thinc\linear\linear.pyx -> build\lib.win-amd64-3.8\thinc\linear
        copying thinc\linear\serialize.pyx -> build\lib.win-amd64-3.8\thinc\linear
        copying thinc\linear\sparse.pyx -> build\lib.win-amd64-3.8\thinc\linear
        copying thinc\linear\avgtron.pxd -> build\lib.win-amd64-3.8\thinc\linear
        copying thinc\linear\features.pxd -> build\lib.win-amd64-3.8\thinc\linear
        copying thinc\linear\serialize.pxd -> build\lib.win-amd64-3.8\thinc\linear
        copying thinc\linear\sparse.pxd -> build\lib.win-amd64-3.8\thinc\linear
        copying thinc\linear\__init__.pxd -> build\lib.win-amd64-3.8\thinc\linear
        copying thinc\linear\avgtron.cpp -> build\lib.win-amd64-3.8\thinc\linear
        copying thinc\linear\features.cpp -> build\lib.win-amd64-3.8\thinc\linear
        copying thinc\linear\linear.cpp -> build\lib.win-amd64-3.8\thinc\linear
        copying thinc\linear\serialize.cpp -> build\lib.win-amd64-3.8\thinc\linear
        copying thinc\linear\sparse.cpp -> build\lib.win-amd64-3.8\thinc\linear
        copying thinc\neural\ops.pyx -> build\lib.win-amd64-3.8\thinc\neural
        copying thinc\neural\optimizers.pyx -> build\lib.win-amd64-3.8\thinc\neural
        copying thinc\neural\_aligned_alloc.pyx -> build\lib.win-amd64-3.8\thinc\neural
        copying thinc\neural\cpu.pxd -> build\lib.win-amd64-3.8\thinc\neural
        copying thinc\neural\ops.pxd -> build\lib.win-amd64-3.8\thinc\neural
        copying thinc\neural\__init__.pxd -> build\lib.win-amd64-3.8\thinc\neural
        copying thinc\neural\ops.cpp -> build\lib.win-amd64-3.8\thinc\neural
        copying thinc\neural\optimizers.cpp -> build\lib.win-amd64-3.8\thinc\neural
        copying thinc\neural\_aligned_alloc.cpp -> build\lib.win-amd64-3.8\thinc\neural
        copying thinc\extra\cache.pyx -> build\lib.win-amd64-3.8\thinc\extra
        copying thinc\extra\eg.pyx -> build\lib.win-amd64-3.8\thinc\extra
        copying thinc\extra\mb.pyx -> build\lib.win-amd64-3.8\thinc\extra
        copying thinc\extra\search.pyx -> build\lib.win-amd64-3.8\thinc\extra
        copying thinc\extra\cache.pxd -> build\lib.win-amd64-3.8\thinc\extra
        copying thinc\extra\eg.pxd -> build\lib.win-amd64-3.8\thinc\extra
        copying thinc\extra\mb.pxd -> build\lib.win-amd64-3.8\thinc\extra
        copying thinc\extra\search.pxd -> build\lib.win-amd64-3.8\thinc\extra
        copying thinc\extra\__init__.pxd -> build\lib.win-amd64-3.8\thinc\extra
        copying thinc\extra\cache.cpp -> build\lib.win-amd64-3.8\thinc\extra
        copying thinc\extra\eg.cpp -> build\lib.win-amd64-3.8\thinc\extra
        copying thinc\extra\mb.cpp -> build\lib.win-amd64-3.8\thinc\extra
        copying thinc\extra\search.cpp -> build\lib.win-amd64-3.8\thinc\extra
        running build_ext
        error: Microsoft Visual C++ 14.0 is required. Get it with "Build Tools for Visual Studio": https://visualstudio.microsoft.com/downloads/
        ----------------------------------------
        ERROR: Failed building wheel for thinc
        Running setup.py clean for thinc
      Failed to build thinc
      Installing collected packages: wheel, cymem, murmurhash, preshed, numpy, blis, wasabi, srsly, plac, tqdm, thinc, cython
          Running setup.py install for thinc: started
          Running setup.py install for thinc: finished with status 'error'
          ERROR: Command errored out with exit status 1:
           command: 'C:\Users\kashy\anaconda3\envs\awesome-streamlit\python.exe' -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\kashy\\AppData\\Local\\Temp\\pip-install-062p5m4u\\thinc\\setup.py'"'"'; __file__='"'"'C:\\Users\\kashy\\AppData\\Local\\Temp\\pip-install-062p5m4u\\thinc\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record 'C:\Users\kashy\AppData\Local\Temp\pip-record-du2vb3ti\install-record.txt' --single-version-externally-managed --prefix 'C:\Users\kashy\AppData\Local\Temp\pip-build-env-q38svyds\normal' --compile --install-headers 'C:\Users\kashy\AppData\Local\Temp\pip-build-env-q38svyds\normal\Include\thinc'
               cwd: C:\Users\kashy\AppData\Local\Temp\pip-install-062p5m4u\thinc\
          Complete output (5 lines):
          running install
          running build
          running build_py
          running build_ext
          error: Microsoft Visual C++ 14.0 is required. Get it with "Build Tools for Visual Studio": https://visualstudio.microsoft.com/downloads/
          ----------------------------------------
      ERROR: Command errored out with exit status 1: 'C:\Users\kashy\anaconda3\envs\awesome-streamlit\python.exe' -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\kashy\\AppData\\Local\\Temp\\pip-install-062p5m4u\\thinc\\setup.py'"'"'; __file__='"'"'C:\\Users\\kashy\\AppData\\Local\\Temp\\pip-install-062p5m4u\\thinc\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record 'C:\Users\kashy\AppData\Local\Temp\pip-record-du2vb3ti\install-record.txt' --single-version-externally-managed --prefix 'C:\Users\kashy\AppData\Local\Temp\pip-build-env-q38svyds\normal' --compile --install-headers 'C:\Users\kashy\AppData\Local\Temp\pip-build-env-q38svyds\normal\Include\thinc' Check the logs for full command output.
      ----------------------------------------
    ERROR: Command errored out with exit status 1: 'C:\Users\kashy\anaconda3\envs\awesome-streamlit\python.exe' 'C:\Users\kashy\anaconda3\envs\awesome-streamlit\lib\site-packages\pip' install --ignore-installed --no-user --prefix 'C:\Users\kashy\AppData\Local\Temp\pip-build-env-q38svyds\normal' --no-warn-script-location --no-binary :none: --only-binary :none: -i https://pypi.org/simple -- wheel 'cymem<2.1.0,>=2.0.2' 'preshed<3.1.0,>=3.0.2' 'thinc<7.2.0,>=7.1.1' 'cython>=0.25' 'murmurhash<1.1.0,>=0.28.0' Check the logs for full command output.
    
    opened by kashyapm94 2
  • Display pyLDAvis

    Display pyLDAvis

    A question regarding the bokeh.models.Div workaround for running custom HTML and JS. I am trying to display a pyLDAvis output, and I am aware the streamlit does not support this currently. I have been attempting to use the workaround in the Custom Widgets Hack, but it does not seem to work by passing in the JS and HTML outputs from the pyLDAvis.save_html() output. Do you know of any workaround that could get the display running? I have attached a sample output from pyLDAvis.save_html().

    Any guidance is greatly appreciated. lda_vis.zip

    opened by russell-phe 2
  • Bump pillow from 6.2.0 to 9.3.0 in /gallery/image_classifier

    Bump pillow from 6.2.0 to 9.3.0 in /gallery/image_classifier

    Bumps pillow from 6.2.0 to 9.3.0.

    Release notes

    Sourced from pillow's releases.

    9.3.0

    https://pillow.readthedocs.io/en/stable/releasenotes/9.3.0.html

    Changes

    ... (truncated)

    Changelog

    Sourced from pillow's changelog.

    9.3.0 (2022-10-29)

    • Limit SAMPLESPERPIXEL to avoid runtime DOS #6700 [wiredfool]

    • Initialize libtiff buffer when saving #6699 [radarhere]

    • Inline fname2char to fix memory leak #6329 [nulano]

    • Fix memory leaks related to text features #6330 [nulano]

    • Use double quotes for version check on old CPython on Windows #6695 [hugovk]

    • Remove backup implementation of Round for Windows platforms #6693 [cgohlke]

    • Fixed set_variation_by_name offset #6445 [radarhere]

    • Fix malloc in _imagingft.c:font_setvaraxes #6690 [cgohlke]

    • Release Python GIL when converting images using matrix operations #6418 [hmaarrfk]

    • Added ExifTags enums #6630 [radarhere]

    • Do not modify previous frame when calculating delta in PNG #6683 [radarhere]

    • Added support for reading BMP images with RLE4 compression #6674 [npjg, radarhere]

    • Decode JPEG compressed BLP1 data in original mode #6678 [radarhere]

    • Added GPS TIFF tag info #6661 [radarhere]

    • Added conversion between RGB/RGBA/RGBX and LAB #6647 [radarhere]

    • Do not attempt normalization if mode is already normal #6644 [radarhere]

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • Bump tensorflow from 2.0.1 to 2.9.3 in /gallery/image_classifier

    Bump tensorflow from 2.0.1 to 2.9.3 in /gallery/image_classifier

    Bumps tensorflow from 2.0.1 to 2.9.3.

    Release notes

    Sourced from tensorflow's releases.

    TensorFlow 2.9.3

    Release 2.9.3

    This release introduces several vulnerability fixes:

    TensorFlow 2.9.2

    Release 2.9.2

    This releases introduces several vulnerability fixes:

    ... (truncated)

    Changelog

    Sourced from tensorflow's changelog.

    Release 2.9.3

    This release introduces several vulnerability fixes:

    Release 2.8.4

    This release introduces several vulnerability fixes:

    ... (truncated)

    Commits
    • a5ed5f3 Merge pull request #58584 from tensorflow/vinila21-patch-2
    • 258f9a1 Update py_func.cc
    • cd27cfb Merge pull request #58580 from tensorflow-jenkins/version-numbers-2.9.3-24474
    • 3e75385 Update version numbers to 2.9.3
    • bc72c39 Merge pull request #58482 from tensorflow-jenkins/relnotes-2.9.3-25695
    • 3506c90 Update RELEASE.md
    • 8dcb48e Update RELEASE.md
    • 4f34ec8 Merge pull request #58576 from pak-laura/c2.99f03a9d3bafe902c1e6beb105b2f2417...
    • 6fc67e4 Replace CHECK with returning an InternalError on failing to create python tuple
    • 5dbe90a Merge pull request #58570 from tensorflow/r2.9-7b174a0f2e4
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • Bump tensorflow from 2.0.1 to 2.9.3

    Bump tensorflow from 2.0.1 to 2.9.3

    Bumps tensorflow from 2.0.1 to 2.9.3.

    Release notes

    Sourced from tensorflow's releases.

    TensorFlow 2.9.3

    Release 2.9.3

    This release introduces several vulnerability fixes:

    TensorFlow 2.9.2

    Release 2.9.2

    This releases introduces several vulnerability fixes:

    ... (truncated)

    Changelog

    Sourced from tensorflow's changelog.

    Release 2.9.3

    This release introduces several vulnerability fixes:

    Release 2.8.4

    This release introduces several vulnerability fixes:

    ... (truncated)

    Commits
    • a5ed5f3 Merge pull request #58584 from tensorflow/vinila21-patch-2
    • 258f9a1 Update py_func.cc
    • cd27cfb Merge pull request #58580 from tensorflow-jenkins/version-numbers-2.9.3-24474
    • 3e75385 Update version numbers to 2.9.3
    • bc72c39 Merge pull request #58482 from tensorflow-jenkins/relnotes-2.9.3-25695
    • 3506c90 Update RELEASE.md
    • 8dcb48e Update RELEASE.md
    • 4f34ec8 Merge pull request #58576 from pak-laura/c2.99f03a9d3bafe902c1e6beb105b2f2417...
    • 6fc67e4 Replace CHECK with returning an InternalError on failing to create python tuple
    • 5dbe90a Merge pull request #58570 from tensorflow/r2.9-7b174a0f2e4
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • Bump lxml from 4.4.1 to 4.9.1

    Bump lxml from 4.4.1 to 4.9.1

    Bumps lxml from 4.4.1 to 4.9.1.

    Changelog

    Sourced from lxml's changelog.

    4.9.1 (2022-07-01)

    Bugs fixed

    • A crash was resolved when using iterwalk() (or canonicalize()) after parsing certain incorrect input. Note that iterwalk() can crash on valid input parsed with the same parser after failing to parse the incorrect input.

    4.9.0 (2022-06-01)

    Bugs fixed

    • GH#341: The mixin inheritance order in lxml.html was corrected. Patch by xmo-odoo.

    Other changes

    • Built with Cython 0.29.30 to adapt to changes in Python 3.11 and 3.12.

    • Wheels include zlib 1.2.12, libxml2 2.9.14 and libxslt 1.1.35 (libxml2 2.9.12+ and libxslt 1.1.34 on Windows).

    • GH#343: Windows-AArch64 build support in Visual Studio. Patch by Steve Dower.

    4.8.0 (2022-02-17)

    Features added

    • GH#337: Path-like objects are now supported throughout the API instead of just strings. Patch by Henning Janssen.

    • The ElementMaker now supports QName values as tags, which always override the default namespace of the factory.

    Bugs fixed

    • GH#338: In lxml.objectify, the XSI float annotation "nan" and "inf" were spelled in lower case, whereas XML Schema datatypes define them as "NaN" and "INF" respectively.

    ... (truncated)

    Commits
    • d01872c Prevent parse failure in new test from leaking into later test runs.
    • d65e632 Prepare release of lxml 4.9.1.
    • 86368e9 Fix a crash when incorrect parser input occurs together with usages of iterwa...
    • 50c2764 Delete unused Travis CI config and reference in docs (GH-345)
    • 8f0bf2d Try to speed up the musllinux AArch64 build by splitting the different CPytho...
    • b9f7074 Remove debug print from test.
    • b224e0f Try to install 'xz' in wheel builds, if available, since it's now needed to e...
    • 897ebfa Update macOS deployment target version from 10.14 to 10.15 since 10.14 starts...
    • 853c9e9 Prepare release of 4.9.0.
    • d3f77e6 Add a test for https://bugs.launchpad.net/lxml/+bug/1965070 leaving out the a...
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • Bump numpy from 1.17.2 to 1.22.0 in /gallery/image_classifier

    Bump numpy from 1.17.2 to 1.22.0 in /gallery/image_classifier

    Bumps numpy from 1.17.2 to 1.22.0.

    Release notes

    Sourced from numpy's releases.

    v1.22.0

    NumPy 1.22.0 Release Notes

    NumPy 1.22.0 is a big release featuring the work of 153 contributors spread over 609 pull requests. There have been many improvements, highlights are:

    • Annotations of the main namespace are essentially complete. Upstream is a moving target, so there will likely be further improvements, but the major work is done. This is probably the most user visible enhancement in this release.
    • A preliminary version of the proposed Array-API is provided. This is a step in creating a standard collection of functions that can be used across application such as CuPy and JAX.
    • NumPy now has a DLPack backend. DLPack provides a common interchange format for array (tensor) data.
    • New methods for quantile, percentile, and related functions. The new methods provide a complete set of the methods commonly found in the literature.
    • A new configurable allocator for use by downstream projects.

    These are in addition to the ongoing work to provide SIMD support for commonly used functions, improvements to F2PY, and better documentation.

    The Python versions supported in this release are 3.8-3.10, Python 3.7 has been dropped. Note that 32 bit wheels are only provided for Python 3.8 and 3.9 on Windows, all other wheels are 64 bits on account of Ubuntu, Fedora, and other Linux distributions dropping 32 bit support. All 64 bit wheels are also linked with 64 bit integer OpenBLAS, which should fix the occasional problems encountered by folks using truly huge arrays.

    Expired deprecations

    Deprecated numeric style dtype strings have been removed

    Using the strings "Bytes0", "Datetime64", "Str0", "Uint32", and "Uint64" as a dtype will now raise a TypeError.

    (gh-19539)

    Expired deprecations for loads, ndfromtxt, and mafromtxt in npyio

    numpy.loads was deprecated in v1.15, with the recommendation that users use pickle.loads instead. ndfromtxt and mafromtxt were both deprecated in v1.17 - users should use numpy.genfromtxt instead with the appropriate value for the usemask parameter.

    (gh-19615)

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
Owner
Marc Skov Madsen
Data, Models and Analytics Ninja. PhD, CFA® and Lead Data Scientist Developer at Ørsted. Developer of awesome-panel.org and awesome-streamlit.org
Marc Skov Madsen
A curated list of awesome tools for Sphinx Python Documentation Generator

Awesome Sphinx (Python Documentation Generator) A curated list of awesome extra libraries, software and resources for Sphinx (Python Documentation Gen

Hyunjun Kim 831 Dec 27, 2022
A curated list of awesome mathematics resources

A curated list of awesome mathematics resources

Cyrille Rossant 6.7k Jan 5, 2023
Data-Scrapping SEO - the project uses various data scrapping and Google autocompletes API tools to provide relevant points of different keywords so that search engines can be optimized

Data-Scrapping SEO - the project uses various data scrapping and Google autocompletes API tools to provide relevant points of different keywords so that search engines can be optimized; as this information is gathered, the marketing team can target the top keywords to get your company’s website higher on a results page.

Vibhav Kumar Dixit 2 Jul 18, 2022
Highlight Translator can help you translate the words quickly and accurately.

Highlight Translator can help you translate the words quickly and accurately. By only highlighting, copying, or screenshoting the content you want to translate anywhere on your computer (ex. PDF, PPT, WORD etc.), the translated results will then be automatically displayed before you.

Coolshan 48 Dec 21, 2022
charcade is a string manipulation library that can animate, color, and bruteforce strings

charcade charcade is a string manipulation library that can animate, color, and bruteforce strings. Features Animating text for CLI applications with

Aaron 8 May 23, 2022
SamrSearch - SamrSearch can get user info and group info with MS-SAMR

SamrSearch SamrSearch can get user info and group info with MS-SAMR.like net use

knight 10 Oct 6, 2022
graphical orbitational simulation of solar system planets with real values and physics implemented so you get a nice elliptical orbits. you can change timestamp value or scale from source code idc.

solarSystemOrbitalSimulation graphical orbitational simulation of solar system planets with real values and physics implemented so you get a nice elli

Mega 3 Mar 3, 2022
This is a repository for "100 days of code challenge" projects. You can reach all projects from beginner to professional which are written in Python.

100 Days of Code It's a challenge that aims to gain code practice and enhance programming knowledge. Day #1 Create a Band Name Generator It's actually

SelenNB 2 May 12, 2022
SCTYMN is a GitHub repository that includes some simple scripts(currently only python scripts) that can be useful.

Simple Codes That You Might Need SCTYMN is a GitHub repository that includes some simple scripts(currently only python scripts) that can be useful. In

CodeWriter21 2 Jan 21, 2022
the project for the most brutal and effective language learning technique

- "The project for the most brutal and effective language learning technique" (c) Alex Kay The langflow project was created especially for language le

Alexander Kaigorodov 7 Dec 26, 2021
Toolchain for project structure and documents optimisation

ritocco Toolchain for project structure and documents optimisation

Harvey Wu 1 Jan 12, 2022
Explicit, strict and automatic project version management based on semantic versioning.

Explicit, strict and automatic project version management based on semantic versioning. Getting started End users Semantic versioning Project version

Dmytro Striletskyi 6 Jan 25, 2022
Project documentation with Markdown.

MkDocs Project documentation with Markdown. View the MkDocs documentation. Project release notes. Visit the MkDocs wiki for community resources, inclu

MkDocs 15.6k Jan 2, 2023
Your Project with Great Documentation.

Read Latest Documentation - Browse GitHub Code Repository The only thing worse than documentation never written, is documentation written but never di

Timothy Edmund Crosley 809 Dec 28, 2022
The project that powers MDN.

Kuma Kuma is the platform that powers MDN (developer.mozilla.org) Development Code: https://github.com/mdn/kuma Issues: P1 Bugs (to be fixed ASAP) P2

MDN Web Docs 1.9k Dec 26, 2022
Software engineering course project. Secondhand trading system.

PigeonSale Software engineering course project. Secondhand trading system. Documentation API doumenatation: list of APIs Backend documentation: notes

Harry Lee 1 Sep 1, 2022
Project created to help beginner programmers to study, despite the lack of internet!

Project created to help beginner programmers to study, despite the lack of internet!

Dev4Dev 2 Oct 25, 2021
Portfolio project for Code Institute Full Stack software development course.

Comic Sales tracker This project is the third milestone project for the Code Institute Diploma in Full Stack Software Development. You can see the fin

null 1 Jan 10, 2022
This is a small project written to help build documentation for projects in less time.

Documentation-Builder This is a small project written to help build documentation for projects in less time. About This project builds documentation f

Tom Jebbo 2 Jan 17, 2022