A collection of inference modules for fastai2

Overview

fastinference

A collection of inference modules for fastai including inference speedup and interpretability

Install

pip install fastinference

There are submodules available as well via:

  • pip install fastinference[interp] - Interpretability modules such as SHAP and Feature Importance
  • pip install fastinference[onnx-cpu] - ONNX for a CPU environment
  • pip install fastinference[onnx-gpu] - ONNX for a GPU environment

Wonderful Contributors:

(Using both their fastai handles as well as their GitHub handles if possible):

  • Pavel (Pak)
Comments
  • error in fully_decoded: ValueError: only one element tensors can be converted to Python scalars

    error in fully_decoded: ValueError: only one element tensors can be converted to Python scalars

    This happens when enabling fully_decoded in get_preds. I don't know where this comes from, any ideas of where to start looking?

    ---------------------------------------------------------------------------
    ValueError                                Traceback (most recent call last)
    <ipython-input-62-6fffcf6bb38e> in <module>
    ----> 1 foo = learn.get_preds(fully_decoded=True)
    
    ~/.local/lib/python3.6/site-packages/fastinference/inference/inference.py in get_preds(x, ds_idx, dl, raw_outs, decoded_loss, fully_decoded, **kwargs)
         70     else:
         71         outs.insert(0, raw)
    ---> 72     if fully_decoded: outs = _fully_decode(x.dls, inps, outs, dec_out, is_multi)
         73     if decoded_loss: outs = _decode_loss(x.dls.vocab, dec_out, outs)
         74     return outs
    
    ~/.local/lib/python3.6/site-packages/fastinference/inference/inference.py in _fully_decode(dl, inps, outs, dec_out, is_multi)
         14             inps[i] = torch.cat(inps[i], dim=0)
         15     else:
    ---> 16         inps = tensor(*inps[0])
         17     b = (*tuplify(inps), *tuplify(dec_out))
         18     try:
    
    /usr/local/lib/python3.6/dist-packages/fastai2/torch_core.py in tensor(x, *rest, **kwargs)
        108     # if isinstance(x, (tuple,list)) and len(x)==0: return tensor(0)
        109     res = (x if isinstance(x, Tensor)
    --> 110            else torch.tensor(x, **kwargs) if isinstance(x, (tuple,list))
        111            else _array2tensor(x) if isinstance(x, ndarray)
        112            else as_tensor(x.values, **kwargs) if isinstance(x, (pd.Series, pd.DataFrame))
    
    ValueError: only one element tensors can be converted to Python scalars
    
    opened by vrodriguezf 21
  • TypeError: requires_grad_() takes 1 positional argument but 2 were given

    TypeError: requires_grad_() takes 1 positional argument but 2 were given

    Here's the code I'm using

    from fastai.text.all import *  
    from fastinference.inference import *   
    learn_class=load_learner('/content/gdrive/MyDrive/TuesdayChatbot/classifier_model-'+ver+'.pkl')   
    test=["In the article there were several things left out."]   
    learn_class.predict(test)   
    learn_class.intrinsic_attention(test[0])   
    
    TypeError                                 Traceback (most recent call last)
    <ipython-input-76-ba2c19ef8949> in <module>()
    ----> 1 learn_class.intrinsic_attention(test[0])
    
    1 frames
    /usr/local/lib/python3.6/dist-packages/fastinference/inference/text.py in _intrinsic_attention(learn, text, class_id)
        149     dl = learn.dls.test_dl([text])
        150     batch = next(iter(dl))[0]
    --> 151     emb = learn.model[0].module.encoder(batch).detach().requires_grad_(True)
        152     lstm = learn.model[0].module(emb, True)
        153     learn.model.eval()
    
    TypeError: requires_grad_() takes 1 positional argument but 2 were given
    
    opened by randywreed 11
  • Fastinfernce has no tabular attirbute

    Fastinfernce has no tabular attirbute

    Hi. The recent version of fastai 2.1.17 started throwing a lot of errors. One of them is related to fastinference. When I call Shap interpretation, it throws:

    AttributeError: module 'fastinference' has no attribute 'tabular'
    
    bug 
    opened by turgut090 8
  • "NameError: name '_ConstantFunc' is not defined"

    from fastinference.inference import * (worked fine)

    %%time for i in range(40): learn.predict(img)

    This code returned:

    ~/miniconda3/envs/fastai2/lib/python3.6/site-packages/fastinference/inference/inference.py in predict(x, item, with_input, rm_type_tfms)
         79 def predict(x:Learner, item, with_input=False, rm_type_tfms=None):
         80         dl = x.dls.test_dl([item], rm_type_tfms=rm_type_tfms, num_workers=0)
    ---> 81         res = x.get_preds(dl=dl, with_input=with_input, with_decoded=True)
         82         return res
    
    ~/miniconda3/envs/fastai2/lib/python3.6/site-packages/fastinference/inference/inference.py in get_preds(self, ds_idx, dl, with_input, with_decoded, with_loss, raw, act, inner, reorder, cbs, **kwargs)
         55     if reorder and hasattr(dl, 'get_idxs'):
         56         idxs = dl.get_idxs()
    ---> 57         dl = dl.new(get_idxs = _ConstantFunc(idxs))
         58     cb = GatherPredsCallback(with_input=with_input, with_loss=with_loss, **kwargs)
         59     ctx_mgrs = self.validation_context(cbs=L(cbs)+[cb], inner=inner)
    
    NameError: name '_ConstantFunc' is not defined
    

    It seems that to run a predict with fastinference I need to have a databunch loaded with the learner itself.

    The steps I followed: 1 - Import fastai 2 - Load a pre-trained model (this model was exported with.export()) with learn = load_learner('PATH_OF_LEARNER', cpu=False) 3 - Predict with a image loaded with opencv (cv2.imread(image_path)) 4 - Import fastinference with from fastinference.inference import * (no error returned) 5 - Repeat step 3 and got the error.

    opened by lucasgabrielce 8
  • 02_shap.interp.ipynb error?

    02_shap.interp.ipynb error?

    Hi, I tried running this notebook and got an error at this line:

    exp = ShapInterpretation(learn)
    exp.decision_plot(class_id=0, row_idx=10)
    

    TypeError                                 Traceback (most recent call last)
    <ipython-input-12-907d69fd87c2> in <module>
    ----> 1 exp.decision_plot(class_id=0, row_idx=10)
    
    <ipython-input-8-27630e1fb799> in decision_plot(self, class_id, row_idx, **kwargs)
         16     def decision_plot(self, class_id=0, row_idx=-1, **kwargs):
         17         "Visualize model decision using cumulative `SHAP` values."
    ---> 18         shap_vals, exp_val = _get_values(self, class_id)
         19         n_rows = shap_vals.shape[0]
         20         if row_idx == -1:
    
    <ipython-input-10-91e394550211> in _get_values(interp, class_id)
          5     exp_vals = interp.explainer.expected_value
          6     if interp.is_multi_output:
    ----> 7         (class_name, class_idx) = _get_class_info(interp, class_id)
          8         print(f"Classification model detected, displaying score for the class {class_name}.")
          9         print("(use `class_id` to specify another class)")
    
    <ipython-input-9-96fbf3aee7cf> in _get_class_info(interp, class_id)
          2 def _get_class_info(interp:ShapInterpretation, class_id):
          3     "Returns class name associated with index, or vice-versa"
    ----> 4     if isinstance(class_id, int): class_idx, class_name = class_id, interp.class_names[class_id]
          5     else: class_idx, class_name = interp.class_names.o2i[class_id], class_id
          6     return (class_name, class_idx)
    
    TypeError: 'NoneType' object is not subscriptable
    
    opened by ncduy0303 7
  • SHAP won't install

    SHAP won't install "from fastinference.tabular import *" Fails

    Trying to install, use SHAP.

    !pip install fastinference fastai -q, this works

    from fastinference.tabular import * produces:

    ModuleNotFoundError Traceback (most recent call last) in () ----> 1 from fastinference.tabular import *

    3 frames /usr/local/lib/python3.6/dist-packages/fastinference/tabular/init.py in () 3 raise ImportError("The interp module is not installed.") 4 ----> 5 from .shap import * 6 from .interpretation import *

    /usr/local/lib/python3.6/dist-packages/fastinference/tabular/shap/init.py in () ----> 1 from .interp import *

    /usr/local/lib/python3.6/dist-packages/fastinference/tabular/shap/interp.py in () 4 5 # Cell ----> 6 from .core import _prepare_data, _predict 7 import shap 8 from fastai.tabular.all import *

    /usr/local/lib/python3.6/dist-packages/fastinference/tabular/shap/core.py in () 4 5 # Cell ----> 6 from fastai.tabular.all import * 7 8 # Cell

    ModuleNotFoundError: No module named 'fastai.tabular.all'


    NOTE: If your import is failing due to a missing package, you can manually install dependencies using either !pip or !apt.

    To view examples of installing some common dependencies, click the "Open Examples" button below.

    Have also tried: !pip install fastinference[interp] -q and no luck. Thanks!

    opened by md598 5
  • can't import fastinference.onnx

    can't import fastinference.onnx

    Hi @muellerzr

    I am running fastinference==0.0.32, fastai==2.1.9 and fastcore==1.3.12.

    When I try to import from fastinference.onnx import * to use fastONNX class I get this error:

    ---------------------------------------------------------------------------
    TypeError                                 Traceback (most recent call last)
    <ipython-input-10-52ff53f82e02> in <module>
    ----> 1 from fastinference.onnx import *
    
    ~/.virtualenvs/challenge/lib/python3.6/site-packages/fastinference/onnx.py in <module>
         18 # Cell
         19 #export
    ---> 20 from .inference.inference import _decode_loss
         21 
         22 # Cell
    
    ~/.virtualenvs/challenge/lib/python3.6/site-packages/fastinference/inference/__init__.py in <module>
    ----> 1 from .inference import *
          2 from .text import *
    
    ~/.virtualenvs/challenge/lib/python3.6/site-packages/fastinference/inference/inference.py in <module>
         52 @patch
         53 def get_preds(self:Learner, ds_idx=1, dl=None, with_input=False, with_decoded=False, with_loss=False, raw=False, act=None,
    ---> 54                 inner=False, reorder=True, cbs=None, **kwargs):
         55     if dl is None: dl = self.dls[ds_idx].new(shuffled=False, drop_last=False)
         56     if reorder and hasattr(dl, 'get_idxs'):
    
    ~/.virtualenvs/challenge/lib/python3.6/site-packages/fastcore/meta.py in _f(f)
        114         to_f = getattr(to_f,'__func__',to_f)
        115         if hasattr(from_f,'__delwrap__'): return f
    --> 116         sig = inspect.signature(from_f)
        117         sigd = dict(sig.parameters)
        118         k = sigd.pop('kwargs')
    
    /usr/lib/python3.6/inspect.py in signature(obj, follow_wrapped)
       3063 def signature(obj, *, follow_wrapped=True):
       3064     """Get a signature object for the passed callable."""
    -> 3065     return Signature.from_callable(obj, follow_wrapped=follow_wrapped)
       3066 
       3067 
    
    /usr/lib/python3.6/inspect.py in from_callable(cls, obj, follow_wrapped)
       2813         """Constructs Signature for the given callable object."""
       2814         return _signature_from_callable(obj, sigcls=cls,
    -> 2815                                         follow_wrapper_chains=follow_wrapped)
       2816 
       2817     @property
    
    /usr/lib/python3.6/inspect.py in _signature_from_callable(obj, follow_wrapper_chains, skip_bound_arg, sigcls)
       2191 
       2192     if not callable(obj):
    -> 2193         raise TypeError('{!r} is not a callable object'.format(obj))
       2194 
       2195     if isinstance(obj, types.MethodType):
    
    TypeError: None is not a callable object
    

    When I use fastai==2.1.5 and fastcore==1.3.2 this problem doesn't exist, but not using the updated fastai would cause other problems when running fastai interpretations eginterp.plot_top_losses.

    opened by hududed 4
  • CUDA Out of Memory When Running Inference on a Large Test Set (And Proposed Fixes)

    CUDA Out of Memory When Running Inference on a Large Test Set (And Proposed Fixes)

    In inference.py's get_preds function, the inputs are being stored regardless of whether fully_decoded is True or False. These inputs are being stored on the GPU, and are only released from memory after the inference loop is finished.

    As the size of the test set increases, the GPU will at some point run out of memory if the test set is large enough (In my test case, I ran out of memory with <50,000 items with an EfficientNet-B3A and image size of 224x224).

    There's 2 ways to fix this:

    1. store the inputs on CPU (I haven't tested this).

    with torch.no_grad():
        if is_multi:
            for i in range(x.dls.n_inp):
                #inps[i].append(batch[i])
                inps[i].append(batch[i].cpu())
        else:
            #inps.append(batch[:x.dls.n_inp])
            inps.append(batch[:x.dls.n_inp].cpu())
        # rest of the loop
    

    2. Skip storing the inputs altogether if fully_decoded is False, as it's redundant. This should provide some speedup too. (Tested)

    with torch.no_grad():
        if fully_decoded:
            if is_multi:
                for i in range(x.dls.n_inp):
                    inps[i].append(batch[i])
            else:
                inps.append(batch[:x.dls.n_inp])
        # rest of the loop
    

    I've made this modification in my code and can vouch that it works. I can pass in arbitrarily long test sets and there's no issues.


    I think (2) should be done regardless. I'm not sure about (1) as converting to .cpu() is an overhead, but perhaps we should leave this to the user (if they want to prioritise inference speed, they'd probably be skipping this anyways).

    opened by rsomani95 4
  • New fastai inference API

    New fastai inference API

    Hey Zach, Let's work here prototyping the inference. What I would like to have (Santa's wishful letter).

    • Streamlined torchscript support on all fastai models, simple models should be compatible with jit.trace and more complex ones, with decisions with jit.script. The guys at facebook may be able to help here, they are super interested on this right now. The user should have simple image preprocessing/posprocessing to make inference work once the models is exported on plain pytorch. If jeremy splits the fastai lib on core/vision/etc... we could depende on the fastai core's.
    • ONNX: Exporting on all models, image encoders should work out of the box, some layers are missing for Unet's (PixelShuffle). Tabular should work also. Without being an expert, I would expect that torchscript replaces the ONNX pipeline in the future, one less layer.
    • RTTorch: We should start discussing with them probably, as the TensorRT frameworks is super fast for GPU inference. This could be done latter, once we have ONNX exports. I have a contact at NVIDIA that could help us export to TensorRT.
    • DeepStream? Stas Beckman is a guru on this topic, we could ask him what he thinks about it.

    We should have tests that periodically verify that this functionality is not broken, and the performance is maintained. This is something fastai does not have right now and it needs, e.g., fastai's unet is slower than before, noted this the other day.

    Another cool thing, would be to directly serve the model with torch.serve directly from fastai. Like,

    learn.serve(port=5151)
    

    and get a service running to make inference over HTTP.

    opened by tcapelle 3
  • Returning N top predictions

    Returning N top predictions

    Hi, I have a use case where I need multiple predictions and their probabilities - if the model is not confident enough in the prediction, the user gets a choice of N top predictions to choose the correct one themselves.

    I've modified the fastinference code to implement this functionality (at the moment I just return a sorted list of all classes and their probabilities). Would you be interested in having it as a pull request? I haven't measured the speed and the code is a bit hacky at the moment, so I would need to clean it up first and integrate with the original functionality.

    opened by andresti 2
  •  pip install fastinference[interp] doesn't work

    pip install fastinference[interp] doesn't work

    Running pip install fastinference[interp] is not working. I have the latest version of SHAP (0.39) already installed. Seems like it is crashing when trying to run setup.py for SHAP. fastinference seems to depend on SHAP 0.35.

    (fastai) PS C:\work> pip install fastinference[interp]
    Requirement already satisfied: fastinference[interp] in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (0.0.36)
    Requirement already satisfied: fastai>=2.0.0 in c:\work\ml\fastai (from fastinference[interp]) (2.3.1)
    Collecting shap<0.36.0
      Using cached shap-0.35.0.tar.gz (273 kB)
    Collecting plotly
      Using cached plotly-4.14.3-py2.py3-none-any.whl (13.2 MB)
    Collecting plotnine
      Using cached plotnine-0.8.0-py3-none-any.whl (4.7 MB)
    Requirement already satisfied: pip in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from fastai>=2.0.0->fastinference[interp]) (21.0.1)
    Requirement already satisfied: packaging in c:\users\pc\appdata\roaming\python\python38\site-packages (from fastai>=2.0.0->fastinference[interp]) (20.4)
    Requirement already satisfied: fastcore<1.4,>=1.3.8 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from fastai>=2.0.0->fastinference[interp]) (1.3.20)
    Requirement already satisfied: torchvision>=0.8.2 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from fastai>=2.0.0->fastinference[interp]) (0.9.1)
    Requirement already satisfied: matplotlib in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from fastai>=2.0.0->fastinference[interp]) (3.3.4)
    Requirement already satisfied: pandas in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from fastai>=2.0.0->fastinference[interp]) (1.2.4)
    Requirement already satisfied: requests in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from fastai>=2.0.0->fastinference[interp]) (2.25.1)
    Requirement already satisfied: pyyaml in c:\users\pc\appdata\roaming\python\python38\site-packages (from fastai>=2.0.0->fastinference[interp]) (5.3.1)
    Requirement already satisfied: fastprogress>=0.2.4 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from fastai>=2.0.0->fastinference[interp]) (1.0.0)
    Requirement already satisfied: pillow>6.0.0 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from fastai>=2.0.0->fastinference[interp]) (8.1.2)
    Requirement already satisfied: scikit-learn in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from fastai>=2.0.0->fastinference[interp]) (0.24.1)
    Requirement already satisfied: scipy in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from fastai>=2.0.0->fastinference[interp]) (1.6.1)
    Requirement already satisfied: spacy<4 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from fastai>=2.0.0->fastinference[interp]) (3.0.5)
    Requirement already satisfied: torch<1.9,>=1.7.0 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from fastai>=2.0.0->fastinference[interp]) (1.8.1)
    Requirement already satisfied: numpy in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from fastprogress>=0.2.4->fastai>=2.0.0->fastinference[interp]) (1.20.1)
    Requirement already satisfied: tqdm>4.25.0 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from shap<0.36.0->fastinference[interp]) (4.59.0)
    Requirement already satisfied: typer<0.4.0,>=0.3.0 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from spacy<4->fastai>=2.0.0->fastinference[interp]) (0.3.2)
    Requirement already satisfied: preshed<3.1.0,>=3.0.2 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from spacy<4->fastai>=2.0.0->fastinference[interp]) (3.0.5)
    Requirement already satisfied: jinja2 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from spacy<4->fastai>=2.0.0->fastinference[interp]) (3.0.1)
    Requirement already satisfied: spacy-legacy<3.1.0,>=3.0.0 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from spacy<4->fastai>=2.0.0->fastinference[interp]) (3.0.1)
    Requirement already satisfied: catalogue<2.1.0,>=2.0.1 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from spacy<4->fastai>=2.0.0->fastinference[interp]) (2.0.1)
    Requirement already satisfied: srsly<3.0.0,>=2.4.0 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from spacy<4->fastai>=2.0.0->fastinference[interp]) (2.4.0)
    Requirement already satisfied: setuptools in c:\users\pc\appdata\roaming\python\python38\site-packages (from spacy<4->fastai>=2.0.0->fastinference[interp]) (50.3.0)
    Requirement already satisfied: blis<0.8.0,>=0.4.0 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from spacy<4->fastai>=2.0.0->fastinference[interp]) (0.7.4)
    Requirement already satisfied: wasabi<1.1.0,>=0.8.1 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from spacy<4->fastai>=2.0.0->fastinference[interp]) (0.8.2)
    Requirement already satisfied: pathy>=0.3.5 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from spacy<4->fastai>=2.0.0->fastinference[interp]) (0.4.0)
    Requirement already satisfied: cymem<2.1.0,>=2.0.2 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from spacy<4->fastai>=2.0.0->fastinference[interp]) (2.0.5)
    Requirement already satisfied: pydantic<1.8.0,>=1.7.1 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from spacy<4->fastai>=2.0.0->fastinference[interp]) (1.7.3)
    Requirement already satisfied: murmurhash<1.1.0,>=0.28.0 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from spacy<4->fastai>=2.0.0->fastinference[interp]) (1.0.5)
    Requirement already satisfied: thinc<8.1.0,>=8.0.2 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from spacy<4->fastai>=2.0.0->fastinference[interp]) (8.0.2)
    Requirement already satisfied: six in c:\users\pc\appdata\roaming\python\python38\site-packages (from packaging->fastai>=2.0.0->fastinference[interp]) (1.15.0)
    Requirement already satisfied: pyparsing>=2.0.2 in c:\users\pc\appdata\roaming\python\python38\site-packages (from packaging->fastai>=2.0.0->fastinference[interp]) (2.4.7)
    Requirement already satisfied: smart-open<4.0.0,>=2.2.0 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from pathy>=0.3.5->spacy<4->fastai>=2.0.0->fastinference[interp]) (2.2.1)
    Requirement already satisfied: urllib3<1.27,>=1.21.1 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from requests->fastai>=2.0.0->fastinference[interp]) (1.26.4)
    Requirement already satisfied: certifi>=2017.4.17 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from requests->fastai>=2.0.0->fastinference[interp]) (2020.12.5)
    Requirement already satisfied: chardet<5,>=3.0.2 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from requests->fastai>=2.0.0->fastinference[interp]) (4.0.0)
    Requirement already satisfied: idna<3,>=2.5 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from requests->fastai>=2.0.0->fastinference[interp]) (2.10)
    Requirement already satisfied: boto3 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from smart-open<4.0.0,>=2.2.0->pathy>=0.3.5->spacy<4->fastai>=2.0.0->fastinference[interp]) (1.17.33)
    Requirement already satisfied: typing-extensions in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from torch<1.9,>=1.7.0->fastai>=2.0.0->fastinference[interp]) (3.7.4.3)
    Requirement already satisfied: click<7.2.0,>=7.1.1 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from typer<0.4.0,>=0.3.0->spacy<4->fastai>=2.0.0->fastinference[interp]) (7.1.2)
    Requirement already satisfied: jmespath<1.0.0,>=0.7.1 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from boto3->smart-open<4.0.0,>=2.2.0->pathy>=0.3.5->spacy<4->fastai>=2.0.0->fastinference[interp]) (0.10.0)
    Requirement already satisfied: botocore<1.21.0,>=1.20.33 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from boto3->smart-open<4.0.0,>=2.2.0->pathy>=0.3.5->spacy<4->fastai>=2.0.0->fastinference[interp]) (1.20.33)
    Requirement already satisfied: s3transfer<0.4.0,>=0.3.0 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from boto3->smart-open<4.0.0,>=2.2.0->pathy>=0.3.5->spacy<4->fastai>=2.0.0->fastinference[interp]) (0.3.6)
    Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in c:\users\pc\appdata\roaming\python\python38\site-packages (from botocore<1.21.0,>=1.20.33->boto3->smart-open<4.0.0,>=2.2.0->pathy>=0.3.5->spacy<4->fastai>=2.0.0->fastinference[interp]) (2.8.1)
    Requirement already satisfied: MarkupSafe>=2.0 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from jinja2->spacy<4->fastai>=2.0.0->fastinference[interp]) (2.0.1)
    Requirement already satisfied: cycler>=0.10 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from matplotlib->fastai>=2.0.0->fastinference[interp]) (0.10.0)
    Requirement already satisfied: kiwisolver>=1.0.1 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from matplotlib->fastai>=2.0.0->fastinference[interp]) (1.3.1)
    Requirement already satisfied: pytz>=2017.3 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from pandas->fastai>=2.0.0->fastinference[interp]) (2021.1)
    Requirement already satisfied: retrying>=1.3.3 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from plotly->fastinference[interp]) (1.3.3)
    Requirement already satisfied: patsy>=0.5.1 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from plotnine->fastinference[interp]) (0.5.1)
    Requirement already satisfied: mizani>=0.7.3 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from plotnine->fastinference[interp]) (0.7.3)
    Requirement already satisfied: statsmodels>=0.12.1 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from plotnine->fastinference[interp]) (0.12.2)
    Requirement already satisfied: descartes>=1.1.0 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from plotnine->fastinference[interp]) (1.1.0)
    Requirement already satisfied: palettable in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from mizani>=0.7.3->plotnine->fastinference[interp]) (3.3.0)
    Requirement already satisfied: joblib>=0.11 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from scikit-learn->fastai>=2.0.0->fastinference[interp]) (1.0.1)
    Requirement already satisfied: threadpoolctl>=2.0.0 in c:\users\pc\miniconda3\envs\fastai\lib\site-packages (from scikit-learn->fastai>=2.0.0->fastinference[interp]) (2.1.0)
    Building wheels for collected packages: shap
      Building wheel for shap (setup.py) ... error
      ERROR: Command errored out with exit status 1:
       command: 'c:\users\pc\miniconda3\envs\fastai\python.exe' -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\pc\\AppData\\Local\\Temp\\pip-install-hejpe705\\shap_2eff441acfc04fe3bd0ac670c603bf74\\setup.py'"'"'; __file__='"'"'C:\\Users\\pc\\AppData\\Local\\Temp\\pip-install-hejpe705\\shap_2eff441acfc04fe3bd0ac670c603bf74\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d 'c:\users\pc\AppData\Local\Temp\pip-wheel-gcnda71l'
           cwd: c:\users\pc\AppData\Local\Temp\pip-install-hejpe705\shap_2eff441acfc04fe3bd0ac670c603bf74\
      Complete output (67 lines):
      running bdist_wheel
      running build
      running build_py
      creating build
      creating build\lib.win-amd64-3.8
      creating build\lib.win-amd64-3.8\shap
      copying shap\common.py -> build\lib.win-amd64-3.8\shap
      copying shap\datasets.py -> build\lib.win-amd64-3.8\shap
      copying shap\__init__.py -> build\lib.win-amd64-3.8\shap
      creating build\lib.win-amd64-3.8\shap\explainers
      copying shap\explainers\additive.py -> build\lib.win-amd64-3.8\shap\explainers
      copying shap\explainers\bruteforce.py -> build\lib.win-amd64-3.8\shap\explainers
      copying shap\explainers\explainer.py -> build\lib.win-amd64-3.8\shap\explainers
      copying shap\explainers\gradient.py -> build\lib.win-amd64-3.8\shap\explainers
      copying shap\explainers\kernel.py -> build\lib.win-amd64-3.8\shap\explainers
      copying shap\explainers\linear.py -> build\lib.win-amd64-3.8\shap\explainers
      copying shap\explainers\mimic.py -> build\lib.win-amd64-3.8\shap\explainers
      copying shap\explainers\partition.py -> build\lib.win-amd64-3.8\shap\explainers
      copying shap\explainers\permutation.py -> build\lib.win-amd64-3.8\shap\explainers
      copying shap\explainers\pytree.py -> build\lib.win-amd64-3.8\shap\explainers
      copying shap\explainers\sampling.py -> build\lib.win-amd64-3.8\shap\explainers
      copying shap\explainers\tf_utils.py -> build\lib.win-amd64-3.8\shap\explainers
      copying shap\explainers\tree.py -> build\lib.win-amd64-3.8\shap\explainers
      copying shap\explainers\__init__.py -> build\lib.win-amd64-3.8\shap\explainers
      creating build\lib.win-amd64-3.8\shap\explainers\other
      copying shap\explainers\other\coefficent.py -> build\lib.win-amd64-3.8\shap\explainers\other
      copying shap\explainers\other\lime.py -> build\lib.win-amd64-3.8\shap\explainers\other
      copying shap\explainers\other\maple.py -> build\lib.win-amd64-3.8\shap\explainers\other
      copying shap\explainers\other\random.py -> build\lib.win-amd64-3.8\shap\explainers\other
      copying shap\explainers\other\treegain.py -> build\lib.win-amd64-3.8\shap\explainers\other
      copying shap\explainers\other\__init__.py -> build\lib.win-amd64-3.8\shap\explainers\other
      creating build\lib.win-amd64-3.8\shap\explainers\deep
      copying shap\explainers\deep\deep_pytorch.py -> build\lib.win-amd64-3.8\shap\explainers\deep
      copying shap\explainers\deep\deep_tf.py -> build\lib.win-amd64-3.8\shap\explainers\deep
      copying shap\explainers\deep\__init__.py -> build\lib.win-amd64-3.8\shap\explainers\deep
      creating build\lib.win-amd64-3.8\shap\plots
      copying shap\plots\bar.py -> build\lib.win-amd64-3.8\shap\plots
      copying shap\plots\colorconv.py -> build\lib.win-amd64-3.8\shap\plots
      copying shap\plots\colors.py -> build\lib.win-amd64-3.8\shap\plots
      copying shap\plots\decision.py -> build\lib.win-amd64-3.8\shap\plots
      copying shap\plots\dependence.py -> build\lib.win-amd64-3.8\shap\plots
      copying shap\plots\embedding.py -> build\lib.win-amd64-3.8\shap\plots
      copying shap\plots\force.py -> build\lib.win-amd64-3.8\shap\plots
      copying shap\plots\force_matplotlib.py -> build\lib.win-amd64-3.8\shap\plots
      copying shap\plots\image.py -> build\lib.win-amd64-3.8\shap\plots
      copying shap\plots\monitoring.py -> build\lib.win-amd64-3.8\shap\plots
      copying shap\plots\partial_dependence.py -> build\lib.win-amd64-3.8\shap\plots
      copying shap\plots\summary.py -> build\lib.win-amd64-3.8\shap\plots
      copying shap\plots\text.py -> build\lib.win-amd64-3.8\shap\plots
      copying shap\plots\waterfall.py -> build\lib.win-amd64-3.8\shap\plots
      copying shap\plots\__init__.py -> build\lib.win-amd64-3.8\shap\plots
      creating build\lib.win-amd64-3.8\shap\benchmark
      copying shap\benchmark\experiments.py -> build\lib.win-amd64-3.8\shap\benchmark
      copying shap\benchmark\measures.py -> build\lib.win-amd64-3.8\shap\benchmark
      copying shap\benchmark\methods.py -> build\lib.win-amd64-3.8\shap\benchmark
      copying shap\benchmark\metrics.py -> build\lib.win-amd64-3.8\shap\benchmark
      copying shap\benchmark\models.py -> build\lib.win-amd64-3.8\shap\benchmark
      copying shap\benchmark\plots.py -> build\lib.win-amd64-3.8\shap\benchmark
      copying shap\benchmark\__init__.py -> build\lib.win-amd64-3.8\shap\benchmark
      creating build\lib.win-amd64-3.8\shap\plots\resources
      copying shap\plots\resources\bundle.js -> build\lib.win-amd64-3.8\shap\plots\resources
      copying shap\plots\resources\logoSmallGray.png -> build\lib.win-amd64-3.8\shap\plots\resources
      copying shap\tree_shap.h -> build\lib.win-amd64-3.8\shap
      running build_ext
      numpy.get_include() c:\users\pc\miniconda3\envs\fastai\lib\site-packages\numpy\core\include
      building 'shap._cext' extension
      error: Microsoft Visual C++ 14.0 or greater is required. Get it with "Microsoft C++ Build Tools": https://visualstudio.microsoft.com/visual-cpp-build-tools/
      ----------------------------------------
      ERROR: Failed building wheel for shap
      Running setup.py clean for shap
    Failed to build shap
    Installing collected packages: shap, plotnine, plotly
      Attempting uninstall: shap
        Found existing installation: shap 0.39.0
        Uninstalling shap-0.39.0:
          Successfully uninstalled shap-0.39.0
        Running setup.py install for shap ... error
        ERROR: Command errored out with exit status 1:
         command: 'c:\users\pc\miniconda3\envs\fastai\python.exe' -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\pc\\AppData\\Local\\Temp\\pip-install-hejpe705\\shap_2eff441acfc04fe3bd0ac670c603bf74\\setup.py'"'"'; __file__='"'"'C:\\Users\\pc\\AppData\\Local\\Temp\\pip-install-hejpe705\\shap_2eff441acfc04fe3bd0ac670c603bf74\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record 'c:\users\pc\AppData\Local\Temp\pip-record-uor8hprs\install-record.txt' --single-version-externally-managed --compile --install-headers 'c:\users\pc\miniconda3\envs\fastai\Include\shap'
             cwd: c:\users\pc\AppData\Local\Temp\pip-install-hejpe705\shap_2eff441acfc04fe3bd0ac670c603bf74\
        Complete output (67 lines):
        running install
        running build
        running build_py
        creating build
        creating build\lib.win-amd64-3.8
        creating build\lib.win-amd64-3.8\shap
        copying shap\common.py -> build\lib.win-amd64-3.8\shap
        copying shap\datasets.py -> build\lib.win-amd64-3.8\shap
        copying shap\__init__.py -> build\lib.win-amd64-3.8\shap
        creating build\lib.win-amd64-3.8\shap\explainers
        copying shap\explainers\additive.py -> build\lib.win-amd64-3.8\shap\explainers
        copying shap\explainers\bruteforce.py -> build\lib.win-amd64-3.8\shap\explainers
        copying shap\explainers\explainer.py -> build\lib.win-amd64-3.8\shap\explainers
        copying shap\explainers\gradient.py -> build\lib.win-amd64-3.8\shap\explainers
        copying shap\explainers\kernel.py -> build\lib.win-amd64-3.8\shap\explainers
        copying shap\explainers\linear.py -> build\lib.win-amd64-3.8\shap\explainers
        copying shap\explainers\mimic.py -> build\lib.win-amd64-3.8\shap\explainers
        copying shap\explainers\partition.py -> build\lib.win-amd64-3.8\shap\explainers
        copying shap\explainers\permutation.py -> build\lib.win-amd64-3.8\shap\explainers
        copying shap\explainers\pytree.py -> build\lib.win-amd64-3.8\shap\explainers
        copying shap\explainers\sampling.py -> build\lib.win-amd64-3.8\shap\explainers
        copying shap\explainers\tf_utils.py -> build\lib.win-amd64-3.8\shap\explainers
        copying shap\explainers\tree.py -> build\lib.win-amd64-3.8\shap\explainers
        copying shap\explainers\__init__.py -> build\lib.win-amd64-3.8\shap\explainers
        creating build\lib.win-amd64-3.8\shap\explainers\other
        copying shap\explainers\other\coefficent.py -> build\lib.win-amd64-3.8\shap\explainers\other
        copying shap\explainers\other\lime.py -> build\lib.win-amd64-3.8\shap\explainers\other
        copying shap\explainers\other\maple.py -> build\lib.win-amd64-3.8\shap\explainers\other
        copying shap\explainers\other\random.py -> build\lib.win-amd64-3.8\shap\explainers\other
        copying shap\explainers\other\treegain.py -> build\lib.win-amd64-3.8\shap\explainers\other
        copying shap\explainers\other\__init__.py -> build\lib.win-amd64-3.8\shap\explainers\other
        creating build\lib.win-amd64-3.8\shap\explainers\deep
        copying shap\explainers\deep\deep_pytorch.py -> build\lib.win-amd64-3.8\shap\explainers\deep
        copying shap\explainers\deep\deep_tf.py -> build\lib.win-amd64-3.8\shap\explainers\deep
        copying shap\explainers\deep\__init__.py -> build\lib.win-amd64-3.8\shap\explainers\deep
        creating build\lib.win-amd64-3.8\shap\plots
        copying shap\plots\bar.py -> build\lib.win-amd64-3.8\shap\plots
        copying shap\plots\colorconv.py -> build\lib.win-amd64-3.8\shap\plots
        copying shap\plots\colors.py -> build\lib.win-amd64-3.8\shap\plots
        copying shap\plots\decision.py -> build\lib.win-amd64-3.8\shap\plots
        copying shap\plots\dependence.py -> build\lib.win-amd64-3.8\shap\plots
        copying shap\plots\embedding.py -> build\lib.win-amd64-3.8\shap\plots
        copying shap\plots\force.py -> build\lib.win-amd64-3.8\shap\plots
        copying shap\plots\force_matplotlib.py -> build\lib.win-amd64-3.8\shap\plots
        copying shap\plots\image.py -> build\lib.win-amd64-3.8\shap\plots
        copying shap\plots\monitoring.py -> build\lib.win-amd64-3.8\shap\plots
        copying shap\plots\partial_dependence.py -> build\lib.win-amd64-3.8\shap\plots
        copying shap\plots\summary.py -> build\lib.win-amd64-3.8\shap\plots
        copying shap\plots\text.py -> build\lib.win-amd64-3.8\shap\plots
        copying shap\plots\waterfall.py -> build\lib.win-amd64-3.8\shap\plots
        copying shap\plots\__init__.py -> build\lib.win-amd64-3.8\shap\plots
        creating build\lib.win-amd64-3.8\shap\benchmark
        copying shap\benchmark\experiments.py -> build\lib.win-amd64-3.8\shap\benchmark
        copying shap\benchmark\measures.py -> build\lib.win-amd64-3.8\shap\benchmark
        copying shap\benchmark\methods.py -> build\lib.win-amd64-3.8\shap\benchmark
        copying shap\benchmark\metrics.py -> build\lib.win-amd64-3.8\shap\benchmark
        copying shap\benchmark\models.py -> build\lib.win-amd64-3.8\shap\benchmark
        copying shap\benchmark\plots.py -> build\lib.win-amd64-3.8\shap\benchmark
        copying shap\benchmark\__init__.py -> build\lib.win-amd64-3.8\shap\benchmark
        creating build\lib.win-amd64-3.8\shap\plots\resources
        copying shap\plots\resources\bundle.js -> build\lib.win-amd64-3.8\shap\plots\resources
        copying shap\plots\resources\logoSmallGray.png -> build\lib.win-amd64-3.8\shap\plots\resources
        copying shap\tree_shap.h -> build\lib.win-amd64-3.8\shap
        running build_ext
        numpy.get_include() c:\users\pc\miniconda3\envs\fastai\lib\site-packages\numpy\core\include
        building 'shap._cext' extension
        error: Microsoft Visual C++ 14.0 or greater is required. Get it with "Microsoft C++ Build Tools": https://visualstudio.microsoft.com/visual-cpp-build-tools/
        ----------------------------------------
      Rolling back uninstall of shap
      Moving to c:\users\pc\miniconda3\envs\fastai\lib\site-packages\shap-0.39.0.dist-info\
       from c:\users\pc\miniconda3\envs\fastai\Lib\site-packages\~hap-0.39.0.dist-info
      Moving to c:\users\pc\miniconda3\envs\fastai\lib\site-packages\shap\
       from c:\users\pc\miniconda3\envs\fastai\Lib\site-packages\~hap
    ERROR: Command errored out with exit status 1: 'c:\users\pc\miniconda3\envs\fastai\python.exe' -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\pc\\AppData\\Local\\Temp\\pip-install-hejpe705\\shap_2eff441acfc04fe3bd0ac670c603bf74\\setup.py'"'"'; __file__='"'"'C:\\Users\\pc\\AppData\\Local\\Temp\\pip-install-hejpe705\\shap_2eff441acfc04fe3bd0ac670c603bf74\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record 'c:\users\pc\AppData\Local\Temp\pip-record-uor8hprs\install-record.txt' --single-version-externally-managed --compile --install-headers 'c:\users\pc\miniconda3\envs\fastai\Include\shap' Check the logs for full command output.
    
    opened by EtienneT 2
  • Bump addressable from 2.7.0 to 2.8.1 in /docs

    Bump addressable from 2.7.0 to 2.8.1 in /docs

    Bumps addressable from 2.7.0 to 2.8.1.

    Changelog

    Sourced from addressable's changelog.

    Addressable 2.8.1

    • refactor Addressable::URI.normalize_path to address linter offenses (#430)
    • remove redundant colon in Addressable::URI::CharacterClasses::AUTHORITY regex (#438)
    • update gemspec to reflect supported Ruby versions (#466, #464, #463)
    • compatibility w/ public_suffix 5.x (#466, #465, #460)
    • fixes "invalid byte sequence in UTF-8" exception when unencoding URLs containing non UTF-8 characters (#459)
    • Ractor compatibility (#449)
    • use the whole string instead of a single line for template match (#431)
    • force UTF-8 encoding only if needed (#341)

    #460: sporkmonger/addressable#460 #463: sporkmonger/addressable#463 #464: sporkmonger/addressable#464 #465: sporkmonger/addressable#465 #466: sporkmonger/addressable#466

    Addressable 2.8.0

    • fixes ReDoS vulnerability in Addressable::Template#match
    • no longer replaces + with spaces in queries for non-http(s) schemes
    • fixed encoding ipv6 literals
    • the :compacted flag for normalized_query now dedupes parameters
    • fix broken escape_component alias
    • dropping support for Ruby 2.0 and 2.1
    • adding Ruby 3.0 compatibility for development tasks
    • drop support for rack-mount and remove Addressable::Template#generate
    • performance improvements
    • switch CI/CD to GitHub Actions
    Commits
    • 8657465 Update version, gemspec, and CHANGELOG for 2.8.1 (#474)
    • 4fc5bb6 CI: remove Ubuntu 18.04 job (#473)
    • 860fede Force UTF-8 encoding only if needed (#341)
    • 99810af Merge pull request #431 from ojab/ct-_do_not_parse_multiline_strings
    • 7ce0f48 Merge branch 'main' into ct-_do_not_parse_multiline_strings
    • 7ecf751 Merge pull request #449 from okeeblow/freeze_concatenated_strings
    • 41f12dd Merge branch 'main' into freeze_concatenated_strings
    • 068f673 Merge pull request #459 from jarthod/iso-encoding-problem
    • b4c9882 Merge branch 'main' into iso-encoding-problem
    • 08d27e8 Merge pull request #471 from sporkmonger/sporkmonger-enable-codeql
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • Bump tzinfo from 1.2.5 to 1.2.10 in /docs

    Bump tzinfo from 1.2.5 to 1.2.10 in /docs

    Bumps tzinfo from 1.2.5 to 1.2.10.

    Release notes

    Sourced from tzinfo's releases.

    v1.2.10

    TZInfo v1.2.10 on RubyGems.org

    v1.2.9

    • Fixed an incorrect InvalidTimezoneIdentifier exception raised when loading a zoneinfo file that includes rules specifying an additional transition to the final defined offset (for example, Africa/Casablanca in version 2018e of the Time Zone Database). #123.

    TZInfo v1.2.9 on RubyGems.org

    v1.2.8

    • Added support for handling "slim" format zoneinfo files that are produced by default by zic version 2020b and later. The POSIX-style TZ string is now used calculate DST transition times after the final defined transition in the file. The 64-bit section is now always used regardless of whether Time has support for 64-bit times. #120.
    • Rubinius is no longer supported.

    TZInfo v1.2.8 on RubyGems.org

    v1.2.7

    • Fixed 'wrong number of arguments' errors when running on JRuby 9.0. #114.
    • Fixed warnings when running on Ruby 2.8. #112.

    TZInfo v1.2.7 on RubyGems.org

    v1.2.6

    • Timezone#strftime('%s', time) will now return the correct number of seconds since the epoch. #91.
    • Removed the unused TZInfo::RubyDataSource::REQUIRE_PATH constant.
    • Fixed "SecurityError: Insecure operation - require" exceptions when loading data with recent Ruby releases in safe mode.
    • Fixed warnings when running on Ruby 2.7. #106 and #111.

    TZInfo v1.2.6 on RubyGems.org

    Changelog

    Sourced from tzinfo's changelog.

    Version 1.2.10 - 19-Jul-2022

    Version 1.2.9 - 16-Dec-2020

    • Fixed an incorrect InvalidTimezoneIdentifier exception raised when loading a zoneinfo file that includes rules specifying an additional transition to the final defined offset (for example, Africa/Casablanca in version 2018e of the Time Zone Database). #123.

    Version 1.2.8 - 8-Nov-2020

    • Added support for handling "slim" format zoneinfo files that are produced by default by zic version 2020b and later. The POSIX-style TZ string is now used calculate DST transition times after the final defined transition in the file. The 64-bit section is now always used regardless of whether Time has support for 64-bit times. #120.
    • Rubinius is no longer supported.

    Version 1.2.7 - 2-Apr-2020

    • Fixed 'wrong number of arguments' errors when running on JRuby 9.0. #114.
    • Fixed warnings when running on Ruby 2.8. #112.

    Version 1.2.6 - 24-Dec-2019

    • Timezone#strftime('%s', time) will now return the correct number of seconds since the epoch. #91.
    • Removed the unused TZInfo::RubyDataSource::REQUIRE_PATH constant.
    • Fixed "SecurityError: Insecure operation - require" exceptions when loading data with recent Ruby releases in safe mode.
    • Fixed warnings when running on Ruby 2.7. #106 and #111.
    Commits
    • 0814dcd Fix the release date.
    • fd05e2a Preparing v1.2.10.
    • b98c32e Merge branch 'fix-directory-traversal-1.2' into 1.2
    • ac3ee68 Remove unnecessary escaping of + within regex character classes.
    • 9d49bf9 Fix relative path loading tests.
    • 394c381 Remove private_constant for consistency and compatibility.
    • 5e9f990 Exclude Arch Linux's SECURITY file from the time zone index.
    • 17fc9e1 Workaround for 'Permission denied - NUL' errors with JRuby on Windows.
    • 6bd7a51 Update copyright years.
    • 9905ca9 Fix directory traversal in Timezone.get when using Ruby data source
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • Bump nokogiri from 1.12.5 to 1.13.6 in /docs

    Bump nokogiri from 1.12.5 to 1.13.6 in /docs

    Bumps nokogiri from 1.12.5 to 1.13.6.

    Release notes

    Sourced from nokogiri's releases.

    1.13.6 / 2022-05-08

    Security

    • [CRuby] Address CVE-2022-29181, improper handling of unexpected data types, related to untrusted inputs to the SAX parsers. See GHSA-xh29-r2w5-wx8m for more information.

    Improvements

    • {HTML4,XML}::SAX::{Parser,ParserContext} constructor methods now raise TypeError instead of segfaulting when an incorrect type is passed.

    sha256:

    58417c7c10f78cd1c0e1984f81538300d4ea98962cfd3f46f725efee48f9757a  nokogiri-1.13.6-aarch64-linux.gem
    a2b04ec3b1b73ecc6fac619b41e9fdc70808b7a653b96ec97d04b7a23f158dbc  nokogiri-1.13.6-arm64-darwin.gem
    4437f2d03bc7da8854f4aaae89e24a98cf5c8b0212ae2bc003af7e65c7ee8e27  nokogiri-1.13.6-java.gem
    99d3e212bbd5e80aa602a1f52d583e4f6e917ec594e6aa580f6aacc253eff984  nokogiri-1.13.6-x64-mingw-ucrt.gem
    a04f6154a75b6ed4fe2d0d0ff3ac02f094b54e150b50330448f834fa5726fbba  nokogiri-1.13.6-x64-mingw32.gem
    a13f30c2863ef9e5e11240dd6d69ef114229d471018b44f2ff60bab28327de4d  nokogiri-1.13.6-x86-linux.gem
    63a2ca2f7a4f6bd9126e1695037f66c8eb72ed1e1740ef162b4480c57cc17dc6  nokogiri-1.13.6-x86-mingw32.gem
    2b266e0eb18030763277b30dc3d64337f440191e2bd157027441ac56a59d9dfe  nokogiri-1.13.6-x86_64-darwin.gem
    3fa37b0c3b5744af45f9da3e4ae9cbd89480b35e12ae36b5e87a0452e0b38335  nokogiri-1.13.6-x86_64-linux.gem
    b1512fdc0aba446e1ee30de3e0671518eb363e75fab53486e99e8891d44b8587  nokogiri-1.13.6.gem
    

    1.13.5 / 2022-05-04

    Security

    Dependencies

    • [CRuby] Vendored libxml2 is updated from v2.9.13 to v2.9.14.

    Improvements

    • [CRuby] The libxml2 HTML4 parser no longer exhibits quadratic behavior when recovering some broken markup related to start-of-tag and bare < characters.

    Changed

    • [CRuby] The libxml2 HTML4 parser in v2.9.14 recovers from some broken markup differently. Notably, the XML CDATA escape sequence <![CDATA[ and incorrectly-opened comments will result in HTML text nodes starting with &lt;! instead of skipping the invalid tag. This behavior is a direct result of the quadratic-behavior fix noted above. The behavior of downstream sanitizers relying on this behavior will also change. Some tests describing the changed behavior are in test/html4/test_comments.rb.

    ... (truncated)

    Changelog

    Sourced from nokogiri's changelog.

    1.13.6 / 2022-05-08

    Security

    • [CRuby] Address CVE-2022-29181, improper handling of unexpected data types, related to untrusted inputs to the SAX parsers. See GHSA-xh29-r2w5-wx8m for more information.

    Improvements

    • {HTML4,XML}::SAX::{Parser,ParserContext} constructor methods now raise TypeError instead of segfaulting when an incorrect type is passed.

    1.13.5 / 2022-05-04

    Security

    Dependencies

    • [CRuby] Vendored libxml2 is updated from v2.9.13 to v2.9.14.

    Improvements

    • [CRuby] The libxml2 HTML parser no longer exhibits quadratic behavior when recovering some broken markup related to start-of-tag and bare < characters.

    Changed

    • [CRuby] The libxml2 HTML parser in v2.9.14 recovers from some broken markup differently. Notably, the XML CDATA escape sequence <![CDATA[ and incorrectly-opened comments will result in HTML text nodes starting with &lt;! instead of skipping the invalid tag. This behavior is a direct result of the quadratic-behavior fix noted above. The behavior of downstream sanitizers relying on this behavior will also change. Some tests describing the changed behavior are in test/html4/test_comments.rb.

    1.13.4 / 2022-04-11

    Security

    Dependencies

    • [CRuby] Vendored zlib is updated from 1.2.11 to 1.2.12. (See LICENSE-DEPENDENCIES.md for details on which packages redistribute this library.)
    • [JRuby] Vendored Xerces-J (xerces:xercesImpl) is updated from 2.12.0 to 2.12.2.
    • [JRuby] Vendored nekohtml (org.cyberneko.html) is updated from a fork of 1.9.21 to 1.9.22.noko2. This fork is now publicly developed at https://github.com/sparklemotion/nekohtml

    ... (truncated)

    Commits
    • b7817b6 version bump to v1.13.6
    • 61b1a39 Merge pull request #2530 from sparklemotion/flavorjones-check-parse-memory-ty...
    • 83cc451 fix: {HTML4,XML}::SAX::{Parser,ParserContext} check arg types
    • 22c9e5b version bump to v1.13.5
    • 6155881 doc: update CHANGELOG for v1.13.5
    • c519a47 Merge pull request #2527 from sparklemotion/2525-update-libxml-2_9_14-v1_13_x
    • 66c2886 dep: update libxml2 to v2.9.14
    • b7c4cc3 test: unpend the LIBXML_LOADED_VERSION test on freebsd
    • eac7934 dev: require yaml
    • f3521ba style(rubocop): pend Style/FetchEnvVar for now
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • Remove package from results, alternative option

    Remove package from results, alternative option

    Here is a new option to use that is better:

        package_name:str, # The name of a python package
        depth_limit:int=1, # How deep to follow nested dependencies
    ) -> dict: # A dictionary of {package:version}
        "Recursively grabs dependencies of python package"
        pkgs = pipdeptree.get_installed_distributions(local_only=False, user_only=False)
        tree = pipdeptree.PackageDAG.from_pkgs(pkgs)
        tree = tree.filter([package_name], None)
        curr_depth=0
        def _get_deps(j, dep_dict={}, curr_depth=0):
            if curr_depth > depth_limit: return dep_dict
            if isinstance(j, list):
                for a in j:
                    _get_deps(a, dep_dict, curr_depth)
            elif isinstance(j, dict):
                if 'package_name' in j.keys():
                    if j['package_name'] not in dep_dict.keys() and j['package_name'] != package_name:
                        dep_dict[j['package_name']] = j['installed_version']
                if 'dependencies' in j.keys():
                    curr_depth += 1
                    return _get_deps(j['dependencies'], dep_dict, curr_depth)
            return dep_dict
        return _get_deps(ast.literal_eval(pipdeptree.render_json_tree(tree, 4)), {})
    
    opened by muellerzr 0
  • Error importing interpretation module

    Error importing interpretation module

    It appears this is being triggered where get_features_core is being passed to the delegates decorator. I switched out get_features_core for TabularLearner.get_features_core, and this resolved the immediate error, but seemed to cause issues further downstream.

    I don't know enough about the fastcore or fastinference libraries, to understand the desired behavior here

    >>> from fastinference.tabular.interpretation import *
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "/lib/python3.8/site-packages/fastinference/tabular/interpretation.py", line 138, in <module>
        def get_top_features_corr(x:TabularLearner, df:Optional[pd.DataFrame]=None, thresh:float=0.8, **kwargs):
      File "/lib/python3.8/site-packages/fastcore/meta.py", line 111, in _f
        if to is None: to_f,from_f = f.__base__.__init__,f.__init__
    AttributeError: 'function' object has no attribute '__base__'
    
    opened by ecatkins 0
Owner
Zachary Mueller
Software Design and Development Major at University of West Florida
Zachary Mueller
PyTorch-LIT is the Lite Inference Toolkit (LIT) for PyTorch which focuses on easy and fast inference of large models on end-devices.

PyTorch-LIT PyTorch-LIT is the Lite Inference Toolkit (LIT) for PyTorch which focuses on easy and fast inference of large models on end-devices. With

Amin Rezaei 157 Dec 11, 2022
Data-depth-inference - Data depth inference with python

Welcome! This readme will guide you through the use of the code in this reposito

Marco 3 Feb 8, 2022
A library of extension and helper modules for Python's data analysis and machine learning libraries.

Mlxtend (machine learning extensions) is a Python library of useful tools for the day-to-day data science tasks. Sebastian Raschka 2014-2020 Links Doc

Sebastian Raschka 4.2k Jan 2, 2023
Implementation of self-attention mechanisms for general purpose. Focused on computer vision modules. Ongoing repository.

Self-attention building blocks for computer vision applications in PyTorch Implementation of self attention mechanisms for computer vision in PyTorch

AI Summer 962 Dec 23, 2022
Turning SymPy expressions into PyTorch modules.

sympytorch A micro-library as a convenience for turning SymPy expressions into PyTorch Modules. All SymPy floats become trainable parameters. All SymP

Patrick Kidger 89 Dec 13, 2022
Implementation of Bidirectional Recurrent Independent Mechanisms (Learning to Combine Top-Down and Bottom-Up Signals in Recurrent Neural Networks with Attention over Modules)

BRIMs Bidirectional Recurrent Independent Mechanisms Implementation of the paper Learning to Combine Top-Down and Bottom-Up Signals in Recurrent Neura

Sarthak Mittal 26 May 26, 2022
DI-HPC is an acceleration operator component for general algorithm modules in reinforcement learning algorithms

DI-HPC: Decision Intelligence - High Performance Computation DI-HPC is an acceleration operator component for general algorithm modules in reinforceme

OpenDILab 185 Dec 29, 2022
Implementation for our ICCV 2021 paper: Dual-Camera Super-Resolution with Aligned Attention Modules

DCSR: Dual Camera Super-Resolution Implementation for our ICCV 2021 oral paper: Dual-Camera Super-Resolution with Aligned Attention Modules paper | pr

Tengfei Wang 110 Dec 20, 2022
Implementation for our ICCV 2021 paper: Dual-Camera Super-Resolution with Aligned Attention Modules

DCSR: Dual Camera Super-Resolution Implementation for our ICCV 2021 oral paper: Dual-Camera Super-Resolution with Aligned Attention Modules paper | pr

Tengfei Wang 110 Dec 20, 2022
Weight initialization schemes for PyTorch nn.Modules

nninit Weight initialization schemes for PyTorch nn.Modules. This is a port of the popular nninit for Torch7 by @kaixhin. ##Update This repo has been

Alykhan Tejani 69 Jan 26, 2021
Nest - A flexible tool for building and sharing deep learning modules

Nest - A flexible tool for building and sharing deep learning modules Nest is a flexible deep learning module manager, which aims at encouraging code

ZhouYanzhao 41 Oct 10, 2022
This is a Machine Learning Based Hand Detector Project, It Uses Machine Learning Models and Modules Like Mediapipe, Developed By Google!

Machine Learning Hand Detector This is a Machine Learning Based Hand Detector Project, It Uses Machine Learning Models and Modules Like Mediapipe, Dev

Popstar Idhant 3 Feb 25, 2022
Pytorch modules for paralel models with same architecture. Ideal for multi agent-based systems

WideLinears Pytorch parallel Neural Networks A package of pytorch modules for fast paralellization of separate deep neural networks. Ideal for agent-b

null 1 Dec 17, 2021
Stacs-ci - A set of modules to enable integration of STACS with commonly used CI / CD systems

Static Token And Credential Scanner CI Integrations What is it? STACS is a YARA

STACS 18 Aug 4, 2022
Import Python modules from dicts and JSON formatted documents.

Paker Paker is module for importing Python packages/modules from dictionaries and JSON formatted documents. It was inspired by httpimporter. Important

Wojciech Wentland 1 Sep 7, 2022
null 2 Jul 19, 2022
Western-3DSlicer-Modules - Point-Set Registrations for Ultrasound Probe Calibrations

Point-Set Registrations for Ultrasound Probe Calibrations -Undergraduate Thesis-

Matteo Tanzi 0 May 4, 2022
Torch-mutable-modules - Use in-place and assignment operations on PyTorch module parameters with support for autograd

Torch Mutable Modules Use in-place and assignment operations on PyTorch module p

Kento Nishi 7 Jun 6, 2022
Visyerres sgdf woob - Modules Woob pour l'intranet et autres sites Scouts et Guides de France

Vis'Yerres SGDF - Modules Woob Vous avez le sentiment que l'intranet des Scouts

Thomas Touhey (pas un pseudonyme) 3 Dec 24, 2022