Machine learning that just works, for effortless production applications

Overview

PyPI version PyPI pyversions Docs Build Status Coverage Status HitCount

SimpleML

Machine learning that just works, for effortless production applications

Documentation: simpleml.readthedocs.io

Installation: pip install simpleml

History

SimpleML started as a persistence solution to simplify some of the most common pain points in new modeling projects. It offered an abstraction layer to implicitly version, persist, and load training iterations to make productionalizing a project an effortless process. Extensibility is central to the design and is therefore compatible out of the box with modeling libraries (Scikit-Learn, Tensorflow/Keras, etc) and algorithms, making it a low overhead drop-in complement to workflows.

As the ML ops space has grown, more solutions are being offered to manage particular pain points or conform to opinionated development and release patterns. These patterns while immensely powerful are rigid and not always the ideal implementation for projects (and definitely not amenable to blending multiple frameworks in a project). SimpleML is also growing to address this gap by evolving from a persistence framework to an ML management framework. The goal is to unify existing and new solutions into a standardized ecosystem giving developers the ease and flexibility to choose the right fit(s) for their projects. Like before, SimpleML does and will not define modeling algorithms, instead it will focus on perfecting the glue that allows those algorithms and now solutions to be used in real workflows that can be effortlessly deployed into real applications.

Architecture

Architecturally, SimpleML has a core set of components that map to the areas of ML management. Each of those in turn is extended and refined to support external libraries, tools, and infrastructure. Extensibility is the cornerstone for SimpleML and support for new extensions should be a simple, straightforward process without ever requiring monkey-patching.

Components

  • Persistables: Standardization
  • Executors: Portability, Scale
  • Adapters: Interoperatability
  • ORM: Versioning, Lineage, Metadata Tracking, Reusability
  • Save Patterns: Persistence
  • Registries: Extensibility

Persistables

Persistables are the wrappers around artifacts (artifacts are the actual objects that are generated by training and need to be deployed into production). They provide a standardized interface to manage and use artifacts making it easy to use artifacts from different libraries inside the same processing environment. Additionally they allow for a unified mapping of the particular idiosyncrasies that come with different frameworks to enable developers and scripts to only use a single access pattern (eg always call "fit" instead of mapping between fit, train, etc based on library). See the source code for the inheritance pattern and examples to extend around any external library.

Executors

Executors are the persistable agnostic components that provide portability and scale. They handle execution so the same functions can be run in various backends without affecting the artifacts produced. (examples: single process execution, multiprocessing, threading, containers, kubernetes, dask, ray, apache-beam, spark, etc). This intentional decoupling is a large part of what powers the diverse support for flexible productionalization (train once, deploy anywhere). Note that not every execution pattern is guaranteed to work natively with every persistable (these will be noted as needed).

Adapters

Adapters are the complements to persistables and executors. They are optional wrappers to align input requirements to operations. By definition adapters are stateless wrappers that have no functional impact on processing so they can be specified at runtime as needed. Additionally the output across different executors for the same operation is guaranteed to be identical. (eg creating a docker container for a persistable to run in kubernetes or wrapping a persistable in a ParDo to execute in apache-beam)

ORM

The ORM layer is the heart of metadata management. All persistables are integrated with the database to record specifications for reproducibility, lineage, and versioning. Depending on the workflows, that metadata can also be leveraged for reusability to accelerate development iterations by only creating new experiments and reusing old persistables for existing ones.

Save Patterns

Save and load patterns are the mechanism that manage persistence. All artifacts can be different with native or special handling of serialization to save the training state to be loaded into a production environment. Save patterns allow for that customization to register any serialization/deserialization technique that will automatically be applied by the persistables. (examples: pickle, hickle, hdf5, json, library native, database tables, etc)

Registries

Registries are the communication backend that allows users to change internal behavior or extend support at runtime. Registration can happen implicitly on import or explicitly as part of a script. (eg register serialization class for a save pattern or map an executor class to a particular backend parameter)

Workflows

Workflows are largely up to individual developers, but there are some assumptions made about the process:

The primary assumption is that ML lifecycle follow a DAG. That creates a forward propagating dependency chain without altering previous pieces of the chain. There is considerable flexibility in what each of the steps can be, but are generally assumed to flow modularly and mimic a data science project.

Thematic steps, in sequence, start with data management, move through transformation, model creation, and finally evaluation. These are further broken down in the following ways:

Data Management

  • Raw Datasets: The basic data block of (potentially) unformatted datasets. These datasets can be sourced from anywhere
  • Dataset Pipelines: The required transformation to turn unformatted data into what is expected to be seen in production -- These pipelines are completely optional and only used in derived datasets
  • Datasets: The "production formatted" datasets

Transformation

  • Pipelines: Transformation sequences to extract and process the dataset

Modeling

  • Models: The machine learning models

Evaluation

  • Metrics: Evaluation objects computed over the models and datasets

Examples

Examples will be posted in response to requests under Examples. Please open an issue on github to request more examples and tag them with [Example Request] How to...

Usage

Starting a project is as simple as defining the raw data and guiding the transformations. A minimal example using the kaggle Titanic dataset is demonstrated below:

The first step in every project is to establish a database connection to manage metadata. Technically this step is only necessary if a persistable is saved or loaded, so ephemeral setups can skip this.

from simpleml.utils import Database

# Initialize Database Connection and upgrade the schema to the latest
# By default this will create a new local sqlite database
# The upgrade parameter is only necessary if the db is outdated or new
db = Database().initialize(upgrade=True)

The most direct way to use SimpleML is to treat it like other modeling frameworks with forward moving imperative actions (eg initialize, run methods, save). Notice how this workflow is identical to using the underlying libraries directly with a few additional parameters. That is because SimpleML wraps the underlying libraries and standardizes the interfaces.

This block (or any subset) can be executed as many times as desired and will create a new object each time with an autoincrementing version (for each "name").

from simpleml.datasets import SingleLabelPandasDataset
from simpleml.pipelines import RandomSplitPipeline
from simpleml.transformers import SklearnDictVectorizer, DataframeToRecords, FillWithValue
from simpleml.models import SklearnLogisticRegression
from simpleml.metrics AccuracyMetric
from simpleml.constants import TEST_SPLIT


# Define Dataset and point to loading file -- Creates a pandas.DataFrame artifact
class TitanicDataset(SingleLabelPandasDataset):
    def build_dataframe(self):
        self.dataframe = self.load_csv('filepath/to/train.csv')

# Create Dataset and save it
dataset = TitanicDataset(name='titanic', label_columns=['Survived'])
dataset.build_dataframe()
dataset.save()  # this defaults to a pickle serialization

# Define the minimal transformers to fill nulls and one-hot encode text columns
transformers = [
    ('fill_zeros', FillWithValue(values=0.)),
    ('record_coverter', DataframeToRecords()),
    ('vectorizer', SklearnDictVectorizer())
]

# Create Pipeline and save it - Use basic 80-20 test split
# Creates an sklearn.pipelines.Pipeline artifact
pipeline = RandomSplitPipeline(name='titanic', external_pipeline_class='sklearn', transformers=transformers,
                               train_size=0.8, validation_size=0.0, test_size=0.2)
pipeline.add_dataset(dataset)
pipeline.fit()
pipeline.save()  # this defaults to a pickle serialization

# Create Model and save it -- Creates an sklearn.linear_model.LogisticRegression artifact
model = SklearnLogisticRegression(name='titanic')
model.add_pipeline(pipeline)
model.fit()
model.save()  # this defaults to a pickle serialization

# Create Metric and save it
metric = AccuracyMetric(dataset_split=TEST_SPLIT)
metric.add_model(model)
metric.add_dataset(dataset)
metric.score()
metric.save()

The same operations can also be defined in a declaritive way using wrapper utilities so only the parameters need to be specified. Additionally if using a deterministic persistable wrapper (the object is fully initialized at construction and not subject to user changes) the metadata automatically generated can be used to identify existing artifacts without having to recreate them.

from simpleml.utils import DatasetCreator, PipelineCreator, ModelCreator, MetricCreator

# ---------------------------------------------------------------------------- #
# Option 1: Explicit object creation (pass in dependencies)
# ---------------------------------------------------------------------------- #
# Object defining parameters
dataset_kwargs = {'name': 'titanic', 'registered_name': 'TitanicDataset', 'label_columns': ['Survived']}
pipeline_kwargs = {'name': 'titanic', 'registered_name': 'RandomSplitPipeline', 'transformers': transformers, 'train_size': 0.8, 'validation_size': 0.0, 'test_size': 0.2}
model_kwargs = {'name': 'titanic', 'registered_name': 'SklearnLogisticRegression'}
metric_kwargs = {'registered_name': 'AccuracyMetric', 'dataset_split': TEST_SPLIT}

# each creator has two methods - `retrieve_or_create` and `create`. using create will
# create a new persistable each time while retrieve_or_create will first look for a matching persistable
dataset = DatasetCreator.retrieve_or_create(**dataset_kwargs)
pipeline = PipelineCreator.retrieve_or_create(dataset=dataset, **pipeline_kwargs)
model = ModelCreator.retrieve_or_create(pipeline=pipeline, **model_kwargs)
metric = MetricCreator.retrieve_or_create(model=model, dataset=dataset, **metric_kwargs)     

# ---------------------------------------------------------------------------- #
# Option 2: Implicit object creation (pass in dependency references - nested)
# Does not require dependency existence at this time, good for compiling job definitions and executing on remote, distributed nodes
# ---------------------------------------------------------------------------- #
# Nested dependencies
pipeline_kwargs['dataset_kwargs'] = dataset_kwargs
model_kwargs['pipeline_kwargs'] = pipeline_kwargs
metric_kwargs['model_kwargs'] = model_kwargs

# each creator has two methods - `retrieve_or_create` and `create`. using create will
# create a new persistable each time while retrieve_or_create will first look for a matching persistable
dataset = DatasetCreator.retrieve_or_create(**dataset_kwargs)
pipeline = PipelineCreator.retrieve_or_create(dataset_kwargs=dataset_kwargs, **pipeline_kwargs)
model = ModelCreator.retrieve_or_create(pipeline_kwargs=pipeline_kwargs, **model_kwargs)
metric = MetricCreator.retrieve_or_create(model_kwargs=model_kwargs, dataset_kwargs=dataset_kwargs, **metric_kwargs)     

This workflow is modeled as a DAG, which means that there is room for parallelization, but dependencies are assumed to exist upon execution. Persistable creators are intentionally designed to take a dependent object as input or a reference. This allows for job definition before dependencies exist with lazy loading when they are required. Note that this comes at the cost of additional computations. In order to match up dependencies to a reference, a dummy persistable must be created and compared, with the exception of a unique reference - like name, version which mean the dependency already exists but is memory efficient to load later. This form also enables usage of config files to parameterize training instead of requiring an active shell to interactively define the objects.

Once artifacts have been created, they can be easily retrieved by their name attribute (or any other identifying metadata). By default the latest version for the supplied parameters will be returned, but this can be overridden by explicitly passing a version number. This makes productionalization as simple as defining a deployment harness to process new requests.

from simpleml.utils import PersistableLoader

# Notice versions are not shared between persistable types and can increment differently depending on iterations
dataset = PersistableLoader.load_dataset(name='titanic', version=7)
pipeline = PersistableLoader.load_pipeline(name='titanic', version=6)
model = PersistableLoader.load_model(name='titanic', version=8)
metric = PersistableLoader.load_metric(name='classification_accuracy', model_id=model.id)

When it comes to production, typically the training data is no longer needed so this mechanism becomes as simple as loading the feature pipeline and model:

desired_model = PersistableLoader.load_model(name='titanic', version=10)
# Implicitly pass new data through linked pipeline via transform param
desired_model.predict_proba(new_dataframe, transform=True)

or (explicitly load a pipeline to use, by default the pipeline the model was trained on will be used)

desired_pipeline = PersistableLoader.load_pipeline(name='titanic', version=11)
desired_model = PersistableLoader.load_model(name='titanic', version=10)
desired_model.predict_proba(desired_pipeline.transform(new_dataframe), transform=False)

The Vision

Ultimately SimpleML should fill the void currently faced by many data scientists with a simple and painless management layer. Furthermore it will be extended in a way that lowers the technical barrier for all developers to use machine learning in their projects. If it resonates with you, consider opening a PR and contributing!

Future features I would like to introduce:

  • Browser GUI with drag-n-drop components for each step in the process (click on a dataset, pile transformers as blocks, click on a model type...)
  • App-Store style tabs for community shared persistables (datasets, transformers...)
  • Automatic API hosting for models (click "deploy" for REST API)

Support

SimpleML is a community project, developed on the side, to address a lot of the pain points I have felt creating ML applications. If you find it helpful and would like to support further development, please consider becoming a Sponsor ❤️ or opening a PR.

Contract & Technical Support

For support implementing and extending SimpleML or architecting a machine learning tech stack, contact the author Elisha Yadgaran 📧 for rates.

Enterprise Support

There is a vision to eventually offer a managed enterprise version, but that is not being pursued at the moment. Regardless of that outcome, SimpleML will always stay an open source framework and offer a self-hosted version.

Comments
  • Bump paramiko from 2.9.2 to 2.10.1 in /docs

    Bump paramiko from 2.9.2 to 2.10.1 in /docs

    Bumps paramiko from 2.9.2 to 2.10.1.

    Commits
    • 286bd9f Cut 2.10.1
    • 4c491e2 Fix CVE re: PKey.write_private_key chmod race
    • aa3cc6f Cut 2.10.0
    • e50e19f Fix up changelog entry with real links
    • 02ad67e Helps to actually leverage your mocked system calls
    • 29d7bf4 Clearly our agent stuff is not fully tested yet...
    • 5fcb8da OpenSSH docs state %C should also work in IdentityFile and Match exec
    • 1bf3dce Changelog enhancement
    • f6342fc Prettify, add %C as acceptable controlpath token, mock gethostname
    • 3f3451f Add to changelog
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 2
  • Bump pillow from 9.0.0 to 9.0.1 in /docs

    Bump pillow from 9.0.0 to 9.0.1 in /docs

    Bumps pillow from 9.0.0 to 9.0.1.

    Release notes

    Sourced from pillow's releases.

    9.0.1

    https://pillow.readthedocs.io/en/stable/releasenotes/9.0.1.html

    Changes

    • In show_file, use os.remove to remove temporary images. CVE-2022-24303 #6010 [@​radarhere, @​hugovk]
    • Restrict builtins within lambdas for ImageMath.eval. CVE-2022-22817 #6009 [radarhere]
    Changelog

    Sourced from pillow's changelog.

    9.0.1 (2022-02-03)

    • In show_file, use os.remove to remove temporary images. CVE-2022-24303 #6010 [radarhere, hugovk]

    • Restrict builtins within lambdas for ImageMath.eval. CVE-2022-22817 #6009 [radarhere]

    Commits
    • 6deac9e 9.0.1 version bump
    • c04d812 Update CHANGES.rst [ci skip]
    • 4fabec3 Added release notes for 9.0.1
    • 02affaa Added delay after opening image with xdg-open
    • ca0b585 Updated formatting
    • 427221e In show_file, use os.remove to remove temporary images
    • c930be0 Restrict builtins within lambdas for ImageMath.eval
    • 75b69dd Dont need to pin for GHA
    • cd938a7 Autolink CWE numbers with sphinx-issues
    • 2e9c461 Add CVE IDs
    • See full diff in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 2
  • Bump tensorflow from 2.7.0 to 2.8.0 in /docs

    Bump tensorflow from 2.7.0 to 2.8.0 in /docs

    Bumps tensorflow from 2.7.0 to 2.8.0.

    Release notes

    Sourced from tensorflow's releases.

    TensorFlow 2.8.0

    Release 2.8.0

    Major Features and Improvements

    • tf.lite:

      • Added TFLite builtin op support for the following TF ops:
        • tf.raw_ops.Bucketize op on CPU.
        • tf.where op for data types tf.int32/tf.uint32/tf.int8/tf.uint8/tf.int64.
        • tf.random.normal op for output data type tf.float32 on CPU.
        • tf.random.uniform op for output data type tf.float32 on CPU.
        • tf.random.categorical op for output data type tf.int64 on CPU.
    • tensorflow.experimental.tensorrt:

      • conversion_params is now deprecated inside TrtGraphConverterV2 in favor of direct arguments: max_workspace_size_bytes, precision_mode, minimum_segment_size, maximum_cached_engines, use_calibration and allow_build_at_runtime.
      • Added a new parameter called save_gpu_specific_engines to the .save() function inside TrtGraphConverterV2. When False, the .save() function won't save any TRT engines that have been built. When True (default), the original behavior is preserved.
      • TrtGraphConverterV2 provides a new API called .summary() which outputs a summary of the inference converted by TF-TRT. It namely shows each TRTEngineOp with their input(s)' and output(s)' shape and dtype. A detailed version of the summary is available which prints additionally all the TensorFlow OPs included in each of the TRTEngineOps.
    • tf.tpu.experimental.embedding:

      • tf.tpu.experimental.embedding.FeatureConfig now takes an additional argument output_shape which can specify the shape of the output activation for the feature.
      • tf.tpu.experimental.embedding.TPUEmbedding now has the same behavior as tf.tpu.experimental.embedding.serving_embedding_lookup which can take arbitrary rank of dense and sparse tensor. For ragged tensor, though the input tensor remains to be rank 2, the activations now can be rank 2 or above by specifying the output shape in the feature config or via the build method.
    • Add tf.config.experimental.enable_op_determinism, which makes TensorFlow ops run deterministically at the cost of performance. Replaces the TF_DETERMINISTIC_OPS environmental variable, which is now deprecated. The "Bug Fixes and Other Changes" section lists more determinism-related changes.

    • (Since TF 2.7) Add PluggableDevice support to TensorFlow Profiler.

    Bug Fixes and Other Changes

    • tf.data:

      • The optimization parallel_batch now becomes default if not disabled by users, which will parallelize copying of batch elements.
      • Added the ability for TensorSliceDataset to identify and handle inputs that are files. This enables creating hermetic SavedModels when using datasets created from files.
        • The optimization parallel_batch now becomes default if not disabled by users, which will parallelize copying of batch elements.
        • Added the ability for TensorSliceDataset to identify and handle inputs that are files. This enables creating hermetic SavedModels when using datasets created from files.
    • tf.lite:

      • Adds GPU Delegation support for serialization to Java API. This boosts initialization time up to 90% when OpenCL is available.
      • Deprecated Interpreter::SetNumThreads, in favor of InterpreterBuilder::SetNumThreads.
    • tf.keras:

      • Adds tf.compat.v1.keras.utils.get_or_create_layer to aid migration to TF2 by enabling tracking of nested keras models created in TF1-style, when used with the tf.compat.v1.keras.utils.track_tf1_style_variables decorator.
      • Added a tf.keras.layers.experimental.preprocessing.HashedCrossing layer which applies the hashing trick to the concatenation of crossed scalar inputs. This provides a stateless way to try adding feature crosses of integer or string data to a model.
      • Removed keras.layers.experimental.preprocessing.CategoryCrossing. Users should migrate to the HashedCrossing layer or use tf.sparse.cross/tf.ragged.cross directly.
      • Added additional standardize and split modes to TextVectorization:
        • standardize="lower" will lowercase inputs.
        • standardize="string_punctuation" will remove all puncuation.
        • split="character" will split on every unicode character.
      • Added an output_mode argument to the Discretization and Hashing layers with the same semantics as other preprocessing layers. All categorical preprocessing layers now support output_mode.
      • All preprocessing layer output will follow the compute dtype of a tf.keras.mixed_precision.Policy, unless constructed with output_mode="int" in which case output will be tf.int64. The output type of any preprocessing layer can be controlled individually by passing a dtype argument to the layer.
      • tf.random.Generator for keras initializers and all RNG code.
      • Added 3 new APIs for enable/disable/check the usage of tf.random.Generator in keras backend, which will be the new backend for all the RNG in Keras. We plan to switch on the new code path by default in tf 2.8, and the behavior change will likely to cause some breakage on user side (eg if the test is checking against some golden nubmer). These 3 APIs will allow user to disable and switch back to legacy behavior if they prefer. In future (eg TF 2.10), we expect to totally remove the legacy code path (stateful random Ops), and these 3 APIs will be removed as well.

    ... (truncated)

    Changelog

    Sourced from tensorflow's changelog.

    Release 2.8.0

    Major Features and Improvements

    • tf.lite:

      • Added TFLite builtin op support for the following TF ops:
        • tf.raw_ops.Bucketize op on CPU.
        • tf.where op for data types tf.int32/tf.uint32/tf.int8/tf.uint8/tf.int64.
        • tf.random.normal op for output data type tf.float32 on CPU.
        • tf.random.uniform op for output data type tf.float32 on CPU.
        • tf.random.categorical op for output data type tf.int64 on CPU.
    • tensorflow.experimental.tensorrt:

      • conversion_params is now deprecated inside TrtGraphConverterV2 in favor of direct arguments: max_workspace_size_bytes, precision_mode, minimum_segment_size, maximum_cached_engines, use_calibration and allow_build_at_runtime.
      • Added a new parameter called save_gpu_specific_engines to the .save() function inside TrtGraphConverterV2. When False, the .save() function won't save any TRT engines that have been built. When True (default), the original behavior is preserved.
      • TrtGraphConverterV2 provides a new API called .summary() which outputs a summary of the inference converted by TF-TRT. It namely shows each TRTEngineOp with their input(s)' and output(s)' shape and dtype. A detailed version of the summary is available which prints additionally all the TensorFlow OPs included in each of the TRTEngineOps.
    • tf.tpu.experimental.embedding:

      • tf.tpu.experimental.embedding.FeatureConfig now takes an additional argument output_shape which can specify the shape of the output activation for the feature.
      • tf.tpu.experimental.embedding.TPUEmbedding now has the same behavior as tf.tpu.experimental.embedding.serving_embedding_lookup which can take arbitrary rank of dense and sparse tensor. For ragged tensor, though the input tensor remains to be rank 2, the activations now can be rank 2 or above by specifying the output shape in the feature config or via the build method.
    • Add tf.config.experimental.enable_op_determinism, which makes TensorFlow ops run deterministically at the cost of performance. Replaces the TF_DETERMINISTIC_OPS environmental variable, which is now deprecated. The "Bug Fixes and Other Changes" section lists more determinism-related changes.

    • (Since TF 2.7) Add

    ... (truncated)

    Commits
    • 3f878cf Merge pull request #54226 from tensorflow-jenkins/version-numbers-2.8.0-22199
    • 54307e6 Update version numbers to 2.8.0
    • 2f2bdd2 Merge pull request #54193 from tensorflow/update-release-notes
    • 97e2f16 Update release notes with security advisories/updates
    • 93e224e Merge pull request #54182 from tensorflow/cherrypick-93323537ac0581a88af827af...
    • 14defd0 Bump ICU to 69.1 to handle CVE-2020-10531.
    • 0a20763 Merge pull request #54159 from tensorflow/cherrypick-b1756cf206fc4db86f05c420...
    • b7ecb36 Bump the maximum threshold before erroring
    • e542736 Merge pull request #54123 from terryheo/windows-fix-r2.8
    • 8dd07bd lite: Update Windows tensorflowlite_flex.dll build
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 2
  • Bump tensorflow from 2.7.0 to 2.7.1 in /docs

    Bump tensorflow from 2.7.0 to 2.7.1 in /docs

    Bumps tensorflow from 2.7.0 to 2.7.1.

    Release notes

    Sourced from tensorflow's releases.

    TensorFlow 2.7.1

    Release 2.7.1

    This releases introduces several vulnerability fixes:

    • Fixes a floating point division by 0 when executing convolution operators (CVE-2022-21725)
    • Fixes a heap OOB read in shape inference for ReverseSequence (CVE-2022-21728)
    • Fixes a heap OOB access in Dequantize (CVE-2022-21726)
    • Fixes an integer overflow in shape inference for Dequantize (CVE-2022-21727)
    • Fixes a heap OOB access in FractionalAvgPoolGrad (CVE-2022-21730)
    • Fixes an overflow and divide by zero in UnravelIndex (CVE-2022-21729)
    • Fixes a type confusion in shape inference for ConcatV2 (CVE-2022-21731)
    • Fixes an OOM in ThreadPoolHandle (CVE-2022-21732)
    • Fixes an OOM due to integer overflow in StringNGrams (CVE-2022-21733)
    • Fixes more issues caused by incomplete validation in boosted trees code (CVE-2021-41208)
    • Fixes an integer overflows in most sparse component-wise ops (CVE-2022-23567)
    • Fixes an integer overflows in AddManySparseToTensorsMap (CVE-2022-23568)
    • Fixes a number of CHECK-failures in MapStage (CVE-2022-21734)
    • Fixes a division by zero in FractionalMaxPool (CVE-2022-21735)
    • Fixes a number of CHECK-fails when building invalid/overflowing tensor shapes (CVE-2022-23569)
    • Fixes an undefined behavior in SparseTensorSliceDataset (CVE-2022-21736)
    • Fixes an assertion failure based denial of service via faulty bin count operations (CVE-2022-21737)
    • Fixes a reference binding to null pointer in QuantizedMaxPool (CVE-2022-21739)
    • Fixes an integer overflow leading to crash in SparseCountSparseOutput (CVE-2022-21738)
    • Fixes a heap overflow in SparseCountSparseOutput (CVE-2022-21740)
    • Fixes an FPE in BiasAndClamp in TFLite (CVE-2022-23557)
    • Fixes an FPE in depthwise convolutions in TFLite (CVE-2022-21741)
    • Fixes an integer overflow in TFLite array creation (CVE-2022-23558)
    • Fixes an integer overflow in TFLite (CVE-2022-23559)
    • Fixes a dangerous OOB write in TFLite (CVE-2022-23561)
    • Fixes a vulnerability leading to read and write outside of bounds in TFLite (CVE-2022-23560)
    • Fixes a set of vulnerabilities caused by using insecure temporary files (CVE-2022-23563)
    • Fixes an integer overflow in Range resulting in undefined behavior and OOM (CVE-2022-23562)
    • Fixes a vulnerability where missing validation causes tf.sparse.split to crash when axis is a tuple (CVE-2021-41206)
    • Fixes a CHECK-fail when decoding resource handles from proto (CVE-2022-23564)
    • Fixes a CHECK-fail with repeated AttrDef (CVE-2022-23565)
    • Fixes a heap OOB write in Grappler (CVE-2022-23566)
    • Fixes a CHECK-fail when decoding invalid tensors from proto (CVE-2022-23571)
    • Fixes a null-dereference when specializing tensor type (CVE-2022-23570)
    • Fixes a crash when type cannot be specialized (CVE-2022-23572)
    • Fixes a heap OOB read/write in SpecializeType (CVE-2022-23574)
    • Fixes an unitialized variable access in AssignOp (CVE-2022-23573)
    • Fixes an integer overflow in OpLevelCostEstimator::CalculateTensorSize (CVE-2022-23575)
    • Fixes an integer overflow in OpLevelCostEstimator::CalculateOutputSize (CVE-2022-23576)
    • Fixes a null dereference in GetInitOp (CVE-2022-23577)
    • Fixes a memory leak when a graph node is invalid (CVE-2022-23578)
    • Fixes an abort caused by allocating a vector that is too large (CVE-2022-23580)
    • Fixes multiple CHECK-failures during Grappler's IsSimplifiableReshape (CVE-2022-23581)
    • Fixes multiple CHECK-failures during Grappler's SafeToRemoveIdentity (CVE-2022-23579)
    • Fixes multiple CHECK-failures in TensorByteSize (CVE-2022-23582)

    ... (truncated)

    Changelog

    Sourced from tensorflow's changelog.

    Release 2.7.1

    This releases introduces several vulnerability fixes:

    • Fixes a floating point division by 0 when executing convolution operators (CVE-2022-21725)
    • Fixes a heap OOB read in shape inference for ReverseSequence (CVE-2022-21728)
    • Fixes a heap OOB access in Dequantize (CVE-2022-21726)
    • Fixes an integer overflow in shape inference for Dequantize (CVE-2022-21727)
    • Fixes a heap OOB access in FractionalAvgPoolGrad (CVE-2022-21730)
    • Fixes an overflow and divide by zero in UnravelIndex (CVE-2022-21729)
    • Fixes a type confusion in shape inference for ConcatV2 (CVE-2022-21731)
    • Fixes an OOM in ThreadPoolHandle (CVE-2022-21732)
    • Fixes an OOM due to integer overflow in StringNGrams (CVE-2022-21733)
    • Fixes more issues caused by incomplete validation in boosted trees code (CVE-2021-41208)
    • Fixes an integer overflows in most sparse component-wise ops (CVE-2022-23567)
    • Fixes an integer overflows in AddManySparseToTensorsMap (CVE-2022-23568)
    • Fixes a number of CHECK-failures in MapStage (CVE-2022-21734)
    • Fixes a division by zero in FractionalMaxPool (CVE-2022-21735)
    • Fixes a number of CHECK-fails when building invalid/overflowing tensor shapes (CVE-2022-23569)
    • Fixes an undefined behavior in SparseTensorSliceDataset (CVE-2022-21736)
    • Fixes an assertion failure based denial of service via faulty bin count operations (CVE-2022-21737)
    • Fixes a reference binding to null pointer in QuantizedMaxPool (CVE-2022-21739)
    • Fixes an integer overflow leading to crash in SparseCountSparseOutput (CVE-2022-21738)
    • Fixes a heap overflow in SparseCountSparseOutput (CVE-2022-21740)
    • Fixes an FPE in BiasAndClamp in TFLite (CVE-2022-23557)
    • Fixes an FPE in depthwise convolutions in TFLite (CVE-2022-21741)

    ... (truncated)

    Commits
    • 2a0f59e Merge pull request #54212 from tensorflow/disable-flaky-tests-on-r2.7
    • 937f21f Disable flaky test
    • 9998338 Merge pull request #54202 from tensorflow/fix-sanity-on-r2.7
    • 2c50ffd Reorder tags to fix buildifier linting
    • f5f1bd7 Merge pull request #54199 from tensorflow/cherrypick-510ae18200d0a4fad797c0bf...
    • f6bb419 Set Env Variable to override Setuptools new behavior
    • 28d73d4 Merge pull request #54175 from tensorflow-jenkins/relnotes-2.7.1-14226
    • afffde4 Update RELEASE.md
    • daa1073 Merge pull request #54189 from tensorflow/revert-54188-revert-54183-cherrypic...
    • 3d53027 Update third_party/icu/workspace.bzl
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 2
  • Bump numpy from 1.19.5 to 1.21.0 in /docs

    Bump numpy from 1.19.5 to 1.21.0 in /docs

    Bumps numpy from 1.19.5 to 1.21.0.

    Release notes

    Sourced from numpy's releases.

    v1.21.0

    NumPy 1.21.0 Release Notes

    The NumPy 1.21.0 release highlights are

    • continued SIMD work covering more functions and platforms,
    • initial work on the new dtype infrastructure and casting,
    • universal2 wheels for Python 3.8 and Python 3.9 on Mac,
    • improved documentation,
    • improved annotations,
    • new PCG64DXSM bitgenerator for random numbers.

    In addition there are the usual large number of bug fixes and other improvements.

    The Python versions supported for this release are 3.7-3.9. Official support for Python 3.10 will be added when it is released.

    :warning: Warning: there are unresolved problems compiling NumPy 1.21.0 with gcc-11.1 .

    • Optimization level -O3 results in many wrong warnings when running the tests.
    • On some hardware NumPy will hang in an infinite loop.

    New functions

    Add PCG64DXSM BitGenerator

    Uses of the PCG64 BitGenerator in a massively-parallel context have been shown to have statistical weaknesses that were not apparent at the first release in numpy 1.17. Most users will never observe this weakness and are safe to continue to use PCG64. We have introduced a new PCG64DXSM BitGenerator that will eventually become the new default BitGenerator implementation used by default_rng in future releases. PCG64DXSM solves the statistical weakness while preserving the performance and the features of PCG64.

    See upgrading-pcg64 for more details.

    (gh-18906)

    Expired deprecations

    • The shape argument numpy.unravel_index cannot be passed as dims keyword argument anymore. (Was deprecated in NumPy 1.16.)

    ... (truncated)

    Commits
    • b235f9e Merge pull request #19283 from charris/prepare-1.21.0-release
    • 34aebc2 MAINT: Update 1.21.0-notes.rst
    • 493b64b MAINT: Update 1.21.0-changelog.rst
    • 07d7e72 MAINT: Remove accidentally created directory.
    • 032fca5 Merge pull request #19280 from charris/backport-19277
    • 7d25b81 BUG: Fix refcount leak in ResultType
    • fa5754e BUG: Add missing DECREF in new path
    • 61127bb Merge pull request #19268 from charris/backport-19264
    • 143d45f Merge pull request #19269 from charris/backport-19228
    • d80e473 BUG: Removed typing for == and != in dtypes
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 2
  • Bump tensorflow from 2.5.1 to 2.5.2 in /docs

    Bump tensorflow from 2.5.1 to 2.5.2 in /docs

    Bumps tensorflow from 2.5.1 to 2.5.2.

    Release notes

    Sourced from tensorflow's releases.

    TensorFlow 2.5.2

    Release 2.5.2

    This release introduces several vulnerability fixes:

    • Fixes a code injection issue in saved_model_cli (CVE-2021-41228)
    • Fixes a vulnerability due to use of uninitialized value in Tensorflow (CVE-2021-41225)
    • Fixes a heap OOB in FusedBatchNorm kernels (CVE-2021-41223)
    • Fixes an arbitrary memory read in ImmutableConst (CVE-2021-41227)
    • Fixes a heap OOB in SparseBinCount (CVE-2021-41226)
    • Fixes a heap OOB in SparseFillEmptyRows (CVE-2021-41224)
    • Fixes a segfault due to negative splits in SplitV (CVE-2021-41222)
    • Fixes segfaults and vulnerabilities caused by accesses to invalid memory during shape inference in Cudnn* ops (CVE-2021-41221)
    • Fixes a null pointer exception when Exit node is not preceded by Enter op (CVE-2021-41217)
    • Fixes an integer division by 0 in tf.raw_ops.AllToAll (CVE-2021-41218)
    • Fixes an undefined behavior via nullptr reference binding in sparse matrix multiplication (CVE-2021-41219)
    • Fixes a heap buffer overflow in Transpose (CVE-2021-41216)
    • Prevents deadlocks arising from mutually recursive tf.function objects (CVE-2021-41213)
    • Fixes a null pointer exception in DeserializeSparse (CVE-2021-41215)
    • Fixes an undefined behavior arising from reference binding to nullptr in tf.ragged.cross (CVE-2021-41214)
    • Fixes a heap OOB read in tf.ragged.cross (CVE-2021-41212)
    • Fixes a heap OOB read in all tf.raw_ops.QuantizeAndDequantizeV* ops (CVE-2021-41205)
    • Fixes an FPE in ParallelConcat (CVE-2021-41207)
    • Fixes FPE issues in convolutions with zero size filters (CVE-2021-41209)
    • Fixes a heap OOB read in tf.raw_ops.SparseCountSparseOutput (CVE-2021-41210)
    • Fixes vulnerabilities caused by incomplete validation in boosted trees code (CVE-2021-41208)
    • Fixes vulnerabilities caused by incomplete validation of shapes in multiple TF ops (CVE-2021-41206)
    • Fixes a segfault produced while copying constant resource tensor (CVE-2021-41204)
    • Fixes a vulnerability caused by unitialized access in EinsumHelper::ParseEquation (CVE-2021-41201)
    • Fixes several vulnerabilities and segfaults caused by missing validation during checkpoint loading (CVE-2021-41203)
    • Fixes an overflow producing a crash in tf.range (CVE-2021-41202)
    • Fixes an overflow producing a crash in tf.image.resize when size is large (CVE-2021-41199)
    • Fixes an overflow producing a crash in tf.tile when tiling tensor is large (CVE-2021-41198)
    • Fixes a vulnerability produced due to incomplete validation in tf.summary.create_file_writer (CVE-2021-41200)
    • Fixes multiple crashes due to overflow and CHECK-fail in ops with large tensor shapes (CVE-2021-41197)
    • Fixes a crash in max_pool3d when size argument is 0 or negative (CVE-2021-41196)
    • Fixes a crash in tf.math.segment_* operations (CVE-2021-41195)
    • Updates curl to 7.78.0 to handle CVE-2021-22922, CVE-2021-22923, CVE-2021-22924, CVE-2021-22925, and CVE-2021-22926.
    Changelog

    Sourced from tensorflow's changelog.

    Release 2.5.2

    This release introduces several vulnerability fixes:

    ... (truncated)

    Commits
    • 957590e Merge pull request #52873 from tensorflow-jenkins/relnotes-2.5.2-20787
    • 2e1d16d Update RELEASE.md
    • 2fa6dd9 Merge pull request #52877 from tensorflow-jenkins/version-numbers-2.5.2-192
    • 4807489 Merge pull request #52881 from tensorflow/fix-build-1-on-r2.5
    • d398bdf Disable failing test
    • 857ad5e Merge pull request #52878 from tensorflow/fix-build-1-on-r2.5
    • 6c2a215 Disable failing test
    • f5c57d4 Update version numbers to 2.5.2
    • e51f949 Insert release notes place-fill
    • 2620d2c Merge pull request #52863 from tensorflow/fix-build-3-on-r2.5
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 2
  • Bump babel from 2.9.0 to 2.9.1 in /docs

    Bump babel from 2.9.0 to 2.9.1 in /docs

    Bumps babel from 2.9.0 to 2.9.1.

    Release notes

    Sourced from babel's releases.

    Version 2.9.1

    Bugfixes

    • The internal locale-data loading functions now validate the name of the locale file to be loaded and only allow files within Babel's data directory. Thank you to Chris Lyne of Tenable, Inc. for discovering the issue!
    Changelog

    Sourced from babel's changelog.

    Version 2.9.1

    Bugfixes

    
    * The internal locale-data loading functions now validate the name of the locale file to be loaded and only
      allow files within Babel's data directory.  Thank you to Chris Lyne of Tenable, Inc. for discovering the issue!
    
    Commits
    • a99fa24 Use 2.9.0's setup.py for 2.9.1
    • 60b33e0 Become 2.9.1
    • 412015e Merge pull request #782 from python-babel/locale-basename
    • 5caf717 Disallow special filenames on Windows
    • 3a700b5 Run locale identifiers through os.path.basename()
    • 5afe2b2 Merge pull request #754 from python-babel/github-ci
    • 58de834 Replace Travis + Appveyor with GitHub Actions (WIP)
    • d1bbc08 import_cldr: use logging; add -q option
    • 156b7fb Quiesce CLDR download progress bar if requested (or not a TTY)
    • 613dc17 Make the import warnings about unsupported number systems less verbose
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 2
  • Bump tensorflow from 2.4.1 to 2.4.2 in /docs

    Bump tensorflow from 2.4.1 to 2.4.2 in /docs

    Bumps tensorflow from 2.4.1 to 2.4.2.

    Release notes

    Sourced from tensorflow's releases.

    TensorFlow 2.4.2

    Release 2.4.2

    This release introduces several vulnerability fixes:

    ... (truncated)

    Changelog

    Sourced from tensorflow's changelog.

    Release 2.4.2

    This release introduces several vulnerability fixes:

    • Fixes a heap buffer overflow in RaggedBinCount (CVE-2021-29512)
    • Fixes a heap out of bounds write in RaggedBinCount (CVE-2021-29514)
    • Fixes a type confusion during tensor casts which leads to dereferencing null pointers (CVE-2021-29513)
    • Fixes a reference binding to null pointer in MatrixDiag* ops (CVE-2021-29515)
    • Fixes a null pointer dereference via invalid Ragged Tensors (CVE-2021-29516)
    • Fixes a division by zero in Conv3D (CVE-2021-29517)
    • Fixes vulnerabilities where session operations in eager mode lead to null pointer dereferences (CVE-2021-29518)
    • Fixes a CHECK-fail in SparseCross caused by type confusion (CVE-2021-29519)
    • Fixes a segfault in SparseCountSparseOutput (CVE-2021-29521)
    • Fixes a heap buffer overflow in Conv3DBackprop* (CVE-2021-29520)
    • Fixes a division by 0 in Conv3DBackprop* (CVE-2021-29522)
    • Fixes a CHECK-fail in AddManySparseToTensorsMap (CVE-2021-29523)
    • Fixes a division by 0 in Conv2DBackpropFilter (CVE-2021-29524)
    • Fixes a division by 0 in Conv2DBackpropInput (CVE-2021-29525)
    • Fixes a division by 0 in Conv2D (CVE-2021-29526)
    • Fixes a division by 0 in QuantizedConv2D (CVE-2021-29527)
    • Fixes a division by 0 in QuantizedMul (CVE-2021-29528)
    • Fixes vulnerabilities caused by invalid validation in SparseMatrixSparseCholesky (CVE-2021-29530)
    • Fixes a heap buffer overflow caused by rounding (CVE-2021-29529)
    • Fixes a CHECK-fail in tf.raw_ops.EncodePng (CVE-2021-29531)
    • Fixes a heap out of bounds read in RaggedCross (CVE-2021-29532)
    • Fixes a CHECK-fail in DrawBoundingBoxes

    ... (truncated)

    Commits
    • 1923123 Merge pull request #50210 from tensorflow/geetachavan1-patch-1
    • a0c8093 Update BUILD
    • f1c8200 Merge pull request #50203 from tensorflow/mihaimaruseac-patch-1
    • 7cf45b5 Update common.sh
    • 4aaac2b Merge pull request #50185 from geetachavan1/cherrypicks_U90C1
    • 65afa4b Fix the nightly nonpip builds for MacOS.
    • 46c1821 Merge pull request #50184 from tensorflow/mihaimaruseac-patch-1
    • cf8d667 Update common_win.bat
    • b2ef8a6 Merge pull request #50061 from tensorflow/geetachavan1-patch-2
    • f9a1ba8 Update sparse_fill_empty_rows_op.cc
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 2
  • Bump urllib3 from 1.26.4 to 1.26.5 in /docs

    Bump urllib3 from 1.26.4 to 1.26.5 in /docs

    Bumps urllib3 from 1.26.4 to 1.26.5.

    Release notes

    Sourced from urllib3's releases.

    1.26.5

    :warning: IMPORTANT: urllib3 v2.0 will drop support for Python 2: Read more in the v2.0 Roadmap

    • Fixed deprecation warnings emitted in Python 3.10.
    • Updated vendored six library to 1.16.0.
    • Improved performance of URL parser when splitting the authority component.

    If you or your organization rely on urllib3 consider supporting us via GitHub Sponsors

    Changelog

    Sourced from urllib3's changelog.

    1.26.5 (2021-05-26)

    • Fixed deprecation warnings emitted in Python 3.10.
    • Updated vendored six library to 1.16.0.
    • Improved performance of URL parser when splitting the authority component.
    Commits
    • d161647 Release 1.26.5
    • 2d4a3fe Improve performance of sub-authority splitting in URL
    • 2698537 Update vendored six to 1.16.0
    • 07bed79 Fix deprecation warnings for Python 3.10 ssl module
    • d725a9b Add Python 3.10 to GitHub Actions
    • 339ad34 Use pytest==6.2.4 on Python 3.10+
    • f271c9c Apply latest Black formatting
    • 1884878 [1.26] Properly proxy EOF on the SSLTransport test suite
    • See full diff in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 2
  • Bump tensorflow from 2.4.1 to 2.5.0 in /docs

    Bump tensorflow from 2.4.1 to 2.5.0 in /docs

    Bumps tensorflow from 2.4.1 to 2.5.0.

    Release notes

    Sourced from tensorflow's releases.

    TensorFlow 2.5.0

    Release 2.5.0

    Major Features and Improvements

    • Support for Python3.9 has been added.
    • tf.data:
      • tf.data service now supports strict round-robin reads, which is useful for synchronous training workloads where example sizes vary. With strict round robin reads, users can guarantee that consumers get similar-sized examples in the same step.
      • tf.data service now supports optional compression. Previously data would always be compressed, but now you can disable compression by passing compression=None to tf.data.experimental.service.distribute(...).
      • tf.data.Dataset.batch() now supports num_parallel_calls and deterministic arguments. num_parallel_calls is used to indicate that multiple input batches should be computed in parallel. With num_parallel_calls set, deterministic is used to indicate that outputs can be obtained in the non-deterministic order.
      • Options returned by tf.data.Dataset.options() are no longer mutable.
      • tf.data input pipelines can now be executed in debug mode, which disables any asynchrony, parallelism, or non-determinism and forces Python execution (as opposed to trace-compiled graph execution) of user-defined functions passed into transformations such as map. The debug mode can be enabled through tf.data.experimental.enable_debug_mode().
    • tf.lite
      • Enabled the new MLIR-based quantization backend by default
        • The new backend is used for 8 bits full integer post-training quantization
        • The new backend removes the redundant rescales and fixes some bugs (shared weight/bias, extremely small scales, etc)
        • Set experimental_new_quantizer in tf.lite.TFLiteConverter to False to disable this change
    • tf.keras
      • tf.keras.metrics.AUC now support logit predictions.
      • Enabled a new supported input type in Model.fit, tf.keras.utils.experimental.DatasetCreator, which takes a callable, dataset_fn. DatasetCreator is intended to work across all tf.distribute strategies, and is the only input type supported for Parameter Server strategy.
    • tf.distribute
      • tf.distribute.experimental.ParameterServerStrategy now supports training with Keras Model.fit when used with DatasetCreator.
      • Creating tf.random.Generator under tf.distribute.Strategy scopes is now allowed (except for tf.distribute.experimental.CentralStorageStrategy and tf.distribute.experimental.ParameterServerStrategy). Different replicas will get different random-number streams.
    • TPU embedding support
      • Added profile_data_directory to EmbeddingConfigSpec in _tpu_estimator_embedding.py. This allows embedding lookup statistics gathered at runtime to be used in embedding layer partitioning decisions.
    • PluggableDevice
    • oneAPI Deep Neural Network Library (oneDNN) CPU performance optimizations from Intel-optimized TensorFlow are now available in the official x86-64 Linux and Windows builds.
      • They are off by default. Enable them by setting the environment variable TF_ENABLE_ONEDNN_OPTS=1.
      • We do not recommend using them in GPU systems, as they have not been sufficiently tested with GPUs yet.
    • TensorFlow pip packages are now built with CUDA11.2 and cuDNN 8.1.0

    Breaking Changes

    • The TF_CPP_MIN_VLOG_LEVEL environment variable has been renamed to to TF_CPP_MAX_VLOG_LEVEL which correctly describes its effect.

    Bug Fixes and Other Changes

    • tf.keras:
      • Preprocessing layers API consistency changes:
        • StringLookup added output_mode, sparse, and pad_to_max_tokens arguments with same semantics as TextVectorization.
        • IntegerLookup added output_mode, sparse, and pad_to_max_tokens arguments with same semantics as TextVectorization. Renamed max_values, oov_value and mask_value to max_tokens, oov_token and mask_token to align with StringLookup and TextVectorization.
        • TextVectorization default for pad_to_max_tokens switched to False.
        • CategoryEncoding no longer supports adapt, IntegerLookup now supports equivalent functionality. max_tokens argument renamed to num_tokens.
        • Discretization added num_bins argument for learning bins boundaries through calling adapt on a dataset. Renamed bins argument to bin_boundaries for specifying bins without adapt.
      • Improvements to model saving/loading:
        • model.load_weights now accepts paths to saved models.

    ... (truncated)

    Changelog

    Sourced from tensorflow's changelog.

    Release 2.5.0

    Breaking Changes

    • The TF_CPP_MIN_VLOG_LEVEL environment variable has been renamed to to TF_CPP_MAX_VLOG_LEVEL which correctly describes its effect.

    Known Caveats

    Major Features and Improvements

    • TPU embedding support

      • Added profile_data_directory to EmbeddingConfigSpec in _tpu_estimator_embedding.py. This allows embedding lookup statistics gathered at runtime to be used in embedding layer partitioning decisions.
    • tf.keras.metrics.AUC now support logit predictions.

    • Creating tf.random.Generator under tf.distribute.Strategy scopes is now allowed (except for tf.distribute.experimental.CentralStorageStrategy and tf.distribute.experimental.ParameterServerStrategy). Different replicas will get different random-number streams.

    • tf.data:

      • tf.data service now supports strict round-robin reads, which is useful for synchronous training workloads where example sizes vary. With strict round robin reads, users can guarantee that consumers get similar-sized examples in the same step.
      • tf.data service now supports optional compression. Previously data would always be compressed, but now you can disable compression by passing compression=None to tf.data.experimental.service.distribute(...).
      • tf.data.Dataset.batch() now supports num_parallel_calls and deterministic arguments. num_parallel_calls is used to indicate that multiple input batches should be computed in parallel. With num_parallel_calls set, deterministic is used to indicate that outputs can be obtained in the non-deterministic order.
      • Options returned by tf.data.Dataset.options() are no longer mutable.
      • tf.data input pipelines can now be executed in debug mode, which disables any asynchrony, parallelism, or non-determinism and forces Python execution (as opposed to trace-compiled graph execution) of user-defined functions passed into transformations such as map. The debug mode can be enabled through tf.data.experimental.enable_debug_mode().
    • tf.lite

      • Enabled the new MLIR-based quantization backend by default
        • The new backend is used for 8 bits full integer post-training quantization
        • The new backend removes the redundant rescales and fixes some bugs (shared weight/bias, extremely small scales, etc)

    ... (truncated)

    Commits
    • a4dfb8d Merge pull request #49124 from tensorflow/mm-cherrypick-tf-data-segfault-fix-...
    • 2107b1d Merge pull request #49116 from tensorflow-jenkins/version-numbers-2.5.0-17609
    • 16b8139 Update snapshot_dataset_op.cc
    • 86a0d86 Merge pull request #49126 from geetachavan1/cherrypicks_X9ZNY
    • 9436ae6 Merge pull request #49128 from geetachavan1/cherrypicks_D73J5
    • 6b2bf99 Validate that a and b are proper sparse tensors
    • c03ad1a Ensure validation sticks in banded_triangular_solve_op
    • 12a6ead Merge pull request #49120 from geetachavan1/cherrypicks_KJ5M9
    • b67f5b8 Merge pull request #49118 from geetachavan1/cherrypicks_BIDTR
    • a13c0ad [tf.data][cherrypick] Fix snapshot segfault when using repeat and prefecth
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 2
  • Allow user to request a raised exception if hash(content) will be inconsistent

    Allow user to request a raised exception if hash(content) will be inconsistent

    Allows user to have custom_hasher raise ValueError if it cannot identify a consistent-across-instantiations hash for the object.

    This is valuable because we've found it very easy to create artifacts that are impossible to get reused, because they have data in their config that can’t be represented as Python primitives. Currently you can’t possibly know if you’ve created such an artifact until you try to rerun and notice it’s taking a long time / the reuse is not being picked up, or unless you look at the db. This change supports users in reducing the cycle time to identify this error.

    opened by ptoman-pa 1
  • Bump certifi from 2021.10.8 to 2022.12.7 in /docs

    Bump certifi from 2021.10.8 to 2022.12.7 in /docs

    Bumps certifi from 2021.10.8 to 2022.12.7.

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 1
  • Bump pillow from 9.0.0 to 9.3.0 in /docs

    Bump pillow from 9.0.0 to 9.3.0 in /docs

    Bumps pillow from 9.0.0 to 9.3.0.

    Release notes

    Sourced from pillow's releases.

    9.3.0

    https://pillow.readthedocs.io/en/stable/releasenotes/9.3.0.html

    Changes

    ... (truncated)

    Changelog

    Sourced from pillow's changelog.

    9.3.0 (2022-10-29)

    • Limit SAMPLESPERPIXEL to avoid runtime DOS #6700 [wiredfool]

    • Initialize libtiff buffer when saving #6699 [radarhere]

    • Inline fname2char to fix memory leak #6329 [nulano]

    • Fix memory leaks related to text features #6330 [nulano]

    • Use double quotes for version check on old CPython on Windows #6695 [hugovk]

    • Remove backup implementation of Round for Windows platforms #6693 [cgohlke]

    • Fixed set_variation_by_name offset #6445 [radarhere]

    • Fix malloc in _imagingft.c:font_setvaraxes #6690 [cgohlke]

    • Release Python GIL when converting images using matrix operations #6418 [hmaarrfk]

    • Added ExifTags enums #6630 [radarhere]

    • Do not modify previous frame when calculating delta in PNG #6683 [radarhere]

    • Added support for reading BMP images with RLE4 compression #6674 [npjg, radarhere]

    • Decode JPEG compressed BLP1 data in original mode #6678 [radarhere]

    • Added GPS TIFF tag info #6661 [radarhere]

    • Added conversion between RGB/RGBA/RGBX and LAB #6647 [radarhere]

    • Do not attempt normalization if mode is already normal #6644 [radarhere]

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • Bump tensorflow from 2.7.0 to 2.9.3 in /docs

    Bump tensorflow from 2.7.0 to 2.9.3 in /docs

    Bumps tensorflow from 2.7.0 to 2.9.3.

    Release notes

    Sourced from tensorflow's releases.

    TensorFlow 2.9.3

    Release 2.9.3

    This release introduces several vulnerability fixes:

    TensorFlow 2.9.2

    Release 2.9.2

    This releases introduces several vulnerability fixes:

    ... (truncated)

    Changelog

    Sourced from tensorflow's changelog.

    Release 2.9.3

    This release introduces several vulnerability fixes:

    Release 2.8.4

    This release introduces several vulnerability fixes:

    ... (truncated)

    Commits
    • a5ed5f3 Merge pull request #58584 from tensorflow/vinila21-patch-2
    • 258f9a1 Update py_func.cc
    • cd27cfb Merge pull request #58580 from tensorflow-jenkins/version-numbers-2.9.3-24474
    • 3e75385 Update version numbers to 2.9.3
    • bc72c39 Merge pull request #58482 from tensorflow-jenkins/relnotes-2.9.3-25695
    • 3506c90 Update RELEASE.md
    • 8dcb48e Update RELEASE.md
    • 4f34ec8 Merge pull request #58576 from pak-laura/c2.99f03a9d3bafe902c1e6beb105b2f2417...
    • 6fc67e4 Replace CHECK with returning an InternalError on failing to create python tuple
    • 5dbe90a Merge pull request #58570 from tensorflow/r2.9-7b174a0f2e4
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 1
  • Bump joblib from 1.1.0 to 1.2.0 in /docs

    Bump joblib from 1.1.0 to 1.2.0 in /docs

    Bumps joblib from 1.1.0 to 1.2.0.

    Changelog

    Sourced from joblib's changelog.

    Release 1.2.0

    • Fix a security issue where eval(pre_dispatch) could potentially run arbitrary code. Now only basic numerics are supported. joblib/joblib#1327

    • Make sure that joblib works even when multiprocessing is not available, for instance with Pyodide joblib/joblib#1256

    • Avoid unnecessary warnings when workers and main process delete the temporary memmap folder contents concurrently. joblib/joblib#1263

    • Fix memory alignment bug for pickles containing numpy arrays. This is especially important when loading the pickle with mmap_mode != None as the resulting numpy.memmap object would not be able to correct the misalignment without performing a memory copy. This bug would cause invalid computation and segmentation faults with native code that would directly access the underlying data buffer of a numpy array, for instance C/C++/Cython code compiled with older GCC versions or some old OpenBLAS written in platform specific assembly. joblib/joblib#1254

    • Vendor cloudpickle 2.2.0 which adds support for PyPy 3.8+.

    • Vendor loky 3.3.0 which fixes several bugs including:

      • robustly forcibly terminating worker processes in case of a crash (joblib/joblib#1269);

      • avoiding leaking worker processes in case of nested loky parallel calls;

      • reliability spawn the correct number of reusable workers.

    Commits
    • 5991350 Release 1.2.0
    • 3fa2188 MAINT cleanup numpy warnings related to np.matrix in tests (#1340)
    • cea26ff CI test the future loky-3.3.0 branch (#1338)
    • 8aca6f4 MAINT: remove pytest.warns(None) warnings in pytest 7 (#1264)
    • 067ed4f XFAIL test_child_raises_parent_exits_cleanly with multiprocessing (#1339)
    • ac4ebd5 MAINT add back pytest warnings plugin (#1337)
    • a23427d Test child raises parent exits cleanly more reliable on macos (#1335)
    • ac09691 [MAINT] various test updates (#1334)
    • 4a314b1 Vendor loky 3.2.0 (#1333)
    • bdf47e9 Make test_parallel_with_interactively_defined_functions_default_backend timeo...
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • Bump protobuf from 3.19.3 to 3.19.5 in /docs

    Bump protobuf from 3.19.3 to 3.19.5 in /docs

    Bumps protobuf from 3.19.3 to 3.19.5.

    Release notes

    Sourced from protobuf's releases.

    Protocol Buffers v3.19.5

    C++

    Protocol Buffers v3.19.4

    Python

    • Make libprotobuf symbols local on OSX to fix issue #9395 (#9435)

    Ruby

    • Fixed a data loss bug that could occur when the number of optional fields in a message is an exact multiple of 32. (#9440).

    PHP

    • Fixed a data loss bug that could occur when the number of optional fields in a message is an exact multiple of 32. (#9440).
    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 1
  • Bump oauthlib from 3.1.1 to 3.2.1 in /docs

    Bump oauthlib from 3.1.1 to 3.2.1 in /docs

    Bumps oauthlib from 3.1.1 to 3.2.1.

    Release notes

    Sourced from oauthlib's releases.

    3.2.1

    In short

    OAuth2.0 Provider:

    • #803 : Metadata endpoint support of non-HTTPS
    • CVE-2022-36087

    OAuth1.0:

    • #818 : Allow IPv6 being parsed by signature

    General:

    • Improved and fixed documentation warnings.
    • Cosmetic changes based on isort

    What's Changed

    New Contributors

    Full Changelog: https://github.com/oauthlib/oauthlib/compare/v3.2.0...v3.2.1

    3.2.0

    Changelog

    OAuth2.0 Client:

    • #795: Add Device Authorization Flow for Web Application
    • #786: Add PKCE support for Client
    • #783: Fallback to none in case of wrong expires_at format.

    OAuth2.0 Provider:

    • #790: Add support for CORS to metadata endpoint.
    • #791: Add support for CORS to token endpoint.
    • #787: Remove comma after Bearer in WWW-Authenticate

    OAuth2.0 Provider - OIDC:

    • #755: Call save_token in Hybrid code flow
    • #751: OIDC add support of refreshing ID Tokens with refresh_id_token
    • #751: The RefreshTokenGrant modifiers now take the same arguments as the AuthorizationCodeGrant modifiers (token, token_handler, request).

    ... (truncated)

    Changelog

    Sourced from oauthlib's changelog.

    3.2.1 (2022-09-09)

    OAuth2.0 Provider:

    • #803: Metadata endpoint support of non-HTTPS
    • CVE-2022-36087

    OAuth1.0:

    • #818: Allow IPv6 being parsed by signature

    General:

    • Improved and fixed documentation warnings.
    • Cosmetic changes based on isort

    3.2.0 (2022-01-29)

    OAuth2.0 Client:

    • #795: Add Device Authorization Flow for Web Application
    • #786: Add PKCE support for Client
    • #783: Fallback to none in case of wrong expires_at format.

    OAuth2.0 Provider:

    • #790: Add support for CORS to metadata endpoint.
    • #791: Add support for CORS to token endpoint.
    • #787: Remove comma after Bearer in WWW-Authenticate

    OAuth2.0 Provider - OIDC:

    • #755: Call save_token in Hybrid code flow
    • #751: OIDC add support of refreshing ID Tokens with refresh_id_token
    • #751: The RefreshTokenGrant modifiers now take the same arguments as the AuthorizationCodeGrant modifiers (token, token_handler, request).

    General:

    • Added Python 3.9, 3.10, 3.11
    • Improve Travis & Coverage
    Commits
    • 88bb156 Updated date and authors
    • 1a45d97 Prepare 3.2.1 release
    • 0adbbe1 docs: fix typos
    • 6569ec3 docs: Fix a few typos
    • bdc486e Fixed isort imports
    • 7db45bd Fix typo in server.rst
    • b14ad85 chore: s/bode_code_verifier/body_code_verifier/g
    • b123283 Allow non-HTTPS issuer when OAUTHLIB_INSECURE_TRANSPORT. (#803)
    • 2f887b5 Docs: fix Sphinx warnings for better ReadTheDocs generation (#807)
    • d4bafd9 Merge pull request #797 from cclauss/patch-2
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
Releases(0.14.0)
  • 0.14.0(Jul 14, 2022)

    0.14.0 (2022-07-13)

    • Standarized formatting with Black
    • Split up ORM into a standalone swappable backend
    • Persistables maintain weakrefs for lineage
    • Persistables are normal python objects now
    • Hashing flag to reject non-serializable objects

    What's Changed

    • Black formatting by @eyadgaran in https://github.com/eyadgaran/SimpleML/pull/103
    • dask tweaks by @eyadgaran in https://github.com/eyadgaran/SimpleML/pull/104
    • Orm separation by @eyadgaran in https://github.com/eyadgaran/SimpleML/pull/99
    • Allow user to request a raised exception if hash(content) will be inconsistent by @ptoman-pa in https://github.com/eyadgaran/SimpleML/pull/107
    • version bump and changelog by @eyadgaran in https://github.com/eyadgaran/SimpleML/pull/109

    New Contributors

    • @ptoman-pa made their first contribution in https://github.com/eyadgaran/SimpleML/pull/107

    Full Changelog: https://github.com/eyadgaran/SimpleML/compare/0.13.0...0.14.0

    Source code(tar.gz)
    Source code(zip)
  • 0.13.0(Mar 29, 2022)

    • Path existence check for pandas serialization

    What's Changed

    • Path creation by @eyadgaran in https://github.com/eyadgaran/SimpleML/pull/101

    Full Changelog: https://github.com/eyadgaran/SimpleML/compare/0.12.0...0.13.0

    Source code(tar.gz)
    Source code(zip)
  • 0.12.0(Mar 3, 2022)

    • Changed internal dataset structure from mixins to direct inheritance
    • Condensed all pandas dataset types into a single base class
    • Adds support for dask datasets
    • Placeholders for additional dataset libraries
    • Adds hashing support for dask dataframes
    • Refactored persistence ("save_patterns") package into standalone extensible framework
    • Adds context manager support to registries for temporary overwrite
    • Refactor pipelines into library based subclasses

    BREAKING CHANGES

    • Pandas dataset will default param squeeze_return to False (classes expecting to return a series will need to be updated)
    • Numpy dataset is considered unstable and will be redesigned in a future release
    • Onedrive, Hickle, and database save patterns are removed (functionality is still available but a composed pattern is not predefined. these can be trivially added in user code if needed)
    • Changed pandas hash output to int from numpy.int64 (due to breaking change in NumpyHasher)
    • Changed primitive deterministic hash from pickle to md5
    • Extracted data iterators into utility wrappers. Pipelines no longer have flags to return iterators
    • Random split defaults are computed at runtime instead of precalculated (affects hash)

    What's Changed

    • Ml management structure by @eyadgaran in https://github.com/eyadgaran/SimpleML/pull/87
    • Python eol by @eyadgaran in https://github.com/eyadgaran/SimpleML/pull/92
    • Dataset libraries by @eyadgaran in https://github.com/eyadgaran/SimpleML/pull/90
    • Pipeline refactor by @eyadgaran in https://github.com/eyadgaran/SimpleML/pull/96
    • additional testing coverage by @eyadgaran in https://github.com/eyadgaran/SimpleML/pull/83
    • Adding Ensemble Model Histogram-based Gradient Boosting Classifier by @aolopez in https://github.com/eyadgaran/SimpleML/pull/91
    • version bump by @eyadgaran in https://github.com/eyadgaran/SimpleML/pull/98

    New Contributors

    • @aolopez made their first contribution in https://github.com/eyadgaran/SimpleML/pull/91

    Full Changelog: https://github.com/eyadgaran/SimpleML/compare/0.11.0...0.12.0

    Source code(tar.gz)
    Source code(zip)
  • 0.11.0(Oct 10, 2021)

    • Added support to hasher for initialized objects
    • Adds support for arbitrary dataset splits and sections
    • Dataset hooks to validate dataframe setting
    • Pipelines no longer cache dataset splits and proxy directly to dataset on every call
    • Introduces pipeline splits as reproducible projections over dataset splits
    • Database utility to recalculate hashes for existing persistables

    BREAKING CHANGES

    • Hash for an uninitialized class changed from repr(cls) to "cls._module.cls._name"
    • Database migrations no longer recalculate hashes. That has to be done manually via a utility
    Source code(tar.gz)
    Source code(zip)
  • 0.10.0(Jul 10, 2021)

    • Dataset external file setter with validation hooks
    • Pandas changes to always return dataframe copies (does not extend to underlying python objects! eg lists, objects, etc)
    • Pandas Dataset Subclasses for Single and Multi label datasets
    • PersistableLoader methods do not require name as a parameter

    BREAKING CHANGES

    • PandasDataset is deprecated and will be dropped in a future release. Use SingleLabelPandasDataset or MultiLabelPandasDataset instead
    • Pandas Dataset Classes require dataframe objects of type pd.DataFrame and will validate input (containers of pd.DataFrames are no longer supported)
    Source code(tar.gz)
    Source code(zip)
  • 0.9.3(Apr 4, 2021)

  • 0.9.2(Jan 27, 2021)

  • 0.9.1(Dec 28, 2020)

  • 0.9.0(Nov 30, 2020)

    • Refactored save patterns. Supports multiple concurrent save locations and arbitrary artifact declaration
    • Registry centric model for easier extension and third party contrib
    • Support for in-memory sqlite db
    • Changed database JSON mapping class and dependency to support mutability tracking
    • New import wrapper class to manage optional dependencies
    • Added dataset_id as a Metric reference. Breaking workflow change! Will raise an error if a dataset is not added and the metric depends on it
    • Dropped default Train pipeline split. Will return an empty split for split pipelines and a singleton full dataset split for NoSplitPipelines
    • Explicitly migrated to tensorflow 2 and tf.keras
    Source code(tar.gz)
    Source code(zip)
  • 0.8.1(May 12, 2020)

  • 0.8.0(Mar 16, 2020)

  • v0.7.2(Dec 2, 2019)

  • v0.7.1(Oct 13, 2019)

  • v0.7(Oct 8, 2019)

    • Thread-safe Keras Sequence dataset splits
    • Additional Seq2Seq models
    • Bastion tunneling support for SSH db connections
    • Explicit modules for constants and imports
    • Additional base classes for database connections (plain and alembic)
    • Database independent sqlachemy types
    • Switched pickle library from dill to cloudpickle
    • SQLite support
    • Changed default DB connection to SQLite
    Source code(tar.gz)
    Source code(zip)
  • v0.6(Jun 26, 2019)

    • Full database initialization with alembic
    • DB schema validation on start
    • Main configuration file for all credentials
    • Drop official support for python 3.4
    • Automatic handling of no data operations
    • Remaining cloud provider support
    • Feature metadata for classification models
    • Runtime environment validation
    • Add Split and SplitContainer objects
    • Simplejson dependency
    • Pipeline generator support
    • Library specific model base classes
    • Generalized database connection classes
    Source code(tar.gz)
    Source code(zip)
  • v0.5(Feb 18, 2019)

    • Default identity pipeline
    • Alembic integration for database migration
    • Standardized model inheritance pattern
    • Condensed pandas split dataframes into single df
    • Remaining classification metrics
    • Updated schema with hash datatype
    • Updated hash to use joblib code, consistent across initializations
    • Generator pipeline and fitted kwarg
    • Dropped base prefixes
    • Moved composed subclasses to inits
    • Unified datasets and pipelines
    Source code(tar.gz)
    Source code(zip)
  • v0.4(Jan 5, 2019)

    • Keras Seq2Seq support
    • Keras model support
    • Minimized required installation dependencies
    • Abstract base classes
    • Complex object JSON serialization
    • Python 3 compatibility
    • Travis and tox for CI/testing
    Source code(tar.gz)
    Source code(zip)
  • v0.3(Dec 2, 2018)

  • v0.2(Sep 9, 2018)

Owner
Elisha Yadgaran
Elisha Yadgaran
Evidently helps analyze machine learning models during validation or production monitoring

Evidently helps analyze machine learning models during validation or production monitoring. The tool generates interactive visual reports and JSON profiles from pandas DataFrame or csv files. Currently 6 reports are available.

Evidently AI 3.1k Jan 7, 2023
Production Grade Machine Learning Service

This project is made to help you scale from a basic Machine Learning project for research purposes to a production grade Machine Learning web service

Abdullah Zaiter 10 Apr 4, 2022
ZenML 🙏: MLOps framework to create reproducible ML pipelines for production machine learning.

ZenML is an extensible, open-source MLOps framework to create production-ready machine learning pipelines. It has a simple, flexible syntax, is cloud and tool agnostic, and has interfaces/abstractions that are catered towards ML workflows.

ZenML 2.6k Jan 8, 2023
Iris-Heroku - Putting a Machine Learning Model into Production with Flask and Heroku

Puesta en Producción de un modelo de aprendizaje automático con Flask y Heroku L

Jesùs Guillen 1 Jun 3, 2022
A demo project to elaborate how Machine Learn Models are deployed on production using Flask API

This is a salary prediction website developed with the help of machine learning, this makes prediction of salary on basis of few parameters like interview score, experience test score.

null 1 Feb 10, 2022
A toolkit for making real world machine learning and data analysis applications in C++

dlib C++ library Dlib is a modern C++ toolkit containing machine learning algorithms and tools for creating complex software in C++ to solve real worl

Davis E. King 11.6k Jan 2, 2023
AutoTabular automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications.

AutoTabular automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications. With just a few lines of code, you can train and deploy high-accuracy machine learning and deep learning models tabular data.

Robin 55 Dec 27, 2022
CrayLabs and user contibuted examples of using SmartSim for various simulation and machine learning applications.

SmartSim Example Zoo This repository contains CrayLabs and user contibuted examples of using SmartSim for various simulation and machine learning appl

Cray Labs 14 Mar 30, 2022
AutoTabular automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications.

AutoTabular AutoTabular automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications. With just

wenqi 2 Jun 26, 2022
Automated Machine Learning Pipeline for tabular data. Designed for predictive maintenance applications, failure identification, failure prediction, condition monitoring, etc.

Automated Machine Learning Pipeline for tabular data. Designed for predictive maintenance applications, failure identification, failure prediction, condition monitoring, etc.

Amplo 10 May 15, 2022
A Python Automated Machine Learning tool that optimizes machine learning pipelines using genetic programming.

Master status: Development status: Package information: TPOT stands for Tree-based Pipeline Optimization Tool. Consider TPOT your Data Science Assista

Epistasis Lab at UPenn 8.9k Jan 9, 2023
Python Extreme Learning Machine (ELM) is a machine learning technique used for classification/regression tasks.

Python Extreme Learning Machine (ELM) Python Extreme Learning Machine (ELM) is a machine learning technique used for classification/regression tasks.

Augusto Almeida 84 Nov 25, 2022
Vowpal Wabbit is a machine learning system which pushes the frontier of machine learning with techniques

Vowpal Wabbit is a machine learning system which pushes the frontier of machine learning with techniques such as online, hashing, allreduce, reductions, learning2search, active, and interactive learning.

Vowpal Wabbit 8.1k Dec 30, 2022
CD) in machine learning projectsImplementing continuous integration & delivery (CI/CD) in machine learning projects

CML with cloud compute This repository contains a sample project using CML with Terraform (via the cml-runner function) to launch an AWS EC2 instance

Iterative 19 Oct 3, 2022
An open source framework that provides a simple, universal API for building distributed applications. Ray is packaged with RLlib, a scalable reinforcement learning library, and Tune, a scalable hyperparameter tuning library.

Ray provides a simple, universal API for building distributed applications. Ray is packaged with the following libraries for accelerating machine lear

null 23.3k Dec 31, 2022
Microsoft contributing libraries, tools, recipes, sample codes and workshop contents for machine learning & deep learning.

Microsoft contributing libraries, tools, recipes, sample codes and workshop contents for machine learning & deep learning.

Microsoft 366 Jan 3, 2023
A data preprocessing package for time series data. Design for machine learning and deep learning.

A data preprocessing package for time series data. Design for machine learning and deep learning.

Allen Chiang 152 Jan 7, 2023
A mindmap summarising Machine Learning concepts, from Data Analysis to Deep Learning.

A mindmap summarising Machine Learning concepts, from Data Analysis to Deep Learning.

Daniel Formoso 5.7k Dec 30, 2022
A comprehensive repository containing 30+ notebooks on learning machine learning!

A comprehensive repository containing 30+ notebooks on learning machine learning!

Jean de Dieu Nyandwi 3.8k Jan 9, 2023