ML powered analytics engine for outlier detection and root cause analysis.

Overview

CG Header

WebsiteDocsBlogLinkedInCommunity Slack

All Contributors License Latest release Test status Give us a star! Last commit

ML powered analytics engine for outlier detection and root cause analysis


What is Chaos Genius?

Chaos Genius is an open source ML powered analytics engine for outlier detection and root cause analysis. Chaos Genius can be used to monitor and analyse high dimensionality business, data and system metrics at scale.

Using Chaos Genius, users can segment large datasets by key performance metrics (e.g. Daily Active Users, Cloud Costs, Failure Rates) and important dimensions (e.g., countryID, DeviceID, ProductID, DayofWeek) across which they want to monitor and analyse the key metrics.

Use Chaos Genius if you want:

  • Multidimensional Drill Downs & Insights
  • Anomaly Detection
  • Smart Alerting
  • Seasonality Detection*
  • Automated Root Cause Analysis*
  • Forecasting*
  • What-If Analysis*

*in Short and Medium-term Roadmap

Demo

To try it out, check out our Demo. Or explore live dashboards for:

⚙️ Quick Start

git clone https://github.com/chaos-genius/chaos_genius

cd chaos_genius

docker-compose up

Visit http://localhost:8080

Follow this Quick Start guide or read our Documentation for more details.

💫 Key Features

1. Automated DeepDrills

Generate multidimensional drilldowns to identify the key drivers of change in defined metrics (e.g. Sales) across a large number of high cardinality dimensions (e.g. CountryID, ProductID, BrandID, Device_type).

  • Techniques: Statistical Filtering, A* like path based search to deal with combinatorial explosion

DD

2. Anomaly Detection

Modular anomaly detection toolkit for monitoring high-dimensional time series with ability to select from different models. Tackle variations caused by seasonality, trends and holidays in the time series data.

  • Models: Prophet, EWMA, EWSTD, Neural Prophet, Greykite

Anomaly

3. Smart Alerts

Actionable alerts with self-learning thresholds. Configurations to setup alert frequency & reporting to combat alert fatigue.

  • Channels: Email, Slack

Alerting

:octocat: Community

For any help, discussions and suggestions feel free to reach out to the Chaos Genius team and the community here:

  • GitHub (report bugs, contribute, follow roadmap)

  • Slack (discuss with the community and Chaos Genius team)

  • Book Office Hours (set up time with the Chaos Genius team for any questions or help with setup)

  • Blog (follow us on latest trends on Data, Machine Learning, Open Source and more)

🚦 Roadmap

Our goal is to make Chaos Genius production ready for all organisations irrespective of their data infrasturcture, data sources and scale requirements. With that in mind we have created a roadmap for Chaos Genius. If you see something missing or wish to make suggestions, please drop us a line on our Community Slack or raise an issue.

🌱 Contributing

Want to contribute? Get started with:

  • Show us some love - Give us a 🌟 !

  • Submit an issue.

  • Share a part of the documentation that you find difficult to follow.

  • Translate our Readme.

  • Create a pull request. Here's a list of issues to start with. Please review our contribution guidelines before opening a pull request. Thank you for contributing!

❤️ Contributors

Thanks goes to these wonderful people (emoji key):


pshrimal21

📆 📖 🤔 🎨

Harshit Surana

💻 🔣 🔬 🐛

Manas Solanki

💻 👀 🔧 🐛

Kartikay Bagla

💻 🚧 🔬

Varun P

💻 🚧 🔬

Keshav Pradeep

💻 🔣 📖

Daj Katal

🔌 📖

Amatullah Sethjiwala

💻 🔣 ⚠️

juzarbhori

💻 🎨

Amogh Dhar Diwan

💻 🔣 🐛

Samyak Sarnayak

💻 📦 🐛

Aayush Naik

💻 🐛 📦

Kshitij Agarwal

💻 🔧 🐛

Bhargav S. Kumar

💻 📦 🐛

moghankumar06

💻 🎨

Santhoshkumar1023

💻 🎨

Mansi-Chauhan27

🔌

davidhayter-karhoo

🐛

Marijn van Aerle

🐛

gxu-kangaroo

🐛

RamneekKaur983

💻

arvind-27

🔣

Josh Taylor

🐛

ChartistDev

💻 🎨 🐛 👀

Rajdeep Sharma

💻 👀

balakumar9493

💻 🎨

Ikko Ashimine

💻

This project follows the all-contributors specification. Contributions of any kind welcome!

📜 License

Chaos Genius is licensed under the MIT license.

Comments
  • chore(deps): bump slack-sdk from 3.8.0 to 3.15.2

    chore(deps): bump slack-sdk from 3.8.0 to 3.15.2

    Bumps slack-sdk from 3.8.0 to 3.15.2.

    Release notes

    Sourced from slack-sdk's releases.

    version 3.15.2

    Changes


    version 3.15.1

    Changes


    version 3.15.0

    Changes


    version 3.14.1

    Changes

    • #1173 Fix a bug where some of the files.remote API parameters do not work since v3.10 - Thanks @​seratch

    version 3.14.0

    Changes

    ... (truncated)

    Commits
    • 48661f8 version 3.15.2
    • 619c638 Upgrade pytype to the latest
    • fde7bfd Add new properties to Audit Logs API response type
    • d0db05a Add file_ids to chat.update parameters (#1187)
    • 53766ae version 3.15.1
    • 9d9a6e1 Fix #1184 Add exception handling for socket mode - BlockingIOError: Resource ...
    • eec08f4 Fix build failures due to itsdangerous package release
    • 814da3c Fix #1181 Add exception handling for socket mode - socket.timeout: the read o...
    • 9853614 Upgrade pytype version
    • e68017a Update validation command to format integration tests as well
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 7
  • Druid configurable count column

    Druid configurable count column

    • [x] add "count_column" to KPI model https://github.com/chaos-genius/chaos_genius/pull/985/commits/80fd1d09fbe496a8430788bffbc7fbb11fb70925
    • [x] add "Update KPI" query to migration script for upgrade instance https://github.com/chaos-genius/chaos_genius/pull/985/commits/80fd1d09fbe496a8430788bffbc7fbb11fb70925
    • [x] read "count_column" from configured value in KPI definition instead of using hardcoded "count" value https://github.com/chaos-genius/chaos_genius/pull/985/commits/a318e318b59d0d3a93e70447476aae2a5a34b141
    • [x] add KPI validation to make sure selected count_column is of integer type https://github.com/chaos-genius/chaos_genius/pull/985/commits/40856b2da3908133087839385173e1090fe32773
    • [x] Fixes for the failing pytests because of this change https://github.com/chaos-genius/chaos_genius/pull/985/commits/c61ea1eb3047c91b9ecc83596aa385476159d884
    opened by rjdp 6
  • Configurable Datetime format and SQL strings in Data Loader

    Configurable Datetime format and SQL strings in Data Loader

    • Moved SQL identifiers to their respective connectors
    • Added Date and timestamp formatting strings in connectors
    • Updated data loader to work with these
    • Formatted all files used to be consistent with style guidelines.
    🛠️ backend 🔗 connectors 
    opened by kartikay-bagla 6
  • chore(deps): bump sqlalchemy-redshift from 0.8.6 to 0.8.9

    chore(deps): bump sqlalchemy-redshift from 0.8.6 to 0.8.9

    Bumps sqlalchemy-redshift from 0.8.6 to 0.8.9.

    Changelog

    Sourced from sqlalchemy-redshift's changelog.

    0.8.9 (2021-12-15)

    • Support inspection of Redshift datatypes (Pull [#242](https://github.com/sqlalchemy-redshift/sqlalchemy-redshift/issues/242) <https://github.com/sqlalchemy-redshift/sqlalchemy-redshift/pull/242>_)

    0.8.8 (2021-11-03)

    • Remove support for Python 2.7; now requires python >=3.4 (Pull [#234](https://github.com/sqlalchemy-redshift/sqlalchemy-redshift/issues/234) <https://github.com/sqlalchemy-redshift/sqlalchemy-redshift/pull/234>_)
    • Support GEOMETRY, SUPER Redshift datatypes (Pull [#235](https://github.com/sqlalchemy-redshift/sqlalchemy-redshift/issues/235) <https://github.com/sqlalchemy-redshift/sqlalchemy-redshift/pull/235>_)

    0.8.7 (2021-10-27)

    • Initial SQLAlchemy 2.0.x support (Pull [#237](https://github.com/sqlalchemy-redshift/sqlalchemy-redshift/issues/237) <https://github.com/sqlalchemy-redshift/sqlalchemy-redshift/pull/237>_)
    Commits
    • c6b3e59 Preparing release 0.8.9
    • 08eb502 Merge pull request #242 from Brooke-white/datatype-inspection
    • 0f13d48 Update CHANGES.rst
    • 6fae561 fix(dialect, custom-types): support compilation
    • b22cc98 fix(dialect, metadata): support inspection of Redshift datatypes
    • a3eff80 Back to development: 0.8.9
    • 2c0c4d3 Preparing release 0.8.8
    • af22877 Support GEOMETRY, SUPER Redshift datatypes (#235)
    • c830f7b Remove support and tests for Python 2.7 (#234)
    • d18b9b7 Back to development: 0.8.8
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 6
  • chore(deps): bump pycryptodomex from 3.14.1 to 3.15.0

    chore(deps): bump pycryptodomex from 3.14.1 to 3.15.0

    Bumps pycryptodomex from 3.14.1 to 3.15.0.

    Changelog

    Sourced from pycryptodomex's changelog.

    3.15.0 (22 June 2022) ++++++++++++++++++++++++++

    New features

    • Add support for curves Ed25519 and Ed448, including export and import of keys.
    • Add support for EdDSA signatures.
    • Add support for Asymmetric Key Packages (RFC5958) to import private keys.

    Resolved issues

    • GH#620: for Crypto.Util.number.getPrime , do not sequentially scan numbers searching for a prime.
    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    pip dependencies 
    opened by dependabot[bot] 5
  • ClickHouse support

    ClickHouse support

    Tell us about the problem you're trying to solve

    ClickHouse is a widely used OLAP database. So I am looking for adding it somehow as a Datasource. ClickHouse supports MySQL driver, so we can connect as MySQL clients to it, but it gave me an error when testing the connection. After investigating the problem, I saw that a query "ROLLBACK" give an exception because no transaction was found.

    Describe the solution you'd like

    so it will be helpful to support ClickHouse over MySQL driver or even over HTTP requestor native client.

    🔗 connectors 
    opened by MahmoudElhalwany 5
  • [BUG] Cannot connect to a postgres database hosted on Amazon AWS on the

    [BUG] Cannot connect to a postgres database hosted on Amazon AWS on the "Add Data Sources" page. "Test Connection" keeps running for a while but never changes to "Add Data Source".

    Bug

    [BUG] Cannot connect to a postgres database hosted on Amazon AWS on the "Add Data Sources" page. "Test Connection" keeps running for a while but never changes to "Add Data Source". I have tested with a publicly available postgres database, with the details given in the chaos-genius documentation (please refer this). I was able to connect to the public dataset. However, I cannot connect to my own postgres database (hosted on Amazon AWS). I am certain I am entering the correct credentials for the database connection.

    Environment

    • Chaos Genius version: 0.5.1
    • OS Version / Instance: Ubuntu 20.04
    • Deployment type: Deployed locally by cloning the repository and using the docker-compose up command

    Current behavior

    After deploying chaos-genius locally and starting all services using the docker-compose up command, the home page for chaos-genius shows on localhost:8080. On clicking the "Add Data Source" button on the homepage, I am taken to the "Add Data Sources" page. After entering all details correctly for my own postgres database, I click the "Test Connection" button. a "Loading..." message appears. This never changes to "Add Data Source" and just changes back to "Test Connection".

    Steps: to reproduce:

    1. Clone the repository: git clone https://github.com/chaos-genius/chaos_genius
    2. Change into the chaos_genius directory: cd chaos_genius
    3. Start services: docker-compose up
    4. Navigate to localhost:8080 in browser
    5. Click "Add Data Source" button on homepage
    6. Fill in all database connection details
    7. Click "Test Connection" to test

    Expected behavior

    On clicking the "Test Connection" button, it should change to "Add Data Source". Since the credentials are correct, "Test Connection" should be successful and this should happen.

    opened by prathamSharma25 5
  • chore(deps): bump cryptography from 3.4.8 to 36.0.2

    chore(deps): bump cryptography from 3.4.8 to 36.0.2

    Bumps cryptography from 3.4.8 to 36.0.2.

    Changelog

    Sourced from cryptography's changelog.

    36.0.2 - 2022-03-15

    
    * Updated Windows, macOS, and Linux wheels to be compiled with OpenSSL 1.1.1n.
    

    .. _v36-0-1:

    36.0.1 - 2021-12-14

    • Updated Windows, macOS, and Linux wheels to be compiled with OpenSSL 1.1.1m.

    .. _v36-0-0:

    36.0.0 - 2021-11-21

    
    * **FINAL DEPRECATION** Support for ``verifier`` and ``signer`` on our
      asymmetric key classes was deprecated in version 2.0. These functions had an
      extended deprecation due to usage, however the next version of
      ``cryptography`` will drop support. Users should migrate to ``sign`` and
      ``verify``.
    * The entire :doc:`/x509/index` layer is now written in Rust. This allows
      alternate asymmetric key implementations that can support cloud key
      management services or hardware security modules provided they implement
      the necessary interface (for example:
      :class:`~cryptography.hazmat.primitives.asymmetric.ec.EllipticCurvePrivateKey`).
    * :ref:`Deprecated the backend argument<faq-missing-backend>` for all
      functions.
    * Added support for
      :class:`~cryptography.hazmat.primitives.ciphers.aead.AESOCB3`.
    * Added support for iterating over arbitrary request
      :attr:`~cryptography.x509.CertificateSigningRequest.attributes`.
    * Deprecated the ``get_attribute_for_oid`` method on
      :class:`~cryptography.x509.CertificateSigningRequest` in favor of
      :meth:`~cryptography.x509.Attributes.get_attribute_for_oid` on the new
      :class:`~cryptography.x509.Attributes` object.
    * Fixed handling of PEM files to allow loading when certificate and key are
      in the same file.
    * Fixed parsing of :class:`~cryptography.x509.CertificatePolicies` extensions
      containing legacy ``BMPString`` values in their ``explicitText``.
    * Allow parsing of negative serial numbers in certificates. Negative serial
      numbers are prohibited by :rfc:`5280` so a deprecation warning will be
      raised whenever they are encountered. A future version of ``cryptography``
      will drop support for parsing them.
    * Added support for parsing PKCS12 files with friendly names for all
      certificates with
      :func:`~cryptography.hazmat.primitives.serialization.pkcs12.load_pkcs12`,
      which will return an object of type
      :class:`~cryptography.hazmat.primitives.serialization.pkcs12.PKCS12KeyAndCertificates`.
    </tr></table> 
    

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 5
  • Can not add alert from UI

    Can not add alert from UI

    On Trying to add alert through UI, getting the following error image

    {"message":"(psycopg2.ProgrammingError) can't adapt type 'dict'\n[SQL: INSERT INTO alert (alert_name, alert_type, data_source, alert_query, alert_settings, kpi, kpi_alert_type, severity_cutoff_score, alert_message, alert_frequency, alert_channel, alert_channel_conf, active, created_at, last_alerted) VALUES (%(alert_name)s, %(alert_type)s, %(data_source)s, %(alert_query)s, %(alert_settings)s, %(kpi)s, %(kpi_alert_type)s, %(severity_cutoff_score)s, %(alert_message)s, %(alert_frequency)s, %(alert_channel)s, %(alert_channel_conf)s, %(active)s, %(created_at)s, %(last_alerted)s) RETURNING alert.id]\n[parameters: {'alert_name': 'abcd', 'alert_type': 'KPI Alert', 'data_source': {'active': True, 'connection_status': 'connected', 'connection_type': 'Postgres', 'created_at': 'Sat, 18 Sep 2021 08:28:25 GMT', 'id': 4, 'is_third_party': False, 'last_sync': None, 'name': 'Cloud Failure', 'sync_status': None}, 'alert_query': '', 'alert_settings': '', 'kpi': 15, 'kpi_alert_type': 'Anomaly', 'severity_cutoff_score': 7, 'alert_message': 'Test alert', 'alert_frequency': 'daily', 'alert_channel': 'slack', 'alert_channel_conf': '\"{}\"', 'active': True, 'created_at': datetime.datetime(2021, 10, 2, 11, 23, 44, 266857), 'last_alerted': None}]\n(Background on this error at: http://sqlalche.me/e/14/f405)","status":"failure"}

    🐛 bug 🛠️ backend 
    opened by Fletchersan 5
  • Anomaly charts do not highlight outliers & CI range in some edge cases

    Anomaly charts do not highlight outliers & CI range in some edge cases

    It looks like this is only triggered in cases where the first data point is an outlier. May be related to our charting/ intercept code.

    Some Examples:

    image image

    🐛 bug 🖥️ frontend 
    opened by suranah 5
  • chore(deps): bump toolz from 0.11.2 to 0.12.0

    chore(deps): bump toolz from 0.11.2 to 0.12.0

    Bumps toolz from 0.11.2 to 0.12.0.

    Release notes

    Sourced from toolz's releases.

    Release 0.12.0

    • Add apply (#411)
    • Support newer Python versions--up to Python 3.11-alpha (#525, #527, #533)
    • Improve warning when using toolz.compatibility (#485)
    • Improve documentation (#507, #524, #526, #530)
    • Improve performance of merge_with (#532)
    • Improve import times (#534)
    • Auto-upload new releases to PyPI (#536, #537)
    Commits
    • 245b78e Merge pull request #537 from eriknw/auto_pypi
    • e50ab10 Tag glob pattern for auto-upload to PyPI
    • 07d20d2 Merge pull request #536 from eriknw/auto_pypi
    • 7640dc1 Add Github Action to upload to PyPI
    • 22dc024 Merge pull request #534 from eriknw/faster_import
    • 7880c57 Add test to ensure toolz.curried.operator is properly curried
    • 77c044e Make import times significantly faster
    • 32135de Merge pull request #533 from eriknw/python3.11
    • dfbc3d6 Use importlib.machinery.ModuleSpec instead of TlzSpec
    • 6a8de2e Fix Tlz error in Python 3.11 (alpha)
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    pip dependencies 
    opened by dependabot[bot] 4
  • chore(deps): bump cryptography from 36.0.2 to 39.0.0

    chore(deps): bump cryptography from 36.0.2 to 39.0.0

    Bumps cryptography from 36.0.2 to 39.0.0.

    Changelog

    Sourced from cryptography's changelog.

    39.0.0 - 2023-01-01

    
    * **BACKWARDS INCOMPATIBLE:** Support for OpenSSL 1.1.0 has been removed.
      Users on older version of OpenSSL will need to upgrade.
    * **BACKWARDS INCOMPATIBLE:** Dropped support for LibreSSL < 3.5. The new
      minimum LibreSSL version is 3.5.0. Going forward our policy is to support
      versions of LibreSSL that are available in versions of OpenBSD that are
      still receiving security support.
    * **BACKWARDS INCOMPATIBLE:** Removed the ``encode_point`` and
      ``from_encoded_point`` methods on
      :class:`~cryptography.hazmat.primitives.asymmetric.ec.EllipticCurvePublicNumbers`,
      which had been deprecated for several years.
      :meth:`~cryptography.hazmat.primitives.asymmetric.ec.EllipticCurvePublicKey.public_bytes`
      and
      :meth:`~cryptography.hazmat.primitives.asymmetric.ec.EllipticCurvePublicKey.from_encoded_point`
      should be used instead.
    * **BACKWARDS INCOMPATIBLE:** Support for using MD5 or SHA1 in
      :class:`~cryptography.x509.CertificateBuilder`, other X.509 builders, and
      PKCS7 has been removed.
    * **BACKWARDS INCOMPATIBLE:** Dropped support for macOS 10.10 and 10.11, macOS
      users must upgrade to 10.12 or newer.
    * **ANNOUNCEMENT:** The next version of ``cryptography`` (40.0) will change
      the way we link OpenSSL. This will only impact users who build
      ``cryptography`` from source (i.e., not from a ``wheel``), and specify their
      own version of OpenSSL. For those users, the ``CFLAGS``, ``LDFLAGS``,
      ``INCLUDE``, ``LIB``, and ``CRYPTOGRAPHY_SUPPRESS_LINK_FLAGS`` environment
      variables will no longer be respected. Instead, users will need to
      configure their builds `as documented here`_.
    * Added support for
      :ref:`disabling the legacy provider in OpenSSL 3.0.x<legacy-provider>`.
    * Added support for disabling RSA key validation checks when loading RSA
      keys via
      :func:`~cryptography.hazmat.primitives.serialization.load_pem_private_key`,
      :func:`~cryptography.hazmat.primitives.serialization.load_der_private_key`,
      and
      :meth:`~cryptography.hazmat.primitives.asymmetric.rsa.RSAPrivateNumbers.private_key`.
      This speeds up key loading but is :term:`unsafe` if you are loading potentially
      attacker supplied keys.
    * Significantly improved performance for
      :class:`~cryptography.hazmat.primitives.ciphers.aead.ChaCha20Poly1305`
      when repeatedly calling ``encrypt`` or ``decrypt`` with the same key.
    * Added support for creating OCSP requests with precomputed hashes using
      :meth:`~cryptography.x509.ocsp.OCSPRequestBuilder.add_certificate_by_hash`.
    * Added support for loading multiple PEM-encoded X.509 certificates from
      a single input via :func:`~cryptography.x509.load_pem_x509_certificates`.
    

    .. _v38-0-4:

    38.0.4 - 2022-11-27 </tr></table>

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    pip dependencies 
    opened by dependabot[bot] 2
  • chore(deps): bump pyopenssl from 21.0.0 to 23.0.0

    chore(deps): bump pyopenssl from 21.0.0 to 23.0.0

    Bumps pyopenssl from 21.0.0 to 23.0.0.

    Changelog

    Sourced from pyopenssl's changelog.

    23.0.0 (2023-01-01)

    Backward-incompatible changes: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

    Deprecations: ^^^^^^^^^^^^^

    Changes: ^^^^^^^^

    • Add OpenSSL.SSL.X509StoreFlags.PARTIAL_CHAIN constant to allow for users to perform certificate verification on partial certificate chains. [#1166](https://github.com/pyca/pyopenssl/issues/1166) <https://github.com/pyca/pyopenssl/pull/1166>_
    • cryptography maximum version has been increased to 39.0.x.

    22.1.0 (2022-09-25)

    Backward-incompatible changes: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

    • Remove support for SSLv2 and SSLv3.
    • The minimum cryptography version is now 38.0.x (and we now pin releases against cryptography major versions to prevent future breakage)
    • The OpenSSL.crypto.X509StoreContextError exception has been refactored, changing its internal attributes. [#1133](https://github.com/pyca/pyopenssl/issues/1133) <https://github.com/pyca/pyopenssl/pull/1133>_

    Deprecations: ^^^^^^^^^^^^^

    • OpenSSL.SSL.SSLeay_version is deprecated in favor of OpenSSL.SSL.OpenSSL_version. The constants OpenSSL.SSL.SSLEAY_* are deprecated in favor of OpenSSL.SSL.OPENSSL_*.

    Changes: ^^^^^^^^

    • Add OpenSSL.SSL.Connection.set_verify and OpenSSL.SSL.Connection.get_verify_mode to override the context object's verification flags. [#1073](https://github.com/pyca/pyopenssl/issues/1073) <https://github.com/pyca/pyopenssl/pull/1073>_
    • Add OpenSSL.SSL.Connection.use_certificate and OpenSSL.SSL.Connection.use_privatekey to set a certificate per connection (and not just per context) [#1121](https://github.com/pyca/pyopenssl/issues/1121) <https://github.com/pyca/pyopenssl/pull/1121>_.

    22.0.0 (2022-01-29)

    Backward-incompatible changes:

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    pip dependencies 
    opened by dependabot[bot] 2
  • [Snyk] Security upgrade pygments from 2.5.2 to 2.7.4

    [Snyk] Security upgrade pygments from 2.5.2 to 2.7.4

    This PR was automatically created by Snyk using the credentials of a real user.


    Snyk has created this PR to fix one or more vulnerable packages in the `pip` dependencies of this project.

    Changes included in this PR

    • Changes to the following files to upgrade the vulnerable dependencies to a fixed version:
      • requirements/dev.txt
    ⚠️ Warning
    pdbpp 0.10.3 requires pygments, which is not installed.
    flake8-isort 4.2.0 requires isort, which is not installed.
    
    

    Vulnerabilities that will be fixed

    By pinning:

    Severity | Priority Score (*) | Issue | Upgrade | Breaking Change | Exploit Maturity :-------------------------:|-------------------------|:-------------------------|:-------------------------|:-------------------------|:------------------------- high severity | 696/1000
    Why? Proof of Concept exploit, Has a fix available, CVSS 7.5 | Regular Expression Denial of Service (ReDoS)
    SNYK-PYTHON-PYGMENTS-1086606 | pygments:
    2.5.2 -> 2.7.4
    | No | Proof of Concept high severity | 589/1000
    Why? Has a fix available, CVSS 7.5 | Denial of Service (DoS)
    SNYK-PYTHON-PYGMENTS-1088505 | pygments:
    2.5.2 -> 2.7.4
    | No | No Known Exploit

    (*) Note that the real score may have changed since the PR was raised.

    Some vulnerabilities couldn't be fully fixed and so Snyk will still find them when the project is tested again. This may be because the vulnerability existed within more than one direct dependency, but not all of the affected dependencies could be upgraded.

    Check the changes in this PR to ensure they won't cause issues with your project.


    Note: You are seeing this because you or someone else with access to this repository has authorized Snyk to open fix PRs.

    For more information: 🧐 View latest project report

    🛠 Adjust project settings

    📚 Read more about Snyk's upgrade and patch logic


    Learn how to fix vulnerabilities with free interactive lessons:

    🦉 Regular Expression Denial of Service (ReDoS) 🦉 Denial of Service (DoS)

    opened by varunp2k 2
  • [Snyk] Security upgrade setuptools from 39.0.1 to 65.5.1

    [Snyk] Security upgrade setuptools from 39.0.1 to 65.5.1

    This PR was automatically created by Snyk using the credentials of a real user.


    Snyk has created this PR to fix one or more vulnerable packages in the `pip` dependencies of this project.

    Changes included in this PR

    • Changes to the following files to upgrade the vulnerable dependencies to a fixed version:
      • requirements/prod.txt
    ⚠️ Warning
    pyOpenSSL 21.0.0 requires cryptography, which is not installed.
    
    

    Vulnerabilities that will be fixed

    By pinning:

    Severity | Priority Score (*) | Issue | Upgrade | Breaking Change | Exploit Maturity :-------------------------:|-------------------------|:-------------------------|:-------------------------|:-------------------------|:------------------------- medium severity | 551/1000
    Why? Recently disclosed, Has a fix available, CVSS 5.3 | Regular Expression Denial of Service (ReDoS)
    SNYK-PYTHON-SETUPTOOLS-3180412 | setuptools:
    39.0.1 -> 65.5.1
    | No | No Known Exploit

    (*) Note that the real score may have changed since the PR was raised.

    Some vulnerabilities couldn't be fully fixed and so Snyk will still find them when the project is tested again. This may be because the vulnerability existed within more than one direct dependency, but not all of the affected dependencies could be upgraded.

    Check the changes in this PR to ensure they won't cause issues with your project.


    Note: You are seeing this because you or someone else with access to this repository has authorized Snyk to open fix PRs.

    For more information: 🧐 View latest project report

    🛠 Adjust project settings

    📚 Read more about Snyk's upgrade and patch logic


    Learn how to fix vulnerabilities with free interactive lessons:

    🦉 Regular Expression Denial of Service (ReDoS)

    opened by varunp2k 2
  • chore(deps): bump sqlalchemy from 1.4.27 to 1.4.45

    chore(deps): bump sqlalchemy from 1.4.27 to 1.4.45

    Bumps sqlalchemy from 1.4.27 to 1.4.45.

    Release notes

    Sourced from sqlalchemy's releases.

    1.4.45

    Released: December 10, 2022

    orm

    • [orm] [bug] Fixed bug where _orm.Session.merge() would fail to preserve the current loaded contents of relationship attributes that were indicated with the _orm.relationship.viewonly parameter, thus defeating strategies that use _orm.Session.merge() to pull fully loaded objects from caches and other similar techniques. In a related change, fixed issue where an object that contains a loaded relationship that was nonetheless configured as lazy='raise' on the mapping would fail when passed to _orm.Session.merge(); checks for "raise" are now suspended within the merge process assuming the _orm.Session.merge.load parameter remains at its default of True.

      Overall, this is a behavioral adjustment to a change introduced in the 1.4 series as of #4994, which took "merge" out of the set of cascades applied by default to "viewonly" relationships. As "viewonly" relationships aren't persisted under any circumstances, allowing their contents to transfer during "merge" does not impact the persistence behavior of the target object. This allows _orm.Session.merge() to correctly suit one of its use cases, that of adding objects to a Session that were loaded elsewhere, often for the purposes of restoring from a cache.

      References: #8862

    • [orm] [bug] Fixed issues in _orm.with_expression() where expressions that were composed of columns that were referenced from the enclosing SELECT would not render correct SQL in some contexts, in the case where the expression had a label name that matched the attribute which used _orm.query_expression(), even when _orm.query_expression() had no default expression. For the moment, if the _orm.query_expression() does have a default expression, that label name is still used for that default, and an additional label with the same name will continue to be ignored. Overall, this case is pretty thorny so further adjustments might be warranted.

      References: #8881

    engine

    • [engine] [bug] Fixed issue where _engine.Result.freeze() method would not work for textual SQL using either _sql.text() or _engine.Connection.exec_driver_sql().

      References: #8963

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    pip dependencies 
    opened by dependabot[bot] 2
  • chore(deps): bump pycryptodomex from 3.14.1 to 3.16.0

    chore(deps): bump pycryptodomex from 3.14.1 to 3.16.0

    Bumps pycryptodomex from 3.14.1 to 3.16.0.

    Release notes

    Sourced from pycryptodomex's releases.

    v3.16.0 - Ravensburg

    New features

    • Build wheels for musl Linux. Thanks to Ben Raz.

    Resolved issues

    • GH#639: ARC4 now also works with 'keys' as short as 8 bits.
    • GH#669: fix segfaults when running in a manylinux2010 i686 image.

    v3.16.0 - Ravensburg (pycryptodomex)

    New features

    • Build wheels for musl Linux. Thanks to Ben Raz.

    Resolved issues

    • GH#639: ARC4 now also works with 'keys' as short as 8 bits.
    • GH#669: fix segfaults when running in a manylinux2010 i686 image.
    Changelog

    Sourced from pycryptodomex's changelog.

    3.16.0 (26 November 2022) ++++++++++++++++++++++++++

    New features

    • Build wheels for musl Linux. Thanks to Ben Raz.

    Resolved issues

    • GH#639: ARC4 now also works with 'keys' as short as 8 bits.
    • GH#669: fix segfaults when running in a manylinux2010 i686 image.

    3.15.0 (22 June 2022) ++++++++++++++++++++++++++

    New features

    • Add support for curves Ed25519 and Ed448, including export and import of keys.
    • Add support for EdDSA signatures.
    • Add support for Asymmetric Key Packages (RFC5958) to import private keys.

    Resolved issues

    • GH#620: for Crypto.Util.number.getPrime , do not sequentially scan numbers searching for a prime.
    Commits
    • f54108b Bump version
    • 3d065d6 Remove manylinux1 wheels only on Linux
    • 60898fe Build wheel for PyPy3.7, drop PyPy3.6
    • 428fd89 Delete manylinux1 wheels
    • 80d6640 Use -mstackrealign for 32-bits systems and SSE2
    • 32f64d5 Use most recent versions of python, ubuntu, upload-artifact
    • dcab3e7 Refactor wheel build process
    • 7e59254 Merge branch 'rc4_shorter_keys'
    • dfcc12b Bump version
    • ac3eab0 Allow RC4 keys to be as small as 8 bits
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    pip dependencies 
    opened by dependabot[bot] 2
Releases(v0.11.0)
  • v0.11.0(Oct 4, 2022)

    Hello everyone :wave:, we’re announcing the release of Chaos Genius v.0.11.0 today! :tada::rocket: This release we bring an addition to our Anomaly Detection and Alerts module as well as resolve critical bugs.

    :rotating_light: Enhancement to our Anomaly Detection and Alerts Module

    To make Anomaly Detection more insightful we now provide users expected value of their time series data in addition to the expected range. Furthermore, we have updated the alerts format and dashboard to provide a better percentage of change based on expected values.

    • Add Expected Value to Anomaly and Alerts #1132 #1138 #1140 #1144

    :bug: Bug fixes

    • fix(retrain): clear recent alerts data and set appropriate last_anomaly_timestamp #1128
    • chore(dev-ops): Upgrade dependencies #1131 #1127 #1126 #1124
    Source code(tar.gz)
    Source code(zip)
  • v0.10.2(Sep 9, 2022)

    Hello everyone :wave:,

    We’re announcing the release of Chaos Genius v.0.10.2 today! :tada::rocket: This release we focus on resolving critical bugs and adding a new connector.

    :chains: Adding Clickhouse as a Data Connector

    To make Chaos Genius operational for more environments, we’ve added a new data connector - Clickhouse - to our roster. Thank you @MahmoudElhalwany for your contribution!

    Feature/clickhouse connector #1090

    :bug: Bug fixes

    • fix(alert-report): ignore alerts of deleted/inactive alerts #1118
    • fix(datasource): use case-insensitive check for LIMIT #1116
    • fix(front-end): Bug event alert toast on test query #1105
    • fix(front-end):changed sidebar support text and icon #1092
    • chore(dev-ops): Upgrade dependencies #1109 #1108 #1107 #1096
    Source code(tar.gz)
    Source code(zip)
  • v0.10.1(Aug 15, 2022)

    Hey everyone :wave:, we’re back announcing the release of Chaos Genius v0.10.1! :tada::rocket: This is a minor hotfix release to resolve some critical bugs.

    :bug: Bug fixes

    • Dimension and subdimensional options should be searchable #1080
    • feat(subdim-filter): sort dimension and value lists #1069
    • feat: count column for druid supports float now #1075
    • fix: check dashboard/datasource active in filter options #1079
    • Removed xAxis legend of KPI summary chart #1067
      • Thank you Somya Jain and Ashish Sahoo for all of the above suggestions and bug reports
    • fix(docker): add analytics params to alerts worker #1078
      • Thank you Grant Xu for reporting this bug
    • fix(anomaly): ensure correct data types #1077
      • Thank you Alberto Azambuja for reporting this bug
    • fix(anomaly): convert slack into hourly value for hourly KPIs #1082
    • fix(front-end): called all alerts API when alerts are deleted #1081
    • Disable analytics endpoint #1073
    Source code(tar.gz)
    Source code(zip)
  • v0.10.0(Jul 28, 2022)

    Hey everyone :wave:, we’re announcing the release of Chaos Genius v0.10.0! :tada::rocket:

    This release focuses on clearing the bugs backlog and improving the quality of our root cause analysis for anomalies. Here are some brief notes on our new features and bug fixes:

    :zap: Enhancements to Root Cause/Drill Down Algorithm for Anomalies

    Till now, for finding the reasons behind an anomalous data point, Drilldowns would mostly sort the underlying sub-populations by the severity score of their anomalies. This often led to high severity but low impact sub-populations being ranked higher in the Drilldowns hierarchy. To fix this, Drilldowns for an anomalous point in a KPI are now sorted by a new metric called Impact Score. This metric weighs both the severity of the data point as well as the contribution the subdimension has to the overall KPI. Users will now have more relevant and clearer insights into their data.

    • Impact Score Metric #1033

    :bug: Bug fixes

    • Fix anomaly subdim filter and data download for subdims with empty value #1057
    • removed check of hasdruiddropdown #1056
    • fix(drilldowns): reset count col before data loading #1055
    • fix(data-loader): fix bug when metric and count column are same #1053
    • fixed security vulnerabilites #1052
    • fix(drilldowns): use severity when impact is 0 #1051
    • fix(anomaly): when metric and count column are same for pre-aggregated data #1048
    • feat: add path safe KPI and subdim names for CSV download #1045
    • download csv name changes #1047 #1044
    • fix(anomaly-params): use .get instead of indexing, check rca_time exists #1043
    • added description and numerical range to slider values #1041
    • fix(migrations): handle case where value of subdim can be an empty value #1040
    • perf: optimize graph JSON creation #1039
    • Make anomaly CSV download work for subdims too #1038 #1036
    • feat(alerts): add CSV to email alert report, better CSV names #1035
    • Schedule Deepdrills/summary at same time as anomaly for daily running KPIs #1034
    • refactor anomaly data view for style, lints, type errors #1032
    • fix(anomaly): smaller time window, drop dups for hourly KPI drilldown #1029
    • fix: added guard condition to fix #1021 #1027
    Source code(tar.gz)
    Source code(zip)
  • v0.9.1(Jul 12, 2022)

    A minor hotfix release for some critical bugs.

    :bug: Fixes

    • Alert dashboard did not work when anomaly was manually disabled for some KPIs (#1020)
    • Alerts had an extra "Sub-dimensional anomalies" heading and "Reasons for change" text even when there were no sub-dimensional anomalies or relevant subdims (#1024)
    • CSS bug in dashboard filter (#1023)
    Source code(tar.gz)
    Source code(zip)
  • v0.9.0(Jul 1, 2022)

    Hey everyone :wave:, we’re back announcing the release of v0.9.0 of Chaos Genius! Here are some brief notes on our new features and bug fixes:

    :arrow_double_down: Sub Dimensional Filter for Anomaly Detection

    Users can now browse through anomalies for any sub-dimensions (e.g. Country = UK) by just choosing the filters from the top; this has been a much requested feature and we’re happy that we’re finally supporting this.

    • Sub-Dimension Anomaly Filter #999
    • Sub-Dim Filter Front End #1002 #1004

    Sub-dimensional filter in anomaly

    :rotating_light: Enhancements to our Alerts Module

    Sub-Dimensional alerts have returned with a new and improved format! Users can choose to receive sub dimensional alerts while adding a KPI alert. To make alerts more insightful, we now provide Reasons for Change for any anomaly as well!

    • Add relevant subdims to overall anomalies and toggle for subdim level anomalies #1000 #1010 #1012

    Reasons for change in anomaly alerts

    Sub-dimensional anomalies

    :chains: Adding DataBricks & Athena as Data Connectors

    To make Chaos Genius operational for more environments, we’ve added two new data connectors - AWS Athena & Databricks - to our roster.

    • Databricks Datasource Support #1001
    • feat(connectors): added AWS Athena Connector #964

    :bug: Bug fixes

    • Fix/pyarrow dep coflict #1008
    • Fixes the bug where having strings with quotes in your data would break it #1005
    • Added condition for comparing KPI search texts #1003
    • Fix empty event alerts tab in alerts dashboard #1000
    • Dashboard link in alert does not filter by the specific alert #1000
    • Alerts and KPIs are not linked in the alerts dashboard #1000
    • fix: error handling for fetching the table info in case of permission issue #998
    • fix(edit-kpi): kpi validation triggered on edit kpi #996
    • Fix breadcrumbs in KPI screen #993
    • Fixes #991 by checking for validation success before running checks for tz-aware data #992
    Source code(tar.gz)
    Source code(zip)
  • v0.8.0(Jun 13, 2022)

    Release Notes for Chaos Genius 0.8.0

    Hello everyone :wave:, we’re announcing the release of Chaos Genius v0.8.0 today :tada:

    We’ve focused on enriching user experience and fixing critical bugs. We briefly cover the main features & fixes here.

    :rotating_light: Enhancements to our Alerts Module - More Insights, Less Noise

    To make alerts more insightful, we now identify KPI behavior such as :arrow_up: Spike, :arrow_down_small: Drop and :record_button: Anomalous occurrences. Furthermore we’ve reduced noise in alert messages by switching off sub-dimensional alerts by default. However they will still be available via the Alerts Dashboard. We are currently working on making Sub-Dimensional Alerts more relevant and insightful for the users in the upcoming releases, so stay tuned!

    • Alerts Revamp #974
    • Alert Fixes Post Revamp #982
    • Added Disclaimer for Consolidated Alerts Reports #971, #986

    :monocle_face: Increased focus on Anomaly Detection

    After listening to all our users feedback we have decided to shift our focus and deepen our offering on Anomaly Detection. On that end, we have made the DeepDrills section optional. This feature will now be disabled by default. Users have the option to enable it, if they want. Please refer to our docs on how to enable DeepDrills.

    • DeepDrills Decoupling and Restructuring #967
    • Decouple DeepDrills Frontend #970
    • Added DEEPDRILLS_ENABLED in global config API endpoint #972

    :sparkle: Druid Features Addition

    The name for Count column for Druid KPIs is now configurable as part of KPI definition. This column can also be proxy for count for the rolled-up data i.e not representing the actual count. We support Druid for rollup scenarios only.

    • Druid Configurable Count Column #985
    • Druid Feature Changes Frontend #981, #983, #984

    :bug: Bug fixes

    • Cannot define KPI with tables which contain ‘-’ in the table name. #793, #980
    • Fix bug in edit KPI for NoneType #977
    • A Druid data source does not need to be named “Druid”. #985
    • Fix typo in controller.py #968
    • Editing a KPI gets it removed from every dashboard except ‘All’. #965, #966
    • Fix type errors and lint issues #979
    Source code(tar.gz)
    Source code(zip)
  • v0.7.0(May 5, 2022)

    Release Notes for Chaos Genius 0.7.0

    Hello everyone, our 0.7.0 release takes care of usability & experience issues our users found operating Chaos Genius at scale. We briefly cover the main features & fixes here.

    🎊 Pagination to manage 1000s of KPIs

    Some of you who are monitoring a large number of KPIs found degraded performance in Home Screen, KPI Explorer, Data Sources & Alerts. We have now added pagination & server-side search so Chaos Genius UI scale wells for 1000s of KPIs. Thank you, Klub team for bringing this to our attention!

    image image
    • added paginate param for data sources dropdown in event alert #953
    • added paginate params for the KPI list and changed font size in pagination #946
    • pagination frontend #935
    • added fixes for filtering of url #949

    🐛 Robust KPI Editing

    We now have more robust editing for KPIs, including support for removing all subdimensions during the edit process. Thanks, @GRANTOSMO & Athul!

    • [BUG] Task Failure Alert for KPIs with Empty Sub-dimension List After Editing #938
    • fix(editkpi): issue of subdim anomaly failure after removal of all subdims #939

    🛡️ Vulnerabilities Fixes

    Like with other releases we make sure that there are no vulnerability or security issues in any libraries or dependencies Chaos Genius uses. We have also made CORS disabled by default - and it is configurable (thanks, @rsohlot).

    • vulnerabilities fixed on dependencies #944
    • bump cryptography from 3.4.8 to 37.0.1 #942
    • configurable cors #791
    Source code(tar.gz)
    Source code(zip)
  • v0.6.0(Apr 21, 2022)

    Release Notes for Chaos Genius 0.6.0

    Hey everyone, we have tackled a bunch of key issues in our 0.6.0 release based on your feedback. Here are a brief notes around them.

    1. What's New
    2. Bug fixes

    :tada: What's New

    Optimized Metadata Loader

    Many of you encountered metadata related issues with connecting to cloud data warehouses with large number of data assets. This was likely because of the amount of metadata that was being live fetched - to tackle this better we now have a system that asynchronously loads all the metadata from your datastore making this much more seamless. We have battle tested its performance with datawarehouses with over 100K tables.

    Thank you, @danielefrigo, @ankneo, @sparshgupta & the klub team!

    Configurable KPI Settings

    You no longer need to delete & recreate a KPI to make any major changes to them. Now we support editing KPI configuration as well as anomaly settings.

    Thank you, @fampay-tech, @KShivendu, @gxu-kangaroo!

    Expanded TZ Support

    If you are dealing with different timezones for database & reporting, this feature is relevant. We now support timezone aware columns as well as provide data transformations when your database timezone & reporting timezone differs. As a config while creating data sources, we will now ask you to select your DB timezone - as a default it is UTC.

    cc: @fampay-tech, @KShivendu

    Analytics Download

    Many of our users have asked the option to download our analytics data for downstream ad-hoc analytics. We now support data downloads for panel metrics, DeepDrills as well as anomaly detection. Just click on the blue download button for the respective analytics report to be downloaded as a CSV.

    We thank the Klub team for bringing this feature request.

    :construction_worker: Other enhancements

    We have also worked to improve some other features including:

    • Made hourly alerting more robust to missing data
    • Streamlined alert schedulers to be more fault tolerant
    • Fixed an issue in tabular KPI support for third party data.

    :bug: Bug fixes

    • fix: API response for anomaly when anomaly settings have been not configured by @Amatullah in https://github.com/chaos-genius/chaos_genius/pull/845
    • fixed node-forge security vulnerabilites by @ChartistDev in https://github.com/chaos-genius/chaos_genius/pull/857
    • fix(scheduler): remove microsecond component from scheduled_time by @Samyak2 in https://github.com/chaos-genius/chaos_genius/pull/835
    • Hotfix 864 by @rjdp in https://github.com/chaos-genius/chaos_genius/pull/866
    • fix(docker): fixes #853 added restart policy by @varunp2k in https://github.com/chaos-genius/chaos_genius/pull/854
    • fix(data-loader): used ISO8601 extended strings in queries by @kartikay-bagla in https://github.com/chaos-genius/chaos_genius/pull/919
    Source code(tar.gz)
    Source code(zip)
  • v0.5.2(Apr 5, 2022)

    🎉 Release Notes for Chaos Genius 0.5.2

    We are doing a hotfix release to tackle an issue.

    Bug Fixes 🐛

    • KPI creation for 3rd party Datasources was broken on v0.5.0 and onwards, checkout PR https://github.com/chaos-genius/chaos_genius/pull/885 that fixes this, for more details

    To upgrade your CG instance, follow the commands here.

    Source code(tar.gz)
    Source code(zip)
  • v0.5.1(Mar 22, 2022)

    🎉 Release Notes for Chaos Genius 0.5.1

    We are doing a minor release to tackle a few issues you have raised.

    Bug Fixes 🐛

    • Fix the routing in the onboarding flow
    • Correct the message in the onboarding flow 📝
    • Fix styling the charts 📈 where y-axis labels are being cut

    Improvement ✨

    • Added the sidebar link for Joining the Slack Community 👥
    • Added the Add KPI FAQs in the onboarding screen 📄

    To upgrade your CG instance, follow the commands here.

    Source code(tar.gz)
    Source code(zip)
  • v0.5.0(Mar 15, 2022)

    Release Notes for Chaos Genius 0.5.0

    Hi everyone! We’re excited to announce the release of Chaos Genius 0.5.0 with some highly requested features since the start along with more bug squashing.

    Some of the key highlights for CG 0.5.0:

    1. Hourly Anomaly Detection and Alerting
    2. Event Alerts: understand and get alerted on changes in data.
    3. Druid support (Experimental).
    4. We have also squashed a bunch of bugs.

    To upgrade your CG instance, follow the commands here.

    A big thanks to @playsimple, @fampay-tech, @KShivendu, @coindcx-gh, @GRANTOSMO, @athul-osmo, @rsohlot

    :tada: New Features

    Hourly Anomaly Detection and Alerting

    • Since many users are running Chaos Genius on live data, they need faster anomaly detection and alerting, so we've added support to run anomaly detection and alerting at an hourly level.
    • Now users have the ability to run Anomaly Detection on KPIs every hour and observe real-time changes to their data.
    • Users can get hourly updates on slack or email by setting up hourly alerts for their KPIs.

    image image

    Event Alerts

    • You can now set up alerts to monitor various events in your data.
    • Alerts can be set up for the following events:
      • If you have a new entry added to your data.
      • If there is any change (addition or deletion) to your data.
      • If you have missing data
    • You can set the alert frequency to either daily or hourly.

    image

    Druid support (Experimental)

    • Druid is an open-source data store which can run super fast queries on data and provides data ingestion and fast data aggregation.
    • We currently only support the sum and count aggregations with druid but we’ll be adding mean and unique soon, so keep on the lookout for that.
    • The supported authorizations in druid are anonymous (without username, password) and basic auth

    Bug fixes

    • Back-filling of missing data not occurring at the edges for anomaly #730
    • Sometimes alerts are not triggered even though anomalies exist in sub-dimensions #743
    • Expected range in alerts can be confusing when negative numbers are involved #725
    • Fixed search bug across multiple pages #826
    • Fixed bug which didn't accept test query payload #823
    Source code(tar.gz)
    Source code(zip)
  • v0.4.1(Feb 23, 2022)

    🐛 Release Notes for Chaos Genius 0.4.1

    We are releasing a small release to tackle all the issues you raised. The issues tackled include:

    • Add hyperlink for troubleshooting URL in add KPI with embedded video
    • Fixed %Inf to - in KPI Home
    • Remove global loader from Add KPI form and change the text in the dropdown - no options to loading
    • Incorrect yAxis label formatting when value below 1000 in waterfall charts
    • 1 dependabot security vulnerability fix - other cannot be fixed
    • Slack alerts where config is not setup
    Source code(tar.gz)
    Source code(zip)
  • v0.4.0(Feb 17, 2022)

    Release Notes for Chaos Genius 0.4.0 (Public Release)

    1. What's New
    2. New Features
    3. Bug Fixes

    :sparkles: What's New?

    Hi everyone, we hope 2022 is off to a great start. With Chaos Genius 0.4.0 - we have a BIG announcement to make. After spending weeks working with you all working on making the product - easy to use and stable, we're finally opening up our repos for public access !! Don't forget to star us 🌟

    A BIG thank you to each one of you for your excitement about the product and diligent feedback. And we are grateful to @KShivendu, @danielefrigo, @coindcx-gh, @playsimple & @fampay-tech for the feedback for this release 🎶 🙌

    Some of the key highlights for CG 0.4.0:

    1. Simpler and faster install: 70% lesser storage, 50% faster installation
    2. DeepDrills: New time cuts addition (WoW, MoM, QoQ)
    3. Daily Alerts Report
    4. Alerts Dashboard
    5. Timezone support
    6. We also fixed a bunch of bugs.

    To upgrade your CG instance, follow the commands here.

    🎉 New Features

    Faster & Simpler Install

    • We are adding a default installation setup that is lighter, faster and works with fewer resources.
    • Decreased storage requirements by over 5.1 GB (70% of earlier storage)
    • Faster install time - which is on average halved from previous versions
    • (Optional) - all 3rd party SaaS connectors have been made optional and can be accessed now by docker-compose.thirdparty.yml

    Deep Drills

    • We introduced new time cuts for DeepDrills like WoW, WTD, MoM, MTD, QoQ, QTD to reflect business needs - these can be also be configured from the env variables.
    • We also improved UI for Deepdrills to make the data more actionable and intuitive.
    • Thanks for the feedback (Dushyant, Anmol, Shree)

    Daily Alerts Report

    • As many of our users are monitoring 1000s of KPIs (incl. sub-dimensions) - this can lead to multiple emails being triggered for each individual alert
    • For ease of use and to make these alerts more actionable, you can now select to get these alerts in a Daily Alerts Digest
    • The Alerts are now also presented in natural language (Thanks @KShivendu for all the amazing feedback on Alerts)
    image

    Alerts Dashboard

    • As many of our users are monitoring 1000s of KPIs (incl. sub-dimensions) - this can lead to a large number of alerts and can cause alert fatigue
    • To manage a large number of alerts better, we've enabled a Alerts Dashboard where you can access all your alerts for the last 7 days. You can filter by Alert Configurations, Recipient Email, KPI and Dates
    • For more details you can jump into the Alerts Dashboard available at api/digest or via the Alerts screen by clicking the header "Alerts"

    Timezone Support

    • Can set up a reporting time zone for metrics now on which the results are displayed.

    • Currently our analytics & alerts scheduler runs on server time - which can often create confusion if the server timezone is different from the reporting timezone. To tackle this issue, we now also display the timezone for all scheduler settings.

    • We are in the process of adding new features to enhance timezone including native handling of timezone aware data.

      🐛 Bug fixes

    • DQ Anomaly Metrics should not be displayed when we do count aggregation on a categorical column #575

    • Fix the sorting logic in the KPI and Dashboard #590

    • Add a modal popup after successful addition of datasources #616

    • Add a Loader in Add KPI Screen while selecting dropdowns #612

    • Human Readable numbers for Chart axis labels and values, Chart tooltip format uniformity #582

    • Fix severity computation robustness for anomalies for ML & Stats models #535

    • Duplicate entry for the last date in db while running Anomaly Detection daily #451

    • Anomaly Charts highlight entire time series as anomaly #417

    • Subtitles for Time input field #696

    • TimeZone Changes for Deepdrills graph and Anomaly #695

    • Flickering in hierarchial charts(grid) #688

    • Inconsistent time series for panel & anomalies for hourly data #678

    • Alert digests & subdimensional anomaly do not have consistent anomalies #677

    • Alerts toasts getting Retriggered on navigating back to the alerts page #699

    • Third-party datasources do not sync when chaosgenius-db port is not exposed #720

    • Enable all time cuts by default in DeepDrills #721

    • KPI is not added to non-default dashboards that are selected in Create KPI Page. #729

    • NaN output for Mean aggregations in RCA when one of the groups is empty #731

    Source code(tar.gz)
    Source code(zip)
  • v0.3.0(Jan 7, 2022)

    Release Notes for Chaos Genius 0.3.0

    • What's New
    • New Features
    • Bug Fixes

    ✨ What's New?

    A very happy new year to everyone in the Chaos Genius Community! We are bringing another big upgrade to the Chaos Genius experience based on the love and feedback we've received from you!! :heart:

    Thank you - @mvaerle, @omriAl, @KrishnaSistla, @gxu-kangaroo, @Adwate, @coindcx-gh, @davidhayter-karhoo, @csoni111!

    With Chaos Genius 0.3.0, you can now scale Chaos Genius across multiple teams in the organization with our Dashboards feature. You can get even more powerful insights with Anomaly Detection supported at a sub-population level. In the task manager, you can now also view the errors occurring in your analytics - so no more going through the logs :) We also fixed a bunch of bugs.

    To upgrade your CG instance, follow the commands here.

    We look forward to continue building with all the support from our Community! Thank you and wish you all and your families a safe and stellar 2022! 🎈

    🎉 New Features

    Anomaly detection at a sub-group level

    So far we've only supported anomaly detection at an Overall KPI level. Based on community requests (thanks @gxu-kangaroo , @coindcx-gh), we now run anomaly detection and alerting for anomalies in KPIs at the top 250 sub-population groups (this is configurable). See below a quick snapshot of the anomaly alert:

    • Subdimensional email and slack alerts #516
    • feat(anomaly): add subdim level anomaly #512

    Teams Dashboards (EE Feature)

    We launched our first EE feature by allowing users to create separate dashboards for different groups of KPIs. This can be used to segregate analytics across different teams, customers, or any other groups.

    Using this, customers are already using Chaos Genius to live monitor over 1000s of KPIs daily. Something which was not possible using the traditional BI tools!

    This is an EE feature. So feel free to reach out to [email protected] for pricing & other details.

    • feat: dashboard functionality along with some fixes #520

    Detailed Error Reporting in Task Status - no more looking at Logs to debug.

    We have tried to reduce the need to go through the logs by allowing users to find the status of their analytics run directly from the Chaos Genius portal. You can now also see the Error that's occurring and debug yourself or share the screenshot with the Chaos Genius team.

    • Store task & subtask status and create a view for it for streamlined troubleshooting #459

    Analytics now can run on data for (t-1)

    As a default, we run analytics at a t-2 offset from the current day to allow for data pipelines to complete & the data to be backfilled for the day.

    We now allow this to be set by changing DAYS_OFFSET_FOR_ANALTYICS to 1. As a default, it is set to 2.

    • feat: updating t-2 to t-(k-1) for analytics #523

    Define KPIs without any dimensions

    Earlier it was not possible to define a KPI w/o selecting the dimensions. We have now made that optional. In a KPI w/o defined dimensions, you will not get DeepDrills analysis. In Anomaly you will get the overall KPI anomaly but Drill Downs will not be available.

    • feat(anomaly): added anomaly without subdim support #435

    Support for custom schema as a table

    Thanks, @mvaerle for the feature suggestion. Earlier to setup a KPI on custom schema only query was an option. We now allow custom (non-public) schema to be added as a table KPI for ease of KPI setup. We support Postgres, Snowflake & Redshift for this.

    • Selecting a custom schema when adding KPIs #359

    Better defaults for analytics params

    Based on user feedback from @gxu-kangaroo, @davidhayter-karhoo @fampay-tech, we have modified the defaults for these analytics params.

    As a default, we would now consider any dimension with up to 1000 cardinality. Also, we now consider the top 250 sub-groups for anomaly detection at a sub-dimensional level. Learn more about these params here.

    MAX\_SUBDIM\_CARDINALITY=1000
    MAX\_FILTER\_SUBGROUPS\_ANOMALY=250   
    
    • chore: update config params #567

    🐛 Bug Fixes

    • Query for KPI validation & analytics is not of consistent format for time ranges #457
    • [BUG] Large panel values in DeepDrills overflow in to the graph #479
    • Anomaly screen not getting redirected to setting when anomaly is not setup #537
    • Handle fallback screen for 0 KPI #539
    • Drilldown collapse/expand not working for zero dimension #543
    • Form Validation is not working correctly in create dashboard screen #542
    • Subdimensional Anomaly tab shows incorrect fallback before valid data is fetched #544
    • [BUG] "All" dashboard should not be allowed to be deleted #553
    • Edit KPI screen should have the dashboard they are pinned to selected in dropdown menu #540
    • Flask API endpoints shouldn't have the trailing slash #576
    Source code(tar.gz)
    Source code(zip)
  • v0.2.0(Dec 9, 2021)

    Release Notes for Chaos Genius 0.2.0

    • What's New
    • New Model(s)
    • New Features
    • Bug Fixes

    :sparkles: What's New?

    Thank you all for the feedback on Chaos Genius 0.1.3. We're releasing Chaos Genius 0.2.0 today.

    Our main focus for the upgrade was on covering edge cases for making DeepDrills and Anomaly Detection work on as varied datasets as possible, adding Task Status monitoring to enable users to detect if any analytics is failing and other bug fixes.

    Key highlights being:

    • Detailed status tracking for analytics for faster detection & debugging (cc: @bouke-nederstigt , @gxu-kangaroo, @davidhayter-karhoo, @mvaerle)
    • Configurations for edge cases like older data sets, smaller data sets, enabling KPI definition w/o dimensions etc. (cc: @davidhayter-karhoo)
    • DeepDrills handling for missing data, NULL/NaN values (cc: @davidhayter-karhoo)
    • New Anomaly Detection model - EWMA
    • Error & Analytics - Config for enabling Sentry & PostHog for Error handling & Analytics (cc: @coindcx-gh)
    • Improved Alerting logic
    • Bug Fixes
      • Data Sources not showing on installation (cc: @omriAl, @nsankar)
      • Other Bug Fixes

    We're happy to inform you that we've reached a Community Size of 50 with teams from 10 different time zones in such a short period of time! We look forward to working closely with all of you to support your use cases before we open up to the Public.

    🧮 New Model(s)

    We added a new model for Anomaly Detection - EWMA. Exponentially Weighted Moving Average (EWMA) is a statistic that averages the data in a way that gives less and less weight to data as they are further removed in time. EWMA is better suited for cases where the data is largely static and then can have sudden state change.

    • feat(anomaly): add EWMA Model (#428)

    :tada: New Features

    Task and status observability on your Analytics

    There are various unique reasons which can sometimes lead to analytics failing - e.g. Database access/authorization error, network error, incomplete data. While we are covering as many edge cases as possible, adding a Task Status is our first step towards faster incident detection. We be adding more features to it including exact errors & diagnoses when the analytics fails. The task status on local installation should be available at http://127.0.0.1:8080/api/status/

    image

    • Store task & subtask status and create a view for it for streamlined troubleshooting (#459)
    • Observable tasks deepdrills (#446)

    Error handling and user analytics to give better support (sentry, posthog)

    In order to identify the error sooner, you can now configure your Sentry account by updating the parameter SENTRY_DSN in docker-compose.yml. We can also provide you our Sentry token so we can closely monitor any issues you maybe facing.

    We've also added Posthog - an open-source analytics tool, to capture user activity to help us better inform the product roadmap as we open up our repos for public access. We enabled an option for anonymizing the data before sharing. It is also possible to disable Posthog.

    • Init the sentry integration (#357)
    • Posthog user identification & redirection (#462)

    More dataset configurations/missing data support

    In the previous versions, there were analytics failures in cases where there was no data for the past 5 days. We call this 'Slack length'. We've made this value configurable (MAX_DEEPDRILLS_SLACK_DAYS and MAX_ANOMALY_SLACK_DAYS) in the docker-compose.yml and updated the default to 14 days. This parameter helps us to perform anomaly detection on the latest data for the most accurate results.

    In our previous versions, we also required users to select dimensions as a mandatory field. We've now made this optional. You need to specify dimensions only if you need sub-dimensional insights.

    • Make slack configurable for DeepDrills and Anomaly (#434)
    • Remove the mandatory option for the dimension (#445)

    Robust DeepDrills for missing data & errors

    Our first implementation of DeepDrills required complete datasets with last 60 days of data to run successfully. We've enhanced DeepDrills to be more granular in order to work with incomplete data sets & handle missing data.

    • Handle DeepDrills analytics failures gracefully with partial analytics in case of subtask errors (#458)
    • Account for NaN & NULL values in DeepDrill analysis (#437)

    Improved alerting logic

    We've enhanced our alert logic to instantly trigger alert once an anomaly is detected. We've also made a few improvements in the alert format. We'll continue to build out the alerting functionality in our future releases.

    • Update the anomaly alert implementation (#467)
    • Change the email format for more clarity (#477)

    Improved analytics indexing

    We have optimized our indexes to provide faster drill-downs for large KPIs & dimensions.

    • Add the analytics data index (#461)

    :bug: Bug Fixes

    • Handle KPI queries with trailing semicolon for KPI validation & analytics (#429)
    • Validate the duplicate column in the result dataset of query defined KPI (#441)
    • Snowflake connector mentions setting up with a hostname, where the hostname is actually not required (#438) (cc: @joshuataylor)
    • Metric columns having NaN's in first 10 or higher rows fails KPI Validation (#444)
    • Validation for the dimension column in the add KPI screen (#450)
    • DeepDrills fails for KPI with no dimensions defined (#468)
    • Handle empty data in comparison dataframe for mean aggregation in DeepDrills (#494)
    Source code(tar.gz)
    Source code(zip)
  • v0.1.3(Nov 18, 2021)

    Release Notes for Chaos Genius 0.1.3

    • What's New
    • New Connector(s)
    • New Features
    • Bug Fixes

    :sparkles: What's New?

    We're excited to announce the release of new and improved Chaos Genius 0.1.3. We want to sincerely thank our early users for their feedback (shout to @gxu-kangaroo @davidhayter-karhoo, @mvaerle, @nitsujri, @miike, @coindcx-gh) and to all our contributors for their relentless effort towards improving Chaos Genius.

    Chaos Genius 0.1.3 is focused on improving the onboarding process and improving compatibility for large datasets and varied sub-population types.

    Key highlights being:

    • Amazon Redshift Integration
    • Global Configuration to support handling large datasets (aggregated views upto 10M rows) and varied sub-populations (1-250 subgroups)
    • Optimized data fetching for large datasets
    • Improving Anomaly Detection via handling missing data points in time series, higher number of drill-downs, higher cardinality support (1000+) and enhancements in Anomaly Detector Configuration
    • DeepDrills bug fixes
    • Improved logging
    • Other bug fixes

    🔌 New Connector(s)

    With the 0.1.3 release, Chaos Genius now supports Amazon Redshift as a data source. With this Chaos Genius now works with the 3 major data warehouses - Snowflake, BigQuery and Amazon Redshift.

    Please find the documentation for Redshift here.

    We will soon release public data sets on Redshift for our community to test out!

    • Add the redshift connector (#348)

    :tada: New Features

    Global Configuration to support large datasets & varied sub-group characteristics

    Using a global configuration setting, Chaos Genius can now enable support for aggregated views upto 10M rows and varied sub-group characteristics (1-250+ subgroups). This will enable config control over the statistical filtering calculations that are carried out while running both DeepDrills and Anomaly Detection at a sub-group level.

    Chaos Genius team will be happy to help you set up the configuration.

    • Fine Grained control on Anomaly Detection for different series_type (#324)
    • Add support for subgroup calculation global config in anomaly detection core (#341)
    • Make population calculation & statistical filtering parameters globally configurable (#340)
    • Make population calculation & statistical filtering parameters globally configurable (#340)

    Anomaly Detection Enhancements

    Missing Data in Time Series

    Handling missing data points in time-series analysis is a hairy problem. Chaos Genius 0.1.3 now handles missing data points as zero while plotting the time-series graphs and identifying anomalies. We will continue to invest more deeply on this going forward by adding missing data alerts which might get undetected in certain algorithms.

    • Handle completely missing data in time series as zero (#367)

    Higher Cardinality support for Dimensions definition

    We've further optimized subgroup time series creation to handle higher cardinality dimensions. We now support dimensions with 1000+ cardinality. Earlier large cardinality dimensions were excluded from the analysis. We'll continue to optimize it further over upcoming releases.

    • Refactor anomaly detection subgroup detection to handle higher cardinality (#350)

    Higher Number of Drill-downs in Anomaly Detection

    While investigating Anomalies via Drill Downs, Chaos Genius now gives 10 most relevant sub-groups sorted by relevance (mix of anomaly severity & sub-group population) - this number is also configurable. We also upgraded the algorithms used to create these sub-groups.

    Going forward, we will enable this by configuration and also enable multi-dimensional drill downs as you detect the top drivers causing anomalies in your time-series.

    • Enable support for a higher number of drill downs (#319)
    • Create new algorithm for subgroup list generation (#351)

    Support for Multivariate Subdimensional groups

    Chaos Genius now supports the ability to detect anomalies on multivariate subdimensional groups that are mutually exclusive. All possible permutations for selected dimensions are selected, statistically filtered based on population characteristics - anomaly detection & drill downs are now available for these as an option.

    We'll continue investing in subdimensional anomaly detection including clustering & grouping for subdimensions that behave alike.

    • Configurable sub-dimension settings for anomaly detection (#349)

    Improved UX for Anomaly Settings for hourly time-series

    Chaos Genius now offers improved UX for setting anomaly detection configuration for hourly time-series.

    You can now specify the historical data by number of days instead of units of frequency of the time-series - e.g. 7 days instead of 168 hours if you need to train hourly time-series data for last week :)

    • Set anomaly period's value in days, irrespective of frequency (#336)

    Optimized Data Fetching for Large Datasets

    In the current release, we've added optimization for fetching large datasets by adding chunk size specifications. Data is fetched in chunks (currently param is set to 50,000) and then merged into a single dataframe.

    • Benchmark & enable chunk size for pandas data fetching (#332)

    Enhanced Logging

    We're working extensively to improve the logging for Chaos Genius. In the current release, we've centralized the logging, added an option for Fluentd logs for persistence and now also include data params in the logs in order to identify edge cases where the analytics might be failing to run.

    • Centralized logging and spawning of loggers throughout the flask app (#313)
    • Fluentd for persistence
    • Data params passed in logs for easier replication of edge case issues

    In subsequent releases, we plan to enable the status of all the tasks.

    Other enhancements

    • Added nginx based front-end deployment
      • Update docker-compose for 0.1.3 release (#419)
    • Global configuration for multidimensional drill-downs
      • Make multidimensional drill down to be configurable for DeepDrills (#369)
    • Improved Error Message copy in UI
      • fix: Update the error messages, disable event kpi alert fix, anomaly setting fixes (#321)
      • Error message & integer type changed (#284)

    :bug: Bug Fixes

    • DeepDrill UI fixes
      • Count & size columns in the DeepDrills table are swapped (#310)
    • Anomaly interface fixes
      • Anomaly drill down graphs only display integer values (#306)
      • Changes in the Edit Anomaly Settings (#311)
      • Make analytics charts more descriptive and consistent. (#372)
    • Snowflake metadata ingestion issue raised by Grant Xu
      • Using Snowflake timestamp when casted can create issues while adding KPI (#320)
    • Handle when KPI only has 1 subgroup
      • UnboundLocalError when a KPI has only one subgroup (#342)
    • Handle edge cases data with multiple frequencies
    • Fix the edit functionality for data sources
      • Data Source isn't being updated properly (#308)
    • Other UI fixes
      • Modified timestamp isn't coming in the alert (#309)
    • Fixes to improve handling of missing or incomplete data
      • RCA Infinite Loop when KPI is query based (#344)
      • Data Padding causes issue with anomaly detection values (#353)
      • RCA saving fails if there is a NaN value present (#347)
    • Fixes to handle anomaly & DeepDrill edge cases
      • Incorrect confidence intervals for anomaly detection after the first training session (#388)
      • Inconsistent analytics occurs between hourly panel metrics, DeepDrill & anomaly data (#390)
      • Wrong Start Dates for Anomaly (#399)
      • Anomaly training not till expected end date (#400)
      • Missing data point in DeepDrills when we have missing data (#413)
      • Inconsistent Last Updated between Anomaly and Deepdrills (#411)
    • Validation sequence logic & platform update for KPI addition
      • Adding KPI with incorrect columns does not produce the correct errors (#391)
      • SQL error while adding snowflake KPI (#397)
      • Truncated error output for out of bounds error in KPI validation (#398)
      • KPI validation for datetime column does not work (#405)
    Source code(tar.gz)
    Source code(zip)
  • v0.1.2(Oct 11, 2021)

    The first public release for the alpha version 🔮

    This release includes:

    • Data source connections for database, data warehouses & third party sources
    • KPI creation with multiple dimensions for analysis
    • DeepDrills across the data with multidimensional waterfalls
    • Anomaly detection along with multidimensional drill-downs & data quality checks
    • Alerting based on the severity threshold on email and Slack
    Source code(tar.gz)
    Source code(zip)
Owner
Chaos Genius
Chaos Genius is an open source analytics engine for applying AI and ML to monitor and analyse high-dimensionality business & system metrics.
Chaos Genius
(JMLR'19) A Python Toolbox for Scalable Outlier Detection (Anomaly Detection)

Python Outlier Detection (PyOD) Deployment & Documentation & Stats Build Status & Coverage & Maintainability & License PyOD is a comprehensive and sca

Yue Zhao 6.6k Jan 3, 2023
Streaming Anomaly Detection Framework in Python (Outlier Detection for Streaming Data)

Python Streaming Anomaly Detection (PySAD) PySAD is an open-source python framework for anomaly detection on streaming multivariate data. Documentatio

Selim Firat Yilmaz 181 Dec 18, 2022
A Python Library for Graph Outlier Detection (Anomaly Detection)

PyGOD is a Python library for graph outlier detection (anomaly detection). This exciting yet challenging field has many key applications, e.g., detect

PyGOD Team 757 Jan 4, 2023
[CVPR 2021] Counterfactual VQA: A Cause-Effect Look at Language Bias

Counterfactual VQA (CF-VQA) This repository is the Pytorch implementation of our paper "Counterfactual VQA: A Cause-Effect Look at Language Bias" in C

Yulei Niu 94 Dec 3, 2022
CAUSE: Causality from AttribUtions on Sequence of Events

CAUSE: Causality from AttribUtions on Sequence of Events

Wei Zhang 21 Dec 1, 2022
Imbalanced Gradients: A Subtle Cause of Overestimated Adversarial Robustness

Imbalanced Gradients: A Subtle Cause of Overestimated Adversarial Robustness Code for Paper "Imbalanced Gradients: A Subtle Cause of Overestimated Adv

Hanxun Huang 11 Nov 30, 2022
Source code for our paper "Improving Empathetic Response Generation by Recognizing Emotion Cause in Conversations"

Source code for our paper "Improving Empathetic Response Generation by Recognizing Emotion Cause in Conversations" this repository is maintained by bo

Yuhan Liu 24 Nov 29, 2022
SSD: A Unified Framework for Self-Supervised Outlier Detection [ICLR 2021]

SSD: A Unified Framework for Self-Supervised Outlier Detection [ICLR 2021] Pdf: https://openreview.net/forum?id=v5gjXpmR8J Code for our ICLR 2021 pape

Princeton INSPIRE Research Group 113 Nov 27, 2022
Outlier Exposure with Confidence Control for Out-of-Distribution Detection

OOD-detection-using-OECC This repository contains the essential code for the paper Outlier Exposure with Confidence Control for Out-of-Distribution De

Nazim Shaikh 64 Nov 2, 2022
Deep Anomaly Detection with Outlier Exposure (ICLR 2019)

Outlier Exposure This repository contains the essential code for the paper Deep Anomaly Detection with Outlier Exposure (ICLR 2019). Requires Python 3

Dan Hendrycks 464 Dec 27, 2022
(Py)TOD: Tensor-based Outlier Detection, A General GPU-Accelerated Framework

(Py)TOD: Tensor-based Outlier Detection, A General GPU-Accelerated Framework Background: Outlier detection (OD) is a key data mining task for identify

Yue Zhao 127 Jan 5, 2023
Official Implementation of "LUNAR: Unifying Local Outlier Detection Methods via Graph Neural Networks"

LUNAR Official Implementation of "LUNAR: Unifying Local Outlier Detection Methods via Graph Neural Networks" Adam Goodge, Bryan Hooi, Ng See Kiong and

Adam Goodge 25 Dec 28, 2022
Apache Spark - A unified analytics engine for large-scale data processing

Apache Spark Spark is a unified analytics engine for large-scale data processing. It provides high-level APIs in Scala, Java, Python, and R, and an op

The Apache Software Foundation 34.7k Jan 4, 2023
Deep Text Search is an AI-powered multilingual text search and recommendation engine with state-of-the-art transformer-based multilingual text embedding (50+ languages).

Deep Text Search - AI Based Text Search & Recommendation System Deep Text Search is an AI-powered multilingual text search and recommendation engine w

null 19 Sep 29, 2022
On-device speech-to-intent engine powered by deep learning

Rhino Made in Vancouver, Canada by Picovoice Rhino is Picovoice's Speech-to-Intent engine. It directly infers intent from spoken commands within a giv

Picovoice 510 Dec 30, 2022
On-device speech-to-index engine powered by deep learning.

On-device speech-to-index engine powered by deep learning.

Picovoice 30 Nov 24, 2022
SymPy-powered, Wolfram|Alpha-like answer engine totally in your browser, without backend computation

SymPy Beta SymPy Beta is a fork of SymPy Gamma. The purpose of this project is to run a SymPy-powered, Wolfram|Alpha-like answer engine totally in you

Liumeo 25 Dec 21, 2022
Certifiable Outlier-Robust Geometric Perception

Certifiable Outlier-Robust Geometric Perception About This repository holds the implementation for certifiably solving outlier-robust geometric percep

null 83 Dec 31, 2022
VOS: Learning What You Don’t Know by Virtual Outlier Synthesis

VOS This is the source code accompanying the paper VOS: Learning What You Don’t

null 248 Dec 25, 2022