Python Dialogflow CX Scripting API (SCRAPI)

Overview
Scrappy, the SCRAPI mascot!

Python Dialogflow CX Scripting API (SCRAPI)

A high level scripting API for bot builders, developers, and maintainers.

Table of Contents
  1. Introduction
  2. Getting Started
  3. Usage
  4. Roadmap
  5. Contributing
  6. License
  7. Contact
  8. Acknowledgements

Introduction

The Python Dialogflow CX Scripting API (DFCX SCRAPI) is a high level API that extends the official Google Python Client for Dialogflow CX which makes using CX easier, more friendly, and more pythonic for bot builders, developers and maintainers.

SCRAPI --> Python Dialogflow CX
as
Keras --> Tensorflow

What Can I Do With DFCX SCRAPI?

With DFCX SCRAPI, you can perform many bot building and maintenance actions at scale including, but not limited to:

  • Create, Update, Delete, Get, List for all CX resources types (i.e. Intents, Entity Types, Pages, Flows, etc.)
  • Convert commonly accessed CX Resources to Pandas Dataframes for data manipulation and analysis
    • Ex: bulk_intents_to_dataframe provides you all intents and training phrases in a Pandas DataFrame that can be manipulated and/or exported to CSV or back to CX
  • Have fully automated conversations with a CX agent (powerful for regression testing!)
  • Extract Validation information to assist in tuning your agent NLU, routes, etc.
  • Extract Change History information to assist with Change Management and Accountability for your devlepment team
  • Search Util functions to look across all Flows/Pages/Routes to find a specific parameter or utterance you need to locate
  • Copy Util functions that allow you to quickly move CX resource between agents!
    • Ex: copy_intent_to_agent allows you to choose source and destination Agent IDs and a human readable Intent Display Name and BAM! Intent is moved with all training phrases to the destination agent!
  • Maker/Builder Util functions that allow you to build the fundamental protobuf objects that CX uses for each resource type
    • Ex: if you want to build a new Intent (or hundreds!) with training phrases from a pandas dataframe, you can build them all offline/in memory using the build_intent method
  • ...and much, much more!

Built With

  • Python 3.8+

Getting Started

Environment Setup

Set up Google Cloud Platform credentials and install dependencies.

gcloud auth login
gcloud auth application-default login
gcloud config set project <project name>
python3 -m venv venv
source ./venv/bin/activate
pip install -r requirements.txt

Authentication

In order to use the functions and API calls to Dialogflow CX, you will need a Service Account that has appropriate access to your GCP project.
For more information on view the official docs for Creating and Managing GCP Service Accounts.

Usage

To run a simple bit of code you can do the following:

  • Import a Class from dfcx_scrapi.core
  • Assign your Service Account to a local variable
from dfcx_scrapi.core.intents import Intents

creds_path = '
   
    '
   
agent_path = '
   
    '
   

# DFCX Agent ID paths are in this format:
# 'projects/
   
    /locations/
    
     /agents/
     
      '
     
    
   

# Instantiate your class object and pass in your credentials
i = Intents(creds_path, agent_id=agent_path)

# Retrieve all Intents and Training Phrases from an Agent and push to a Pandas DataFrame
df = i.bulk_intent_to_df()

For more examples, please refer to Examples or Tools.

Library Composition

A brief overview of the motivation behind the library structure

Core

The Core folder is synonymous with the core Resource types in the DFCX Agents like:

  • agents
  • intents
  • entity_types
  • etc.

The Core folder is meant to contain the fundamental building blocks for even higher level customized tools and applications that can be built with this library.

Tools

The Tools folder contains various customized toolkits that allow you to do more complex bot management tasks. These include things like:

  • Manipulating Agent Resource types into various DataFrame structure for data scienc-y type tasks
  • Copying Agent Resources between Agents and GCP Projects on a resource by resource level
  • Moving data to and from DFCX and other GCP Services like BigQuery, Sheets, etc.
  • Creating customized search queries inside of your agent resources
    • i.e. - Find all parameters in all pages in the agent that contain the string dtmf

Roadmap

TBD

Contributing

We welcome any contributions or feature requests you would like to submit!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

License

Distributed under the Apache 2.0 License. See LICENSE for more information.

Contact

Patrick Marlow - [email protected] - @kmaphoenix
David "DC" Collier - [email protected] - @DCsan
Henry Drescher - [email protected] - @Hgithubacct

Project Link: https://github.com/GoogleCloudPlatform/dfcx-scrapi

Acknowledgements

Dialogflow CX Python Client Library

Comments
  • Feature/copy util update

    Feature/copy util update

    Updating copy util because it was getting errors from missing flows_map/flow references when trying to copy contents of one flow into another empty flow. Also added to the copy_paste_pages notebook to show how to copy the start page of a flow, as it has to be done separately from the other pages.

    opened by SeanScripts 6
  • Generate new training phrases and test sets from existing tp

    Generate new training phrases and test sets from existing tp

    Use the utterance generator tool to:

    1. Create net-new training phrases for intents from existing intents
    2. Create test sets from existing training phrases, ie use the training phrases to generate new independent phrases to be used as a test set.
    opened by Hgithubacct 5
  • Added get_webhook, update_webhook, small cleanup

    Added get_webhook, update_webhook, small cleanup

    Added ability to get individual webhooks by ID and make updates to existing webhooks. Also cleaned up some comment stuff.

    Should work very similar to environments/versions so I don't expect any issues! (famous last words)

    enhancement Webhooks 
    opened by Omerside 4
  • [FIX] Replace Frame.append with pd.concat

    [FIX] Replace Frame.append with pd.concat

    Context:

    /layers/google.python.pip/pip/lib/python3.9/site-packages/dfcx_scrapi/core/conversation.py:117: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
    /layers/google.python.pip/pip/lib/python3.9/site-packages/dfcx_scrapi/core/conversation.py:117: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
    

    Reference: https://github.com/GoogleCloudPlatform/dfcx-scrapi/blob/ee744d3fef9b0f1d53f6508dd1d91941731f8dc5/src/dfcx_scrapi/core/conversation.py#L117

    dataframes 
    opened by kmaphoenix 4
  • [Pull Request] Agent Assist Class Addition

    [Pull Request] Agent Assist Class Addition

    Pull Request Template

    Description

    The change aims to add modules in the test_cases class. This change will help the DDs & CAs get detailed information on test coverage rate with respect to intents, flows, transitions & route groups coverage.

    No new dependencies needs to be added or changed.

    Fixes # (issue)

    Type of change

    Please delete options that are not relevant.

    • [ ] Bug fix (non-breaking change which fixes an issue)
    • [x] New feature (non-breaking change which adds functionality)
    • [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
    • [ ] This change requires a documentation update

    How Has This Been Tested?

    I extended the ScrapiBase class to create a new test_case_extend class. This was done to ensure that the module will integrate with the existing test cases class. Then I added the new modules as method to this class. I created a dummy agent in a GCP project and manually generated few test cases. I then used the agent id and a service account key from the above GCP project to test my code. The results were validated by matching the numbers in the DFCX console.

    Requirements to run the above script-

    • Agentid with some DFCX test cases configured
    • SAK with Dialogflow test cases admin role

    Please describe the tests that you ran to verify your changes. Provide instructions and code snippets so we can reproduce. Please also list any relevant details for your test configuration (i.e. new dependencies).

    Psuedo Code-

    Code Snippets:

    # Ex:
    # from dfcx_scrapi.core.intents import Intents
    # i = Intents()
    # intents = i.list_intents()
    # for intent in intents:
    #   print(intent.display_name)
    

    Checklist:

    • [x] My code follows the style guidelines of this project
    • [x] I have performed a self-review of my own code
    • [x] My code passes the linter as defined in the .pylintrc file
    • [x] I have commented my code, particularly in hard-to-understand areas
    • [ ] I have made corresponding changes to the documentation
    • [x] My changes generate no new warnings
    • [ ] I have added tests that prove my fix is effective or that my feature works
    • [ ] New and existing unit tests pass locally with my changes
    • [ ] Any dependent changes have been merged and published in downstream modules
    • [x] I have checked my code and corrected any misspellings
    opened by hkhaitan1 3
  • [FR] Examples for annotated TPs uploading

    [FR] Examples for annotated TPs uploading

    Is your proposal related to a problem?

    It's related to understand on how to upload annotated TPs, how is the input file format and wich SCRAPI's resource I need to use to do that.

    Describe the solution you'd like

    A notebook with some examples.

    Describe alternatives you've considered

    (Write your answer here.)

    Additional context

    (Write your answer here.)

    enhancement Intents fr dataframes 
    opened by migoogle 3
  • Advanced mode did not manage properly unannotated training phrases

    Advanced mode did not manage properly unannotated training phrases

    The bulk_create_intent_from_dataframe and bulk_update_intents_from_dataframe did not manage properly training phrases that had no annotations (parameter_id is NA). Added a condition that solves the issue in src/dfcx_scrapi/tools/dataframe_functions.py file.

    Also added an example:

    • a notebook and .py files to help understand how to work with annotated training phrases
    • 2 CSV files for sample intent and params
    Intents dataframes 
    opened by lambertaurelle 3
  • [BUG] _page_level_handlers() doesn't return df with all pages

    [BUG] _page_level_handlers() doesn't return df with all pages

    Expected Behavior

    function returns dataframe for all pages in all flows within the agent

    Current Behavior

    function returns a fraction of the expected pages.

    I wrote my own code to pull page event handlers before I was aware of find_event_handlers() (1/3 of which is _page_level_handlers) which returned EHs for 1379 for a certain agent. For the same agent, find_event_handlers() pulled EHs for only 77 pages.

    bug dataframes 
    opened by Greenford 3
  • [BUG] non-intuitive behavior in tools/search_util function

    [BUG] non-intuitive behavior in tools/search_util function

    Current Behavior

    search = search_util.SearchUtil(creds_path=creds)
    handlers = search.find_event_handlers()
    

    2nd line throws AttributeError: 'SearchUtil' object has no attribute 'agent_id' agent_id isn't a parameter in find_event_handlers.

    Possible Solution

    1. agent_id should be a constructor required param -or- 2. should be a parameter for find_event_handlers(). Recommend 2.
    bug 
    opened by Greenford 3
  • New file exclusively for drive connectors, new sheets functions

    New file exclusively for drive connectors, new sheets functions

    New file for sheets functions, removed existing from data frames class and added new file to separate out functionality. Added in ability to create new sheets, worksheets, share sheets, pipeline to create a new sheet, share it with a set of emails, create a new worksheet and add data to it from a dataframe.

    Auth types: inherited from environment variables creds file path creds dictionary object

    opened by Hgithubacct 3
  • Route Transition function fix

    Route Transition function fix

    Fixed 2 issues:

    1. Fixed edge case error in the "route_groups_to_dataframe" function. Fix: When a Route has no intent under it the code will skip the record instead of displaying an error.
    2. Added "target_page" field to the output of "route_groups_to_dataframe" function
    opened by HEMANTH5439 3
  • Feature/search util update

    Feature/search util update

    In the SearchUtil class, adds a function get_param_presets_df to get a dataframe of all parameter presets in the agent, with optional filtering to a list of flows. This covers a common search request, to look for all places where a particular parameter is set, and what value it is set to. Thanks for your work on this content @skyyeat !

    opened by SeanScripts 1
  • Feature/agent checking

    Feature/agent checking

    Creates AgentCheckerUtil class, with functions for finding reachable/unreachable pages and intents. Some other checking functions could be added in the future. Also adds a function to TestCases to get a dataframe of test case results for an agent, rerunning tests without results, or optionally rerunning all test cases. I had put this in the agent checking class, but decided it made more sense to include it in TestCases directly, since we have similar functions to get dataframes in the other core classes.

    opened by SeanScripts 1
  • Access or Roles requirement given to service account key for integration with DFCX #PermissionDenied #Roles

    Access or Roles requirement given to service account key for integration with DFCX #PermissionDenied #Roles

    I got a service account key from my organization to make use of scrapi and automate some functionalities in DFCX like create bulk test cases and uploading intent with tp in bulk etc. So the organization gave some access to that service account and when I use it to even read the agent using `get_agent()' it throws "PERMISSION DENIED" and basically says that my account doesn't have the right permission to perform this task without actually specifying which roles exactly. I have less idea on what all roles/access the organization has given to the service account and I found no ways to find that unless I had access to IAM which I don't. I would request to get clarity on what all roles does a service account need to perform basic functionalities like upload intents, create test cases etc. I can see that in every example there is prerequisite to have "API admin role" assigned to the account but my organization can't provide me that access. So is that the issue or do we have some alternative roles that can compensate for it? Please make use of this link https://cloud.google.com/dialogflow/cx/docs/concept/access-control and state exact roles needed in a service account to perform basic functionalities in DFCX.

    opened by tima946 1
  • dataframe_functions _make_schema concat dataframe with dictionary

    dataframe_functions _make_schema concat dataframe with dictionary

    error in _make_schema concatenating dataframe to dictionary

    Expected Behavior

    Current Behavior

    Possible Solution

    implement better type coercing method use dictionary to add to the dataframe

    Steps to Reproduce

    Context (Environment)

    when calling bulk_create_intent_from_dataframe function

    or try running this example https://github.com/GoogleCloudPlatform/dfcx-scrapi/blob/main/examples/bot_building_series/bot_building_102-intents-with-annotated_tp.ipynb

    Detailed Description

    Possible Implementation

    bug 
    opened by DtStry 0
  • [FR] Implement Automation to Support Test Driven Design (TDD) in Dialogflow CX Agents

    [FR] Implement Automation to Support Test Driven Design (TDD) in Dialogflow CX Agents

    Is your proposal related to a problem?

    As a user, I need a way to author Dialogflow CX Test Cases outside of the console/IDE.
    Preferably, this can be in a format that is easily human-readable like Google Sheets or YAML.

    Describe the solution you'd like

    • Implement a Test Case Builders class to support creating the Test Case Protos
    • Implement a Test Case Utils class that offers several methods including:
      • Converting a Google Sheet of Test Case data into a List of Test Case Protos ready for upload
      • Converting a YAML file of Test Case data into a List of Test Case Protos ready for upload
      • Methods for parsing inputs from all formats
      • Methods for validating input from all formats

    Additional context

    A primer on Test Driven Development and why it can be a powerful paradigm for designing Dialogflow CX Agents.

    enhancement fr Test Cases 
    opened by kmaphoenix 0
Releases(1.5.1)
  • 1.5.1(Sep 30, 2022)

    Bug Fixes

    • Fixed import typo in NLU Analysis notebook 😁

    Full Changelog: https://github.com/GoogleCloudPlatform/dfcx-scrapi/compare/1.5.0...1.5.1

    Source code(tar.gz)
    Source code(zip)
  • 1.5.0(Sep 30, 2022)

    New Features

    Builder Classes

    In this release, we're introducing a new concept called Builders. 👷🏼‍♂️🏗 🛠

    Motivation: All Dialogflow CX resources are built as Protocol Buffers or "protos" for short. When you call get_agent or get_intent, you're receiving that resource as an Agent Proto or Intent Proto that you can then manipulate, edit, and push back to Dialogflow CX. Protos can contain other protos. Protos can be difficult to work with at times! We built this class to simplify working with Protos!

    Each Builder class offers a set of CRUD functions that can be performed 100% offline, without any calls to the Dialogflow CX APIs.
    This allows you to programmatically build 100's or 1000's of resources offline, iterate, experiment, and perfect until you're ready to actually push them to your Dialogflow CX Agent via the SCRAPI core classes.

    We've even included a Builders 101 Colab Notebook to kickstart your Builders journey!

    Happy Building! 😄 🏗️


    NLU Utils - Semantic Similarity and Clustering

    We've introduced a new NaturalLanguageUnderstandingUtil Class that is packed full of some amazing NLU Analysis tools to supercharge your bot tuning workflows. Powered by Tensorflow, [1][USEv4](https://tfhub.dev/google/universal-sentence-encoder/4), and [2][ScaNN](https://github.com/google-research/google-research/tree/master/scann), the class includes analysis methods to perform the following tasks:

    • Identifying Conflicting Training Phrases Across Intents
    • Finding the Most Similar Training Phrases for a Provided Set of Utterances
    • Clustering Utterances that Don't Match Training Phrases

    We've provided an NLU Analysis notebook that allows you to dive right in and start using these features with your Dialogflow CX Agents.

    [1] USEv4 provides us with high quality embeddings that we can utilize for performing Semantic Similarity and Classification tasks.

    image

    image

    [2] The Scalable Nearest Neighbors (ScaNN) method from Google Research team provides us with SOTA performance clustering.

    image


    Agent Assist

    We're super excited to announce the release of our Agent Assist class! This is our first non-DFCX class to make it into the library as well. After careful consideration, we realized that there are many CCAI Ecosystem APIs that a typical bot developer might need to access that are immediately adjacent to Dialogflow CX. The Agent Assist APIs are used heavily in conjunction with DFCX APIs, so it was a logical choice for us to include them.

    With the new Agent Assist class, you can perform several functions like:

    • CRUD operations for Conversation Profiles
    • CRUD operations on Participants
    • Creating and Completing Conversations

    We'll continue to build out this class to include other AA features like Smart Reply, FAQ Suggestion, and Article Suggestion over time. You can find the top level folder agent_assist adjacent to the dfcx_scrapi folder.


    Session Entity Types

    If you're familiar with System Entity Types and Custom Entity Types but haven't used Session Entity Types yet, then you're in for a treat! 🍬 Session Entity Types are a powerful way to enhance your bot's NLU capabilities at a very granular level - Session by Session. Unlike System and Custom Entity Types which can only be modified during Design Time, Session Entity Types can be modified dynamically during Runtime.

    Once you've instantiated a user Session, you can call the Session Entity Types API to create Session specific entity types that have a TTL for the length of the active Session.

    This allows you to handle use cases like:

    • Loading user-specific data into a session for NLU recognition
    • Performing Entity Expansion on existing Entity Types based on user feedback during the conversation
    • Overwriting existing Entity Types for a single user session

    Enhancements

    Intents to DataFrame Transposed

    We've introduced a new Intents method that allows you to export your Intents / Training Phrases transposed from what the normal export achieves. Typically the data is exported row by row like this:

    | intent | training_phrase| |---|---| |head_intent.billing| I need to pay my bill| |head_intent.billing| How do I make a new payment for the statement I received?| |head_intent.billing| What's the minimum I can pay right now for my phone bill?|

    In the transposed setup, we provide the data with the Intent Name as the column header, and all of the training phrases below it:

    | head_intent.billing| confirmation.yes | |---|---| |I need to pay my bill| yes| |How do I make a new payment for the statement I received?| yeah| |What's the minimum I can pay right now for my phone bill?| for sure|

    The motivation behind this feature is to provide the exported data for your team's linguists, data labelers, and QA team to review from a different perspective. This transposed format can help with use cases like:

    • Identifying "weak" Intents that need more training phrases
    • Identifying "strong" Intents that perhaps have too many training phrases
    • Overall Intent and Training Phrase Balancing and Distribution

    Notebooks

    We've provided 5 new example notebooks in this release.

    1. SCRAPI Prebuilt Development Notebook, This is a a great starter notebook if you're unsure where to begin with SCRAPI. It provides some foundational imports and ends with a blank canvas to allow you to explore, script, and build based on your needs.
    2. Builder 101 - Using Builder Classes to Build Bots Programmatically, A primer on the new Builder classes that will get you up to speed in no time!
    3. Exporting Custom Entities to Google Sheets, a quick how-to guide on getting Custom Entity Types and Synonyms out of your bot, and into Google Sheets to share, analyze, and iterate with your team.
    4. NLU Analysis - Semantic Similarity and Clustering, using machine learning to improve machine learning! This notebook provides 3 great ML-based approaches to tuning your Intents and Training Phrases.
    5. Intent Analysis Using Levenshtein Ratios, another great analysis notebook that leverages Levenshtein Distance to identify conflicts between intents.

    Bug Fixes

    • Fixed a bug in the Experiments class that incorrectly set the client_options variable
    • Fixed a bug in entity_type_proto_to_dataframe that was incorrectly concatenating a Pandas DataFrame with a Dictionary, instead of 2 Pandas DataFrames
    • Fixed a bug in the Levenshtein class that referenced non-existent keys in an upstream DataFrame

    Misc

    • Lots of style updates across the library for consistency
    • Doc string updates across the library to provide better explanation of methods and functions

    What's Changed

    • Feature/intent to df transposed by @MRyderOC in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/73
    • Correctly set region using environment_path by @cbradgoog in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/77
    • entity to dataframe fix by @DtStry in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/76
    • Adding methods for extracting test coverage by @hkhaitan1 in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/75
    • Feature/intent builder by @MRyderOC in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/74
    • Feature/session entity types by @kmaphoenix in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/84
    • Feature/agent and entity builder by @MRyderOC in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/80
    • 81 fr static analysis class and notebooks by @cgibson6279 in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/83
    • Feature/notebook examples by @kmaphoenix in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/86
    • Feature/style cleanup by @SeanScripts in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/82

    New Contributors

    • @cbradgoog made their first contribution in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/77
    • @DtStry made their first contribution in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/76
    • @hkhaitan1 made their first contribution in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/75

    Full Changelog: https://github.com/GoogleCloudPlatform/dfcx-scrapi/compare/1.4.0...1.5.0

    Source code(tar.gz)
    Source code(zip)
  • 1.4.0(Jul 12, 2022)

    New Features

    Import / Export Flow in Bytes

    A new method was added in flows.py allowing the user to capture the Flow object as an inline byte stream. This feature enables some CI/CD tooling that is being developed and will be released soon! Great addition ➕ from @Omerside !

    Enhancements

    NLU Regression Testing

    We've updated the functionality of run_intent_detection method in the conversation.py class to provide more verbose information in the resulting DataFrame. Previously, the "Match Type" information had to be derived from the results. We can now access each of these signals directly, including PARAMETER_FILLING which allows for more granular NLU Regression testing. Big thanks 🎉 to @Greenford for getting this put together! Full details here.

    Intents to DataFrame Refactor

    We've performed a refactor of bulk_intent_to_df to provide more consistency with the output results. Previously, there were 2 "modes" which provided different output types:

    • basic, which output a DataFrame
    • advanced, which output a Dictionary containing 2 DataFrames

    For output type consistency, we've consolidated all of the information from the advanced mode into a single DataFrame. All of the information that existed previously is available in the new, single DataFrame. Kudos 👏 to @MRyderOC for cleaning this up!

    Pandas Append Deprecation

    We've made various modifications to the deprecated pd.append methods across the library to use the recommended syntax of pd.concat. This also provides us with some computation speed increases in some areas, which is nice! Thanks for this cleanup 🧹 @sidpagariya !

    Bug Fixes

    • Fixed a typo in transition_route_groups.py
    • Fixed an issue in dataframe_functions.py on the _update_intent_from_dataframe method where parameter_id was not appropriately populated when annotations were not provided. Thanks for the catch / fix Lambert!

    What's Changed

    • Comments. by @Omerside in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/62
    • added import/export flow using raw bytes by @Omerside in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/60
    • Fix update_transition_route_group by @SeanScripts in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/58
    • Update pandas append function for dataframes to pandas concat to suppress Pandas deprecation warning by @sidpagariya in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/63
    • Feature/detect param by @Greenford in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/59
    • Advanced mode did not manage properly unannotated training phrases by @lambertaurelle in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/45
    • Refactor/intents to df by @MRyderOC in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/67
    • Lint Fixes for Recent File Changes by @kmaphoenix in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/68

    New Contributors

    • @SeanScripts made their first contribution in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/58
    • @sidpagariya made their first contribution in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/63
    • @lambertaurelle made their first contribution in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/45

    Full Changelog: https://github.com/GoogleCloudPlatform/dfcx-scrapi/compare/1.3.0...1.4.0

    Source code(tar.gz)
    Source code(zip)
  • 1.3.0(Apr 5, 2022)

    New Features

    Pull All Fulfillments

    A new method has been implemented called SearchUtil.get_agent_fulfillment_message_df which allows you to extract all Fulfillment Messages from an agent and pull them into a DataFrame for easy reference.

    There is also a secondary method that allows you to pull fulfillment messages with their original unpacked structure at SearchUtil.get_raw_agent_fulfillment_df

    Entity Types to DataFrame

    You can now extract all Entity Types and metadata contents into a DataFrame!

    The new method has 2 modes, just like it's Intent counterpart:

    • basic mode will provide a simple DataFrame of Entity Type and their corresponding synonyms
    • advanced mode will provide 2 DataFrames:
      • entity_types which includes everything from the basic mode, plus additional metadata flags
      • excluded_phrases which includes the mapping of all excluded phrases, if applicable

    Principal Auth for Google Sheets

    Methods in DataframeFunctions that interact with Google Sheets now accept GCP Principal account auth! When instantiating the class, pass the flag for principal=True to launch the OAuth process.

    The OAuth process requires your GCP project to have certain APIs enabled as well as a downloaded OAuth client credential.
    You can find more on setting up this process for your GCP Project here

    Enhancements

    • Conversation.reply now correctly unpacks all Proto structures (embedded Structs / Arrays) into more pythonic structures (Dicts / Lists)
    • You can now get a Flow by display name in the Flows class
    • UtteranceGeneratorUtils.create_test_dataset now generates a DataFrame that is formatted and in alignment for use with other methods across the library.

    Bug Fixes

    • Operations.get_lro now uses the correct region/location based on the input LRO ID
    • Fixed an issue in Conversation.reply where inputs were being truncated at 250 instead of the default 256 char limit

    What's Changed

    • Conversation Class Enhancements and Bug Fixes by @kmaphoenix in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/52
    • Cleanup/search util by @Greenford in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/43
    • A way to pull and organize fulfillments in an agent for analyzing gra… by @Hgithubacct in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/22
    • added get_by_display_name, cleaned up other stuff by @Omerside in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/42
    • create_test_dataset fixed by @cgibson6279 in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/51
    • Feature/entity types to data frame by @MRyderOC in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/48
    • add inherited auth for GSheets by @kmaphoenix in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/53
    • fixed regionalization for get_lro by @kmaphoenix in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/54
    • lint fixes; pre-1.3.0 release by @kmaphoenix in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/55

    New Contributors

    • @cgibson6279 made their first contribution in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/51
    • @MRyderOC made their first contribution in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/48

    Full Changelog: https://github.com/GoogleCloudPlatform/dfcx-scrapi/compare/1.2.2...1.3.0

    Source code(tar.gz)
    Source code(zip)
  • 1.2.2(Mar 1, 2022)

    What's Changed

    • Added get_webhook, update_webhook, small cleanup to Webhooks class by @Omerside in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/38
    • Select Environment when Exporting Agent by @Omerside in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/37
    • minor doc update; moving method for vis by @kmaphoenix in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/40
    • Patch/agents minor updates by @kmaphoenix in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/41
    • Fixed bug in route_groups_to_dataframe that would cause method to fail if intent was missing from route by @Greenford in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/39

    New Contributors

    • @Greenford made their first contribution in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/39

    Full Changelog: https://github.com/GoogleCloudPlatform/dfcx-scrapi/compare/1.2.1...1.2.2

    Source code(tar.gz)
    Source code(zip)
  • 1.2.1(Feb 18, 2022)

    What's Changed

    • Conversation Clean Up and Minor Feature Adds by @kmaphoenix in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/36

    Full Changelog: https://github.com/GoogleCloudPlatform/dfcx-scrapi/compare/1.2.0...1.2.1

    Source code(tar.gz)
    Source code(zip)
  • 1.2.0(Feb 8, 2022)

    Features

    Versions

    You can now manage DFCX Versions directly from SCRAPI which includes all basic CRUD functions.

    from dfcx_scrapi.core.versions import Versions
    from dfcx_scrapi.core.flows import Flows
    
    f = Flows(creds_path)
    flows_map = f.get_flows_map(agent_id, reverse=True)
    
    # List Versions
    all_versions = v.list_versions(flows_map['Default Start Flow'])
    
    # Get Version by Display Name
    ver = v.get_version_by_display_name(display_name='dsf v2', flow_id=flows_map['Default Start Flow'])
    
    # Load Version
    lro = v.load_version(ver)
    
    # Create Version
    lro = v.create_version(flows_map['Default Start Flow'], 'patrick api ver', 'this came from SCRAPI')
    
    # Compare Version
    res = v.compare_versions(v0, v1, flows_map['Default Start Flow'])
    
    # Delete Version
    v.delete_version(v1)
    
    • Feature/versions by @Omerside in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/23

    Environments

    As a compliment to Versions, the DFCX Environments functionality has also been added to SCRAPI. Automatically manage the promotion of Flow Versions into various Environments within DFCX with this Class.

    from dfcx_scrapi.core.environments import Environments
    from dfcx_scrapi.core.operations import Operations
    
    ops = Operations(creds_path)
    envs = Environments(creds_path)
    
    # List Envs
    all_envs = envs.list_environments(agent_id)
    prod_env = all_envs[0]
    
    # Get Env
    prod_env = envs.get_environment(prod_env.name)
    
    # Get Env by Display Name
    demo_env = envs.get_environment_by_display_name('PROD', agent_id)
    
    # Create Enviornment
    lro = envs.create_environment_by_display_name(
        display_name='DEMO2',
        version_configs=[
            ('Default Start Flow', 2),
            ('Confidence Demo',3),
            ('Sentiment Demo',1),
            ('Response Map Demo',1),
            ('Lists Demo',1),
            ('Multi Param Demo',1),
            ('Proper Names Demo',4),
            ('Entity Routing',1),
            ('[Redaction] Demo',1)
            ],
        description='This is my DEMO environment',
        agent_id=agent_id)
    
    # Update Environment
    lro = envs.update_environment(demo_env.name, display_name='DEMO_API_UPDATE')
    
    # Delete Environment
    envs.delete_environment(demo_env.name)
    
    # Deploy Flow to Environment
    lro = envs.deploy_flow_to_environment(demo_env.name, flow_ver_id))
    
    # Lookup Environment History
    history = envs.lookup_environment_history(demo_env.name)
    
    # List Continuous Test Results
    ct_res = envs.list_continuous_test_results(demo_env.name)
    
    • Feature/environments by @Omerside in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/26

    Security Settings

    You can now manage CCAI Security Settings directly from SCRAPI! The CCAI Security Settings feature allows you to apply Data Loss Prevention (DLP) templates,Redaction Strategies and Retention Policies for conversations that are handled by Dialogflow CX.

    from dfcx_scrapi.core.security_settings import SecuritySettings
    
    ss = SecuritySettings(creds_path)
    
    # List Security Settings
    all_ss = ss.list_security_settings(location_id)
    
    # Get Security Settings
    # my_ss = ss.get_security_settings(my_ss.name)
    
    # Create Security Settings from Obj
    my_new_ss = my_ss
    my_new_ss.display_name = "SCRAPI SS UPDATE"
    my_new_ss.retention_window_days = 10
    result = ss.create_security_settings(location_id, my_new_ss)
    
    # Update Security Settings
    result = ss.update_security_settings(my_ss.name, retention_window_days=30, display_name='SCRAPI SS API UPDATE')
    
    # Delete Security Settings
    result = ss.delete_security_settings(my_ss.name)
    
    • Feature/security settings by @kmaphoenix in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/28

    Utterance Generator Utils

    This class is an extension of the UtteranceGenerator ML class introduced in Release v1.1.0.
    This class provides additional methods that wrap the base class and allow the user to perform tasks like:

    • Create New Training Phrases
    • Create Test Dataset
    • Create Synthetic Dataset

    Each of these are ML-powered, synthetically generated datasets meant to bootstrap your bot building experience or further automate the testing and QA of your bots.

    from dfcx_scrapi.tools.utterance_generator_util import UtteranceGeneratorUtils
    ug = UtteranceGeneratorUtils(creds_path=creds_path)
    
    # Create New Training Phrases
    df = ug.create_new_training_phrases(agent_id, ['context.people_names'], 10)
    df.head(2)
    
    # Create Test Dataset
    df = ug.create_test_dataset(agent_id, ['context.people_names'], 10)
    df.head(2)
    
    # Create Synthetic Dataset
    df = ug.create_synthetic_dataset(agent_id, ['context.people_names'], 10)
    df.head(2)
    
    • Feature/utterance_gen_util by @Hgithubacct in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/15

    Bug Fixes

    • adding src/dfcx_scrapi/core_async/init.py by @jmound in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/21

    Enhancements

    • Deprecated various methods across classes that were dependent on Python Requests library and implemented the methods based on the core Dialogflow CX library
    • Updated Dataframe outputs in Intents.bulk_intent_to_df() to provide a more consistent naming schema throughout the library

    Docs

    • Updated licensing throughout library
    • Various Doc strings and Styles updated throughout the library
    • Renamed tools/utterance_generator_utils.py -> tools/utterance_generator_util.py for name space consistency
    • Renamed tools/webhook_utils.py -> tools/webhook_util.py for name space consistency
    • Moved soop/project.py -> core/project.py
    • Deleted unused soop folder
    • Renamed tools/analysis_util -> tools/levenshtein.py for more appropriate visibility

    New Contributors

    • @jmound made their first contribution in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/21
    • @Hgithubacct made their first contribution in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/15
    • @Greenford made their first contribution in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/15
    Source code(tar.gz)
    Source code(zip)
  • 1.1.1(Dec 30, 2021)

    Minor Patch - What's Changed

    • Adding bug report template by @kmaphoenix in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/16
    • updating doc strings in create_intent by @kmaphoenix in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/17
    • add language_code to bulk create intent from df by @zya-codes in https://github.com/GoogleCloudPlatform/dfcx-scrapi/pull/14

    Full Changelog: https://github.com/GoogleCloudPlatform/dfcx-scrapi/compare/1.1.0...1.1.1

    Source code(tar.gz)
    Source code(zip)
  • 1.1.0(Dec 21, 2021)

    Features

    Semantic Clustering

    We've introduced a new class that provides the ability to perform Semantic Clustering on Training Phrases that are currently in Dialogflow CX, or that you plan to put in Dialogflow CX. The motivation behind this class is to provide a base "engine" for building tools that allow Bot Builders and Designers with easier methods for determining where their Intents may be "heavy" or "light" with training data. By performing Semantic Clustering, you can quickly pinpoint clusters of Training Data to be groomed from your CX Agent that could be causing conflicts with other Intents.

    We'll be extending this class with more tools in the future.
    Try out the new features here: SemanticClustering

    Utterance Generator

    We're excited to introduce another new class that enables users to quickly generate synthetic Training Phrases based on a single input phrase. The motivation behind this class was to provide a way for Bot Builders and Designers to "bootstrap" their Dialogflow CX Agents with an appropriate amount of training data to move to a Production deployment. Once in Production, synthetic data could be swapped out with "real-world" data as it is collected and analyzed for a more fine tuned experience.

    In addition, this class can be used to "bolster" Intents that have very few Training Phrases. This is especially helpful when you have high variance in your Training Phrase distribution among Intents, in particular when the # of Training Phrases in your smallest Intents is not at least 10% of the # of Training Phrases in your largest intents.

    We'll be extending this class with more tools in the future.
    Try out the new features here: UtteranceGenerator

    Test Cases (Sync and Async)

    Test Cases are finally here! SCRAPI now has support for all Dialogflow CX Test Case features and functionality in both synchronous and asynchronous classes. The motivation behind including the Async class was primarily due to the large scale at which Test Cases can grow to, which can cause running a large set of Test Cases to take quite some time. Allowing these to be run Asynchronously can provide the developer more flexibility to retrieve the results when they are ready, rather than waiting for a large batch of them to return all at once.

    You can find the new classes here:
    TestCases TestCasesAsync

    Bulk Update Entity From DataFrame

    Added a ability to provide bulk updates to Entities via DataFrame, similar to the existing bulk create functionality

    Bug Fixes

    • Fixed a bug in core/conversations.py that forced the input of creds_path arg

    Enhancements

    • Added language_code support to TransitionRouteGroups.update_transition_route_group()
    • Added language_code support to EntityTypes.create_entity_type()
    • Added language_code support to DataframeFunctions.bulk_create_entity_from_dataframe()
    • Added language_code support to EntityTypes.update_entity_type()

    Docs

    • Updated docs for EntityType.create_entity_type()
    • Updated docs for DataframeFunctions.bulk_create_entity_from_dataframe()
    • Updated requirements.txt

    Misc.

    • README updates
    • Minor .gitignore updates
    Source code(tar.gz)
    Source code(zip)
  • 1.0.7(Oct 26, 2021)

    Features

    Multi Threaded Intent Detection

    The DialogflowConversation Class now supports Multi Threaded Intent Detection via Python threading!
    This will enable hundreds or thousands of tests to be run in parallel, significantly expediting Intent Detection tasks for QA and Operation teams. See DialogflowConversation.run_intent_detection() for more information.

    Intents Language Code Support

    All methods in the Intents Class have been updated to include language_code as an optional arg (where applicable) to allow for Intent CRUD operations across multi-lingual CX Agents.

    Bug Fixes

    • Fixed an issue in StatsUtil.stats() that caused the agent_id to not be passed into the underlying functions appropriately
    • Fixed broken link in examples/template.ipynb

    Enhancements

    • Additional error handling in ChangeHistory.change_history_to_dataframe() to account for the scenario where no Change History results are present in the provided CX Agent
    • Added chunk_size and rate_limit args to DialogflowConversation.run_intent_detection() to allow for more granular control over the amount of data sent to the Detect Intent API in parallel
    • Added Boolean flag for checkpoints in DialogflowConversation.reply() to provide an option for users to obtain timestamp information for debugging purposes

    Docs

    • Updated docstrings for DataframeFunctions.bulk_update_intent_from_dataframe() which was missing language_code as an optional arg

    Misc.

    • Updated Docker image to be specific vs. latest due to issues in 3.10.x
    • Updated Makefile
    Source code(tar.gz)
    Source code(zip)
  • 1.0.6(Oct 5, 2021)

    Features

    • Added AnalysisUtils, a Class for building methods and functions related to the in-depth comparison and analysis of Intents and Training Phrases in Dialogflow CX Agents
    • Added calc_tp_distances() function in AnalysisUtils which performs Levenshtein Distance and Ratio comparison between 2 intents

    Bug Fixes

    • Fixed an issue in SearchUtil.find_list_parameters() that was causing a failure when iterating through the Flows map

    Enhancements

    • Restructured the examples/ folder to contain a hierarchy based on the intended use of the Notebooks and scripts
    • Added an .ipynb template to use as a guideline and best practice for authoring new examples to include in the repo
    • Added several new Python Notebook examples with instructions in the examples/ folder
    • Updated all Python Notebooks to adhere to new template structure
    Source code(tar.gz)
    Source code(zip)
  • 1.0.5(Sep 10, 2021)

    Features

    • Added 2 new methods to seach_util.py
      • find_true_routes extracts all Page and Route data from a given agent and checks to see that there are valid "exit" paths from every page, ensuring that user is able to navigate past the Page and not remain "stuck".
      • find_event_handlers extracts all of the Event Handlers from every Resource level through the agent (i.e. Flow, Page, Parameter, etc.) and displays metadata about their associated event. The motivation for this is to allow a user to identify patterns that may be detrimental to bot performance or user experience.

    Bug Fixes

    • Fixed a bug in operations.py that was preventing pre-defined credentials from being inherited from the base class

    Enhancements

    • reply method in conversation.py can now accept a current_page arg that allows the user to specify the specific Page that they would like to start the conversation from if not from the Default Start Page.

    Misc.

    • Minor updates to CI config to modify when e2e tests should run
    Source code(tar.gz)
    Source code(zip)
  • 1.0.4(Aug 31, 2021)

    Features

    • Added update_entity_type() method to entity_types.py
    • Added new functionality to bulk_intent_to_df in intents.py that allows a user to define a subset of intents to extract from an Agent instead of extracting all Intents and Training Phrases
    • Added a new method in intents.py called modify_training_phrase_df() that is meant to be used in tandem with the following functions:
      • bulk_intent_to_df(mode='advanced') located in intents.py
      • bulk_update_intents_from_dataframe() located in dataframe_functions.py

    The new modify_training_phrase_df() method allows you to define a third Pandas DataFrame to define "Actions" that you want to take on a subset of the Intent and Training Phrase combinations that you have in your ["phrases"] dataframe returned from bulk_intent_to_df(mode='advanced'). The two primary actions you can perform are add and delete. You can consider an action such as move the combination of delete + add.

    For example, if I wanted to move a Training Phrase hi from IntentA to IntentB I would perform the following actions:

    • delete, hi, IntentA
    • add, hi, IntentB

    The combination of the above is essentially a move function.

    Bug Fixes

    • Fixed an issue in route_groups_to_dataframe that caused a failure when the Fulfillment message length was < 1

    Enhancements

    • Added support for all fulfillment option types in route_groups_to_dataframe in transition_route_groups.py including custom_payload, liveAgentHandoff, conversationSuccess, playAudio, outputAudioText
    • Improved support for fulfillment_message type so that it supports single item fulfillment or lists as supported in the UI. If the UI designer provides more than one fulfillment message, the dataframe will provide the messages back in list format

    Misc.

    • Fixed relative links in README

    Code Samples

    Easy Entity Update

    e_map = e.get_entities_map(agent_id=scratch_agent, reverse=True)
    e.update_entity_type(e_map['people'], display_name='people_updated')
    

    Complex MACDs for Intents and Training Phrases

    # Grab your Intents
    my_intents = i.bulk_intent_to_df(agent_id=scratch_agent, mode='advanced')
    
    # Define your Actions
    actions = pd.DataFrame(
        columns=['display_name', 'phrase', 'action'],
        data=[
            ['Default Welcome Intent', 'hi', 'delete'],
            ['Default Welcome Intent', 'what is up, how are you', 'add'],
            ['Default Welcome Intent', 'yes', 'delete'],
            ['support.yes', 'yes', 'add']
        ]
    )
    
    # Modify your Training Phrases
    myactions = i.modify_training_phrase_df(actions, my_intents['phrases'])
    
    # Push your Updates!
    dffx.bulk_update_intents_from_dataframe(
        agent_id=scratch_agent,
        tp_df=myactions['updated_training_phrases_df'],
        params_df=my_intents['parameters'],
        update_flag=True)
    
    Source code(tar.gz)
    Source code(zip)
  • 1.0.3(Aug 24, 2021)

    Features

    • Updated output DataFrame in intents.bulk_intent_to_df() to match the same schema used in the DataframeFunctions class for continuity
    • Updated agents.list_agents() to optionally accept location_id or project_id. The former will provide expedited compute time and more control over the specific Region of agents that is returned. The latter is a "lazy" method and will iterate through all regions and gather any available agents into a larger List
    • Added agents.get_agent_by_display_name() that provides the user with several options for retrieving an Agent by Display name. See docstrings for details.

    Docs

    • Added improved docstrings for several methods that return Long Running Operations (LROs) to alert the user to use the method operations.get_lro() to retrieve the status of the LRO

    Tests

    • Added test cases in support of agents.get_agent_by_display_name()

    Misc.

    • Updates to CI yaml
    • Updates to .gitignore
    • Minor lint fixes
    Source code(tar.gz)
    Source code(zip)
  • 1.0.2(Aug 19, 2021)

    Features

    • Added support for language_code inside of update_intent and bulk_update_intents_from_dataframe methods
    • Added data/ccai_service_kit artifacts

    This set of artifacts contains an Example Dialogflow CX .blob file that can be imported directly in your project to test out various different pre-built features of Dialogflow CX. The agent file is accompanied by an Example Cloud Functions package that serves as the backend Webhook for one of the use cases in the sample agent.

    In order to use the pre-built Webhook:

    1. Download the webhook artifacts folder in data/ccai_service_kit/conf_score_cfx
    2. Choose one of the Cloud Functions Deployment Options to deploy the code to your GCP Project
    3. Copy the HTTP Trigger URL from the Cloud Function in GCP Console > Cloud Functions > Your Function Name > Trigger > Trigger URL
    4. Replace the Webhook URL in your Dialogflow CX Agent with your new Trigger URL by browsing to your Agent and going to Manage > Webhooks > Your Webhook > Webhook URL
    Source code(tar.gz)
    Source code(zip)
  • 1.0.1(Aug 18, 2021)

    Bug Fixes

    • Fixed an issue in bulk_create_intent_from_dataframe where the meta object was being checked for basic mode even though the meta object is not needed

    Features

    • Added basic bot building examples for Jupyter notebooks and Python that show how to build a Dialogflow CX agent from scratch using simple text inputs and a CSV file.
    • added bot_building_101.ipynb
    • added bot_building_101.py
    Source code(tar.gz)
    Source code(zip)
Owner
Google Cloud Platform
Google Cloud Platform
PRAW, an acronym for "Python Reddit API Wrapper", is a python package that allows for simple access to Reddit's API.

PRAW: The Python Reddit API Wrapper PRAW, an acronym for "Python Reddit API Wrapper", is a Python package that allows for simple access to Reddit's AP

Python Reddit API Wrapper Development 3k Dec 29, 2022
alpaca-trade-api-python is a python library for the Alpaca Commission Free Trading API.

alpaca-trade-api-python is a python library for the Alpaca Commission Free Trading API. It allows rapid trading algo development easily, with support for both REST and streaming data interfaces

Alpaca 1.5k Jan 9, 2023
WhatsApp Api Python - This documentation aims to exemplify the use of Moorse Whatsapp API in Python

WhatsApp API Python ChatBot Este repositório contém uma aplicação que se utiliza

Moorse.io 3 Jan 8, 2022
Official python API for Phish.AI public and private API to detect zero-day phishing websites

phish-ai-api Summary Official python API for Phish.AI public and private API to detect zero-day phishing websites How it Works (TLDR) Essentially we h

Phish.AI 168 May 17, 2022
Python API wrapper around Trello's API

A wrapper around the Trello API written in Python. Each Trello object is represented by a corresponding Python object. The attributes of these objects

Richard Kolkovich 904 Jan 2, 2023
A python to scratch API connector. Can fetch data from the API and send it back in cloud variables.

Scratch2py Scratch2py or S2py is a easy to use, versatile tool to communicate with the Scratch API Based of scratchclient by Raihan142857 Installation

null 20 Jun 18, 2022
Async ready API wrapper for Revolt API written in Python.

Mutiny Async ready API wrapper for Revolt API written in Python. Installation Python 3.9 or higher is required To install the library, you can just ru

null 16 Mar 29, 2022
🚀 An asynchronous python API wrapper meant to replace discord.py - Snappy discord api wrapper written with aiohttp & websockets

Pincer An asynchronous python API wrapper meant to replace discord.py ❗ The package is currently within the planning phase ?? Links |Join the discord

Pincer 125 Dec 26, 2022
wyscoutapi is an extremely basic API client for the Wyscout API (v2 & v3) for Python

wyscoutapi wyscoutapi is an extremely basic API client for the Wyscout API (v2 & v3). Usage Install with pip install wyscoutapi. To connect to the Wys

Ben Torvaney 11 Nov 22, 2022
Beyonic API Python official client library simplified examples using Flask, Django and Fast API.

Beyonic API Python official client library simplified examples using Flask, Django and Fast API.

Harun Mbaabu Mwenda 46 Sep 1, 2022
A Python API wrapper for the Twitter API!

PyTweet PyTweet is an api wrapper made for twitter using twitter's api version 2! Installation Windows py3 -m pip install PyTweet Linux python -m pip

TheFarGG 1 Nov 19, 2022
A new coin listing alert bot using Python, Flask, MongoDB, Telegram API and Binance API

Bzzmans New Coin Listing Detection Bot Architecture About Project Work in progress. This bot basically gets new coin listings from Binance using Binan

Eyüp Barlas 21 May 31, 2022
Python API wrapper library for Convex Value API

convex-value-python Python API wrapper library for Convex Value API. Further Links: Convex Value homepage @ConvexValue on Twitter JB on Twitter Authen

Aaron DeVera 2 May 11, 2022
This an API wrapper library for the OpenSea API written in Python 3.

OpenSea NFT API Python 3 wrapper This an API wrapper library for the OpenSea API written in Python 3. The library provides a simplified interface to f

Attila Tóth 159 Dec 26, 2022
YARSAW is an Async Python API Wrapper for the Random Stuff API.

Yet Another Random Stuff API Wrapper - YARSAW YARSAW is an Async Python API Wrapper for the Random Stuff API. This module makes it simpler for you to

Bruce 6 Mar 27, 2022
Python API Client for Twitter API v2

?? Python Client For Twitter API v2 ?? Why Twitter Stream ? Twitter-Stream.py a python API client for Twitter API v2 now supports FilteredStream, Samp

Twitivity 31 Nov 19, 2022
EpikCord.py - This is an API Wrapper for Discord's API for Python

EpikCord.py - This is an API Wrapper for Discord's API for Python! We've decided not to fork discord.py and start completely from scratch for a new, better structuring system!

EpikHost 28 Oct 10, 2022
A simple Python API wrapper for Cloudflare Stream's API.

python-cloudflare-stream A basic Python API wrapper for working with Cloudflare Stream. Arbington.com started off using Cloudflare Stream. We used the

Arbington 3 Sep 8, 2022
Popcorn-time-api - Python API for interacting with the Popcorn Time Servers

Popcorn Time API ?? CONTRIBUTIONS Before doing any contribution read CONTRIBUTIN

Antonio 3 Oct 31, 2022