Kubernetes-native workflow automation platform for complex, mission-critical data and ML processes at scale. It has been battle-tested at Lyft, Spotify, Freenome, and others and is truly open-source.

Overview

Flyte and LF AI & Data Logo

Flyte

Flyte is a workflow automation platform for complex, mission-critical data, and ML processes at scale

Current Release Sandbox Build End-to-End Tests License Commit Activity Commits since Last Release GitHub Milestones Completed GitHub Next Milestone Percentage Docs Twitter Follow Flyte Helm Chart Join Flyte Slack

Home Page · Quick Start · Documentation · Live Roadmap · RFC's & Proposals · Features · Community & Resources · Changelogs · Components

💥 Introduction

Flyte is a structured programming and distributed processing platform that enables highly concurrent, scalable, and maintainable workflows for Machine Learning and Data Processing. It is a fabric that connects disparate computation backends using a type-safe data dependency graph. It records all changes to a pipeline, making it possible to rewind time. It also stores a history of all executions and provides an intuitive UI, CLI, and REST/gRPC API to interact with the computation.

Flyte is more than a workflow engine -- it uses workflow as a core concept, and task (a single unit of execution) as a top-level concept. Multiple tasks arranged in a data producer-consumer order creates a workflow.

Workflows and Tasks can be written in any language, with out-of-the-box support for Python, Java and Scala. Flyte was designed to manage complexity that arises in Data and ML teams and ensures them to keep up their high velocity of delivering business impacting features. One way to it achieves this is by separating the control-plane from the user-plane. Thus, every organization can offer Flyte like a service to their end users and the service is managed by folks who are more infrastructure focused, while the users use the intuitive interface of flytekit.

Five Reasons to Use Flyte

  • Kubernetes-Native Workflow Automation Platform
  • Ergonomic SDK's in Python, Java & Scala
  • Versioned, Auditable & Reproducible Pipelines
  • Data Aware and Strongly typed
  • Resource aware and deployed to scale with your organization

🚀 Quick Start

With Docker installed and Flytectl installed, run the following command:

  flytectl sandbox start

This creates a local Flyte sandbox. Once the sandbox is ready, you should see the following message: Flyte is ready! Flyte UI is available at http://localhost:30081/console.

Visit http://localhost:30081/console to view the Flyte dashboard.

Here's a quick visual tour of the console:

Flyte console Example

To dig deeper into Flyte, refer to the Documentation.

⭐️ Current Deployments & Contributors

🛣️ Live Roadmap

Live roadmap for the project can be found @Github Live Roadmap.

🔥 Features

  • Used at Scale in production by 500+ on one deployment. Used in production at multiple firms. Proved to scale to more than 1 million executions, and 40+ million containers
  • Data Aware and Resource aware (Allows organizations to separate concerns - users can use the API, platforms/infra teams can manage the deployments and scaling)
  • Enables collaboration across your organization by:
    • Executing distributed data pipelines/workflows
    • Making it easy to stitch together workflows from different teams and domain experts and share them across teams
    • Comparing results of training workflows over time and across pipelines
    • Simplifying the complexity of multi-step, multi-owner workflows
  • Get Started quickly -- start locally and scale to the cloud instantly
  • gRPC / REST interface to define and execute tasks and workflows
  • Type safe construction of pipelines -- each task has an interface characterized by its input and output, so illegal construction of pipelines fails during declaration, rather than at runtime
  • Supports multiple data types for machine learning and data processing pipelines, such as Blobs (images, arbitrary files), Directories, Schema (columnar structured data), collections, maps, etc.
  • Memoization and Lineage tracking
  • Provides logging and observability
  • Workflow features:
    • Start with one task, convert to a pipeline, attach multiple schedules, trigger using a programmatic API, or on-demand
    • Parallel step execution
    • Extensible backend to add customized plugin experience (with simplified user experience)
    • Branching
    • Workflow of workflows - subworkflows (a workflow can be embedded within one node of the top-level workflow)
    • Distributed remote child workflows (a remote workflow can be triggered and statically verified at compile time)
    • Array Tasks (map a function over a large dataset -- ensures controlled execution of thousands of containers)
    • Dynamic workflows creation and execution with runtime type safety
    • flytekit plugins with first-class support in Python
    • Arbitrary flytekit-less containers tasks (RawContainer)
  • Guaranteed reproducibility of pipelines via:
  • Multi-cloud support (AWS, GCP, and others)
  • No single point of failure, and is resilient by design
  • Automated notifications to Slack, Email, and Pagerduty
  • Multi K8s cluster support
  • Out of the box support to run Spark jobs on K8s, Hive queries, etc.
  • Snappy Console & Golang CLI (flytectl)
  • Written in Golang and optimized for jobs that run for a long period of time.
  • Grafana templates (user/system observability)
  • Deploy with Helm and kustomize

🔌 Available Plugins

📦 Component Repos

Repo Language Purpose Status
flyte Kustomize,RST deployment, documentation, issues Production-grade
flyteidl Protobuf gRPC/REST API, Workflow language spec Production-grade
flytepropeller Go execution engine Production-grade
flyteadmin Go control plane Production-grade
flytekit Python python SDK and tools Production-grade
flyteconsole Typescript Flyte UI Production-grade
datacatalog Go manage input & output artifacts Production-grade
flyteplugins Go Flyte Backend plugins Production-grade
flytecopilot Go Sidecar to manage input/output for sdk-less Production-grade
flytestdlib Go standard library Production-grade
flytesnacks Python examples, tips, and tricks Maintained
flytekit-java Java/Scala Java & scala SDK for authoring Flyte workflows Incubating
flytectl Go A standalone Flyte CLI Production-grade
homebrew-tap Ruby Tap for downloadable flyte tools (cli etc) Production-grade
bazel-rules skylark/py Use Bazel to build Flyte workflows and tasks Incubating

Functional Tests Matrix

We run a suite of tests (defined in https://github.com/flyteorg/flytesnacks/blob/master/cookbook/flyte_tests_manifest.json) to ensure that basic functionality, and a subset of the integrations work across a variety of release versions. Those tests are run in a cluster where specific versions of the flyte components, such as console, flyteadmin, datacatalog, and flytepropeller, are installed. The table below has different release versions as the columns and the result of each test suite as rows.

workflow group nightly
core core
integrations-hive integration-hive
integrations-k8s-spark integrations-k8s-spark
integrations-kfpytorch integrations-kfpytorch
integrations-pod integrations-pod
integrations-pandera_examples integrations-pandera_examples
integrations-papermilltasks integrations-papermilltasks
integrations-greatexpectations integrations-greatexpectations
integrations-sagemaker-pytorch integrations-sagemaker-pytorch
integrations-sagemaker-training integrations-sagemaker-training

🛣️ RFC's (Request for Commments) & Proposals

Flyte is a Community Driven and Community Owned Software. It is managed using a steering committee and encourages collaboration. The community has a long roadmap for Flyte, but they know that, there might be some other interesting ideas, extensions or additions that you may want to propose. This is done usually starting with a

  • Github Issue - We maintain issues for all repos in the main flyte repo.
  • Writing down your proposal using a Documented RFC process
  • For small changes RFCs are not required, but for larger changes, RFC's are encouraged. Ofcourse drop into the Slack channel and talk to the community, if you want to test the waters, before proposing.

🤝 Community & Resources

Here are some resources to help you learn more about Flyte.

Communication Channels

Biweekly Community Sync

  • 📣 Flyte OSS Community Sync Every other Tuesday, 9am-10am PDT. Check out the calendar, and register to stay up-to-date with our meeting times. Or join us on Zoom.
  • Upcoming meeting agenda, previous meeting notes, and a backlog of topics are captured in this document.
  • If you'd like to revisit any previous community sync meetings, you can access the video recordings on Flyte's YouTube channel.

Blog Posts

Newsletter

Conference Talks & Pod Casts

Conference

  • Kubecon 2019 - Flyte: Cloud Native Machine Learning and Data Processing Platform video | deck
  • Kubecon 2019 - Running LargeScale Stateful workloads on Kubernetes at Lyft video
  • re:invent 2019 - Implementing ML workflows with Kubernetes and Amazon Sagemaker video
  • Cloud-native machine learning at Lyft with AWS Batch and Amazon EKS video
  • OSS + ELC NA 2020 splash
  • Datacouncil video | splash
  • FB AI@Scale Making MLOps & DataOps a reality
  • GAIC 2020
  • OSPOCon 2021:
    • Building and Growing an Open Source Community for an Incubating Project video
    • Enforcing Data Quality in Data Processing and ML Pipelines with Flyte and Pandera video
    • Self-serve Feature Engineering Platform Using Flyte and Feast video
    • Efficient Data Parallel Distributed Training with Flyte, Spark & Horovod video
  • KubeCon+CloudNativeCon North America 2021 - How Spotify Leverages Flyte To Coordinate Financial Analytics Company-Wide session
  • PyData Global 2021 - Robust, End-to-end Online Machine Learning Applications with Flytekit, Pandera and Streamlit session
  • ODSC West Reconnect - Deep Dive Into Flyte workshop

Podcasts

💖 All Contributors

A big thank you to the community for making Flyte possible!

9533583709012575971182715929843943158892184082377810805628965681083056215185242777717388881151688870937936015656289850323568122852653394388064539362132617421377798312914271654870211815175193752412816689623945060650517098816090976533133944967458467927248688133023350265543458779838207208820020943587819108698151638991316881136052915508713994551810591804219341221940539396596774758183378079609986543342653226245433386070057654896664777167782313810382310719273589516984748306212301251053717006343085331126925661228633130702368817639173091873037538957967031177840712450632496993331333172450860453197336836506810659778008575382826953709847350361388071587700011836330140230156716684325364490751535920040124739949104306354054801004789480621133588131255434483070069161722211097441499218911753927475946117969863047862440250586911142103451842953263816461847253876014008978163346030335921922904975488233659452758032810272071568889937967253911731499686893438190104305121953442646765941174730443689976970335732047284554071284190369891122626539232755936685311310789313000221324225467294202804701994291032633185190371147841186675474886318130813902771391982388000177039261785105111931873434167241733087241941910145045078542033884147090147964002533536441242107780986519378

Comments
  • [Docs] Flytectl docs cleanup

    [Docs] Flytectl docs cleanup

    Describe the documentation update

    • Fix typos, grammar in flytectl docs
    • Format code and content to properly render the docs
    • Add any missing command docs and also add some more examples if there's a possibility (optional)
    • Make sure flags' descriptions respect capitalized case
    • Getting the docs consistent to have flytectl command instead of bin/flytectl which is used in some places
    • Providing output for the example commands

    On the whole, improve flytectl user experience through docs!

    GitHub repo(s) flytectl

    documentation flytectl hacktoberfest 
    opened by samhita-alla 27
  • [Flytectl Feature] Add field selector for all get commands

    [Flytectl Feature] Add field selector for all get commands

    Describe the feature/command for flytectl Currently the get list commands for various flyte resources including project, tasks, workflows, launchplans, executions etc return the entire result set or are bounded through Limit query inorder to not fetch the entire dataset for that object type.

    The ask of this feature is to implement a field selector interface for flytectl which allows to narrow down the result set matching the field selector criteria.

    This is similar to kubectl field selector. https://kubernetes.io/docs/concepts/overview/working-with-objects/field-selectors/

    Provide a possible output or ux example Once implemented, user should be able to fetch project, tasks, workflows,launchplans,executions based on the field selector. Invalid field selector should throw an error or provide details on whats fields are available for selection.

    eg: This one returns all the executions in the domain and project.

    {
        "executions": [
            {
                "id": {
                    "project": "flytesnacks",
                    "domain": "development",
                    "name": "er7xaa2vxf"
                },
                "spec": {
                    "launch_plan": {
                        "resource_type": "LAUNCH_PLAN",
                        "project": "flytesnacks",
                        "domain": "development",
                        "name": "recipes.core.basic.lp.go_greet",
                        "version": "v103"
                    },
                    "metadata": {
                        "principal": "flyteconsole",
                        "system_metadata": {}
                    }
                },
                "closure": {
                    "outputs": {
                        "uri": "s3://my-s3-bucket/metadata/propeller/flytesnacks-development-er7xaa2vxf/end-node/data/0/outputs.pb"
                    },
                    "phase": "SUCCEEDED",
                    "started_at": "2021-02-17T07:10:09.271487Z",
                    "duration": "43.817508900s",
                    "created_at": "2021-02-17T07:10:09.234425400Z",
                    "updated_at": "2021-02-17T07:10:53.088995900Z",
                    "workflow_id": {
                        "resource_type": "WORKFLOW",
                        "project": "flytesnacks",
                        "domain": "development",
                        "name": "recipes.core.basic.lp.go_greet",
                        "version": "v103"
                    }
                }
            },
            {
                "id": {
                    "project": "flytesnacks",
                    "domain": "development",
                    "name": "oeh94k9r2r"
                },
                "spec": {
                    "launch_plan": {
                        "resource_type": "LAUNCH_PLAN",
                        "project": "flytesnacks",
                        "domain": "development",
                        "name": "recipes.core.07_juptyer_notebooks.whole_square_workflow.whole_square_wf",
                        "version": "v58"
                    },
                    "metadata": {
                        "principal": "flyteconsole",
                        "system_metadata": {}
                    }
                },
                "closure": {
                    "abort_metadata": {},
                    "phase": "FAILED",
                    "started_at": "2021-02-05T14:50:00.598630500Z",
                    "duration": "27.059172300s",
                    "created_at": "2021-02-05T14:50:00.470665800Z",
                    "updated_at": "2021-02-05T14:50:27.657802300Z",
                    "workflow_id": {
                        "resource_type": "WORKFLOW",
                        "project": "flytesnacks",
                        "domain": "development",
                        "name": "recipes.core.07_juptyer_notebooks.whole_square_workflow.whole_square_wf",
                        "version": "v58"
                    }
                }
            }
        ]
    }
    

    Now with field selector i should be able to specify any field in the json and be able to filter based on it. for eg on the phase field. this should result in returning just one matching the phase criterial w.r.t above data bin/flytectl get executions --field-selector=phase=SUCCEEDED

    {
        "executions": [
            {
                "id": {
                    "project": "flytesnacks",
                    "domain": "development",
                    "name": "er7xaa2vxf"
                },
                "spec": {
                    "launch_plan": {
                        "resource_type": "LAUNCH_PLAN",
                        "project": "flytesnacks",
                        "domain": "development",
                        "name": "recipes.core.basic.lp.go_greet",
                        "version": "v103"
                    },
                    "metadata": {
                        "principal": "flyteconsole",
                        "system_metadata": {}
                    }
                },
                "closure": {
                    "outputs": {
                        "uri": "s3://my-s3-bucket/metadata/propeller/flytesnacks-development-er7xaa2vxf/end-node/data/0/outputs.pb"
                    },
                    "phase": "SUCCEEDED",
                    "started_at": "2021-02-17T07:10:09.271487Z",
                    "duration": "43.817508900s",
                    "created_at": "2021-02-17T07:10:09.234425400Z",
                    "updated_at": "2021-02-17T07:10:53.088995900Z",
                    "workflow_id": {
                        "resource_type": "WORKFLOW",
                        "project": "flytesnacks",
                        "domain": "development",
                        "name": "recipes.core.basic.lp.go_greet",
                        "version": "v103"
                    }
                }
            }
        ]
    }
    

    Following doc will help in finding how the filters work in admin service which flytectl communicates with for performing its operations. https://flyte.readthedocs.io/en/latest/dive_deep/admin_service.html#using-the-admin-service

    enhancement flytectl 
    opened by pmahindrakar-oss 27
  • [Core Feature] Wait on ongoing task when cache is enabled

    [Core Feature] Wait on ongoing task when cache is enabled

    Motivation: Why do you think this is important? This would allow all concurrent cached Flyte tasks with the same input to wait on the first one to complete instead of causing a race condition when they execute at the same time. This would also help save resources by preventing repeated work.

    Goal: What should the final outcome look like, ideally? When multiple cached tasks with the same inputs are running concurrently, only one should be executing, the rest should be waiting on that one to finish.

    Describe alternatives you've considered Writing our own cache system by hashing inputs, storing them in s3, and checking them before executing the main logic.

    [Optional] Propose: Link/Inline OR Additional context Cached tasks already exist today, but it only works for already completed tasks. The proposal is to enable it for ongoing tasks as well.

    enhancement 
    opened by michaels-lyft 26
  • Migrate to go module

    Migrate to go module

    Why

    • everything of flyte is on go-1.13 and go module has become mature to use in production
    • it seems to be the future
    • one fewer tool (dep) to install and manage
    • no more vendor folder
    • I tried the migration and it was smooth https://github.com/lyft/datacatalog/pull/21
    good first issue 
    opened by honnix 24
  • [BUG] Propeller Panic On K8s Array Task (datacatalog implicated)

    [BUG] Propeller Panic On K8s Array Task (datacatalog implicated)

    Describe the bug

    We ran a map task over a Python task that takes in a FlyteFile as input, and returns an int:

    @task(
        requests=Resources(cpu="12", mem="25Gi", ephemeral_storage="325Gi"),
        limits=Resources(cpu="14", mem="25Gi"),
        retries=3,
        cache=True,
        cache_version="1.0",
    )
    def rf_germline_mappable(rf_germline_input_file: FlyteFile) -> int:
    

    We have no special overrides on the map task, just:

        map_task(
            rf_germline_mappable, concurrency=constants.FLYTE_MAP_TASK_DEFAULT_CONCURRENCY
        )(rf_germline_input_file=comparison_sets).with_overrides(
            requests=Resources(cpu="12", mem="25Gi", ephemeral_storage="325Gi"),
            limits=Resources(cpu="14", mem="25Gi"),
        )
    

    When this runs (the first time), it fails (no pods are spun up, it fails all in flyte propeller before the map task launches at all.

    It fails when we resume as well.

    None of the map tasks in the workflow before this seem to have this issue. The task that prepares the List[FlyteFiles] before this one seems to complete fine. We checked manually, and the FlyteFiles in question appear to all correctly exist in the S3 bucket that Flyte is using.

    When this happens, we see a bunch of warnings in data catalog pod logs that look like: Dataset does not exist key: {Project:relative-finder Name:flyte_task-relative_finder.workflows.rf_germline.mapper_rf_germline_mappable_2 Domain:development Version:1.0-85909EZ4-PmK8qF6C UUID:}, err missing entity of type Dataset with identifier project:"relative-finder" name:"flyte_task-relative_finder.workflows.rf_germline.mapper_rf_germline_mappable_2" domain:"development" version:"1.0-85909EZ4-PmK8qF6C"

    The stack trace is:

    Workflow[relative-finder:development:relative_finder.workflows.relative_finder.relative_finder_wf] failed. RuntimeExecutionError: max number of system retry attempts [51/50] exhausted. Last known status message: failed at Node[n1]. RuntimeExecutionError: failed during plugin execution, caused by: failed to execute handle for plugin [k8s-array]: panic when executing a plugin [k8s-array]. Stack: [goroutine 760 [running]:
    runtime/debug.Stack()
    	/usr/local/go/src/runtime/debug/stack.go:24 +0x65
    github.com/flyteorg/flytepropeller/pkg/controller/nodes/task.Handler.invokePlugin.func1.1()
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/task/handler.go:375 +0xfe
    panic({0x1f3c580, 0x3952540})
    	/usr/local/go/src/runtime/panic.go:838 +0x207
    github.com/flyteorg/flytestdlib/bitarray.(*BitSet).IsSet(...)
    	/go/pkg/mod/github.com/flyteorg/[email protected]/bitarray/bitset.go:33
    github.com/flyteorg/flyteplugins/go/tasks/plugins/array/core.InitializeExternalResources({0x2796db0, 0xc01870b290}, {0x27a2700?, 0xc00116c4d0?}, 0xc002130240, 0x23cf0a8)
    	/go/pkg/mod/github.com/flyteorg/[email protected]/go/tasks/plugins/array/core/metadata.go:33 +0x1e1
    github.com/flyteorg/flyteplugins/go/tasks/plugins/array/k8s.Executor.Handle({{0x7fe260ad1090, 0xc00098afc0}, {{0x2789d50, 0xc0018ca0b0}}, {{0x2789d50, 0xc0018ca160}}}, {0x2796db0, 0xc01870b290}, {0x27a2700, 0xc00116c4d0})
    	/go/pkg/mod/github.com/flyteorg/[email protected]/go/tasks/plugins/array/k8s/executor.go:94 +0x225
    github.com/flyteorg/flytepropeller/pkg/controller/nodes/task.Handler.invokePlugin.func1(0x0?, {0x2796db0, 0xc01870b050}, {0x2799298?, 0xc0007daf00?}, 0x0?)
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/task/handler.go:382 +0x178
    github.com/flyteorg/flytepropeller/pkg/controller/nodes/task.Handler.invokePlugin({{0x27970f8, 0xc0011919b0}, {0x27848f0, 0xc000d37da0}, 0xc00143ccc0, 0xc00143ccf0, 0xc00143cd20, {0x2798818, 0xc001710000}, 0xc0018522c0, ...}, ...)
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/task/handler.go:384 +0x9a
    github.com/flyteorg/flytepropeller/pkg/controller/nodes/task.Handler.Handle({{0x27970f8, 0xc0011919b0}, {0x27848f0, 0xc000d37da0}, 0xc00143ccc0, 0xc00143ccf0, 0xc00143cd20, {0x2798818, 0xc001710000}, 0xc0018522c0, ...}, ...)
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/task/handler.go:617 +0x182b
    github.com/flyteorg/flytepropeller/pkg/controller/nodes/dynamic.dynamicNodeTaskNodeHandler.handleParentNode({{0x279a148, 0xc000a5cdd0}, {{0xc000b258c0, {{...}, 0x0}, {0xc0009e8440, 0x4, 0x4}}, {0xc000b258e0, {{...}, ...}, ...}, ...}, ...}, ...)
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/dynamic/handler.go:70 +0xd8
    github.com/flyteorg/flytepropeller/pkg/controller/nodes/dynamic.dynamicNodeTaskNodeHandler.Handle({{0x279a148, 0xc000a5cdd0}, {{0xc000b258c0, {{...}, 0x0}, {0xc0009e8440, 0x4, 0x4}}, {0xc000b258e0, {{...}, ...}, ...}, ...}, ...}, ...)
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/dynamic/handler.go:220 +0x9d0
    github.com/flyteorg/flytepropeller/pkg/controller/nodes.(*nodeExecutor).execute(0xc00131e240, {0x2796db0, 0xc01870aba0}, {0x2798698, 0xc00143e000}, 0xc001082600, {0x27ab1b8?, 0xc001e981a0?})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/executor.go:382 +0x157
    github.com/flyteorg/flytepropeller/pkg/controller/nodes.(*nodeExecutor).handleQueuedOrRunningNode(0xc00131e240, {0x2796db0, 0xc01870aba0}, 0xc001082600, {0x2798698?, 0xc00143e000?})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/executor.go:512 +0x227
    github.com/flyteorg/flytepropeller/pkg/controller/nodes.(*nodeExecutor).handleNode(0xc00131e240, {0x2796db0, 0xc01870aba0}, {0x7fe2606e34d0, 0xc0050f5490}, 0xc001082600, {0x2798698?, 0xc00143e000})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/executor.go:736 +0x3c5
    github.com/flyteorg/flytepropeller/pkg/controller/nodes.(*nodeExecutor).RecursiveNodeHandler(0xc00131e240, {0x2796db0, 0xc01870a420}, {0x27a63e8, 0xc01ac29400}, {0x7fe2606e34d0, 0xc0050f5490}, {0x2784a30?, 0xc018156b60?}, {0x27a3810, ...})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/executor.go:934 +0x705
    github.com/flyteorg/flytepropeller/pkg/controller/nodes.(*nodeExecutor).handleDownstream(0x22f4f2d?, {0x2796db0, 0xc01870a420}, {0x27a63e8, 0xc01ac29400}, {0x7fe2606e34d0, 0xc0050f5490?}, {0x2784a30?, 0xc018156b60}, {0x27a3810, ...})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/executor.go:774 +0x3c5
    github.com/flyteorg/flytepropeller/pkg/controller/nodes.(*nodeExecutor).RecursiveNodeHandler(0xc00131e240, {0x2796db0, 0xc01870a420}, {0x27a63e8, 0xc01ac29400}, {0x7fe2606e34d0, 0xc0050f5490}, {0x2784a30?, 0xc018156b60?}, {0x27a3810, ...})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/executor.go:941 +0x935
    github.com/flyteorg/flytepropeller/pkg/controller/nodes.(*nodeExecutor).handleDownstream(0x22f4f2d?, {0x2796db0, 0xc01870a420}, {0x27a63e8, 0xc01ac29400}, {0x7fe2606e34d0, 0xc0050f5490?}, {0x2784a30?, 0xc018156b60}, {0x27a3810, ...})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/executor.go:774 +0x3c5
    github.com/flyteorg/flytepropeller/pkg/controller/nodes.(*nodeExecutor).RecursiveNodeHandler(0xc00131e240, {0x2796db0, 0xc01870a420}, {0x27a63e8, 0xc01ac29400}, {0x7fe2606e34d0, 0xc0050f5490}, {0x2784a30?, 0xc018156b60?}, {0x27a3810, ...})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/executor.go:941 +0x935
    github.com/flyteorg/flytepropeller/pkg/controller/nodes/subworkflow.(*subworkflowHandler).handleSubWorkflow(0xc000490c68, {0x2796db0, 0xc01870a420}, {0x27a4130, 0xc001082540}, {0x27a1b40, 0xc0050f5490}, {0x2784a30, 0xc018156b60})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/subworkflow/subworkflow.go:74 +0x334
    github.com/flyteorg/flytepropeller/pkg/controller/nodes/subworkflow.(*subworkflowHandler).CheckSubWorkflowStatus(0xc00af11c50?, {0x2796db0, 0xc01870a420}, {0x27a4130?, 0xc001082540?})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/subworkflow/subworkflow.go:226 +0x3f1
    github.com/flyteorg/flytepropeller/pkg/controller/nodes/subworkflow.(*workflowNodeHandler).Handle(0xc000490c40, {0x2796db0, 0xc01870a420}, {0x27a4130?, 0xc001082540?})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/subworkflow/handler.go:91 +0x1690
    github.com/flyteorg/flytepropeller/pkg/controller/nodes.(*nodeExecutor).execute(0xc00131e240, {0x2796db0, 0xc01870a420}, {0x2798758, 0xc000490c40}, 0xc001082540, {0x27ab1b8?, 0xc01bb53d40?})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/executor.go:382 +0x157
    github.com/flyteorg/flytepropeller/pkg/controller/nodes.(*nodeExecutor).handleQueuedOrRunningNode(0xc00131e240, {0x2796db0, 0xc01870a420}, 0xc001082540, {0x2798758?, 0xc000490c40?})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/executor.go:512 +0x227
    github.com/flyteorg/flytepropeller/pkg/controller/nodes.(*nodeExecutor).handleNode(0xc00131e240, {0x2796db0, 0xc01870a420}, {0x277d258, 0xc002cae500}, 0xc001082540, {0x2798758?, 0xc000490c40})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/executor.go:736 +0x3c5
    github.com/flyteorg/flytepropeller/pkg/controller/nodes.(*nodeExecutor).RecursiveNodeHandler(0xc00131e240, {0x2796db0, 0xc01870a0c0}, {0x27a63e8, 0xc01ac29360}, {0x277d258, 0xc002cae500}, {0x277d280?, 0xc002cae500?}, {0x27a3810, ...})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/executor.go:934 +0x705
    github.com/flyteorg/flytepropeller/pkg/controller/nodes.(*nodeExecutor).handleDownstream(0x22f4f2d?, {0x2796db0, 0xc01870a0c0}, {0x27a63e8, 0xc01ac29360}, {0x277d258, 0xc002cae500?}, {0x277d280?, 0xc002cae500}, {0x27a3810, ...})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/executor.go:774 +0x3c5
    github.com/flyteorg/flytepropeller/pkg/controller/nodes.(*nodeExecutor).RecursiveNodeHandler(0xc00131e240, {0x2796db0, 0xc01870a0c0}, {0x27a63e8, 0xc01ac29360}, {0x277d258, 0xc002cae500}, {0x277d280?, 0xc002cae500?}, {0x27a3810, ...})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/nodes/executor.go:941 +0x935
    github.com/flyteorg/flytepropeller/pkg/controller/workflow.(*workflowExecutor).handleRunningWorkflow(0xc000491e30, {0x2796db0, 0xc01870a0c0}, 0xc002cae500)
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/workflow/executor.go:147 +0x1b3
    github.com/flyteorg/flytepropeller/pkg/controller/workflow.(*workflowExecutor).HandleFlyteWorkflow(0xc000491e30, {0x2796db0, 0xc01870a0c0}, 0xc002cae500)
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/workflow/executor.go:393 +0x40f
    github.com/flyteorg/flytepropeller/pkg/controller.(*Propeller).TryMutateWorkflow.func2(0xc00145f0e0, {0x2796db0, 0xc01870a0c0}, 0xc010bf3848, 0x1e51040?)
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/handler.go:130 +0x18e
    github.com/flyteorg/flytepropeller/pkg/controller.(*Propeller).TryMutateWorkflow(0xc00145f0e0, {0x2796db0, 0xc0186b5230}, 0xc002156f00)
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/handler.go:131 +0x459
    github.com/flyteorg/flytepropeller/pkg/controller.(*Propeller).Handle(0xc00145f0e0, {0x2796db0, 0xc0186b5230}, {0xc004980330, 0x19}, {0xc00498034a, 0x8})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/handler.go:205 +0x86d
    github.com/flyteorg/flytepropeller/pkg/controller.(*WorkerPool).processNextWorkItem.func1(0xc00189ac60, 0xc010bf3f28, {0x1e51040?, 0xc003db8440})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/workers.go:88 +0x510
    github.com/flyteorg/flytepropeller/pkg/controller.(*WorkerPool).processNextWorkItem(0xc00189ac60, {0x2796db0, 0xc0186b5230})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/workers.go:99 +0xf1
    github.com/flyteorg/flytepropeller/pkg/controller.(*WorkerPool).runWorker(0x2796db0?, {0x2796db0, 0xc0007cda10})
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/workers.go:115 +0xbd
    github.com/flyteorg/flytepropeller/pkg/controller.(*WorkerPool).Run.func1()
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/workers.go:150 +0x59
    created by github.com/flyteorg/flytepropeller/pkg/controller.(*WorkerPool).Run
    	/go/src/github.com/flyteorg/flytepropeller/pkg/controller/workers.go:147 +0x285
    ]
    

    Expected behavior

    No panci, code recognizes that the cache is empty and begins the task, caching the results.

    Additional context to reproduce

    We are on EKS, using managed node groups.

    K8s version v1.22.10-eks-84b4fe6

    Datacatalog cr.flyte.org/flyteorg/datacatalog-release:v1.1.0
    Flyte propeller cr.flyte.org/flyteorg/flytepropeller:v1.1.24 (we have this up a bit higher for some more recent bugfixes that aren't on a proper release yet) Flyte admin cr.flyte.org/flyteorg/flyteadmin-release:v1.1.0

    Screenshots

    image

    Unusual data catalog warning: image

    Are you sure this issue hasn't been raised already?

    • [X] Yes

    Have you read the Code of Conduct?

    • [X] Yes
    bug 
    opened by CalvinLeather 20
  • [Docs] `make` is resulting in an error when trying to build the flytesnacks docs

    [Docs] `make` is resulting in an error when trying to build the flytesnacks docs

    Description

    There shouldn't be any errors. Will have to fix indentations, broken links, and RST styles.

    Are you sure this issue hasn't been raised already?

    • [X] Yes

    Have you read the Code of Conduct?

    • [X] Yes
    documentation hacktoberfest 
    opened by samhita-alla 16
  • RFC for Eviction of cached task outputs

    RFC for Eviction of cached task outputs

    https://hackmd.io/qOztkaj4Rb6ypodvGEowAg?view

    Comments already present on the HackMD doc are from our internal team and have been left in for clarification/further discussion.

    Initial discussion on Slack

    opened by MorpheusXAUT 16
  • [Feature] [FlyteKit] intra-task checkpointing and resumable task execution

    [Feature] [FlyteKit] intra-task checkpointing and resumable task execution

    Motivation: Why do you think this is important? ML training processes are usually iterative and long-running. The ML community has developed the checkpointing mechanism to preserve partial progress of a training process. The benefit of this mechanism is two-fold. Firstly, it allows users to jump-start the next time in the case of crashing or things going unexpectedly. Secondly, some of those intermediate checkpoints might actually be higher-quality models. It is not uncommon to see ML practitioners keep track of the best checkpoints (e.g., in terms of performance on validation dataset) in addition to the final model.

    Checkpointing not only helps ML practitioners make the training more robust and efficient but also has an implication on cost optimization.

    Today's cloud providers usually supply some types of interruptible compute instances as a way to maximize their own resource utilization. The incentive these providers give to the users is that leveraging these interruptible instances, such as AWS Spot Instances, gains the users some steep discounts on the hourly rates and therefore has the potential to save significantly.

    Flyte's current way of leveraging spot instances works as follows: when a Flyte task is marked as interruptible, Flyte would first try to execute it on a spot instance and hope the execution to finish before getting interrupted. If the execution receives an interruption from the Spot Service, it is aborted and will be restarted from the beginning on the next instance it got scheduled onto. If the execution got interrupted too many times, to the extent that is beyond a predetermined level of tolerance, Flyte will then schedule it onto a normal, uninterruptible EC2 instance instead to ensure its completion.

    This approach, however, has its limitations. When an execution runs on a Spot instance and gets interrupted, the progress is lost completely no matter how close it is to its completion. This lack of "resumability" incurs a significant amount of compute overhead, which naturally translates into extra dollars.

    Checkpointing, luckily, provides the much-needed resumability. It allows users to persist some intermediate artifacts and data which allows the interrupted execution to resume from the partial progress made earlier without wasting time and money on computing what's already computed previously.

    Goal: What should the final outcome look like, ideally? Flyte workflows are designed to be a way to checkpoint as they provide users a way to code a state machine with isolated failure domains. DataCatalog then helps in the recovery in alternate runs. With these two components working together, Flyte provides the checkpointing and resumption ability at task boundaries.

    In this proposal, on the other hand, we aim to formulate the idea of explicit intra-task checkpointing. Our definition of intra-task checkpointing means a mechanism that (a) persists some intermediate artifacts that execution of user code produces on-demand, and (b) exposes them later to the same or a different execution of the same piece of user code so that users can "resume"

    Describe alternatives you've considered A clear and concise description of any alternative solutions or features you've considered.

    Flyte component

    • [ ] Overall
    • [ ] Flyte Setup and Installation scripts
    • [ ] Flyte Documentation
    • [ ] Flyte communication (slack/email etc)
    • [x] FlytePropeller
    • [] FlyteIDL (Flyte specification language)
    • [x] Flytekit (Python SDK)
    • [ ] FlyteAdmin (Control Plane service)
    • [x] FlytePlugins
    • [x] DataCatalog
    • [ ] FlyteStdlib (common libraries)
    • [x] FlyteConsole (UI)
    • [ ] Other

    [Optional] Propose: Link/Inline Link to the design doc: https://docs.google.com/document/d/1Ds8TlZpiHjDV48yWNITqrUkXNZNv6tA7VwcYEOYonf4/edit?ts=5fb7050a#

    Additional context Add any other context or screenshots about the feature request here.

    Is this a blocker for you to adopt Flyte Please let us know if this makes it impossible to adopt Flyte

    enhancement flytekit 
    opened by bnsblue 16
  • Support run-to-completion flag

    Support run-to-completion flag

    Right now, if any of the tasks in a workflow fails, Flyte will fail the entire workflow. We would like to have a flag to let the Flyte continue to run whatever tasks that are not affected by the failed tasks.

    In the following example, if task 3 fails, task 5 will be marked won't run. While task 1, 2, 4 should continue to run.

    image

    enhancement 
    opened by datability-io 16
  • Updated the Existing Issue Templates to Issue forms

    Updated the Existing Issue Templates to Issue forms

    Updated all the existing issue templates to issue forms for ease of traversing through the multitude of issues and relevance of content to avoid spam. If there are any changes and or edits,please do let me know!

    Issue: https://github.com/flyteorg/flyte/issues/1555

    Here is how each of the template looks now:

    • Bug report: image

    • Documentation: image

    • Core Features: image

    • Flyctectl feature: image

    • Housekeeping: image

    • UI feature: image

    hacktoberfest-accepted 
    opened by noobkid2411 15
  • [BUG] Task fails with error: `containers with unready status: [main]|failed to reserve container name`

    [BUG] Task fails with error: `containers with unready status: [main]|failed to reserve container name`

    Describe the bug We had a rare failure where a task (and consequently the workflow) failed with the following error:

    containers with unready status: [main]|failed to reserve container name "main_f76be7b00ec9b4348871-n1-0_pineapple-development_987b2aa9-b537-4038-9884-8d2bc413762a_0": name "main_f76be7b00ec9b4348871-n1-0_pineapple-development_987b2aa9-b537-4038-9884-8d2bc413762a_0" is reserved for "6678ab271c041304d4a91e8e219dca896388098ec8a258dc49f4f9942296f669"
    

    Expected behavior Flytepropeller handles these likely transient issues gracefully:

    Relevant issue: https://github.com/containerd/containerd/issues/4604 See this comment:

    Eventually, kubelet is able to resolve this issue without manual intervention, however, it is significantly slowing the deployment of new images during the release (extra 2-3 minutes to resolve name conflicts).

    It is possible that Flyte might be catching and failing on the error too fast before kubelet can resolve this issue on its own. Flyte might need to wait and let this issue resolve, possibly with some reasonable timeout.

    [Optional] Additional context To Reproduce Steps to reproduce the behavior: 1. 2.

    Screenshots If applicable, add screenshots to help explain your problem.

    bug 
    opened by jeevb 15
  • [Core feature] Convert List[Any] to a single pickle file

    [Core feature] Convert List[Any] to a single pickle file

    Motivation: Why do you think this is important?

    Currently, flyte create N (size of list) pickle files if output type is List[Any]. This slows down serialization. it takes more than 15 mins to upload the pickles to s3 if the size of list is 1000.

    People don't care about how we serialize List[Any]. We can just convert entire list into a single pickle file, which reduces the time required for serialization.

    Goal: What should the final outcome look like, ideally?

    it will make serialization faster

    Describe alternatives you've considered

    • Raise an error when using large list
    • Add a detailed warning

    Propose: Link/Inline OR Additional context

    Slack Thread

    Are you sure this issue hasn't been raised already?

    • [X] Yes

    Have you read the Code of Conduct?

    • [X] Yes
    enhancement flytekit 
    opened by pingsutw 0
  • [Core feature] [flytekit] data plugins update

    [Core feature] [flytekit] data plugins update

    Motivation: Why do you think this is important?

    Currently we have our own datapersistence layer. This is because we wanted to support aws cli and gsutil. I think we should drop support for aws cli and gsutil and completely migrate to using datafusion / arrow file system or fsspec

    On reading I see that the datafusion project has matured and has a fantastic rust object_Store crate which works great for Flytekit https://arrow.apache.org/docs/python/filesystems.html

    Goal: What should the final outcome look like, ideally?

    Flyte is enriched with the amazing integrations of arrow and the great growing ecosystem.

    Describe alternatives you've considered

    NA

    Propose: Link/Inline OR Additional context

    No response

    Are you sure this issue hasn't been raised already?

    • [X] Yes

    Have you read the Code of Conduct?

    • [X] Yes
    enhancement flytekit 
    opened by kumare3 0
  • [Core feature] [Pod plugin] UX and example improvements

    [Core feature] [Pod plugin] UX and example improvements

    Motivation: Why do you think this is important?

    Example for using Pod features is overly complicated.

    Also the pod plugin does not make it simply possible to add pod-spec related features like node-selector etc.

    it is possible to do this

     @task(
            task_config=Pod(
                pod_spec=V1PodSpec(
                    containers=[],
                    node_selector={"node_group": "memory"},
                ),
            ),
            requests=Resources(
                mem="1G",
            ),
        )
        def my_pod_task():
            ...
    

    Goal: What should the final outcome look like, ideally?

    Improve the example and point to the new pod plugin

    Describe alternatives you've considered

    NA

    Propose: Link/Inline OR Additional context

    No response

    Are you sure this issue hasn't been raised already?

    • [X] Yes

    Have you read the Code of Conduct?

    • [X] Yes
    enhancement flytekit 
    opened by kumare3 0
  • [Core feature] Add support Hex plugin

    [Core feature] Add support Hex plugin

    Motivation: Why do you think this is important?

    Hex is a modern Data Workspace. It makes it easy to connect to data, analyze it in collaborative SQL and Python-powered notebooks, and share work as interactive data apps and stories.

    Hex creates a chart cell that allows editors to interactively explore and aggregate a dataframe, creating rich visualizations to share with app users.

    We can add a Hex plugin to run the notebook in the Hex workspace and display the interactive chart on Flyte Deck.

    https://user-images.githubusercontent.com/37936015/209454302-500e6861-488a-4edb-9186-54fb306170d7.mp4

    1_2t9jVOLMAZuz4swwnV3Xng

    Goal: What should the final outcome look like, ideally?

    • Have an interactive chart on Flyte Deck
    • Submit a job to Hex to run the notebook in Hex workspace.

    Describe alternatives you've considered

    NA

    Propose: Link/Inline OR Additional context

    • https://learn.hex.tech/docs/logic-cell-types/display-cells/chart-cells
    • https://github.com/hex-inc/airflow-provider-hex

    Are you sure this issue hasn't been raised already?

    • [X] Yes

    Have you read the Code of Conduct?

    • [X] Yes
    enhancement flytekit 
    opened by pingsutw 0
  • [BUG] Structured Dataset compatibility between plugins

    [BUG] Structured Dataset compatibility between plugins

    Describe the bug

    I was running some tasks in a notebooks where I was passing the results of a Spark task as a StructuredDataset and then trying to load them into a polars dataframe and a hugging face dataset.

    It resulted in the following error for both plugins No such file or directory: /var/folders/wq/3hjh3ms916b6dj56zx0f_x000000gq/T/flyte-69d2tww2/sandbox/local_flytekit/95bac8efeb64a8d10d34c73b66df7051/00000. However, it did work for pandas.

    It seems like polars and huggingface add in 00000 to the path in the transformers and spark does not.

    • polars: https://github.com/flyteorg/flytekit/blob/master/plugins/flytekit-polars/flytekitplugins/polars/sd_transformers.py#L43
    • spark: https://github.com/flyteorg/flytekit/blob/master/plugins/flytekit-spark/flytekitplugins/spark/sd_transformers.py#L29

    Expected behavior

    I would expect to be able to use a StructuredDataset from spark with dataframe libraries from all plugins.

    Additional context to reproduce

    from flytekit import task, StructuredDataset from flytekitplugins.spark.task import Spark from datasets import Dataset import polars as pl import datasets import pandas as pd

    @task( task_config=Spark() ) def spark_task(path: str) -> StructuredDataset: sess = flytekit.current_context().spark_session df = sess.read.parquet(path) return StructuredDataset(dataframe=df)

    df = spark_task(path="./ratings_100k.parquet")

    try: df.open(pl.DataFrame).all().head() except Exception as e: print(e)

    try: df.open(datasets.Dataset).all().head() except Exception as e: print(e)

    df.open(pd.DataFrame).all().head()

    Screenshots

    Screen Shot 2022-12-24 at 10 54 40 AM

    Are you sure this issue hasn't been raised already?

    • [X] Yes

    Have you read the Code of Conduct?

    • [X] Yes
    bug untriaged 
    opened by esadler-hbo 1
Releases(v1.3.0-b8)
Owner
Flyte
Organization that hosts the Flyte Project with all of the core components. Flyte is an LF AI & Data Incubating project
Flyte
A simple projects to help your seo optimizing has been written with python

python-seo-projects it is a very simple projects to help your seo optimizing has been written with python broken link checker with python(it will give

Amirmohammad Razmy 3 Dec 25, 2021
Trashselected - Plugin for fman.io to move files that has been selected in fman to trash

TrashSelected Plugin for fman.io to move files that has been selected in fman to

null 1 Feb 4, 2022
This repository provides a set of easy to understand and tested Python samples for using Acronis Cyber Platform API.

Base Acronis Cyber Platform API operations with Python !!! info Copyright © 2019-2021 Acronis International GmbH. This is distributed under MIT licens

Acronis International GmbH 3 Aug 11, 2022
A script for creating battle animations in FEGBA format.

AA2 Made by Huichelaar. I heavily referenced FEBuilderGBA. I also referenced circleseverywhere's Animation Assembler. This is also where I took lzss.p

null 2 May 31, 2022
Paxos in Python, tested with Jepsen

Python implementation of Multi-Paxos with a stable leader and reconfiguration, roughly following "Paxos Made Moderately Complex". Run python3 paxos/st

A. Jesse Jiryu Davis 25 Dec 15, 2022
A docker container (Docker Desktop) for a simple python Web app few unit tested

Short web app using Flask, tested with unittest on making massive requests, responses of the website, containerized

Omar 1 Dec 13, 2021
Open source home automation that puts local control and privacy first

Home Assistant Open source home automation that puts local control and privacy first. Powered by a worldwide community of tinkerers and DIY enthusiast

Home Assistant 57k Jan 2, 2023
DownTime-Score is a Small project aimed to Monitor the performance and the availabillity of a variety of the Vital and Critical Moroccan Web Portals

DownTime-Score DownTime-Score is a Small project aimed to Monitor the performance and the availabillity of a variety of the Vital and Critical Morocca

adnane-tebbaa 5 Apr 30, 2022
Type Persian without confusing words for yourself and others, in Adobe Connect

About In the Adobe Connect chat section, to type in Persian or Arabic, the written words will be confused and will be written and sent illegibly (This

Matin Najafi 23 Nov 26, 2021
An example file showing a simple endpoints like a login/logout function and maybe some others.

Flask API Example An example project showing a simple endpoints like a login/logout function and maybe some others. How to use: Open up your IDE (or u

Kevin 1 Oct 27, 2021
Users can read others' travel journeys in addition to being able to upload and delete posts detailing their own experiences

Users can read others' travel journeys in addition to being able to upload and delete posts detailing their own experiences! Posts are organized by country and destination within that country.

Christopher Zeas 1 Feb 3, 2022
Run python scripts and pass data between multiple python and node processes using this npm module

Run python scripts and pass data between multiple python and node processes using this npm module. process-communication has a event based architecture for interacting with python data and errors inside nodejs.

Tyler Laceby 2 Aug 6, 2021
Cloud Native sample microservices showcasing Full Stack Observability using AppDynamics and ThousandEyes

Cloud Native Sample Bookinfo App Observability Bookinfo is a sample application composed of four Microservices written in different languages.

Cisco DevNet 13 Jul 21, 2022
A PowSyBl and Python integration based on GraalVM native image

PyPowSyBl The PyPowSyBl project gives access PowSyBl Java framework to Python developers. This Python integration relies on GraalVM to compile Java co

powsybl 23 Dec 14, 2022
HatAsm - a HatSploit native powerful assembler and disassembler that provides support for all common architectures

HatAsm - a HatSploit native powerful assembler and disassembler that provides support for all common architectures.

EntySec 8 Nov 9, 2022
Ingestinator is my personal VFX pipeline tool for ingesting folders containing frame sequences that have been pulled and downloaded to a local folder

Ingestinator Ingestinator is my personal VFX pipeline tool for ingesting folders containing frame sequences that have been pulled and downloaded to a

Henry Wilkinson 2 Nov 18, 2022
🍏 Make Thinc faster on macOS by calling into Apple's native Accelerate library

?? Make Thinc faster on macOS by calling into Apple's native Accelerate library

Explosion 81 Nov 26, 2022
Python package for reference counting native pointers

refcount master: testing: This package is primarily for managing resources in native libraries, written for instance in C++, from Python. While it boi

CSIRO Hydroinformatics 2 Nov 3, 2022
A python script for combining multiple native SU2 format meshes into one mesh file for multi-zone simulations.

A python script for combining multiple native SU2 format meshes into one mesh file for multi-zone simulations.

MKursatUzuner 1 Jan 20, 2022