Apolo
HomeConsoleGitHub
  • Apolo concepts
  • CLI Reference
  • Examples/Use Cases
  • Flow CLI
  • Actions Reference
  • Apolo Extras CLI
  • Python SDK
  • Getting started
    • Introduction
    • First Steps
      • Getting Started
      • Training Your First Model
      • Running Your Code
    • Apolo Base Docker image
    • FAQ
    • Troubleshooting
    • References
  • Apolo Console
    • Getting started
      • Sign Up, Login
      • Organizations
      • Clusters
      • Projects
    • Apps
      • Pre-installed apps
        • Files
        • Buckets
        • Disks
        • Images
        • Secrets
        • Jobs
          • Remote Debugging with PyCharm Professional
          • Remote Debugging with VS Code
        • Flows
      • Available apps
        • Terminal
        • LLM Inference
          • vLLM Inference details
          • Multi-GPU Benchmarks Report
        • PostgreSQL
        • Text Embeddings Inference
        • Jupyter
        • VSCode
        • PyCharm Community Edition
        • MLflow
        • Apolo Deploy
        • Dify
        • Weaviate
        • Fooocus
        • Stable Diffusion
        • Hugging Face
        • Service Deployment
        • Spark Application
  • Apolo CLI
    • Installing CLI
    • Apps
      • Files
      • Jobs
      • Images
      • Available apps
        • VSCode
        • Hugging Face
        • Service Deployment
  • Administration
    • Apolo Credits
    • Cluster Management
      • Creating a Cluster
      • Managing Users and Quotas
      • Managing organizations
      • Creating Node Pools
      • Managing Presets
Powered by GitBook
On this page
  • Use Cases
  • Example Use Case
  • Models of Operation
  • Web Console Capabilities
  • References

Was this helpful?

  1. Apolo Console
  2. Apps
  3. Pre-installed apps

Flows

Apolo pipelines

Apolo Flow is a powerful pipeline engine designed for MLOps workflows on the Apolo platform. It enables seamless orchestration, automation, and execution of machine learning pipelines.

Use Cases

  • Automating ML workflows, from data ingestion to model deployment.

  • Running batch and live workflows for continuous training and inference.

  • Managing dependencies and execution order across pipeline steps.

  • Standardizing and versioning workflows for reproducibility and collaboration.

Example Use Case

Imagine a data science team working on a fraud detection model. They can use Apolo Flow to:

  1. Ingest transaction data from multiple sources.

  2. Preprocess the data and extract relevant features.

  3. Train and validate multiple models in parallel.

  4. Deploy the best-performing model into production.

Models of Operation

  • CLI Usage: Provides a command-line interface for managing pipelines, configuring workflows, and executing actions.

  • Configuration Files: Uses structured configuration files to define workflow syntax, actions, and dependencies.

  • Workflow Syntax: Supports batch (pipeline) and live (interactive) workflows, allowing users to define execution logic and contexts.

Web Console Capabilities

Apolo Web console includes a Flow section for monitoring and managing pipeline execution. Users can:

  • List workloads running as part of flows, including live jobs and bakes (batch execution).

  • Monitor and control the lifecycle of jobs, tasks, and entire pipelines.

  • Retrieve pipeline statuses, view pipeline DAG with highlighted statuses, step-by-step execution details, inspect logs.

  • Kill jobs, individual tasks, or full pipeline if necessary.

  • Access detailed outputs for each pipeline step, enabling debugging and performance optimization.

For detailed documentation, refer to the dedicated Apolo Flow reference guide.

References

PreviousRemote Debugging with VS CodeNextAvailable apps

Last updated 2 months ago

Was this helpful?

Apolo Flow documentation