Skip to main content

getting-started

Getting Started with IOA Core

Welcome to IOA Core! This guide will help you get up and running with the IOA (Intelligent Orchestration Architecture) framework in minutes.

What is IOA Core?

IOA Core is the open-source framework for governed AI orchestration — bringing verifiable policy, evidence, and trust to every workflow.

Key capabilities:

  • Immutable Audit Chains: Cryptographically verified audit logging
  • System Laws Framework: Seven governing principles for AI orchestration
  • 6 LLM Providers: OpenAI, Anthropic, Gemini, DeepSeek, XAI, and Ollama
  • Vendor-Neutral Quorum: Multi-agent consensus with graceful scaling
  • Memory Fabric: Multi-tier storage with hot and cold layers
  • Working Examples: Six complete examples to get you started

System Requirements

  • Python 3.10 or higher
  • pip or poetry
  • (Optional) LLM provider API keys for live testing

Installation

Quick Install

Install IOA Core from PyPI:

pip install ioa-core

Development Install

For development, install from source:

git clone https://github.com/orchintel/ioa-core.git
cd ioa-core
pip install -e ".[dev]"

Verify Installation

Check that IOA Core is properly installed:

# Check Python environment
python --version # Should be 3.10+

# Run system health check
python examples/30_doctor/doctor_check.py

Expected output:

{
"python_version_ok": true,
"python_version": "3.10.0",
"overall_health": "healthy"
}

Quick Start

1. Bootstrap a Project

Create a new IOA project:

python examples/00_bootstrap/boot_project.py my-ai-system

This creates a project directory with:

  • my-ai-system/ioa.yaml - Configuration file
  • my-ai-system/README.md - Project documentation

2. Run Your First Workflow

Execute a governed workflow:

python examples/10_workflows/run_workflow.py

Output shows:

  • Task execution
  • Governance policy applied
  • Audit evidence ID
  • System Laws enforced

3. Test Provider Connectivity

Check provider status:

# Mock provider (no API key needed)
IOA_PROVIDER=mock python examples/40_providers/provider_smoketest.py

# Real provider (requires API key)
IOA_LIVE=1 IOA_PROVIDER=openai python examples/40_providers/provider_smoketest.py

4. Run Multi-Agent Quorum

Demonstrate vendor-neutral roundtable:

python examples/20_roundtable/roundtable_quorum.py "Analyze this code (ok)"

Features:

  • 3-agent quorum voting
  • 2-of-3 approval threshold
  • Vendor-neutral consensus

Examples

IOA Core includes six working examples:

1. Bootstrap

File: examples/00_bootstrap/boot_project.py

Create a new IOA project with configuration and schemas.

python examples/00_bootstrap/boot_project.py my-ai-system

2. Workflows

File: examples/10_workflows/run_workflow.py

Run governed workflows with policy enforcement and audit logging.

python examples/10_workflows/run_workflow.py

3. Roundtable

File: examples/20_roundtable/roundtable_quorum.py

Multi-agent consensus with vendor-neutral quorum policy.

python examples/20_roundtable/roundtable_quorum.py "Your task (ok)"

4. Doctor

File: examples/30_doctor/doctor_check.py

System health check and environment validation.

python examples/30_doctor/doctor_check.py

5. Providers

File: examples/40_providers/provider_smoketest.py

Test LLM provider connectivity and configuration.

IOA_PROVIDER=mock python examples/40_providers/provider_smoketest.py

6. Ollama Turbo

File: examples/50_ollama/turbo_mode_demo.py

Local model optimization with turbo mode.

python examples/50_ollama/turbo_mode_demo.py turbo_cloud

API Keys Setup (Optional)

For live testing with real LLM providers, set up your API keys:

# Set API keys
export OPENAI_API_KEY="sk-your-openai-key"
export ANTHROPIC_API_KEY="sk-ant-your-anthropic-key"
export XAI_API_KEY="xai-your-xai-key"
export GOOGLE_API_KEY="your-google-key"
export DEEPSEEK_API_KEY="your-deepseek-key"

# Enable live mode
IOA_LIVE=1 python examples/10_workflows/run_workflow.py

What You Get

IOA Core v2.5.2 provides:

  • ✅ Working governance framework
  • ✅ Immutable audit chains with cryptographic verification
  • ✅ System Laws enforcement at runtime
  • ✅ 6 LLM provider support with unified interface
  • ✅ Memory fabric (hot & cold storage)
  • ✅ Vendor-neutral quorum policy for multi-agent consensus
  • ✅ Comprehensive examples and documentation
  • ✅ CLI interface for system management
  • ✅ Production-ready architecture

Next Steps

Getting Help

Resources


Ready to start building? Check out our Quick Start Guide to get up and running in minutes.