Skip to main content

Requirements

  • Python 3.10 or higher
  • API key for at least one LLM provider

Install from PyPI

pip install synkro

API Keys

Synkro supports multiple LLM providers. Set the API key for your preferred provider:
export OPENAI_API_KEY="sk-..."

Verify Installation

import synkro
print(synkro.__version__)

Choose Your Models

from synkro import create_pipeline
from synkro.models import OpenAI, Anthropic, Google

# OpenAI
pipeline = create_pipeline(
    model=OpenAI.GPT_4O_MINI,
    grading_model=OpenAI.GPT_4O,
)

# Anthropic
pipeline = create_pipeline(
    model=Anthropic.CLAUDE_SONNET,
    grading_model=Anthropic.CLAUDE_OPUS,
)

# Google
pipeline = create_pipeline(
    model=Google.GEMINI_25_FLASH,
    grading_model=Google.GEMINI_25_PRO,
)

Local Models

Run with Ollama or any OpenAI-compatible endpoint:
from synkro import create_pipeline
from synkro.models import Local

# Ollama
pipeline = create_pipeline(model=Local.OLLAMA("llama3.2"))

# vLLM
pipeline = create_pipeline(model=Local.VLLM("mistral-7b"))

# Custom endpoint
pipeline = create_pipeline(
    model=Local.CUSTOM("my-model", endpoint="http://localhost:8080")
)