PromptRails

Local Emulator

Run a local PromptRails API emulator for development and testing — like LocalStack for AWS, but for PromptRails.

Local Emulator

PromptRails Local is an in-memory API emulator that lets you develop and test against the PromptRails API without a real backend. Think of it like LocalStack for AWS, but for PromptRails.

All data lives in memory and resets on restart. It comes pre-loaded with example agents, prompts, credentials, and LLM models so you can start testing immediately.

Quick Start

docker run -p 8080:8080 bahattincinic/promptrails-local

The emulator starts with seed data and is ready to use:

Connect Your SDK

Point any PromptRails SDK to the emulator by changing the base URL:

Python

from promptrails import PromptRails
 
client = PromptRails(api_key="test-key", base_url="http://localhost:8080")
 
agents = client.agents.list()
for agent in agents.data:
    print(f"{agent.name} ({agent.type})")
 
# Execute an agent (returns simulated response)
result = client.agents.execute(
    "39wNZZu78VawB207IOPonkoP38J",
    input={"topic": "AI agents"}
)
print(result.status)  # "completed"

JavaScript / TypeScript

import { PromptRails } from '@promptrails/sdk'
 
const client = new PromptRails({
  apiKey: 'test-key',
  baseUrl: 'http://localhost:8080',
})
 
const agents = await client.agents.list()
const result = await client.agents.execute('39wNZZu78VawB207IOPonkoP38J', {
  input: { topic: 'AI agents' },
})

Go

client := promptrails.NewClient("test-key",
    promptrails.WithBaseURL("http://localhost:8080"))
 
agents, _ := client.Agents.List(ctx, nil)
result, _ := client.Agents.Execute(ctx, "39wNZZu78VawB207IOPonkoP38J",
    &promptrails.ExecuteAgentParams{
        Input: map[string]any{"topic": "AI agents"},
    })

Pre-loaded Seed Data

The emulator starts with example data so you can immediately test:

ResourceCountDetails
Agents6simple, chain, multi_agent, approval types
Agent Versions15Multiple versions per agent
Prompts8Various templates with {{ variable }} syntax
Prompt Versions12With system/user prompts and input schemas
Data Sources2PostgreSQL examples
Credentials4OpenAI, Gemini, PostgreSQL, Linear
LLM Models42Full catalog (OpenAI, Anthropic, Gemini, DeepSeek, xAI, etc.)
MCP Tools6Builtin, datasource, remote, API types
Guardrails1PII redaction example
Memories4Fact, procedure, semantic types

Supported Endpoints

ResourceCRUDExecuteNotes
AgentsYesYes (simulated)+ versions, promote, preview
PromptsYesYes (simulated)+ versions, promote, preview, run
ExecutionsReadAuto-createdCreated by agent execute
Data SourcesYesMock results+ versions
CredentialsYesNo real validation
Chat SessionsYesSimulated repliesMulti-turn tracking
LLM ModelsReadFrom seed catalog
TracesReadAuto-createdCreated by executions
ScoresYes+ score configs
ApprovalsYesApprove/reject flow
Webhook TriggersYesYesToken-based hook endpoint
MCP ToolsYes+ templates
GuardrailsYes
MemoriesYes+ search

Custom Fixtures

Load your own test data from a directory:

# Default seed + your custom data
docker run -p 8080:8080 \
  -v ./my-fixtures:/fixtures \
  -e FIXTURES=/fixtures \
  bahattincinic/promptrails-local
 
# Only your data (no default seed)
docker run -p 8080:8080 \
  -v ./my-fixtures:/fixtures \
  -e SEED=false \
  -e FIXTURES=/fixtures \
  bahattincinic/promptrails-local

Place JSON files in your fixtures directory — all files are optional:

my-fixtures/
  agents.json
  agent_versions.json
  prompts.json
  prompt_versions.json
  credentials.json
  llm_models.json
  data_sources.json
  mcp_tools.json
  guardrails.json
  memories.json

See the fixtures documentation for the complete file format reference.

Configuration

FlagEnvDefaultDescription
--portPORT8080Server port
--seedSEEDtrueLoad built-in seed data
--fixturesFIXTURESLoad fixtures from directory
--log-levelLOG_LEVELinfodebug, info, warn, error
--cors-originsCORS_ORIGINS*CORS allowed origins

Admin Endpoints

Reset state between test runs:

# Reset all data and reload seed
curl -X POST http://localhost:8080/admin/reset
 
# View store statistics
curl http://localhost:8080/admin/store/stats

Use in CI/CD

# GitHub Actions example
jobs:
  test:
    runs-on: ubuntu-latest
    services:
      promptrails:
        image: bahattincinic/promptrails-local
        ports:
          - 8080:8080
    steps:
      - uses: actions/checkout@v4
      - run: pytest tests/ -v
        env:
          PROMPTRAILS_BASE_URL: http://localhost:8080

Authentication

The emulator accepts any value for the X-API-Key header — no real authentication is performed. All data lives in a single flat namespace with no workspace isolation.

Installation Options

MethodCommand
Dockerdocker run -p 8080:8080 bahattincinic/promptrails-local
Docker ComposeSee docker-compose.yml
Go Installgo install github.com/promptrails/promptrails-local@latest
BinaryGitHub Releases