Local Emulator
Run a local PromptRails API emulator for development and testing — like LocalStack for AWS, but for PromptRails.
Local Emulator
PromptRails Local is an in-memory API emulator that lets you develop and test against the PromptRails API without a real backend. Think of it like LocalStack for AWS, but for PromptRails.
All data lives in memory and resets on restart. It comes pre-loaded with example agents, prompts, credentials, and LLM models so you can start testing immediately.
Quick Start
docker run -p 8080:8080 bahattincinic/promptrails-localThe emulator starts with seed data and is ready to use:
- API: http://localhost:8080/api/v1
- Interactive Docs: http://localhost:8080/docs (Scalar)
- Health Check: http://localhost:8080/health
Connect Your SDK
Point any PromptRails SDK to the emulator by changing the base URL:
Python
from promptrails import PromptRails
client = PromptRails(api_key="test-key", base_url="http://localhost:8080")
agents = client.agents.list()
for agent in agents.data:
print(f"{agent.name} ({agent.type})")
# Execute an agent (returns simulated response)
result = client.agents.execute(
"39wNZZu78VawB207IOPonkoP38J",
input={"topic": "AI agents"}
)
print(result.status) # "completed"JavaScript / TypeScript
import { PromptRails } from '@promptrails/sdk'
const client = new PromptRails({
apiKey: 'test-key',
baseUrl: 'http://localhost:8080',
})
const agents = await client.agents.list()
const result = await client.agents.execute('39wNZZu78VawB207IOPonkoP38J', {
input: { topic: 'AI agents' },
})Go
client := promptrails.NewClient("test-key",
promptrails.WithBaseURL("http://localhost:8080"))
agents, _ := client.Agents.List(ctx, nil)
result, _ := client.Agents.Execute(ctx, "39wNZZu78VawB207IOPonkoP38J",
&promptrails.ExecuteAgentParams{
Input: map[string]any{"topic": "AI agents"},
})Pre-loaded Seed Data
The emulator starts with example data so you can immediately test:
| Resource | Count | Details |
|---|---|---|
| Agents | 6 | simple, chain, multi_agent, approval types |
| Agent Versions | 15 | Multiple versions per agent |
| Prompts | 8 | Various templates with {{ variable }} syntax |
| Prompt Versions | 12 | With system/user prompts and input schemas |
| Data Sources | 2 | PostgreSQL examples |
| Credentials | 4 | OpenAI, Gemini, PostgreSQL, Linear |
| LLM Models | 42 | Full catalog (OpenAI, Anthropic, Gemini, DeepSeek, xAI, etc.) |
| MCP Tools | 6 | Builtin, datasource, remote, API types |
| Guardrails | 1 | PII redaction example |
| Memories | 4 | Fact, procedure, semantic types |
Supported Endpoints
| Resource | CRUD | Execute | Notes |
|---|---|---|---|
| Agents | Yes | Yes (simulated) | + versions, promote, preview |
| Prompts | Yes | Yes (simulated) | + versions, promote, preview, run |
| Executions | Read | Auto-created | Created by agent execute |
| Data Sources | Yes | Mock results | + versions |
| Credentials | Yes | — | No real validation |
| Chat Sessions | Yes | Simulated replies | Multi-turn tracking |
| LLM Models | Read | — | From seed catalog |
| Traces | Read | Auto-created | Created by executions |
| Scores | Yes | — | + score configs |
| Approvals | Yes | — | Approve/reject flow |
| Webhook Triggers | Yes | Yes | Token-based hook endpoint |
| MCP Tools | Yes | — | + templates |
| Guardrails | Yes | — | |
| Memories | Yes | — | + search |
Custom Fixtures
Load your own test data from a directory:
# Default seed + your custom data
docker run -p 8080:8080 \
-v ./my-fixtures:/fixtures \
-e FIXTURES=/fixtures \
bahattincinic/promptrails-local
# Only your data (no default seed)
docker run -p 8080:8080 \
-v ./my-fixtures:/fixtures \
-e SEED=false \
-e FIXTURES=/fixtures \
bahattincinic/promptrails-localPlace JSON files in your fixtures directory — all files are optional:
my-fixtures/
agents.json
agent_versions.json
prompts.json
prompt_versions.json
credentials.json
llm_models.json
data_sources.json
mcp_tools.json
guardrails.json
memories.json
See the fixtures documentation for the complete file format reference.
Configuration
| Flag | Env | Default | Description |
|---|---|---|---|
--port | PORT | 8080 | Server port |
--seed | SEED | true | Load built-in seed data |
--fixtures | FIXTURES | — | Load fixtures from directory |
--log-level | LOG_LEVEL | info | debug, info, warn, error |
--cors-origins | CORS_ORIGINS | * | CORS allowed origins |
Admin Endpoints
Reset state between test runs:
# Reset all data and reload seed
curl -X POST http://localhost:8080/admin/reset
# View store statistics
curl http://localhost:8080/admin/store/statsUse in CI/CD
# GitHub Actions example
jobs:
test:
runs-on: ubuntu-latest
services:
promptrails:
image: bahattincinic/promptrails-local
ports:
- 8080:8080
steps:
- uses: actions/checkout@v4
- run: pytest tests/ -v
env:
PROMPTRAILS_BASE_URL: http://localhost:8080Authentication
The emulator accepts any value for the X-API-Key header — no real authentication is performed. All data lives in a single flat namespace with no workspace isolation.
Installation Options
| Method | Command |
|---|---|
| Docker | docker run -p 8080:8080 bahattincinic/promptrails-local |
| Docker Compose | See docker-compose.yml |
| Go Install | go install github.com/promptrails/promptrails-local@latest |
| Binary | GitHub Releases |