Orq MCP is live: Use natural language to interrogate traces, spot regressions, and experiment your way to optimal AI configurations. Available in Claude Desktop, Claude Code, Cursor, and more. Start now →
Use this file to discover all available pages before exploring further.
AI Router
Route your LLM calls through the AI Router with a single base URL change. Zero vendor lock-in: always run on the best model at the lowest cost for your use case.
Observability
Instrument your code with OpenTelemetry to capture traces, logs, and metrics for every LLM call, agent step, and tool use.
Mastra is a TypeScript framework for building AI-powered applications with pipelines, agents, and workflows. By connecting Mastra to Orq.ai’s AI Router, you transform experimental AI applications into production-ready systems with enterprise-grade capabilities.
Here’s a complete example of creating and running a Mastra agent through Orq.ai:
TypeScript
import { Agent } from "@mastra/core/agent";// Create agentexport const assistantAgent = new Agent({ id: "assistant", name: "Assistant", instructions: "You are a helpful assistant that explains complex concepts simply.", model: { id: "openai/gpt-4o", url: "https://api.orq.ai/v3/router", apiKey: process.env.ORQ_API_KEY, },});// Run the agentasync function main() { const result = await assistantAgent.generate("Explain quantum computing in simple terms"); console.log(result.text);}main().catch(console.error);
Tool Calling Limitation: Mastra’s tool call format in the Responses API currently has schema incompatibilities with the AI Router. Basic agent usage works, but tool-based workflows require using the Observability integration or direct provider access.
Integrate Mastra with Orq.ai’s observability to gain complete insights into pipeline execution, agent performance, workflow orchestration, and system reliability using OpenTelemetry.
In your Mastra server configuration, enable export tracing. The previously set environment variables will be used.
TypeScript
export const mastra = new Mastra({ // ... other config telemetry: { serviceName: "my-app", enabled: true, export: { type: "otlp", // endpoint and headers will be picked up from env vars }, },});