Total Context | Max Output | Input Price | Output Price | Cache Read | Cache Write | Input Audio | Input Audio Cache |
|---|---|---|---|---|---|---|---|
200K | 131K | -- | -- | -- | -- | -- | -- |
All prompts and completions for this model are logged by the provider and may be used to improve the model.
Pony Alpha
openrouter/pony-alpha
Pony is a cutting-edge foundation model with strong performance in coding, agentic workflows, reasoning, and roleplay, making it well suited for hands-on coding and real-world use.
Note: All prompts and completions for this model are logged by the provider and may be used to improve the model.
Providers for Pony Alpha
OpenRouter routes requests to the best providers that are able to handle your prompt size and parameters, with fallbacks to maximize uptime.
When an error occurs in an upstream provider, we can recover by routing to another healthy provider, if your request filters allow it. You can access uptime data programmatically through the Endpoints API
Learn more about our load balancing and customization options.
Sample code and API for Pony Alpha
OpenRouter normalizes requests and responses across providers for you.
OpenRouter supports reasoning-enabled models that can show their step-by-step thinking process. Use the reasoning parameter in your request to enable reasoning, and access the reasoning_details array in the response to see the model's internal reasoning before the final answer. When continuing a conversation, preserve the complete reasoning_details when passing messages back to the model so it can continue reasoning from where it left off. Learn more about reasoning tokens.
In the examples below, the OpenRouter-specific headers are optional. Setting them allows your app to appear on the OpenRouter leaderboards.
import { OpenRouter } from "@openrouter/sdk";
const openrouter = new OpenRouter({
apiKey: "<OPENROUTER_API_KEY>"
});
// Stream the response to get reasoning tokens in usage
const stream = await openrouter.chat.send({
model: "openrouter/pony-alpha",
messages: [
{
role: "user",
content: "How many r's are in the word 'strawberry'?"
}
],
stream: true
});
let response = "";
for await (const chunk of stream) {
const content = chunk.choices[0]?.delta?.content;
if (content) {
response += content;
process.stdout.write(content);
}
// Usage information comes in the final chunk
if (chunk.usage) {
console.log("\nReasoning tokens:", chunk.usage.reasoningTokens);
}
}Using third-party SDKs
For information about using third-party SDKs and frameworks with OpenRouter, please see our frameworks documentation.
See the Request docs for all possible fields, and Parameters for explanations of specific sampling parameters.