• Skew Protection now supports prebuilt deployments

    Skew Protection can now be used with vercel deploy --prebuilt deployments.

    For teams building locally and uploading with --prebuilt, you can now set a custom deploymentId in your next.config.js:

    next.config.js
    module.exports = {
    deploymentId: process.env.GIT_SHA || 'my-deployment-id',
    }

    This ID is written to routes-manifest.json and used by Vercel for skew protection routing. You control the ID lifecycle, using the same ID across multiple prebuilt deployments or updating it when deploying new versions.

    This feature enables Skew Protection support for the specific workflow of building applications locally and then uploading them to Vercel.

    Learn more about Skew Protection.

    Brooke Mosby

  • Vercel Agent investigations now available in Slack

    Anomaly alerts proactively monitor your application for usage or error anomalies. When we detect an issue, we send an alert by email, Slack or webhook. Vercel Agent investigates anomaly alerts to find out what's happening in your logs and metrics to help you identify the root cause.

    With our updated Slack integration, investigations now appear directly in Slack alert messages as a threaded response. This eliminates the need to click into the Vercel dashboard and gives you context to triage the alert directly in Slack.

    This feature is available for teams using Observability Plus. 10 investigations are included at no additional cost for Observability Plus subscribers.

    Learn more about Vercel Agent investigations.

    +2

    Julia S, Fabio B, Timo L, Malavika T

  • Tag-based cache invalidation now available for all responses

    Vercel's CDN now supports tag-based cache invalidation, giving you granular control over cached content across all frameworks and backends.

    Responses can now be tagged using the Vercel-Cache-Tag header with a comma-separated list of tags as a new cache organization mechanism to group related content and invalidate it together, rather than just purging your entire cache when content changes.

    This complements existing headers that cache responses on Vercel's CDN, like Cache-Control, CDN-Cache-Control, and Vercel-CDN-Cache-Control and exposes the same underlying technology that powers Next.js Incremental Static Regeneration (ISR) to any framework or backend.

    We recommend Next.js applications continue using Incremental Static Regeneration (ISR) for built-in cache tagging and invalidation without managing cache headers manually.

    Link to headingHow it works

    After a response has a cache tag, you can invalidate it through dashboard settings, the Vercel CLI, the Function API, or the REST API.

    Vercel's CDN reads Vercel-Cache-Tag and strips it before sending the response to the client. If you apply cache tags via rewrites from a parent to a child project, and both projects belong to the same team, cached responses on the parent project also include the corresponding tags from the child project.

    This is available starting today on all plans at no additional cost. Read the cache invalidation documentation to learn more.

  • Introducing the vercel api CLI command

    vercel@50.5.1 adds a new api command, giving direct access to the full suite of Vercel APIs from your terminal.

    The api command provides a direct access point for AI agents to interact with Vercel through the CLI. Agents like Claude Code can access Vercel directly with no additional configuration required. If an agent has access to the environment and the Vercel CLI, it inherits the user's access permissions automatically.

    List available APIs with vercel api ls, build requests interactively with vercel api, or send requests directly with vercel api [endpoint] [options].

    Get started with npx vercel@latest api --help.

  • Trinity Large Preview is on AI Gateway

    You can now access Trinity Large Preview via AI Gateway with no other provider accounts required.

    Trinity Large Preview is optimized for reasoning-intensive workloads, including math, coding tasks, and complex multi-step agent workflows. It is designed to handle extended multi-turn interactions efficiently while maintaining high inference throughput.

    To use this model, set model to arcee-ai/trinity-large-preview in the AI SDK:

    import { streamText } from 'ai'β€‹β€‹β€‹β€‹β€Œο»Ώβ€ο»Ώβ€‹β€β€‹β€β€Œβ€ο»Ώο»Ώβ€Œο»Ώβ€‹β€β€Œβ€β€β€Œβ€Œβ€β€Œο»Ώβ€Œβ€β€β€Œβ€Œβ€ο»Ώβ€β€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€β€Œο»Ώβ€Œβ€Œβ€Œβ€ο»Ώβ€‹β€‹β€ο»Ώβ€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€Œβ€‹β€β€‹β€β€‹β€ο»Ώβ€‹β€‹β€β€‹β€β€Œβ€β€β€‹β€Œο»Ώβ€‹β€β€Œβ€β€Œβ€Œβ€Œβ€β€Œβ€β€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œβ€β€β€‹β€Œο»Ώβ€Œβ€‹β€Œο»Ώβ€Œβ€‹β€Œο»Ώβ€‹β€‹β€Œο»Ώβ€‹ο»Ώβ€‹ο»Ώβ€β€β€‹β€ο»Ώο»Ώβ€‹β€ο»Ώο»Ώβ€Œβ€β€‹β€Œβ€Œο»Ώβ€‹β€‹β€Œο»Ώβ€‹β€‹β€‹β€ο»Ώβ€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€β€Œο»Ώβ€Œβ€Œβ€Œβ€ο»Ώβ€‹β€‹β€ο»Ώβ€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€Œβ€‹β€ο»Ώο»Ώβ€Œο»Ώβ€‹ο»Ώβ€Œο»Ώβ€‹β€‹β€Œβ€β€‹β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€‹β€ο»Ώο»Ώβ€Œβ€β€Œβ€Œβ€‹ο»Ώβ€Œβ€Œβ€‹ο»Ώβ€‹ο»Ώβ€‹ο»Ώβ€β€‹β€‹ο»Ώβ€‹β€β€Œβ€β€β€‹β€Œβ€β€‹ο»Ώβ€Œο»Ώβ€Œβ€‹β€‹ο»Ώβ€Œο»Ώβ€‹ο»Ώβ€Œβ€‹β€Œο»Ώβ€‹ο»Ώβ€Œβ€β€β€Œβ€‹β€ο»Ώο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€β€Œβ€β€β€Œβ€Œο»Ώβ€‹β€β€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œβ€ο»Ώβ€Œβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œο»Ώβ€‹ο»Ώβ€‹β€ο»Ώο»Ώβ€Œβ€ο»Ώβ€Œβ€Œβ€β€‹β€Œβ€Œο»Ώβ€‹ο»Ώβ€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹β€β€‹β€ο»Ώο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œο»Ώβ€‹β€β€Œβ€β€β€Œβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€‹β€ο»Ώο»Ώβ€‹ο»Ώβ€Œβ€Œβ€Œο»Ώβ€‹β€β€Œβ€Œβ€Œβ€Œβ€Œβ€Œβ€‹β€β€Œβ€‹β€‹β€β€Œβ€‹β€Œβ€β€Œβ€Œβ€β€‹β€Œβ€Œβ€Œο»Ώβ€Œβ€‹β€β€β€Œο»Ώβ€Œβ€Œβ€Œβ€Œβ€Œβ€β€Œβ€Œβ€β€Œβ€Œβ€ο»Ώβ€β€‹ο»Ώβ€‹ο»Ώβ€Œο»Ώβ€Œβ€‹β€Œβ€β€β€‹β€Œβ€‹ο»Ώο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€β€β€Œβ€β€Œο»Ώβ€Œο»Ώβ€‹β€β€Œβ€‹β€Œβ€Œβ€‹β€ο»Ώο»Ώβ€‹ο»Ώο»Ώο»Ώβ€Œβ€β€Œβ€β€Œβ€ο»Ώο»Ώβ€Œβ€β€‹ο»Ώβ€Œο»Ώβ€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€β€Œβ€‹β€Œβ€‹β€Œβ€β€Œβ€β€β€Œβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€‹β€Œβ€β€Œβ€‹β€‹ο»Ώο»Ώβ€Œβ€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€‹β€β€Œβ€β€Œβ€β€Œβ€β€Œβ€ο»Ώο»Ώβ€Œβ€β€‹ο»Ώβ€Œο»Ώβ€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€β€Œβ€‹β€Œβ€‹ο»Ώβ€‹β€Œβ€ο»Ώο»Ώβ€Œβ€β€‹ο»Ώβ€Œβ€β€‹β€Œβ€Œβ€ο»Ώβ€‹β€Œβ€β€Œβ€Œβ€‹ο»Ώο»Ώβ€Œβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€‹β€ο»Ώβ€Œβ€Œβ€Œβ€Œβ€Œβ€Œβ€Œβ€‹ο»Ώβ€‹β€β€Œβ€β€Œο»Ώβ€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œο»Ώβ€Œβ€Œβ€Œο»Ώβ€‹β€β€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€‹ο»Ώο»Ώβ€Œβ€Œο»Ώβ€Œβ€β€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹β€β€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€‹β€‹β€ο»Ώβ€Œβ€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€‹β€ο»Ώβ€Œβ€Œβ€ο»Ώβ€‹β€Œβ€β€β€Œβ€Œβ€ο»Ώβ€β€Œβ€β€ο»Ώβ€‹β€β€‹β€β€‹β€ο»Ώβ€‹β€‹β€β€‹β€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€β€Œο»Ώβ€Œβ€Œβ€Œβ€ο»Ώβ€‹β€‹β€β€‹β€β€‹ο»Ώβ€β€β€Œο»Ώβ€ο»Ώβ€‹β€β€‹β€β€Œβ€β€Œβ€Œβ€Œβ€β€Œβ€‹β€Œβ€β€β€Œβ€Œο»Ώβ€Œβ€‹β€Œβ€ο»Ώο»Ώβ€Œο»Ώβ€‹β€β€Œβ€‹β€β€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹β€β€Œβ€β€Œβ€β€Œβ€β€‹β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€‹β€β€‹β€β€‹ο»Ώβ€β€β€Œο»Ώβ€ο»Ώβ€‹β€β€‹β€β€Œο»Ώβ€Œο»Ώβ€Œβ€β€β€Œβ€Œβ€β€Œβ€‹β€Œβ€β€Œο»Ώβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€Œβ€‹β€Œβ€‹β€β€Œβ€Œβ€β€Œβ€‹β€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œο»Ώβ€‹β€β€Œβ€β€β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€β€‹β€Œβ€Œβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œο»Ώβ€β€‹β€Œο»Ώβ€Œβ€‹β€Œβ€‹β€Œβ€Œβ€Œβ€β€Œβ€‹β€Œβ€β€β€Œβ€Œο»Ώβ€Œβ€‹β€Œβ€ο»Ώο»Ώβ€Œο»Ώβ€‹β€β€‹β€β€‹β€β€‹β€ο»Ώβ€‹β€‹β€β€‹β€β€Œο»Ώβ€Œο»Ώβ€Œβ€β€β€Œβ€Œβ€β€Œβ€‹β€Œβ€β€Œο»Ώβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€Œβ€‹β€Œβ€‹ο»Ώβ€β€Œβ€β€‹β€Œβ€Œβ€ο»Ώβ€Œβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€Œο»Ώβ€‹β€‹β€Œβ€β€‹β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œβ€β€‹β€β€Œο»Ώβ€Œβ€Œβ€Œβ€β€β€Œβ€Œβ€ο»Ώβ€‹β€Œο»Ώβ€Œβ€‹β€Œβ€β€β€Œβ€Œβ€ο»Ώβ€β€‹β€β€‹β€β€Œο»Ώο»Ώβ€Œβ€‹β€ο»Ώβ€‹β€‹β€β€‹β€β€Œβ€β€Œβ€β€Œβ€β€β€Œβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€‹β€Œβ€β€Œβ€‹β€Œβ€Œβ€Œβ€‹β€Œο»Ώβ€β€Œβ€Œο»Ώβ€‹β€‹β€Œβ€β€Œβ€Œβ€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œβ€Œβ€‹β€β€Œβ€β€β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€β€‹β€Œβ€Œβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œο»Ώβ€β€‹β€Œο»Ώβ€Œβ€‹β€‹β€β€‹β€β€Œο»Ώο»Ώβ€Œβ€Œο»Ώο»Ώβ€Œ
    const result = streamText({
    model: 'arcee-ai/trinity-large-preview',
    prompt:
    `Implement a long-context reasoning benchmark with ingested documents,
    multi-step analysis, and generate conclusions.`
    })β€‹β€‹β€‹β€‹β€Œο»Ώβ€ο»Ώβ€‹β€β€‹β€β€Œβ€ο»Ώο»Ώβ€Œο»Ώβ€‹β€β€Œβ€β€β€Œβ€Œβ€β€Œο»Ώβ€Œβ€β€β€Œβ€Œβ€ο»Ώβ€β€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€β€Œο»Ώβ€Œβ€Œβ€Œβ€ο»Ώβ€‹β€‹β€ο»Ώβ€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€Œβ€‹β€β€‹β€β€‹β€ο»Ώβ€‹β€‹β€β€‹β€β€Œβ€β€β€‹β€Œο»Ώβ€‹β€β€Œβ€β€Œβ€Œβ€Œβ€β€Œβ€β€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œβ€β€β€‹β€Œο»Ώβ€Œβ€‹β€Œο»Ώβ€Œβ€‹β€Œο»Ώβ€‹β€‹β€Œο»Ώβ€‹ο»Ώβ€‹ο»Ώβ€β€β€‹β€ο»Ώο»Ώβ€‹β€ο»Ώο»Ώβ€Œβ€β€‹β€Œβ€Œο»Ώβ€‹β€‹β€Œο»Ώβ€‹β€‹β€‹β€ο»Ώβ€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€β€Œο»Ώβ€Œβ€Œβ€Œβ€ο»Ώβ€‹β€‹β€ο»Ώβ€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€Œβ€‹β€ο»Ώο»Ώβ€Œο»Ώβ€‹ο»Ώβ€Œο»Ώβ€‹β€‹β€Œβ€β€‹β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€‹β€ο»Ώο»Ώβ€Œβ€β€Œβ€Œβ€‹ο»Ώβ€Œβ€Œβ€‹ο»Ώβ€‹ο»Ώβ€‹ο»Ώβ€β€‹β€‹ο»Ώβ€‹β€β€Œβ€β€β€‹β€Œβ€β€‹ο»Ώβ€Œο»Ώβ€Œβ€‹β€‹ο»Ώβ€Œο»Ώβ€‹ο»Ώβ€Œβ€‹β€Œο»Ώβ€‹ο»Ώβ€Œβ€β€β€Œβ€‹β€ο»Ώο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€β€Œβ€β€β€Œβ€Œο»Ώβ€‹β€β€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œβ€ο»Ώβ€Œβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œο»Ώβ€‹ο»Ώβ€‹β€ο»Ώο»Ώβ€Œβ€ο»Ώβ€Œβ€Œβ€β€‹β€Œβ€Œο»Ώβ€‹ο»Ώβ€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹β€β€‹β€ο»Ώο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œο»Ώβ€‹β€β€Œβ€β€β€Œβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€‹β€ο»Ώο»Ώβ€‹ο»Ώβ€Œβ€Œβ€Œο»Ώβ€‹β€β€Œβ€Œβ€Œβ€Œβ€Œβ€Œβ€‹β€β€Œβ€‹β€‹β€β€Œβ€‹β€Œβ€β€Œβ€Œβ€β€‹β€Œβ€Œβ€Œο»Ώβ€Œβ€‹β€β€β€Œο»Ώβ€Œβ€Œβ€Œβ€Œβ€Œβ€β€Œβ€Œβ€β€Œβ€Œβ€ο»Ώβ€β€‹ο»Ώβ€‹ο»Ώβ€Œο»Ώβ€Œβ€‹β€Œβ€β€β€‹β€Œβ€‹ο»Ώο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€β€β€Œβ€β€Œο»Ώβ€Œο»Ώβ€‹β€β€Œβ€‹β€Œβ€Œβ€‹β€ο»Ώο»Ώβ€‹ο»Ώο»Ώο»Ώβ€Œβ€β€Œβ€β€Œβ€ο»Ώο»Ώβ€Œβ€β€‹ο»Ώβ€Œο»Ώβ€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€β€Œβ€‹β€Œβ€‹β€Œβ€β€Œβ€β€β€Œβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€‹β€Œβ€β€Œβ€‹β€‹ο»Ώο»Ώβ€Œβ€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€‹β€β€Œβ€β€Œβ€β€Œβ€β€Œβ€ο»Ώο»Ώβ€Œβ€β€‹ο»Ώβ€Œο»Ώβ€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€β€Œβ€‹β€Œβ€‹ο»Ώβ€‹β€Œβ€ο»Ώο»Ώβ€Œβ€β€‹ο»Ώβ€Œβ€β€‹β€Œβ€Œβ€ο»Ώβ€‹β€Œβ€β€Œβ€Œβ€‹ο»Ώο»Ώβ€Œβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€‹β€ο»Ώβ€Œβ€Œβ€Œβ€Œβ€Œβ€Œβ€Œβ€‹ο»Ώβ€‹β€β€Œβ€β€Œο»Ώβ€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œο»Ώβ€Œβ€Œβ€Œο»Ώβ€‹β€β€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€‹ο»Ώο»Ώβ€Œβ€Œο»Ώβ€Œβ€β€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹β€β€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€‹β€‹β€ο»Ώβ€Œβ€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€‹β€ο»Ώβ€Œβ€Œβ€ο»Ώβ€‹β€Œβ€β€β€Œβ€Œβ€ο»Ώβ€β€Œβ€β€ο»Ώβ€‹β€β€‹β€β€‹β€ο»Ώβ€‹β€‹β€β€‹β€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€β€Œο»Ώβ€Œβ€Œβ€Œβ€ο»Ώβ€‹β€‹β€β€‹β€β€‹ο»Ώβ€β€β€Œο»Ώβ€ο»Ώβ€‹β€β€‹β€β€Œβ€β€Œβ€Œβ€Œβ€β€Œβ€‹β€Œβ€β€β€Œβ€Œο»Ώβ€Œβ€‹β€Œβ€ο»Ώο»Ώβ€Œο»Ώβ€‹β€β€Œβ€‹β€β€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹β€β€Œβ€β€Œβ€β€Œβ€β€‹β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€‹β€β€‹β€β€‹ο»Ώβ€β€β€Œο»Ώβ€ο»Ώβ€‹β€β€‹β€β€Œο»Ώβ€Œο»Ώβ€Œβ€β€β€Œβ€Œβ€β€Œβ€‹β€Œβ€β€Œο»Ώβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€Œβ€‹β€Œβ€‹β€β€Œβ€Œβ€β€Œβ€‹β€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œο»Ώβ€‹β€β€Œβ€β€β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€β€‹β€Œβ€Œβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œο»Ώβ€β€‹β€Œο»Ώβ€Œβ€‹β€Œβ€‹β€Œβ€Œβ€Œβ€β€Œβ€‹β€Œβ€β€β€Œβ€Œο»Ώβ€Œβ€‹β€Œβ€ο»Ώο»Ώβ€Œο»Ώβ€‹β€β€‹β€β€‹β€β€‹β€ο»Ώβ€‹β€‹β€β€‹β€β€Œο»Ώβ€Œο»Ώβ€Œβ€β€β€Œβ€Œβ€β€Œβ€‹β€Œβ€β€Œο»Ώβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€Œβ€‹β€Œβ€‹ο»Ώβ€β€Œβ€β€‹β€Œβ€Œβ€ο»Ώβ€Œβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€Œο»Ώβ€‹β€‹β€Œβ€β€‹β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œβ€β€‹β€β€Œο»Ώβ€Œβ€Œβ€Œβ€β€β€Œβ€Œβ€ο»Ώβ€‹β€Œο»Ώβ€Œβ€‹β€Œβ€β€β€Œβ€Œβ€ο»Ώβ€β€‹β€β€‹β€β€Œο»Ώο»Ώβ€Œβ€‹β€ο»Ώβ€‹β€‹β€β€‹β€β€Œβ€β€Œβ€β€Œβ€β€β€Œβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€‹β€Œβ€β€Œβ€‹β€Œβ€Œβ€Œβ€‹β€Œο»Ώβ€β€Œβ€Œο»Ώβ€‹β€‹β€Œβ€β€Œβ€Œβ€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œβ€Œβ€‹β€β€Œβ€β€β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€β€‹β€Œβ€Œβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œο»Ώβ€β€‹β€Œο»Ώβ€Œβ€‹β€‹β€β€‹β€β€Œο»Ώο»Ώβ€Œβ€Œο»Ώο»Ώβ€Œ

    AI Gateway provides a unified API for calling models, tracking usage and cost, and configuring retries, failover, and performance optimizations for higher-than-provider uptime. It includes built-in observability, Bring Your Own Key support, and intelligent provider routing with automatic retries.

    Learn more about AI Gateway, view the AI Gateway model leaderboard or try it in our model playground.

  • Kimi K2.5 is live on AI Gateway

    You can now access Kimi K2.5 via AI Gateway with no other provider accounts required.

    Kimi K2.5 is Moonshot AI's most intelligent and versatile model yet, achieving open-source state-of-the-art performance across agent tasks, coding, visual understanding, and general intelligence. It has more advanced coding abilities compared to previous iterations, especially with frontend code quality and design expressiveness. This enables the creation of fully functional interactive user interfaces with dynamic layouts and animations.

    To use this model, set model to moonshotai/kimi-k2.5 in the AI SDK:

    import { streamText } from 'ai'β€‹β€‹β€‹β€‹β€Œο»Ώβ€ο»Ώβ€‹β€β€‹β€β€Œβ€ο»Ώο»Ώβ€Œο»Ώβ€‹β€β€Œβ€β€β€Œβ€Œβ€β€Œο»Ώβ€Œβ€β€β€Œβ€Œβ€ο»Ώβ€β€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€β€Œο»Ώβ€Œβ€Œβ€Œβ€ο»Ώβ€‹β€‹β€ο»Ώβ€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€Œβ€‹β€β€‹β€β€‹β€ο»Ώβ€‹β€‹β€β€‹β€β€Œβ€β€β€‹β€Œο»Ώβ€‹β€β€Œβ€β€Œβ€Œβ€Œβ€β€Œβ€β€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œβ€β€β€‹β€Œο»Ώβ€Œβ€‹β€Œο»Ώβ€Œβ€‹β€Œο»Ώβ€‹β€‹β€Œο»Ώβ€‹ο»Ώβ€‹ο»Ώβ€β€β€‹β€ο»Ώο»Ώβ€‹β€ο»Ώο»Ώβ€Œβ€β€‹β€Œβ€Œο»Ώβ€‹β€‹β€Œο»Ώβ€‹β€‹β€‹β€ο»Ώβ€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€β€Œο»Ώβ€Œβ€Œβ€Œβ€ο»Ώβ€‹β€‹β€ο»Ώβ€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€Œβ€‹β€ο»Ώο»Ώβ€Œο»Ώβ€‹ο»Ώβ€Œο»Ώβ€‹β€‹β€Œβ€β€‹β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€‹β€ο»Ώο»Ώβ€Œβ€β€Œβ€Œβ€‹ο»Ώβ€Œβ€Œβ€‹ο»Ώβ€‹ο»Ώβ€‹ο»Ώβ€β€‹β€‹ο»Ώβ€‹β€β€Œβ€β€β€‹β€Œβ€β€‹ο»Ώβ€Œο»Ώβ€Œβ€‹β€‹ο»Ώβ€Œο»Ώβ€‹ο»Ώβ€Œβ€‹β€Œο»Ώβ€‹ο»Ώβ€Œβ€β€β€Œβ€‹β€ο»Ώο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€β€Œβ€β€β€Œβ€Œο»Ώβ€‹β€β€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œβ€ο»Ώβ€Œβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œο»Ώβ€‹ο»Ώβ€‹β€ο»Ώο»Ώβ€Œβ€ο»Ώβ€Œβ€Œβ€β€‹β€Œβ€Œο»Ώβ€‹ο»Ώβ€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹β€β€‹β€ο»Ώο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œο»Ώβ€‹β€β€Œβ€β€β€Œβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€‹β€ο»Ώο»Ώβ€‹ο»Ώβ€Œβ€Œβ€Œο»Ώβ€‹β€β€Œβ€Œβ€Œβ€Œβ€Œβ€Œβ€‹β€β€Œβ€‹β€‹β€β€Œβ€‹β€Œβ€β€Œβ€Œβ€β€‹β€Œβ€Œβ€Œο»Ώβ€Œβ€‹β€β€β€Œο»Ώβ€Œβ€Œβ€Œβ€Œβ€Œβ€β€Œβ€Œβ€β€Œβ€Œβ€ο»Ώβ€β€‹ο»Ώβ€‹ο»Ώβ€Œο»Ώβ€Œβ€‹β€Œβ€β€β€‹β€Œβ€‹ο»Ώο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€β€β€Œβ€β€Œο»Ώβ€Œο»Ώβ€‹β€β€Œβ€‹β€Œβ€Œβ€‹β€ο»Ώο»Ώβ€‹ο»Ώο»Ώο»Ώβ€Œβ€β€Œβ€β€Œβ€ο»Ώο»Ώβ€Œβ€β€‹ο»Ώβ€Œο»Ώβ€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€β€Œβ€‹β€Œβ€‹β€Œβ€β€Œβ€β€β€Œβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€‹β€Œβ€β€Œβ€‹β€‹ο»Ώο»Ώβ€Œβ€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€‹β€β€Œβ€β€Œβ€β€Œβ€β€Œβ€ο»Ώο»Ώβ€Œβ€β€‹ο»Ώβ€Œο»Ώβ€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€β€Œβ€‹β€Œβ€‹ο»Ώβ€‹β€Œβ€ο»Ώο»Ώβ€Œβ€β€‹ο»Ώβ€Œβ€β€‹β€Œβ€Œβ€ο»Ώβ€‹β€Œβ€β€Œβ€Œβ€‹ο»Ώο»Ώβ€Œβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€‹β€ο»Ώβ€Œβ€Œβ€Œβ€Œβ€Œβ€Œβ€Œβ€‹ο»Ώβ€‹β€β€Œβ€β€Œο»Ώβ€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œο»Ώβ€Œβ€Œβ€Œο»Ώβ€‹β€β€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€‹ο»Ώο»Ώβ€Œβ€Œο»Ώβ€Œβ€β€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹β€β€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€‹β€‹β€ο»Ώβ€Œβ€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€‹β€ο»Ώβ€Œβ€Œβ€ο»Ώβ€‹β€Œβ€β€β€Œβ€Œβ€ο»Ώβ€β€Œβ€β€ο»Ώβ€‹β€β€‹β€β€‹β€ο»Ώβ€‹β€‹β€β€‹β€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€β€Œο»Ώβ€Œβ€Œβ€Œβ€ο»Ώβ€‹β€‹β€β€‹β€β€‹ο»Ώβ€β€β€Œο»Ώβ€ο»Ώβ€‹β€β€‹β€β€Œβ€β€Œβ€Œβ€Œβ€β€Œβ€‹β€Œβ€β€β€Œβ€Œο»Ώβ€Œβ€‹β€Œβ€ο»Ώο»Ώβ€Œο»Ώβ€‹β€β€Œβ€‹β€β€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹β€β€Œβ€β€Œβ€β€Œβ€β€‹β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€‹β€β€‹β€β€‹ο»Ώβ€β€β€Œο»Ώβ€ο»Ώβ€‹β€β€‹β€β€Œο»Ώβ€Œο»Ώβ€Œβ€β€β€Œβ€Œβ€β€Œβ€‹β€Œβ€β€Œο»Ώβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€Œβ€‹β€Œβ€‹β€β€Œβ€Œβ€β€Œβ€‹β€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œο»Ώβ€‹β€β€Œβ€β€β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€β€‹β€Œβ€Œβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œο»Ώβ€β€‹β€Œο»Ώβ€Œβ€‹β€Œβ€‹β€Œβ€Œβ€Œβ€β€Œβ€‹β€Œβ€β€β€Œβ€Œο»Ώβ€Œβ€‹β€Œβ€ο»Ώο»Ώβ€Œο»Ώβ€‹β€β€‹β€β€‹β€β€‹β€ο»Ώβ€‹β€‹β€β€‹β€β€Œο»Ώβ€Œο»Ώβ€Œβ€β€β€Œβ€Œβ€β€Œβ€‹β€Œβ€β€Œο»Ώβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€Œβ€‹β€Œβ€‹ο»Ώβ€β€Œβ€β€‹β€Œβ€Œβ€ο»Ώβ€Œβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€Œο»Ώβ€‹β€‹β€Œβ€β€‹β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œβ€β€‹β€β€Œο»Ώβ€Œβ€Œβ€Œβ€β€β€Œβ€Œβ€ο»Ώβ€‹β€Œο»Ώβ€Œβ€‹β€Œβ€β€β€Œβ€Œβ€ο»Ώβ€β€‹β€β€‹β€β€Œο»Ώο»Ώβ€Œβ€‹β€ο»Ώβ€‹β€‹β€β€‹β€β€Œβ€β€Œβ€β€Œβ€β€β€Œβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€‹β€Œβ€β€Œβ€‹β€Œβ€Œβ€Œβ€‹β€Œο»Ώβ€β€Œβ€Œο»Ώβ€‹β€‹β€Œβ€β€Œβ€Œβ€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œβ€Œβ€‹β€β€Œβ€β€β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€β€‹β€Œβ€Œβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œο»Ώβ€β€‹β€Œο»Ώβ€Œβ€‹β€‹β€β€‹β€β€Œο»Ώο»Ώβ€Œβ€Œο»Ώο»Ώβ€Œ
    const result = streamText({
    model: "moonshotai/kimi-k2.5",
    prompt:
    `Build a playful task dashboard with animations, drag-and-drop chaos,
    infinite scroll, theme toggles, and production-ready frontend code.`
    })β€‹β€‹β€‹β€‹β€Œο»Ώβ€ο»Ώβ€‹β€β€‹β€β€Œβ€ο»Ώο»Ώβ€Œο»Ώβ€‹β€β€Œβ€β€β€Œβ€Œβ€β€Œο»Ώβ€Œβ€β€β€Œβ€Œβ€ο»Ώβ€β€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€β€Œο»Ώβ€Œβ€Œβ€Œβ€ο»Ώβ€‹β€‹β€ο»Ώβ€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€Œβ€‹β€β€‹β€β€‹β€ο»Ώβ€‹β€‹β€β€‹β€β€Œβ€β€β€‹β€Œο»Ώβ€‹β€β€Œβ€β€Œβ€Œβ€Œβ€β€Œβ€β€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œβ€β€β€‹β€Œο»Ώβ€Œβ€‹β€Œο»Ώβ€Œβ€‹β€Œο»Ώβ€‹β€‹β€Œο»Ώβ€‹ο»Ώβ€‹ο»Ώβ€β€β€‹β€ο»Ώο»Ώβ€‹β€ο»Ώο»Ώβ€Œβ€β€‹β€Œβ€Œο»Ώβ€‹β€‹β€Œο»Ώβ€‹β€‹β€‹β€ο»Ώβ€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€β€Œο»Ώβ€Œβ€Œβ€Œβ€ο»Ώβ€‹β€‹β€ο»Ώβ€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€Œβ€‹β€ο»Ώο»Ώβ€Œο»Ώβ€‹ο»Ώβ€Œο»Ώβ€‹β€‹β€Œβ€β€‹β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€‹β€ο»Ώο»Ώβ€Œβ€β€Œβ€Œβ€‹ο»Ώβ€Œβ€Œβ€‹ο»Ώβ€‹ο»Ώβ€‹ο»Ώβ€β€‹β€‹ο»Ώβ€‹β€β€Œβ€β€β€‹β€Œβ€β€‹ο»Ώβ€Œο»Ώβ€Œβ€‹β€‹ο»Ώβ€Œο»Ώβ€‹ο»Ώβ€Œβ€‹β€Œο»Ώβ€‹ο»Ώβ€Œβ€β€β€Œβ€‹β€ο»Ώο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€β€Œβ€β€β€Œβ€Œο»Ώβ€‹β€β€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œβ€ο»Ώβ€Œβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œο»Ώβ€‹ο»Ώβ€‹β€ο»Ώο»Ώβ€Œβ€ο»Ώβ€Œβ€Œβ€β€‹β€Œβ€Œο»Ώβ€‹ο»Ώβ€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹β€β€‹β€ο»Ώο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œο»Ώβ€‹β€β€Œβ€β€β€Œβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€‹β€ο»Ώο»Ώβ€‹ο»Ώβ€Œβ€Œβ€Œο»Ώβ€‹β€β€Œβ€Œβ€Œβ€Œβ€Œβ€Œβ€‹β€β€Œβ€‹β€‹β€β€Œβ€‹β€Œβ€β€Œβ€Œβ€β€‹β€Œβ€Œβ€Œο»Ώβ€Œβ€‹β€β€β€Œο»Ώβ€Œβ€Œβ€Œβ€Œβ€Œβ€β€Œβ€Œβ€β€Œβ€Œβ€ο»Ώβ€β€‹ο»Ώβ€‹ο»Ώβ€Œο»Ώβ€Œβ€‹β€Œβ€β€β€‹β€Œβ€‹ο»Ώο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€β€β€Œβ€β€Œο»Ώβ€Œο»Ώβ€‹β€β€Œβ€‹β€Œβ€Œβ€‹β€ο»Ώο»Ώβ€‹ο»Ώο»Ώο»Ώβ€Œβ€β€Œβ€β€Œβ€ο»Ώο»Ώβ€Œβ€β€‹ο»Ώβ€Œο»Ώβ€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€β€Œβ€‹β€Œβ€‹β€Œβ€β€Œβ€β€β€Œβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€‹β€Œβ€β€Œβ€‹β€‹ο»Ώο»Ώβ€Œβ€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€‹β€β€Œβ€β€Œβ€β€Œβ€β€Œβ€ο»Ώο»Ώβ€Œβ€β€‹ο»Ώβ€Œο»Ώβ€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€β€Œβ€‹β€Œβ€‹ο»Ώβ€‹β€Œβ€ο»Ώο»Ώβ€Œβ€β€‹ο»Ώβ€Œβ€β€‹β€Œβ€Œβ€ο»Ώβ€‹β€Œβ€β€Œβ€Œβ€‹ο»Ώο»Ώβ€Œβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€‹β€ο»Ώβ€Œβ€Œβ€Œβ€Œβ€Œβ€Œβ€Œβ€‹ο»Ώβ€‹β€β€Œβ€β€Œο»Ώβ€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œο»Ώβ€Œβ€Œβ€Œο»Ώβ€‹β€β€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€‹ο»Ώο»Ώβ€Œβ€Œο»Ώβ€Œβ€β€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹β€β€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€‹β€‹β€ο»Ώβ€Œβ€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€‹β€ο»Ώβ€Œβ€Œβ€ο»Ώβ€‹β€Œβ€β€β€Œβ€Œβ€ο»Ώβ€β€Œβ€β€ο»Ώβ€‹β€β€‹β€β€‹β€ο»Ώβ€‹β€‹β€β€‹β€β€Œβ€β€‹ο»Ώβ€Œβ€ο»Ώο»Ώβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€β€Œο»Ώβ€Œβ€Œβ€Œβ€ο»Ώβ€‹β€‹β€β€‹β€β€‹ο»Ώβ€β€β€Œο»Ώβ€ο»Ώβ€‹β€β€‹β€β€Œβ€β€Œβ€Œβ€Œβ€β€Œβ€‹β€Œβ€β€β€Œβ€Œο»Ώβ€Œβ€‹β€Œβ€ο»Ώο»Ώβ€Œο»Ώβ€‹β€β€Œβ€‹β€β€Œβ€Œβ€ο»Ώβ€β€Œο»Ώβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹β€β€Œβ€β€Œβ€β€Œβ€β€‹β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€‹β€β€‹β€β€‹ο»Ώβ€β€β€Œο»Ώβ€ο»Ώβ€‹β€β€‹β€β€Œο»Ώβ€Œο»Ώβ€Œβ€β€β€Œβ€Œβ€β€Œβ€‹β€Œβ€β€Œο»Ώβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€Œβ€‹β€Œβ€‹β€β€Œβ€Œβ€β€Œβ€‹β€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œο»Ώβ€‹β€β€Œβ€β€β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€β€‹β€Œβ€Œβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œο»Ώβ€β€‹β€Œο»Ώβ€Œβ€‹β€Œβ€‹β€Œβ€Œβ€Œβ€β€Œβ€‹β€Œβ€β€β€Œβ€Œο»Ώβ€Œβ€‹β€Œβ€ο»Ώο»Ώβ€Œο»Ώβ€‹β€β€‹β€β€‹β€β€‹β€ο»Ώβ€‹β€‹β€β€‹β€β€Œο»Ώβ€Œο»Ώβ€Œβ€β€β€Œβ€Œβ€β€Œβ€‹β€Œβ€β€Œο»Ώβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€Œβ€‹β€Œβ€‹ο»Ώβ€β€Œβ€β€‹β€Œβ€Œβ€ο»Ώβ€Œβ€Œβ€β€Œβ€Œβ€Œο»Ώβ€‹ο»Ώβ€Œο»Ώβ€‹β€‹β€Œβ€β€‹β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€Œβ€Œβ€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œβ€β€‹β€β€Œο»Ώβ€Œβ€Œβ€Œβ€β€β€Œβ€Œβ€ο»Ώβ€‹β€Œο»Ώβ€Œβ€‹β€Œβ€β€β€Œβ€Œβ€ο»Ώβ€β€‹β€β€‹β€β€Œο»Ώο»Ώβ€Œβ€‹β€ο»Ώβ€‹β€‹β€β€‹β€β€Œβ€β€Œβ€β€Œβ€β€β€Œβ€Œβ€β€Œβ€Œβ€Œβ€ο»Ώβ€‹β€Œβ€β€Œβ€‹β€Œβ€Œβ€Œβ€‹β€Œο»Ώβ€β€Œβ€Œο»Ώβ€‹β€‹β€Œβ€β€Œβ€Œβ€‹β€β€‹β€β€‹ο»Ώβ€β€β€‹β€β€‹β€β€Œβ€Œβ€‹β€β€Œβ€β€β€Œβ€Œβ€β€‹ο»Ώβ€Œβ€β€β€‹β€Œβ€Œβ€Œβ€‹β€Œβ€β€Œβ€Œβ€Œο»Ώβ€β€‹β€Œο»Ώβ€Œβ€‹β€‹β€β€‹β€β€Œο»Ώο»Ώβ€Œβ€Œο»Ώο»Ώβ€Œ

    AI Gateway provides a unified API for calling models, tracking usage and cost, and configuring retries, failover, and performance optimizations for higher-than-provider uptime. It includes built-in observability, Bring Your Own Key support, and intelligent provider routing with automatic retries.

    Learn more about AI Gateway, view the AI Gateway model leaderboard or try it in our model playground.

  • Qwen 3 Max Thinking now available on AI Gateway

    You can now access Qwen 3 Max Thinking via AI Gateway with no other provider accounts required.

    Qwen 3 Max Thinking integrates thinking and non-thinking modes for improved performance on complex reasoning tasks. The model autonomously selects and uses its built-in search, memory, and code interpreter tools during conversations without requiring manual tool selection. The tools reduce hallucinations and provide real-time information.

    To use this model, set model to alibaba/qwen3-max-thinking in the AI SDK:

    import { streamText } from 'ai'
    const { textStream } = await streamText({
    model: 'alibaba/qwen3-max-thinking',
    prompt:
    `Research a current topic, verify facts, remember a user preference,
    and include a short code snippet to support the explanation.`,
    })

    AI Gateway provides a unified API for calling models, tracking usage and cost, and configuring retries, failover, and performance optimizations for higher-than-provider uptime. It includes built-in observability, Bring Your Own Key support, and intelligent provider routing with automatic retries.

    Learn more about AI Gateway, view the AI Gateway model leaderboard or try it in our model playground.

  • Claude Code Max via AI Gateway, available now for Claude Code

    AI Gateway now supports the Claude Code Max subscription for the Claude Code CLI. This allows developers to use their existing subscription on Anthropic models with no additional cost while getting unified observability, usage tracking, and monitoring through Vercel’s platform.

    Link to headingSetup

    Set up your environment variables in your shell configuration file (~/.zshrc or ~/.bashrc)

    export ANTHROPIC_BASE_URL="https://ai-gateway.vercel.sh"
    export ANTHROPIC_CUSTOM_HEADERS="x-ai-gateway-api-key: Bearer your-ai-gateway-api-key"

    Replace your-ai-gateway-api-key with your actual AI Gateway API key.

    Link to headingStart Claude Code

    claude

    Link to headingLog in with your Claude subscription

    If you're not already logged in, Claude Code will prompt you to authenticate. Choose Option 1 - Claude account with subscription and log in with your Anthropic account.

    If you encounter issues, try logging out with claude /logout and logging in again.

    Your Claude Code requests now route through AI Gateway, giving you full visibility into usage patterns and costs while using your Max subscription.

    Link to headingHow it works

    When you configure Claude Code to use AI Gateway, Claude Code continues to authenticate with Anthropic. It sends its Authorization header and AI Gateway acts as either a passthrough proxy to Anthropic or, when it needs to fall back, a router to other providers.

    Since the Authorization header is reserved for Claude subscription credentials, AI Gateway uses a separate header x-ai-gateway-api-key for its own authentication. This allows both auth mechanisms to coexist.

    Read more about how to configure Claude Code Max with AI Gateway in the docs.