When I started Vercel my singular obsession was “time to deploy URL”, in the fewest number of commands and config (`now`). Our next frontier: time to intelligence. By building on the @aisdk, all you have to do is install the dep: ▲ ~ / npm i ai@alpha @vercel/ai-sdk-gateway Then run `vc pull` for local development support. That’s it. To try or switch models just swap a string, like “𝚍𝚎𝚎𝚙𝚜𝚎𝚎𝚔/𝚍𝚎𝚎𝚙𝚜𝚎𝚎𝚔-𝚛𝟷” → “𝚡𝚊𝚒/𝚐𝚛𝚘𝚔-𝟹-𝚋𝚎𝚝𝚊”. No API tokens. No new accounts. Unified usage billing with observability. Our AI gateway comes from the lessons of building @v0 and powering many of the largest AI apps today on Vercel’s infra. The realities today: ▫AI labs move at breakneck speed. Your best play is to minimize switching costs and preserve option value without lock-in. ▫The demand for AI is so extreme that even the largest clouds struggle with capacity issues that result in frequent rate-limiting errors. ▫The infrastructure is not as mature, stable, and predictable as the “software 1.0” cloud services yet. Backups and failovers are even more essential. ▫As open models progress, the competition for best price/performance intensifies, because many vendors can serve the weights. You will be able to name the open model ID and delegate the optimal vendor choice to the gateway. We’re super excited about the AI Gateway solving these challenges and many more as it evolves. Please try it out and share your feedback!
Vercel AI Gateway (alpha): • Built on the @aisdk 5 alpha • Switch between ~100 AI models without API keys • Handles auth, usage tracking, and more vercel.com/blog/ai-gateway