← Back to all tweets

Tweet by @cramforce

View original on X

This is based on `resumable-stream` which you can find on npm! A core feature is that while it uses Redis, the actual Redis usage is minimal unless you actually need to resume a stream (which should be rare in practice). It supports: - Clients resuming streams on network interruption - Multiple browser tabs following the same underlying stream - Multiple users following the same stream

import { createResumableStreamContext } from "resumable-stream";
import { after } from "next/server";

const streamContext = createResumableStreamContext({
  waitUntil: after,
  // Optionally pass in your own Redis publisher and subscriber
});

export async function GET(req: NextRequest, { params }: { params: Promise<{ streamId: string }> }) {
  const { streamId } = await params;
  const resumeAt = req.nextUrl.searchParams.get("resumeAt");
  const stream = await streamContext.resumableStream(
    streamId,
    makeTestStream,
    resumeAt ? parseInt(resumeAt) : undefined
  );
  if (!stream) {
    return new Response("Stream is already done", {
      status: 422,
    });
  }
  return new Response(strean, {
    headers: {
      "Content-Type": "text/event-stream",
    },
  });
}
Guillermo Rauch
Guillermo Rauch
@rauchg

The @vercel Chat SDK now features stream resumption. This makes AI conversations resilient to network hiccups and reloading or sharing a chat mid-generation. This is especially valuable for long responses (e.g.: Deep Research). No proprietary APIs, no sticky load balancing, just

271
Reply