When Cloudflare says "94% test coverage" - it's important to understand what that actually means. Their own readme says "94% of the API has full or partial support" (load-bearing "partial" here). In other words, this does not mean that they're actually testing 94% of the Next.js entire test suite (13,708 tests cases). "94% of the API surface" means they wrote a 52-item checklist and gave themselves a score. This is "we have a function with that name."... It's a cherry-picked vanity metric. That checklist hides what's actually broken. Take parallel routes: vinext tests 15 server-render cases. We test 90 across 27 directories, because the hard part is client-side (slot state retention, catch-all specificity, scoped revalidation, back/forward history). None of which vinext implements. It just straight up does not work. We've found this pattern across the feature surface. On the real Next.js test suite: 13% dev, 20% e2e, 10% production. They can evidently throw an agent at this, but it's a good reminder that if you don't understand what you're building in the first place, you're going to have a hard time. Does the team shipping this actually understand what they've built? If you're parading "94% coverage" to the world and features are fundamentally broken past the happy path, either you know and you're being deliberately misleading, or you don't know, which is scary.