SaaS • Developer tools • E-commerce
Pull Request Validation
Test every pull request against a live preview so reviewers know the change works in a browser — not just in the diff.
Challenge
Code review catches logic errors in diffs, but it cannot verify that a change actually works in the browser. Without E2E tests on pull requests, teams merge code that looks correct but breaks real user journeys — and the breakage is only discovered downstream in staging or production. Running the full test suite on every PR is often too slow or flaky to be practical, so most teams skip it entirely and rely on post-merge testing to catch issues.
Stably approach
Stably runs your test suite against every pull request using preview deployment URLs. The CLI can also generate new tests directly from PR context and git diffs using `stably create`, so test coverage grows alongside the feature being built. Tests execute in parallel across cloud infrastructure, and results post back to the PR as status checks with detailed failure reports including screenshots. Reviewers see exactly what works and what broke before they approve.
What changes
Every PR gets E2E results before the reviewer even opens the diff
Stably's GitHub Action triggers on pull_request events. By the time the reviewer opens the PR, test results are already posted — with a green checkmark or a failure report showing exactly which user flow broke, including a screenshot of the broken state.
New feature? stably create generates tests from the diff
Run stably create on a PR that adds a settings page. The CLI analyzes the diff, understands the new routes and components, and generates a test that navigates to the settings page, fills in the form, saves, and verifies the changes persisted. Coverage grows with each PR.
Preview deploys are tested automatically, not manually clicked through
Your Vercel or Netlify preview URL is passed as BASE_URL to the Stably action. The same tests that validate staging now validate the exact build artifact from this PR — no manual "let me open the preview and click around."
Post-merge surprises drop dramatically
When tests run on every PR, regressions are caught in the branch that introduced them — not three merges later when the staging build breaks and nobody knows which PR caused it.
When this is the right fit
- Bugs are found after merging, requiring follow-up PRs to fix
- E2E tests only run on the main branch or nightly, not per PR
- Reviewers approve PRs without verifying behavior in a browser
- Preview deployments exist but are not tested automatically