Feature Spec Template
Use this markdown starter to align product, engineering, QA, and AI coding output. The sections below explain how to fill it in a way that survives real implementation pressure.
# Feature Spec Template ## Goal - ... ## Non-goals - ... ## Acceptance Criteria - [ ] Given ... - [ ] When ... - [ ] Then ... ## Edge Cases - null/empty: - duplicates/idempotency: - concurrency/race: - permissions/visibility: ## Output / Deliverables - API / DB / UI changes: - Test checklist: - Rollback notes:
Template usage path
- Copy the Markdown template into the ticket, pull request, or repo.
- Fill owner, non-goals, acceptance criteria, evidence, and rollback notes.
- Ask product, engineering, and QA to review different failure modes.
- Update the spec in the same change when implementation behavior shifts.
What a filled template should look like
A template becomes useful only after it carries a real decision, owner, and evidence. Use the blank template above for structure, then aim for this level of specificity before implementation starts.
## Refund retry behavior Owner: Billing platform Decision: - retries reuse the same idempotency key Evidence: - duplicate request test - provider-timeout fixture - support UI screenshot
If the copied template still has empty owner, evidence, or rollback fields, keep it in review.
When to use this template
- New feature delivery with unclear acceptance boundaries.
- Refactors that may change user-visible behavior.
- Cross-team changes where API, DB, and UI must stay aligned.
- AI-assisted coding tasks that need strict scope control.
How to fill it correctly
The Goal section should be one sentence describing the user or business outcome. Non-goals set explicit boundaries to prevent scope drift. Acceptance Criteria are binary Given/When/Then statements. The Edge Cases section covers nulls, duplicates, permission and concurrency paths. Deliverables list the exact code areas, tests, and rollback notes.
Write so a reviewer can derive test cases without asking clarifying questions.
Common mistakes
- Using vague language like "handle properly" or "optimize experience".
- Listing implementation tasks but skipping acceptance outcomes.
- Not documenting error and permission behavior.
- Shipping state changes without rollback instructions.
Pre-implementation review checklist
- Every criterion maps to a test case.
- Out-of-scope items are explicit and agreed.
- API/DB impact is documented and backward-compatible.
- Rollout and rollback conditions are actionable.
Need examples? See spec template examples and spec review checklist.
Weak vs strong feature spec
Weak spec
"Users can filter orders, the page should be easy to use, and performance should be reasonable." This sounds clear, but it leaves reviewers guessing about fields, default sorting, empty states, permissions, and performance thresholds.
Strong spec
"The order list supports filters for status, created date, and customer email. It defaults to the last 30 days sorted by created date descending. Empty results show a neutral empty state. Users without orders.read receive 403. P95 query time stays below 300ms."
The stronger version is not just longer. It turns review, QA, and implementation into the same testable set of decisions.
How teams use this template
- Product fills in the goal, non-goals, and user scenarios before implementation starts.
- Engineering adds API, data, state, and rollback impact, especially when the change crosses services or tables.
- QA turns each Given/When/Then item into a manual or automated test case.
- AI coding tools receive the template as context, with an explicit instruction not to implement behavior outside the non-goals and acceptance criteria.
Feature spec FAQ
Is this template useful for small changes?
Yes, when a small change still affects state, permissions, data, API behavior, or visible user outcomes. For a static copy correction, a lighter checklist is usually enough.
How many acceptance criteria should a feature spec have?
Most feature specs land between five and twelve criteria. Coverage matters more than count: include the happy path, failure path, edge cases, permissions, and rollout judgment.
How to tell the spec is ready for implementation
A feature spec is ready when a reviewer can identify the first shippable slice, the excluded work, the affected roles, the required test evidence, and the rollback trigger without joining another meeting. If any of those answers still depend on memory from a planning call, keep the spec in review.
For AI-assisted work, paste the spec with a clear instruction that generated code must map back to a listed goal, acceptance criterion, edge case, or deliverable. This makes code review less subjective because every addition either has a spec anchor or needs to be removed.
Real-world example: user notification preferences
Here is what a filled-in version of this template looks like for a real feature — letting users control which notifications they receive.
## Goal Allow users to opt in/out of individual notification channels (email, push, in-app) per event type, reducing unsubscribe rates by giving granular control. ## Non-goals - Notification delivery infrastructure changes - Admin-level notification overrides - Frequency capping or digest mode ## Acceptance Criteria - [ ] Given a logged-in user on /settings/notifications When they toggle "Marketing emails" off and save Then no marketing emails are sent to that user - [ ] Given a user with all channels disabled When a critical security alert fires Then the alert is still delivered (security is non-optional) - [ ] Given an API consumer calling PATCH /users/:id/preferences When the payload omits a channel key Then that channel preference is unchanged (partial update) ## Edge Cases - null/empty: user has no existing preferences → default all ON - duplicates/idempotency: toggling the same setting twice is a no-op - concurrency/race: two tabs saving simultaneously → last-write-wins - permissions/visibility: users cannot modify other users' preferences ## Output / Deliverables - API: PATCH /users/:id/notification-preferences - DB: add notification_preferences JSONB column to users table - UI: toggle grid component on settings page - Test checklist: unit tests for partial update, integration test for security override, E2E test for toggle persistence - Rollback: feature flag `notification_prefs_v2` → revert to global on/off toggle
Related resources
- Acceptance Criteria Examples — reusable Given/When/Then patterns
- Edge Case Checklist — systematic edge case identification
- API Spec Template — when your feature includes API changes
- What Is Spec-First Development? — the complete methodology guide
Editorial note
This template covers feature specification for spec-first engineering teams. The filled example is an illustrative scenario.
- Author: Daniel Marsh
- Editorial policy: How we review and update content
- Corrections: Contact the editor
Tip: save it under /docs/specs/, review before implementation, and update after major requirement changes. Last updated: May 6, 2026.
Fill a form, get Markdown — ready for your repo.