Source: Dev.to
Process Overview
In a modern mid‑size startup (≈ 50–200 employees), the software development process balances the speed of a small team with the predictability required by investors and stakeholders. The workflow follows a Discovery → Planning → Execution loop rather than ad‑hoc coding.
Phases
| Phase | Key Activity | Output |
|---|
| Estimation | Pointing session with developers | Sprint Backlog |
| Execution | Coding, Code Reviews (PRs), CI/CD | Feature Branch |
| QA/UAT | Automated testing & stakeholder review | Release Candidate |
| Launch | Feature flags & phased rollout | Live Feature |
Initiation
- Triggered by data, customer feedback, or strategic goals.
- Catalysts: Product Management (user feedback/analytics), Leadership (strategic pivot), or Sales/Customer Success (high‑value client requests).
Discovery Phase
- A Discovery squad (Product Manager, Lead Designer, Tech Lead) validates the idea before any code is written.
- Activities: user interviews, technical feasibility spikes.
- Key output: Product Requirements Document (PRD) – a living document (e.g., Notion, Confluence) containing problem statement, target user, user stories, success metrics (KPIs), and out‑of‑scope items.
Planning
- Designers receive the PRD, analyze requirements, and produce high‑fidelity mockups (usually in Figma).
- The Tech Lead discusses the PRD and UI mockups with engineers, identifies edge cases, and drafts a Technical Design Document (RFC/TDD) covering system architecture, database changes, API designs, metric‑gathering plan, and release plan.
- The design is peer‑reviewed across the engineering team.
Backlog Grooming
- Product Manager and Engineering Lead break the PRD into tickets (tasks) in a tool like Jira.
- Each ticket receives Story Points for effort estimation.
- Teams typically use Scrum or Kanban with 2‑week sprints.
Execution
- Development proceeds in sprints: coding, pull‑request reviews, continuous integration/deployment.
- Daily stand‑ups (≈ 15 min) keep the team aligned.
QA / UAT
- Automated tests run continuously.
- Stakeholders perform User Acceptance Testing on a release candidate.
Launch
- Features are released behind feature flags for phased rollout.
- A “dark launch” (code deployed but hidden) precedes full release.
Supporting Practices
- Asynchronous Documentation: All decisions and updates are written down for asynchronous consumption.
- Product Roadmap: High‑level quarterly timeline (Now, Next, Later).
- RACI Matrix: Defines who is Responsible, Accountable, Consulted, and Informed.
- SLAs: Document expected uptime and performance for the new feature.
- Project Tracking: Linear, Jira, Asana
- Documentation: Notion, Confluence, GitHub Wiki
- Design: Figma
- Communication: Slack (integrated with GitHub/Jira)
Example Timeline for a Medium‑Sized Feature (e.g., new dashboard)
| Week | Activity |
|---|
| 1‑2 | Discovery: research, PRD drafting, stakeholder alignment |
| 3 | Technical Design: RFC/TDD writing, peer review, Figma prototyping |
| 4‑7 | Development: 2‑3 sprints of coding, daily stand‑ups |
| 8 | Testing & Polish: bug fixing, User Acceptance Testing |
| 9 | Deployment: dark launch → full release |
Post‑Launch Activities
- User Feedback & Analytics: Monitor tools like Amplitude, Mixpanel, Google Analytics to assess impact against PRD goals.
- Observability & Monitoring
- Feature Flag Cleanup
- Technical Debt Management (e.g., “Nothing is more permanent than a temporary solution”)
- Performance Tuning
Post‑Launch Review (formal meeting 1–2 weeks after launch)
| Section | Description |
|---|
| What went well? | Example: “The RFC process caught a major security flaw early.” |
| What went wrong? | Example: “Figma designs missed a mobile view, causing a 2‑day delay.” |
| Timeline Reality vs. Estimate | Compare actual sprints to planned sprints and explain variances. |
| Action Items | Concrete steps to avoid repeat mistakes in future projects. |