Often teams that are creating software have a hard time knowing when features are "done" or what a features progress is.
This can be hard to track the more async planning teams have. Each team also has their own priorities, capabilities and blockers.
- priorities - What a certain team values as important or their assigned tasks/goals.
- capabilities - Can a team accomplish a feature? Does the language/framework/business unit/etc support what's needed?
- blockers - Do teams understand what is requested? Are the requirements well understood and complete?
An engineering team might have priorities organized like the graph on the right for each feature / project. This graph shows how "complete"
a feature/project can be. The essential parts (database, api endpoints, etc) are listed lower since without them nothing above could be completed.
A product team might have priorities that encompass a larger view than feature implementation.
These graphs can be easy to understand for a single feature, but become more complex when teams have multiple concurrent features in progress.
Also, the "percent" complete may be misleading, but could represent time + effort while showing dependencies.
Unit tests require API endpoints (or event messaging, filesystem ops, etc) and fixing bugs typically requires deploying the software.
We can track an individual team's progress and priorities, but how do we compare that across teams? Product teams feed Engineering with requirements, designs, user stories, etc.
Engineering teams feed Product with suggestions for improvement and known limitations. Support teams inform both Engineering and Product with common problems users encounter.
This implies that teams work ahead of others and are dependent on each other. We could show an aggregate view of a team's progress by showing how far along they are in their individual graph.
Doing so would assume they've completed all items in each step, but it's an approximation.
Looking at this graph over time we can understand that Sales/Product would complete their work first. Sales gathers requirements from customers and Product folks turn them into requirements for Design/Engineering.
A documentation team is dependent on Design/Engineering for screenshots and API examples. Thinking about this flow makes sense, if we can measure the work.
In my experience I've seen teams scatter these priorities differently than other teams. Everyone has assumptions about what can get completed faster or what is more important. This results in a graph like the one below
where each filled section represents the time + effort of one feature within a product. The colors show priority of a feature/request.
The lower a section is the more essential it is. When a item is understood to be essential it's either a building block for other features or a core component of the product.
This scattering of priorities and effort should be organized. When teams are organized then there are better handoffs. Leadership has more accurate estimates as a result. An organized graph would show roughly idential color ordering,
which implies work is streamlined for dependencies. The real world may not have such alignment, but it's a goal.
With all of these teams and priorities how do we determine what "Done" is? Everyone seems to ask about when work is "done" but answering that seems much harder.
It appears that teams have to internalize their notion of "done" (at a certain watermark on the graph) and communicate that.