Understanding Project Forecasts: From Velocity to Delivery Dates
GoalPath calculates delivery forecasts using three inputs: your team's velocity, multitasking load, and the maturity of your estimates. This guide explains the math.
The Short Version
If you want the gist without the math:
- GoalPath measures how fast your team works (velocity = story points completed per week over the last 6 weeks)
- It penalizes multitasking — when people juggle too many milestones at once, forecasts account for the productivity loss
- It adjusts for uncertainty — the less defined the work, the wider the forecast range
- You get three dates: optimistic (things go well), expected (most likely), and pessimistic (things go badly)
- Forecasts update automatically as work completes and velocity changes
Want to improve your forecasts? Reduce multitasking, estimate all items, and break down ambiguous work. The math below explains exactly why each of these helps.
The Foundation: Work-Weeks
Before we dive in, let's clarify what we mean by a "work-week":
- 1 work-week = 5 business days (Monday through Friday)
- We automatically skip weekends in all our calculations
- This ensures our estimates reflect actual working time, not calendar time
If we used calendar weeks (7 days), we'd overestimate every project by about 28.6%—that's nearly a third! By using work-weeks, we give you realistic timelines based on when work actually gets done.
The Three Pillars of Forecasting
Our forecasting system rests on three fundamental concepts that work together to give you realistic, actionable predictions:
1. Velocity: Your Team's Throughput
Velocity is the rate at which your team completes work, measured in story points per work-week.
How we calculate it:
- We look at the last 6 weeks of completed items
- We sum up the story points delivered each week
- We calculate the average: total points ÷ number of weeks
- We also track the standard deviation to understand consistency
Example:
Week 1: 15 points
Week 2: 18 points
Week 3: 12 points
Week 4: 20 points
Week 5: 16 points
Week 6: 14 points
Total: 95 points over 6 weeks
Average Velocity: 15.8 points/week
Standard Deviation: 2.9 points (measures consistency)
Why it matters: Velocity tells us your team's baseline capacity—how much work they can realistically complete in a week.
2. Multitasking Penalty: The Hidden Cost
Context switching reduces productivity. When team members juggle multiple milestones, GoalPath adjusts their effective velocity downward.
Penalties for concurrent in-progress milestones:
- Working on 1 milestone: 0% penalty (100% effective velocity)
- Working on 2-3 milestones: 15% penalty (85% effective velocity)
- Working on 4-7 milestones: 30% penalty (70% effective velocity)
- Working on 8+ milestones: 50% penalty (50% effective velocity)
How we apply it:
We calculate each team member's penalty individually based on how many milestones they're currently working on simultaneously, then aggregate:
Example Team:
- Alice: 1 active milestone → 100% effective → 20 pts/week × 1.0 = 20 pts/week
- Bob: 2 active milestones → 85% effective → 18 pts/week × 0.85 = 15.3 pts/week
- Carol: 5 active milestones → 70% effective → 16 pts/week × 0.7 = 11.2 pts/week
Team Effective Velocity: 46.5 pts/week (down from 54 pts/week)
Average Multitasking Penalty: ~14% velocity reduction
Why it matters: This prevents over-optimistic forecasts by accounting for the real-world cost of divided attention. A team member spreading work across many concurrent milestones will have longer delivery times, even with the same raw velocity.
3. Confidence Level: Accounting for the Unknown
Not all forecasts are created equal. Some projects have rock-solid estimates, while others are exploratory and uncertain. The confidence level captures this reality.
Three confidence levels:
- High (90%): Well-understood work, clear requirements, experienced team → 10% buffer
- Medium (70%): Some unknowns, moderate complexity → 25% buffer
- Low (50%): Exploratory work, many unknowns, new domain → 50% buffer
How we use it:
Confidence doesn't just add a fixed buffer—it acts as a multiplier on your total uncertainty, shaping your entire forecast range through uncertainty propagation (more on this below).
Why it matters: A low-confidence estimate acknowledges that surprises are likely. Instead of pretending we can predict the unpredictable, we give you a realistic range.
Putting It All Together: The Forecast Calculation
Now let's see how these three pillars combine to produce your forecast.
Step 1: Calculate Total Work
First, we estimate the total work remaining:
Estimated items: 25 stories × average 8 points = 200 points
Unestimated items: 5 stories × average 8 points (inferred) = 40 points
Total Work: 240 points
Step 2: Apply Multitasking Penalty to Velocity
Next, we adjust the team velocity for multitasking:
Raw Team Velocity: 54 points/week
Multitasking Penalty: 14% reduction
Effective Velocity: 54 × (1 - 0.14) = 46.4 points/week
This gives us the actual throughput we can expect given current multitasking load.
Step 3: Calculate Base Duration
Now we can calculate a baseline:
Base Duration = Total Work ÷ Effective Velocity
Base Duration = 240 points ÷ 46.4 points/week
Base Duration = 5.2 work-weeks
This is our starting point—but we're not done yet.
Step 4: Propagate Uncertainty
We don't just add a simple buffer. Instead, we propagate multiple sources of uncertainty through the calculation:
Uncertainty sources:
-
Estimation Uncertainty (from unestimated items)
5 unestimated items out of 30 total = 17% unestimated Uncertainty added: 17% × 30% factor = 5% additional buffer -
Velocity Uncertainty (from standard deviation)
Standard deviation: 2.8 points Velocity uncertainty: 2.8 ÷ 46.4 = 6% variance Uncertainty added: 6% × 50% factor = 3% additional buffer -
Confidence Level Buffer (based on project confidence)
Medium confidence → 25% buffer applied -
Coordination Uncertainty (if team-based forecast)
Team-based forecast → 10% coordination buffer
Total uncertainty multiplier:
1.0 (base) + 0.05 (estimation) + 0.03 (velocity) = 1.08
1.08 × 1.25 (confidence) × 1.1 (coordination) = 1.49
Step 5: Calculate the Three Scenarios
Finally, we use this uncertainty to create three estimates:
Optimistic (Best Case):
Minimum Duration = Base × 0.7
Minimum Duration = 5.2 weeks × 0.7 = 3.6 work-weeks
This assumes things go better than expected—fewer blockers, faster velocity.
Most Likely (Expected Case):
Most Likely = Base × (1 + (total uncertainty - 1) × 0.7)
Most Likely = 5.2 × (1 + (1.49 - 1) × 0.7)
Most Likely = 5.2 × 1.34 = 7.0 work-weeks
This is our best estimate—it includes 70% of the uncertainty buffer.
Pessimistic (Worst Case):
Maximum Duration = Base × Total Uncertainty × 1.2
Maximum Duration = 5.2 × 1.49 × 1.2 = 9.3 work-weeks
This assumes things go worse than expected—more blockers, scope creep, etc.
Step 6: Convert to Calendar Dates
Finally, we convert work-weeks to actual dates:
Start Date: October 5, 2025
Optimistic End: October 5 + (3.6 weeks × 5 days) = October 23, 2025
Most Likely End: October 5 + (7.0 weeks × 5 days) = November 21, 2025
Pessimistic End: October 5 + (9.3 weeks × 5 days) = December 19, 2025
Remember: We skip weekends automatically, so these are real delivery dates.
What This Means for Your Project
Understanding the Range
When you see a forecast like "7.0 work-weeks (Nov 21)" with a range of "3.6 to 9.3 weeks," here's what it tells you:
- The range is not a mistake—it's honest uncertainty
- The most likely date (Nov 21) includes confidence buffers—it's not a best-case scenario
- The optimistic date (Oct 23) is possible if everything goes smoothly
- The pessimistic date (Dec 19) protects you from unpleasant surprises
How to Improve Your Forecast
Want tighter ranges and faster delivery? Here's how:
-
Reduce Multitasking
- Focus team members on fewer concurrent milestones
- A 30% multitasking penalty means 30% longer delivery times
- Example: 5 active milestones → 2 active milestones saves ~15% of total time
-
Increase Confidence
- Break down ambiguous stories into clearer tasks
- Spike on unknowns before committing
- Low → Medium confidence can reduce range by 20%
-
Estimate More Items
- Each unestimated item adds uncertainty
- Quick estimation sessions can tighten your forecast
- Even rough estimates (S/M/L) are better than nothing
-
Build Consistent Velocity
- Reduce WIP (work in progress) to smooth flow
- Address bottlenecks that create variance
- Consistent velocity = tighter forecast ranges
Real-World Example: Complete Walkthrough
Let's put this all together with a real project:
Project: Mobile App Redesign
Team Setup:
- 4 developers on the team
- 2 developers focused on this milestone (1 milestone each)
- 1 developer working on 2 concurrent milestones
- 1 developer working on 5 concurrent milestones
Work Breakdown:
- 45 estimated stories: 360 points
- 8 unestimated stories
- Average story size: 8 points
- Estimated total with unknowns: 424 points
Velocity Data (last 6 weeks):
- Team raw velocity: 52 points/week
- Standard deviation: 6.2 points (12% variance—somewhat inconsistent)
Confidence Level: Medium (some unknowns in the UI framework)
Step-by-step calculation:
-
Calculate Multitasking Penalty
Dev 1: 20 pts/week × 1.0 (1 milestone) = 20 pts/week Dev 2: 18 pts/week × 1.0 (1 milestone) = 18 pts/week Dev 3: 16 pts/week × 0.85 (2 milestones) = 13.6 pts/week Dev 4: 14 pts/week × 0.7 (5 milestones) = 9.8 pts/week Effective Team Velocity: 61.4 pts/week (down from 68 pts/week) Multitasking Penalty: 10% average reduction -
Calculate Base Duration
424 points ÷ 61.4 pts/week = 6.9 work-weeks -
Propagate Uncertainty
Estimation uncertainty: 8/53 unestimated = 15% × 30% = 4.5% Velocity uncertainty: 6.2/61.4 = 10% × 50% = 5% Confidence buffer: Medium = 25% Team coordination: 10% Total multiplier: 1.0 + 0.045 + 0.05 = 1.095 With confidence: 1.095 × 1.25 × 1.1 = 1.51 -
Calculate Range
Optimistic: 6.9 × 0.7 = 4.8 work-weeks (24 business days) Most Likely: 6.9 × 1.36 = 9.4 work-weeks (47 business days) Pessimistic: 6.9 × 1.51 × 1.2 = 12.5 work-weeks (62 business days) -
Convert to Dates (starting October 5, 2025)
Optimistic: November 7, 2025 Most Likely: December 12, 2025 Pessimistic: January 2, 2026
What the team can do:
- Focus the dev on 5 milestones → 2 milestones: Would reduce multitasking penalty from 10% to ~7%, saving ~3% of total time (≈10 days earlier)
- Estimate those 8 stories: Would reduce uncertainty by 4.5%, tightening the range significantly
- Spike on UI framework unknowns: Could raise confidence from Medium → High, reducing buffer from 25% to 10% (≈2 weeks earlier)
Combined impact: Could shift "Most Likely" from Dec 12 to mid-November—a month faster!
Transparency and Continuous Improvement
Every forecast includes detailed metadata that captures:
- Base velocity before penalties
- Actual multitasking penalty applied
- Each uncertainty factor and its impact
- The complete calculation formula
This transparency serves two purposes:
- Trust: You can see exactly how we arrived at the estimate
- Learning: Over time, we can measure actual delivery against predictions and refine our penalty factors
Every forecast includes this metadata so you can track forecast accuracy over time and make informed decisions about reducing multitasking or improving estimates.
Conclusion
Project forecasting combines your team's velocity, multitasking load, and estimation maturity to give you honest delivery ranges.
When you see your forecast:
- The multitasking penalty shows the real cost of divided attention
- The confidence level shapes your uncertainty buffers
- The velocity drives your baseline throughput
- The range acknowledges that the future is uncertain
- The work-weeks ensure we're counting actual working time
Delivery Probability Lines: Visualizing Delivery Risk
When prioritizing items in a milestone, you may notice colored lines appearing between items. These are Delivery Probability Lines—a visual tool that helps you understand delivery risk based on your team's velocity.
What Are Delivery Probability Lines?
Delivery Probability Lines show you where your delivery thresholds fall based on team velocity and variance. They answer the question: "If we work at our current pace, how much can we deliver in 1 week? 2 weeks? 3 weeks? 4 weeks?"
Each line represents a time horizon (1-6 weeks) and comes in three colors:
- Green (Best Case): Based on optimistic velocity (90th percentile)
- Amber (Probable): Based on median velocity (50th percentile)
- Red (Worst Case): Based on pessimistic velocity (10th percentile)
How It Works
The lines are calculated using your team's velocity data:
-
Calculate velocity bounds using statistical confidence intervals:
- Optimistic velocity = Effective velocity + (1.28 × standard deviation)
- Probable velocity = Effective velocity (median)
- Pessimistic velocity = Effective velocity - (1.28 × standard deviation)
-
Calculate capacity for each time horizon:
- For each week (1 through 6), multiply velocity by weeks to get deliverable points
-
Position lines by calculating cumulative story points:
- Lines appear after the last item whose cumulative total fits within the capacity
Example
Imagine a milestone with these items in priority order:
Item 1: 3 points → Cumulative: 3 pts
Item 2: 5 points → Cumulative: 8 pts
───────────────────────────────────◀ 1 week (green: best case)
Item 3: 4 points → Cumulative: 12 pts
───────────────────────────────────◀ 1 week (amber: probable)
Item 4: 6 points → Cumulative: 18 pts
───────────────────────────────────◀ 1 week (red: worst case)
───────────────────────────────────◀ 2 weeks (green: best case)
Item 5: 8 points → Cumulative: 26 pts
───────────────────────────────────◀ 2 weeks (amber: probable)
Item 6: 5 points → Cumulative: 31 pts
If your team's velocity is 12 pts/week with ±3 pts standard deviation:
- Best case 1 week: ~15 pts (Items 1-3)
- Probable 1 week: ~12 pts (Items 1-2)
- Worst case 1 week: ~9 pts (Items 1-2)
Using Probability Lines for Prioritization
Probability lines help you make informed prioritization decisions:
-
Identify high-priority items: Anything above the first green line is highly likely to be delivered within a week
-
Manage stakeholder expectations: Use the amber and red lines to communicate realistic delivery ranges
-
Assess risk: If a critical item falls below the red line for your target delivery date, you may need to:
- Reprioritize to move it higher
- Reduce scope of items above it
- Accept the delivery risk
When Lines Appear
Delivery probability lines are shown when:
- The milestone is In Progress or Planned (not Inbox, Icebox, or Done)
- Forecast data is available with velocity > 0
- Items have story point estimates
Interpreting the Spread
The distance between green, amber, and red lines for the same week tells you about delivery uncertainty:
- Lines close together: Consistent velocity, predictable delivery
- Lines spread apart: High variance, less predictable delivery
Wide spreads suggest you might want to focus on improving velocity consistency (reduce WIP, address blockers, etc.) before committing to aggressive deadlines.