planning-fallacytime-managementcognitive-biasproductivitydecision-making

Why Everything Takes Longer Than You Think (Even When You Know It Will)

You've been late on every deadline this year. You know you underestimate time. And yet you keep doing it. The planning fallacy isn't a knowledge problem. It's a design flaw in how your brain imagines the future.

Clarido Team··9 min read
Why Everything Takes Longer Than You Think (Even When You Know It Will)

You already know this

You're going to underestimate how long this takes. Whatever "this" is. The project at work, the errand after lunch, the email you've been putting off because it requires actual thought. You'll look at it, think "twenty minutes," and surface an hour later wondering where the time went.

You know this about yourself. You've known it for years. And yet here you are, doing it again.

This is the strange part. The planning fallacy isn't something that goes away once you learn about it. Knowing you're bad at estimating time doesn't make you better at estimating time. It's one of the few cognitive biases that survives awareness completely intact, which tells you something important about what's actually going on.


The original discovery

In 1979, Daniel Kahneman and Amos Tversky gave it a name. They called it the planning fallacy: the tendency to underestimate how long tasks will take, even when you have extensive experience with similar tasks running late.

55.5 days: the actual average time psychology students took to finish their senior theses. Their prediction? 33.9 days. Only 30% finished when they said they would.

That finding comes from a landmark 1994 study by Roger Buehler, Dale Griffin, and Michael Ross, published in the Journal of Personality and Social Psychology. They asked students to predict when they'd finish their theses under three scenarios: best case, expected case, and worst case. The results were remarkable. Even the students' worst-case predictions were more optimistic than reality. They couldn't imagine the future being as slow as it actually turned out to be.

Is this just about lazy students who didn't care enough? Not even close. The pattern shows up everywhere.


It scales

Bent Flyvbjerg spent years studying what he calls the "Iron Law of Megaprojects." His research, beginning with a 2002 paper in the Journal of the American Planning Association, analyzed hundreds of large infrastructure projects across 104 countries and ninety years of data. The findings are bleak.

Rail projects run an average of 44.7% over budget. Roads come in at 20.4% over. Between 70% and 90% of megaprojects experience cost overruns, depending on the type. For the Olympics, the overrun rate is 100%. Every single one.

These aren't amateur operations run by people who don't know what they're doing. These are billion-dollar projects managed by experienced professionals with detailed plans, historical data, and entire teams dedicated to estimation. They still get it wrong. Consistently. In the same direction.

The planning fallacy doesn't care how smart you are, how experienced you are, or how many times you've been burned before. It operates beneath the level of conscious reasoning.

So what's going on? Why would a bias this well-documented, this widely experienced, resist correction so stubbornly?


The inside view

Kahneman and Tversky's explanation centers on what they called the "inside view" versus the "outside view." When you estimate how long something will take, you naturally imagine the specific task ahead of you. You picture the steps. You think about what you'll do first, then next, then after that. You're constructing a mental movie of the work.

The problem is that mental movies have terrible special effects. They tend to show the plot without the complications. You imagine writing the report, but not the twenty minutes you'll spend looking for that one statistic. You imagine driving to the store, but not the construction detour on Oak Street. You imagine the meeting ending on time.

What does your mental simulation leave out? Almost everything that matters. It captures the essential steps and skips the friction, the interruptions, the things that go sideways for reasons you can't predict because they haven't happened yet.

The inside view

You focus on the specific task and build a step-by-step plan. Feels detailed and reliable. Produces optimistic estimates.

"I'll draft the presentation, add the data slides, do a review pass. Maybe two hours."

The outside view

You look at how long similar tasks have taken in the past, regardless of how this one feels. Less satisfying. Far more accurate.

"The last three presentations took me four to six hours each. So probably around five."

The outside view is boring. It ignores everything that feels unique about your current situation. But that's precisely why it works. Your situation isn't as unique as it feels. The reasons things take longer are remarkably consistent: unexpected complexity, context switching, energy dips, dependencies on other people, and the simple fact that almost nothing goes exactly according to plan.


Why pessimism doesn't help either

Here's where it gets interesting. You might think the fix is simple: just imagine everything going wrong. Budget for disaster. Add buffer time.

In 2000, Newby-Clark, Ross, Buehler, and colleagues tested exactly this. They asked participants to generate pessimistic scenarios for their tasks before making predictions. It didn't work. Even after explicitly imagining things going badly, people still produced optimistic estimates. They rated their pessimistic scenarios as less plausible than their optimistic ones.

Think about what that means. You can look directly at the possibility of things going wrong, acknowledge it, and then set it aside because it doesn't feel real. The optimistic scenario isn't just your default. It's the one your brain finds believable.

This is why "just add 50% to your estimate" fails in practice. You add 50% to an already-optimistic number. Or you add it, then quietly shave it back down because the padded number "feels too high." The bias doesn't live in your arithmetic. It lives in your imagination.


The memory problem

There's another layer that makes this worse. Michael Roy, Nicholas Christenfeld, and Craig McKenzie published a provocative argument in Psychological Bulletin in 2005: people don't just underestimate the future. They misremember the past.

When you recall how long something took, your memory compresses it. The boring parts shrink. The waiting evaporates. The interruptions disappear. So when you use your past experience to estimate a future task (the thing you're supposed to do), you're calibrating against a distorted baseline.

What does your brain tell you? "Last time took about three hours." But last time actually took five. You just don't remember the two hours of friction because they weren't interesting enough to encode as distinct memories.

Why experience doesn't fix this People with more experience at a task often aren't better at estimating it. They have more data points, yes, but they're all compressed the same way. Ten years of underestimating doesn't produce ten years of calibration. It produces ten years of confidently wrong estimates.

What actually helps

If knowing about the bias doesn't fix it, and imagining worst cases doesn't fix it, and experience doesn't fix it, what does?

The research points to a few things that genuinely move the needle.

Use the outside view deliberately. Don't ask "how long will this take?" Ask "how long did similar things take?" Buehler's research consistently shows that people are much better at predicting others' completion times than their own. When you estimate for someone else, you naturally default to base rates rather than imagining specific scenarios. You can do this for yourself by treating your own task as if someone else described it to you. How long would you tell them it will take?

Break it down, then add up the pieces. Research on "unpacking" shows that people give longer (and more accurate) estimates when they list the subtasks involved. When you look at a project as a single unit, your brain generates one optimistic estimate. When you list twelve steps, each one gets its own small dose of optimism, but the total is closer to reality. The friction becomes visible.

Single estimate "Redesigning the website, about two weeks"
Unpacked estimate "Audit current pages (2 days) + wireframes (3 days) + design (4 days) + copy (3 days) + dev (5 days) + review (2 days) = about four weeks"

Track your actuals. This sounds obvious, but almost nobody does it. Write down your estimate before you start, then record how long it actually took. After a few weeks, you'll have a personal correction factor. Most people find theirs is somewhere between 1.5x and 2.5x. It's not a flattering number, but it's a useful one.

Offload the plan. One reason estimates go wrong is that your brain can't hold a complex project and all its contingencies in working memory simultaneously. Writing down every step, including the boring ones (setting up the environment, waiting for feedback, reformatting the output), forces you to account for time you'd otherwise forget.


The deeper pattern

The planning fallacy is part of a broader tendency: your brain is better at storytelling than statistics. When you think about the future, you don't calculate probabilities. You narrate scenes. And narratives have a structure that naturally omits the boring, the random, and the inconvenient.

This is useful in many contexts. Optimism about the future is what gets you to start things. If you knew in advance exactly how long and hard something would be, you might never begin. The planning fallacy, in a sense, is the price of ambition.

But it has costs. You overcommit. You disappoint people who were counting on your timeline. You disappoint yourself. You start to believe there's something wrong with you, that everyone else somehow finishes things on time and you're the only one who can't get it together.

You're not the only one. The 1994 thesis study? Seventy percent of students missed their own deadline. And those were students who were explicitly thinking about how long the work would take.

The problem isn't that you're bad at planning. The problem is that planning feels like knowing, and it isn't.

The gap between your plan and reality isn't a failure of discipline. It's a feature of cognition. Your brain builds a clean story about how things will go, and reality delivers the unedited version. The mess, the delays, the detours were always going to be there. Your plan just didn't have room for them.


What changes

Once you stop treating optimistic estimates as a character flaw and start treating them as a predictable cognitive pattern, something shifts. You stop asking "why am I always late?" and start asking "what's my correction factor?"

What does that look like in practice? You build in margins not because you're pessimistic, but because you're realistic about how your brain works. You write down your plans in enough detail that the hidden work becomes visible. You check your estimates against history instead of against your imagination.

And maybe most importantly, you get a little gentler with yourself on the days when things still take longer than you planned. Because they will. That's not a bug. That's just how brains work.


References

  • Kahneman, D. & Tversky, A. (1979). "Intuitive Prediction: Biases and Corrective Procedures." TIMS Studies in Management Science, 12, 313–327.
  • Buehler, R., Griffin, D., & Ross, M. (1994). "Exploring the 'Planning Fallacy': Why People Underestimate Their Task Completion Times." Journal of Personality and Social Psychology, 67(3), 366–381.
  • Newby-Clark, I.R., Ross, M., Buehler, R., Koehler, D.J., & Griffin, D. (2000). "People Focus on Optimistic Scenarios and Disregard Pessimistic Scenarios While Predicting Task Completion Times." Journal of Experimental Psychology: Applied, 6(3), 171–182.
  • Flyvbjerg, B., Holm, M.S., & Buhl, S. (2002). "Underestimating Costs in Public Works Projects: Error or Lie?" Journal of the American Planning Association, 68(3), 279–295.
  • Roy, M.M., Christenfeld, N.J.S., & McKenzie, C.R.M. (2005). "Underestimating the Duration of Future Events: Memory Incorrectly Used or Memory Bias?" Psychological Bulletin, 131(5), 738–756.
  • Buehler, R., Griffin, D., & Peetz, J. (2010). "The Planning Fallacy: Cognitive, Motivational, and Social Origins." Advances in Experimental Social Psychology, 43, 1–62.

Clarido is a quiet log for a loud mind. Speak or type what's on your mind. Clarido holds it for you so you can return when you have space. Join the beta →