I've gone through times when management would treat estimates as deadlines, and were deaf to any sort of reason about why it could be otherwise, like the usual thing of them changing the specification repeatedly.
So when those times have occurred I've (we've more accurately) adopted what I refer to the "deer in the headlights" response to just about anything non-trivial. "Hoo boy, that could be doozy. I think someone on the team needs to take an hour or so and figure out what this is really going to take." Then you'll get asked to "ballpark it" because that's what managers do, and they get a number that makes them rise up in their chair, and yes, that is the number they remember. And then you do your hour of due diligence, and try your best not to actually give any other number than the ballpark at any time, and then you get it done "ahead of time" and look good.
Now, I've had good managers who totally didn't need this strategy, and I loved 'em to death. But for the other numbnuts who can't be bothered to learn their career skills, they get the whites of my eyes.
Also, just made meetings a lot more fun.
> I've gone through times when management would treat estimates as deadlines, and were deaf to any sort of reason about why it could be otherwise, like the usual thing of them changing the specification repeatedly.
I worked at a place where this management insanity was endemic, which lead to everyone padding all estimates with enough contingency to account for that. Which lad to the design team, and the front-end team, and the backend team, and the QA team, all padding out their estimates by 150 or 200% - to avoid the blame storms they'd seen for missing "deadlines".
Then the Project managers added those all ups and added 150 - 200%. Then the account managers and sales teams added 150 - 200% to the estimated costs before adding margins and setting prices.
Which ended up in literally around 1 million dollars a month to maintain a website which could _easily_ have been handled by a full time team of 8 or 10 decent web and full stack devs. Hell, apart from the 24x7 support requirement, I reckon I know a few great Rails or Django devs who could have done all the work on their own, perhaps with a part time of contracted graphic designer.
That all lasted a handful of years, until the client worked out what was going on, and my company management flew the whole thing into the mountain, with ~100 people losing their jobs and their owed entitlements (I was out about $26K that day.)
This is literally the endgame.
And the only cure is instead building a company that's tolerant of mistakes while still aspiring to excellence.
The one I've worked at which got the closest had a corporate culture that failures were atrributable to processes, while successes were atrributable to individuals/teams.
Of course that had its own negative side effects, but on the whole it made the company a lot more honest with itself. And consequently got better work out of everyone.
What helped me was to track Sprint Volatility in addition to Sprint Velocity. We had our over all capacity, let's say 40 points and that would go up or down some based on people leaving the team, joining the team, etc. It's just an average of how much a team can get done in a given sprint. Velocity was gauged as points per person per day.
Volatility is how much the sprint changes. Sure you can pull one 5 pt ticket out and add in a 3 point and 2 point, but if you do that 12 times in a two week sprint, we will not finish the sprint even if total capacity stays under 40 points.
I would snapshot the sprint each day, so each day I could see how many tickets got removed/added. The end result being I could show my manager, look, when volatility is low, we almost always finish the sprint. When the volatility is high, we don't, it doesn't matter if we are over/under velocity because we don't have the time to properly plan and get clarity on asks. Have our product team think more than two weeks out and we'll deliver. That worked to a degree.
Jitter in agile; I love it.
In my experience, super large estimates don’t make you look good in the long run, they make you look incompetent. The engineers who are most likely to be under-performers are also those who give super inflated estimates for simple tasks.
Maybe this is a good strategy for dealing with people who aren’t going to judge you for delivering slowly, or for managers who don’t know what the fuck is going on. For managers who do, they will see right through this.
So many bold claims in this comment and little to no justification.
For what it's worth I've seen pretty much the opposite. I don't know about competent vs. incompetent engineers. But when it comes to experience, I've seen the inexperienced ones giving super low estimates and the experienced people giving larger estimates.
> I've seen the inexperienced ones giving super low estimates and the experienced people giving larger estimates
I have the same anecdotal experience with a possible explanation:
Inexperienced engineers often don't see the greater picture or the kind of edge cases that will probably need to be handled ahead of time. I've often had the following type of conversation:
Engineer: "I think that would be a day's work"
Me: "This will need to interact with team X's package. Have you accounted for time spent interacting with them?"
Engineer: "Oh no, I guess two days then"
Me: "Will this account for edge case 1 and 2?"
Engineer: "Ah yes, I guess it would be three days then"
Me: "Testing?"
Engineer: "Maybe let's say a week?"
On the other hand experienced devs might have their judgement clouded by past events:
"Last time we did something with X it blew out by 3 months" - Ignoring the fact that X is now a solved issue
Ime, as a junior dev/ops person, there is almost always scope creep and adding padding grants you room to account for the new idea your supervisor/ user thought of when being midway into development. As far as I can tell, my supervisor also assumes my estimates should be padded more because sometimes you might need wait on human i/o for longer than planned (holidays/ sick leave/...).
I think general problem on HN is that you can't say something "bold" without people going "nuts" - especially when it comes to estimating work.
In my experience (been hacking since the '90's before it was cool) great developers are great at estimating things. And these are not outliers, all except 1 great developer I've had pleasure of working with over these years has never been "off" on estimates by any statistically significant margin. but you say anything like that here on HN and it is heresy.
My general opinion is that developers LOVE making everyone believe that software development is somehow "special" from many other "industries" for the lack of a better word and that we simply cannot give you accurate estimates that you can use to make "deadlines" (or better said project plans). and yet most developers (including ones raising hell here and downvoting any comment contrary to "popular belief") are basically doing sht that's been done million times before, CRUD here, form there, notification there, event there etc... It is not like we are all going "oh sht, I wonder how long it'll take to create a sign-up form."
I think we have (so far) been successful at running this "scam" whereby "we just can't accurately estimate it" because of course it is super advantageous to us. and WFH has made this even worse in my opinion - given that we can't provide "accurate estimates" now we can simply pad them (who dare to question this, it is just an estimate, right? can't hold me to the estimate...) and then chill with our wifes and dogs when sh*t is done 6 weeks earlier :)
It really depends on what you are working on. When I did agency type of work, building things you have built before, from scratch, it's easy to estimate. E.g. some sort of e-commerce website from scratch. On the other hand, working in a large corp, with massive legacy systems, unknown domain knowledge and dependency on other teams, it becomes near impossible. What might have been a 2 hour task in agency, might either be completely impossible during our lifetime unless the whole company was built from scratch or take 2 years. You might need 6 other teams to agree to make some sort of change, and you first might have to convince those teams, and then 2 teams will initially agree to it, then pull out in the last moment, or realize they can't do it.
But when it comes to experience, I've seen the inexperienced ones giving super low estimates and the experienced people giving larger estimates
is the essence of this thread
A big problem when you're a more experienced engineer is when you have your hands in a lot of things and know the relative priority of stuff and how likely it is that something else of importance will pop up. So you anticipate things getting sidetracked over time, and try to make a bit of a longer estimate, usually to give yourself the slack to do other important things without looking like you're falling behind in JIRA.
Giving an "if I had nothing else going on" estimate can be a big trap to fall into - they will only see the number and judge your performance based on that. This dovetails into the problem that untracked but still important work being thankless in low trust environments - not all work can ever be tracked, or else the time to track that work would take as long as doing the work. Examples: literally any emotional labor, time to monitor, time to train, time to document when its not explicitly required, time to solve little problems.
In the environment where none of this counts because its not quantifiable, everyone with knowledge makes themselves into a silo in order to protect perceptions of their performance, and everyone else suffers. I'll go even a little further to say that companies that attempt to have no untracked work are by nature far more sociopathic - thus far there's basically no consequences for sociopathic organizations but I hope one day there will be.
> Maybe this is a good strategy for dealing with people who aren’t going to judge you for delivering slowly, or for managers who don’t know what the fuck is going on. For managers who do, they will see right through this.
I think a manager who doesn't know the difference between and estimate and a deadline is one who "[doesn't] know what the fuck is going on," and that's the kind of manager the GP uses this strategy with.
I've been considered high performer everywhere I went, only when I was beginning I usually gave very low and naive estimates, experience has taught me otherwise. Of course it will also depend on who and why I'm giving those estimations to.
Usually there are just too many unknowns that higher estimate is justified to avoid having to explain why you didn't make it by certain deadline. The estimates I give are not median or average that I expected the task to complete, they are so that I can be 95% sure it's possible to do it and then some.
It is however a logical follow up of the managers behaviour. If they try to hold the estimate as deadline, the estimate will be larger next time. If manager doesn't see this coming, the manager needs to work on some people skills.
The managers are often just as trained in this by the organization. If I'm in a "commitment" organization, I'm sure as hell telling them all to pad their estimates.
The punishment for a commitment culture is inflated estimates.
It's exactly this kind of prideful ego centric attitude that these managers rely on to get folks to commit to unrealistic estimates, then work nights and weekends (and cut corners) to fulfil.
This can only hold water in reasonably small orgs. In large orgs you often have to coordinate with large number of teams to get something delivered. Those teams have changing priorities that can impact when they complete their tasks. Your teams priorities can be shifted too to fight some fire. So this small estimate has no value because it has 0 correlation to the overall delivery date higher ups can commit to for the overall project.
> In my experience, super large estimates don’t make you look good in the long run, they make you look incompetent.
The manager need to know how to make the estimate to know is bullshit.
If the manager knows how to make the estimate and what it represents, there is no need to inflate it for them.
The hallmark of a bad manager who doesn't know they're a bad manager: "Why can't you just give me a number?" Inexperienced managers or people backfilling for someone else I can completely understand, they're not comfortable with the uncertainty they're dealing with. However in any other circumstance I think it's inexcusable.
But you have to remember that the manager is going to be asked for an estimate by his boss. He can't just say some time between "1 day and 10 years." In the real world, you have to be able to give some sort of estimate and help the poor guy do his job.
And thus we get to the root of the problem. As as business executive, why not simply track how long your big projects tend to take, rather than try and dictate how long they should take?
How can you tell what is worth doing if you don't know how long it might take?
You make projections instead of estimates. You split the work that needs to be done into many tasks and project from past experiences.
You cannot rely 100% on any estimates either, and all you are doing by demanding estimates is creating stress and making people less productive. The meta work imposed by that in itself will make a project take more time, as everyone will be padding their estimates.
What is the difference between a projection and an estimate?
Always reminds me of this
Knew what this was without clicking on it.
Reminds me of Hofstadter's Law: It always takes longer than you think, even when you take into account Hofstadter's Law.
We could say, always say it will take longer than you think?
Though by this principle, it seems that "overestimates" are likely to be actually accurate?
Joel Spolsky wrote about his time estimation software which recorded the actual time required for completion, and then calculated for each person a factor by which their estimates were off, and this factor was consistent enough that it could be reliably used as a multiplier to improve estimation accuracy.
> Most estimators get the scale wrong but the relative estimates right. Everything takes longer than expected, because the estimate didn’t account for bug fixing, committee meetings, coffee breaks, and that crazy boss who interrupts all the time. This common estimator has very consistent velocities, but they’re below 1.0. For example, {0.6, 0.5, 0.6, 0.6, 0.5, 0.6, 0.7, 0.6}
https://www.joelonsoftware.com/2007/10/26/evidence-based-sch...
Doesn't the article say that for experienced developers, the scaling factor tended to be converge on an average for each individual, even if variable for any particular task?
And Joel sidesteps the unknown-unknowns problem in that piece, by discussing boiling down tasks to <1 day chunks.
But what if you need to build a prototype before you sufficiently understand the project and options to decide on an approach? Where does that time get estimated?
The more projects I work on, the bigger of a fan of spiral development [0] I become.
Because, at root, there are 2 independent variables that drive project scheduling -- remaining work and remaining risk.
This estimation problem would drastically simplify if it allowed for "high confidence, 30 days" and "low confidence, 5 days" estimates.
And critically, that could drive different development behavior! E.g. prototype out that unknown feature where most of the remaining technical risk is.
Trying to instead boil that down to an un-risk-quantified number produces all the weird behaviors we see discussed elsewhere in the comments.
Problem is anything non-trivial can't be estimated: deadlines are useful to timebox the process and keep it accountable with a follow-up to see if it's worth continuing. Otherwise it becomes "just trust me bro" which is equally unfair in the opposite direction.
Some of the best career advice I got was very early on at my first gig - I had a designer tell me, over a cup of sake, that I should just inflate all my estimates by 60%. 30% to cover the stuff I hadn’t thought of, 30% to cover what they hadn’t thought of.
That sounded insane to me… nearly two decades later, with plenty of remote freelance and full time onsite team experience under my belt… and I fully agree. It’s always going to take significantly longer, and if you pretend it’s not, it’s going to come down on your head, like it or not. Always better to underpromise and overdeliver than the other way around.
The article is about modernization projects, which have soft deadlines because ostensibly the legacy software is still running while you're developing the replacement. There's always budget pressure, and promises may have been made, and users may be hoping for new features and eliminated frustrations... but if the replacement is a day late, it really wouldn't matter much.
Conversely, if you're trying to launch a space probe and the planets are no longer in the right positions for the required gravity assist, your spacecraft will not get where it needs to go. Or if you're a little $100M/yr toolmaker, and Ford asks you for a die for the 2026 F150 production line, to be delivered by March, and the contract states you owe a $20,000 per MINUTE penalty if you're late...you don't wait until February to say something surprising happened and it's not going to be ready. You don't sign on that dotted line unless you know for certain that you can do it.
Ford or NASA won't bat an eye when you tell them that a quote is going to cost $XX,XXX. They won't be surprised when they give you an ECO and you say that it's going to take 3 weeks and $8,000 to deliver a part that everyone knows you can probably make by hand in 30 minutes, they know that you're hedging against those deadlines, and pricing in the acceptance phase and inspection phase and contingency plans and everything else that makes their deadline-heavy industry function.
But if you tell someone at OP's modernization group that due to incomplete information you think that the 30-minute task to change the text of that button will take "no more than 3 weeks and $8,000" they'll laugh you out the door. Optimistic estimates get rewarded, pessimistic estimates get discouraged, accurate estimates are irrelevant, and in the end you're constantly behind schedule and no one's really surprised.
One technique is to run the modernization project, but use maintenance of the legacy software to keep the business going. Such maintenance could be for keeping up with hardware changes, OS upgrades, new features, and so forth.
I've seen projects run in parallel like this for 10+ years.
In 1505 Michelangelo gave an estimate of five years to finish the tomb for Pope Julius II.
Due to small side projects like the painting of the Sistine Chapel ceiling it took around 40.
Failure to meet the deadline informed by the estimate meant that the scale of the project was massively reduced because: Pope Julius II had died prior to completion, there were changes requested by the customer (both Julius and his heirs), supply chain issues, contract renegotiations, labor disputes, shortages of qualified workers, and money running out due to the long duration of the project.
So, since 1505 at least?
The funny thing is that the pope isn't even interred there.
"Can you imagine if the insurance company started arguing with the repair shop, asking them—no—telling them that they would only pay the $18,000 and not the additional $20,000 because that was the original estimate? Does that sound ridiculous to you? It does to me, too. Thank heavens, reality does not operate like this."
That happens all the time with insurance. I'm surprised at the confident tone in "reality does not operate like this". Not just car/home insurance either...health insurance also. They do often negotiate to a reasonable place, but not always.
It's also why they will write off a car if the estimate is much more than 70% of the value of the car. Yes the write-off may cost them more, but it's a known upper bound and it closes the claim. Insurance companies don't like open claims.
It depends on whether we're talking about estimates or negotiated rates.
In the latter (also called "preferred rates" for auto insurance) there's a blanket agreement that all work of a certain type is to be billed at a fixed, negotiated rate.
That is VERY different from a binding estimate, which typically means a one-off estimate for a specific job where the estimator takes the risk and promises to complete the work at the rate, even if it's much more complicated than they bargained for.
There's also whether they will pay at all for some things. They have some discretion to claim some damage was existing, or unrelated to the claim, caused by the repair shop, not normally covered, etc. Or where they won't pay for OEM parts if some substitute is available. There's lots of ways they can shuffle around.
As much as I want to take a fair and balanced perspective to the article, the author works for NYTimes. They’re overpaid for what they do and their worldview is warped and pampered. It taints the article for me to approach it with both sides being biased, and the subject matter tone being, for the lack of a better word, insufferable.
I learned something important early in my career: the first number you put out will be remembered.
Unfortunately it’s often true. People keep saying: "but didn’t you initially say X?"
"Sure I did, but I have new knowledge" won't always work.
A nasty side-effect is that people who are aware of this shy away from giving you numbers.
> Kirk: Mr. Scott. Have you always multiplied your repair estimates by a factor of four?
> Scotty: Certainly, sir. How else can I keep my reputation as a miracle worker?
There’s a whole generation of developers who have internalized this.
There’s a whole generation of management who have caused this due to their own behavior.
> the first number you put out will be remembered
This is called anchoring effect, a psychological bias.
I think you meant "anchoring bias": https://www.scribbr.com/research-bias/anchoring-bias/
Anchoring effect is something related but subltley different: https://en.m.wikipedia.org/wiki/Anchoring_effect
Is it? If you visit
https://www.wikipedia.org/wiki/List_of_cognitive_biases#Anch...
and follow the first link labeled
Main article: Anchoring (cognitive bias)
you may be surprised.Actually, you may also read the first link you sent and be equally surprised:
Anchoring bias (also known as anchoring heuristic or anchoring effect)
https://www.scribbr.com/research-bias/anchoring-bias/#whatOf course businesses want to be able to de-risk by having highly accurate predictions of the future. Too bad those don't exist in any domain of business planning.
More often, the focus on estimating comes from management layers where incentives are not structured to reward anyone for accurate estimates, merely to punish them for missed deadlines.
Time to finished is only one dimension of estimation. With any unit of engineering work there may be code debt added or removed, complexity increased or decreased, morale increased or decreased, etc.
Focusing only on time, especially in a punitive way surely negatively impacts the others.
After too many iterations of providing "some wild-ass guess" estimates and them turning into hard deadlines, I now try to champion a No Estimates[0][1] approach with stakeholders.
There is often understandable resistance to this at first. To address concerns, I find it helpful to share with stakeholders that a reasonably accurate estimate, one which could be legitimately used in planning concerns, is really only possible in one of two situations:
A) the outstanding work is a carbon-copy of a previous
effort, such as the second time provisioning a data
center for the same system.
B) the remaining new functional work is determined
by the team to be in the last quartile and is
well-defined, including remaining risks to
successful completion.
EDIT:Micro-estimates are the enabler of micro-management. A healthy team identifies the highest priority tasks to perform and does so in descending order, where priority is defined as risk to project success.
0 - https://www.youtube.com/watch?v=MhbT7EvYN0c
1 - https://www.goodreads.com/book/show/30650836-noestimates
This is a fun formula that caught my eye a while ago in HN, it looks flashy and very cool. Of course, just like others do their estimations, this one is just a made up formula and without any formal validity, apart from supposedly personal experience:
https://news.ycombinator.com/item?id=37965582
My estimate math:
R = t × [1.1^ln(n+p) + 1.3^X]
R - time it really takes.
t - shortest possible time it would take without need to communicate.
n - number of people working and involved during the process, both customers and developing organization.
p - longest communication distance network involved in the project (typically from the lowest level developer to the end user)
X - number of new tools, libraries, techniques, used in the process.
Example. Project involving one developing writing code. Project would take 2 weeks (t=2), but it has 5 people (n=5) involved total, only 1 new tool (X=1) and longest communication distance is 4.
2×(1.1^ln(5+4) + 1.3^1) = 4.5 weeks.
The X factor is likely correct, but needs some additional notes.
Include additional X quantity for unknown unknowns, not just the known unknowns.
Estimates are tricky because different manager roles and different personalities bias toward totally different/incompatible concepts of what an estimate actually means. The author's article is conflating realistic and pessimistic estimates:
- Realistic e.g. tech managers and people who favors agile/lean/XP/etc.
- Optimistic e.g. sales managers and people who want to promote.
- Pessimistic e.g. risk managers and people who need firm deadlines.
- Equilabristic e.g. project managers and people doing critical chain
The abbreviation is ROPE, and it turns out to work really well in practice to cover all four bases. My notes are below. Constructive criticism welcome.
Sounds like they need curves (probability distributions), not point estimates.
+1. I wrote myself a script that does this with distributions based on my personal time tracking data for doing certain tasks.
More concretely, I sample with replacement N times from the empirical distributions of each step, then sum the steps to get N “samples” from the total distribution.
This is called bootstrapping: https://en.m.wikipedia.org/wiki/Bootstrapping_(statistics)
It isn’t too hard to do, and I can confirm that it works reasonably well.
One trick, if you can get away with it, is to ensure that you are always estimating for a fixed scope exclusive of unknown unknowns.
You should not provide an estimate for "feature X implemented", but rather for "feature X engine". If you discover additional work to be done, then you need to add "existing code refactor", "feature X+Y integration", etc. as discovered milestones.
Unfortunately, you need that nomenclature and understanding to go up the chain for this to work. If someone turns your "feature X engine" milestone into "feature X complete" with the same estimate, you are screwed.
------
There is a related problem that I've seen in my career: leadership thinks that deadlines are "motivating".
These are the same people that want to heat their home to a temperature of 72F, but set the thermostat to 80F "so it will do it faster".
I was once in a leadership meeting, where the other participants forgot that I, lowly engineer, was invited to this meeting. Someone asked if we should accept that deadline X was very unlikely to be met, and substitute a more realistic deadline. To which the senior PM responded that "we never move deadlines! Engineering will just take any time given to them!"
Engineering, in that case, gave the time back when I left that team.
The best estimation system I've seen is to actually BET on the completion date, closest gets a free lunch paid for by other members of the group.
Those dates were mostly informed guesses of what would actually happen or go wrong. Importantly this was between friends. Needless to say, they turned out very accurate.
In my direct personal experience, for me it was 1969 during my first major gig. I remember one project, called the March 1 system, that slowly came into being sometime that October, if my memory serves me correctly. There was tension, of course. This was exacerbated by having to work nights to get access to the system to continue development.
I eventually learned, when asked for a "quick" estimate, I would give something drastically longer than I knew would be accepted. I said "But I can give you a better estimate if you give me a few days to do a better plan." This always got me the extra time to provide an estimate.
My technique is to give an estimate with error bars. Something like 6 weeks plus or minus 2. That then leads into discussion as to what is unknown/undefined leading to the uncertainty. Sometimes the error bars are larger than the estimate because I know there will be endless feature creep and UI revisions for something like "analytics". And sometimes the number could be negative like 6 weeks plus or minus 8. That is when the functionality already exists (eg the data is already there - you can already load it into Excel and a pivot table) and could be sufficient.
At my very first job out of college,¹ the VP who led the division that I was part of invited me into his office (I think there was someone else from the team, but this is 34 years ago so who knows) and talked about a little game he had encountered about estimates. He gave us a list of 20 things to estimate with 95% confidence (the only one I remember was the weight of a 747), all given as a range. If we did it right, we would get 19 of the 20 correct. The point was to set the range large enough that you could be confident about the answer.
I kept that as a practice when I did freelancing work, always giving my estimates as a range rather than a single number. It would be nice if agile(ish) practices incorporated this in the estimation process.
⸻
1. Technically, my second job since I had a short temp job doing some cataloging work at a local used bookstore between leaving school and starting at that company.
I often see takes on this topic from the engineering side. "It's hard!". "Managers just don't understand".
It feels like as a community, it would be useful to get more articles seeing things from the other side and exploring functional approaches beyond provide-a-worst-case-scenario-estimate.
There's a reason this dynamic is so pervasive. In order for everyone in an organization to do their job well, people do often need a realistic set of estimates. Can sales promise the prospect this integration? Can marketing plan a launch for the new feature? Can the CPO make a reasonable bet on a new line of work?
In my experience, the nuance here is more about handling the mis-estimates. How do we discuss the risks up front? How much work should we put into contingency planning? How do we handle the weeks / months before a deadline when it is clear that we need to limit scope?
This is it. I'm a hardcore engineer at heart who has a lot of these sales, marketing, and product folks as friends, and can attest to the fact that they also have constraints.
The whole world runs on deadlines and timelines. Even a president is elected for a specific duration. If you're in a B2B setting, the customer demands (sometimes even contractually binding) at least the Quarter when something will be delivered.
Time is the only common denominator by which different activities can be coordinated. Without some heed to time, there will be no coherence.
Not only that, in the name of efficiency where I work estimates are actively being pushed down and no spillovers are allowed. I've been in crunch like mode for over a year with no recuperation at all. I kept on hoping it will get better but it seems it's getting worse. On top of it we're all rotated to different areas of the product all the time with no recourse. Though I'm still running along I feel absolutely spent mentally...
> When did estimates turn into deadlines?
In my personal experience: the first time I gave what could be construed as an official estimate.
The deadline portion come in when someone expects to pay for the work or to pay for the opportunity cost for the work not being completed...
I would say that the trend against actual agile and towards waterfall (PRDs are totally en vogue) suggests that deadlines are how most company management looks at this. Definitely not suggesting it is right, but "really aggressive 'estimates' (that are accurate beyond human ability) you can plan on", aka deadlines, are expected of any product/engineering org today.
Tell me a story I want to believe, even if it isn't true. Then you can make it the teams' fault because they said they would and didn't.
When I give an estimate, I always reiterate that it's just an estimate that is based on limited and incomplete information, and that the real number can be more or less. If they don't like it, they're free to put someone else on the project.
That's cool that they went to Sokcho!! Not many people go to Sokcho but imo it's the best city in Korea, the vibes there are on point, nobody really speaks English and it's pretty dead for the most part, but I really love Sokcho for the vibes, it's maybe the place i've felt the most at peace in my life. If you ever get the chance, I recommend Sokcho. :)
I visited Sokcho a few months ago! I agree with you, it just has such a nice and chill vibe that makes you be at peace, and lot's of interesting history in relations to North Korea. It was also really nice to go for a bike ride around the big lake that is there.
This is especially problematic in early business when the team is small and manager act with same policy/process as when at BigCo.
In this case don't estimate the time to build a poorly defined $something - invert the problem to estimate the value of $somethig.
It's amazing how many of those managers asking for estimates push back when they have to put one out. With all the same reasoning that engineers have when estimating.
A good manager should start with the Value first and allocate time-budget that makes that Value payoff.
Working in smaller steps is how you should build software. Constantly get feedback and re-evaluate what you're working on with other members of the team. Instead of giving an estimate, use t-shirt size.
With constant feedback, the whole team is participating in the emergent complexity, instead of being passive and just annoying you with "is it done yet"?
Agile pretty much throws out estimating anything bigger than a sprint. Even then, points don’t mean time and velocity can be wild for a mature team.
but what if I’m working on something meaningful?
I can’t MVP my way to a simulation physics engine, when each feature or partial feature requires weeks of planning, testing, iterating and tweaking- privately, before anything can be delivered to be used.
Feedback, implies a working widget.
Feedback does not need to be on a MVP, it can be given before that from your fellow engineers. That said there are tasks which really take a lot of research before even fellow engineers can give feedback.
I definitely relate to the basic issue of estimates with my one-person consulting projects. The bit they didn't mention is: who pays for the cost of the inspections and analysis? In software, it can take a long time to analyse the requirements and the solution, in order to come up with an estimate (or fixed quote), and I find it awkward trying to get paid for that time.
Do a roadmap. Fixed scope small project that you get paid for where you define requirements. Deliverable is a functional requirements doc. It derisks the project for them, and gives you more confidence when quoting a fixed scope fixed price project. You just don’t guarantee timeline. I’ve been doing it for over a decade - it works wonders.
Best way but hard to find clients open to this as it requires more trust especially if it starts to drag on.
What is the total cost of these projects and how much are you typically charging for the requirements doc?
I'm skeptical that most of the clients I work with would go for something like this. Generally my projects are in the $7k-20k USD range (though with some clients the total spend is much greater with additional phases of work...)
Also, what do you mean "you just don't guarantee timeline"? Why not?
I normally only commit to "We will give an update in X (minutes/hours) to make sure we understand the problem first. Then start giving estimated timelines in ranges with specific call outs for updates and possible changes to the timeline.
I've found that most management just want to be involved in the process and have definite times set for updates and can handle timeline changes as long as information is coming at regular intervals.
I learned how to manage client expectations from Scotty in Star Trek: https://youtu.be/L3jXhmr_o9A
Multiply your original estimate by 3, works most of the time
Heuristics I'd learned was "double the time and bump the unit".
So: 2 hours -> 4 days, 1 week -> 2 months, etc.
(I'm not sure where this turned up, but it's a long time ago, going on three decades.)
The other option is to carefully track tasks, relevant dimensions, estimates, and actual performance, and see if there's any prediction modelling which can be derived from that. Problem is that this a classic instance of modelling in which the model itself affects the domain (as with economics and sociology, contrasted with astronomy and meterology where this isn't the case), such that estimates incorporating past performance data will incorporate the fact that the prediction model is being used.
> (I'm not sure where this turned up, but it's a long time ago, going on three decades.)
That means it was 1.5 years (wall clock) ago right?
I like this method.
So 1 year -> 2 decades?
Yes.
Note that this also comes out about the same if you're using months (12 months -> 24 years (2.4 decades)) and (at least within a factor of two-ish) weeks (52 weeks -> 104 months (0.867 decades).
I'm not claiming this is accurate, I'm stating that it's a heuristic I'm familiar with.
This may have been in Arthur Bloch's Murphy's Law and Other Reasons Things Go Wrong, though I'm not finding it in comparable collections. Possibly from project planning literature I was reading in the 1990s (DeMarco & Lister, Brooks, Boehm, McConnell, etc.).
Then you get the dreaded can you explain why it takes so long, or I asked engineer xyz and they gave a different estimate, where is the complexity etc.
If you already asked someone, I'm not needed here so I'm going back to work on project Y that already is late enough.
I prefer pi myself. For the sort of false precision that managers love.
I've heard "multiply by 2, use next time unit" rule. So 1 hour -> 2 days, 2 weeks -> 4 months, etc..
Montgomery Scott recommends 4 instead.
A friend suggests doubling the number and increasing to the next unit — hours become days, days become weeks, etc.
I've certainly seen some environments — plural — where a task that should take 1 hour actually takes 2 days, and one that should take 2 days takes 4 weeks.
By the book admiral, hours could seem like days.
Multiply by π — it's more accurate.
Accurate and precise is different things :)
Of course you’re right, however:
1. If we’re talking about bamboozling people who are either ignorant or willfully incapable of understanding estimation, excess precision is a stupid but useful tool for getting a multiple of 3x-4x past them.
2. If the target audience understands excess precision, the excess precision is a nice little in-joke to flatter them and acknowledge their cleverness.
3. The absurdity of the precision illustrates tha very imprecision of the estimate.
> Can you imagine if the insurance company started arguing with the repair shop, asking them—no—telling them that they would only pay the $18,000 and not the additional $20,000 because that was the original estimate? Does that sound ridiculous to you? It does to me, too.
No, actually, this sounds quite realistic and in many cases even reasonable. Medical insurance does this literally all the time. Insurance companies of many different kinds in general have to do this to prevent fraud and keep costs (and by extension premiums) low.
> There is no fixed path when modernizing a complex legacy system. There is no rulebook to follow.
Of course not. But as an engineer tasked with this kind of project your estimate should reflect some kind of plan for accomplishing the project, where the plan has more concrete and actionable steps that make estimation easier. And if you are good at your job, your estimate should account for known-unknowns (eg this part is contingent on something partially out of our control and may be delayed by X months) and give enough slack to handle inevitable unknown-unknowns without excessive padding. And uncertainty regarding any particular risks or variance in how long somethign takes should be explicitly noted and communicated.
My $0.02:
The unfixable estimate vs deadline problem ultimately boils down to money. A particular project might be worth it if it ties up 10 people for 3 months, but not worth it if it ties up 10 people for 12 months. And relatedly, businesses oftentimes find themselves needing something finished by a certain time to close a sale/keep a customer happy/catch up to a competitor, and the stakeholders on the other end (customers, investors, partners) need some kind of date for their own planning.
Good estimation is really about identifying the probability distribution of the completion date, rather than a single "expected" completion date, and identifying/communicating the related risks. For a big project this itself probably will take a few days to months to complete and communicate. If you do this properly, there should never be any ambiguity as to what is an estimate and what is a deadline. If you don't do this, you are basically assuming that your manager (and their managers and so on) can read your mind and have the exact same understanding as you do regarding how firm the estimate is + what the risks and possibly "actual completion times" are. You're also denying them the opportunity to help you derisk or accelerate the project by eg adding more people to work on it, reach out to some external stakeholders, or communicating the end date to eg investors/customers in a way that reflects risks. You also need to update the people that care about the completion semi regularly to tell them about any new risks or known unknowns for the same reasons.
If you are wildly off with your estimate even after spending the time to breakdown the project with a more actionable plan, identify risks, and come up with ranges/a distribution of outcomes - let's say you blow past your p99 time to completion - you are probably just bad at your job (or trying to operate above your skill level) and it would probably be very reasonable for your manager to hold you to a deadline with the threat of firing you/canceling the project. Yes, the sunk cost fallacy dictates that even projects that run over should probably be completed in many cases (though since overall budget/personnel are usually constrained I think the opportunity cost of doing other things becomes pretty important). But if people can't trust you to estimate properly they probably can't trust you to lead a project or to have estimated the remaining time to finish the project.
This is a very good comment and does reflect my experience. As engineers, we're the only people who can estimate something close enough, and it becomes our job to do that while taking into account the risks.
Our bad assumption is in thinking that only the final output matters, regardless of when and where it is delivered. Like saying that it only matters if the train arrives at the station, regardless of when it does.
The problem is anyone depending on us downstream will get impacted. And yes, estimation is tough, requires foresight, and maybe a lot more things, but that's what being a professional means.
when an external / vendor team picks a project...
estimates = contract amount = project budget ...
it is hard for a project team to negotiate later
often the estimates need to be competitive or bottom most for you to win the contract
no one acknowledges the unknown and risks at the beginning of the project
writing down more than 4-5 risks along with the bid amount is taken as a sign of a team that will not be easy to work with and fight over every bullet point whether something is in scope or a change request
and often the one who estimates and wins the project may not be on the actual project team with delivery accountability
Estimates turned into deadlines around the time oneone came up with the concept of an estimate.
David Parnas wrote in A Rational Design Process (1985) [1] that accurate estimates are essentially unattainable, but we should try anyway. The point is, I think, that the ceremony of a structured process facilitates communication better than no ceremony.
[1] https://users.ece.utexas.edu/~perry/education/SE-Intro/fakei...
Estimates have always been deadlines. I've noticed it since at least 1999. Either your estimate goes up a certain number of management levels and then they don't want to lose face so you are stuck with it. Or a salesperson has seen it and promised it to customer and nobody wants to be the one that loses the deal so you are stuck with it.
A lot of ink has been spilled on this topic.
The solution is simple: get better at estimating.
Software engineers act as if they're the only ones in the world who are asked to estimate and then held accountable.
It's a skill issue.
> It's a skill issue.
Maybe. Sort of. Skill hints at the problem, but it's more of an experience issue. More experienced developers can give pretty reliable estimates for work they've already done. The catch is, how often do you ask an engineer to do something they've already done? The beauty of software, is that you don't solve the same problem over and over, because if you did, you'd automate it.
So where does that leave us? Well, developers who've seen a lot of different problems recognize when a new problem they see rhymes with a problem they've solved before. That can allow for a very good estimate. But even then, engineers tend to think in terms of best case, or maybe even typical case. I saw a study on this a few years ago, and it showed how often engineers tended to cluster around the median time for solutions. But, since really long tasks, with big error bars have a tendency to really skew timelines, the solution averages was much farther from these median times.
Believe it or not, lawyers have this problem too. They are taught in law school to come up with an estimate based on expected effort, then they apply a formula similar to: double the estimate, then increase the time units. So a 5 hour task becomes 10days. A 2 week task becomes 4 months. Mind you, this isn't the amount of billable hours they're predicting, it's the amount of calendar time that a given task will take until it's complete. This ends up taking into account a lot of variables that lawyers have no control over. Things like court scheduling, or communication delay. I suspect it also takes into account blind spots we humans have around time estimates in isolation. Like, 1 task, if you can focus on it can be done in so many hours. But if you have to do 5 similarly sized tasks, it takes more than 5 times as long, simply because it's easy to expend resources like brain power, and the estimate ignores the refractory periods necessary once the task is completed. (BTW this was one of the problems with the very first stop watch study in the rail yard, where the rail workers didn't work at their sustainable pace, but worked at their depletion pace).
Yes I think you're right about it being based on experience.
It's still a "skill issue" but it's more about knowing how long it will take you or your team. If you don't have the relevant experience with the tech or with the team, you can't really gauge how long something will take.
Before I estimate larger projects I always ask who will be on my team. The estimate is very different depending on the answer.
You're correct to a degree, but IME it's a discipline issue, not a skill issue as such.
What makes software hard to estimate is not so much the work, but how much requirements, priorities and resourcing change on the fly. I've worked in places that did quite strict scrum: once the sprint was planned and agreed to, we pushed back hard on our business area if they wanted to change anything. For practical reasons we didn't outright ban it, but for example, if they wanted something new done "right now" they had to choose something of roughly the same scope to be removed. If they couldn't do that, we'd cancel and re-plan the sprint. They hated having to make that choice, so most things just ended up at the top of the backlog for the next sprint.
The other part was our own management wanting to take people out of the team mid-sprint to work on something else. We never said no, but did say "this means we have to cancel and re-plan the sprint (and tell our business area)".
Basically, we made the people making the changes take responsibility for the impact on deadlines. Our estimates in that scenario were quite good (within the bounds of the sprint, at least).
Of course, the longer the estimation window the less well this works. The only way to estimate accurately is to constrain scope and resources, which is not actually what management/business want.
If you're hiring architects and engineers to design and build your home, you might already have a pretty good idea of the home you want. The people you've hired provide estimates on cost and timing based on solidly known quantities. They've put in a basement like that before. They've worked with your weird materials. Their vendors report on material status daily.
Software development is not surrounded by this sort of infrastructure or codification. My discovery process establishes these lines of communication, and I have no idea when I'll uncover a black box or when one will be imposed upon me.
It's literally impossible to know what you don't know. Sometimes you discover things in the process of your work that couldn't have reasonably been accounted for on first estimate, and it is not good practice to always give an estimate that assumes a 5x difficulty multiplier for some unforeseen reason.
Yes, if this is an "every time experience" then there could be a skill issue or possibly some other undiagnosed or unrecognized systemic factor
No, it is a funding issue. Nobody wants to pay software engineers for actually doing prestudies to learn enough to do a proper estimate. They want a good estimate but do not want to pay for it.
Nonsense. Code is copy-pasteable; other things are not.
One can give very accurate estimates of how long it takes to build a brick wall because building brick walls takes time and labor. You can make highly accurate estimates of how long it takes based on how long it has taken to do the exact same task in the past.
But suppose I laid one brick and then could copy-paste it, then copy-paste that into four bricks, then eight, until I have a wall. Then I can copy-paste the entire wall. Once I have a building I can copy-paste the entire building in five seconds.
The ability to copy and paste an entire building is very valuable but how long does it take to create that copyable template? No one knows because it has never been done before.
Building brick walls is easy to estimate not because it is physical labor but because the company has done it many times before. Ask a construction company to estimate a unique one-off job and they will most likely fail. And that is despite them getting a lot of resources for investigating and planning which software engineers almost never get.
Yes exactly this. Perhaps when I said "skill issue" I should have said "experience" instead as another poster suggested.
Sometime around 1956 at a guess.
Sadly, estimates are a negotiation. Whoever provides a number first generally loses because of "The Wince".
"What's your estimate?" "I'm not sure." "Just ballpark it."
"Well, when would you want it by?"
This is the trap that new managers will fall into every time. If they give you an answer? Bingo. You give them "The Wince":
You suck air between your teeth, and with a frown say, "Oh, that's completely unrealistic. Where in the world did you get that number from??" Then you provide a number that's many multiples higher, or offer a reduced amount of work: "Oh, gosh. We'd only be able to get X feature done in that amount of time, and only if we got lucky and cut feature Y from the other project."
Regardless, whatever you do, don't be the first to provide a number. It takes a little bit to get the feel for it, like poker or buying a car.
Time is money, even in a big corporation. Treat it as such: It's a zero sum game.
> Can you imagine if the insurance company started arguing with the repair shop, asking them—no—telling them that they would only pay the $18,000 and not the additional $20,000 because that was the original estimate?
Well yeah, because there's not an inherent power imbalance like there is in employment.
Part of this imbalance results in the ability for managers to employ Taylorization upon their directs. The majority of the time Taylorization hinders workers but management loves it because they can have more control in outcomes. What ends up happening though, is that an shadow work plan ends up getting established that management has less control over unless they want to drive out top talent by employing technocratic solutions to social problems.
Also, insurance does do stuff like that.
They do, I considered adding that, but felt like it detracted from the point.
Any passing glance at the healthcare system immediately reveals that doctors don't give patients cost estimates, inflate costs for insurers, and then settle at negotiated rates. Often they tack on procedures that aren't necessarily part of the initial plan due to unforeseen circumstances while the patient is unconscious and unable to consent, and you can go into a hospital that is ostensibly covered only to find out that a particular provider isn't. It's like if your treads were discovered to be bare, they decided to replace them and then the guy who does wheels sends you his own bill a month later.
When marches were replaced with sprints.
These are all just tricks to reduce cost and make sure you get things done. You have competing constraints:
# You need to align the rest of your team on things: maybe you have marketing tied to real world events, you're hiring and training to a sales cycle. If you do it wrong you waste money
# You need to fight scope creep which happens with unbounded timelines
# You need to fight a minimal product that's not viable because it isn't done enough for your market
and more. The ideal is for the decision maker to constantly have full information on progress and make instantaneous minute adjustments to keep things going at full steam. But there's no way to have full information and there's no way to make instantaneous minute adjustments if you're not a one-man shop. So everything else comes from the organizational effort of trying to work with that.