Friday, December 28, 2007

Project Management and Politicians

About once every month, the papers in Wisconsin's state capital of Madison run an article on the latest scandal related to a state IT project that is over budget, late, mismanaged, and generally messed up. Over the last year or so, we've seen a driver's license software program get bad press because it functioned worse than the system it replaced, there have been scandals regarding e-mail systems and server consolidation projects at state agencies, a parting of ways with a consulting firm that was failing in developing a voter registration system (http://wistechnology.com/article.php?id=4414), an audit report identifying numerous projects that were behind and over budget, and now today's latest regarding a state Medicaid program that is exceeding its original budget and timeline (http://www.madison.com/wsj/home/local/index.php?ntid=264282&ntpid=3).

State politicians decry the mismanagement, announce investigations, agency directors resign and replacements are appointed with promises to do better, and the mess continues. One problem is that politicians and political appointees have no business overseeing projects, especially IT projects. It's one thing to build highways and bridges on time - these are known entities with plenty of supporting historical data on which to build good estimates for budgets and schedule (Boston's Big Dig, a project which was quietly closed recently, does not count. No one had ever attempted anything of that scale before. On the other hand, the Marquette Interchange project in downtown Milwaukee is fairing pretty well). IT projects, as I may have mentioned in this blog, are completely different animals.

When huge IT projects start to go awry, as they so often do, the political motivation is to cover up the problem rather than apply project management practices to it. Doubtless, on most if not all of these projects there is a state employee, a project manager, who knows what the right things to do are and wants to do them: Report status accurately, hold vendors and project team members accountable, and find ways to recover the project, or at least update plans and status based on what is really happening. But at each reporting level above them, there is someone whose cozy job is dependent on bad news not getting out for as long as possible.

Successful, professional project management is dependent on transparency and an absence of consequences for project managers who perform their function properly. It's not breaking new ground here to state that successful projects require an environment in which project managers and project teams are rewarded for honesty and collaboration, and held accountable for obscuring facts and real status, engaging in turf wars and empire-building and general failure to collaborate for the benefit of the project and its stakeholders.

A system in which political appointees and/or elected officials are responsible for overseeing project management of state IT projects is doomed to fail. Only an independent professional project management office could provide the processes, leadership and oversight needed to ensure that state IT projects are effectively, and more importantly, honestly managed. Wisconsin state IT projects are the subject of numerous "official" proposals for bringing better oversight to these projects. None of the proposals or ideas I have seen call for leadership and oversight by senior-level professional project managers.

I think this means that we will be reading about troubled Wisconsin IT projects for years to come.

Friday, December 21, 2007

Why can't we hit dates?

My boss asked a few months back why it is so hard to hit dates with big software development projects. It's almost a rhetorical question, and I think at a high level he, and most managers, know many of the answers.

But - here goes: In software development we're asked to provide a completion date with most of the work still unknown. Only when a PM or a project team is estimating a project they have done several times, and have absolute certainty of the scope, can they provide an accurate estimate of effort and schedule. With software development, every project is an unknown. The only way to hit a date to is to set it and then accept whatever deliverables can be done by that date. This does not achieve the goal of meeting what stakeholders want by that date but rather achieves a purely date-driven goal.

Not to get on the Agile bandwagon too much, but the idea of getting the stakeholders to look at completed software at regular intervals and declare the project done enough when they have the functionality they want is a pretty good one. Gathering all requirements up front helps the planning process but by no means guarantees completely accurate effort estimates or duration estimates unless the work very closely mirrors other projects and holds no major risks.

My boss expressed a desire to create a model where, in many cases, he could "turn the resource knob" and offer faster completion at higher costs when necessary. There is a point at which you cannot add more resources and go faster. It is important to note that more and more research shows that offshoring does not mitigate this - offshoring or outsourcing will hit business requirements road blocks, quality issues, or eventually the same law of diminishing returns. This is not new or unique to my current employer- this is a global condition that IS managers and PMs beat their heads over consistently.

Different situation - When I left a job as head of a custom solutions project group, my team and I could estimate video production projects within 5%. Publications projects could be a little more variable, but were still pretty accurate. Software projects were complete unknowns. Even when there were some basic similarities in content or subject matter, the programming side was unpredictable. The point here is that I had a team that developed estimates for a variety of project types on a daily basis - estimates that, if these projects went forward, would impact the team's performance bonuses each quarter. And despite that incentive and expertise, software projects remained incredibly difficult to estimate accurately.

Having had the need to do a lot of reading and research on software project estimating, the best piece I have seen on this topic recently is Joel Spolsky's piece, which you can find here: http://www.joelonsoftware.com/items/2007/10/26.html

Read it - it's good stuff.

Tuesday, December 18, 2007

The new strategy is working...

Replacing optimism with realism seems to be working. I never thought I would feel good about canceling planned UAT sessions, but I do. I know we did not proceed with them for the right reasons - we're just not quite stable enough.

The realism-driven approach to completing the project is paying dividends for the team and me. As much as our Agile-influenced project approach for the last year was intended to keep everyone communicating and remove barriers to candid communication, it's become clear that team members felt pressure to agree that they could meet desired dates, even when their gut knew it was unlikely. Now, people are speaking more freely about where they are with testing and whether we are ready to bring the users back in to test.

Since our sponsor (my boss) is bought in to this as well, the overall pressure to push to do user testing has been replaced with the responsibility of not restarting testing prematurely. This is much easier to manage.

Thursday, December 6, 2007

Optimism is bad

It finally hit me - the lesson learned in 1998 that I should be taking into this project: Optimism is bad. It's bad in project managers and bad in developers.

I've been consistently too optimistic in scheduling and managing dates, and the team has been too optimistic in believing that all bugs will be resolved at the last minute to be ready for user testing. Great quotes from the daily scrums: "Two days is an eternity in developer time." And, "Are you seeing any reasons to think we won't be ready to test on Monday?" Optimism...

This is the same thing that happened to me in 1998 with the BellSouth interactive training software. I'd test the beta and have no problems, schedule testing in Atlanta with the sense that nothing would go wrong, and then encounter problems with varying test platforms, missing video drivers and hardware, equipment damaged on arrival, or some other thing. Once we got through the optimistic dates and hopes that we would be ready for scheduled training deadlines and instead assumed the worst, we got the project done, and everyone stayed employed to work another day.

So - we're adjusting our sights and approach to remove optimism and instead rely on the bug reports and test data to tell us when we are ready to bring our users back in. And, we are setting the expectation that we will need to do a few more rounds of this before we are ready to release. The date will slip into the new year, but everyone seems ready to accept that as the price of a quality project. Best of all, our sponsor and key stakeholder are in accord that this is the way to go.

Wednesday, December 5, 2007

Reliving a past project

I realized today that my current project is unfolding a lot like a software project I managed nearly ten years ago. In 1997 and 1998, I led an interactive multimedia training software for a client - BellSouth. As we got ready for usability testing, things started to go awry. We ended up firing a vendor that I know was doing the best they could, but had bit off more than they could handle. We had several testing misfires, missed due dates, heated discussions, and finally delivered the project several months late.

I was struck today by the similarities bewtween that project and my current project, although my current project is on a much larger scale. The thing I am struggling with now is "what lessons did I take away from the experience ten years ago that I could apply to this situation?"

I'm not coming up with easy answers. Much about how things are playing out is influenced by our corporate culture, which puts quality, cost management and work/life balance over due dates. But with our second attempt at user acceptance testing fizzling even though my team was confident just last week, there are emerging patterns that are not familiar to me, and I don't have a response in my past experiences to use, despite the similarities to a previous project.

In the next couple of days, I'll have to determine how to move the team forward while working through the team's functional manager. I'll let you know what how it goes.