Difference between revisions of "DevSummit07:Case Study: Benetech's Miradi Project"
m (1 revision imported) |
(No difference)
|
Latest revision as of 23:08, 20 May 2015
Facilitated by Kevin Smith
Kevin will explain the agile development processes used to create the Miradi Adaptive Management Software for Conservation Projects. Topics will include client engagement, user requirements definition, architecture, coding, deployment, and updates. He will also describe his own multi-decade journey through the land of chaos, waterfalls, extreme methodologies, and agility. Questions and discussion will be strongly encouraged.
Kevin has been associated with Benetech (http://www.benetech.org) for six years, first as an employee and now as a contractor. He has been the technical lead on the Martus Human Rights tool since it began in 2001, and now is also the lead on Miradi. Both are user-friendly Java Swing desktop apps that communicate with servers.
Kevin attended Aspiration's Summer Source event in Croatia, and currently lives in Florida.
Session Notes:
Introductions
I am Kevin Smith, a developer with 30 years experience including about 30 projects. Miradi is being developed by Benetech, a silicon valley non-profit technology organization, and the Conservation Measures Partnership, a consortium of conservation non-profit organizations. I am currently a contractor for Benetech, after being an employee there for several years. Miradi is a standalone desktop application that helps field practitioners plan and manage environmental conservation projects.
This has not been a perfect project, but by most measures it has been a great success. We delivered a 1.0 beta version this past January, after about 18 months of development. We were about one month later than scheduled, partly to avoid releasing during the holidays, and partly to fit in one last major feature). We were about 25% over budget, although it is a little hard to tell whether we delivered more or less than was originally planned within that time period. Initial user feedback has been positive, and we have had very high quality (few bugs).
Project Initiation
The CMP created a project proposal, and brought it to Benetech for evaluation. We concluded that the project had merit, was technically feasible, and that the proposed budget seemed reasonable. We also noted that the proposal was written in a "waterfall" style, and suggested that the CMP consider using an "agile" approach instead.
After learning about agile, the CMP folks saw that it was very similar to the Adaptive Management system that the app would facilitate, and they asked Benetech to take on the project. This was Benetech's first major project for an external customer, and it was agreed that the project would be run by Benetech and the CMP as partners.
The primary CMP representative ("Nick") is a domain expert in Adaptive Management for Conservation projects, and co-wrote a book on the subject. He is the visionary behind the project, and agreed to work closely with the development team. He outlined the initial requirements, and has continued to provide additional requirements and extensive feedback as the project has progressed.
Architecture Choices
The first milestone was the creation of a throwaway prototype, demonstrating the potential of the application to other CMP members and prospective funders. This prototype was written in Java using some of the third-party libraries we would go on to use in the real app. We learned a lot from the prototype, but as planned, we discarded all the code when we started creating the real application.
Early on, we realized that many of our users would not have fast or reliable internet connections, so this had to be a standalone desktop application. We needed to support Windows and Mac (and hoped to support Linux as well), and would need to support multiple language translations. Based on earlier positive experience with Benetech's Martus application, we decided to use Java and its Swing UI toolkit.
Originally, I didn't want to use a relational database, largely due to my own personal bias against RDBMS systems. I ended up deciding to yield to conventional wisdom and used one anyway. Within a couple weeks, it was clear to me that an RDBMS was not the right tool for this application. RDBMSs are very bad at managing user-ordered lists, and hierarchies of objects that have a lot in common but significant differences. This app was full of both of those cases. I ripped out the database, and replaced it with a very simple home-grown "object database", where each object is written as a separate XML-like file. We actually used JSON instead of XML for technical reasons.
From day one, we built in the ability to localize the app to other languages. But after creating the basic framework, we have not done a lot with it. It is important that a future potential feature like internationalization not create ongoing costs and delays for the developers. When we actually need to translate the app, we will be able to do so quickly. Again, we learned a lot from Martus, which has had its UI translated into Spanish, French, Russian, Thai, Arabic, and several other languages.
Ongoing Development
The team was initially just one part-time developer (me). Eventually it grew to 2.5 developers, a part-time tester/sysadmin, Nick serving in a Product Management/"Customer" role, and support from various CMP and Benetech folks as needed.
We decided to use weekly iterations. Each week, the team would choose what "stories" (features/tasks/defects) to work on. At the end of the week, we would build, test, and deliver working software. Nick and other interested folks would be able to install each week's build, and provide rapid feedback on recent changes. This short feedback cycle is critical for agile development.
We have unit tests, although not as many as I would like. We do a lot of refactoring to keep the code as clean and modular as possible. We do some pair programming, but probably only 10% or so. The developers work in the same room about 60% of the time, allowing sharing of project knowlege. As the lead, I review every line of code that gets checked in. However, code that I write does not necessarily get reviewed, and that is something I hope to change.
Delivery
As we approached the first public release, there was no "death march" or panic. We put in some long hours completing the last big features, but we were not frantically fixing bugs as happens at the end of many projects. We delivered 1.0 beta at the end of January 2007, and decided to release monthly updates going forward. The first of those (1.0.1) came out in the middle of February. The application has been downloaded by users in 25 countries across 6 continents, and early feedback has been very positive. No major bugs in the public releases have been reported (or found by our team).
The development team and customer (Nick and other CMP folks) have established a great relationship. Everyone has adopted a win-win mentality, and has worked to build and maintain trust.
We held a Retrospective to learn from the past and try to improve in the future. (We prefer the term "retrospective" over the common "post-mortem" which makes it sound like the project died. Another good term is "sunset review".) There were a lot of good suggestions, and we will be incorporating many of them as we move forward.
It is interesting to look at how much has changed since the original requirements statement. Of the 10 views originally envisioned, two of them have been dropped because we realized they are not needed. Several other views have been added that we didn't imagine when the project started. As the application has come to life, Nick and others have had creative insights leading to features that would have been almost impossible to imagine without having a working app to play with.
The Budget portion of Miradi is particularly exciting. Although designed for conservation projects, it has a rare feature that is relevant for almost any project: the ability to view budget data broken down by objective, or by accounting code. For the folks running a project, the ability to know how much was spent on reducing damage to the mangroves vs. how much was spent saving the coral is much more valuable than knowing how much was spent on Fred and how much on boats. The Miradi team plans to start using Miradi itself to manage the Miradi project budget. It is great when a team can become its own customer, because any usability flaws in the application will become obvious, and are likely to be fixed very quickly.
Questions from the group:
• What is agile? See the notes from the Agile session earlier in the day
• Did you throw out the RFP? We used the content as a general roadmap, but essentially converted the contract to a “sort of” subscription model
• Are there standard practices for breaking it up into smaller chunks? Not really, but there are some good tricks. I happen to be pretty good at it. Often stories can be split "horizontally", implementing the core parts and omitting the fancy "bells and whistles". Another option is to split the stories "vertically", handling some cases but not others.
• What would be a typical project you would Miradi for? I am not exactly sure, but I believe it would typically a small group of 1-5 people who are affiliated with one of the larger conservation organizations. Probably a 1-5 million dollar project over 3-5 years. But the projects will vary greatly.
• Iteration length? 1 week
• # of staff on project Kevin the only tech – some funding people – Nick of CMP added as project manager on domain side rather than technical side
• (not sure what the question was) With agile the goal is to postpone the decisions for as long as you can since you are then locked in. For this project – needed to be cross platform, users with poor connectivity so it needed to be desktop and needed to be able to work with 3rd party components.
• Does agile cause a lot of reworking? Not as much as you might think, because you don’t go as far before correction. Some folks believe you should NEVER look ahead as you make technical decisions. Others think you should rely a lot on expected future stories to decide on implementations. I fall in the middle. I think you should use past experience and future feature expectations when designing and making choices, but the actual decisions should be strongly weighted toward today's requirements only. Usually there is a design that will allow for future development without committing to the direction. I try to avoid creating generic frameworks until they are needed, and definitely avoid creating hooks that are not yet required.
• When in the midst of the project is it hard to spot opportunities to use design patterns? – As a good coder, if you are reviewing and re-factoring, you will generally end up with the good design pattern almost without realizing it. Patterns can be overused, so they shouldn't be forced where they are not appropriate.
• (about internationalization) We set up systems that would allow internationalization in the future but didn’t spend a lot of time on it at the beginning, and made sure that day-to-day development would not be slowed by worrying about other languages.
• What did you use for an installer? NSIS, mostly because it was used with Martus. It is ok, but not great.
• With a domain expert seeing weekly builds, is there a danger that he will focus on superficial problems with the app instead of deeper issues? Perhaps, but in this case Nick was a very practical, pragmatic guy who realized that little things like fonts or colors could be tweaked at any time, and that we should spend our early effort on key functionality.
• (unit tests) We have unit tests – several hundred tests that runs automatically in less than a minute.
• (refactoring) We used refactoring tools from the Eclipse IDE to keep the code as clean as possible.
• (pair programming) We did some pair programming – two people sharing one computer. It is not "one person looking over the other's shoulder". Both are fully engaged. I have heard stories of some pairs where one person takes the keyboard and the other controls the mouse.
• (not sure what the question was) Sometimes Nick wasn’t as available as we would like, due to travel or other commitments. He was in Washington DC and the development team was in Florida. Weekly team meetings went/worked well. We always made sure that the app always builds and the program always runs. I strongly prefers 1 week iterations, partly because they help keep a sense of focus and working at a consistent pace.
• Do you always deliver new UI functionality each week? We try to. There are very few iterations where there isn’t something for the client to see. It could be that the feature is too big to accomplish in a single week or some restructuring has to be done. Even the very first iteration delivered something that Nick could see and play with.
• How often did we integrate and build? Integration happened immediately, because each developer would pull the latest code several times a day, and eclipse would immediately report any major breaks. Each week, we did a full build which creates the installer. Our tester would notify of any run-time breakages within a couple days.
• Were you close to the estimates from the beginning? Was this a point of stress with the customer? Fortunately, we all went in with a win win cooperative attitude. We identified what was to be done and moved in a collaborative fashion. We basically hit the target release date – chose to delay a month because it was December 20th and they wanted one more feature. Ended up close to the budget.
• (not sure about the question) Response from users has been very positive. Bug count has been low. Decided on 1 month releases. Minor releases monthly – 3 to 6 months between major releases.
• What kind of user feedback have we gotten? We haven't yet received as much concrete feedback from users outside the team as I would like. [Note: The day after the session, we started getting good feedback] The domain expert has sort of a “no news is good news” attitude. We will be explicitly requesting feedback from registered users, and will be sending a survey of some sort.
• How did the Ruby on Rails part interact with the app? It didn’t. We built a rails web site to promote the app and allow users to download it. The app itself is pure Java.
• What format was used for the Team retrospective? What should we keep doing, what should we stop doing, what should we start doing?
• Did you end up having to modify a lot of code based on changing requirements? Not really. The requirements changed a lot for parts of the app that we hadn't yet worked on, but after we wrote certain features, those features remained fairly stable.
• How did you test the database portion? Basically through unit tests, plus a human tester testing the app as a whole.
• (question about using Miradi to generate reports for funders) Currently, each funder wants their reports in a different format. Miradi will be able to export its data in custom formats, so users could design a format for Ford, a format for Macarthur, etc. Hopefully if Miradi becomes the “standard” in the domain, funders might start to accept a single standard Miradi report.
• When you worked on your first agile – what resources were there? I read the original Extreme programming book by Kent Beck. I realized that I had done each of the 12 practices identified in the book in the past, with success. I became an advocate/zealot for Extreme programming for several years. Over time, I realized that Extreme programming is very effective, but is also very difficult to do, so it not appropriate for most teams. Now I have scaled back to a more moderate position: Extreme is great if you can do it but Agile is better for most teams. The process we used for Miradi is about 95% of the process described in the book "Sustainable Software Development" (see the link at the bottom of the Agile session notes).