Monday, August 31, 2009

One of the "Something Old" Variety

Back in 2007, I started a blog called just “Software Qualities.” Not long after, I needed to change jobs and ended up in a rather restrictive IP situation which caused me to stop blogging after hardly any posts – 2 to be specific. So while I collect myself after Agile 2009 and having caught up on 4 days of meeting notes, I thought I’d repost at least one of the original posts. I've posted it as it was to avoid being revisionist since I don’t think that differently today. So here it is, over 2 years later.


There are a lot of places where people can, and do, write about quality and about software quality in particular. What I hope to do with "Software Qualities" is write about (and have people write back about) ideas on what quality (mostly related to software, but not exclusively) means from a product, process, personal, professional, whatever perspective.

Why yet another place to go on about quality? Because I think it's very important and not just some theoretical subject (though there's plenty of theory out there).

What (finally) prompted me to start this blog was a fortune cookie message that said "Give to the world the best you have and the best will come back to you." Though it sounds a bit 60's "instant karma"-ish, I think this is quite a practical idea, though not always easy to do on a day-to-day basis.

I wrote a letter to the ASQ's Quality Progress editor a few years ago (and may someday expand into an actual article) which suggested that, in all the years of training, consulting, and speaking, I have never run into anyone who believes they consciously head to work intending to do a bad job. However, I've run into very few people who claim they go to work intending (or at least really knowing how or expecting) to do an excellent job. I also suggested that I thought most people end up do "the best they can" to "get by" that day as various pressures begin to weigh on them.

Then, recently, I was at a local ASQ section talk about management and leadership, which was actually quite good. In the course of the talk, the speaker stated that -- influenced by the style of management and/or leadership -- people will largely do just enough on their job to "stay out of trouble" and that more than 50 percent of people are "disengaged" on their jobs. The speaker stated this was based on formal research and it seemed to match my experiences with people's answers during my training, consulting, and speaking.

Again, nobody intends to do a "bad" job, but whether they really feel they can do their "best," whatever that means to them, let alone do an "excellent" job, whatever that means to them, seems to be a real question.

On the other hand, Deming suggested people "doing their best" was insufficient to achieve quality as he meant it.[1] I've worked several places where that was, indeed, the performance value and suggesting that it was insufficient would produce a reaction somewhat like, "Well, if we cannot rely on people doing their best to get the quality we want, what in the world can we rely on?" I would claim that this approach suggested that, if we were failing to get the quality we wanted, it was because, somewhere out there in the organization, there were people not "doing their best" and this pushed pursuit of quality into a "moral" rather than "engineering" direction. Maybe this is why people just work to "get by" because failure says something very disturbing to them personally give the moral approach to quality that, I believe, it does not have to.

Just some initial thoughts to get this rolling, then.

Do you agree with any of this? Do you experience this yourself or do you see it in others? What do you think "doing your best" has to do with achieving "quality"?

[1] Deming's original statement was “It is not enough to just do your best or work hard. You must know what to work on.”

Sunday, August 30, 2009

Agile 2009 Notes - Thursday

The following material describes the sessions I attended.

Thursday AM

Dan Mezick – “Boundary, Authority, Role and Tasks”

As I note in my Tuesday summary, Dan had said this session would be an expansion of some of the ideas from the more general Group Relations talk on Tuesday. Specifically, he spoke about four key elements in team/group performance and how “negotiation” about them (due to lack of clarity) creates/is waste. Vagueness about them causes anxiety and competition as two examples of wasteful energy. I found this session more strongly rooted in ideas I feel could be used and Dan made a point of discussing the clarity or vagueness around Scrum ideas as examples.

One by one, the four elements were addressed:

Boundary – defined as the “container for work” and was a topic Jurgen Appelo also mentioned in his Tuesday talk on complexity. Boundaries can be defined in terms of time (e.g., deadlines), physical space, territory, roles, resources, responsibilities, tasks, and resource access. Dan stated that the latter, resource access, can have a great deal to do with project success or failure. Waste can occur when boundaries remain fuzzy, though certainly flexibility in boundaries can be helpful. That is, rather than a hard line for a boundary there may be an area of broader boundary definition that can be tolerated and used effectively. But, ultimately there will have to be a point which is “outside” the boundary in order to use the boundary to define the work parameters. There is also what Dan called a “boundary culture” defined by how rigid or flexible the boundary is.

Authority – defined as “the right to do work” which can be formal (i.e., delegated) or personal (i.e., how one steps into and take more formal authority). Lack of clarity about authority can result in postponement of decisions (and Alistair noted in his keynote that this causes waste as people wait to see who will decide or what will be decided). This lack of clarity was described as “what you think they think” and “what they think you think.”

Role – definition depends on boundary, authority and task definitions. However, as with authority, there are formal roles (e.g., job descriptions) and informal ones taken on to fill gaps that formal ones do not indentify. What roles a person chooses to take depends a great deal, as one might imagine, on their personality and/or self-image.

Task – is defined, as Dan sad, by being unique to the current state of work, influenced as it is by time and personal difference. This is because, no matter how similar in name or general concept a task may be, it likely has never been done before exactly as it needs to be this time.

After this discussion, Dan went over Scrum’s roles, artifacts and ceremonies and we discussed how clearly or vaguely they are defined. For example, there are 3 defined Scrum roles: Scrum Master, Product Owner, and Team. The first two have specifically defined responsibilities, though exactly how they carry them out (and what else they may do) is not defined. The Team is a collection of many people with specialties and knowledge (and probably job titles), but Scrum does not define any of these. Thus, the Team must come to some understanding of how more specific definition will be accomplished and done in terms of boundaries, authority and tasks. The same is true for Scrum artifacts and ceremonies: there are some specific ones identified, but a large part of what they contain and how they are done is not specifically defined in Scrum.

Dan used a good word throughout this discussion of Scrum in describing what the official Scrum literature says. He called the information “canonical” and avoided the word “pure.” I liked this since the former carries with it the sense of being authoritative or accepted from a particular source which can be defined. The term “pure,” on the other hand, implies being free from impurity and somewhat restrictive, even carrying some moral tone. (And when one speaks of a person being a “agile purist,” there is a decidedly negative sense in which that is used, implying impracticality and unnecessary rigidness.)

Rod Claar and Doug Shimp“May the Forces Be with You, Exploring the Forces Driving and Restraining Agile”

This was another workshop-style session addressing the forces that contribute to adoption and rejection of an agile approach. The room was divided into two teams: one to address the driving forces (in favor of adoption) and one to address the restraining forces (holding back adoption). There were also a pair of “judges” (not Rod or Doug) as we would be preparing presentations which they would rate on impact of the selected force and style of presentation.

I ended up on the team addressing the drivers for adoption. Each team spent some time coming up with a set of forces. Ours had 7-8, but we eventually had to choose the 3 top ones in our estimation and present each one, in turn, alternating with the restraining forces team. Forms if presentation included a limerick and then various forms of skits illustrating the forces. There were a number of clever ideas fort representing the forces visually, almost allegorically in some ways.

A few comments made during the session, sometimes in response to presentations, were:

“Will any methodology work if it is not possible to find/identify the needs of the customer appropriately?”
“Is every force restraining Agile really about lack of/inability to collaborate” (which depends on communication)?
Regarding distributed teams: “Latitude hurts; longitude kills.”
“Sharks work alone; dolphins, in teams” and can take on sharks that way.
“Everything thrown into a shark tank dies.”

Thursday PM -

Christian Gruber & Lisa Moore – “Coach Aikido: Lessons and support for abused coaches in hostile environments”

This session was another workshop format with 6-7 people in a group and consisted of group discussions around so scenarios we were given. While we were discussing, Christian and Lisa circulate among the groups, made notes on what they heard, and offered some suggestions on occasion. Then there was a bit of debriefing by each group around the scenarios which, at the start, were the same one, but then each team selected another from the remaining set.

Before we began however, Christian and Lisa described and demonstrated, in slow-motion, basic aikido concepts which are based on “control of the opponent’s violence for the sake of their spirit.” So it is an approach which does not involve combat, but rather a way to diffuse/redirect force by “blending” with your opponent. From a coaching perspective, the goal was to come up with ideas for how to use this non-confrontational approach to overcome negativity and resistance in agile settings.

Aikido’s “approach” is to understand your attacker and try to “guide them into a new posture or situation…dissipating their violence (for their sake)…while also exiting cleanly and safely” yourself. This requires taking “the correct stance.” We were also asked to consider four “mental postures” a person may take: beginner, deeply rooted, leadable and no-mind.

The first scenario describe a single, senior member of a team who criticizes all aspects of the agile approach and proceeds to do as he wishes irrespective of the team. The Scrum Master is not his manager and he is “protected politically” by his actual manager. His knowledge is also needed by the team. We started out discussion in the abstract, but a couple people had such issue (or had them in the past) and we ended up focusing on the real-life situation, which had more details that could be brought to bear on how to handle such a situation. There was some discussion in our team about flattery, when various ways to reach out to the person and bring them into the group were discussed. One or two people felt the attention they would get might be seen as catering to their, possible, attention-getting behavior by others on the team. In one of our real-world situations, it was noted that others were wondering why they could not be allowed to behave as independently as this person. Some of the recommended ideas, in the workshop notes, involved eliciting information from the person about what they see as the usefulness of their work methods. It as also noted that a coach would need to look broader than just this person to see if there is some encouragement coming from elsewhere in the organization suggesting doubts about an agile approach.

The next scenario our team selected involved a team where several members felt negatively about pair programming and how it “invaded their space” and wasted individually productive time. These people did pair, but didn’t engage well and made those they paired with “miserable.” One idea offered within our team had to do with trying to provide information/evidence of how this approach had benefited others (hopefully in that organization but from other companies if need be). Another idea was simply not to push on this issue, offering other forms of pairing and cooperative work or, if all else failed, not requiring the people to pair. This approach was suggested as it was felt there were larger, more critical practices to worry about in all likelihood. Our real world examples suggested the latter was true and needed more attention than pairing.

Jesse Fewell“Growing PMI Using Agile”

Jesse has been at the forefront of building an agile community within the Project Management Institute (PMI). This talk described how this was accomplished, using an agile approach to the community’s work with the PMI headquarters staff. His slides have notes which cover his talk, so I won’t try to summarize all of that. Indeed, as an example, it was interesting to listen to, but the real payoff is what lies ahead as this community can now begin to approach local PMI chapters and bring agile speakers/instructors to them. The one phrase Jesse used that has stuck with me is “contagious culture of commitment.” This describes the dedication and effort he and others have put forth to make this all happen and how that attracted support from PMI headquarters through example.

Jesse also spoke on Wednesday evening at the ThoughtWorks office not far from the Conference hotel where many people showed up to hear the formal announcement of the PMI-Agile effort being launched. There were other talks/discussions at this event and I actually spent a good bit of time in a room with 10-15 people and video hookups to ThoughtWorks office in India and China. We discussed various aspects of widely distributed teams, the challenges this presents, and ways people have worked to address those challenges.

Brian Marick“4 Challenges, 5 Guiding Values”

This was a clearly well-planned and practiced presentation as Brian delivered it clearly and succinctly.

His 4 “challenges” were:

Open Workspace – or rather the lack thereof which represents an impediment to remove. The typical agile workspace may be viewed by others as messy, noisy, undisciplined, uncontrolled and, in general, “unprofessional.” Brian told a story of a coach that, when the organization refused to move cube walls, came in over a weekend and personally reorganized the space by disassembling and re-assembling the walls. On Monday, they stated that they’d keep doing this, if the walls were reassembled, until they were fired.

Courage – The prior example Brian gave illustrates when it can take and the challenge presented in doing so. But Brian noted that as the transition to agile improves, less of this sort of courage becomes needed.

Naiveté – or, again, lack of it. Brian note the need to hold on to this “beginner’s mind” trait to fight objections over new ideas by withholding judgment about practices not tried (e.g., pairing, TDD).

Infrastructure Design – There is always some tension between building as robust base for functionality and getting to the functionality. Features may come in a way that ends up with the architecture being inadequate if too little thought goes into initial concern for it.

Regarding Working Software, Brian noted that if one can deliver this, people may be more willing to give you some slack in other ways to work as you wish. The goal is to deliver something better than the last time, i.e., “last week I could not do ‘X’, but this week I can.” In this way the value becomes clear and concrete.

His 5 values were:

Reactive vs Proactive – in this case being “reactive” was more about adaptation than allowing poor work to be done and than scrambling to “fix” it.

“Irritation” (leading to) Ease – something will drive the desire to improve and Brian discussed a “ready-to-hand” approach (which focuses on the goal) compared to a present-to-hand one (which focuses on the tool). He used the example of a hammer which we do not think about as we focusing on the act of hammering compared to a hammer with, for example, a loose head which we have some concern over lest the head fly off while hammering. In this context, Brian also talked about the practice of “shaving yaks” which represents starting out on a task only to find something else it depends on and then something that depends on, trying to find the perfect way when a good way right now will do. (The Yak’s hair enters the picture as a last step driven to though starting out with nothing like that in mind.)

Solidarity – Brian mentioned the well-known Golden Rule of “doing unto others as you would have them do unto you.” The problem, he said, is that other people are not you and that a Platinum Rule seems better where you “do unto others as they would have you do unto them.” At this point, however, he turned back to the story of the open workspace disassembly and noted that it would have been a far stronger example had the whole team joined in and stated they would quit. That would show solidarity.

Decency – Brian stated this quite simply as “treating others uncommonly well,” which ties back to the Platinum Rule.

Joy – Finally, Brian spoke of bringing true joy to work and noted how everyone can speak about the “one great project” they were on. These days, he said, what he hears more often is that they do agile because “at least my project doesn’t suck as much as it used to.” Hardly an encouraging sign for agile methods and their value.

One final idea that I noted was Brian’s statement that “if you are doing agile well, you can afford to be wrong” as the consequences of a decision are easily corrected.

[This ends my formal summary of the Agile 2009 sessions I attended. I may have more to say about my reaction to some of these ideas in other posts in the future, though.]

Agile 2009 Notes - Wednesday

The following material describes the sessions I attended.

Wednesday AM

Daryl Kulak (of Pillar Technology) – “5 Symptoms of Mechanical Agile and Being Change Ready”

Got up early to hear this talk since it was a vendor presentation scheduled for 7:30am in the breakfast area of the Hyatt and I was staying about 8 blocks away and it was raining and ….
In any event, Daryl identified “mechanistic” signs and, later, ideas to make an organization more change ready (that I have associated with the mechanistic practices though they were discussed separately):

Agile Expert Syndrome – people continue to do things because the expert brought in to help at one time did so. (Later on in the talk Daryl said “If you see a Best Practice by the side of the roads, kill it” and referred to Mary Poppendick’s “good practices in context” quote.

Separation of Decision-Making from Work – Daryl used an example of decisions made by a process group (SEPG) separated from the teams which resulted, not surprisingly, in much overhead and resulting in “seepage” as the pronunciation for the process group acronym. Daryl’s advice, of course, was to close the gap between decision-making and work.

Blame the [other people, team] Person, not the System – here, the advice was to remove boundaries and encourage connectedness between teams.

Just Tell Me What to do, Boss – the material on this topic basically recommended learning to value unstructuredness.

Competition Between People and/or Teams – didn’t fine a direct “answer” to this as the last change-related point was to “avoid over-engineering your requirements, people and processes.”

David Hussman“Coaching and Producing Agility”

This was an all morning workshop on the role of the Agile coach and how to improve one’s coaching capability. Most of the time we worked in pairs (though “promiscuously” since we changed for each exercise like canonical pairing suggests). David divided the session into four (not necessarily equal) sections, which, in case you wonder, were using a music production theme as David’s former life was as a record producer and musician:

Coaching Personas (Your Coaching Style) – In this section, we were all asked to create/identify a persona for ourselves with a short phrase summarizing our style, a possible graphic to illustrate it, a series of words describing ourselves and a series of words identifying values we hold. Mine ended up being Socratic, Story-Telling, Scorpio with a graphic of "? my blog logo ?". My descriptive elements were: (Talkative), Organized, Inquisitive, [Musical], Teacher. My values were: Synthesis, Experimentation, Openness, Learning, Self-Direction.

So, to explain this, my approach to teaching and consulting has been to ask questions and guide people to answers rather than tell in as many cases as it seems reasonable to do so (a “Socratic” approach). I employ a lot of stories as examples since I’ve been around software development of various kinds and industries for over 37 years. And, though my astrological sign is Gemini, with Capricorn rising and moon in Ares, Scorpio is at my mid-heaven – I used to be in a band where one person’s wife was into astrology and she did my chart. So the Gemini represents the inquisitive, experimenting, open, change-ready aspect of my life. Capricorn represents my organizational inclination (which is often what people see over time). People’s mid-heaven sign usually represents their success mode, so Scorpio represents my synthesis approach where I try to take ideas from many places and produce some whole out of it that takes pieces from each. My logo represents this (see my first blog where I describe it).

The other elements of the persona I’ve covered more or less. Talkative has parens around it since I see that as something I need to contain as a coach and Musical is bracketed because I have a band/recording background and it just popped in there but I’m not sure how it fits exactly other than the band metaphor working better for me than team sports ones so common in business similes/metaphors.

Preproduction (Getting Ready to Produce) – This section addressed interviewing, which we practiced with one another by using real world project examples from our own experience. During this part of the session, David recommended a descriptive approach (”This is what I have seen work”) rather than a prescriptive one (“This is what you should do”) in doing our coaching. He also briefly touched on the Satir change model then collected typical agile practices together to show valuable, related groupings:

For Community-Teams: Chartering, Common Workspace, Information Radiators, Iteration 0
For Iterative Delivery: Burnup / Velocity, Acceptance Tests, Test Driven / Refactoring, Continuous Integration
For Products-Planning: Product Backlogs, Personas, User Stories, Release / Iteration Planning
For Tuning -Improving: Stand Up Meetings, Product Reviews, Retrospectives, Continuous Feedback

We completed this part of the session with a coaching plan development exercise to (1) take practices (what do you want to do?), (2) identify the value of each (why?), and (3) give an example of each (how might it work?). The goal was to arrive at a base approach each of us could use for coaching.

Finding Your Groove (Getting Productive) – David started this part of the session recommending a couple books: The Black Swan and This is Your Brain on Music, quoting from the latter:

“Groove is that quality that moves the song forward”
“When a song has a good groove, it invites us into a sonic world that we don’t want to leave.”

Five things help build “groove” in agile coaching: iteration planning, story-telling, stand ups, acceptance tests/reviews, and retrospectives/indicators. We spent time telling one another, again in pairs, stories to illustrate situations from real projects. David emphasized that each story told should be about somebody doing something of value. He also commented on the traditional story-writing format for agile requirements: he doesn’t like it because, I believe, he sees it as constraining ideas about expressing value, being ritualistic if treated dogmatically.

(Regarding story estimation he noted an idea I have heard before, which he said comes from ThoughtWorks, that people use a paper-rock-scissors approach to sizing since, you always have your materials with you.)

Keeping the Band Together (Staying Productive) – The final part of the session was about sustaining what is built through earlier coaching work. Davbid mentioned the traditional “5 whys” used to do root cause analysis, but recommended a “5 whats” approach since, psychologically, “whys” can lead to blaming of individuals. Of course, retrospectives were noted in this part of the session, including doing retrospectives on one’s own coaching using the coaching plan format of practice-value-example. Finally, David mentioned The Beginner’s Mind and the need to maintain curiosity about the work.

Wednesday PM

Esther Derby and Diana Larsen“Esther and Diana's Excellent Retrospective Adventures”

This was another long workshop, taking the entire afternoon. Most of it was based on Esther and Diana’s book Agile Retrospectives: Making Good Teams Great. The session began with an activity for teams of 5-7 people, of which there were ~7 in the session. We were all charged with building an “objet d’arte” with materials such as sheets of paper, colored pipe cleaners, stickers, etc. (but no tape). The “specs” for the project were that the result needed height, stability, and beauty. We were allowed to ask Esther to come over and ask her questions, as our “customer.” When we asked about height, she pulled out a sewing measurement tape to about 24”. When we asked about beauty, she mentioned she liked “motion.” Stability we took for granted as understanding (and perhaps lucked out there). Lots of “structures” resulted, but, as you might guess, the exercise was all about having something to use to practice retrospective technique.

Before we started doing so, however, we were provided with a proposed outline for a retrospective, beginning with Setting the Stage which included (1) identifying the focus/goal of the retrospective, (2) congratulating the team on their iteration, (3) checking in by getting everyone to say a word or two that described how their felt/reacted to the iteration, and (4) defining the agenda. The agenda would consist of: gathering data, generating insights, deciding priorities, and coming to a close. For each of the agenda elements, we performed an exercise.

For Gathering Data, each team generated a radar chart covering things like use of resources, customer satisfaction, quality of work life, etc. Each member of the team was asked to rate (0-10 scale along the arms of the chart) their experience with each element of the chart. We were then asked to identify the variances and commonalities among the results, discuss what we heard and saw, and recognize the high and low points for each element.

To Generate Insights, we were asked to identify what we felt were underlying causes for the major results (pro and con) using a four section matrix: what we felt good about, what we did not feel good about, what were ideas & insights, and what “bouquets” we wanted to give anyone. Each of us created stickies with things which could go in any of the portions of the matrix.

From this we moved to Deciding Priorities by listing actions in a table with the columns labeled: Action, Impact, Effort, Energy and Commitment. The last two deserve some clarification since they require people to indicate how dedicated they are to taking the action as a group and as individuals. The process was to list all the actions, then all the impacts for each action, then the effort expected for each action, followed by the energy people felt they had for the action. In case of apparent ties (since not all actions could reasonably be done in a single iteration), the energy column was to be used to break ties. However, people have to be willing to “sign up for” (commit to) doing any action, not just place votes and estimates on them. Of course, if people were not willing to ultimately commit to an action, what did their energy statement really mean? Finally, it was emphasized that we should end up being able to “take a card into the planning” for the next iteration that described the task to be performed (not as a story).

Coming to a Close involved doing a brief retrospective on the retrospective itself, revisiting commitments, and thanking the participants for their involvement.

Following this, there was some open discussion and question time. One question was who should facilitate retrospectives. A number of ideas were suggested: team coach, trained facilitator from outside the project, and rotation of team members. Esther and Diana recommended the latter and said it could help participation in the retrospectives since people would not want to withhold from participating since they would want others to be active when they facilitate. So this would help establish a more constructive retrospective “culture.”

One final idea about retrospectives and team interactions was that the absence of conflict is not harmony, but apathy.

To close, we were directed to a Yahoo group (, mentioning this class) as it is a moderated group. Also a book by Sam Kaner (and others) called Facilitator's Guide to Participatory Decision-Making was recommended.

Agile 2009 Notes - Tuesday

[Sorry I am behind on these, but I'll be catching up today and tomorrow.]

The following material describes the sessions I attended.

Tuesday AM

Alistair Cockburn’s Keynote, “I Come to Bury Agile, Not to Praise it”

Alistair’s presentation began with a recitation of a revised version of Marc Antony’s speech over Caesar’s dead body from Shakespeare’s Julius Caesar. He said he did not feel Agile was really dead, but that it had [begun to] become absorbed in the mainstream of how many groups do software. Like a large, melted iceberg, he said, Agile was now part of the ocean. As such, Agile has begun facing classic software development problems related to large team design efforts. To address this situation, Alistair made three main points related to (1) software development as a cooperative game (a classic theme of his), (2) craftsmanship (a recent “movement” in the Agile community), and (3) lean concepts.

Game – Alistair showed his matrix of competitive and cooperative game types along the scale of finite, open-ended and infinite time periods. Software development is a cooperative game which is finite (in length) and goal-directed. Alistair note that software development had two goals which could conflict: delivery the software [on time and on budget with acceptable quality] and “setting up for the next game.” The latter meant making sure the work done made it possible to do the next work. This involves effort that will pay off in the future, but not necessarily now. Thus, there will be a cost added to the current work that may show no immediate value and, thus, be hard to justify in the short-term.

A significant problem in software as a game is that “positions don’t repeat” (often), so strategies for playing the game must be adaptive. Being adaptive, however, means being able to communicate and understand effectively. Alistair described another of his classic themes: how the best communication is face-to-face and how distance between people who need to communicate has a significant cost that is often not accounted for in project structures. Distance, he said, affects whether people can detect the need to communicate, care enough to do so, and be effective in making it happen.

Craft – Alistair listed 7 major crafts: deciding what to build, managing people and projects, modeling, designing an external view, large-scale (architecture) design, fine-scale (programming) design, and validation. At this point, Alistair reviewed another of his classic themes, the shu-ha-ri learning stages where one learns technique, collects more technique and finally invents/blends techniques.

Lean – Finally, Alistair talked about work flow and bottlenecks, noting that software’s “inventory” that moved through the development process is “the unvalidated decision.” Lean aims for “continuous flow” and people waiting on decisions create the bottlenecks. [One problem with such waiting is that, to move along the expected schedule, people will often “make up” decisions that either go unknown or are learned of late and require expensive rework.]

Unlike manufacturing, software development requires feedback/learning as design is knowledge acquisition. Alistair spoke about waterfall as a late-learning approach to developing software since, until a coherent system delivery for testing occurs, there is not much information about the actual state of what will be delivered. Agile methods, in contrast, incur cost earlier to learn earlier through continuous integration such that a good idea of the state of the work occurs early enough to make risk-reduction decisions. This allows business to “trim the tail,” that is, make a decision to deliver on time (or early) or delay delivery to get more (or better) functionality.

In closing, Alistair said that 21st century development will use the game, craft and lean ideas he noted.

Tamara Sulaiman“Tips & Techniques for Implementing an Agile Program Across Distributed Teams”

Most of this talk was based on ideas from John P. Kotter’s book Leading Change in which he identifies 8 change concepts:

1) Establish a sense of urgency by “going for emotions” and being clear about the passion for change.

2) Create a guiding coalition of people from across the whole organization who are credible and committed to change. They should be willing to model agile behaviors for the rest of the organization, i.e., being cross-functional, running their meetings/work with one another in an agile fashion, being visible in the coaching/training given to other people in the organization, and maintaining a backlog of transition tasks with a Product Owner. [Audience member suggested forming, not a PMO but an AMO: Agile Mentoring Office.]

3) Develop a vision and strategy by showing what success in the new way would be/look like.

4) Communicate the change vision through slogans, graphics, posters, wiki pages, a glossary of new terms and how they relate to older ones. [There could clearly be some argument about this as it is how every “flavor of the month” program gets done in companies. Having point 2) clear would make this “marketing” acceptable and likely effective. Otherwise, it will be ridiculed.]

5) Empower broad-based action to allow change impact to go beyond development teams and IT as well as to provide an environment where it is okay to fail as long as the learning is clear and progress continues.

6) Generate short-term wins [to show progress] and use a wiki where teams can write about their experiences.

7) Consolidate gains and produce more change using lean, process flow, etc.

8) Anchor the change in the culture, producing a high-trust organization.

Overall, though the material was interesting, the talk was not that much about agile-specific issues in distributed teams. It was more a focus on organization-wide change management, which could be applied to any methodology, for example.

Luke Hohmann“Leveraging Collaborative Tools with Distributed Customer Teams”

Again, while some interesting ideas and pointers to other resources, this talk did not address substantive distributed team issues. Some important concepts mentioned, however, were:

- Collaboration is always about the goals, so being clear about goals matters. [Collaboration is not simply ad hoc “cooperation” among people.]

- Effective goals are expressed in verbs, i.e., action.

- Understand how fast a customer can accept the output of agile teams. [Cannot collaborate well if the work cannot be shared effectively.]

Hohmann pointed the audience to a Harvard Business Review article entitled “In Praise of Hierarchy,” noting that there are many effective uses for hierarchy, authority not being one of them but being the one most often associated with hierarchy. Hierarchy can be very helpful in handling complexity, for example.

Tuesday PM

Jurgen Appelo – “What (Else) Can Agile Learn from Complexity?”

Jurgen has one of the most popular blogs in Europe (related to software development and management) and his talk was an interesting exploration of ideas related to complexity. Basically, Jurgen took a series of quotations about complexity and related them to various ideas in agile methods. He began, as might be expected, with some quotes from Ken Schwaber, Jeff Sutherland, and Jim Highsmith discussing very direct links with agile methods. He then displayed a detailed graphic of the history of complexity science from people like Norbert Weiner through work being done to explore web/e-science ideas.

During the talk, he referenced numerous books and articles, including one in particular on social complexity and management. Jurgen is, himself, working on a book related to management in an agile context as he is CIO of a software development company in the Netherlands. Though the article describes 4 types of complexity, Jurgen’s emphasis was on social complexity because of its complex, unordered, and human-centric focus. This led him to some discussion of self-organization which is key in agile development philosophy. Interestingly, he asked whether an agile team, being coached, is really self-organizing, pointing to the Wikipedia definition of self-organization as including the phrase “without being guided or managed by an outside source.”

Jurgen then described various complexity-related principles, including:

Darkness Principle – systems elements are ignorant of overall system behavior since all system complexity cannot be present in each element; thus no single person (e.g., project manager) can monitor and control a whole system since they cannot know everything [and should learn to make use of self-organization and self-management in teams to help manage the complexity].

Law of Requisite Variety – to have a stable system the states of any control mechanism must equal or exceed that of the system under control and one (project) manager is less complex than the project being managed.

Boundaries and Conditions – self-organization requires that it operate within a boundary which defines the “self” being organized, thus agile management is an important part of agile development in setting such boundaries, i.e., teams do not get to make all decisions about everything.

Hierarchy Principle – “complex natural phenomena are organized into hierarchies wherein each level is made up of several integrated systems,” harkening back to Hohmann’s statement about the value of hierarchy and using a Scrum of Scrums as an example of one that can grow naturally rather than being imposed. Jurgen also noted a “patchwork” approach to Scrum relationships without hierarchy (and referenced Mike Cohn’s Mountain Goat Software site for further discussion of this).

Group Size – where it was noted that some research has pointed to ‘8’ as the most likely group size to lead to deadlock.

Specialization – many advantages accrue in complex systems when there is some level of specialization/division of labor.

Power Laws – which are related to that fact that there is a high chance of small issues and a low chance of large ones, suggesting that prediction of velocity into the future “includes an (impossible) estimate of the size of unknown problems.”

Dependence on Context – “the method to manage the project is embedded in the context and one must allow the emergence of such a method through interaction between the actors and the environment,” which led Jurgen to say that “ScrumButs are natural and necessary.”

Fitness Landscapes – is about how we “create the environment,” it is not “separate from us,” or, quoting a Spanish phrase, “My friend, there is no road, You make the road as you walk.” (This all relates to how a released system will affect, and change, the environment into which it is released.)

Incomprehensibility – states that “there is no accurate representation of the system which is simpler than the system itself,” so any model of it will be wrong, though, as George Box has said, “all models are wrong, but some are useful.”

Stephen Palmer – “Working with Large Backlogs”

Palmer discussed a variety of methods for backlog “triage” and management, including epics, themes, hierarchies, pruning, demand management, and extended kanban. He directed the audience to, in particular, with regard to the last approach and mentioned some ideas from David Anderson on time-to-market psychology, e.g., on a 6 week project with 2 week deliveries, there is not much schedule concern, but on a 3 month project with 1 month deliveries, schedule will matter.

Dan Mezick – “Group Relations and Social Systems”

This was basically a talk about group relations theory and how it could be applied to agile teams (to some extent). Dan noted that he had a talk Thursday AM covering the ideas (of boundaries, authority, roles and tasks) in more detail and I’ll cover that talk in Thursday’s summary. This talk began by noting work by Wilfred Bion and his book Experiences in Groups (1961) which covers the main theories in this area. In particular, Dan stated that true groups demonstrate “interdependence of tasks and fate” since survival is a key group driver. For example, he noted that wolves, on their own, must survive on meager food sources (i.e., small animals) while those in packs have richer sources (i.e., moose, caribou). A key issue in maintaining this survival and accomplishing tasks is avoidance of distractions since distractions = waste. Unfortunately, group relations can produce a great deal of waste, but agile methods, like Scrum, employ various “ceremonies,” “artifacts,” and “practices” to “short circuit inattention and drift.” This is done by leveraging “inattentional blindness” which research has indicated means people with a clear focus on activities of immediate concern can block out other things which might place a claim on their attention. In a sense, this supports the idea of “we see and hear what we expect.”

Dan then discussed planning which he said is a prediction, but “prediction is a judgment” and “judgment is a belief” and “belief is a filter” and “filters can distort reality.” All belief demands attention, which can produce waste if the attention is not on the reality of what needs to be done at the moment. Impediments in Scrum are one kind of distraction.

Dan continued by noting several resources/references related to group relations such as LeBon’s study (1895) of loss of individual identity in crowds, making them more easily influenced. Crowds exhibit “system-level emotions that are inherently primitive” as well as seeking dependence on leadership. Indeed, a group will usually seek out leadership that “voices” its main desire. Work by McDougal (1920s) focused on smaller groups who were task-oriented (not the unorganized crowds LeBon studied). Work by Kurt Lewin (1940s) on field theory influenced Bion and was the pre-cursor to modern group relations work. For Lewin a field is “the totality of coexisting facts which are conceived of as mutually interdependent” such as Scrum’s goal of co-location, its roles, its artifacts, and its “ceremonies” (e.g., stand-ups, planning meeting, review, retrospective). Dan also noted Tuckman’s (1965) “forming, storming, norming, performing” model of small group development to which Tuckman added “adjourning” (1977). Tuckman’s work frequently mentioned Bion’s.

Dan nexted spoke about attention and comparing Scrum to more waterfall approaches. In Scrum, “you cannot drift very far from stated task[s].” Waterfall approaches are “not an ‘attention harness’ in any sense of that word” since “all sorts of attention is diverted away from stated task[s], lengthening” their duration. In Scrum all distractions are treated equally, “they are waste.”

Dan finished up by discussing group relations “conferences” (which sounded more like intensive week-long workshops). He also pointed us to the Washington-Baltimore Center for the Study of Group Relations which he said has a variety of good resources on group relations.

Monday, August 24, 2009

Agile 2009 Notes - Monday

The following material describes the sessions I attended.

Monday AM

Research Stage

Research reports started Monday at 9AM before other sessions, but were located out of the way of all other session areas. Also, some people at information areas claimed sessions didn’t start until 11AM.

I spent the morning at two research presentations because the format this year drew in a great deal of audience participation as opposed to just a series of 45-60 minute talks. Dr. Jael Dublinksy explained that a goal of the Research Stage this year was to solicit a lot of input from industry folks. I believe it worked for those industry people present, but, again, being a out of the way(and mind) of people did diminish attendance.

The first presentation was by Rosalva E. Gallardo-Valencia on “How Agile Teams validate Requirements.” Team activity was observed and artifacts were collected for ~2 days, including stand-ups and Sprint Planning sessions. Overall, the study showed that, while standard validation approaches were not used (or even possible) in an agile environment, there as substantial validation performed at various points during team and team/PO interaction.
Interestingly, it was noted that the PO and written up a “checklist” of details about story information which was not shared with the teams. Instead, at the Planning Meeting, the PO would talk about this information and respond to team questions while the ScrumMaster worked to write the stories to match the discussion. I asked why, if all this material had already been written (even in brief form), did people feel index cards had to be created. There were a variety of audience opinions, but there really was no answer from the study why this occurred.
It was noted that the teams wrote jUnit tests before coding and that Fitnesse was used for acceptance testing.

After the presentation, there was a period of feedback to the presenter on work of the study. But then, the audience was asked to form into groups and consider research topics that they might like to see pursued. Our group had the following ideas submitted by various members of the group:

1) Classification of team sizes and recommendations for what team sizes/characteristics might be appropriate for different development situations. Dimensions to be examined were: trust, distribution of team, new vs “rewrite” type work, and degree of team integration, including senior management.

2) What concrete benefits does agile provide an organization, e.g., lower costs, higher customer satisfaction, improved time-to-market, market success (revenue).

3) Quantify an organizations degree of “agileness”

4) Effective approaches for introducing agile – path to adoption.

The next presentation was by Dr. Stuart Mitchell based on the work of Rashina Hoda regarding “Agile Undercover” where some organizations wanted to use agile but customers “couldn’t have cared less about what methods were used.”

Dr. Mitchell noted that a grounded theory approach (Glaser rather than Strauss style) was being used for data collection. Observation, without making conclusive statements was a characteristic of how this study was done.

Some outsourcing teams in India were studied. They wanted to use an agile approach, but most of their customers did not care to adapt to such an approach. Ms. Hoda interviewed a person from each of ~8 teams. It seemed that many ended up with customer proxies to allow their teams to try to function in as agile a manner as they wanted. Some teams did do periodic demos to actual customers, but some did not.

I suggested that the most interesting research work related to this might be to observe how the proxy role made it easier for the team as to behave as they wanted while the customer did not have to be aware of how the team was getting work done.

Monday PM

Elizabeth Hendrickson and Chris Sims on Design of Simulations and Games

This was an all afternoon session that asked attendees, in teams, to design “games” of various kinds that could be used to allow help groups learn (presumably about agile methods, values, etc.). There were board games designed; some involving pure interaction among participants; one involving joint creation of sentences; etc. Each, however, was developed around a different goal. One being “Truth and honesty in planning.” Once teams had been given some time to design a game, members of other teams would come by to “playtest” and provide feedback. Teams could revise their games and get further feedback.

Many fascinating observations occurred about what the games “taught” as well as what contribution to that learning the participants brought compared to the games as originally envisioned. Some of the main ideas that came from the session as a whole (some offered directly and others arising out of the session itself) were:

1) The debriefing around playing the game(s) is where the main value occurs, not in the game(s) themselves.

2) Using the term ”simulation” rather than “game” might make the experience of using games more acceptable in some organizations.

3) It’s better to layer complexity on a simple game than to try to remove it from a more complex one.

4) Consider the incorporation of obstacles, choices and randomness in games.

5) Interesting results come from players not sharing the same mental model of what the game is about. (Two examples were a sentence-building game and a balloon passing one.)

6) One good goal of a game is to see how participants achieve alignment, perhaps by evolving the game and its rules themselves to reach a goal they determine to be the point of the game.

7) Game debriefing is like a retrospective for which some suggested questions ideas were asking what happened (in game play); what did players do and feel; what surprised people? In this regard Elizabeth and Chris suggested being very observant as game facilitators, noting what people did and how the game changed, then using these observations to ask what the participants thought about the things observed.

8) Ask participants to “map” what the learned/observed in game play to real world situations and how they could take action in their environment to implement what was learned.

All in all, it was a very interesting day.

Saturday, August 22, 2009

What’s your agile methods “elevator speech”?

Sorry I missed a post for yesterday, but I had a bit to do to complete getting ready to leave later today for Agile 2009.

Today’s post will be short, but deliberately so as it is about asking you to submit your favorite agile “elevator speech.”

In case you haven’t heard the term before, an “elevator speech” is a brief explanation of something to get the listener interested in hearing more or reasonably informed about the subject in a short time. The “elevator” part comes from someone asking some thing like, “What if you were on an elevator with your boss and they asked you ‘I’ve been hearing a lot about . What’s it about?’ What would you say in 60 seconds, so your boss would walk away with a reasonable idea?”

So what if you were in the elevator and your boss said, “I’ve been hearing a lot about agile methods. What are they all about?” What would you say to explain agile methods in 60 seconds?

Here’s one sample (but by no means great) idea:

"Agile methods are an approach to development and delivery of software in smaller, shorter increments. They do this by using techniques that improve communication bandwidth & frequency using direct contact among customers, developers & the product to get continuous feedback. They also employ development approaches that try to be open to and prepared for change by facilitating change rather than trying to prevent it and developing incrementally so change is easier to manage."

What would you say in about that many words?

Thursday, August 20, 2009

Quotes (from Twitter)

The the past 2-3 months that I've been on Twitter, I've been saving Tweets that, at the time, struck in particular when they appeared. As the list was getting long, I thought I'd better post these now, before it got ridiculously long. Some are quotes, not from Twitter users, but quotes other Twitter users thought were interesting from authors, philosophers, etc. There's even a few here of my own. :-) [In some cases there are follow-ups combined with the original which are not, therefore, in alphabetical order by the author's name.]

Abraham Maslow (via ASQ) - "Secrecy, censorship, dishonesty, & blocking of communication threatens all the basic needs."

Alan Shalloway - Doing meta thinking is good (see my book design patterns explained). "going meta" means doing meta thinking w/o validation ;)

Alan Shalloway - practices followed as solutions are dangerous. Practices followed as examples of understood principles are a basis of learning.

Albert Einstein - Not everything that can be counted counts and not everything that counts can be counted.

Alistair Cockburn - The center of agile development is to deliver running, tested features to users and collect feedback.

Alistair Cockburn - We need shared understanding at this point, not shared common sense.

Alistair Cockburn: OH "a large dev team has a voracious appetite for requirements". Food for thought.

Andrew Clay Shafer - In a mechanical system, friction is the most common cause of lost work. Friction seems like a useful metaphor.

Ari Tikka - Standardized work... in Finnish I like word "vakioitu". Has the flavor of being constant instead of forced to standard.

August Turak - Maximum motivation emerges from the peer pressure of a team working toward a common mission.

Bas Vodde - I'm a positive person, but gradually losing hope that large companies can get rid of their self-destroying habits. Esther Derby - plus structures and policies reinforce current behavior, sometimes punish new behavior. Bas Vodde - Yes, and people reinforce each others behavior all the time. A nice system build to keep it in status quo.

Ben Simo - If you learn from /waste/ and /rework/, is it really waste?

Ben Simo - SW doesn't handle ambiguity like people but SW systems definitely can be complex. Plus, human users r part of those systems.

Benjamin Mitchell - A new team has set up a Kanban board with an explicit column for 'blocked'. Amazing how quickly kanban boards provide useful information.

Bill Graham - “It’s only work if there’s something else you’d rather be doing.”

Bob Marshall - Locating Sw Dev in IT seems dumb to me (these days). But the wider point: who owns Product Dev?

Bob Marshall - The one and only question you need to ask agile teams: "What measures are you using to understand and improve performance?

Bob Martin - Scrum+XP+Lean+Kanban+CSM+DSDM+TDD+BDD+CSP+... = murky Agilebet soup. Maybe it's time to rethink this.

Brian Marick - like to think of AR⊗TA as part of new wave: "economy of trust", "smaller co's w/ intense collaboration", products instead of finance, etc.

Brian Marick - Courage isn't needed for those things once you've constructed an environment where making mistakes isn't scary. So I'd put courage as a temporary value, a stepping stone.

Brian Marick - I detect a certain tendency for craftsmanship to become narcissistic, about the individual's heroic journey toward mastery.

Brian Marick - I think of arxta as a back to basics movement, where basics are pretty much XP attitude & the original scrappiness of Scrum.

Brian Marick - Interesting to think about changes required by "Done means it's in the user's hands. Nothing less."

Brian Marick - Much social thought today is about making sure no one gets something for nothing (welfare queens!) and bad ppl suffer (smokers: perish). (Esther Derby - Those seem almost like tribal values.) What gets lost is persuading/allowing ppl to give something for nothing (open source, social capital) so that ordinary people benefit. Being jealous of resources is human reproduction model (small # of children, huge investment in each). Alternative is our maple: huge # of seeds, waste cheerfully accepted. Maple doesn't care that some seeds wasted on undeserving ground.

Brian Marick - People who think they're on a hero's journey tend to disregard the ordinary schmucks around them.

Brian Marick - The idea that working with other people is risky, that it requires lowering some sort of barrier or thinking differently (becoming humble).

Brian Marick - Thing about waterfall for programmers: you only have to acknowledge your gross lack of skill every few months. With Agile, it's *every day*. Ron Jeffries - true but i can't do much harm in a single day.

C. Northcote Parkinson - “Delay is the deadliest form of denial.” (Not from Twitter)

Charles Kingsley - We act as though comfort and luxury were the chief requirements of life, when all that we need to make us happy is something to be enthusiastic about.

Chris Sims - People think agile introduces uncertainty when it is actually just uncovering the uncertainty that has always been there.

Classic #quote - A user is somebody who tells you what they want the day you give them what they asked for.

Courtney Benson - "People don’t fail because they make mistakes. People fail because they don’t learn from their mistakes.” Chuck Musciano

Dan North - talk on learning at Better Software conf: "Use metrics as indicators, not targets."

Dave Pembs - Do not argue with an idiot. He will drag you down to his level and beat you with experience.

David Alfaro - "Individuals and interactions over processes and tools" So true! Docs=Potential Knowledge [hardly unleashed], People=Kinetic Knowledge.

David Alfaro - You will be amazed how powerful is to switch the immediate reaction "I disagree" to "I don't see it, help me see it"

David Anderson - TPS yes! Lean (Womack, Jones, Daniels) No! A theory of variation is missing from core Lean literature. So not Deming.

David Hussman - "what do you call a stand-up meeting with too many people? A stand-there meeting"

David Platt (via Michael Bolton) - “Your user is not you. Most people don't want to DRIVE somewhere, they want to BE somewhere."

Derrick Bailey - if you don't learn fr waste/rework, it's not only waste/rework, its complete failure. i expect learning fr waste/rework.

Doug Shimp - A well formed team often can solve problems faster than the surrounding business is able to apply.

Doug Shimp - Build your team protocols one at a time. Pay attention and adapt based on realities encounter and don't assume what is needed.

Doug Shimp - Great individual talent is often not enough. We need teams that are talented & humble enough 2 work together.

Doug Shimp - If you think it's expensive to hire a professional to do the job, wait until you hire an amateur.

Doug Shimp - Scrum often reveals that the business is not able to figure out which problems to solve.

Ed Yourdon - David Stephenson makes key point (at #e2conf1): "make customers co-creators"

Elizabeth Hendrickson - Seems to me that prof testers do 4 things well: observe, notice what can be varied, predict risks, & explore methodically.

Elizabeth Hendrickson -Trying to test the depths of the code thru the UI is like peering through the shower head to examine pipes in the basement.

Esther Derby - primary work of managers: establish conditions for teams to work in. develop people. work on the work system.

G.M. Weinberg - CENTER means know your own agenda and motivations. Get control of yourself first, so you can genuinely be of service to someone else. ENTER means you must enter someone else's system to help them. You can't bash your way in, and you can't force them to see things as you do. TURN means that we think in terms of making a nudge here and there. We can't expect to transform someone. They self-transform, if at all.

G.M. Weinberg - Instead of thinking, 'irrational', think 'rational from the perspective of a different set of values.

George Dinwiddie - "Talk to the card" We found that focus on the card wall helped bring focus to the standup.

George Dinwiddie - Yes, it's a pity. With mechanical products, the customer can admire the design and workmanship. Estimate long term value.

Gia Lyons - Marketing = Matchmaker. Sales = Dating. Services = Marriage. Support = Marriage Counseling.

Greg Vaughn - Agile offers more control. But you have to give up the illusion of predictability.

Henrik Kniberg - In Scrum, the Scrum Master's role is to create a great Team, and the Product Owner's role is to use that team to create a great Product.

Herm Albright - A positive attitude may not solve all your problems but it will annoy enough people to make it worth the effort.

Igor Macaubus - "Rules and procedures can be an insurance policy against disaster, and they prevent disaster. But they also assure mediocrity."

Immanuel Kant - Immaturity is the inability to use one's own understanding without the guidance of another.

J.B.Rainsberger - I find that the more authority I have over my own habits, the more I care about outcomes over dogma.

J.B.Rainsberger - Perhaps if more craftspeople worked more closely together, the narcissism would subside.

James Bach - Testing under no circumstances shows that the product will work. So we test mainly to discover how it will not work.

James Bach - When you see someone "resist change", realize from their point of view they are just applying self-discipline to do things right.

Jason Yip - If you had to scale down Agile to a few core skills that can be learned and applied without a huge commitment. What would that look like?

Jason Yip - The problem is not concentration of power; the problem is thinking based on authority rather than responsibility.

Jeff Patton - popular quote at yesterday's agile round table [Colorado somewhere] on good management: "organize their goals, not their work." - can't remember where I got it.

Jens Ohlig - pizza with the radius z and thickness a has the volume pi*z*z*a [Life is good]

Jim Highsmith - Agile managers understand that demanding certainty in the face of uncertainty is dysfunctional.

Jim Highsmith - Calculating value points. If the product mgr has no time to calculate value, the dev team has no time to calculate cost.

Jim Highsmith - Documentation is often the solution to a communications problem that can't be corrected with documentation.

Jim Highsmith - The best way to get a project done faster is to start sooner.

Joshua Kerievsky - An improvement is something that "enhances value or excellence." Don't ship features. Ship improvements.

Joshua Kerievsky - To release frequently, discover and build "acceptably incomplete" features that can ship today and evolve tomorrow. re: "acceptably incomplete" - one Customer deferred a good number of stories (8-10) that would "flesh out" a feature. In the end, those stories were never implemented - Customer realized that they didn't really need them after all. Dave Rooney - So, "acceptably incomplete" was actually "perfectly acceptable". Joshua Kerievsky - Completeness is overrated. We get more done faster by finding what is acceptable in its incompleteness. The core issue may be that we don't know what "complete" really is all the time (perhaps 'most' of the time). That's why the hardest and perhaps most valuable skill to master is Evolutionary Design.

Keith Braithwaite - No. I'd call src ctrl a good practice so good that it's mandatory until something better comes along—and I hope for something better. Ron Jeffries - so your concern with "best" is that someday something might be better? Keith Braithwaite - My concern is that a stated belief in the current way being best will lessen the likelihood of a better way being recognised.

Kent Beck - by "aspirations" i mean "who we are trying to be": software will improve when we aspire to be accountable, transparent, reliable. Ron Jeffries - well, yes, that and when we act in accord with those aspirations.

Kent Beck - Disagree that xp1e glorified programmers. Talking code when addressing the industry built on code isn't glorifying. The kind of thing i notice is when i say "the team" in 1e i mean "the programmers". In 2e we meant everybody influencing dev. "respect" as a value hasn't been widely accepted. Michael Feathers - The thing that was key in XP1E was the emphatic message "with these constraints, this works.” Up to that point, everyone was trying to solve the "general problem" of software dev. Rachel Davies - for me - XP1E was about listening to customers without compromising code quality - a way to achieve balance.

Kent Beck - i would be very glad to see a dramatic increase in aspirations for outcomes accompanied by a dramatic decrease in dogma.

Kent Beck - not quite a definition but... "quality results in a steady flow of value" or "a steady flow of value indicates the presence of quality".

Kent Beck - shu ha ri is generally a power trip for the teacher. empathy, engagement, and modeling work much better if you can deliver them.

Kevlin Henney - Use of "general" and "flexible" are design meeting smells.

LinusTorvalds - Real men don’t use backups, they post their stuff on a public FTP server & let the rest of the world make copies.

Luke Hohmann - Surveys are about getting answers to questions. #innovationgames are about shared exploration of complex issues to gain insight.

Malte Foegen (via Hillel Glazer) - "Replace 'process' with 'work' everywhere you see it & ppl will not get so hung-up on process."

Mark Graban - Toyota people taught me to shift my mindset from "we can't do that" to "we haven't figured out how to do that YET."

Mark Twain - The best way to cheer yourself up is to try to cheer somebody else up.

Martin Fowler - I'd rather someone thoughtfully does something counter to my advice than someone blindly follows it.

Martin L. Shoemaker - If you want to get things done around here, you have to learn to think outside the boss...

Matt Podwysocki - @KentBeck now what about the anti-for campaign? I could get behind that one too. Composable functions over explicit loops.

Michael Bolton - Serendipity is the discovery of something in the absence of a goal to discover it.

Michael Bolton - Testing is not quality assurance. Testing provides information to those who make decisions about quality assurance (programmers, managers).

Michael Bolton - That which is ubiquitous without being influential is in obsolescence. (from Mark Federman @ #Agile 2008, and maybe from McLuhan).

Michael Bolton - There is no test that is guaranteed to be needed only once. True; as is the converse. A test of some kind will help us tell the difference.

Michael Bolton -To me, "going meta" means "going above the current level of conversation to try to figure out what we're really talking about".

Michael Bolton: We get into BIG problems when we confuse MEASUREMENT which can be qua [l nt] itative with METRICS, which are functions, quantitative.

Michael Feathers - The test of a philosophy is the state the world would be in if it were followed fully. Most philosophies posit impossible worlds. Human nature is the one bit that many philosophies don't account for. Some pay lip service to it, but then conveniently ignore it.

Mike Wesely - "Statistics are often used as a drunken man uses a lamppost; for support rather than illumination."

Napoleon Bonaparte - Ten people who speak make more noise than ten thousand who are silent.

Nicole Radziwill - Alex just invented a new word: "for-gonna-get" - when you haven't forgotten about it yet, but you will forget something in the future.

Ohno - “Do not codify method" [because improvement is never-ending, and by writing it down, your process will ossify].

Paul Seibert - Are you agile? Look for adaptation in the face of things you did not expect?

Paul Seibert - The right people don't need to be managed. if you need to tightly manage someone, you've made a hiring mistake.

Randy Nelson (of Pixar) - Core skill of innovators is error recovery not failure avoidance. [Is agile more about the former compared to traditional approaches that may emphasize the latter?]

Ricardo Semler - "It's unfair to expect all employees to feel passionate about their work."

Rob England - Open content standards those I know of have failed. Open OK, but need money and editors. COBIT 5 might fly.

Ron Jeffries - an estimate is a guess in a clean shirt.

Ron Jeffries - OK. Here's the deal. The fact that you think you need tools to support your distributed development is a sign. Read the damn sign.

Ron Jeffries - There is a big difference between "I don't know a better way" and "there is no better way".

RonJeffries - @jwgrenning well put. TDD style finds mistakes, preventing defects.

Scott Bellware - That decay that plagues community efforts (agile,, etc) is only inevitable if the community fears principled community organization. With a values statement of code of conduct, a community becomes an *intentional community*. A community with defined values and protocols can afford more diversity than a community that eschews definition for the sake of diversity. A community's core values necessarily creates a core group. as long as that group doesn't devolve into a clique, it strengthens the whole. On top of creating definitions that permit a core, the group needs values and protocols to recognize and address cliquishness of the core.

Scott Duncan - I think we should talk about (what I think is) DeMarco's main point: focusing on the goal of creating software that changes/transforms and the conception vs construction aspect of software. Note also that he does not reject the idea of engineering software, so understanding what that may mean still seems important, though control, predictability & consistency are not most important to him.

Scott Duncan - My reading of the article suggests the pt is to focus on building transformational sw, not expect cmd & ctrl will be the best way to structure doing that, and rethink what the engineering of sw needs to mean in doing so.

Scott Duncan - Perhaps why some at #agileroots felt "real" agile is about code & coding techniques, feeling social "stuff" is a distraction. Alistair noted that people saying this weren't around when the Manifesto was crafted. Kent Beck - how odd. the social stuff is the point. technique supports relationships. Ron Jeffries - Yes, many techies think "social stuff" is distraction. But no: valuable skill. At same time, one puts effort where one's heart is. George Dinwiddie - Many business people also think "social stuff" is distraction.

Scott Duncan - Stability seems relative: depends on how much change one can absorb/comprehend in some period of time. Could it be that acceptance/concern over an agile approach is about one's relative sense of stability?

Scott McKain - “What gets measured is what gets done” is true but also “What gets measured gets emphasized by management.”

Seiche Sanders (ASQ) - I've never been a fan of the shooting-a-mouse-with-a-shotgun problem-solving approach. Propagates more problems/costs.

Shigeo Shingo - What is the Toyota Production System? When asked this question most people (80 percent) will echo the view of the average consumer and say: “It’s a kanban system”; another 15 percent may actually know how it functions in the factory and say: “It’s a production system”; only a very few (5 percent) really understand its purpose and say: “It’s a system for the absolute elimination of waste.” Some people imagine that Toyota has put on a smart new set of clothes, the kanban system, so they go out and purchase the same outfit and try it on. They quickly discover that they are much too fat to wear it! They must eliminate waste and make fundamental improvements in their production systems before techniques like kanban can be of any help. The Toyota production system is 80 percent waste elimination, 15 percent production system, and only 5 percent kanban. This confusion stems from a misunderstanding of the relationship between basic principles of production at Toyota and kanban as a technique to help implement those principles. – Shigeo Shingo, A Study of the Toyota Production System

Thomas Cagley - The world is not rational and expecting people or even groups of people to act rationally is confirmation of the thesis.

Tobias G Mayer - If teams are not collocated, are they dislocated?

Tom Gilb - DeMarco is really saying we must control value not merely cost and time.

Vadim Zaytzev - “Formal rules for comments are difficult enough to be easily forgotten to be included in a language standard” Michael D. Hill - aren't comments for precisely the stuff we can't express formally? if we could, they'd be called "code".

William W. (Woody) Williams - Agile development requires continuous planning. Waterfall requires constant re-planning.

William W. (Woody) Williams - Despite rumors to the contrary, Scrum is not a project management practice, it is a software delivery method.

William W. (Woody) Williams - It is infinitely better to mentor and train people - risking they leave - than to do nothing and risk they stay.

Wednesday, August 19, 2009

Developing Standards with an Open Source Model - Possible? Any Interest?

Rob England (@theitskeptic) retweeted (and agreed with) a Tweet from Antonio Valle (@avallesalas) who said "Why not create an open-source alternative to ITIL?" I responded asking, "What about an entire open source standards movement?"

I have, in the past, been involved with both ISO and IEEE software/systems related standards work. Without getting into an entire explanation of how standards bodies work, one of the issues raised by people on the "outside" is that it is very costly to get standards and difficult to find out the real status of ongoing work. At a certain stage in standards work, drafts are no longer available to anyone outside the standards working groups and voting bodies.

To be involved in such groups costs money, not because there is a high membership fee. There are 2-3 meetings of each per year, which means travel & living costs plus time away from work for 2-3 days each time. This cost dwarfs any fees associated with membership. Consequently, membership in such organizations usually means corporate support of some sort. Therefore, most of the people involved represent corporate interests. In the case of the software/systems areas in which I was involved (process related), most participants were either government bodies or contractors to the government (or consultant groups associated with government work).

The argument for the cost of standards is that the bodies overseeing the work have legal and administrative costs to cover (besides publishing ones) to ensure the fairness of the standards work. This fairness is very important since contracts and business often depend on meeting standards, so making sure the standards do not unnecessarily favor certain vendors or industry groups is very important. (In the case of ISO standards, the issue is to ensure countries or blocks of countries are not so favored.)

However, standards are made by those who show up either to the write or vote upon them. The former carries a good bit of weight since voters rarely spend the time to re-write what comes out of a working committee, though comments can be extensive. Part of the fairness issue is to make sure all comments are addressed and dealt with appropriately and in an open manner.

It can takes years for a standard to go from a proposal for new work to an actual accepted document. So the cost can be significant to participate, in money and time.

Now one way to have an impact on this process, is to come to the bodies with a draft of a standard already in hand. Of course, that is usually done by some member of the group or some document "sponsored" by such a member. Once a document is created by or adopted by a standards body, the copyright belongs to that body, i.e., they "own" the document. From then on, they are the ones entitled to make changes (or not) to it.

My thought when I asked about an open source standards effort would be to see if a standards document could be produced that could get industry support but be developed in an open source manner. I did some work on an agile-related standard for IEEE where I attempted to use email rather than face-to-face meetings to try to get broad international participation. But it was not open source and consensus was difficult to impossible to achieve that way with the large number of people involved. An open source approach might be able to overcome this as the open source community has addressed software developed in a "community" fashion.

Do people who have had open source and/or standards experience feel this could work? One major question would be how the results are handled and appropriate "stability" introduced so the result can be reliably used as a reference that is not changing too frequently given how standards are used over many years and may become part of contractual considerations.

Monday, August 17, 2009

Engineering Practice and Bridge Design

In April of 1986, ACM’s CACM (Communications of the ACM) published an article entitled “A Computer Science Perspective on Bridge Design” which was an interview of Gerald Fox (a bridge engineer) by Alfred Spector and David Gifford:

Since it requires a subscription to the ACM Digital Library to get access to this article, I thought I would highlight some points in the article, specifically where it discusses ideas that have some relevance to software design practice. It should be noted that all the material quoted comes from Fox’s statements as Spector and Gifford acted as interviewers only, though they did control the kinds of questions asked. The rest of the commentary is my own take on the material.

The first point made in the article is the significant difference in approach used for small vs large bridges. Small bridge design can get by with “simple calculations or experience.” Large ones usual employ “several alternative designs” during a “preliminary design phase”. These alternatives are usually presented to a client with one recommended over the others. “There would also usually be hearings to get the public’s reaction.” In software, the “public” could be considered end-users.

The next interesting comment made is that long-term cost is not a major concern since “initial cost is the primary thing clients look at today.” This concern for initial cost has an important impact on the materials and design proposed and accepted. However, the costs for all aspects of design usually pale in comparison to actual construction cost. This is similar for most manufactured products. Thus, a good bit of effort over the years has been spent in ways to optimize and improve the costs for construction/manufacturing.

One telling remark made by Fox was that everyone involved with the design phases “have always understood that the quality of the design is more important than its cost,” again, because of the huge cost of construction. It is noted that design cost is usually “less than 6 percent of total cost, and even less for larger bridges.” This represents one major difference between developing software (which is almost completely a design activity) and creating physical products.

The next major point is that, “There’s usually not a lot of very complex analysis involved, unless the project represents a significant departure from experience.” I believe this is an area where software development has continued to struggle for decades. Ensuring that necessary (domain and technical) experience exists and that this body of experience grows in some organized fashion may be one of the great limiting factors software development currently faces. It contributes greatly to the perceived complexity in software projects. In this regard, complexity from lack of knowledge of existing experience may even be considered an “accidental” (to use Fred Brooks’ term) element of software development.

Related to this is discussion about the existence and use of “standard values for the allowable stresses and loadings” which exist in basic bridge design. Engineering also benefits from much knowledge about the characteristics of the materials it works with, which is where a good bit of basic science comes into engineering. In contrast, much software is built starting from atoms/molecules rather than whole materials, though reusable libraries, patterns, O-O design are/have been attempts to raise the level of design. Fox does note, however, that “additional criteria” are usually needed for larger bridges, especially with regard to “natural phenomena” where a good bit of risk analysis is performed “to establish acceptable bounds.”

There is, of course, considerable modeling done for larger bridges, both mathematical and physical, both to determine “where joints, pins, and other connections are to be placed” and to consider dead and live loads. In software, we can think of these as interface and performance design. A key in both bridge and software design, though, is determining “how many situations to account for.” In particular, both must be concerned for combinations of conditions. Using one of Brooks’ terms again, we have an “essential” aspect of complexity in this regard, both for bridges and software.

One factor crucial in engineering design for physical products is the effect of component wear. For example, in bridge design, there is “fatigue” brought about by the stresses the bridge undergoes both through use and the impacts of natural phenomena. Again, combinations of both are important to consider. Concern for fatigue and component failure drives assessment of safety limits as well as maintenance considerations. The former are a design concern while the latter is usually not factored into costs, though the needs of such maintenance are well understood. In this regard, software has little concern as it does not “wear out” and repair maintenance of software is due to flaws in the original design, not flaws which develop “naturally” through use. Indeed, Fox points out how “inadequate inspections” of existing structures, are more likely to be the cause of failure than design/construction flaws. However, many examples of catastrophic failure are design/construction flaws, e.g., Tacoma Narrows Bridge and Kansas City Hyatt Regency walkways.

The interviewers next ask about the numbers of people involved in design and the use of pre-made components. Fox notes that too many people involved in design can “get in each other’s way, and it becomes more difficult to keep everyone up-to-date with changes”. But design can be divided up between groups with “one engineer in charge who would ensure all the groups were designing compatible structures.” Regarding standard components, Fox states the there is “very little…except for small-span bridges. Most elements are built up out of steel plates.” Fox does say that there are some standard sizes for “angles and I beams,” but, overall, “standardization is not very great in terms of components” for large bridges. Fox continues by saying that “economies of scale with a large bridge may make it more logical to use nonstandard components.” (Again, however, “components” are not atoms and molecules.)

Quite telling is Fox’s statement that, “It’s difficult to get every nuance of the design into the drawings” to be used for guiding construction and that contractors “may add extra material to the structure to reduce the stresses.” Fox points out the “tendency for the consulting engineering firm not to be involved in the construction process.” He later states that he feels this is something that needs to change. In software, we might compare this to separation between groups doing architectural and high-level design and those involved in actual low-level design and coding. In both engineering for physical products and software “on site” decisions are frequently made to get things to “fit” properly.

Continuing with this theme of change and design modification, Fox states that, even with computer design optimization, bridge design relies largely “on experience and trial and error” during the design phase. Fox does say computers allow engineers to “be more precise” about safety factors. Henry Petroski, who has written extensively on engineering practice and design, expressed concern that use of computer programs for safety optimizations posed some danger as they tended toward less rechecking of calculations, etc. as would have been done before their use. Indeed, he talks about how safety concerns ebb and flow relative to cost considerations. Cutting safety close (being more “precise”) combined with on-site modifications sometimes leads to failures which promote a new phase of over-engineering for safety until, after a while, cost concerns begin to drive the tolerances closer, until the next failure.

Fox says, regarding failure, that this usually occurs when “we extrapolate beyond out experience and models.” And as Petroski and Fox both note, it being a general principle of engineering, failure drives much of the learning, though not necessarily catastrophically so. Fox does not that, in general, bridge design is “conservative with our loading” since natural phenomena and ultimate use cannot be predicated exactly, e.g., wind and traffic. Fox says believe that, though “understanding of materials is also quite good,” a safety-factor of “1.8” is commonly used to account for “variability in loads or materials, as well as possible mistakes in the model.”

When it comes to design “correctness,” Fox points out that “good engineers check everything” and that this is “always done by an engineer who did not work on the original design.” When all design documentation is complete, another review occurs “by a senior engineer for completeness.” One of the tradeoffs made that is checked has to do with redundancy versus safety factors in the individual components. Redundancy can keep a structure from failing even if some element of it fails. But such redundancy is not used in certain bridge types because of the cost. This is where extra safety factors are used to “compensate for lack of redundancy.” Small bridges are going to be more redundant, due to cost, than larger ones. (It is interesting that Fox says how there are “perhaps 100 failures of old small bridges in any given year” though fatalities in such cases are rare.)

At the end of the article, there is a section entitled “Editors Conclusions” which list the following as “specific differences in the disciplines”:

1) Bridge design “is much more structured than computer systems design.”

2) Standardized, conservative specifications exist for many aspects of bridge design and components.

3) In terms of person-years, bridge designs are not as complex as software systems.

4) There is greater attention to reliability in bridge design and considerable checking of design specifications.

5) Analytical models are used more in bridge design and are more advanced.

6) Bridge design produces specifications that are “explicit” and “comprehensible” by many outside of the design group(s).

7) Design and construction of bridges is separated far more than in software where they can be no separation at all.

Now this is an old article, but many of the themes sound quite familiar, even today. Over the last 10 years or so, I have found it very informative and interesting to read about various forms of engineering practice and design. As noted above, books by Henry Petroski are often noted by others and a book entitled Discussion of the Method by Billy Vaughn Koen provides a combination of practical and philosophical examination of engineering problem-solving.

Agile Leadership - More about Agile or More About Leadership

This topic came up on Twitter earlier this morning and it's 140 character limit seemed to me to not be allowing proper addressing of the topic. I may be that everyone can actually agree, but the shortened communication form is getting in the way. In any event here's the way the thread went:

Bob Marshall (@flowchainsensei) asked "Is Agile Leadership more about Agile, or Leadership?" I replied "Former. It's used as an adjective, subsetting the latter." Bob replied "Your answer is not congruent with my world view, not my experience."

To this I posted a set of three Tweets which, combined said "Once you put 'Agile' in front of 'leadership,' by definition, you are specializing some aspects of leadership. Now leadership is still paramount, but once you qualify it, I believe the focus becomes of those aspects of leadership congruent with Agile Values & Principles rather than all possible leadership ideas.

Anna Nachesa "(@ashalynd) posted the following two thoughts: "is this not the case when sub-definitions undermine the original meaning?" and "sort of 'rectification of names' a la Confucius, w the purpose to give the people a new shiny symbol instead of old & beaten one."

But I think the original issue with me may have been my misunderstanding of Bob's intention. I was reading his question as whether leadership, generically, was what "Agile leadership" was about or whether it was about Agile ideas (applied to leadership). So I said it was the former.

But I believe, and tried to say, that leadership was the main point; however, "Agile leadership" was that general idea focused on leadership ideas that fit with the overall Agile Values and Principles. So I think Bob and I may agree. (I do not think "Agile Leadership" necessarily undermines the idea of leadership overall as I believe Anna suggested. Of course, now I may have misunderstood her!)

What do you think? How is adding the word "Agile" to the overall idea of leadership either helpful or harmful (in that it can distract from or diminish the idea of leadership)?

[PS: A follow up Tweet from Bob Marshall clarifies that he was "using the term to indicate 'Leadership of Agile initiatives'." And I Tweeted back that "I would agree then that, once in an Agile initiative, leadership matters most as, hopefully, Agile Vs&Ps are honored."

Bob further noted "My intent in the question was to explore if Agile initiatives are best led by folks with agile dev exp., or leadership exp." I definitely agree its the latter given this either/or but those leading need to understand Agile Vs&Ps wll enough to communicate with others involved in the initiative.

Thoughts on what "Agile leadership" may mean are still welcomed.]

Saturday, August 15, 2009

On (Early) Failure (and Iteration Lengths)

“Fail Fast” is a commonly heard idea in Agile development. This idea is that, if there is some doubt about a design/implementation approach, try something and find out quickly if it works or not. In this way, you don’t wait for a long time to find out if there is going to be a problem.

Now at the coding level, this can be tried without too much resistance from others. But what about this approach in a larger context? On Twitter one ASQ (American Society for Quality) daily quality quote offered was from Charles Knight: “You need the ability to fail. You cannot innovate unless you are willing to accept mistakes." The question is, at what level can failure be accepted?

With a coding spike, you’d refactor or try something different and, hopefully, find an approach that would work. You may even leave yourself some slack in the iteration commitment for just such trial-and-error. A similar application of slack is usually recommended for accepting stories into the iteration. This is normally done by committing to about a 75-80% person/day level for everyone on the team as a contingency. Failure here may be a story (or two). The goal would be to discuss why this happened at the retrospective and work to eliminate the causes going forward.

These kinds of failures, could be absorbed and viewed as something to be expected. What about failure of an entire iteration? In particular, few or no stories were completed. Could that sort of failure be absorbed? Probably not well into an organization’s release effort. But what about at the very beginning? What expectation for successful iteration performance exists when an organization first starts out?

I think it is very important to employ a “fail fast” approach to the entire adoption effort. Failing fast is about learning, of course, not really failing as it “nothing achieved.” So I usually recommend to an organization that they start with, at most, two week iterations. In this way, teams gain experience with every aspect of iteration behavior and can repeat this a few times – three at least – to get used to the rhythm of an Agile approach.

This is not to say there would be acceptance for delivering no stories, but everyone involved should agree that the learning which goes on in these early iterations is as important as delivery of stories. Identification and elimination of impediments is very important at this early stage in adoption, so effective retrospectives are crucial.

The goal, of course, is to get better and better at iteration delivery capability, but this is achieved, not by quotas, “win one for the Gipper” attitudes, etc. It’s achieved by people on, and related to, the teams seeing how Agile Values and Principles, as well as a specific method’s practices and techniques, will function in the organization’s environment. This will lead, hopefully, to adjustments in that environment to allow all these Agile concepts to become understood, accepted, and practiced by everyone.

Small, co-located teams might be able to begin with one week iterations. But I believe even larger (up to 10-15 people) teams that are distributed can benefit from sticking to no more than two weeks. As long as everyone involved understands the important of the learning that will go on and works to apply that learning each iteration, I believe two weeks is advisable. After the teams have a good feeling for how individuals (and teams) interact (including with management), what estimation and delivery commitment makes sense, and what technology support they can reasonably expect, iteration length could be increased.

I do not believe these early iterations should be considered “practice” efforts, though. It may be what they are at one level, but the results should not be treated as throw-aways. That is, the delivered functionality should be treated with production-level quality expectations and expected to be the basis for future iteration work. That very little such production-level results may come out of the early iterations should not be the concern, however.

Finally, it’s important not to set these early iterations up as “failure” cushion, i.e., allow teams to think they can afford not to try to do the best they can. The same commitment and accountability should be expected of them as would be expected later on; however, besides evidence of delivered functionality, evidence of important lessons learned should be considered valuable results at this point.

It is likely everyone associated with the Agile adoption will learn things, not just those delivering functionality. If everyone feels everyone else is, indeed, developing important experience in Agile concepts and behaviors, I believe early “failures” of functionality delivery can be handled positively and without “panic.”

What are your experiences with early team “failures” and the response to them?