7 Habits, Agile, Estimation, Experience, Improvement, Leadership, Learning Lessons, SAFe, SDLC, Technology, Vision

3 Tips For Your Next PI Planning

April kicked off our second Program Increment (PI) Planning event using the SAFe framework. My team and I reflected on the difference between PI 1 Planning and PI 2 Planning.

Three topics came up that we believe helped us improve our PI Planning.

Clear Socialization Purpose and Different Meeting Types

  • Understanding the purpose of socialization helped the team understand why socialization was so important and what needed to be accomplished during these meetings. This led the team to identify what they already knew and what gaps they needed learn, which led to better questions during each session.
  • Our Product Owner used a variety of socialization meeting types to produce new insights and different questions.
    1. High level, first pass – we just went over the features we intended to plan during PI Planning. This was to get our feet wet.
    2. In depth with shared services – This helped us think about our dependencies and ask questions we didn’t think of during our first session.
    3. Teach Back – Our Development Team taught our Product Owner the features we were going to work one. Any gaps in their teachings our Product Owner answered in order to finalize our preparations for PI Planning.
  • Side note: The teach back was inspired by a separate team that was planning on doing the same thing!

Begin With The End In Mind Strategy Towards Team PI Objectives

  • Identifying and clearly writing our PI objectives upfront helped the team envision our destination, so we spent the majority of time figuring out how to get there.
    1. The Development Team did an excellent job using the remaining time to come up with how to accomplish the objectives and they were able to take on unexpected work because it fit into our destination.
    2. A fourth feature was brought to us and because of our clear objectives, the Development Team knew they could commit to the feature.
  • Working closely with our business stakeholders and other train leaders to get feedback on our objectives helped speed up the process. This prevented us from getting stuck and verified we were headed in the right direction early in the process.
    1. We had great conversations about what should and shouldn’t be a stretch objective.
    2. These conversations helped us think about what we truly control and not trying to achieve outcomes out of our control.

Shorter Feedback Loops From Shared Services and Business Stakeholders

  • Leveraging our shared services with more urgency and without hesitation helped us get answers quickly and prevented major bottlenecks in the planning process
    1. Listening for key phrases like ‘I am not sure how that works’ or ‘I assume that is what was meant’ immediately triggered our Scrum Master to bring in stakeholders, shared services or other leaders to help get answers
  • Working hand-in-hand with business stakeholders helped improve ideas around the objectives to avoid unclear wording and reduced time to get business buy-in when finalizing our objectives.
    1. Also, because we were getting our business stakeholders involved earlier, they seemed more engaged and more part of the team.

What helped you in your PI Planning or what do you think of these tips? Let me know!

Photo Credit: https://www.agileoctane.com/2018/03/31/the-magic-of-safe-pi-planning/

Agile, Experience, Leadership, Learning Lessons, SAFe, Team Building, Tools

SAFe Team Self-Assessment Experience

Our company recently adopted the SAFe framework. One pillar of SAFe’s House of Lean is Relentless Improvement. There are many ways to identify areas to improve – retro, inspect and adapt, iteration review and system demo just to name a few.

Another tool that SAFe provides is Team Self-Assessment.

When I started looking into the tool, I noticed there wasn’t clear direction around how to use it, just a short description of the tool’s purpose. I started to think:  

  • Does each team member take this and then I average the scores?
  • Do we take this together, like sizing, where we all agree on a number?
  • Do I want to get a quick snapshot here, or have the team kind of think about each question (with a timebox either way)?
  • Am I over thinking this in general?
  • How often should this assessment be taken? Once a PI, multiple times? Once every other PI?

I started to do some research on implementation practices and I found an article with great ideas.

  • What I liked
    • They suggest doing this as a team activity, rather than having people take the assessment and submitting it back to one person
    • They used an external facilitator, so the entire team could focus
    • They used a planning poker cadence (everyone says their number at the same time) using post-its & time boxing each question
      • Also, they used posters to represent each category and placed sticky notes along the axis (see pics in the post)
  • Questions that remained
    • Does it make sense to do this more often than once a PI?
    • How does this work with remote team members?
    • How does the scoring work with a poker cadence? Discuss it to meet somewhere or just average the scores?

This article helped me think about implementation, so the team took the assessment last week and here are some more takeaways from our experience.

Our Implementation

  • We took the assessment after the second of six iterations in our first PI (after ~1 month)
    • I originally planned to do this each month, to get a good idea of how the team is working within the PI
  • We used Zoom to facilitate discussion (3 remote team members and 3 local team members)
    • Everyone had a working webcam
  • I shared my screen as we went through the excel sheet, to be transparent and visual
  • We spent ~45 minutes going through the entire assessment (keep in mind, this was the first time we did this)
    • We counted down 3-2-1 before giving our score using our hands
  • We posted individual scores on the webcam using our hands; fist = 0 through five fingers = 5
  • I quickly average all the scores and marked them on the excel sheet
    • I then asked for any comments, especially at the extremes of scoring
    • I tried time boxing discussion 2-5 minutes so we could keep moving

Our Learning Lessons/Feedback

  • Some questions were not applicable because we didn’t complete that SAFe event yet or haven’t completed the PI yet
    • Example: “Team reliably meets 80-100% of committed PI Objectives’ business value” – we only completed 2 iterations so we couldn’t answer this question
  • Some questions seem more like a true/false answer or simplifying the scoring may be needed in the future
    • Example: “Team participates in System Demo every two weeks, illustrating real progress towards objectives” & “CI, build and test automation infrastructure is improving” could be true/false answers
    • A 0-5 scale is rather large for short questions in this format. It may make sense to modify the scoring scale to simplify the assessment. Maybe 1-3 like The Five Dysfunctions Of A Team assessment (1 = Rarely, 2 = Sometimes, 3 = Always)
  • The discussion after the question is more valuable than the actual answers to the questions
    • One of my team members made this comment and it was insightful because everyone can interpret the questions a little differently. So discussing the questions after the answers helped everyone think about the question and how the team can improve

Overall, this assessment worked well after the first two iterations. It helped the team take a step back from the day-to-day and think about the bigger picture. Also, I think this tool is useful multiple times per PI, but you need to modify some of the questions. Lastly, I think the next time my team is in-town, following the other blog post about using posters and post-it notes to take the assessment is a great idea that I want to implement.

If you have any questions, suggestions or other experiences, please reach out and let me know!