Categories

Subscribe to My Feed   Follow Me On Twitter   Join Me On LinkedIn   Friend Me On Facebook

Performing a Business Impact and ROI Study of A Leadership Development Program: A Case Study – Part II

Read Part I, which covered Evaluation Planning, prior to reading Part II of this post. This part focuses on Phase 2 of the ROI study: Data Collection.

Case Study: Using the Phillips ROI Methodology™ to Evaluate the XYZ Widget Manufacturing Company Leadership Development Program

Phase 2: Data Collection

In Phase 2, data was collected from a variety of sources to use toward measuring the monetary value (ROI) of the leadership development program at XYZ Widget.  There were a variety of measures put in place that were expected to impact the business.  While there were some common measures, each functional area also had measures specific to that function.

Since significant data had to be collected, and much of it from questionnaires that went out to numerous individuals at various levels throughout the company, it was important that the company knew what was being done and why.  To that end, the communication discussed in Part I helped to ensure buy-in and cooperation from everyone involved.  There was a portal set up for individuals to ask questions and get answers as follow up to the lunch and learn sessions and emails that were sent. This enabled participants to digest the information and get answers to their questions or have their concerns addressed.   There was also an area to ask questions confidentially of members of the learning and development group.

Level 1: A questionnaire was sent to all participants after each individual class that comprised the program and again after the entire program ended.  The purpose of the individual class questionnaires was to gather participant’s thoughts and satisfaction level with the classes, what they learned, and the facilitators’ levels of expertise.  The questionnaire for the entire program discussed the program as a whole and the participant’s satisfaction with the program, the flow of the program, and whether they believed they learned anything from the program.

The questions were on a 9 point scale – agree (9) to disagree (1).

Questions, for the level 1 evaluation for the program as a whole, included, but were not limited to:

  1. The way the program was structured and delivered was an effective way for me to learn.
  2. The program was effective in helping me to learn new skills and gain new knowledge.
  3. The program built on my current skills and knowledge.
  4. I will be able to apply my new skills and knowledge back on the job.
  5. The skills and knowledge I learned are critical to my success at XYZ Widget.
  6. The pre-work, activities, discussions, and post-work helped me to practice my new skills and understand the concepts presented.

Additionally, participants were asked to estimate what percentage (0% – 100%) of their total work time will enable them to apply their new skills and knowledge.

Level 2: Level 2 data collection was to ensure that there was a plan in place to apply the learning in the program.  Learning would be demonstrated in three ways:

  1. Performance on multiple choice and essay exams
  2. Creation of an Action Plan
  3. Classroom participation – including all activities, pre-reading, post-work, etc.

8 – 10 multiple choice questions per course were given to the participants, along with two essay questions per course. One essay question ask them how they intended to use the skills back on the job and the other was a scenario to problem solve.  The problem solving scenario was done in a team-based environment – with 2 – 3 participants per team.

Additionally, participants were asked to create an Action Plan based on what they learned in the program.  The action plan was created in conjunction with the participant’s mentor, the participant’s immediate manager, and the executive of the functional area.   Here is an example of one participant’s action plan (this is a partial action plan and is not complete).  See the post Make the Training Stick for more details and a complete sample of an Action Plan.

Goal: Develop a formal internal communication process between manufacturing, planning and procurement, and materials management for all projects over $1.5 million in value.
Improvement Strategies:

  1. Learn how each group currently communicates inter-department
  2. ……
  3. ……

Tasks/Actions Steps

Support/Resources Needed

Timeline (Complete by)

1. Set up meetings with each specific department: manufacturing, planning and procurement, and materials management to understand processes for communication currently in place Support of executive management of each department area.Access to current processes and procedures in place for communications. Within 1 month
Implications for Professional Development: A key component of a leadership role within manufacturing is to communicate effectively and efficiently with other departments, sharing information and developing processes and procedures that work for the benefit of the company as a whole.
Success Criteria:  Building consensus among the departments on new processes and procedures for effective communication.  New processes and procedures that ensure better and more effective communication among all departments to ensure project goals are met and/or exceeded.
Proof of Goal Reached:  Improved communication will reduce project re-work by 45% within 9 – 12 months and increase satisfaction to 95% among department members.  Progress would be measured in 3, 6 and 9 month increments and again at a year.

And, participants were evaluated on the amount of participation in the classroom – team and individual activities, role plays, sharing information/stories/knowledge with others, and discussion topics.  Those participants who did not actively participate were spoken to and expectations were reset that participation was a key component of successful completion of the program.  Additionally, often managers and executives “popped into” the classroom in order to observe the participants “in action.”

Level 3: Level 3 evaluations looked at whether the learning was actually applied after the program.  In level 2 we put in place a plan to apply the learning and demonstrated the learning that already took place; in level 3 we looked at the progress toward the action planning and whether or not impact was being made toward the process improvement initiatives.

The mentors, along with the managers and executives of the functional areas, evaluated progress toward the action plans over a variety of time periods (e.g., 3, 6, 9 months) depending on the action plan goals.

Additionally, questionnaires were sent to the executives, mentors, immediate managers, and participants to gather data as to how successfully skills were being applied.  A questionnaire was sent at initially 3 months and again at 6 months.

The questions to the mentors, executives and immediate managers included, but were not limited to:

  1. How has the participant used his/her new skills and knowledge from the leadership development program?
  2. What cost reductions and/or cost avoidance was recognized as a result of the participant applying his/her new skills and knowledge?
  3. What benefits have you seen from this participant completing the leadership development program?
  4. How much improvement in the participant’s skills and knowledge was a direct result of the leadership development training program?
  5. How much improvement in the participant’s skills and knowledge was a direct result of the mentoring program?

The questions to the participants included, but were not limited to:

  1. What percentage of the new skills and knowledge learned from the leadership development program did you apply back on the job?
  2. What barriers prevented you from applying your new skills and knowledge?
  3. What enabled you to apply your new skills and knowledge?
  4. What specific actions did you take to implement your new skills and knowledge and who supported you in those efforts?
  5. Given all the other factors, including this leadership development training program, how much would you estimate your job performance related to leadership and general management will improve?

There was also a questionnaire to understand the effectiveness of the formal mentoring process.  This questionnaire was sent to the executives who were not assigned as mentors. These included asking the executives to what extent the mentoring process positively influenced the performance measures (such as productivity, quality, and efficiency) and where the executives saw the most impact from the mentoring process.  A questionnaire was sent to participants and mentors to determine the effectiveness of the mentoring program from their perspective.

Level 4: One measurement for this particular group (as described in Part I) was to increase production by 10% within a 6 month time period. Questionnaires were sent after a 3 and 6 month time period to determine improvement in communication (which was a major component of increasing productivity) along with review of production reports to compare production of widgets within a 6 month time period.

The questionnaires to manufacturing, planning and production, and materials management functions to determine improvement in communications included, but were not limited to, questions about:

  1. What percentage of your time requires communication with the other functional areas?
  2. How effective are those communications? And specifically, what makes them effective?
  3. If the communications are ineffective, how can they be improved to ensure they meet your needs?
  4. On a scale of 0% to 100%, how effective has the communication process been in ensuring that you have what you need to perform your role and meet your goals?
  5. How did the program impact output, quality, cost, customer satisfaction and employee satisfaction?

Interviews were also conducted of customers to determine their satisfaction with the increase in production.

Summary

All of this data collected over the 6 month time period would serve to evaluate the monetary value of the leadership development program at XYZ Widget at increasing production of widgets within a set time period.  The data collected later (at 9 and 12 months) would serve to evaluate other measures tied to the leadership development training programs.

Next post: Phase 3: Data Analysis

Comments are closed.