Lesson 10 Blog

Based on the change process that I have been sharing through my blogs, a major CRM technology upgrade, I have answered the following questions based on what I would want to use for data collection and evaluation.

What do we want to know? Ultimately, we want to ensure our employees use the new platform. I would like to collect touchpoints on the platform and count which pages are being utilized the most. I would want this data daily if possible and on an employee and team level. Additionally, and most importantly, we want to ensure we had a strong VoC platform (voice of crew) to gather voices and ideas to improve the CRM platform.

How will we know we have been successful? Typically, more than 50% usage would be a strong evaluation of heading toward success.

What specific kinds of data will we need? Quantitative data and voices

Do those data exist? If so, where? Who owns them? Can we get them? Currently, VoC does exist and is being utilized well. Multiple upgrades were given and implemented. We were not able to get touchpoint data and sadly, have given up. Ultimately this was an issue with data prioritization. VoC has a sole owner and brings together the necessary leaders each week to discuss and determine what moves forward.

How should we display the data? VoC is collected through a direct button on the CRM browser and funneled into a spreadsheet.

How often should data be collected? How often can they be collected? Who will collect them? VoC is collected at any time and submitted by any employees that use the platform.

Do we need permission to access the data? Only the leaders and project specialists have initial data, but then they share.

What resources do we need? VoC program ownership was needed. A proper database was built to gather and house the info.

Had we had the ability to have more quantitative data, the dashboard I would have created would have included two pieces of data:

  • Overall, employee touchpoints to the CRM browser.
    • Daily – I would also like to see the rollup of each team daily as well. If daily was not possible, weekly would suffice.
  • Which pages of the CRM browser are being used most? This will help with training and ensure that the employees feel comfortable with the range of the CRM browser abilities.
  • I would also be continually gathering voices through VoC and leadership. Encouraging all to share during team time, huddles, and 1:1s.

Lesson 6 Blog – Evaluation Design

Lesson 6 Blog – Evaluation Design

Learning about Rog’s “Evaluation, Design” (2005) description of an evaluation design and the key elements shared allowed me to reflect and ponder past evaluations that I have been a part of in my organization. The elements shared all play an important part in the entire evaluation process, but as I deeply pondered my own experiences, I could not help but continually come back to the Units of Analysis as the element I believe to be most important.

First and foremost, it is critical to evaluate the type of organization that one works in and the type of work that each of us does within that organization. For me, most of my roles have been within leadership and coaching so I am constantly surrounded by people and coaching to be a little better each day. I live by the mantra – people over process. I have seen seasons where the process was more greatly valued than the person completing the process, and in my opinion, that does not sit well with the employees. People first, employees second.

In many of the evaluations I conduct in my division, voices are the key element, which would equate to a unit of analysis. It may be a single voice, or it could be voices gathered in a group. It could also be voices from multiple teams that are funneled together to gather broad perspectives. I have found that these measures of qualitative data are the most impactful, and meaningful, and allow for progress and movement in the focus of that change process. On the other hand, senior leadership does not respond as positively to this type of data. A level of precision, strong and accurate numbers (quantitative) is greatly craved at the top of the leadership chain. What I find critical when using voices, is to weave a few data points into the story of those voices and ensure that I am telling a clear and enticing story. When this is done well, senior leadership often loves it.

I am currently completing a Green Belt project within my workplace. With multiple statistical and data points, the voices session is what stood out most to our sponsor and senior team. However, I believe we had such success as we communicated simply, provided a true story from the perspective of the employees, but then backed up that story with our data, only referencing two or three points of data. Our audience was Senior Operations Leadership – all leaders that LOVE data. We were so pleasantly surprised with their reaction and appreciation for our storytelling method.

Lastly, I do not believe any concepts or elements are missing regarding evaluation; however, I think a conversation is warranted around mixing and matching some of these elements. So much depends on the organization being evaluated, the goals of the division, and the progress that is needed to take the company further. As global organizations are facing constant change, perhaps prioritization may need to be a part of the conversation?  Are companies taking on too much change and then the evaluation process becomes too complex? This is a conversation that I often have with my senior leaders. Prioritization is a major roadblock for many as there are multiple OKRs and initiatives that define success.

Lesson 5 Blog – Strategic Alignment

Lesson 5 Blog – Strategic Alignment 

My organization, Vanguard, has been client focused and mission-aligned since it introduced its first mutual fund back in 1979. Our mission is clear, and the core purpose is simple – to take a stand for all investors, to treat them fairly, and give them the best chance for investment success. This mission is alive and discussed daily and those crew (we call employees crew) are connected to their work; they live and breathe this mission. As an enterprise, each division has its own OKRs (Objectives and Key Results) as well as their own strategic vision, typically for a 3–5-year timeframe.

As I have discussed in my blog for this discussion, the change process that I have been focusing on is one that I have been a partner with multiple teams to try to turn around a poorly integrated change into a better experience for our crew. As a recap, in our advice divisions, we introduced and are changing CRM (Client Relationship Management) platforms. This change is significant, and the platform is quite different from the one the advisors have been using. The change was not introduced well, communication was lacking, and timelines have been vague.

My partnership with this change lives across multiple departments, so when it comes to OKRs, I cover many. Below are a few samples:

  • Support delivery of spinal capability enhancements (Dynamics) for advisors, relationship managers, and sales crew to drive crew productivity and implementations, as measured by project quality assessments and business readiness indicator scores.
  • Digital client experience – drives improved outcomes by bringing the power of advisor-led personalization to the digital experience.
  • Improve crew effectiveness through increased tool utilization, implementing an effective coverage and segmentation model, and institutionalizing book management best practices.

Connection to OKRs

If you asked an advisor which OKR this new platform falls under, they would not be able to provide an answer. OKRs at my organization have often stayed at the leader level which has always puzzled me. Vanguard is all about connection, and we tend to rely on connecting to our mission and the client, as opposed to our OKRS. Leaders understand the underlying goals in the above OKRs; however, true success measures are often discussed as measures that are quite difficult to value. Additionally, many share that we overcomplicate our OKRs with too many words. I have been in conversations where fellow leaders and coaches crave simplicity, directness, and more measurable OKRs.

For this CRM change and OD project, it is referenced as a spinal capability (a word choice I always found peculiar) and is woven throughout the OKRs vaguely and lightly. I have always found that connection, or lack thereof connection, quite interesting as we have seen very light senior leader support through this entire change process.

I think the word connection is so incredibly important and for a change like this one, being able to articulate the “why” and “purpose” of this change and connecting it back to our mission is incredibly important, especially since we do not openly discuss or share OKRs – they are mostly viewed and tracked at the senior leader level.

When I was a leader in our advice division, I would regularly reference our OKRs and do my best to take a lot of words and long sentences and simplify the goal back to the advisor’s goals. I often had some success with this and would share this practice with my managers and peers. As a current Continuous Improvement Coach to the division, I continue to coach back to the basics and coach that words matter. What does this mean to an advisor? How does this change affect them or affect the client? How can we bring each other along together through this change? There are many ways to spotlight this change as good and for me, word choice and connection matter greatly.

Lesson 4 Blog – Applying a Model to Your Evaluation Plan

Lesson 4 Blog – Applying a Model to Your Evaluation Plan

 I am such a fan of the Kirkpatrick Model and have used it a few times during my career, especially over the past few years. As I went through the lesson 4 materials, I really appreciated the different viewpoints to use when applying the Kirkpatrick Model (as shared below). As I think about the change process that I have been sharing in my blog, a complete change of our CRM browser, this perspective, especially in hindsight, brings this change process alive and could have been incredibly helpful when the change was in its early stages.

  1. Change that occurs before the change effort: Before
  2. Change that occurs during the change effort: During
  3. Change that occurs immediately following the change effort: Immediately after
  4. Change that occurs long after the change effort: Long after

As I think about the stages of the change process, applying the Kirkpatrick Model would have been insightful. During step 1 – Reaction, I was a manager in the department when the initial communications were shared, and the information that was shared was minimal. There was no measurement, nor were there any moments of gathering voices. In the early stages, I would have recommended that leaders gather voices and initial reactions and funnel them up to a senior leader lead for the change to ensure that they can adjust as the change process continued. For step 1 – how well people like the change effort – gathering voices is critical and ensuring the employees understand the why behind this upcoming change.

In step 2 – Learning, the learning begins. For my change process, the learning began too early. The CRM browser had not been introduced yet, however, we pulled people into the classroom with hands-on modules, practice, and conversation. Surveys were provided afterward, in Kirkpatrick style, to find out how much people learned, but the feedback was strong around the fact that the learning happened too early. Until the browser was fully in their hands, there was a lot of fear that what was just learned would be forgotten since there was no timeline for application. This in fact was the case. There were multiple IT issues with getting the CRM browser set up, so some of this difficulty in the timeline was out of the project planners’ control. In hindsight, having multiple learning sessions, and practice sessions would be the approach I would recommend, straight up until the true elevation of the new browser. I would evaluate during the training, after the training, and then again just after the evaluation. I would also be encouraging the leaders to continue to gather sentiments as well.

During step 3 – Behavior, we evaluate how much the employees changed on-the-job behavior because of the change effort. This stage was perhaps the trickiest as I have shared in the past that once the new CRM browser was elevated, they did not set a timeline to remove the old browser. As I write, this is still the case – both browsers are available, however, the employees are highly encouraged to only use the new one. If someone is more comfortable with the old browser, they are going to continue to use it, which is just the case in this situation. As one of my manager peers shared, “It’s time to close the gate!” Because a date has not been provided, measuring this stage is very difficult. The more open advisors are willing to practice and explore, whereas those advisors that dislike change, are waiting until they are forced with a removal date.

I would propose a few measures during this step.

  • Have a tentative removal date for the old browser. It can be flexible, but a concrete date will help people accept the change.
  • Ensure SME/Change Agents are on each team. Have them gather either weekly to share best practices with each other and then take them back to their teams. These folks are the change cheerleaders.
  • Measure with voices during 1:1s and team time. I would not do a survey during this stage to ensure the employees do not get survey fatigue.
  • Recognize those employees that are utilizing the Voices button on the platform. As folks are using the browser, they can offer suggestions to customize the platform. Celebrate those ideas and remind the employees that they are a part of shaping this change process.

Lastly, in step 4 – Results, we evaluate how much change benefited the organization. I would ensure in step 1, especially during leadership communication methods, that the why behind this change is very clear and concise. As we rally around this final step, reconnecting the growth of the employees and the department back to that why is critical. This browser is expected to be a much better experience for both the client and the employee, but it is a new browser and very different from the old one. Continuing to share that “why” and the connection is critical. I would recommend that the entire division complete a survey to evaluate the entire change process. I would also recommend that the senior team, especially the change program managers and sponsors meet with the SME/Change Agents as a group and gather their voices on behalf of the entire team that they have been representing. I would also recommend a post-mortem with the sponsors and program leads. What did we learn? What would we do differently? Etc. These sessions are always valuable and critical for learning as we know more change is right around the corner.