We developed an Operating Model during merger transition work in 2005. The “Op Model” became the bible of generating station management at the new company. It prescribed detailed operating, maintenance, and financial management methods.
Adherence was expected of station management, given that the lengthy manual included what was determined to be the best practices from the merging companies and the electric utility industry.
Two “Op Model Assessment Teams” were formed to measure adherence and to assist stations in achieving good performance. Each team consisted of an even mix from the two legacy companies’ employees.
Baseline assessments were conducted at all stations in 2006 before they’d had time to implement the practices. Follow-up assessments were conducted in 2008 to measure implementation of the Op Model practices. The assessments (and their results) were high-profile and station managers had quite a competition in their ranks; mostly good-natured, but with obvious career implications involved because of the emphasis placed on improvement by the Vice President of Operations.
Costs for each of the two-week assessments were not small. The two teams consisted of ten members each, with half from the Midwest and half from North Carolina. Because of the team makeup and geographical separation, five members of each team had airfare expenses each week, while all team members required hotel, rental car, and meal reimbursement.
While the results of the assessments were helpful for improving management at the stations, the process was painful for team members. We called 2006 “the year without a spring or summer” because of the long hours and travel involved. Team documentation consisted of Excel and Word files stored on shared drives.
Assessments included dozens of interviews with all levels of management and craft employees, and dozens of observations of meetings and operations/maintenance work. Teams also reviewed management measures and process documentation. As this review was going on, team members captured their key findings and notes in a Word document for each management area. Toward the end of the second week, team members shut themselves into a room, discussed all findings, reached consensus on scoring for nearly a hundred subject areas, and calculated a final score for the current station.
The scoring document was Excel and when finished, included useful feedback for station management. The document included identified gaps and action plans developed to close those gaps.
A colleague and I recognized that our assessment process was too inefficient, expensive, and painful, so we decided to improve it. We identified the following issues:
At that time, I was just beginning to leverage SharePoint because it had not been available to me until the previous year. Microsoft promoted SharePoint primarily as a collaborative tool at that time, so we decided to move all assessment collaboration to SharePoint. I built an assessment SharePoint site template so that each station would have its own assessment site.
Each SharePoint site included:
We built another improvement that ended up not being used in the next round of assessments, but not because it was not a good idea.
To eliminate considerable time spent interviewing employees and to free time up for coaching (as desired by station managers), I built detailed surveys that station employees could take at their leisure in the weeks prior to on-site assessment work. It would have allowed us to seek information from all employees prior to assessments instead of only getting to interview those employees handpicked by station management during the hectic two-week assessment periods.
Not only would that have gotten us more accurate information, but it would have freed up time during the on-site work to coach management on the finer points of the Op Model and closing their gaps (as they had requested). While surveys would not allow follow-up questions, it would have gained us information that was good enough to find out where stations struggled and where they were strong. Our reasoning was that we weren’t seeking exact scores, we were seeking to help stations improve.
Additionally, if we knew the weak areas at a station, we could request managers from stronger stations to help the assessment team coach the station in those areas.
The Senior Vice President liked the idea, but thought he didn’t have enough time to sell the station managers on the radical idea. He thought they would initially push back too hard because of fears of negative employees skewing scores downward. Even though we could have won that debate because our intent was to use a bell curve for scoring (eliminating outlying scores from negative employees as well as those that were overly optimistic), he decided to put that improvement off until later.
I realize those details may be too granular for your interests, but my point is that we had a business process need and SharePoint met it exactly with some creative thought. Remember, your Continuous Improvement Specialists are the ones that should apply SharePoint to your processes for efficiency improvements and financial savings.
Due to the improved documentation and collaborative processes, we cut assessment time from two weeks to one week and saved 50% of travel expenses. Strain on team members was also reduced by more than 50%. Not only was their time away from home cut in half, but the time spent working during that week was more fluid because SharePoint allowed smoother data entry, data retrieval, and group deliberation.
Station management appreciated the changes because all assessment information was neatly laid out on a single SharePoint site rather than being buried deep in a shared drive in nested folders.
This SharePoint innovation resulted in tens of thousands of dollars of savings and much improved final results. (We never got to see the survey innovation used because the assessment program was eliminated after the next merger.)
Join our mailing list to receive cutting edge content.
Don't worry, your information will not be shared.