Management Observation Program Best Practice 13 – Metrics and Results Communication

StrategyDriven Management Observation Program Best Practice ArticleManagement observation programs serve to reinforce leadership expectations throughout the workforce. This reinforcement not only includes standards associated with the performance of day-to-day operational activities but also manager and supervisor performance of observations. One effective way to reinforce the importance of the program itself, desired performance of management observations by managers and supervisors, and behaviors expected of workers is through a publicly published management observation program metrics set.[wcm_restrict plans=”41970, 25542, 25653″]

Management Observation Program Performance Metric Sets

Metrics programs provide a constant, indirect reinforcement of leadership’s expectations when published in the highly trafficked areas of those to whom they apply. For the management observation program, reinforcement is most optimally achieved through the routine publication of programmatic and findings metrics. These metrics commonly include:

Programmatic Performance Metrics (overall and by department, workgroup, etcetera)

  • Management Observations Performed
  • Management Observations Performed by Managers & Supervisors
  • Average Management Observation Time
  • Average Management Observation Grade
  • Management Observations of Training
  • Management Observation Identified Improvement Opportunities
  • Management Observations Identifying Improvement Opportunities
  • Management Observation Open Condition Reports or Corrective Actions

Findings Performance Metrics

Logical collection of categorical performance metrics based on the observation card criteria (See StrategyDriven Management Observation Program Best Practice article, Use of Standard Observation Forms) graded during each observation. This collection of metrics is derived as follows:

  1. Observation card criteria are grouped by people, process, and technology categories (done for each observation card performed)
  2. Number (specific criteria count) and/or percent (specific criteria count / total of all criteria counts) of criteria within the category receiving a particular grade/score calculated for a defined period (typically one month)
  3. Subordinate metrics for each department, workgroup, etcetera are created (Note that the aggregate ‘number’ from the subordinate metrics should equal that of the parent metric defined in Step 2.)

Worker Behaviors (People) Management Observation Program Findings Metric Example

  • Category: Worker Behaviors (People)
  • Criteria: Procedure Use, Procedure Adherence, Self-Checking, Peer Checking, Independent Verification, etcetera (See the StrategyDriven Human Performance Management Forum)
  • Scoring: Excellent, Above Average, Average, Needs Improvement
  • Periodicity: Monthly
  • Worker Behaviors Number/Counts Metric: Stacked bar chart showing the monthly sum of the counts of the worker behaviors criteria receiving an excellent, above average, average, and needs improvement score

StrategyDriven Management Observation Program Best Practice Article

Figure 1: Worker Behaviors (People) Management Observation Program Findings Metric

Management Observation Program Metrics System Structure

Management observation program performance measures should be well-constructed and horizontally shared. (See the StrategyDriven Organizational Performance Measures Forum) To achieve these goals, each metric within the system should use consistent units of measure and scaling so to enable comparison between workgroups. Such comparison fosters a healthy comradery-based completion between workgroups that serves to further elevate performance.

Management Observation Program Metrics Communication

Each workgroup’s specific management observation program performance metric set should be posted in highly trafficked locations so to serve as a frequent reminder of their compliance with leadership defined standards.

Results briefings should also be provided to workgroup leaders with the expectation that the information will be discriminated to subordinate personnel. These communications provide the additional underlying performance details needed to improve performance

Final Thoughts…

Mature management observation programs leverage software applications to capture observation data and automatically generate the associated performance metrics. These applications greatly reduces or eliminates the administrative overhead associated with maintaining this program.

As with other workgroup level performance metrics, accountability for the measured results should be assigned to the associated workgroup leader and included in his/her performance goals. This alignment between management observation program outcomes and individual manager goals further reinforces the program and promotes its effective implementation.[/wcm_restrict][wcm_nonmember plans=”41970, 25542, 25653″]


Hi there! Gain access to this article with a StrategyDriven Insights Library – Total Access subscription or buy access to the article itself.

Subscribe to the StrategyDriven Insights Library

Sign-up now for your StrategyDriven Insights Library – Total Access subscription for as low as $15 / month (paid annually).

Not sure? Click here to learn more.

Buy the Article

Don’t need a subscription? Buy access to Management Observation Program Best Practice 13 – Metrics and Results Communication for just $2!

[reveal_quick_checkout id=”41969″ checkout_text=”Access the Article Now!”]

 
[/wcm_nonmember]


About the Author

Nathan Ives, StrategyDriven Principal is a StrategyDriven Principal and Host of the StrategyDriven Podcast. For over twenty years, he has served as trusted advisor to executives and managers at dozens of Fortune 500 and smaller companies in the areas of management effectiveness, organizational development, and process improvement. To read Nathan’s complete biography, click here.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *