Nov 302017

MetricsThe ability to measure the effectiveness of your Knowledge Management (KM) program and the initiatives that are essential to its success has been a challenge for all organizations executing a KM Program. Capturing the appropriate metrics are essential to measuring the right aspects of your KM Program. The right metrics will facilitate a clear and correct communication of the health of the KM program to your organization’s leadership. In this post, I will identify metrics (or measurements) of four (4) key initiatives of most KM Programs. These initiatives are Communities of Practice, Search, Lessons Learned, and Knowledge Continuity.

Community of Practice (CoP) Metrics

Typical CoP metrics include:

Average posts per day, Unique contributors (People posting at least once), Repeat contributors (People posting more than once) and Majority contributors (Min people for > 50% of posts).

Some points to consider:

  1. Recognize the diversity of interests in those participating in the group, and that this is a voluntary undertaking for all involved.
  2. Develop a stakeholder classification and perform a RACI assessment for each stakeholder group.
  3. Through a collaborative process, arrive at coherent goals, objectives, principles and strategies for the group.
  4. Develop a CoP plan with agreed upon moderator criteria and stakeholders that influence group behavior in ways that are congruent with the group’s goals and objectives.

Search Metrics

Search Metrics are determined through Tuning and Optimization

Site Owners/Administrators should constantly observe and evaluate effectiveness of search results. Site Administrators/Owners should be able to gather Search Results reports from the KMS administrator periodically (every two weeks). From these reports, they can analyze the type of keywords users are searching for and from which sites most of the search queries come from. Based on this, Site Administrators/Owners can add ‘synonyms’ for their sites. If any newly added metadata column needs to be available in Advanced Search filters then the request must be sent to the KMS administrator.

Search Metrics

  • Search engine usage – Search engine logs can be analyzed to produce a range of simple reports, showing usage, and a breakdown of search terms.
  • Number of Searches performed (within own area and across areas)
  • Number of highly rated searches performed
  • User rankings – This involves asking the readers themselves to rate the relevance and quality of the information being presented. Subject matter experts or other reviewers can directly assess the quality of material on the KM platform.
  • Information currency – This is a measure of how up-to-date the information stored within the system is. The importance of this measure will depend on the nature of the information being published, and how it is used. The great way to track this is using metadata such as publishing and review dates. By using this, automated reports showing a number of specific measures can be generated:
    • Average age of pages
    • Number of pages older than a specific age
    • Number of pages past their review date
    • Lists of pages due to be reviewed
    • Pages to be reviewed, broken down by content owner or business group
  • User feedback – A feedback mechanism is a clear way to indicate if staff is using the knowledge. Alternatively, while many feedback messages may indicate poor quality information, it does indicate strong staff use. It also shows they have sufficient trust in the system to commit the time needed to send in feedback

Lessons Learned Metrics

Lessons Learned Basic Process: Identify – Document – Analyze – Store – Retrieve

Metrics are determined and organized by key fields from the lessons learned template and includes responses gathered during the session. Lessons Learned should be identified by Type of lesson learned captured (i.e., resource, time, budget, system, content, etc.). Summarize the lesson learned by creating a brief summary of the findings and providing recommendations for correcting the findings (i.e., Findings – a summary of the issues found during the review process; Recommendations – recommended actions to be taken to correct findings). In order to provide accurate metrics the approved actions should be documented and tracked to completion. In some cases the approved action may become a project due to high level of resources required to address the finding. Some metrics include: Impact Analysis (time (increased/decreased), improper resourced, budget constraints, software/system limitations, lack of available content, etc.); Applying lesson learned: % of Problem/Issue solved with lesson learned per category and overall.

Knowledge Continuity

The keys at the heart of knowledge continuity include:

  • What constitutes mission-critical knowledge that should be preserved?
  • Where is the targeted mission-critical knowledge and is accessible and transferable?
  • What decisions and action are required to stem the loss of valuable and in many cases irreplaceable knowledge?
  • Successfully obtaining, transferring, and storing the lessons learned and best practices from their most experienced and valuable workers to a knowledge-base or (KM Application) before employees depart or retire?

Some Metrics Include:

  • Percentage of knowledge harvested and stored from key employees.
  • Percentage of knowledge transferred to successor employees.
  • Cost associated with preventing corporate mission-critical knowledge from loss
  • Provides a structured framework and system to store, update, access, enrich, and transfer to employees to support their work activities
  • The amount of ramp-up time of new hires, moving them rapidly up their learning curves and making them more productive sooner

Let me know if you agree with the metrics identified here and/or if you know of additional metrics within these key initiatives that must be captured. I look forward to your responses.

Jan 112010

Army Enterprise Knowledge Management Competency ModelI recently read the discussion, and the associated comments, around KM education, which includes university courses (Masters programs), Certification programs, and Certificate programs.

This discussion is hosted by Art Schlussel in the CKO (Chief Knowledge Officer) forum in LinkedIn. It inspired me to elaborate on my thoughts concerning KM education. As I stated in my comments to Art, for any education to be effective it must be supported by practical application, including having experienced mentors work with participants who have recently completed any number of various KM training venues.

In the discussion, Art mentioned that a partnership between the US military and a well know accredited university would build a comprehensive KM training program is in its preliminary stages. However, the major issue is, what does or will this training consist of, taking into account the fact that the US military wants it to follow their KM Competency Model (see above).

I believe that the KM Training should have a holistic approach, which will cover the following:

  • The basics, and differences between data, information, and knowledge.
  • Establishing “your” definition of knowledge management.
  • Developing/executing knowledge management strategy (including knowledge audits, knowledge mapping, KM process.)
  • Identifying and addressing knowledge gaps (result from knowledge audit.)
  • Collaboration and knowledge sharing (Communities of Practice.)
  • Knowledge transfer planning (mentor protege, knowledge codification.)
  • Collecting and applying knowledge management metrics.
  • Identifying, planning, and executing KM projects/initiatives.
  • Knowledge management tools (wikis, blogs, search, KM systems.)

While this is not an exhaustive list, the approach must include the planning, strategy, and processes applied for knowledge management as well as the software that will enable and support the execution of the KM program initiatives.

The Army’s KM Competency Model serves as a foundation to how the Army will approach KM and forms the basis of what KM will address from the Army’s perspective. The Army’s Enterprise KM Competency Model represents one holistic approach to institutionalizing KM.

I believe that a holistic approach to KM is where we must begin in our training as well as our execution of KM at our organizations.

Jan 012009

An important aspect of any knowledge management strategy is to establish an environment of continuous sharing, collaboration and knowledge reuse. During the democratic primary and the presidential campaign the Obama team leveraged email list gathered partly through their internet site and their push for campaign donations through the mail, including list harvested during Obama’s run for the United States Senate. The emails were leveraged (and are still being leveraged) to push out information and knowledge to supporters, solicit donations and to solicit additional email list of people that want to get involved, partly enticed by the possiblility of winning certain promotional items identified by the Obama email (see example –

The Obama team would utilized these email list, determine where in the country these supporters live and dispatch teams to these locations to mobilize these and other supporters to get out the vote for Barack. This process was repeated (reused) all over the country. This created a “grass roots” effort to gain support and votes for Barack Obama. The emails served as a vehicle to build organic Communities of Practice (CoP) for Obama, to disseminate knowledge and build support for the Obama campaign and subsequent presidency. This strategy empowered supporters to hold their own functions (lunches, dinner parties, other special events) to showcase Barack Obama’s message and to talk about the issues.

Through targeted email marketing, development of communities as vehicles to share knowledge, and creating and executing a repeatable process, established a foundation to a knowledge management strategy that was able to expand. I will post more about this iterative expansion of the Obama Knowledge Management Strategy as we continue this dialog. I look forward to all thoughts and comments.

Happy New Year!