Or a Knowledge Management System for the Marketing Team
Someone, somewhere in the upper echelons of the company, says the words “knowledge management system” in a VIM (Very Important Meeting). Before long, you’re called into an office and tasked with finding said system. There is no budget (yet), no deadlines, few expectations beyond “fix our current silos and poor communication problems.”
So… you get started.
Part I: Research
First, determine what problems you are solving with this KMS. Because a KMS isn’t the solution to all problems. If the problem is simply that folks can’t find things well in Google Drive, maybe the right solution is a Digital Asset Management (DAM) tool, like an internal server.
For us, the problem was in using Google Drive across 15 teams that operate similarly. Teams were starting to create projects in parallel without knowing it, or recreating projects from other teams the previous year with no guidance on how to go about it. When employees left, we lost their tacit (internal, personal, job-specific) knowledge (and their Google Drive records). And new employees struggled to get caught up quickly on the quirks and subtleties of our tech arrangements or business model.
A system where employees could write articles or record demos or share examples of their own expertise would be invaluable. So we decided to pursue a Knowledge Management System.
Interview the Stakeholders
Before I went too far down this road, I wanted user confirmation that there was in fact a problem and that a KMS would in fact solve it. I arranged hour long meetings with every team in our Marketing department — for me, a stakeholder was anyone who would be asked to interact with the new solution.
I started every meeting by explaining, “There seems to be a problem” and outlining what I’d observed. I invited all the participants to agree or disagree with me, and then back up their case. From that point, I mostly listened and typed like a fiend. (Thank you, Miss Rose, for your 9th grade typing class where we learned to type without looking at our hands.)
At the halfway point, we shifted from bitch session to brainstorming what features or products or ideal world solutions could resolve the frustration and issues we’d just outlined. A lot of this part went something like, “If there was a … thing … that we could tag and label all the documents we upload to it, that’d be awesome.” And I just kept up the typing, asking clarifying questions, and offering encouragement to keep talking, that no idea was off the table.
[One note for stakeholder interviews: as you go, record who said what. It’s invaluable later to have specific people to talk to about specific features or functionalities. Or to remember who had game-ending concerns.]
Organize Your Notes
After weeks of interviews, I had approximately 47 pages of notes and was beginning to feel a little drowny. I blocked off a week (no deadlines, remember?) to sift through and organize my notes.
I started by assigning each team a new page in the Google Doc, giving them a header, and creating an outline of the notes so I could jump easily from one team’s page to another.
Then, I created a simple color-coded system of highlighting so I could reread the notes and mark important pieces. The code was super high level:
goals for the system (orange)
proposed contents for the system (green)
feature or functionality requirements (pink)
stakeholders’ names (yellow)
minimum viable product needs (blue)
Once everything was color-coded, I could pull all the green proposed contents into a separate doc, and ship that off to the stakeholders to tinker with. (Which they did — though they were often more interested in what did or didn’t live inside the system than with how it functioned.)
The feature requirements list (pink) I sorted and categorized in a separate doc around product parts:
User Interface
Compatibility
Internal Organization
Knowledge Sharing
Features
Tech Requirements
From that list and the blue list of MVP needs, I created yet another list of 10 must-haves for any solution to make it past the first round of scrutiny. I think it goes without saying, but this list will be different for every company, team, or organization. The MVP list for Clearlink was still rather high level:
Tagging/labeling system
Archive function (that’s still searchable; ideally with an expiration system)
Smart searching tool
Clear versioning system
Recommendations/related materials (functionality to make comparisons between assets)
Tracking/analytics on the tool itself (adoption, use rates, etc.)
Works with our internal SSO system
Slack/Gmail integration
Fast, accessible (mobile, slow WiFi networks, remote or VPN)
User-friendly publishing interface
The distinction for me between requirements and MVP fell along the lines of nice-to-haves vs. needs. The solution wouldn’t be right for us if it lacked one of the MVPs, but if it also had a bunch of the requirements, great!
[See the timeline at the end of this article that demonstrates how long this project took one person to accomplish.]
External Research
The notes were organized, I had an idea of the problem we were solving, and a fuzzy mental picture of what the solution might entail. The next step was seeing if the monster in my dreams existed in the real world.
I spent probably four straight days Googling. Open a fresh Excel sheet or Google sheet, and just write down any possible product/URL that you find. Don’t spend time on the website — if it comes up in the SERPs, write it down. You’ll spend plenty of time vetting sites and products later. The goal at this stage is to find anything that calls itself a KMS and wrangle it to your list.
I searched for everything from “knowledge management system” to “content sharing tools” to “internal storage systems” to “collaboration platforms.” By the time I was done, I had 164 possible solutions.

(Then someone reminded me of Forrester lists, so I scoured those, found a few more, then thought to check review and comparison sites, found a few more… At some point you just have to call it.)
Round 1
I started eliminating solutions with a really simple yes/maybe/no grid:

Once the list came down from 164 to a more manageable number (30, if you were curious), I started an in-depth competitive analysis. For this analysis, I built a grid that pulled in a column each for:
the primary value proposition of each resource
my 10 MVPs
other notable features
availability of demos or walk-throughs
a 1–10 rating system
further notes
This in-depth analysis took several weeks of combing sites, sending emails requesting more information (and waiting), using live chat interfaces (and waiting), and of course — color-coding!
Finally, I had my list down to 12 highly viable tools that I believed could work. There were two more things left to do: plan for change management across the Marketing Team and schedule demos of these twelve tools.
A Timeline for One Person's KMS Implementation

留言