Did you know that, according to Bersin, less than half of the L&D organizations they survey feel they are perceived by their stakeholder leadership as a strategic business partner? One of the ways to combat this is through measuring learning and development metrics and analytics.
That’s a pretty disheartening thing to hear. How is it that the Learning & Development organization—the group responsible for developing the company’s greatest asset, human capital—is often considered ‘overhead’? I talk to a lot of analysts in the space, and I think the answer lies in these challenging observations from one analyst who works regularly with HR Directors on both the Learning and Talent sides:
“I can’t emphasize enough how stressed out these [HR Directors] people are. They have fewer people than ever and they’re feeling the pressure of Corporate saying to them: ‘Every day we stare at your Learning & Development budget and wonder: what are we getting out of it? You can’t report anything. You can’t analyze anything. You can’t trend anything. You can’t tell me whether we’re moving the ball on people. Are you reducing turnover? Are you making people more effective sooner? Are you increasing sales performance?”
You might feel this overstates the case, but it’s a perception that’s out there. And what it tells us is that, in order to be considered a strategic business partner, L&D must have a way of measuring learning and proving the effectiveness and efficiency of their organization. Today, L&D is measuring learning activities through metrics such as time-in-training, completion rates and test scores. Unfortunately, this isn’t the language the rest of the business speaks, because these metrics don’t prove that the provided training is working. It’s clear that the business wants more. And what it wants is low-level, actionable learning and development metrics and better ways of measuring learning.
So, what if L&D were able to provide a set of learning metrics that could be plugged into the corporate scorecard? And what if they could take immediate action on these learning and development metrics to make adjustments to learning at the speed of business? If L&D took a Google Analytics approach to learning, they could. To see why, let’s do a simple but revealing exercise. I’ve pulled together some of the most important Web analytics metrics that we regularly review and report to the business here at Xyleme, along with the reasons why they matter. To prove my point about learning and development analytics, I’ve done three things:
- Substituted the word ‘learners’ for the word ‘visitors’.
- Substituted the word ‘learning content’ for the word ‘site’.
- Substituted or added learning terms such as courses, videos, content nuggets, topics, etc. for the word ‘page’
- Let’s take a look at the results. The substitutions are highlighted.
Typical Google Analytics Metrics
Visitors: How many unique learners visited the learning content, and how many pages, topics, videos, content nuggets, courses etc. were visited?
Why: We want to determine how best to promote our learning content via paid, organic, email, and social channels in order to increase our audience of potential learners.
Traffic Sources: Where did learners initially come from, and where did they go after visiting the learning content?
Why: This information helps us to identify which channels perform best and where we should focus our promotional efforts.
Bounce Rate: What percentage of learners accessed our learning content or a particular page, topic, video, content nugget or course on our site and then left quickly?
Why: A lower percentage rate indicates greater interest in the learning content; a higher percentage means that the learning content proved less useful.
Conversion Rate: What percentage of learners who came to our site filled out a form (i.e. completed a learning activity), and which traffic source gave us the highest conversion?
Why: We can quantify the value of these conversions (from a revenue perspective ).
Content/Keyword: What content or keyword was used to find our learning content? We measure the pages, topics, content nuggets etc. that performed best.
Why: It helps us to understand what our learners are looking for.
Time on Site: How much time did they spend on our learning content?
Why: The more time learners spend on the learning content the more engaged they are.
Device: What type of device is the learner using to access our learning content?
Why: By having a responsive Web site, we can enhance the learner experience by making sure our learning content is easy to view on whatever device they are using. This helps to increase our conversion rate, lower the bounce rate, and increase engagement with the learning content.
Social: Did our learning content get shared?
Why: It enables us to identify which networks and which content achieved the best rates of engagement.
Tying Metrics to Business Results
That’s a pretty telling exercise, but how do we connect these learning and development metrics to business results? Let’s look at a very common example, product certification, which is a significant source of revenue for many companies. What if, by applying the Google analytic metrics above to your learning content, you could
- identify which product training content is the most popular
- identify which product feature content has the highest engagement
And once you’ve identified this content, what if you could dig deep inside that content to:
- understand what product features are most used by customers
- understand what product features may require more training support
- understand what pieces of content may require a review and updates