SCORM 2004 flopped. xAPI won’t.

eLearning courses have multiple standards to choose from for sending results.

  • The latest is xAPI, released in 2012.
  • Prior to that we had four editions of SCORM 2004 starting in 2004
  • Before that was SCORM 1.2 which dates from 2001.

And despite that, the oldest of the standards above (SCORM1.2) still dominates. a full 75% of course developers don’t bother with SCORM 2004 at all, and of those that to, 97% use SCORM 2004 on only on a technicality – using SCORM 2004 to deliver only SCORM 1.2 results.

In summary, of all the courses created last year, fewer than 1% used SCORM 2004 in the way it was intended. Everyone else only needed what SCORM 1.2 was able to achieve.

In short, SCORM course development today continues to use thinking that dates from the year George W Bush was sworn in, Apple opened its first store and the latest PCs came with Windows XP.

You might think we’d have moved on! We certainly had the chance to. SCORM has had one mahor version and four releases since then, but it was hamstrung by the fact that the first release (SCORM 1.2) was actually pretty good at what it does. The things SCORM 2004 allowed us to do were:

  1. Ask the course to look at the learner’s results from earlier questions and adjust the course based on their responses.
  2. Tell the course to re-order its content based on a range of (quite complex) options.
  3. Remembering whether a person actually completed in addition to their result.

It allowed us to determine:

  • This person completed, did they also pass?
  • Does the learner already know the next section and should we skip it?
  • Will we permit the learner to choose their own adventure as they traverse the course?

Nice, but hardly transformative.

Not surprisingly it was met with a collective “meh” and simply never took off. So if people were not tempted to move from SCORM 1.2 to 2004, what makes xAPI any different? Why am I so confident that it will be adopted?

The answer lies in the fact that the step from SCORM 1.2 to 2004 was so subtle while the, the step from SCORM to xAPI is a game changer.

In 2001, it was impossible to imagine the way we use technology today. “CBT” (Computer Based Training – now termed eLearning) was mostly used to deliver automated tests. What little teaching there was was rudimentary but was a convenient way to complement our 20th century learning. The very thought that a computer might simulate things, that we might accredit anyone using results collected automatically or that a computer would play a central role in a person’s formal training was frankly absurd.

And at one level, that’s still the case. Whether we’re measuring a pilot’s ability to fly an aircraft, whether a paramedic can rescuscitate a mannekin or how frequently loaves of bread emerge burned from the oven, vast sectors of the workforce can only be adequately assessed and trained in the real world. What’s changed is that we now want computers to play a role in all of these xAPI is permitting us to use IT to administer training and assessment that is mission critical and cannot be done without it.

It is easy to imagine xAPI becoming mandatory for professions, but considering how few professions can only be taught in a purely online way, mandating SCORM is all but unimaginable.

Indeed the US Department of Defense has mandated xAPI for its distance learning, and it is why all nurses across the US are now trained using xAPI. Last year 2.5 million xAPI statements were sent to the SCORM cloud a doubling in five years. ADL (the creators of SCORM and the group who commissioned xAPI) have confirmed that SCORM will have no further releases. Their support is with xAPI.

Compare the list of questions SCORM 2004 enabled (above) with those that xAPI intruduces:

  • When the learner watched that video, did he/she actually watch it, or did they scrubb or fast forward it? If they did watch it, where did they get the most value?
  • How safely did the person drive today?
  • When people encounter my interface, how do they typically interact with my UI?
  • After they did my training, were they better at their jobs?
  • Looking at their use of the organisation’s systems, does this person require trainig and if so, which course(s)?
  • What training delivers the best ROI to the organisation? How much is that ROI?
  • How did the learner go on a course on some other LMS we can’t access?

xAPI is not setting about to improve on what we have now. It’s rethinking what we do entirely. That’s why its adoption will be explosive and not at all like its predecessors.




Trust me, I’m a trainer

A recent three-nation survey on Learning Transfer conducted by a consortium led by the University of Sussex made for some salutary reading.

The survey looked at the US, the UK and Australia and found that Australian employers were the least likely to insist on staff undertaking learning. Australians were much more likely to report “regulatory compliance” as the primary goal of training and conversely, while one in eight American respondents listed the primary goal as to “provide engaging, relevant and well received learning opportunities”, not a single Australian did.

As for outcomes, while most US respondents and almost half of UK respondents viewed  training investments as “generally beneficial”, only 28% of Australians felt the same way. Most alarming of all, well over a third can’t say whether their job performance is improved through the training offered to them at all and are left wondering why they are made to do it in the first place.

Australians also reported that training is delivered to mainly comply with external regulations rather than to improve their performance. Even management question the value to the point that Australian management is the least likely to back training through policy.

Given those findings, it’s little surprise that Australians are far less enamoured of their trainers. While L&D staff in the UK and US are trusted to deliver effective learning experiences by 72% and 60% respectively, a mere 38% of Australians have that view.

If that does not point to a trust problem, I don’t know what does…

The benefits of building that trust are manifest; on average employers outside Australia were prepared to devote $1,930 per staff member last year in training compared to $735 in Australia. With a third of the resources available to deliver outcomes, it’s small wonder L&D are not able to build the trust they can elsewhere. It’s a catch 22 situation.


The good news is that the same survey tells us that this is much less the case elsewhere and by implication that it does not have to be that way. But the job is ours; noone else is going to break the cycle so the task falls to us trainers.

We must work to change the perception of our work before we’ll get more resources with which to do it.

Impossible? – No. Just ask the British and Americans…

Regardless of where on the cycle you’re standing the exit ramp is trust.

Fortunately, building trust doesn’t have to involve dollars. Time and effort maybe, but not budget initially.  It starts with introspection; for trainees and management to better trust L&D to deliver real outcomes for them personally, L&D needs to be prepared to question what it does now:

  • Does your training prioritise the needs of the trainee, or the needs of external regulators?
  • Are the courses you deliver crafted to maximise results or are they the least expensive/most convenient third party offerings?
  • How often do you measure the knowledge gaps in your organisation?
  • Does the pace of change to your courses match the pace of change in your industry?
  • Do you undertake effective measurement of outcomes and continually improve the material in response?
  • Is the delivery mode driven by knowledge of the most effective format for the situation or is it driven by what is possible?
  • In the shoes of your trainee, how eagerly would you participate and why?
  • When was the last time you asked trainees and management for their goals and in particular for their training priorities?

I do get it – you have always wished you could do more of this but it’s not possible with the resources you have. That’s not a reason not to ask these questions. If others are expected to undergo a needs analysis, why not L&D? Am I right?

Trust me, solutions will emerge; for example:

If your original objectives no longer align with management’s priorities but recreating everything is absurd. Look through it to find what can be adjusted and drop anything that no longer is serving the organisation’s objectives. Use the time you’d otherwise spend delivering the rest to survey people at all levels for their goals, identify the single most important unmet need and develop one item only that addresses it. Build trust by involving those who called for it to the extent you possibly can and in so doing demonstrate your commitment to ensuring you fully meet their needs.

If your material’s badly dated but you don’t have the resources to update it properly, why not learn to make video on a shoestring so you can interview some of your awesome in-house experts and turn it into microlearning? Whenever you deliver it, the recognition upon which it’s based will build trust.

If you are delivering primarily compliance training, it’s natural for people to see your work as an extension of a regulator, rather than aimed at maximising their potential. No matter how vital that compliance training is, it is aimed at addressing problems you hope will never arise. Inevitably it is perceived (wrongly) as pointless unless it’s carefully contextualised by building more practical material alongside it. If you don’t have the resources to offer more structured training in areas that your team do value offer unstructured options instead.

If your training is dry and boring but you don’t have the ability to redevelop it the way you’d like, read this article on crowd sourcing your engagement. Build on that over time until one day the case for it is so solid, the business case for a full Learning Experience Platform is self evident.

Over to you….


Learning Cultures – an accountant’s view

Deloitte Global’s 2019 survey of Human Capital Trends was pretty stark – 86% of respondents believe that in the face of rapid change in the workplace, they must reinvent their ability to learn.

However only 50% of those respondents reported that their L&D department was evolving rapidly and 14% said it was too slow. Just 1 in 9 of respondents rated their learning culture as excellent and 43% said it was “good”. Almost half were of the view that their workplace training culture was inadequate. Add to this the finding in an earlier edition that almost half of millenials were planning to resign due to inadequate training and there is a clear concern for employers emerging.

Deloitte’s view is that organisations must create a learning culture that is broad in scope:

In a competitive external talent market, learning is vital to an organisation’s ability to obtain needed skills. But to achieve the goal of lifelong learning, it must be embedded into not only the flow of work but the flow of life.

That certainly sounds like an ambitious goal, but in fact, as I’ve written previously it’s often surprising just how much impact you can get with relatively minor alterations or merely by elevating practices you undertake already.

Beyond that there are emerging technological solutions such as LXPs that are designed to create learning cultures (rather than merely deliver training). Meanwhile xAPI is helping organisations coordinate multiple forms of training experience throughout their workplaces – another key to creating a learning culture (and it is often free to do so).

Deloitte coined a term “DevWork” to highlight this trend of combining learning with work. The term is derived from DevOps  which refers to the way IT teams are now extending their work beyond development and into operations.

Another clue comes from Deloitte’s finding that there is a rapidly growing consensus that people want to be trained not only by L&D but also by the business itself, suggesting a desire for a more holistic approach to learning.

So, what might an organisation do to achieve this DevWork?

It’s actually not that hard – a learning culture is something that grows naturally if you nurture it. The answer I’ll offer is not very different from what I suggested when I was discussing 70:20:10 (which, after all is a closely a related concept).


Train in context:

  • Use technology such as LXPs to encourage peer sharing,
  • look at options for social learning
  • Encourage knowledge sharing and ensure that ideas are captured, shared and above all recognised
  • Build mentoring relationships through a formalised mentoring program
  • Empower (with resources) and encourage supervisors to take a more proactive training role
  • Create a repository of microlearning (perhaps crowdsourced from your workplace SMEs) and use xAPI to connect it to the points of need in the workplace


Train often:

  • Augmented and virtual reality is actually not an expensive option these days – you can achieve more than you think with the tool you have to hand and a few tips (a topic for another edition).
  • If you have an xAPI system in place, connect it to the tools around your workplace to check for understanding on the job and trigger microlearning interventions when people use those tools
  • Develop a plan to ensure that training becomes ubiquetous – people receive trainig in small doses throughout the day until it becomes “the norm” to be learning.
  • Look at ways to better use smartphones for instantaneous training.


But there had to be a catch, right?

If you really want to nurture a learning culture, show you mean it. As an employer I always make training a core part of performance appraisals. Training and learning goals are regarded as KPIs. We measure their attainment and we regard them as no more or less important than any other set of agreed objectives. An employer with a genuine learning culture should be prepared to link performance incentives to training, for example.

As with many things, money is the real test of your level of resolve. If your organisation is genuine in its commitment it surely sees this as a core business driver that flows to its bottom line. It will incentivise people for their contributions and it will see well directed training expenditure as an investment rather than a cost.

I’ve dealt with organisations that use the language of “learning culture” on one hand and “training cost” on the other. Sorry, but these are incompatible and unfortunately your staff know it. If they find that their preparedness to devote their time to attending a worthwhile training event and all they hear is “budget approval process”, all your good work is lost.

Perhaps, then a prudent next step is to seek an accountant’s assessment of these strong recommendations Deloitte makes…

Oh… wait…

Design Thinking (and a puzzle)

A lot has been said about design thinking recently, and a lot of it’s really really technical. That strikes me as ironic when design thinking is supposed to teach us to do the opposite of being geeks!

Like all fashionable terms, it’s nothing new. Like so many other exciting ides, you will know you have got the idea when you stare at this screen and think something along the lines of ‘duhhh’

It boils down to teaching in a way that starts and ends with the audience; we begin by empathising with them, we follow a process to determine how best to teach them, then we talk to them to find out how we can do better. Sure there are things to think about and a process along the way but much of that is best understood by taking a look at the tools that are available such as  The Learning Ecosystem Canvas, rather than wading through long (empathy free) articles that essentially say the same thing.

Pause that thought…

I want to give you a puzzle. If you think you can can solve it, great. Otherwise, feel free to skim over it.

According to Monty Python, it is part of his kingly duties for King Arthur to know off-hand the speed of a European Swallow in both the case where it has to carry a coconut, and where it was un-laden. The speed has been much debated online and a figure of 11 metres a second is now accepted.

So, our unladen swallow (let’s call her Jane) can cover 11 metres every second. If she flies a kilometre (one thousand metres), it will take her 1000/11 seconds or about 91 seconds.

Now, let’s suppose King Arthur releases Jane and starts walking towards Camelot. He is pretty fit from all those battles, but has some pretty heavy armour on and so let’s say he can stride along at 5km an hour. Camelot is a kilometre away and so Jane gets there  in just 91 seconds. She turns around and flies back to meet Arthur who is now part way towards Camelot but as soon as she reaches him, she turns back and flies to Camelot again.

Without a pause, trusty Jane turns back and flies to Arthur to make sure he is okay, and finding that he is, that he is now much closer but still walking at 5km/h, she returns to Camelot.

Jane repeatedly flies between the ever approaching Arthur and Camelot without pausing, and flying directly between the two.

How far does Jane fly?

A spoiler will be presented below, so if you want to try it, I would encourage you to have a go at solving this before reading on. If you do, I would be interested to read your solution (before you get to the spoiler that is!)

What does this have to do with design thinking I hear you ask?

The puzzle above is one that some of you may be able to solve instantly. Others will need to get out a pen and paper.

That’s because I used anti-design thinking ™    … hey! I now have a catch-term of my own!

Let’s look at this same problem in a design thinker’s way… To do that, we must start with empathy.

Who are my learners?

  • Well, I know that readers of this blog come in many shapes and sizes.
  • Most are working in an area linked in some way to training.
  • Some will have done maths, many will not have done it for some time.
  • Some will get the Monty Python jokes, some will not.

It is safe to assume that the way I presented this puzzle will alienate people.

Fun as the Monty Python reference is for some, it is not relevant and simply adds complexity for others. Why add all that additional wording to a maths problem?

Don’t get me wrong, linking learning points to learner experience is often valuable, but not for its own sake!

With this case study, I can hardly claim to have been attempting to cast the situation as real world to allow me to train someone to deal with a situation they might need to deal with. Adding humour can aid recollection, but the only humour here is for that portion of my audience who is in on my Monty Python joke. Worse, the humour for them is in reliving the experience of seeing that film – my story is not in the least funny. If anything, the humour here will cause people to think about the film, not my puzzle and actually be a distraction from effective learning.

So in summary, I ignored my audience persona and embellished a problem for no gain whatsoever; I showed no empathy to you, dear reader!

What do they bring to the lesson?

Our learners will always learn in a way that has context. What they have seen, heard or done in the run-up to the lesson makes a world of difference, and it’s our job to consider what that might be, Equally it is our job to frame up the material accordingly.

In this example, I created context that was guaranteed to get a lot of users onto the wrong track…

So, our unladen swallow (let’s call her Jane) can cover 11 metres every second. If she flies a thousand metres, it will take her 1000/11 seconds or about 91 seconds.

I tempted you to start thinking about a particular type of calculation… the time it takes for a bird to fly a particular distance, and relate it to the distance of one of the bird’s laps.

Not surprisingly, a number of you may have started with that first 1km flight and 91 seconds, then the time and distance back to meet Arthur, and gone on to work out more lap distances in ever more challenging calculations.

What if I had said this at the outset instead?

The bird can cover 11 metres every second. That means that if she flies for ten minutes (or 600 seconds) she will have flown 11 x 600 metres = 6600 metres.

I would have encouraged very different thinking. With this as our ‘previous experience’ we are led to think about how far the bird flies in a particular time. Immediately after that we hear about Arthur and his 1km walk at 5km an hour.

Perhaps some of you are now seeing a different approach to the problem?

More empathetic treatment 

Here is another puzzle…

A bird can fly 11 metres every second. That means that if she flies for ten minutes (or 600 seconds) she will have flown 11 x 600 metres = 6600 metres.

If her keeper releases her 1km from home and starts walking directly home at the same instant, and if the keeper walks at 5km an hour the bird will fly back and forth between him and home for the time it takes for the keeper to arrive home.

How far does the bird fly?

That’s a fair bit easier! Yet all I have done is change the wording a bit to be a little more empathetic. I have eliminated some red herrings and I have chosen a frame up sentence that will cause readers to approach the same problem differently.

In the classroom, that frame up iwill have come from experiences over which we have no control such as the work environment, the policies of the organisation, traditions, practices, culture, personalities etc. Yet your choice of frame up governs how your participants apply it. They bring existing lines of thinking to your classroom that might seem applicable to them, but might not represent the line of reasoning you expect, so the challenge is to anticipate and work with or adjust those lines of thinking.

As we see from my example, one’s preparation might alter dramatically what people learn, cause simple ideas to become complex and have people give up altogether (hands up those who gave up on the first version of this problem…).

Nobody should design training unless they know their audience, understand their context, and have gone further to use empathy to anticipate the practiced lines of thinking that might appear applicable for the challenges you pose.

Are we there yet?

Okay so I suspect more readers (but still not all) have a distance for the bird’s flight.

… but there has to be a ‘but’ right?

Earlier I said that design thinking does not only start with the audience, it must also end with it. Our job is to assess whether we have successfully taught people the most effective way and then to iterate our design. (That’s why I would love your feedback).

But for now let me start by saying that the best solution to the bird problem is not to work out how long it takes to walk a kilometre at 5 km/h (12 minutes) then ask how far a bird flies at 11 m/s over 12 minutes (720 seconds).

Sure, it gives the right answer (11 x 720 = 7920m), but there is a vastly better approach.

This puzzle is still framed so as to prepare you to think in a way that overcomplicates it.

So here is my puzzle to you. See if you fan see a better line of reasoning for the bird problem and send me your solution. When you have found the best approach, look at my wording to decode how it leads you away from the simple solution to the less simple one. Knowing that, how should this puzzle be framed?

This is precisely what an instructional designer must do when creating learning empathetically.

Solution next week…

P.S. a really thoughtful quote on design thinkers from Don Norman :

“…the more I pondered the nature of design and reflected on my recent encounters with engineers, business people and others who blindly solved the problems they thought they were facing without question or further study, I realized that these people could benefit from a good dose of design thinking. Designers have developed a number of techniques to avoid being captured by too facile a solution. They take the original problem as a suggestion, not as a final statement, then think broadly about what the real issues underlying this might really be (for example by using the “Five Whys” approach to get at root causes). Most important of all, is that the process is iterative and expansive. Designers resist the temptation to jump immediately to a solution to the stated problem. Instead, they first spend time determining what the basic, fundamental (root) issue is that needs to be addressed. They don’t try to search for a solution until they have determined the real problem, and even then, instead of solving that problem, they stop to consider a wide range of potential solutions. Only then will they finally converge upon their proposal. This process is called “Design Thinking.”

Engagement – a case study (with tricks)

About fifteen years ago I was asked to help redesign a CPD program.

As you always should, we started by getting to know our audience and we learned as much as we could about the profession and its workers. One thign that was clear was that the people involved were good. I mean really good. The level of knowledge and experience they possessed was clearly going to be both our best asset, and our greatest challenge.

Lecturing to those with practical experience is a mugs’ game. It’s not merely pointless but counterproductive when the audience knows your material’s assumptions and limitations. It’s those little simplifications we have to make in a theoretical context that lead to those who has worked in the “real world” offering wry smiles and then dismissing the program as not credible.

On the other hand, experts well…. um… do tend to like the sound of their own voice…

did I just say that???

Sorry I meant to say that they are eager to act as mentors. Including to other equally experienced experts. (My turn to wear a wry smile…)

More seriously… what an asset! How about not having to make all those little simplifications? How about drawing on practical experiences and applications from the audience to turn black and white into shades of grey? After all – our audience had finished with the black and white after their graduation ceremonies… when they began to learn what no teacher can offer a class.

One thing I learned is not to fight an adult audience and instead to harness them.

Building interactivity into the eLearning we created made it more rewarding and more valueable at the same time. It required us to concede we were not the smartest people in every room as regards the topic and instead be the smartest learning environment creators in the room (our job, after all!)

Multiple choice – the enemy of engagement

We never asked a question with a right and a wrong. We asked better questions with multiple valid answers depending on circumstances. These became decision points in a branching scenario where people could form their own conclusions as to what was the best decision.

These were interspersed with reflective questions… “In the following situation, what would you recommend?” is an invitation to offer wisdom. We would wait for a considered submission and then take it seriously in one of the following ways:

  1. asking the learner to compare what they had offered to something presented as “here is our response to the same situation. Please provide your thinking as regards the differences”
  2. submitting the response for review by peers – we took the module response and injected it into a discussion forum for peers
  3. modifying the comparison approach (1 above) to inject automatically into the module the last 3 responses given by others to the question, or
  4. inviting the respondent to categorise their approach into one of a set of options, then having the LMS keep track of the proportion of respondents who chose each category to that it could inject a real-time graph of the distribution of collegues’ thinking.

All this was live and automated. Better yet, it created new content – the responses were used to improve our modules and add more insightful text. The modules themselves served to inform us when we were making assumptions or exhibiting unconscious bias (our audience loved to pick us up!)

It was always fascinating to look at the way the overall population’s viewpoint evolved over time – the graphs from type 4 above would slowly change as new ideas were established in the sector and this in turn allowed us to update modules to inform our audience about changes to prevailing ideas.

Taking it forth… a community

We did not view our duty as ceasing on the last page of a module. It was critical that any momentum we’d built (often exhibited in the form of professional debate) continue, but that required us to consider how to build a sense of community online?

Clearly the obvious approach involves social media  – discussion boards and the like and these proved valuable especially when the discussions were guided by a respected expert and built around a scenario.

Another really successful approach was inviting the former participants back to help us review and comment on the in-module offering of those who came after them.

A third approach was to draw in the collected contributions and use the ideas from our learners reflective responses to create new scenarios around the same topic – often shorter – and build them into new microlearning items to generate yet more responses.

Our approach was so successful that an entire sector’s training was transformed – you can still hear the differences in how its workers discuss their ideas to this day. Meanwhile their peak body moved from a century old membership-based funding model to training-based funding model with one of the country’s most profitable training subsidiaries within eighteen months.

I remain pretty proud of that…

Training videos on a shoestring

Yesterday I had coffee with a fellow instructional designer. She spoke about how videos are perfect for some of the things she’s tasked with teaching before adding “… of course I don’t have the budget to do that”. It was the second time I’d heard exactly that line in the last week alone.

My company started life as a TV production crew. We hold many awards for video and while it was certainly true once that with video you could have quality or economy – not both – it equally surely is not true now. When GVM started life we bought three massive editing computers and spent half a million dollars; today we do far more on low end laptops.

We used to spend tens of thousands on fancy cameras and a great deal on tapes. It would cost a few thousand for an autocue complete with a specialist to drive it. Many videos now are shot on a smartphone on a tripod. There are no tapes required and the autocue has become an onld iPad with a free app.  Video production has become so easy for our clients to do that our role now is often simply to point them in the right direction and let them go.

So long as you have the basics right – your camera’s mounted (please! no queezycam!), your scene’s well lit and the audio is crisp (and no…  that doesn’t have to be expensive either) – a few basic tips is all you’ll need for a DIY video that passes the watercooler test, so I thought I’d share a few…


Know what you want

We’ve all seen those painful internally-produced videos that just seem to try hard for authenticity but fail. They look amateurish with overacting, poor editing and ultimately the message is lost.

I’ll let you into a little secret – the difference between that and a professional job is often pretty small.

More often than not, the amateur director simply forgets what they are trying to achieve. Once we get a camera in our hands we forget everything we know about messaging and about why we’re filming and are led by the medium.

A professional is simply clearer about what they are doing, why they are doing it and above all what they are not doing… and they adhere to it.

My advice is to do what you know – in fact an instructional designer is not unlike a scriptwriter in many ways – start with what you know. Before you do anything else, complete the following:

My audience is comprised of members of the  _____________ (single) persona. I need them to know ___________ so that they can __________ (single goal). I will do this by showing them ___________ because our analysis shows that other approaches don’t work well due to __________.

Use a minimalist approach from there. Nothing you shoot should be unscripted and nothing you script should not be germaine to the goal you have set above.


Know the medium

In the last section I suggested restrictingg video to situations where “other approaches don’t work well”. That might sound like I am advocating video only as a last resort. In fact my point is different. If you have a good solution, use it. Like any medium, video should be used where it is the best approach. Those cringeworthy efforts we sometimes see are typically produced for their own sake – “Have camera, will shoot”.

A professional director would never commence work before knowing your alternative approaches. They go far beyond “what are we saying” to understand “and why do you want me to be the one to say it?” Armed with that, they add nuance. They study YouTube to find others who have grappled with the same issues. They think about how to deliver those messages that have proved challenging to convey through casting and setting decisions. They remain focussed on the challenges, not the decoration.

Never forget that you do have options. Just because your teaching point has not done well in a classroom, an animation or an eLearning course, does not mean video is your only option. The options available to us are expanding all the time. (I’d like to offer a shout out to Jennifer Gallegos and Matt Sparks here – they recently wrote a terrific article on the use of immersive audio – a form of VR – and talk about its capacity to draw the listener’s attention to key points using sound alone.)


Less is More

Your viewer has a superpower – the “scrub forward” slider.

You have a single defence – relevance. Never attempt to make more than a single significant point in a video before allowing the person to stop and reflect. You have reserved a space in a conversation – don’t get greedy. For the same reason, the moment your video runs over 90 seconds it’s doomed to be fast forwarded.

Beyond adding carefully considered dot points, I would never use effects in a training video – for something that will take a lot of effort you are almost certainly going to add something that you will love and others will smirk at. Again, the issue is relevance. Remember a director has a budget too – they add effects because they need them and only then. Your audience has been brought up expecting that an effect is there for a particularly important reason. If it’s purely aesthetic is simply cuts against our intuitive grain.


Don’t tell, show

If you have chosen video as your medium, presumably your message isn’t well conveyed through telling. You chose video because you want to show something, so do that.

In film and TV, the term used to describe a situation where characters explain something to us is “exposition”. Writing these scenes is one of the most challenging tasks for any scriptwriter.

You know the one… the scene at the end of every B-grade detective show where the hero responds to that question “there’s just one thing I don’t get…”. I’m cringing just thinking about it!  Unfortunately, unless you are a gifted screenwriter, your expositions will always look forced. In you need exposition, you probably shouln’t have chosen this medium.

Show us! The best videos work with the sound off. That’s a great test and in fact, you may find that’s how they are played at times. After all, in an office setting it’s often impractical to turn sound on.

Finally, always ensure your scenes have motion in them – people are moving naturally and providing an example of the poit you’re making (rather than glued to a spot talking about it).



The economic case for knowledge sharing

After my post on 70:20:10 at the end of last year, I took a bit of a break from posting to read more research on this area and was struck by the findings of a number of studies that show just how crucial it is to create knowledge sharing cultures. It’s becoming clear that there is a consensus that it’s not only linked to business survival, but to the prospects for Australia as a whole. Our government’s economic research stresses that while innovation is vital throughout the OECD now more than ever, Australia’s circumstances make innovation our number one priority.


The decline of productivity growth in Australia


In the period since 2012, Australia’s productivity growth has declined steadily and it’s not alone. That said, the issues that give rise to this problem are domestic according to Australia’s treasury analysis:

“Australia’s recent productivity slowdown is not an isolated occurrence. Developed
economies have experienced slowing productivity growth in recent decades,
particularly during the 2000s. […] Australia’s recent productivity performance appears to be driven by domestic factors rather than factors common to developed economies.” (Australian Treasury)

The report points out that Australia has several specific challenges:

  1. Ours is an economy in transition from a mining and agricultural one to a more diverse economy, and
  2. Australia does not have the capacity to generate the level of new product innovation it would need to arrest the productivity decline on its own.

The second point underscres the degree to which Australia’s growth is welded to the level of innovation we can import from countries such as the USA.

Worryingly, the same research goes on to point to a worldwide slowing down of innovation generally worldwide. For example Gordon’s provocatively titled paper Is U.S. Economic Growth Over? Faltering Innovation Confronts the Six Headwinds actually concluded that the factors which saw product innovation underpin our growth for the last century are gone forever.


All is not lost

Despite this, the latest Australian Government Innovation Activity Survey shows clearly that the proportion of businesses here who are defined as “innovation-active” is actually rising and is now almost at 50%. Even with our limitations, the good news is that now more than ever, Australian businesses have new forms of innovation beyond this traditional pure R&D available to them and the Australian Government points to a trend for Australian businesses to innovate in new areas.

“… business innovation encompasses much more than these measures. Australian firms undertaking innovation are more likely to invest in purchasing new equipment, training and marketing than investment in R&D or acquiring intellectual
property. ”
(Australian Treasury)


The number one barrier to becoming an innovative organisation

The same survey polled businesses to determine the greatest barriers to innovation and this is where all this links back to training.

The number one barrier to further innovation was a lack of skills. By comparison, the cost of innovating was a barrier for fewer than half as many buinesses.

More than a quarter of innovation-active businesses and almost a fifth of all businesses pointed to skills shortages as a concern and it was also the number one barrier for those businesses who were not actively innovating,


The solution

The largest Australian survey related to innovation, staff performance and skills was undertaken as part of a Ph.D. thesis for RMIT University, Melbourne by Peter Chomley who reported that his research “confirmed the relationship between the dimensions of Knowledge Sharing Behavior and Workplace Innovation” – in other words that there is a direct link between use of knowledge sharing in the workplace as the means of skills development and innovation outcomes.

“Knowledge sharing links individual, team and organization by sharing knowledge and expertise from an individual to an organizational level where it is converted to competitive value and advantage for the organization.”

It is here that 70:20:10 becomes central to this picture. In fact, Chomley’s research prioritises a number of the same ideas I pointed to in my last article on that topic :

“the behaviors investigated in this thesis were not constrained to
organization specific knowledge and include the sharing of knowledge gained
outside the organization of employment, either through continued education,
participation in professional bodies/Communities of Practice (CoP) or Communities
of Interest (CoI), or in boundary spanning activities.”

It makes sense. While it’s obvious why creating a new product will lead to business outcomes, we often forget just how much we can gain from less visible innovation.  For example, simply creating a knowledge sharing culture has been shown to foster innovation.

I wonder how often your organisation repeats its mistakes? How often has one project team found an error and addressed it, but without documenting or sharing their findings so that a second team repeats the same error completely unaware of the solution? Innovation will happen naturally if you let it (in this case the innovation is the rectification of a fault). If only the organisation had established a culture that saw it captured as an asset!

The challenge is to prepare the ground that turns individual innovation into team innovation and team innovation into organisational innovation.

Conversely, the absence of a knowledge sharing culture not only prevents the organisation capitalising on innovation but stifles innovation itself by conveying a message that it is undervalued.

So… how good is knowledge sharing and 70:20:10 generally?

  • 70:20:10 is the natural way humans learn – and we are being told by businesses that if only their staff had greater skills they would be able to innovate.
  • Meanwhile that very same culture is itself fertile ground in which innovations germinate and grow.
  • Best of all, as I’ve previously pointed out, 70:20:10 is easier and cheaper to do than you probably realise.

It’s a bit of a no-brainer really but it will not happen by itself. 70:20:10 environments don’t just happen – they are fostered by supervisors and by training coordinaters – Chomly’s work highlights this point too:

“From the managerial perspective, the managers and chief knowledge officers may
create a knowledge-friendly environment through the encouragement and
facilitation of teamwork, communities of practice, personal networks, strong and
weak ties, and boundary-spanning.”

and finally he reiterates the key role that policy makers must play:

“Management policy is informed by these findings and can be further developed to
encourage a sharing, learning and innovative growth mindset”