xAPI in the real world part 2

Last week I gave some practical examples of xAPI in use. It was really well received so I thought I’d offer some more this week. This one is by GVM’s xAPI collaborators and is timely – it’s launching this week to a national Audience across the USA. (Go team!)

The USA’s Department of Health chose our collaborators and their tool Rustici engine– the ones we work with here and is a pretty good example of why we chose Rustici.

Side note: the Rustici Engine will shortly be embedded in our flagship LMS as both products are now owned by the same company and are part of the same suite – PeopleFluent (NetDimensions) LMS.

The scale and ambition of the new system is vast – it will be a national initiative of America’s Federal Department of Health who have a critical need to reduce the perennial problem of staff turnover in the sector. At the moment the turnover rate for graduate nurses is a whopping 35%. With a direct cost per nurse of almost $AUD 100,000 each time a nurse leaves the sector. It was described as an “epidemic” by Marshall University. That is set to change.

The entire initiative is (in its own words) “Not in the LMS Business”. The initiative is in the diversified content and analytics business. With quality data and multiple training options, the LMS can be smarter. It becomes part of a much bigger picture and is allowed to be used for what it does best.

… and you can get a sneak preview here 

xAPI’s genius is the ability to relate training to “doing” – to the coal face. We can measure the real world skills of people, how they learn on the job and where we deliver formal training, we can measure directly the outcomes! In a sector such as health this is critical. Not only that, but it’s instantly engaging! Professionals dislike programs that focus exclusively on what they don’t know and simply teach when they could also demonstrate what they do know and focus on building on that (as well as learning formally). This is the secret to the USA’s new and exciting initiative Transition to Practice.

In America, the national competency framework is owned by the health sector itself and until now has been linked only to formal training. As of now, that is set to change.

xAPI allows any learning experience or activity to be related back to the framework and, when combined with good analysis, experiences will be proactive and personalised. In making learning vastly more relevant and interesting, by creating a holistic learning culture, by using technologies to find out how best to support each person, the USA will finally drive the heavy cost of staff turnover down.

Q) So what Is Learning Analytics and why is PeopleFluent doing so much in this space?

A) Measuring, collecting and analysing information about learners (and context) so that an organisaiton can:

  1. help them engage in their own learning (metacognition)
  2. help to optimise learning products for the way they are found to be used
  3. help develop and retain talent

xAPI will allow the Department to engage with practitioners from the time they leave college, helping them in a range of ways to continue developing, monitoring and aiding on the of results and mapping every experience to nationally accredited competencies.

The model

  • The system will provide cause authors with a list of standardised and verified xAPI profiles to choose from when they design an experience.
  • The profile will then ensure the appropriate information is captured in a repoeatable way to create meaningful learning experiences.
  • As learners undertake the experiences, the system will generate insightful visualisations of the xAPI data informed by the profile.
  • Armed with this, the platform will help authors continuously improve training and provide learners with the first holistic view of their progress – not just on exams, but in the field. Not just on clinical knowledge but on soft skills competencies.

So what are we capturing? Well before I tell you that I want to remind you what we are capturing it from. At first glance you will think “My LMS does that now!” – yes, but not when someone’s not logged in to the LMS it doesn’t! This is data collected while the person works with a patient, uses the VR system, reads a document on your intranet, operates equipment or meets with a supervisor too!

So imagine capturing the following in those contexts…

Phase 1:

  • States of Learning Progress (assigned, started, completed)
  • Score
  • Completion Date

Phase 2:

  • events and interactions within learning experiences – what was clicked, what objects are present in a VR experience etc
  • Time spent in lesson, section, activity, video etc
  • Learning gaps individually and by group based on results from individual questions

So now…. I’m really keen to hear from you all – what could you do with that kind of information to solve a real problem in your workplace?

xAPI in the real world

This week I was asked why anyone would use xAPI.

I’d just finished explaining that:

  • xAPI statements aren’t generated by any old workplace tool – it has to be one that you can program and that is online
  • You can’t easily use it to measure human to human interactions without significant effort
  • It’s best applied to the coalface staff – the work performed at the executive level and even management level is often best monitored in other ways
  • While an LRS could be used to trigger actions automatically, the main intent is to create data and reports that are then subjected to analysis

At which point I’d lost my practically minded colleague in  fog of esoteric data analytics. In part for his benefit, not to mention that of my broader audience, I thought I’d better provide a case study or two showing that analysing people’s performance against training can actually have a real practical impact on business.

Let’s start with Watershed – the reference LRS. Developed by Rustici (the people who created xAPI in the first place), it’s probably the most sophisticated xAPI analytics tool around. I am always fascinated at some of the applications that ingenious businesses have found for the data it can produce.

One such business is the fast food franchise KFC who were keen to learn whether employee training impacted store performance as quantified by an internal system called MERIT.

The question posed was: Can we say that stores whose learners who access materials more often score better in the Merit system, and if so which of Merit’s statistics are best addressed through training?

Armed with that, KFC are now positioned to focus their efforts on training that targets areas that generate the best ROI, and also to address specific shortcomings in performance at particular outlets cheaply using existing resources, knowing when this approach is the best one and when another is needed.

Simple, but then the best ideas often are…

Meanwhile, like all large organisations AT&T has both a financial and ethical obligation to minimise compliance risks. As part of an internal review, it explored options involving a video platform, an intranet, assessments, surveys and a sophisticated simulation tool. As any teacher knows, this approach is best practice – a range of learning options is far more likely to be successful than a single approach alone. Quite apart from differences in individual learning styles, our brains are wired to learn when related information is received in multiple ways.

Yet in a practical sense, it’s unconventional and the main problem is that the analyses these different approaches generate are equally disparate. AT&T wanted to know whether they were going to be forced to choose, or could they harness them all?

Is it possible that results from entirely different mechanisms might be collected into one place and compared in a scaleable and cost effective way to satisfy an auditor?

The easy option would be to turn all of these experiences into conventional eLearning and launch them from an LMS, but the company wanted to avoid adding all of this load to what is already a heavily loaded LMS, to avoid forcing people to log into a system before commencing and to allow them to learn “on the go” despite owning an LMS that is not mobile friendly.

Until recently, the only solution would have been to force all of these trainign modes into the LMS, but xAPI does not require an LMS. It provides creative alternatives.

All of these different systems can use xAPI to send tracking yet still operate as before. The portal does not need to be “moved into” the LMS – it can work as designed, but with xAPI added, it can also track people. A tool that has been carefully crafted for mobile compatibility does not need to compromise and run from a desktop-only LMS, but again xAPI lets it collect data.

Before xAPI, tracking learners involved moving things (to an LMS). xAPI doesn’t care where something launches.

AT&T intends to retain its LMS because some things run well from it. But now the company has other options open to it. People can learn when they run simulation tools, when they use mobile devices, when they read the staff intranet – in short they can learn their own way.

As a business owner, the work done by KFC to prove we can deploy training in a way that they know will generate a return is exciting. As a teacher, I’m inspired by the way AT&T showed that creative teaching and accountability are not inconsistent. Above all I am excited at the brave new xAPI world we’re entering.

 

Are you getting typecast?

I know I am not alone in that I take great care when I hire staff; I conduct multiple interviews, set practical tasks and sometimes undertake personality profiling. I will bring my best ‘people people’ to observe each candidate closely.

Like most employers, I have my favourite questions and also like most, they apply irrespective of the job title. Most are not aimed at eliciting role-specific or technical skills, but rather to examine character traits that are vital to a person’s future success in the role.

Asking people to recall their best and worst supervisors is a favourite of mine; I love to hear their view of how they should and should not be treated, together with what they did about it. The latter point is often a great indicator of their response to stress and as I drill into their responses I gain insight into people skills and communication.

I ask about their successes but I get even more value when I ask about their failures, and more specifically how they learned from them.

Another is to list their top three values, if only to see whether they are introspective, honest and likely to enjoy our culture. With a little digging, this question also sorts the clear thinkers from the rest. It highlights introspection too.

In the second round comes a skills test where I am looking for nuances – the practical tasks tell me if the person can do the job, sure but I am much more interested in hearing the questions they ask me as they perform them. I am studying faces. They won’t know it, but I will so,etimes hire people who fail the test miserably, but go down fighting. I know I can teach technical skills to people if their attitude is right.

In short, like all employers I prize soft skills highly.

So why is it that my company sells dozens of times more compliance courses than soft skills courses? Why are we so often asked whether we offer soft skills courses only to be told eventually that budget was not approved?

If soft skills are an organisational priority, it’s incongruous that they are not a training priority. Yet clients often tell us that the training they deliver is geared towards regulatory priorities to the exclusion of any other.

I am generalising somewhat and you only had to attend LearnX last week, or the AITD awards night to see some really innovative counter examples, but especially in smaller more financially constrained organisations there is a concerning trend. It is ironic that it is precisely these organisations with their small, closer-working teams that have the most to gain.

Wouldn’t it be wonderful to challenge the stereotype that training is about compliance? When management is already trying so hard to recruit soft skills, can it really be so hard to preach to the converted that they should train them too?

Next time you make a budget submission, perhaps focus a little less on ‘completion rates for mandatory training’, and a little more on alignment to culture and broader organisational goals and see how you go… I dare you!

Is xAPI right for me?

I am often asked whether our organisation has xAPI capability as part of a requirement for us to be engaged and when I confirm that it does, I’m then asked “so what is xAPI?”

It’s a slightly funny situation I must say… Do you have it? Oh good. Err… what is it?

With this blog series I’m trying to help explain xAPI (as well as other training matters), but this regular question got me thinking… If you don’t know what xAPI is, what makes you think you need it? Or put another way, Who should be using xAPI?

Tim Morton of Rustici has written several excellent articles on this topic, but what it boils down to is this…

  1. Is all your training online?
  2. Are you happy for everything to stay in your LMS?
  3. Are you simply tracking course completions?

If the answer to all three questions is “yes” then xAPI is not for you (at least not yet).

On the other hand if you are teaching train drivers and you would love to get data from the simulator and use it as part of the assessment, while also collecting data from the trains themselves about how a driver performs as they navigate the network then xAPI is a must.

What is different is that this example is one where training and assessment is taking place away from an LMS and away from a classroom. We’re connecting a thing (a simulator and a train) to our training environment. What’s more, the information we’;re collecting is complex – “how fast was the driver going?” “how long did it take to stop?” “what were the weather conditions?” “how did the driver react to a situation?” “Did the driver notice that signal?”. These are all more specific and detailed than most eLearning is designed to capture so you will probably then need another system to perform analysis of that data to arrive at an overall assessment of the driver.

As another example, if you want to create a system that pushes learning to people when they need it (at “the point of need“), that’s typically not going to happen during a training session. Instead you need some way to have the tools of trade that they are using detect when someone is about to do something and could do with a helping hand. That tool could send a signal with xAPI that causes some other system (perhaps an LMS or a content repository) to send a learning snippet to the operator then and there.

So in summary, it’s worth looking at xAPI if:

  1. you have good opportunities to train or assess people at the coal face
  2. The tools they are using have the ability to send xAPI messages (or can be fitted to do so)
  3. You want more than one system to combine to take part in the training or assessment

 

 

 

 

 

 

xAPI is NOT the new SCORM

I was recently reviewing a tender for an LMS. In it, was a question that does not have an answer. Pity the poor vendors attempting to respond!

That question was premised on a complete misunderstanding of xAPI. – “Is the LMS  both SCORM and xAPI” compatible.

Well… no.

I have heard this a bit and it comes from people thinking “xAPI is all the rage, it’s a standard. SCORM is a standard too …..    so….  xAPI is probably a better SCORM?”

Nope. Chalk and cheese.

SCORM solves the problem of having eLearning courses work with any LMS. With SCORM, courses can send tracking data, results and more to your LMS. SCORM’s really good at this – in fact almost all courses use just a tiny fraction of what SCORM can do. It doesn’t need an update. If you want to deliver eLearning you will be doing it with SCORM well into the future whether or not you also have xAPI.

The reason for xAPI is an entirely different challenge that extends across an entire organisation (well beyond eLearning):

  1. How can we measure results from informal training – what people learn on the job
  2. How can we link training to actual real world changes in behaviour?

Any solution to these challenges needs to work wherever your team learn or work. On the shop floor, while operating equipment, while driving the company car. xAPI makes no assumptions and so it is not an eLearning standard like SCORM. It doesn’t attempt to improve on SCORM, or do what SCORM does and it doesn’t centre on an LMS.

But what if I want eLearning in my bit xAPI world?

Sure! eLearning can play a part in this big picture environment. To do that, we just do exactly what we do now. Our LMs uses SCORM to launch and track eLearning modules. The LMS doesn’t “talk xAPI” at all and SCORM keeps doing what SCORM does.

What will change is your courses.  Courses send results to LMSes in SCORM. That bit stays the same so you can keep using your favourite LMS. In addition, those courses will send information in xAPI to an entirely different system that is listening not only to eLearning outcomes, but everything else too (and hopefully making sense of it).

A system that does that is called a Learning Record Store (LRS) and despite having a similar name they are nothing like an LMS. It can’t deliver a course (for a start); it is too busy listening to what’s going on around the place and analysing it.

So keep your LMS. Keep your SCORM. MAke your courses xAPI ready and do the same for your equipment. Administer your eLearning and classrooms in your LMS, then go to your LRS to see how effective it was, what else people learned and how they applied it.

So after all that, what are better xAPI questions to ask in an eLearning or LMS tender?

Well it’s worth asking if any eLearning courses you buy are ready to send xAPI messages to Learning Record Stores.

Beyond that, not much.


Side note- You can actually buy an LMS that comes with a crude LRS. That’s simply because so many people make the mistake of thinking an LMS should “have xAPI” that some vendors got sick of correcting them and bundled an LRS in so they could simply say “yes”. It’s a bit like a fridge manufacturer despairing of saying “no” to people asking if it washes the dishes and bolting a dishwasher on the back. Probably not a great dishwasher. Probably not a great LRS.

When we invent names for the obvious…

I have been thinking about the term “70:20:10”.

The term itself is new and has its origins in a study in 1996 undertaken at the Center for Creative Leadership in North Carolina. In just over twenty years since then, that study has spawned a term to describe the way we learn and an increasing awareness that people learn by doing and that so much of what we retain is gained outside a formal training environment. That’s true, but it is interesting that the term (and recognition of this as an issue) is so new.

Humans have walked the planet for a million years or so and other hominids for several million before that. Throughout that time we have mastered learning in a way that sets us apart from any other species, yet as far as I know, not many million-year-old classrooms have been excavated. There is little evidence that early hominids sat patiently in their cohorts while a qualified trainer explained the intricacies of fire making and warned that it will be on the exam.

The oldest evidence of formalised training is a mere few thousand years old, workplace training younger still. In fact as recently as a generation ago, almost all on-the-job training took place on the shop floor. We were mentored, apprenticed, helped and shown.

We evolved to learn by doing; it’s worked since the dawn of human time and so practical training is not what is new. What’s new is that it’s become a problem (thus warranting a name). If it’s not the training itself that’s at issue, it follows that the problem we’re grappling with is related to a very recent change in our expectations. I believe that what has changed is an expectation that training must be standardised.

Recently we have grown accustomed to managing training, formalising it, quantifying it, measuring it, standardising it. We have created structured syllabi, certifications, competency frameworks and tests. We’ve loaded them into our LMSes so we can measure success and run reports and we gravitated towards training forms that are readily measurable. We looked to formalised training and increasingly placed on the job learning in the too hard basket. As we’ve built more eLearning, scheduled classes and set exams, we narrowed our options and sometimes lost sight of what works best.

It must be said that in many cases we had little choice and much of this shift has been imposed from outside. Today we’re externally audited, regulated and subject to CPD. Increasingly mobile workforces require us to recognise training conducted elsewhere. Centralised funding sources demand increasing levels of reporting and standardisation of skills has brought both portability and flexibility. These have been positives for the community.

However, the downsides are manifest too. Quite apart from the neurological challenges that go with expecting trainees’ brains to learn in a way they were not designed for, we’ve lost the ability to provide the confidence that comes with seeing oneself try it out. We have sacrificed context – the reality that performance on a formal training environment may not be a good measure of performance elsewhere.

Then there is the cost. We spend increasing amounts on tools such as VR that promise increasingly realistic virtualised training that remains tethered to our LMSes. We add social tools to platforms to attempt to recreate the interactions, sharing, validation and exchange of ideas that once were so integral to our on the job experiences.

The challenge we face is not in creating 70:20:10 learning experiences – that comes naturally. Rather it is to adjust the paradigm in which we deliver them to one that is sophisticated enough to accommodate training in traditional forms. We cannot walk back the need for measurement now that it’s woven into regulatory frameworks, so we have to think beyond the LMS (though the LMS will continue to be important for some aspects of training) but not beyond technology altogether.

What’s exciting is that a solution in the form of xAPI is at hand. The next generation of training will be liberated from any particular teaching style. We will be able to measure practical outcomes as easily as we currently measure test scores and with the xAPI model, an organisation might have an LMS or it might not (depending on whether it wants eLearning as part of the mix).

I wonder if the term 70:20:10 will still be doing the rounds in ten years’ time?

It’s not the online medium that’s the problem, it’s the attitude!

Ultimately all education and training is about knowledge transfer. Part of this is the movement of information from one party to another but knowledge is far more than information.

Knowledge is about curation, assimilation and application of information to enable appropriate action within a specific context.

The traditional art of knowledge transfer is teaching and the outcome of teaching is learning. In a school K – 12 context this is referred to as pedagogy and for an adult it is andragogy – the method and practice of teaching adult learners.

Online learning or eLearning has created great opportunities for scaling knowledge transfer and delivering just in time, targeted information to assist in situ application.

It is exciting times for educators as the digital revolution has enabled a whole new world of evidence based teaching methods and practices.

One example is that it overcomes the problem of timing. Nowadays we can train people at the moment they need the content, overcomming the perennial risk – “Will my student remember this at the coalface?”. In an extreme example, real time repairs to broken down machinery are now realistic without former training.

But does it introduce a new risk? We are all questioning whether eLearning is actually effective – the pedagogy and andragogy considerations. This concern should be taken seriously – after all, if training doesn’t train why do it? The answer lies in not changing the approach. I see far too often that organisations replace expert external trainers with internally produced eLearning. The trainers had been chosen for their expertise in androgogy and the eLearning was assembled by someone who is not primarily a trainer. It should surprise no-one that these eLearning initiatives fail.

It’s easy to see why it happens. Narrow minded assessments view eLearning as a a cost based initiative rather than a cost substitution. Some organisations invest far less in training after adopting eLearning; Out with the specialists and in with whoever in HR is lowest on the pecking order. They then seem surprised when they get a lower return. Others see it as a legitimate training option – one that can deliver better staff and better productivity, the results are undeniable

It’s not the online medium that’s the problem, it’s the attitude!

And no-one should expect staff to continue to tolerate it. The old, boring and disconnected eLearning that flooded the early internet is being replaced with evidence based approaches and the empathy that has always defined best practice teaching.

The internet has enabled an explosion of information – this information is now finding its way back in to the hands of the educators who are utilising technology to deliver best practice pedagogy / andragogy (old fashioned teaching) to create highly effective knowledge transfer.

Or put another way – who do you want your staff relying on? A professional with knowledge of your industry or your payroll officer?