Are you getting typecast?

I know I am not alone in that I take great care when I hire staff; I conduct multiple interviews, set practical tasks and sometimes undertake personality profiling. I will bring my best ‘people people’ to observe each candidate closely.

Like most employers, I have my favourite questions and also like most, they apply irrespective of the job title. Most are not aimed at eliciting role-specific or technical skills, but rather to examine character traits that are vital to a person’s future success in the role.

Asking people to recall their best and worst supervisors is a favourite of mine; I love to hear their view of how they should and should not be treated, together with what they did about it. The latter point is often a great indicator of their response to stress and as I drill into their responses I gain insight into people skills and communication.

I ask about their successes but I get even more value when I ask about their failures, and more specifically how they learned from them.

Another is to list their top three values, if only to see whether they are introspective, honest and likely to enjoy our culture. With a little digging, this question also sorts the clear thinkers from the rest. It highlights introspection too.

In the second round comes a skills test where I am looking for nuances – the practical tasks tell me if the person can do the job, sure but I am much more interested in hearing the questions they ask me as they perform them. I am studying faces. They won’t know it, but I will so,etimes hire people who fail the test miserably, but go down fighting. I know I can teach technical skills to people if their attitude is right.

In short, like all employers I prize soft skills highly.

So why is it that my company sells dozens of times more compliance courses than soft skills courses? Why are we so often asked whether we offer soft skills courses only to be told eventually that budget was not approved?

If soft skills are an organisational priority, it’s incongruous that they are not a training priority. Yet clients often tell us that the training they deliver is geared towards regulatory priorities to the exclusion of any other.

I am generalising somewhat and you only had to attend LearnX last week, or the AITD awards night to see some really innovative counter examples, but especially in smaller more financially constrained organisations there is a concerning trend. It is ironic that it is precisely these organisations with their small, closer-working teams that have the most to gain.

Wouldn’t it be wonderful to challenge the stereotype that training is about compliance? When management is already trying so hard to recruit soft skills, can it really be so hard to preach to the converted that they should train them too?

Next time you make a budget submission, perhaps focus a little less on ‘completion rates for mandatory training’, and a little more on alignment to culture and broader organisational goals and see how you go… I dare you!

Is xAPI right for me?

I am often asked whether our organisation has xAPI capability as part of a requirement for us to be engaged and when I confirm that it does, I’m then asked “so what is xAPI?”

It’s a slightly funny situation I must say… Do you have it? Oh good. Err… what is it?

With this blog series I’m trying to help explain xAPI (as well as other training matters), but this regular question got me thinking… If you don’t know what xAPI is, what makes you think you need it? Or put another way, Who should be using xAPI?

Tim Morton of Rustici has written several excellent articles on this topic, but what it boils down to is this…

  1. Is all your training online?
  2. Are you happy for everything to stay in your LMS?
  3. Are you simply tracking course completions?

If the answer to all three questions is “yes” then xAPI is not for you (at least not yet).

On the other hand if you are teaching train drivers and you would love to get data from the simulator and use it as part of the assessment, while also collecting data from the trains themselves about how a driver performs as they navigate the network then xAPI is a must.

What is different is that this example is one where training and assessment is taking place away from an LMS and away from a classroom. We’re connecting a thing (a simulator and a train) to our training environment. What’s more, the information we’;re collecting is complex – “how fast was the driver going?” “how long did it take to stop?” “what were the weather conditions?” “how did the driver react to a situation?” “Did the driver notice that signal?”. These are all more specific and detailed than most eLearning is designed to capture so you will probably then need another system to perform analysis of that data to arrive at an overall assessment of the driver.

As another example, if you want to create a system that pushes learning to people when they need it (at “the point of need“), that’s typically not going to happen during a training session. Instead you need some way to have the tools of trade that they are using detect when someone is about to do something and could do with a helping hand. That tool could send a signal with xAPI that causes some other system (perhaps an LMS or a content repository) to send a learning snippet to the operator then and there.

So in summary, it’s worth looking at xAPI if:

  1. you have good opportunities to train or assess people at the coal face
  2. The tools they are using have the ability to send xAPI messages (or can be fitted to do so)
  3. You want more than one system to combine to take part in the training or assessment

 

 

 

 

 

 

xAPI is NOT the new SCORM

I was recently reviewing a tender for an LMS. In it, was a question that does not have an answer. Pity the poor vendors attempting to respond!

That question was premised on a complete misunderstanding of xAPI. – “Is the LMS  both SCORM and xAPI” compatible.

Well… no.

I have heard this a bit and it comes from people thinking “xAPI is all the rage, it’s a standard. SCORM is a standard too …..    so….  xAPI is probably a better SCORM?”

Nope. Chalk and cheese.

SCORM solves the problem of having eLearning courses work with any LMS. With SCORM, courses can send tracking data, results and more to your LMS. SCORM’s really good at this – in fact almost all courses use just a tiny fraction of what SCORM can do. It doesn’t need an update. If you want to deliver eLearning you will be doing it with SCORM well into the future whether or not you also have xAPI.

The reason for xAPI is an entirely different challenge that extends across an entire organisation (well beyond eLearning):

  1. How can we measure results from informal training – what people learn on the job
  2. How can we link training to actual real world changes in behaviour?

Any solution to these challenges needs to work wherever your team learn or work. On the shop floor, while operating equipment, while driving the company car. xAPI makes no assumptions and so it is not an eLearning standard like SCORM. It doesn’t attempt to improve on SCORM, or do what SCORM does and it doesn’t centre on an LMS.

But what if I want eLearning in my bit xAPI world?

Sure! eLearning can play a part in this big picture environment. To do that, we just do exactly what we do now. Our LMs uses SCORM to launch and track eLearning modules. The LMS doesn’t “talk xAPI” at all and SCORM keeps doing what SCORM does.

What will change is your courses.  Courses send results to LMSes in SCORM. That bit stays the same so you can keep using your favourite LMS. In addition, those courses will send information in xAPI to an entirely different system that is listening not only to eLearning outcomes, but everything else too (and hopefully making sense of it).

A system that does that is called a Learning Record Store (LRS) and despite having a similar name they are nothing like an LMS. It can’t deliver a course (for a start); it is too busy listening to what’s going on around the place and analysing it.

So keep your LMS. Keep your SCORM. MAke your courses xAPI ready and do the same for your equipment. Administer your eLearning and classrooms in your LMS, then go to your LRS to see how effective it was, what else people learned and how they applied it.

So after all that, what are better xAPI questions to ask in an eLearning or LMS tender?

Well it’s worth asking if any eLearning courses you buy are ready to send xAPI messages to Learning Record Stores.

Beyond that, not much.


Side note- You can actually buy an LMS that comes with a crude LRS. That’s simply because so many people make the mistake of thinking an LMS should “have xAPI” that some vendors got sick of correcting them and bundled an LRS in so they could simply say “yes”. It’s a bit like a fridge manufacturer despairing of saying “no” to people asking if it washes the dishes and bolting a dishwasher on the back. Probably not a great dishwasher. Probably not a great LRS.

When we invent names for the obvious…

I have been thinking about the term “70:20:10”.

The term itself is new and has its origins in a study in 1996 undertaken at the Center for Creative Leadership in North Carolina. In just over twenty years since then, that study has spawned a term to describe the way we learn and an increasing awareness that people learn by doing and that so much of what we retain is gained outside a formal training environment. That’s true, but it is interesting that the term (and recognition of this as an issue) is so new.

Humans have walked the planet for a million years or so and other hominids for several million before that. Throughout that time we have mastered learning in a way that sets us apart from any other species, yet as far as I know, not many million-year-old classrooms have been excavated. There is little evidence that early hominids sat patiently in their cohorts while a qualified trainer explained the intricacies of fire making and warned that it will be on the exam.

The oldest evidence of formalised training is a mere few thousand years old, workplace training younger still. In fact as recently as a generation ago, almost all on-the-job training took place on the shop floor. We were mentored, apprenticed, helped and shown.

We evolved to learn by doing; it’s worked since the dawn of human time and so practical training is not what is new. What’s new is that it’s become a problem (thus warranting a name). If it’s not the training itself that’s at issue, it follows that the problem we’re grappling with is related to a very recent change in our expectations. I believe that what has changed is an expectation that training must be standardised.

Recently we have grown accustomed to managing training, formalising it, quantifying it, measuring it, standardising it. We have created structured syllabi, certifications, competency frameworks and tests. We’ve loaded them into our LMSes so we can measure success and run reports and we gravitated towards training forms that are readily measurable. We looked to formalised training and increasingly placed on the job learning in the too hard basket. As we’ve built more eLearning, scheduled classes and set exams, we narrowed our options and sometimes lost sight of what works best.

It must be said that in many cases we had little choice and much of this shift has been imposed from outside. Today we’re externally audited, regulated and subject to CPD. Increasingly mobile workforces require us to recognise training conducted elsewhere. Centralised funding sources demand increasing levels of reporting and standardisation of skills has brought both portability and flexibility. These have been positives for the community.

However, the downsides are manifest too. Quite apart from the neurological challenges that go with expecting trainees’ brains to learn in a way they were not designed for, we’ve lost the ability to provide the confidence that comes with seeing oneself try it out. We have sacrificed context – the reality that performance on a formal training environment may not be a good measure of performance elsewhere.

Then there is the cost. We spend increasing amounts on tools such as VR that promise increasingly realistic virtualised training that remains tethered to our LMSes. We add social tools to platforms to attempt to recreate the interactions, sharing, validation and exchange of ideas that once were so integral to our on the job experiences.

The challenge we face is not in creating 70:20:10 learning experiences – that comes naturally. Rather it is to adjust the paradigm in which we deliver them to one that is sophisticated enough to accommodate training in traditional forms. We cannot walk back the need for measurement now that it’s woven into regulatory frameworks, so we have to think beyond the LMS (though the LMS will continue to be important for some aspects of training) but not beyond technology altogether.

What’s exciting is that a solution in the form of xAPI is at hand. The next generation of training will be liberated from any particular teaching style. We will be able to measure practical outcomes as easily as we currently measure test scores and with the xAPI model, an organisation might have an LMS or it might not (depending on whether it wants eLearning as part of the mix).

I wonder if the term 70:20:10 will still be doing the rounds in ten years’ time?

It’s not the online medium that’s the problem, it’s the attitude!

Ultimately all education and training is about knowledge transfer. Part of this is the movement of information from one party to another but knowledge is far more than information.

Knowledge is about curation, assimilation and application of information to enable appropriate action within a specific context.

The traditional art of knowledge transfer is teaching and the outcome of teaching is learning. In a school K – 12 context this is referred to as pedagogy and for an adult it is andragogy – the method and practice of teaching adult learners.

Online learning or eLearning has created great opportunities for scaling knowledge transfer and delivering just in time, targeted information to assist in situ application.

It is exciting times for educators as the digital revolution has enabled a whole new world of evidence based teaching methods and practices.

One example is that it overcomes the problem of timing. Nowadays we can train people at the moment they need the content, overcomming the perennial risk – “Will my student remember this at the coalface?”. In an extreme example, real time repairs to broken down machinery are now realistic without former training.

But does it introduce a new risk? We are all questioning whether eLearning is actually effective – the pedagogy and andragogy considerations. This concern should be taken seriously – after all, if training doesn’t train why do it? The answer lies in not changing the approach. I see far too often that organisations replace expert external trainers with internally produced eLearning. The trainers had been chosen for their expertise in androgogy and the eLearning was assembled by someone who is not primarily a trainer. It should surprise no-one that these eLearning initiatives fail.

It’s easy to see why it happens. Narrow minded assessments view eLearning as a a cost based initiative rather than a cost substitution. Some organisations invest far less in training after adopting eLearning; Out with the specialists and in with whoever in HR is lowest on the pecking order. They then seem surprised when they get a lower return. Others see it as a legitimate training option – one that can deliver better staff and better productivity, the results are undeniable

It’s not the online medium that’s the problem, it’s the attitude!

And no-one should expect staff to continue to tolerate it. The old, boring and disconnected eLearning that flooded the early internet is being replaced with evidence based approaches and the empathy that has always defined best practice teaching.

The internet has enabled an explosion of information – this information is now finding its way back in to the hands of the educators who are utilising technology to deliver best practice pedagogy / andragogy (old fashioned teaching) to create highly effective knowledge transfer.

Or put another way – who do you want your staff relying on? A professional with knowledge of your industry or your payroll officer?

OHS Rules in Australia

It is more than five years since harmonised health and safety legislation was introduced across Australia with the exception of Victoria and West Australia. In other jurisdictions a variant of Safe Work Australia‘s model legislation came into effect on the first of January 2012. It has subsequently been updated with the most recent version released in late 2016. It is supported by Model Regulations.

That is not to say it was implemented verbatim anywhere – subtle differences exist in each state and territory. For example, in New South Wales, Tasmania and Queensland, the words “reasonably practicable” were included to qualify the definition of duties. Queensland and South Australia applied the legislation’s duties of care and protections to volunteers while the ACT applied them to site visitors.

Safe Work Australia (SWA) is not an enforcement agency and each state retains responsibility for enforcing legislation.

In late 2014, the process of harmonisation was commenced in WA, though this process is still incomplete and a modified form of the model legislation is likely to be adopted. West Australia’s objections included union right of entry, penalty levels and the reverse onus of proof for discrimination matters.

The Victorian Government has stated that it will not adopt the national model workplace health and safety laws in their current form as it is of the view that it is less stringent than Victoria’s current legislation, the Occupational Health and Safety Act, 2004 (summarised here). Injury claims statistics would appear to provide some support for this position – Victoria is second only to NSW in number of serious accidents per capita and boasts the lowest injury claim cost per employee in Australia.

Untitled

Victoria’s position has received criticism for the added complexity in compliance for employers in multiple jurisdictions.

Meanwhile, Victoria’s regulator (Worksafe Victoria) has recently released new regulations in regards to workplace safety and has issued a guide to assist with compliance. It has indicated that the new rules will improve standards while reducing red tape. Key changes relate to notification of homeowners of asbestos removal work and certain changes to high risk work licenses. Other changes impact hazardous facilities including mines and construction sites.

 

Recruitment and Training – one coin, two sides

Recruitment and Training – one coin, two sides

It is not always practical to recruit for skill as we know but I’d argue it’s not always desirable either. All organisations are attracted to the idea of recruiting those with the right certifications and skills, but with ever more understanding of more subtle traits that are at least as important (such as management potential, emotional intelligence and cultural fit) pre-existing skills are increasingly seen as something “in the mix”. In fact, many of the more sophisticated traits we now demand are more in-grained and harder to teach. Arguably we should view vocational skills as the easier ones to flag for future development in the right candidate and to an extent, prioritise character traits that are essential.

That’s one reason why vocational training is becoming ever more important. It was always true that workforce training had the potential to transform any organisation and make it far more nimble, responsive, effective and profitable as we know. However the more sophisticated our recruitment practices become, the more central training becomes in providing the freedom to recruit wisely. The more we can rely on staff development to fill gaps reliably and rapidly, the more freedom we have to build the best possible teams.

That said, just because training and development is important, it doesn’t make it an automatic win. Like anything, it will require planning and investment to work. Everyone wants to do it, but most of us are confronted by the question “can we afford to do it well enough to make it worthwhile?”. It’s a good question too – I would argue that subjecting people to a poorly planned, tedious and only partially relevant training program does more harm than good. If training is the first thing a new worker experiences, we do not want their first impressions to be irrelevance, poor quality, cheap and disrespectful.

So where does that leave an organisation with limited resources and experience in training delivery? What are the measures they might take to ensure their first steps towards training initiative will actually deliver them results?

The first thing to say is that workforce development takes place in every workplace; in other words you are doing it now whether you realise it or not. Contemporary thinking is that most learning is experiential – almost everything we learn we gain through experiences that just happen whenever we show up to work. The easiest and most effective starting point for any training program is to make those experiences as valuable as possible. Simple initiatives such as a staff rotation program and a mentoring program are arguably the most effective things an organisation can do to upskill its people. They are also some of the easiest things to do.

But what then?

If we assume your competition have most likely also achieved the basics, then to do better than them, we have to consider structured learning. That might sound scary but remember it’s intended as icing on the cake – your good experiential program is the cake itself.

Also, there are options and your structured training does not necessarily involve an internal program; indeed many of the largest companies are partnered with RTOs or TAFEs. From a financial perspective, it’s frequently cheaper to get the economy of scale by asking large external trainers to skill your team rather than replicate their efforts on a small scale internally.

However a surprising number of organisations elect to institute their learning and development programs completely internally and many even go as far as to establish an enterprise RTO and claim funds for it. This is of course more ambitious but don’t make the mistake of assuming it’s out of reach. Many people are surprised to learn just how many of these programs are operating very effectively in small and medium enterprises. The good news is that they also talk about it so there is a wealth of information online and countless vibrant forums on social media for the new learning and development manager to read. Jump in, ask for help. What is there to lose?

Of course, running an internal program is not for everyone, so when is it the right way to go?

Internal implementation allows a program to be highly specialised. That’s perfect for highly specialised or niche organisations. It’s also great when you seek to train people in a context of an organisation’s competitive advantages.

Another important point is that no training is every as good as that delivered from the workplace itself. Only there can it truely capitalise on experiential learning for maximum relevance and interest. The extra effectiveness delivers cost and time savings in turn.

Another consideration is that when people are trained at work their employer organisation gets to see outcomes directly in a way that returning from an external course brandishing a certificate can never do. This helps maintain commitment to the program and allows it to be continually enhanced.

No matter what approach you take, a quality training program will foster a culture of collaboration, learning and of continuous improvement and delivers bottom line outcomes in the process. But best of all, your training program will allow you more freedom to recruit from a wider pool of candidates and make your workplace more sought after