What happens when ID’er’s start asking too many questions..

This seems to be a hot topic right now – I’ve seen it in a number of places, and it’s one of my pet themes. It starts with the question:

  • Where does this learning project come from?

Does the Learning Director/Department manager simply hand the learning project to the Instructional Designer? And where do they get it from?

It’s our business to ask questions, and those questions will soon be asking how the Learning Project meets the needs of the business. Then, from there, it’s a short step to ask if this particular learning project is really the best way to address the perceived needs.

A lesson learnt from my time in programming is that there are two targets:

  • Build the right solution
  • Build the solution right

If you haven’t identified the problem that you need to solve, you could easily waste a lot of time and money building something – which may be great in itself – but isn’t actually what was needed. And nobody wants to do that.

So, here’s the dilemma. The manager hands you a brief for a learning project. You know that you can get the same learning ‘off the shelf’ from one of the content libraries. Or you can see that making a course from the source material you have been given won’t fulfil the objectives. Do you carry on?

As a freelancer, the answer is simple. Yes. You’d be a fool to let on that you’ve seen a similar course on Coursera for $50 and deny yourself the contract. And you’ll be paid whether or not the organizational goals are met – by the time this is known, the project will be a vague memory. 

Working in a company, this may seem easier – In a previous life, I’d go back to the Learning Director and suggest that getting these courses from such-and-such provider would mean the learners would benefit months sooner, the eLearning team would be free to work on projects we couldn’t buy, and we’d save the organization thousands by investing a small amount. So soon I found myself guiding the overall digital learning strategy.

But in doing this, I had shot myself in the foot. As the eLearning expert, I’m no longer creating courses – I’m too busy making contracts and agreements with external suppliers, sifting through libraries and evaluating other people’s courses. The output of our eLearning team has also dropped – the focus had shifted to sourcing eLearning, administration, gathering requirements from the various business areas – and finding eLearning providers who could provide courses to meet these requirements.

And hey presto, I’m no longer an Instructional Designer, but a project administrator. I’m now forced into a role which I didn’t want. Time to say goodbye. Kind words from the department head on leaving – “What I respect about Adam is that he won’t do something unless he believes what he’s doing is the right thing.”  I think it’s nice to be that principled, but sometimes in businesses, compromises need to be made, it’s only really worth entering into battles when you can see a positive outcome. 

And, finally, a warning. As an Instructional Designer, yes, it is an important goal to understand how your project fits with the needs of the organization. But you need to know how the goalposts are fixed by many different stakeholders – both inside and outside the organization. And mess with them at your peril – the consequences might lead you places you don’t want to go!  

A bit of web archeology…

 

Today’s novelty is what I have found while searching for parts of the Enterprise Training courses I developed around 2012.. 

Well, I found evidence of intelligence and sophisticated tool-making dating even further back.. from 2006, and my PGCE no less. And all the links still seem intact. I will need to study the remains further to see if I can find any usable DNA.. 

http://abayliss.edublogs.org/

I particularly like this – disability awareness, made using Flash.. Things weren’t simple in those days, this took some programming.. The idea is still sound!

http://abayliss.edublogs.org/2008/05/08/disability-awareness-unit/

On persuading the client that they need the full VR!

It’s a sad fact that 80% of eLearning is probably going to be levels 1 & 2 for some time to come. How many times do we hear from the SME “Why can’t you just take my powerpoint and put it online?”

Let’s face it, sometimes it is hard enough sometimes justifying any of our development work, so where do we even start when we want to include animations based on 3D environments?

I did some consultancy for a German manufacturing company recently, that exports an innovative product worldwide. Understanding how a business works is important to see the full problem. There are ‘Layers’ of training.

a. Impressing audiences (and potential customers) at industry fairs: Most companies that sell attend industry events where potential buyers wander round looking at where the industry is. These are very high-value events. Having a stand is expensive, the potential sales are huge. This is where they can break out the VR headsets, and show the best the company has to offer.

b. Supporting their sales force: Ok, VR headsets do take some setting up, they need super powerful graphics cards and they are not everyone’s cup of tea. The sales force representatives need a set of great presentations that show the product in its best light. This is a lesson I learned working with Volvo – the people who produce this work are our friends. They often have huge budgets compared to the eLearning department, in essence, the same mission – to educate people about the quality and benefits of the product.

c. Teaching the sales force / technicians: The next level down comes when the product construction becomes important. The sales force need the detail to answer customer questions, the technical team need the details to know how to service or repair the machine.

d. The Customer User Training: Last, but by no means least, we need to train the customer on how the product should actually be used in daily work to get the best out of it. 

Take a look again, and consider the eLearning team. In a lot of cases, the eLearning team are given d. to do. That is their job. c. Can be taken care of by the trainers, who are already designing their classrooms in rooms full of spanners and screwdrivers and computer user consoles. As for a and b, well, they are marketing’s job – what does eLearning have to do with sales?

Now look again – because you’re missing something. All the data and modelling for the nifty graphics in a 3D environment is already there – the people who made the marketing have made the nifty graphics – the 3D modelling has been done. And once you have the 3D model, you can create VR scenarios out of it in exactly the same way as shooting an animated scene – the camera changes but the model is the same.

Now look down – consider using the same high quality film and environment to help the sales force and technical team learn what is going on inside the machine. Not just film – it needs to be backed up with solid exercises and tests and practices and procedures as with any learning – but now instead of creating these expensive film segments for the eLearning, they are actually fall-out from the marketing drive. The whole set of courses fit together and follow the same branding and ‘look and feel’ because they use the same elements.

And last of all, the end customers, the users, the people who will be affected by the product every day – who wouldn’t want to give them the best experience – well, their training can be made fun, interactive.. the do something and Zoom! The camera whizzes inside the machine and sees the cogs and wheels at work before whoosh! The machine delivers what it was asked for.. And the eLearning department doesn’t need to petition to have its budget tripled to make the little filler animations that add this super touch of quality to the user training.. no.. because they use the same models that were created for the rest of the set. And even that is getting cheaper. It’s how we use it. One day all eLearning may be of the boom bang whizz variety, but not for a while. And it’s how we get to there from here that we need to be thinking about…

Risk-based Oversight & the Chemicals Industry

Risk-based Oversight Strategies were the ‘Hot topic’ when I was at EASA. I see the documents published:

https://www.easa.europa.eu/document-library/general-publications/practices-risk-based-oversight

I have suddenly got rather excited about this topic and its application in the Petrochemicals Industry. If it isn’t a big thing already, it soon will be once the savings vs. benefits are properly understood. 

To understand Risk-based approaches, it’s useful to compare with a traditional Procedure-based approach. Think of the procedure as a checklist. Before a plane takes off, the crew go through a list and tick items off. That list must then contain everything that might conceivably go wrong, so that approach takes time. Risk-based approaches would concentrate on areas where failures are more likely – a more focused approach, where the same amount of resources are used more effectively. Think of a group of inspectors at an airport – they can’t check every plane, so they focus on the older models, or operators who have poorer safety records, and let the others through with more trust and less oversight (trust is a key issue in Risk assessment).

In the real world, both of these approaches complement each other. A team working a plant need their procedures and checklists, and the oversight team need to focus their efforts on areas where they know there are the greatest risks. The oversight team need to take these risks into account when they revise the procedures to mitigate areas they have identified. The teams need to establish a base of trust with the oversight that they are following procedures correctly. And so it becomes a cycle.

Any risk analysis accepts that there is a risk of failure. The ‘allowable risk’ is calculated on the basis of probability vs. consequences, and is a figure representing costs. And safety is a cost. But it isn’t a linear relationship. 100% more money spent won’t mean 100% more safety. No more than spending twice as much in a restaurant will guarantee the food tastes twice as good. Any industry involving safety is finding a balance between the levels of safety and profitability – finally, it must be in everyone’s interest that the resources are used in the most effective ways possible. 

The one thing that is missing in this data-driven world, which was notable in the aviation and transport industries, is a common incident database. The Aviation one is ECCAIRS ( http://www.aviationreporting.eu/AviationReporting/ )  – I wonder if similar could exist for the Chemicals Industry? Probably industrial secrecy prevents it? I’m going to find out more…

ROI vs. ROE

Working on a chemical plant in Saudi’s Eastern Region has been educational, especially in the time I’ve been granted to consider things. And bearing in mind the plant contains all sorts of nasty chemicals, which could wipe out entire towns if it were ever to go bang, the effectiveness of the training being given is one of those things worth considering.

Anyone in training must be familiar with Return on Investment. It’s that notoriously difficult to pin down figure that training managers use to justify the existence of their departments to the management. It’s difficult to pin down because it is not always possible to show the effectiveness of a training program in terms of business efficiency. It’s also folly to link gains or losses to a training when there are a thousand other factors that might influence how the business is running – new competitors in the marketplace, or increased cost of raw materials for instance.

Finally, returning to the plant, and safety issues, the ROI in training is hopefully an absence of significant accidents. The cost of even one such event could be staggering. But how to quantify the value of something not happening?  

The ‘Safety Culture’ places great value on ‘Near Miss’ incidents – where lessons can be learned from events that didn’t have any major consequences, but could have done. When they happen, investigations take place and systems and processes are modified. But even this approach relies on incidents in the first place to use as test cases. So what do we look at when we are aiming for total prevention?

This is where Expectations come in.

Let’s take proper use of gloves as an example. The training expectations are that once they have done this training, employees not only know how to use the right gloves, but from that point forward will always use the gloves as intended. Spot checks at monthly intervals might be needed to confirm this. And, assuming the spot checks check out, we can confirm the ROE for the training has been met. There is no obvious cost implication to this success, but no-one has been hurt in the evaluation of this training. 

We can follow the path back to the training, too. It makes sense to define the outputs before we start. Kirkpatricks’ model helps – the output we want is a level 3 – Behaviour. The level 4 goal, having no injuries in the organization is linked, but will follow from that level 3.

Knowing this, the Instructional Designer has a clear job. There’s the information level of the training, which gloves are needed for what. A tactile exercise would be nice, trying them on and performing tasks. You may drive the message home with pictures of hand injuries that resulted from bad choices. You could have classroom activities with pictures of work situations and everyone holds up the right glove. That’s half the story covered.

Of course, outside the classroom, there are things that contribute too. Making sure everyone has a proper set, that fits, personalized with their own name, and somewhere to keep them where they are available. Finally, notices in areas warning employees of penalties for being glove-free may, in the end, have more effect than anything. This last part is arguably the most important because it brings the knowledge out of the training and into the daily grind.

It’s a culture change that’s needed. In the right circumstances, culture change can be deceptively fast.  

Big Data and Learning Evaluations

AI and machine learning. Big data. Learning Evaluations, ROI, ROE (Return on Expectations).

All very exciting terms at the moment, being thrown about – with ‘first movers’ trying to grab a bit of the action before anyone really knows the future applications of these technologies. So here’s my take on the future, and where big data is going to be really useful. But first, we need to look to the other side of the training, to the outcomes.

Various texts I’ve read complain about assessment results. Candidates learning reduced to a simple pass/fail, or a percentage. In this collected assessment data, usually cast aside is a mine of useful information, both about the candidate, and the training itself.

There are a minefield of issues around this data. From a purely deterministic point of view, we should pre-assess everyone, to measure improvement. Topics where a high percentage of candidates struggle raise questions about how those topics are presented. In assessing the learner, we are also assessing the training. So this is where Big Data and groups come in. As a purely average subject, a learner can be a known quantity. The deviations from the reference groups become the points of interest, strengths and weaknesses. The assessment needs to capture areas where further training may be needed – or flag that a learner has affinities for a certain subject. It’s just that creating an assessment that measures all this data is a logistical nightmare. Do we even have the tools?

Something not entirely new but which I’d be keen to see more of is 3D Assessment – where a candidate answers a question, but also gives a rating on how sure they are about the answer they have given. If a candidate selects the right answer and is 100% certain, that’s a full mark. If they select a wrong answer, but are only 20% sure of that answer, for example, the mark would be better than if they select a wrong answer but are sure that it is right. In other words, the candidate is rewarded for showing that they know what they know and what they don’t (this reminds me of the whole ‘known unknowns vs. stuff we don’t even know exists’ argument).

At an individual subject level, the data is pretty subjective. But if you start getting areas where learners come out thinking they have learnt things but have the wrong answers, you have a problem with the training materials. Learners coming out with areas where they are uncertain of answers, whether the answers are right or not, would suggest a need for further training.

I see three issues currently stopping this from happening:

  • Actually creating the assessments
  • Lack of suitable tools
  • Lack of numbers – this is real more users, more useful scenario

And the danger for the designers of the training – the assessment could finish up as such a rigid and complex framework, this will limit the options available when creating training to fit such a frame – when the outputs are so clearly defined, how can the trainer miss?

As always, I’m interested to hear any thoughts on this – has anyone used 3D assessment in this way?

Where serious games and eLearning might lead…

As part of the Serious Games MOOC from the Erasmus University, Rotterdam, I’ve been asked to look in my crystal ball and  guess what’s coming in 5-10 years, so here’s my best answer yet…

Many studies have shown that technology is isolating us to a great extent – where we used to meet, we communicate via virtual channels. There’s a lot of development, especially in Japan covering virtual companions. There’s the Turing test, testing if people can tell a human and a ‘bot’ apart – I see developments in this field coming in to gaming. This would allow virtual ‘agents’ to be placed in the game – as teachers or merely companions to the player – which would then open up another area of social learning possibilities.

Its been studied that we learn more from those around us than anything else – so that must be a desirable effect to mimic.

Imagine a language course with an attractive virtual partner, who can take you through scenes – like a virtual date – while having natural conversations in the chosen language – at the same time, analysing your own conversation for weak areas, to return to in the next session. Scenes could be visiting a restaurant, museum, even walking around a virtual city…

That would push every psychological and affective button we have. The only problems I see is what would happen to the relationship with the avatar once the lessons are learned? Perhaps the answer would be to have the ‘date’ change every few weeks? Then would that disposability of partners cheapen the learners’ attitude to relationships generally?

This idea has got me excited about the possibility of bringing AI into eLearning courses, as virtual tutors. It might not happen yet, but it’s around the corner. More to the point, how elaborate does an AI tutor need to be to be useful?  

Getting answers ready…

…you know how sometimes you are so busy with preparing that you’ve worked out what you want to say so far ahead, its just a shame not to write it down.. And then, written down, lets face it, it will never be said.. but it seems a shame to waste it…

Positive traits…..

Taking on Complex tasks and seeing them through to the end:

The training for the EU <> Brazil BASA agreement – I’m good with technical subjects, but this was training on Aviation Regulations – procedures that need to be followed to sell a second-hand European helicopter in Brazil, for instance – or it might be a Brazilian company wanting to sell propellers in the EU – or something as simple as an Airbus in Sau Paulo needing routine maintenance – as source material I was given a 400 page book of regulations and a list of contacts, both in EASA and in ANAC.

I had to drill down in to groups of users, who would need to learn what – and take my ideas back to the experts, to get their agreement on how to proceed. The final training covered 16 areas, and users could pick the areas they needed for their work. The work was approved by experts at EASA and ANAC, was translated into Portuguese and (should) form the reference when Brazilian companies need to work with the EU (and vice versa).

“My businesslike, sometimes direct, but always friendly and engaging way of dealing with people.

At the level that I have been working, everybody is a professional. When working with an SME – I’ll take the example of Composite-bodied Aircraft – We need mutual respect. My SME is one of the world’s leading experts on Composites – he’s been working in aircraft design for 30 years – and it’s my job to tell him that how he wants to design his training is wrong. Of course there is a challenge there – but by winning him over, building a friendly basis for trust and a common and clear way of working, these difficult challenges melt away.

Its particularly important not to be seen to dictate to people – their training at the end of the day is their own – they need to have a sense of ownership – it might seem surprising how much of Instructional Design involves working with people – you might imagine me sitting at a computer writing training – but actually what you are trying to do is to capture the knowledge from the client or SME, inject ideas and experience of your own, and do it in such a way that the learner will understand and want to learn.

Negative

I get bored easily. Although, working in the Gulf, I have learned to turn this to my advantage, and use the time for my own projects and self development.