Training design - The systems thinking and training blog - Bryan Hopkins

Go to content

Main menu:

Why training is never a solution to a workplace performance issue (but then, nothing else is either)

Published by in Training design ·
 
This blog item was originally published as a LinkedIn article.

 
Early in my schooling I was presented with the problem “2 + 2 = ?”. With the aid of various fingers, I solved that one, and in due course went on to complete an engineering degree, where I solved some much more complicated problems than that. Had I continued in engineering, I might have contributed to the mathematics which lands spaceships on Mars: even more complicated, but given equations, speeds and trajectories we can confidently work out how to get this job done. It is just rocket science, after all, with clear processes to follow … and solutions.

It is a bit different on the days when I am planning what to do when I look after my two-year old grandson. I have an idea about what we will do and know what he is allowed to eat and not eat. But he has his own ideas, and what actually happens on those days emerges out of the interactions of these different perspectives. Our relationship is beyond complicated: it is complex, a heady, undefinable and unpredictable mix of human behaviours. Quadratic equations and Laplace transforms do not help, and there are no solutions giving a plan for a perfect day.

This will not be new to many readers. But, actually what we do when we design training programmes is to pretend that human behaviour is predictable and treat the whole issue of performance improvement as if it were rocket science. We do this because we have been seduced by the charms of the Enlightenment, that period in history when rational thought started to replace mysticism. It was thought that we could understand anything by breaking it down into its constituent parts, seeing what each part did and adding it all back together. This does work well for rockets, but not for my grandson and I, nor for people working in organisations.

The starting point for training design is to work out what we would like people to be doing, define performance objectives and then explicitly or implicitly, deconstruct these to identify the specific aspects of knowledge, skills and attitudes that are needed. We then have the bones of the training programme. This might be a good way to start the process of designing something to improve performance, but it has serious weaknesses if we start to use these same objectives to make judgements about how effective the programme is after it has been implemented. After all, as soon we start training people, these simple pieces of knowledge, skill and attitude interact with human behaviour issues and start to take everyone involved in directions we may not expect.

Let’s think about these behaviour issues more closely. People interact with each other, interactions have consequences and create feedback loops, information comes in from outside the group, and there is a history which has moulded the group into what it is at any particular moment. As such, workplace groups can be regarded as complex adaptive systems, systems which are constantly changing in response to internal and external dynamics. Of particular importance is the reality that human interactions are what is described as non-linear, that there is no direct, consistent connection between cause and effect. Of significance here is that this means that when we train someone to do something better they may not actually do it better, or doing it better may cause negative feedback within the system (resentments, jealousies, infringing implicit performance norms, leaving the organisation and so on).
 
We also know that when we look at problems in the workplace we can find it very difficult to describe exactly what the problem is: everyone will describe it in different ways, depending on their own view of what is happening. Because explanations of the problem are different, definitions of success will be different. Anything we do to change things within a problem situation changes the conditions, so the nature of the problem changes. We also find that the problems we are exploring are actually to some degree caused by other problems. So, as we saw before, because everything is connected we have a network of complex adaptive systems, all constantly evolving to generate situations which we cannot possibly predict in advance.

Given this complete mess how do we start to make things better? The key is to try and stop thinking of finding ‘solutions’. Complex, wicked problems[1] never come to an end, they just keep changing, and all we can do is to try and make things better: we will never be able to ‘solve’ them. This has big implications for training design.

Firstly, training programmes are usually based around sets of static performance objectives or learning outcomes, defined at a specific point in time. But by the time a programme has been designed the problem is different, so the objectives may have become irrelevant. We should therefore think more about trends: is the situation developing in a desirable direction? This also means that instead of an evaluation carried out some time after the event we need to do more ongoing monitoring. This helps to get around the problem of deciding when to carry out an evaluation: this is always difficult, too soon and any initial enthusiasm colours the results, and too late, causality becomes far too indistinct to give evaluation any meaning.

Objectives are usually expressed in the form “The learner will be able to: …” This focuses training on individuals and overlooks the fact that everyone works within a complex adaptive system. It means that the content of training tends to focus on individual knowledge and skills rather than collaborative or cooperative activities. Training initiatives should be more team-oriented, involving staff and supervisors, along with other teams with which they interact. Objectives should focus on positive change rather than being about achieving an end state.

Thirdly, the constantly changing landscape within complex adaptive systems means that top-down didactic training can never hope to give people the knowledge and skill they need to be able to deal with all the evolving operational variety they face. So performance improvement strategies must create structures and space where people can exchange information and learn from each other.

So training can be a sort of solution, as long as we do not see it as providing a definitive result. Solution-oriented thinking also tends to create responses which are structured as projects, i.e., with a beginning, middle and an end. If we escape from that particular thinking box, we can conceive more easily of learning interventions which are ongoing strategies, constantly adapting and being adapted to help people continue to move in a desired direction.


 
   
 
[1] The term ‘wicked problem’ was coined in the article “Dilemmas in a General Theory of Planning”, Rittel, H.W. & Webber, M.M., 1973.. Policy Sciences, 4(2), pp. 155–169.
 
 
 
 



How cybernetics can help improve the quality of training programmes

Published by in Training design ·
 
This posting was originally published as a LinkedIn article (https://bit.ly/2kFr8W1)

Cybernetics. A word which evokes thoughts of robots, of Dr Who’s cybermen, or of ‘transhumanists’, people who are looking to improve human performance by integrating technology into their bodies. But that is only one aspect of cybernetics, and one which does not readily suggest how cybernetics can contribute to learning.

The Oxford English Dictionary defines cybernetics as “the science of communications and automatic control systems in both machines and living things”. Thinking about the ‘living things’ part of this definition, cybernetics therefore looks at how organisms interact with their environment, exchanging information and materials in feedback mechanisms which, if functioning correctly, ensure the organism’s survival.

Of course, organisations are organisms, being composed of living, human beings. Cybernetic principles have therefore been used to analyse organisational behaviour, and one thread of thinking, sometimes called organisational cybernetics, is of interest to us here.
 
Within this perspective, each individual worker interacts with their operational environment, exchanging information and other resources. By extrapolation, so does the overall organisation (of course, in a one-person organisation, the individual is the organisation!), and it is therefore reasonable to assume that we can apply principles of cybernetics to how individuals and their parent organisations operate. Each person’s ‘environment’ includes both external entities (clients, suppliers and so on) and internal entities (colleagues, other departments and so on). We therefore have potentially a complex set of interacting feedback loops, which can make it somewhat difficult to understand what is happening.

However, there exists a very powerful tool called the Viable System Model (or VSM) which can help us to make sense of things. VSM is based around the interrelationship of five distinct but interconnected systems of information and resource exchange. Within the VSM literature, these are typically shown in a diagram like the one below.

 
                                               
 
The key concept in VSM is viability, of being able to survive successfully in the face of whatever variety exists in the environment. Essentially, the organisation must be able to show enough variety in its own behaviour to match the variety it has to deal with. To explain this with an example, if we are looking at a healthcare organisation working with an environment of people who are old, young or have disabilities, its internal organisation must be structured so that it can look after people who are old, young or who have a disability. This may seem blindingly obvious, but it is all too common for training programmes to be limited in scope and inflexible of message, making it harder for people to learn how to work flexibly and function as a viable system. It is also very important to remember that environments are constantly changing, so each worker’s capacity for dealing with variety (and the training required to enable this) must also be changing.

This VSM diagram showing how an organisation operates looks completely different to the classic organisation chart, structured by function. But it has a major advantage in that it shows how the organisation works (or should work), whereas the organisation chart simply shows a structure, and says nothing about interactions or operation. This is because it is derived from a hierarchical, bureaucratic mindset, and goes a long way to explaining why people often complain about “working in silos”: if that is how we think about an organisation’s structure, then that is the way we behave.

So briefly, how does this VSM diagram work?
  • The various System 1s are the operational (or implementation) activities, what delivers value to customers or clients, such as sales, procurement, fulfilment and so on. Every individual System 1 must be viable, in that it can respond appropriately to changes in its environment.
  • System 2 is coordination between the operational activities, making sure that, for example, increased sales activity is matched by an increase in procurement of raw materials or other resources.
  • System 3 is the delivery (or control) function, making sure that the different System 1s and System 2 all have the resources that they need. It actually works in two directions, and what is often called System 3* is a monitoring function, where each System 1 and 2 reports back so that System 3 provides what is needed.
  • System 4 takes information from both the internal and external environment and makes sure that the organisation remains in tune with what its customers and clients want, passing this information on to Systems 3 and 5.
  • System 5 sets the policy for the whole organisation, making sure that organisational activity remains in line with its vision and goals and is appropriate for the environment.

Crucially, this structure is recursive, and we should be able to see this structure within each different System 1 throughout the organisation. So we could look at the sales function and break this down into a number of separate System 1s and corresponding Systems 2 to 5. We see then that, for example, at every level of analysis the organisation should be taking appropriate information from its environment and feeding this into what it does.

If we use a VSM approach to look at how training is designed and delivered, we can identify principles which will make sure that training promotes viability.

Firstly, there is a major distinction between System 1, the operational activities, and the other four systems, which broadly represent what we would call ‘management’. Training for Systems 2 to 5 is often subsumed in what we call ‘management development’, so it is interesting to think about how traditional management development activities deal with cybernetically desirable activities. A key observation here is that traditional approaches to management development are often based around the hierarchical, bureaucratic model of organisations, with an emphasis on up and down relationships: for example, leadership, delegation, accountability and so on. Less importance may be attached to coordination and collaboration, monitoring or environmental awareness.

Operational training (System 1) needs to make sure that people can deal with all of the variety that they experience in every day, working life (being viable). This means that training should be learner-centred, practical and problem-based. This is well known empirically, being a core part of andragogical, adult learning principles, but here we can see how it is a requirement from cybernetic first principles.

Training designers should also recognise what relationships there are between different primary functions and make sure that these are incorporated into the training (System 2). Training programmes which focus on strengthening a System 1 without taking into account its dynamic relationship with other operational systems can cause more problems than they solve. This may mean that the scope of training needs to be widened, with related training or information being provided for people in other functions. Existing protocols and standard operating procedures may need to be revised to reflect different patterns in primary functions. There is a particular role here for informal learning, with people being encouraged to exchange information within and across teams so that coordination improves. ‘Training’ often ignores the need to promote informal learning, but it is crucial if the overall organisation is to be viable.

Training itself is an example of a System 3 activity (provision of necessary knowledge and skills). However, the VSM shows that what this provision should be needs to be based on information provided by System 3* (internal) and System 4 (external), which is, of course, what a training needs analysis (TNA) should do. Of course, this process may show that there are weaknesses in other System 3 or 3* activities. If there are System 3* weaknesses, reporting systems may need to be strengthened (while not becoming disproportionate or onerous): this would subsequently form an important source of information for training evaluations.

Training should make sure that people have the skills and tools needed to gather information from relevant parts of their environment, about what the environment needs and how it is changing (System 4). They should also be able to use this information appropriately. Training management should also be constantly monitoring the environment to make sure that training remains appropriate to what will be constantly changing patterns of variety: TNAs should be ongoing.

Finally, training should always be related to the broader aims of the organisation or department (System 5). This means that people working in a System 5 role should make sure that TNAs are taking place and that what they recommend is consistent with strengthening overall viability.

Too often training carried out in organisations is not planned from a systemic perspective. Training needs analyses may be perfunctory, with little thought being given to the complex web of decisions and interactions which contribute to effective performance. Training programmes are often reductionist, focusing on one small area of knowledge, skills and attitudes which seem to be appropriate to that particular silo of activity. Thinking about training for a cybernetic perspective can help to avoid this, making sure that training being delivered is closely integrated with all aspects of organisational activity so that the organisation continues to be viable in relation to its environment.



On unicorns and training needs analyses

Published by in Training design ·
 
(This post was originally published as a LinkedIn article)

For centuries people were fascinated by the thought of finding a unicorn. They had many qualities: they could make poisoned water drinkable and they could heal sickness. Unfortunately, they do not exist.

I think the same about training needs analyses. They are also wondrous things: they identify effective learning interventions, they explain how new knowledge and skills will overcome great obstacles, they provide a baseline for post-delivery evaluations, and so on. The problem is, that like unicorns, they also do not seem to exist.

Well, maybe that’s being a bit dramatic. There have been some training needs analyses identified here and there. But, really, not that many. On what evidence do I make this assertion?

First, anecdotal. People I talk to in the training world say, almost without exception, that proper training needs analyses just do not take place that often. I also asked a question in the Learning, Education and Training Professionals group on LinkedIn whether training needs analyses happened, and most replies said that they were an exception rather than the norm.

Secondly, more empirical. Whenever I start a training evaluation project, my first question is about the needs analysis on which the training is based: to date, I have never seen a training needs analysis report. Instead, people explain that it was based on a decision made by someone a while back (who has often left the company) or that the decision-making involved was not documented.

Thirdly, and more rigorous, a 2003 meta-study of training design activities by Arthur et al, noted that, based on the information they were able to review, fewer than 10% of training projects were based on a thorough training needs analysis.

So given that they have wondrous qualities, why do we just not see many proper analyses done of workplace situations, which lead, logically and with justification, to effective training interventions? There are a number of possible reasons.

A thorough training needs analysis will be a fairly complex undertaking, requiring the analyst to explore and develop an understanding of the various wicked problems which are contributing to a perceived performance problem. This will therefore take time, and training managers are often under considerable pressure to deliver quickly.

 
Training professionals may simply not have the breadth of skills needed to be able to understand reasons for performance problems. These problems may be due to poor organisational design, group psychology issues in the workplace, ergonomic design weaknesses or just a lack of understanding of the environmental variety that staff have to deal with when they are carrying out their jobs.
 
It may even be unclear as to what the actual problem is. One person may say that it is due to inadequate knowledge, another person due to weaknesses in operating procedures and so on.

 
There is a significant lack of integrated and comprehensive tools for training needs analysis. Ideally such an analysis should take place at three separate levels: organisational, to understand the wider picture affecting performance; operational, to look at what is happening at the functional level; and personal, to understand who the affected staff are and how training can best be integrated into their working lives (Goldstein and Ford, 2001). There are, it should be noted, various tools available for helping with these levels of analysis, but they are probably not as widely known about or used as much as they should be. For example, there is Gilbert’s Behaviour Engineering Model and Mager and Pipe’s performance flowchart for functional analysis and Holton’s Learning Transfer System Inventory for the personal level of analysis.

 
Finally, there is the assumption that, whatever the problem is, training will provide a solution. Paradoxically, this is seen to be a valid assumption at the inception of a project, leading to a decision to implement training without thoroughly analysing the problem, but possibly not valid after delivery, when there is a demand for an evaluation and perhaps an assessment of return on investment.
 
Detailed guidance on how to carry out a thorough training needs analysis is beyond the scope of a short article like this, but I have two suggestions.

 
Firstly, involve in the analysis process the people who will receive any training. One of the less well-implemented aspects of Malcolm Knowles’ work on andragogy (1977) is that planning of any adult learning activity should be done in participation with the learner. The learner knows what operational variety they have to deal with and what gets in the way of satisfactory performance. They will also understand how informal learning channels operate, so that formal training can be designed to integrate with this. All too often the structure and content of training is decided by senior managers who feel that they know what people must know, leading to training which is content-based and trainer-centred.

Secondly, systems thinking provides a set of tools which offer an integrated approach to training needs analysis. Techniques such as the Viable Systems Model and Soft Systems Methodology make it possible to identify and engage with all performance problem stakeholders in a way which can lead to a more holistic solution.

To carry out a training needs analysis properly, the training professional has to overcome quite a few hurdles. But if it is done properly, it can have many benefits. There is a greater confidence that any training solution will have positive benefits, because it will have been designed with the right content and using appropriate delivery modalities. There should be a better return on investment (if that can actually be measured) because the right tool is being used for the right job. And other non-training interventions should have been identified, which remove obstacles to improve performance and support the successful implementation of new knowledge and skills.

So let’s put an end to the mythical nature of training needs analyses, and try to make them a reality of organisational life.

 
References
 
Arthur Jr, W. et al., 2003. Effectiveness of Training In Organizations: A Meta-Analysis of Design and Evaluation Features. Journal of Applied Psychology, 88(2), p.242.
 
Goldstein & Ford, J.K., 2001. Training in Organizations (4th Edition), Cengage Learning.
 
Knowles, M., 1977. Adult Learning Processes: Pedagogy and Andragogy. Religious Education, 72(2), pp.202–211.




Drawing boundaries around training content

Published by in Training design ·
The only face-to-face training I do these days is in so-called 'train the trainer' workshops, where I look to improve participants' skills in designing and delivering training.

The people in these workshops are always experts in their own particular subjects (not training), and this expertise can range from security in high-risk environments to knowledge about drought-resistant seed varieties. The common denominator amongst all of these as regards training is that, when asked to deliver a training programme, they usually proceed to try and transfer all of their knowledge to the learner.

In my courses I always cover some of the key theories about cognition, Kolb, Vygotsky and, of course, Malcolm Knowles. Knowles introduced Western thinking to the concept of andragogy, adult learning. His initial, 1977, paper on adult learning compared pedagogical and andragogical approaches, in the process outlining a number of key principles to follow in adult learning. The one which is often of interest to my participants is about planning: Knowles says that in a pedagogical approach the planning is "primarily by teacher", whereas in an andragogical approach planning is participative.

This always comes as something of a shock to subject matter experts. How can people who know nothing about a subject participate in planning? My answer is then to draw people's attention to the idea of learner-centred outcomes, what do we want people to be able to do at the end of a training session? So we then spend some time talking about Bloom's taxonomy, observable actions, three-part training objectives and so on.

And this always seems to be a real light bulb part of the course for people. Having followed a learner-centred paradigm in my practice for quite a few decades, I tend to forget how revolutionary an idea this can be.

But it is very powerful. If we think about what the learner's outcomes need to be, then we can draw boundaries around what knowledge and skills need to be transferred. Rather than everything.



Learning styles: serious tool or parlour game?

Published by in Training design ·
I have recently been involved in looking at several different training of trainers events. Although the events have been for different target groups and in different sectors, in all cases some time in each course was spent on analysing (and subsequently referring back to) learning styles of different types, in particular those based around Kolb's experiential learning cycle.

Now, I've often done similar activities in my own training, and know that participants seem to find this kind of self-analysis quite fun and interesting ...  but is it just a bit of fun or is it really of significance?

I've started to ask this question more since I have been looking at training and learning from a systems thinking perspective. Every system has to have its own environment with which it has some sort of relationship, and this relationship influences the functioning of the system in some way. What does this mean if 'learning' is the system?

Thinking particularly about Kolb, his work comes from a humanistic psychology perspective, which means that he considers how a whole being behaves, and does not consider how that behaviour has come to be. This contrasts with more psychoanalytical approaches which seek to understand how a person's history (i.e. their environment) has affected their behaviour. So his cycle of experiential learning describes how some free-floating individual makes sense of new information, which is fine as we, when using the idea, can consider how what is going on around the individual might influence how it works.

However, quite a few writers have suggested that when we take Kolb's ideas further by saying that individuals have a preference for one or two of the stages in the learning cycle, that the humanistic approach creates a problem by ignoring the effects of the environment. Learning styles questionnaires work by asking people to reflect on how they learn in different situations, and they then receive some sort of summary as having one or two 'preferred' learning styles. Their contention is that this analysis is only valid for the situations considered in the questionnaire and at that moment in time, so for different situations or at another time the individual might respond quite differently. Which means that there may be no such thing as a person's always preferred learning style, only a preference at a given moment. Which makes the questionnaire a bit pointless ...

For me I know that I approach new learning situations differently depending on various factors, such as what the situation is, how familiar I am with it as a general class, how much time I have, how well I need to be able to respond, and so on.

So I'm left feeling that learning styles might be a bit of fun to talk about, but that pinning "Activist" or "Reflector" badges on people might be at best a bit of a waste of time, or worse, misleading and perpetuating one of training's great myths.



Bored doctors or well-trained doctors - your choice

Published by in Training design ·
A few days ago a friend of mine who works as a doctor in a local hospital called by for a cup of tea. She was just on her way home from a training course where she had been all day. Always interested in other people's experiences of training, I asked her how it had been. "Really boring", she said, "Just listening to somebody reading off slides all day."

I always find comments like this somewhat depressing. How is it that organisations these days can still think that it is cost-effective to take highly paid people, sit them in a room and make them listen to an expert talking on all day long. The direct and opportunity costs of an event such as that must have been considerable, and with the overall effect being to bore the participants.

My experience in quite a few organisations who rely on this type of training delivery is that they still think that the information transfer model (or what I call 'information dumping') is the best or only way to communicate content. I guess that a major reason why this happens is that people are familiar with lectures from their university days, where a person with all the knowledge attempts to transfer this to people with very little knowledge, like pouring water into an empty vessel, as the analogy goes.

The difference with training professionals is that they already know an awful lot, and really need to be able to integrate new information with what they already have, to refine their existing mental models.

Probably many of the people who are called on to deliver training of this sort will never have studied ideas about cognition, so concepts such as Kolb's learning cycle will be unfamiliar to them. The diagram below is a representation of Kolb's theory, and its familiarity to learning professionals means that it needs no explanation.



However, from a systems perspective one of the things that I have noticed about how this is normally presented is that it is portrayed as an individual activity: each one of the four stages is described as something which goes on inside one person's head. However, this is not what really happens in reality, because if it did there would quickly be no learning because each of us would eventually run out of the energy and inspiration needed to reflect and develop new conceptualisations.

Instead, what happens is that as we work round and round the learning cycle we draw in ideas from the world around us, in particular from other people. This new energy coming into our learning cycle is what enables reflection so that each iteration of the cycle improves our mental modelling. We can therefore represent what is happening as a networking of learning cycles, as in this diagram.



This is essentially the idea of the social construction of understanding, an idea attributed to the Russian psychologist Vygotsky. He reasoned that children learn by conversation and negotiation, which leads to a shared understanding of how to behave and how to do things.

This, I think, is why trainers who can let go of the control of the PowerPoint presentation and let people talk about stuff will usually get better results. I'm sure my daughter would have had a much better learning experience had she been able to engage with the trainer and other participants in discussions where she could talk about things she did not understand, listen to other explanations and so end up with a much better understanding of the subject.

I would certainly feel more comfortable lying on a hospital bed feeling that the medical staff looking after me have had the chance to really get to understand their subject, rather than having spent days being bored stiff by PowerPoint presentations.



Copyright 2015. All rights reserved.
Back to content | Back to main menu