This interview explored Dr Ruth Clark’s research on 4 cognitive training methods in building and accelerating expertise in problem-solving skills.
Back in October 2014, I had a pleasure of an interview with Dr. Ruth Clark exploring her views about certain training strategies to building expertise in problem-solving skills and potentially accelerate it. She emphasized using worked examples and Scenario-based e-Learning training techniques as key to building expertise. Here is an excerpt from the conversation.
About Dr. Ruth Clark: A recognized specialist in instructional design and technical training, Dr. Clark is an author of Scenario-based E-Learning and newly released second edition of Evidence-based Training, two books that deal with evidence-based instructional strategies to accelerate expertise. She is also an author of Building Expertise, a book with a thorough account of research-based cognitive training strategies to improve performance. For over 25 years Ruth Clark has helped workforce learning practitioners apply evidence-based practice guidelines based on valid research. Dr. Clark offers consulting by previewing your e-learning products or goals and evaluating your courses or offering customized design-development consultation. Ruth currently offers virtual courses through ATD and the e-Learning Guild. She is past president of the International Society for Performance Improvement and author of seven books and numerous articles. Dr. Clark is the 2006 recipient of the Thomas F. Gilbert Distinguished Professional Achievement Award from ISPI. Click here to know more about her consulting services.
Your book ‘Scenario-based e-Learning’ mentions building expertise using problem-centered training. From my research, it appears that in a very complex setting, meta-cognitive strategies (that how a person approach a problem) appears to play a very big role in terms of how and who is going to gain expertise faster in problem-solving. How do you see it?
Dr. Clark: In my book ‘Scenario-based e-learning’ I gave an example of automotive troubleshooting. In that example, the instruction evaluates not only that you get the correct failure and repair, but also your problem-solving process. The program tracks all the tests the learner conducts during troubleshooting. And at the end, the learner can compare their troubleshooting process to that of an expert. I think in a way that is getting at the meta-cognitive side of problem-solving.
One point to mention is that meta-cognitive strategy will need to be fairly domain specific. I think the hardest part for designers is the analysis. And I think to develop expertise, learners need to engage with a number of cases, analyze those and go to the deeper level of looking at the meta-cognitive strategy. This can be encouraged by embedding a reflection activity in the lesson.
I share my observations taking an example of problem-solving skill training program at a large technology corporation. The training program was designed to give participants the skill and knowledge needed to solve the problem. An interesting observation is noticed in this case. There is a wide range of variations of performance of participants and speed with which they acquired proficiency in the problem solving when they went back to their jobs. On the one hand while one of the participants was very effective problem solver who could solve problems quickly, on the other hand, there was another participant, even though he got the same level of training, it took him about three months or four months to come up to speed to understand the whole context and be a successful problem solver. The participants had the same content; same environment and problem were simulated and taught the same way. The large variation in speed of proficiency is a puzzle as well as a challenge for training designers.
The missing piece what I found in my research is that the “how” part of the problem solving makes the whole difference. ‘How a participant attacks the problem’ is driven by some meta-cognition or meta-cognitive strategies that play its role in making a participant an effective problem-solver or troubleshooter. Though these ‘metacognitive strategies’ associated with problem-solving should be identified at the stage of task analysis, often, if not always are forgotten during analysis and design phase due to too task-focused training design.
The only way is to get experts to talk about their approaches (beyond documented guidelines) how they exert the judgment. However, it has a challenge too. When experts reach automaticity, sometimes it is even difficult for them to explain their approaches and rationale of why they did something in a certain way. They are generally able to ‘make-up’ a rationale after-the-fact, but real thing is to get down to how they used their inherent meta-cognitive strategies in the first place to get success at solving the problem. The general impression is that it is very difficult area to even measure how the experts are solving it in terms of meta-cognitive strategy.
Though metacognitive strategies appear to accelerate the expertise in complex problem solving, there are not many answers I found so far.
Which cognitive or training strategy you would suggest to build and accelerate expertise, especially in problem-solving and other complex domains?
Dr. Clark: Perhaps provide the resources that will guide them on how to do that. For example in troubleshooting, depending upon nature of an issue, you could provide a flow chart of certain guidelines that you can call ‘meta-cognitive guidelines’ in regards to how to go about the process of solving the issue.
‘Worked examples’ offers one approach. There is considerable research on worked examples. If you had a worked example, you can use this to train somebody to solve a problem. First what you would need is to display the expert going through the troubleshooting or problem-solving process (using bubble or a voice-over). That would show the thinking process as well. And then you could add a checklist of the main guidelines the expert followed as a form of performance support
There are two approaches to build problem-solving skills:
I think being directive would be more efficient and better for more novice learners who may experience cognitive overload in a purely inductive environment. In contrast, Inductive approaches are generally more effective for learners who already a have certain level of relevant experience.
I saw a very similar example. In an organization, they have created a structure of how the problems are solved by experienced employees. They have them document the rational, actions they have taken and why they have done so. In the training program for new people, trainer asks the employees to solve the similar problem, but they are not given any rational. During the process of solving problems, a trainer would ask the rationale to take an action and probe what actually triggered them to think in that direction, and why they eliminated certain possibilities and why they decided to attach the one they chose.
The part of the methodology is to make participants talk. And after their session, they present the thought process of the expert how they used it to solve the same problem. This mechanism appears to be inductive and appears to provide a good way to train metacognitive strategies.
Talking to several training experts, they find it challenging to choose between whole-task or part-task method to build skills in a given situation. You mentioned this strategy in your book ‘Building Expertise’. What is your take on that and how would you recommend using it?
Dr. Clark: As you know van Merrienboer’s 4C/ID model is basically a whole-task approach, with part-task embedded as needed when solving a whole task. Let me explain in an example how this works. When I work with the client, we break out the tasks and then ask the client which ones of these are basically procedural tasks which the workers does the same way each time as well as which ones are those that require a judgment and critical thinking. For later types of tasks, we might not want to use more of a whole-task approach. For the procedural tasks, we might want to use the part-task. In this approach, Van Merrienboer uses the whole-task as the main driver while the procedures are embedded into the whole-task.
One caution though. Sometimes a whole task approach may lead to cognitive overload, but if you had people who had some background experience that might work fine. However, if you have more novices, I recommend you teach some of those procedures separately. You could use a different strategy and or more traditional directive techniques for the procedures. And when you go to your problem-solving element, you use whole-task approach. They already have that procedural knowledge that they can apply if it’s needed.
At a large corporation delivering technical training to repair and troubleshooting engineers, I noticed similar approach. The instructional designers segregated the task into few tiers based on the nature of the task. Some of the tiers were straightforward tasks which can be procedural – very well documented. Everyone is expected to do it same way irrespective of the situation and problem.
Then there is another tier of tasks which they called problem-solving tasks. This is where they have segregated ‘fuzzy’ tasks in which lots of critical thinking, problem solving and judgment is involved. The participants’ judgment may vary from situation to situation. And it also varies from one individual to another. That makes building skills in ‘fuzz’ problem-solving skills difficult. And teaching it in isolation is not possible.
This is made feasible by using part-tasks (knowledge and skills – mostly procedural) as well as the problem solving judgmental/critical thinking tasks into a case scenario backbone (whole task). And use this ‘big picture’ case to build judgment and critical thinking skills.
Key here is how procedural part-tasks are connected to ‘big picture’ whole-task, the case scenario. The trick is that knowledge and skills are given to participants at the right point of time in piecemeal fashion. When there is sufficient confident is built in participants to use part skills and part knowledge, then training flow switch back to big picture mode. If during the flow there is need to deliver a part-task, it is delivered just-in-time. At a lower level, the part-tasks have them build the skills which were required to do a different part of the problem. After those participants are taken back to big picture mode again. That helps build situation judgmental skills, conditional critical thinking and dynamic decision making,
You mentioned the challenge in regards to assessment. What is your perspective on assessment?
Dr. Clark: We could use different models to develop people. I think the real challenge is how they actually evaluate whether or not those competencies have been built, especially meta-cognitive. I think it is very hard to do.
For example in the medical domain, you typically find several levels of assessments. They might have multiple choice exams or something of that sort to assess the understanding and application of a concept. Then they’ll have a structured clinical exam, where somebody is playing the role of the patient. And usually they may have a professional observing them and the observer asks those probing questions as they’re going along just to be sure that they have an understanding of the rationale, although it is a little more fine-grained. Then, of course, they have the actual clinical test where the patients are real and they are still being observed.
Probably in the corporate world, they are not going to take the time to do such elaborate assessment. However in a research setting, one could use some indicators to monitor time-to-proficiency like: how long did it take them to solve the problem. And another factor could be to get the solution right. When they’re finished solving a problem, they may be asked to provide reasons or their rationale in regards to why they took a particular step etc. to test overall problem solving (including the metacognitive side).
As of now, I haven’t come across a very sound model of assessment. There are several different ways people seem to do an assessment. For example in one troubleshooting scenario, at the end of the training course they simulate a bug into the machine then they have the participants solve it in an unsupported fashion. That’s the assessment. This is big picture kind of assessment where they do not test the training objectives but the end results and assessment is more business focused. In that situation, the end business result is for participants to be able to fix the problem. And if he cannot, he really did not achieve anything out of training. Then they go down into the postmortem more which part of the problem-solving participant struggled with.
However, this assessment mode does not really answer how to test their metacognitive expertise so as to build it faster. My angle is that for successful assessment of meta-cognitive side, an intervention needed. The trainer or the assessor has to be part of the assessment when they assess participants for these judgmental type skills, even if they focus on the bigger picture.
What I think is that effectiveness of assessment ties back how they did the analysis of the skills at first place to design the training course. The assessment has to follow the way they did the analysis.
Acknowledgment: Thoughts expressed in this blog are credited to Dr. Ruth Clark (where indicated). I highly appreciate Dr. Ruth Clark spending time with to provide this insightful information on different training strategies to build and accelerate expertise in the problem-solving domain.
Attri, RK (2018), ‘4 Cognitive Training Strategies To Accelerate Expertise of Complex Skills: Revelations by a Cognitive Scientist’, [Blog post], Speed To Proficiency Research: S2PRo©, Available online at <https://www.speedtoproficiency.com/blog/cognitive-training-methods-build-and-accelerate-expertise/>.
Image Credits: Pixabay, CC0 attribution