Designing learning, or, what would Elmore do?

February 7th, 2012

Okay, I confess.  Elmore Leonard does not have any advice for better training.  Not that I know of.  But a book review I read yesterday reminded me (as if that were necessary) how much I’ve enjoyed his writing.

I also enjoy the list of rules for writing he says he’s picked up along the way. An example:

5. Keep your exclamation points under control.

You are allowed no more than two or three per 100,000 words of prose.

One thing he strives for, he says, is “to remain invisible when I’m writing a book.”  So the rules are “to help me show rather than tell what’s taking place in a story.”

That’s a pretty decent starting point to take if you’re creating something that’s meant to help other people learn.  So I thought I’d see if you could adapt his rules to designing for learning in the workplace.  Or to supporting learning.  Or at least to keeping away from CEUs and the LMS.

1.  Never open with “how to take this course.”

Angry Birds is a software application in which you use launch suicidal birds via an imaginary slingshot to retaliate against green pigs who’ve stolen eggs and are hiding in improbable shelters.  12 million people have purchased Angry Birds in the past two years, none of them because of the “how to play Angry Birds” module.

Honestly, there are only  two groups of people who look for “how to take this course.”   In the first group are those who designed the course, along with the lowest-ranking members of the client team.  In the second group are folks who still have their Hall Monitor badge from junior high.

2.  Never begin with an overview.

I can’t do any better than Elmore Leonard on this one:

[Prologues] can be annoying, especially a prologue following an introduction that comes after a foreword.  But these are ordinarily found in nonfiction. A prologue in a novel is backstory, and you can drop it in anywhere you want.

Cathy Moore worked with Kinection to create the military decision-making scenario, Connect with Haji Kamal. If you haven’t seen it, click the link. It takes about 10 minutes.  And notice: the overview on that first page is 17 words long.

3.  Never use “we” when you mean “you.”

Maybe I was a grunt for too long.  Maybe I’m just contrary.  But anytime I run across some elearning that’s yapping about “now we’re going to see,” I think, “who’s this we?”

“We” is okay when you’re speaking in general about a group to which you and your intended audience both belong.  But especially in virtual mode, it wears out quickly.

4.  Don’t act like you’re the marketing department. Even if you are.

This is a first cousin to the “we” business.  Once at Amtrak, a group of ticket clerks was learning a marketing-oriented approach to questions about our service.  When a customer asks, “What time do you have trains to Chicago?” the proactive response is to fill in the formula, “We have ___ convenient departures at (time), (time), and (time).”

For several stations in my area, there was one train a day in each direction.  From Detroit to Chicago at the time, there were two.

It’s not only bombastic to talk like this, it also confuses a feature with a benefit, a distinction any good salesperson would explain if marketing just asked.  It doesn’t matter if you have sixteen departures a day if I, the customer, don’t find any of them convenient.

5.  Keep your ENTHUSIASM under control!!

I suppose there’s less of this around than there used to be.  I’m a staunch believer in the value of feedback.  I believe just as firmly that feedback needs to be appropriate to the context.  Shouting “That’s great!” for trivial performance mainly makes people feel like they’ve time-traveled back to an ineffective third grade.

6.  Don’t say “obviously.”

The thing about the obvious is: people recognize it.  That’s why so few of us are surprised when we press the button for the fifth floor and the elevator eventually stops there.  Except in a humorous tone (and remember, tragedy’s easy; comedy’s hard), words like “obviously” and “clearly” can sound maddeningly condescending.

7.  Use technobabble sparingly.

When does tech talk become babble?  When it doesn’t pertain to the people you’re talking with. If I’m discussing interface design with people who work in design-related areas, then “affordances” probably makes sense to them.  But, for example, if in a post here on my Whiteboard I say that Finnish has features of both fusional and agglutinative languages, I can think of perhaps one frequent reader who has any idea what that means.  Accurate as those terms might be for linguists, they’re a dead loss for a general audience.

8. Avoid detailed descriptions of things that don’t matter to people doing the work.

This goes with the backstory remarks above, but I’m also thinking of any number of computer-user training sessions I’ve seen. One GE executive told me that the typical Fortune 100 company has more than 50 mainframe-based computer systems, most of which don’t talk well to each other.

What does that have to do with training people to use them?

The typical worker could not possibly care less whether it’s a mainframe, whether it uses Linux, who built it, where it stores data. If he thinks about cloud computing at all (unlikely), he suspects the phrase is mostly puffery, the IT equivalent to “available at fine stores everywhere.”

The introductory course at a federal agency dealing with pensions was stuffed to stupefaction with that sort of data-processing narcissism. What the participants in the course needed to know was: What’s my job, and how do I do it?

The answer is never “the QED Compounder pre-sorts input from the Hefting database in order to facilitate the Rigwelting process.”  Even if these things are technically true (and who could tell?), they’re meaningless.

Not to say that a quick summary of the process is worthless.  It simply has to make sense:

“The Intake Group reviews personnel records from a new pension plan and makes sure they can go into our system so we can analyze them. Once the Intake Group finishes, we cross-check the new account in our system to uncover any conflicts with the data as it appeared in the original system.  The team at the Resolution Desk handles the conflicts that can’t be fixed quickly.”

9. Don’t go into great detail describing the wonderfulness of the business, the product, or the CEO.

As Bear Bryant said once about the motivational codswallop so beloved at alumni dinners, “People love to hear that shit. Winning inspires my boys.”

Certainly you want people to recognize the good qualities–what makes the company and its product valuable to the customer (and thus to the shareholder). Since people in structured organizational learning already work for the outfit, they’ve already got plenty of information, and most likely an opinion, about its world-class, paradigm-shifting splendor.

10. Try to leave out the parts people don’t learn.

What don’t people learn?

They don’t learn what’s trivial (except to get through the unavoidable Jeopardy-style quiz).  They don’t learn what doesn’t relate to their job (or to a job they’d like to have). They don’t learn what they don’t get a chance to practice. They don’t learn what they don’t need to learn because they already know where to look it up when they need to.

And they especially don’t learn what they knew before they got there.

If it sounds like “training,” redo it.

Training, learning, performance — these are all variations on a theme.  I believe if you talk too much about the process of how you train, or how you learn, people nod off quickly. This is especially true of the beloved rituals of the Stand-Up Instructor: icebreakers, going-around-the-room introductions, that creative nine-dot puzzle, and your expectations for this course.

I do think it’s good to find out what people want or hope or expect, but really: if this is a workshop on designing job aids, then assuming I could read the sign at the front, I’m not here for Assumptive-Close Selling for the INFP.

spacer
Posted by Dave Filed in Basic training
8 Comments »

Behavior and accomplishment, or, what did you do?

March 30th, 2011

Guy Wallace said this on Twitter a while back:

I always wanted the Client to own the analysis & design data rather than me. I don’t convert their words to mine – or to Noun-Verb patterns.

He’s summarized a lot of good ideas about the consulting process.  And his phrase about noun-verb patterns reminded me of  two principles I keep in mind when dealing with performance problems (and with their much-smaller subset, training problems):

  • Behavior you take with you; accomplishment you leave behind.
    (That’s Tom Gilbert talking.)
  • A result is a noun; doing is a verb.

spacer “Doing” is a good example of a fuzzy label, and that’s part of why I’m writing this.  When someone starts talking about what people do at work, it seems to me it’s easy for their focus to shift.

Sometimes when someone’s talking about what people do at work, he might mean the actions they perform, the processes they go through.  That’s how a person works.  It’s the doing part of what they do.  It’s a verb.

If you need to repair the water damage in your basement, then what the contractor does at work–the verbs he carries out–are things like measuring dimensions, testing materials, inspecting damage, examining structures, considering costs,  and calculating square footage.

Alternatively, when someone’s talking about what people do at work, he might mean on the things those people produce, what they accomplish–the result of their work.

spacer A result is a thing.  It’s a noun. That’s the case even in so-called knowledge work and in service occupations. From this angle, what the contractor does–what he produces–is an estimate for the job.  Or a list of suggested approaches (on paper, or  verbal), or a series of questions for you to ask yourself–a kind of contractor’s initial consultation.

That’s what’s left behind when the contractor leaves.

I think this behavior/accomplishment distinction is crucial when you’re talking about performance on the job.  Companies and organizations are crammed to the institution rafters with ritualized behavior that continues in the absence of any real accomplishment except that the behavior got done: sales people are required to make 30 cold calls a day–not because of any company data about the effectiveness of cold calling, but because Veronica, the sales director, had three early successes from cold calls.

Working from the other direction, Umberto, who handles accounting for your department, tells you to charge the new software under “training materials.”  Not that the software is going to be used for training–but your boss has authority for twice the expenditure under that category than he has under “computer resources.”  Changing the accounting codes is such a slow process that not only would the software be out of date, but Umberto and you would both be retired, before it happened.  So Umberto’s helped accomplish a result (software deployed) but if caught you and he will both be sentenced assigned to Purchasing Refresher Training.

When you focus on results, you can work backward through the factors that influenced those results.  Sometimes (most of the time, actually) you’ll find that “lack of skill or knowledge” on the part of the performer is not the major hindrance to accomplishment.  That, of course, means you’ll likely be wasting your time (and the performer’s) if you try to resolve the situation through training.

As Guy Wallace said, you want the client to own the analysis and the design data.  I’ve said before that while I myself try not to use “understand” as a learning objective, I might go along with the client’s use of it, so long as the client and I can agree on what it looks like (what the results are) when someone “understands” how to perform a FEMA elevation certification.

Now, if the client is hell-bent on ignoring any data that doesn’t say “deliver training,” I’m pretty sure I’m not the right person to be working with that client.

Images adapted from this CC-licensed photo by Sebastian Werner / blackwing_de.

 

spacer
Posted by Dave Filed in Basic training, Improving performance
No Comments »

Traction or distraction: phones in class

November 10th, 2010

On LinkedIn’s Learning, Education, and Training Professionals group, two months ago, a member kicked off a discussion with this question:

Increasingly, we are finding that people bring their phones, computers and Blackberrys to class expecting that it will be OK to use them. How are you dealing with this issue?

spacer As of this morning, there are 83 contributions to the discussing.  Although I’ve disagreed strongly with some of the opinions and suggestions, I’ve come to see this question as yet another example of a complex problem–in other words, one without a single, correct solution.

Here’s my paraphrase of what several participants said.  To minimize my biases, I chose every 8th comment.  Well, I left out one, which happened to be my own.  (Just coincidence that it feel into the every-eighth sequence.)

  • I display a slide with logistics (breaks, fire exits, etc.) that asks people to turn off phones or at least put them on vibrate.
  • I show a humorous YouTube video and say this is what I did with the last phone that rang during my presentation. I make everyone take out their phone and turn them off in front of everyone.  I include a 20-30 minute break several times a day.
  • Ask the class to set the rules.  You are there to learn.  If people were on vacation instead of training, why would they check email?  They can do that during lunch.
  • Sometimes people are using BlackBerries and other devices to take notes.
  • Lately I don’t even mention phones.  I trust adults to act like adults.  I do like (another person’s) suggestion of asking people to turn them on to integrate outside information.
  • Set your phone to ring 3 minutes into the session.  Pretend to talk with the president of the company, who wants to know if everyone’s turned their phones off. Exception: if you expect the president to call, or if someone’s seriously ill. I also believe people are adults who must make their own decision.
  • There’s no right or wrong answer.  Some teaching strategies are still focused on a society that no longer exists.  Use appropriate technology at the appropriate time.
  • Go with the flow.  I can get irritated if a phone rings, but if the class is good and people are engaged, they’ll take their own responsibility.
  • I like letting the learners decide how to deal with device interruptions.

spacer I don’t do much formal instruction any more, by which I mean acting as the primary source (and predominant voice) in a scheduled learning event.  I’ve done quite a bit of that, but over time found that people seemed to learn best when I talked less and they did more.

Yes, when people are new to a topic, they generally need some grounding and some concepts.  Most of my experience is has not been with people new to the organization and the industry, however.  That means they tend to need less “before we begin” than a lot of instructors (and instructional designers) seem to think.  Even for a topic as information-dense as Amtrak’s reservation system, I found that a lean approach (less talking, more doing) suited the goal of having people able to use the system.

The LinkedIn discussion does provide a glimpse at the many ways that people working in this field view cell phones, PDAs (does anyone say PDA any more?), and smartphones.  (Almost none of the comments address computers as such.) I see a kind of clustering around “they’re here to learn (from me),” and a smaller one around “I’m here to help them learn.”

My own phone, like my computer, is as basic a tool as pen and paper.  Yes, I take paper notes, but when I have the choice, I take electronic ones so I can tag, search, re-use, copy, paste–all of which are tougher to do with PowerPoint handouts or handwritten notes.

I don’t want someone else telling me how to capture or retrieve information.   If they say things that I find condescending or just plain silly (“enter the world of civilized people,”  “phones are an interruption to learning”), I’ll get the message–though it may not be the one intended.

CC-licensed images:
Retro phone photo by Robert Bonnin.
Classroom sign photo by Ben+Sam.

spacer
Posted by Dave Filed in Basic training, Learning
4 Comments »

Sheep dip and success metrics, or, don’t flock up

August 27th, 2010

Koreen Olbrish of Tandem Learning has a post about using games to assess learning, and she addresses both opportunities and problems.

Games are a natural environment for assessment…in essence, they are assessing your performance just by nature of the game structure itself. Unless, of course, there aren’t clear success metrics and you “win” by collecting more and more meaningless stuff (like Farmville)…but that’s a whole other topic.

So let’s assume there are success metrics built into the game and those metrics align with what your learning objectives are.

Koreen’s main topic is game design, but I want to talk about that last idea:  
the game’s success metrics need to align with your learning objectives.

This sounds like Instructional Design 101, since it is Instructional Design 101.  Ever more fundamental — Instructional Design 100, maybe — are these questions:

spacer

What do you want people to do?
Why aren’t they doing that now?
How will this make things better?

No, the first question isn’t about instruction at all.  Nor is it about, “How do you want them to act?”

It’s about what you want people to get done.

When you can’t articulate what you want people to accomplish, it hardly matters what interventions you try.  You have no way to measure progress.  Might as well just run them all through whatever you feel like.

Making your goals less fuzzy

“Sheep dip” refers to a kind of chemical bath intended to prevent or combat infestations of parasites.  (Videos of older, plunge style and newer, spray style processing of sheep.)

Farmers dip or spray sheep because… well, I’m no farmer, but here are some guesses:

  • It’s more cost-effective than diagnosing the needs of each sheep.
  • A dip-tank of prevention is better than a barnful of cure.
  • Sheep on their own rarely propose new pest-management processes.

spacer

Ultimately, sheep farming has a few key outputs: leather, wool, mutton.  While the sheep play an essential role, I don’t think you can successfully argue that these are accomplishments for the sheep.  So what matters is the on-the-job performance of farm workers.

Speaking of on-the-job, many industries and organizations impose mandatory, formal training.  Even there, the accomplishment shouldn’t be “training completed.”

One client delivered “equal-employment awareness” training annually to every employee.  The original charter was full of “increase awareness” and “understand importance.”  Here’s what that looked like after a lot of “how can I tell they’re more aware?”

  • You can recognize examples of discriminatory behavior on the job.
  • You can state why the behavior is discriminatory.
  • You can describe steps for resolving the discrimination.

That’s not exhaustive (and the legal department would probably say you need to sprinkle “alleged” all over the place), but the three points are a first step toward a success metric that connects the individual and the organization.

Sometimes, it is a training problem

When people in an organization can articulate overall goals, it’s easier for them (as individuals and in groups) to think about how their activities and their results relate to those goals. They’re also likelier to be better problem-solvers, because they won’t corral every problem into a formal-training solution.

Even when a major cause of a performance problem is the lack of skill or knowledge, you benefit from revisiting those Design 100 questions:

  • What are the results you expect when people apply the skills they currently lack?
  • What could interfere with their applying them?
  • How will this approach help them learn and apply the skills?

Slightly more diplomatic language led that EEO-awareness client to decide that knowing the date of the Americans with Disabilities Act didn’t have much impact on deciding whether, in a job interview, you can ask an applicant, “Do you have a handicap?”

I’m no expert on workplace games, but I’m pretty sure I get what Koreen Olbrish is talking about.  It’s the workplace first, then the learning goal, and then the application of good design in pursuit of worthwhile results.

spacer

The same is true for any planned effort to support learning at work. You need to focus on what’s important, on how you know it’s important, on why you think training will help.

Then you use that information to guide your decisions about how to help people acquire and apply those skills when it matters.

Mindlessly grinding out courses (instructor-led, elearning, webinars, whatever) isn’t the answer, regardless of how many completion-hours people rack up.

It’s just…well, you know.

 

CC-licensed images:
Bigg’s Sheep Dip (Glenovis) adapted from this photo by  Riv / Martyn.
Bigg’s Dips (yellow/black) by Maurice Michael.
Quibell’s Sheep Dips by Peter Ashton aka peamasher.

spacer
Posted by Dave Filed in Basic training, On the job
3 Comments »

Novice learners: feeling safe to be dumb

August 15th, 2010

Are you in a corporate training environment? Dick Carlson in his mild-manner way muses on how learners feel about training (Learner Feedback? You Can’t STAND Learner Feedback!).

Dick and I have some differences — I think dogs ought to have noses that they themselves can see — but not in this area.  The core of Dick’s post is the ultimate assessment: can you now accomplish whatever this training was supposed to equip you to accomplish?

spacer (Yes, that does mean “that you couldn’t accomplish before due to a lack of skill or knowledge.”  Don’t be cute.)

Because — if we start with a true skill deficit that prevented you from producing worthwhile results — that’s vastly more important than whether the training fit your purported learning style, whether the ratio of graphics to text was in a given range, and whether the person helping you called herself a trainer, a teacher, a facilitator, a coach, or the Bluemantle Pursuivant.

If you need to learn how to recover from an airplane stall or how to control paragraph borders through a class in CSS, learning assessment comes down to two words: show me.

With all that, I do think that how the learner feels about what’s going on does  influence the learning situation.  I just want to make clear: that’s very different from saying that those feelings matter in terms of assessing the learning.

High profile?  You bet your assessment.

I was once in charge of instructor training and evaluation for an enormous, multi-year training project.  In the final phase, we trained over 2,000 sales reps to use a laptop-based, custom application.  90% of the them had never used a personal computer.

Which was a drawback: the client decided that as long as the sales reps were coming for training on the custom application, we should “take advantage of the opportunity” to teach them email.

spacer And word processing.  And spreadsheets.  And a presentation package.  And connection to two different mainframe applications using simple, friendly 3270 emulation software.

In a total of five days (one 3-day session, a 2-day follow-on one month later).

Our client training group was half a dozen people, so we hired some 30 contractors and trained them as instructors.  I mention the contractors because we needed a high degree of consistency in the training.  When a group of sales reps returned for Session 2, we needed to be confident that they’d mastered the skills in Session 1.

(If the informal learning zealots knew how we electrified the fences within which the instructors could improvise, they’d have more conniptions than a social media guru who discovered her iPhone is really a Palm Pre in drag.)

We used a relentlessly hands-on approach with lots of coaching, as well as “keep quiet and make them think” guidance for the instructor.  The skills focused on important real-world tasks, not power-user trivia: open an account.  Cancel an order.  Add a new contract.

We conducted nearly 600 classroom-days of training, and we had the participants completed end-of-day feedback after 80% of them.  I never pretended this was a learning assessment.  I’m not sure it was an assessment at all, though we might have called the summary an assessment, because our client liked that kind of thing.  We had 10 or so questions with a 1-to-4 scale and a few Goldilocks questions ( “too slow / too fast / just right” ), as well as space for freeform comments.

Why bother?

I made the analogy with checking vital signs at the doctor’s or in the hospital.  Temperature, pulse rate, blood pressure, and respiration rate aren’t conclusive, but they help point the way to possible problems, so you can work on identifying causes and solutions.

So if we asked how well someone felt she could transmit her sales calls, we knew about the drawbacks of self-reported date.  And we had an instructor who observed the transmit exercise.  We were looking for an indication that on the whole, class by class, participants felt they could do thist.

(Over time, we found that when this self-reporting fell below about 3 on the four-point scale, it was nearly always due to… let’s say, opportunity for the instructor to improve.)

When we asked the Goldilocks question about pace, it wasn’t because we believed they knew more about pacing than we did.  We wanted to hear how they felt about the pace.  And if the reported score drifted significantly toward “too fast” or “too slow,” we’d decide to check further.   (2,204 Session 1 evaluations, by the way, put pace at 3.2, where 1 was “too slow” and 5 was “too fast.” )

Naturally, to keep in good standing with the National Association for Smile-Sheet Usage, we had free-form comments as well.  We asked “what did you like best?” and “What did you like least?”  (In earlier phases of this project, we asked them to list three things they liked and three they didn’t.  Almost no one listed three.  When we let them decide for themselves what they wanted to list, the total number per 100 replies went up. )

Early in the project, our client services team sat around one evening, pulling out some of the comment sheets and reading them aloud.  It was my boss at the time who found this gem, under “what did you like best?”

My instructor made me feel
safe to be dumb.

Everybody laughed.  Then everybody smiled.  And then everybody realized we had a new vision of what success in our project would mean.

We wanted the learners to feel safe to be dumb.  Safe to ask questions about things they didn’t understand.  Safe to be puzzled.  Because if they felt safe, they felt comfortable in asking for help.  And if they felt comfortable asking, that meant they felt pretty sure that we could help them to learn what they needed to learn.

What about weaving their feedback into the instructional design?  In general, newcomers to a field don’t know much about that field, which means they’re not especially well equipped to figure out optimal ways to learn.

Please note: I am not at all saying newcomers can’t make decisions about their own learning.  In fact, I think they should make ‘em.  In a situation like this, though, my client wasn’t the individual learner.  It was (fictionally named) Caesar International, and it had thousands of people who needed to learn to apply a new sales-force system as efficiently as possible.

Mainly procedural skills.  Low familiarity with computers, let alone these particular applications.  High degree of apprehension.

(By the way, Ward Cunningham installed WikiWikiWeb online eight months after our project ended, so don’t go all social-media Monday-morning-quarterback on me.)

I felt, and still feel, that our design was good.  So did the Caesar brass: within six months of the end of the project, a nearly 25% increase in market share for Caesar’s #1 product, and the honchos said that resulted from successfully training the reps to use the new sales software on the job.

When you feel safe to be dumb, you don’t stay dumb long.

CC-licensed images:
Yes / no assessment by nidhug (Morten Wulff).
“Cover-the-content” adapted from this photo by antwerpenR (Roger Price).

 

spacer
Posted by Dave Filed in Basic training, On the job
2 Comments »
« Previous Entries
gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.