1 DAY AGO • 4 MIN READ

Friday Finds — Getting it wrong is the whole point (Research Edition)

profile

Friday Finds

Spend 10 minutes. Walk away with actionable ideas you can use Monday morning in your L&D program.

Friday Finds

Curated ideas, practical tools, and marketing-inspired thinking for people who design learning.

Well, I called it. That warm spell we had was a tease — it snowed on Sunday, and the temperature is back below freezing again. (Nobody lives in Ohio for the weather.) This week's topic surfaced while I was teaching my graduate course. We got into a discussion about using scenarios in learning design, and I immediately thought of this research. I'd genuinely love to know: are you already doing something like this? And if not, does today's issue change that? Hit reply and let me know!

🎵 Today I'm listening to Simply Red
(I've been stuck in the 80s lately.)

Supported by iSpring

HR doesn’t need more noise. It needs better systems. This free HR Template Pack from iSpring gives you four practical, ready-to-use frameworks—a customer guide, compliance checklist, troubleshooting playbook, and full employee-lifecycle SOP—so you can replace chaos with clarity and stop reinventing the wheel. Simple. Useful. Done.

Get the free HR Template Pack →


Getting it wrong is the whole point

Quick question before you read another word.

During a typical 20-minute video lecture, what percentage of learners do you think are actively mind-wandering at any given moment?

Take a guess. Write it down if you want. You probably don't know the exact number — and that uncertainty you just felt? That's exactly what this issue is about.

Research consistently finds that a third to more than half of learners are mind-wandering during a video lecture — and the rate climbs as the session goes on. The moment you find that out, you'll remember it — because you tried to retrieve it first.

That's the Pretesting Effect. And it should change how you design training.


Failing is what makes it work

Here's the counterintuitive finding:

Students who attempt to answer questions before studying the material consistently outperform students who spend that same time just reading.

Not students who guess correctly. Students who guess wrong.

In a landmark 2009 study, Richland, Kornell, and Kao had participants answer questions about a passage before they ever read it. Most got nearly everything wrong. Then they read the passage. On the final test, the pretest group dramatically outperformed the group that had simply studied longer — even when researchers analyzed only the questions participants got wrong on the pretest.

Protecting learners from wrong answers might be exactly backwards.


What's actually happening in the brain

When someone guesses incorrectly, three things happen before they ever see the right answer.

First, the wrong guess activates every related concept already stored in memory. Those concepts become hooks. When the correct answer arrives, it has something to attach to — and it sticks.

Second, the mismatch between the guess and the correct answer registers as a prediction error. The brain treats that mismatch as a signal worth encoding. It's a stronger learning signal than reading a fact you never questioned in the first place.

Third — and this one is particularly useful for L&D — pretesting reduces mind-wandering during what follows. Pan, Sana, Schmitt, and Bjork (2020) measured learners' attention during online video lectures. The group that took pretests beforehand wandered significantly less. They had specific gaps to listen for, and they were hunting for answers. That changes how you watch a lecture.

The wrong answer primes the brain. The right answer completes the circuit.


Your learners are zoning out — and pretesting can fix it

Most training design implicitly assumes two things: learners arrive as blank slates, and wrong answers are a waste of time.

Both assumptions are wrong.

Pretesting works best when learners have loosely related experience to draw on — something in memory for the guess to activate. Even partial, imperfect prior knowledge is enough. A good pretest question taps into that. It gives learners a personal stake in finding out if they were right.

That activation is the point. The question you place before your content is doing more cognitive work than most of the content itself.


Try this before your next module goes live

Pick the single most important concept in an upcoming course or session: the one thing you most need learners to actually retain.

Write one question about it that most of them won't be able to answer correctly. Not a trick question. Not an unfair one. Just something specific enough that they have to make a real attempt.

Add it before the content starts. In the intro email, on the opening slide, at the top of the pre-work, wherever learners enter the experience. Tell them there's no pressure to get it right. Then make sure the content delivers a clear, satisfying answer.

Want to go further? Tools like Mentimeter or a simple Google Form let you collect responses and show the group's aggregate guesses before the reveal. That social element adds a second layer — now learners are curious about each other's answers, not just the right one.

One rule: don't skip the answer.
The pretesting effect only fires when learners encounter the correct answer after attempting to retrieve it. The question opens the loop. The content has to close it.

Also supported by Neovation

Most teams have a shared drive full of PDFs nobody reads until there’s a problem. What if those same files could answer questions, guide decisions, and coach people in real time? Stop storing information. Start activating it.

Build your corporate brain →


Want to go deeper? Here are three great places to start.

Worth your attention

The study that started it all

Richland, Kornell, and Kao's foundational 2009 paper, free on PubMed. If you need to make the case for pretesting to a skeptical stakeholder, start here.

Read the original pretesting research

A researcher explains her own findings

Faria Sana — one of the scientists behind the mind-wandering research — wrote this plain-English breakdown for the Psychonomic Society. No paywall, no jargon. Just the science in plain language.

Read the researcher's own explainer

Pretesting vs. retrieval practice: how they compare

A practical walkthrough of Pan and Sana's research comparing pretesting to post-study testing. Useful if you're trying to figure out where questions belong in your design.

Read the practical comparison

Did this resonate? Or miss entirely? Either way, hit reply—I’d love to hear your take.

📍Where I'll be next

If today’s issue was useful, my book Think Like a Marketer, Train Like an L&D Pro goes deeper on designing learning that earns attention and drives action. And if you’ve read it, a short review helps more than you might think.

Friday Finds is an independent publication that I produce in my free time. You can support my work by sharing it with the world, booking an advertising spot, or buying me a coffee.


600 1st Ave, Ste 330 PMB 92768, Seattle, WA 98104-2246
Unsubscribe · Preferences

Friday Finds

Spend 10 minutes. Walk away with actionable ideas you can use Monday morning in your L&D program.