Training by Email: A Practical Guide to Sequenced Learning Delivery
Yesterday I attended an LDA webinar featuring Bianca Baumann and Mike Taylor discussing their book “Think Like a Marketer”, and one thread of conversation explored using tools that marketers use; email platforms, sequenced text messages, and the kind of tracking that accompanies them. This made me think back to a project I ran a few years ago, one where we delivered GDPR compliance training entirely through email sequences, and it reminded me that sometimes the most effective training interventions are the ones that feel least like “training” at all.
The client was a company selling products directly to people’s homes. Their salespeople and field employees would visit customers to discuss contracts and take payments, which meant they needed to understand GDPR inside and out. The problem was that past GDPR training had been, to put it generously, ineffective. Completion rates were abysmal, and the feedback was consistent: “there’s no time” and “it didn’t load on my device”. Field employees were rarely in the office, and expecting them to carve out time for a lengthy e-learning module between customer visits simply wasn’t realistic.
So we tried something different. We delivered the entire training programme as a sequence of carefully crafted emails.
The Science Behind Sequenced Delivery
Before diving into the practicalities, it’s worth understanding why this approach works, because it’s not merely a convenience play. The research supporting sequenced, distributed learning is some of the most robust in cognitive science.
Spaced practice, where learning is distributed across time rather than compressed into a single session, consistently outperforms massed practice for long-term retention. A 2024 review of spaced digital education for health professionals found that this approach was effective in improving knowledge, skills, confidence, and clinical behaviour change across a range of contexts (Lau et al., 2024). More strikingly, research with over 26,000 physicians demonstrated that spaced repetition was superior to massed practice for both learning (58% vs 43%) and knowledge transfer (58% vs 52%) (Price et al., 2025).
Chen et al. (2018) demonstrated that massed practice depletes working memory resources, whilst spacing allows those resources to refresh between learning episodes. When someone sits down to engage with content after a gap, their cognitive capacity is ready to work again. The feeling many of us have experienced during long training sessions, that sense of diminishing returns and mental fatigue, has a measurable neurological basis.
What This Looked Like in Practice
The business had twelve priority areas when it came to GDPR understanding and application. Rather than cramming all twelve into a single module, we spread them across a sequence of emails delivered over several weeks. Each email focused on a single consideration, which meant recipients could engage meaningfully with one concept before moving to the next.
These weren’t generic broadcasts. Email platforms like MailerLite, which we used for this project, allow for dynamic content. Each email used the recipient’s name and was specific to their role. Salespeople received information, questions, and prompts that were relevant to salespeople visiting customers’ homes. Back-office staff received content tailored to their context. The same principle applied throughout; same core topic, different examples and scenarios depending on the recipient’s work.
When employees receive role-specific training, they engage more meaningfully with the material because they can see its direct application to their daily work (Brown and Sitzmann, 2011). We weren’t asking field employees to mentally translate generic scenarios into their context; we did that translation for them.
Each email followed a consistent structure. It opened with a brief explanation of the topic, perhaps two or three paragraphs at most. Then it presented a scenario relevant to the recipient’s role; something they might encounter. Finally, it asked reflective questions. Not quiz questions with right and wrong answers, but genuine prompts to think through how they would handle specific situations.
The beauty of this approach is that reflection doesn’t require infrastructure. Recipients could read an email while waiting for a customer to answer the door, and carry that reflective question with them through their next few interactions. Learning became distributed not just across time, but across the natural rhythm of work itself.
As a reader of the Instructional Design Tips Substack, you can get 25% off a ticket to the very first IDTX Evidence-Informed Practice Conference.
This one-day event is set for the 29th of May 2026 and will be held in Birmingham city centre, UK. The day will see us bring together researchers, scientists, and practitioners to discuss how we utilise the wealth of scientific understanding, research, and evidence to improve workplace training.
To claim your discounted ticket, head over to the IDTX website and use code CPDW25 at checkout.
The Practical Steps
If you’re considering something similar, here’s how to approach it.
Start by identifying the specific behaviours that are causing non-compliance. You’ll note I don’t say start with learning objectives because, quite frankly, in this context they are not particularly helpful. You’re going to want to tailor the content, the scenario, and anything you put in this email towards the one specific behaviour you are trying to impact in that email.y The key is ensuring each behaviour can be meaningfully explored in a single, focused communication.
Next, choose your delivery platform. I’ve used MailerLite, MailChimp, Brevo, and Kit for various projects. All of them support the core functionality you need: sequenced delivery, personalisation through dynamic content, and tracking of opens and engagement. The exact platform matters less than ensuring you can schedule sequences, personalise content based on recipient data, and monitor whether people are engaging.
Note: In this context, I’m using the technological meaning of the word engagement, i.e., what button did they click? What information did they enter? What choice did they make? All of which can be tracked through any of these email platforms.
With your platform in place, build your recipient segments. This is where the personalisation happens. You need to know who’s receiving what, which means having data about roles, locations, or whatever other variables will drive content differentiation. For our project, we had salespeople, customer service representatives, and administrative staff as our primary segments, each receiving contextually relevant versions of the same core content.
Then draft your content with reflection in mind. Each communication should do three things: explain something, situate it in context, and prompt thinking. The explanation should be concise; remember, these are emails, not chapters. The context should be specific to the recipient’s role. The reflection prompts should ask questions that don’t have simple right or wrong answers, but require consideration of how principles apply to practice.
Once content is ready, set up your sequence and automations. Most email platforms allow you to create automations that trigger when someone is added to a list or reaches a certain date. You can control the interval between messages, which should be long enough to allow for spacing effects but short enough to maintain momentum. We used intervals of two to three days, which gave people time to engage without losing the thread of the programme.
Finally, track engagement and follow up where needed. One advantage of email platforms is visibility into who’s opening what. If someone hasn’t engaged with multiple emails, you can trigger a follow-up or flag them for a conversation.
Beyond Email
The principle here extends beyond email. Whatever primary channel of business communication your organisation uses, the same approach can work. I’ve seen variations delivered through WhatsApp for frontline retail workers, through Slack for distributed tech teams, and through Microsoft Teams for corporate functions. The platform matters less than the design principles: spaced delivery, focused topics, role-relevant context, and prompts toward reflection rather than passive consumption.
The key insight from that LDA webinar, the one that sent me down this particular memory lane, was about meeting people where they are. Marketers don’t expect their audiences to come to a dedicated “marketing consumption platform”; they reach people through the channels those people already use. We can learn from this. Field employees weren’t completing traditional e-learning because it required them to stop their work and engage with a separate system. Email arrived in the same inbox they checked anyway, on the same device they carried everywhere.
The Outcomes
I won’t pretend we ran a rigorous experimental study with control groups and statistical analysis. What I can tell you is that completion rates were dramatically higher than the organisation had ever achieved with traditional compliance training. People responded to the reflection prompts, sometimes starting conversations with colleagues about the scenarios we’d described. And when we surveyed participants afterwards, the feedback shifted from “no time” and “wouldn’t load” to something more like “manageable” and “relevant”.
Perhaps more importantly, this approach changed how the organisation thought about compliance training more broadly. Rather than treating it as an annual burden to be endured and forgotten, they began to see it as something that could be woven into the fabric of work. Subsequent compliance topics adopted similar sequenced approaches, and the resistance that had previously characterised training rollouts diminished considerably.
Note: I don’t have access to performance data for this intervention. But I was told anecdotally by leadership in the organisation that the number of flagged non-compliance instances did reduce in the six months after this intervention.
A Note on Technology
I want to be careful not to oversell the technology here. The platforms I mentioned are tools, and tools don’t create good learning design; they simply enable it. The real work was in understanding the audience, identifying what they needed to know and do, translating that into contextually relevant scenarios, and crafting questions that prompted reflection. That work would have been necessary regardless of the delivery mechanism.
What technology does provide is reach and consistency at scale. You can send personalised, sequenced communications to hundreds or thousands of people without the logistical nightmare of scheduling classroom sessions or ensuring everyone has LMS access. For a dispersed workforce that rarely sees the inside of an office, that’s rather handy.
Some Caveats
This approach works well for compliance and procedural knowledge where the goal is to ensure people understand principles and can apply them appropriately. It’s less suited to complex skill development that requires practice and feedback, or to knowledge that benefits from interactive discussion and debate. Not everything belongs in an email sequence, and it would be a mistake to treat this as a universal solution.
There’s also the question of email fatigue. If your organisation already bombards people with communications, adding another sequence might backfire. Context matters, and part of the early work should involve understanding what the communication landscape looks like for your target audience.
Finally, reflection prompts only work if people engage with them. We included them in every email, but we have no way of knowing how many recipients paused to think versus how many simply skimmed and moved on. The research suggests that even brief reflection offers benefits (Di Stefano et al., 2023), but we can’t force deep thinking through a one-way channel.
Closing Thoughts
Thinking like a marketer, as Bianca and Mike encourage, means recognising that the channel matters. Reaching people through familiar tools, respecting their time, and making engagement easy rather than effortful; these aren’t compromises on learning quality. When supported by evidence-informed design principles like spaced practice and prompted reflection, they can improve outcomes compared to more traditional approaches.
The compliance training that people complete and remember is more valuable than the beautifully designed e-learning module that sits unopened on the LMS. Sometimes the right answer isn’t the one that looks most like “proper” training.
References
Brown, K.G. and Sitzmann, T. (2011) ‘Training and employee development for improved performance’, in S. Zedeck (ed.) APA handbook of industrial and organizational psychology, Vol. 2. Selecting and developing members for the organization. Washington, D.C.: American Psychological Association, pp. 469-503.
Chen, O., Castro-Alonso, J.C., Paas, F. and Sweller, J. (2018) ‘Extending cognitive load theory to incorporate working memory resource depletion: Evidence from the spacing effect’, Educational Psychology Review, 30(2), pp. 483-501. (PAID)
Lau, Y., Nyoe, R.S.S., Wong, S.H., Ab Hamid, Z.B. and Car, L.T. (2024) ‘Spaced digital education for health professionals: Systematic review and meta-analysis’, Journal of Medical Internet Research, 26, e57760.
Price, D.W., Wang, T., O’Neill, T.R., Morgan, Z.J., Chodavarapu, P., Bazemore, A., Peterson, L.E. and Newton, W.P. (2025) ‘The effect of spaced repetition on learning and knowledge transfer in a large cohort of practicing physicians’, Academic Medicine, 100(1), pp. 94-102. (PAID)



Thanks for attending our Meet the Author, and your contributions there and here I will supplement your mention of the content with the notion that there are a variety of useful reactivations, and they change as the learner progresses (talked about it here: https://blog.learnlets.com/2025/09/transforming-from-knowledge-to-performance/). I will mention another solution, with the caveat that I'm on their board and have a stake, because they're actually looking to 'design in' learning science. Elevator 9 uses texts to extend the face-to-face training that many (many) folks do. It's spaced out according to our best interpretation of the science, and because learners are asked to react in various ways, there's data coming back, too. A separate system, easily augmenting your existing ones. FWIW.