During a webinar I delivered on Thursday called "Design for Humans," we touched briefly on Peter Morville's UX honeycomb and how it applies to training design. The response was immediate and enthusiastic. It's a topic that consistently generates interest when it comes up in our Better by Design course for L&D professionals wanting to use user experience principles in their design work. Given the level of curiosity it sparks, I thought I'd write a proper exploration of how this framework can transform how we approach training experiences.
The enthusiasm makes sense. Too often, we craft training experiences with meticulous attention to content accuracy and instructional design principles, yet somehow miss the fundamental question of whether these experiences actually work for the humans who need to use them. We've become so focused on the mechanics of learning that we've forgotten the experience of it.
Enter Peter Morville's User Experience Honeycomb, a framework that's been guiding digital designers since 2004 and offers precisely the lens we need. Originally conceived to help teams move "beyond usability" in web design, the honeycomb identifies seven facets that make experiences genuinely valuable (Morville, 2004). For our purposes in L&D, six of these considerations provide a comprehensive checklist for creating learning experiences that don't just deliver content, but actually improve performance.
The Six Essential Considerations
Useful: Does the learning experience address a genuine performance need? This isn't about whether the content is accurate or comprehensive, but whether it solves an actual problem people face in their work. Research consistently shows that adults engage most effectively with learning when they can immediately see its relevance to their current challenges (Knowles, 1984). If learners can't answer "how will this help me do my job better?" within the first few minutes, you've likely failed the usefulness test.
Usable: Can people actually navigate through the experience without unnecessary friction? This goes beyond technical functionality to encompass cognitive load, clear navigation, and intuitive interaction patterns. A study of 50 corporate e-learning programmes found that poor usability was the primary factor in completion rates below 60% (Clark & Mayer, 2016). If learners spend more time figuring out how to use your learning platform than engaging with content, you've created a barrier rather than a bridge to performance improvement.
Desirable: Here's where things get interesting, and where many L&D professionals stumble into gamification rabbit holes. Desirable doesn't mean transforming everything into a video game or trying to compete with Netflix for attention. It means creating experiences that learners genuinely want to engage with because they recognise the value. Research by Schmidt et al. (2020) demonstrates that intrinsic motivation, the desire to engage because something feels worthwhile, consistently outperforms extrinsic motivators like badges and points.
Think about it practically: workplace learning isn't competing with TikTok or Facebook because if you're sitting at work scrolling social media all day, eventually your manager will appear and express their dissatisfaction. Desirable workplace learning means creating experiences that feel valuable enough that learners don't resist going through them within the working environment. It's about reducing reluctance, not creating addiction.
Findable: Can people locate the learning when they need it? This consideration extends beyond basic search functionality to encompass information architecture and just-in-time access. The most elegant training programme becomes worthless if employees can't find it six months later when they actually need to apply the knowledge. Research on workplace learning reveals that 74% of employees report difficulty finding relevant learning resources when they need them (Bersin, 2019). Your brilliant content might as well not exist if people can't discover it in the moment of need.
Credible: Do learners trust that what you're teaching them is correct and will actually work? This involves everything from the expertise of content creators to the currency of information and the credibility of examples used. A study of 200 corporate training programmes found that perceived credibility was the strongest predictor of knowledge transfer to workplace application (Bell et al., 2017). If learners doubt the expertise behind the learning or question whether the approaches will work in their context, engagement and application plummet.
Accessible: Can everyone who needs to engage with the learning actually do so? This means designing for diverse abilities, learning preferences, and technological constraints. Beyond legal compliance, accessibility is fundamentally about inclusion and ensuring that training experiences don't inadvertently exclude the people who need them most. Research indicates that accessible design principles benefit all learners, not just those with specific needs (Burgstahler, 2015).
Practical Application: The Honeycomb Diagnostic
Rather than treating these as abstract principles, use them as diagnostic questions for any training experience you're designing or evaluating. Walk through each consideration systematically:
Start with usefulness by conducting a needs analysis that goes beyond what stakeholders think people should learn to what performance challenges actually exist. If you can't articulate the specific performance problem you're solving, stop designing and start investigating.
Evaluate usability by testing your training experience with actual users, not just reviewing it internally. Watch where people get confused, where they hesitate, and where they give up. These friction points reveal usability problems that theoretical review misses.
Assess desirability by examining whether your approach respects learners' intelligence and acknowledges their existing expertise. The most desirable training experiences feel like valuable use of time rather than mandatory endurance tests. Focus on removing barriers to engagement rather than adding artificial incentives.
Check findability by testing whether people can locate and return to your resources using realistic search terms and scenarios. Consider how training fits within existing workflows and information systems rather than creating isolated repositories.
Verify credibility by ensuring content comes from recognised experts, includes current examples, and acknowledges the complexity of real-world application. Avoid oversimplifying to the point where experienced practitioners lose confidence in the approach.
Confirm accessibility by reviewing your design against established guidelines and testing with diverse users. Remember that accessibility isn't a separate consideration but should be integrated throughout the design process.
Beyond the Training Department
The honeycomb's real power lies in its versatility. These same six considerations apply to any experience you design, whether it's a training programme, a team meeting, or a performance review process.
Consider your next team meeting through the honeycomb lens: Is it useful (does it address issues people need to resolve)? Is it usable (can everyone participate effectively)? Is it desirable (do people see value in attending)? Is the information findable (can people locate relevant documents and follow-up actions)? Is it credible (do people trust the information shared and decisions made)? Is it accessible (can everyone participate regardless of location, technology, or communication preferences)?
This framework helps move conversations beyond "engagement" metrics toward genuine experience quality. Instead of asking whether people completed your training programme, ask whether they found it useful enough to recommend to colleagues, usable enough to navigate without assistance, and credible enough to apply in high-stakes situations.
The Communication Advantage
One of the honeycomb's most practical benefits is providing a shared vocabulary for discussing experience quality with stakeholders. Rather than defending design decisions based on learning theory that others might not understand, you can frame discussions around user experience principles that most business leaders recognise from their own digital product experiences.
When someone asks why people aren't engaging with a programme, you can systematically work through each honeycomb consideration to identify specific improvements rather than making vague promises about "better content" or "more interaction." This diagnostic approach demonstrates professional expertise while keeping conversations focused on practical solutions.
The Reality Check
Applying the honeycomb rigorously will reveal uncomfortable truths about many existing training experiences. You'll discover programmes that are comprehensive but not useful, technically sophisticated but not usable, and entertaining but not credible. This isn't a reason to abandon the framework, but evidence of why so many well-intentioned training initiatives fail to improve performance.
The honeycomb doesn't guarantee success, but it does provide a systematic way to reduce predictable failures. By asking these six questions consistently, you'll create training experiences that people actually want to engage with, can successfully navigate, and trust enough to apply in their work.
Remember, the goal isn't perfection across all six considerations, but conscious design that acknowledges what learners need for successful performance improvement. Sometimes the most useful training experience is the one that helps people recognise they don't need training at all, but better tools, clearer expectations, or systemic changes to their work environment.
What would happen if you applied these six considerations to the last training experience you designed? Which areas would reveal the biggest opportunities for improvement?
Share your answers in the comments, and if you’re a subscriber, join us to discuss them in the subscriber chat.
References
Bell, B. S., Tannenbaum, S. I., Ford, J. K., Noe, R. A., & Kraiger, K. (2017). 100 years of training and development research: What we know and where we should go. Journal of Applied Psychology, 102(3), 305-323.
Bersin, J. (2019). The Future of Work: How to Prepare for the Skills Revolution. Deloitte Insights.
Burgstahler, S. (2015). Universal design in higher education: From principles to practice. Harvard Education Press.
Clark, R. C., & Mayer, R. E. (2016). E-Learning and the Science of Instruction: Proven Guidelines for Consumers and Designers of Multimedia Learning. John Wiley & Sons.
Knowles, M. S. (1984). The Adult Learner: A Neglected Species. Gulf Publishing Company.
Morville, P. (2004). User experience design. Semantic Studios. Retrieved from https://semanticstudios.com/user_experience_design/
Schmidt, M., Earnshaw, Y., Tawfik, A. A., & Jahnke, I. (2020). Methods of user centered design and evaluation for learning designers. In M. Schmidt, A. A. Tawfik, I. Jahnke, & Y. Earnshaw (Eds.), Learner and User Experience Research: An Introduction for the Field of Learning Design & Technology. EdTech Books.