Portfolio · 2026
Most training delivers information. I design systems where behavior actually changes — and you can measure the difference.
About Me
I'm Hannah Shambley, an Instructional Designer and Learning Experience Designer focused on the gap between what people learn in training and what they actually do on the job. That gap is a design problem. I treat it like one.
My work spans curriculum architecture, LMS-based modules, assessment strategy, and AI tools built to sharpen design judgment — not to produce content faster. I build systems that triage training requests before a single slide is made, stress-test designs against synthetic learner profiles, and measure whether behavior actually changed after deployment. I approach every project from the same starting point: what does the learner need to be able to do, and what's currently getting in the way?
Whether I'm designing a branching scenario, auditing legacy content, or building a certification module from scratch, the measure of success is the same: observable behavior change, not completion rates.
A needs analysis isn't a formality. It's how I find out whether training is actually the right solution. Objectives come from what the job demands, not what the SME wants to cover.
Recognition isn't recall. Every activity I build is designed to make the learner do something with the knowledge, because retrieval practice is what moves information into long-term memory.
Working memory is limited. I sequence new concepts by managing extraneous load first, so learners have the cognitive bandwidth to build meaningful schema, not just absorb a content dump.
Training that doesn't transfer is expensive decoration. I design for near and far transfer explicitly, using context-rich scenarios, spaced practice, and feedback loops tied to real performance conditions.
The interesting AI use case in L&D isn't generating quiz questions or writing scripts faster. It's building tools that improve the quality of decisions: systems that triage training requests before a slide is made, stress-test designs against realistic learner profiles, and monitor whether learning actually changed behavior.
Design in Practice
A snapshot of how content gets rebuilt. Same objectives, stripped of what doesn't serve the learner.
The Solara Yoga All-Access Membership provides members with access to a comprehensive range of classes and studio amenities. Membership benefits include: unlimited access to all class formats including vinyasa, yin, restorative, hot yoga, and aerial yoga, access to all Solara studio locations, on-demand video library through the Solara mobile app, live stream class access, monthly workshop priority registration with 20% discount, complimentary mat and towel service, locker room access with showers, 15% retail discount on all studio merchandise, two guest passes per month, membership freeze options for up to 60 days per year, and a complimentary new member orientation session.
Know these three. The rest unfolds.
From catalog label to learner question.
Kept what matters. Removed cognitive overload.
Scannable structure supports in-the-moment recall.
Rewrote in 2nd-person. Speaks to the associate.
Outdated rev. date undermines trust.
Portfolio
How I Work
I start with the performance gap, not the content list. Stakeholder interviews, task analysis, and learner context research determine whether training is the right intervention.
Needs analysis · Task inventory · Learner profile
Objectives are written in behavioral terms. Assessments are designed before activities. Storyboards map every decision before development begins.
Behavioral objectives · Assessment blueprint · Storyboard
Activities are built with a specific cognitive purpose. Each interaction is chosen because it serves retrieval, schema building, or transfer — not because it looks interesting.
eLearning modules · Job aids · Facilitator guides
Content is deployed with SCORM or xAPI tracking configured to capture meaningful learner data. Facilitator guides are built in when the performance context requires it.
LMS deployment · SCORM/xAPI setup · Rollout plan
Evaluation is scoped at Levels 1 through 4 based on what's measurable and meaningful. Learner data informs iteration. Did behavior actually change?
Kirkpatrick evaluation plan · Analytics report · Revision cycle
Expertise
01.
02.
03.
04.
05.
06.
07.
Not using AI to produce content faster — using it to build better tools. Each phase of ADDIE gets its own AI leverage point: judgment, not automation.
Training request triager: routes "we need training" emails into training needed / non-training intervention / insufficient evidence.
Synthetic learner panel: generates 3 realistic learner profiles, walks each through the design, and surfaces where it breaks before development begins.
Objective writer with built-in rubric: follows your design standards, checks its own work, writes in your style guide — shareable across the team.
LMS evaluation agent: pulls data weekly, analyzes engagement and performance trends, and drops a stakeholder summary without manual effort.
Let's Connect
I'm open to new projects, collaborations, and conversations about learning design — especially work where the goal is durable capability, not just a course that gets checked off.
Tell me about the performance gap you're trying to close. I'll tell you whether and how a learning design solution could actually address it.