Rise 360 · Onboarding

New Hire Success: Your First Two Weeks

A Rise 360 onboarding course designed to close the gap between knowing the job and doing it confidently — built after exit interviews revealed one pattern behind 30% new-hire turnover.

Gretchen Lautenschlager
Rise 360, Canva, Google Forms
25–35 minutes
Retail new hires (week 1–2)
Portfolio Demonstration
Kirkpatrick L1–2

Try the course

This is the published Rise 360 module. Click below to open it in a new tab — fully interactive, exactly as a learner would experience it.

New Hire Success: Your First Two Weeks

Rise 360 · 15–20 min · Opens in Articulate →

Launch Course

30% of new hires quit within 90 days. Exit interviews pointed to one thing.

RetailReady Co. was losing 30% of its new hires before their 90-day mark. They had an onboarding program — it covered the handbook, store policies, product knowledge, a welcome tour. Completion rates were strong. But turnover held steady.

With replacement costs estimated at 50–75% of an employee's annual salary, a 30% turnover rate among a 200-person frontline workforce translated to a recurring, preventable business loss. The Director of People Operations asked a simple question: "Can we fix this before it happens?"

That question is where instructional design begins. Exit interviews finally asked the right one: "When did you first feel like you knew what you were doing?" — most employees said never. Or: around month three, if they stayed.

"I never knew what I was supposed to do or who to ask. I didn't want to look stupid, so I just guessed."

The existing training gave new hires information. It did not give them certainty. That's the difference between someone who stays and someone who leaves in week three.

I started with the performance gap, not the content.

Before opening Rise 360, I completed a full needs assessment to answer one critical question: Is this actually a training problem? I interviewed five stakeholders — two managers and three current employees who had survived the first 90 days — and built a gap analysis that separated "doesn't know" from "doesn't know how to do it with confidence in front of a real customer."

That distinction drove every design decision:

An instructional designer who overpromises what training can fix loses credibility fast. Naming what's out of scope is as important as naming what's in scope.

Design decisions start with who's in the room.

I mapped the target audience as frontline employees aged 22–40 with mixed tech literacy, high smartphone comfort, and low motivation for "mandatory training." This profile drove three non-negotiable design constraints:

Mobile-First

Every content block previewed on a mobile frame before finalizing. Rise 360's responsive design handles it — but only if you design for it intentionally.

Under 20 Minutes

Completable in a single break period. Respects the reality of frontline work schedules — not the fantasy of a quiet hour at a desk.

Conversational Tone

Not compliance language. The module opens: "This isn't an orientation video you sit through — it's a guide you'll actually use."

Stakeholder interview questions

These are the questions I ask before building anything. Good design starts with the right conversations — not with opening a tool.

  1. What does your exit interview data say about why new hires leave in the first 90 days?
  2. What does a "successful" new hire look like at Day 30? What are they doing that struggling hires are not?
  3. What training or onboarding currently exists? What format is it in?
  4. What is the biggest mistake new hires make in their first week — the one that generates the most supervisor interventions?
  5. Is there a budget or timeline constraint I should design around?

Root cause analysis

Before recommending training, I asked: Is this actually a training problem? Using the Performance Analysis Questions framework (adapted from Thomas Gilbert's Behavior Engineering Model), I sorted causes into three buckets.

3A — Environment

NOT fixed by training

No structured "first day" checklist for shift leads

Priority board in back office inconsistently updated

Badge access to HR portal delayed 3–5 days

Flagged to stakeholder as process/environment fixes. Training cannot compensate for broken systems.

3B — Knowledge & Skill

CAN be addressed by training

New hires not shown the GREET model or customer routing protocol

No formal introduction to the 5 core daily tasks — passed informally by coworkers

New hires don't know the distinction between shift lead, service desk, and manager roles

3C — Motivation & Confidence

Addressed through scenario design

New hires afraid to ask questions for fear of appearing incompetent

Lack of early wins creates disengagement by Day 7

Seven sections. Every one tied to a performance objective.

I built a six-section storyboard in Google Docs before touching Rise 360. Each section maps to a specific performance objective — not a content topic. Rise 360's clean, responsive layout keeps cognitive load low for a mixed-literacy audience: no cluttered slides, no walls of text, no autoplay audio the learner can't control.

1

Welcome

Sets a performance tone, not a compliance tone

2

Know Your Store

Accordion interaction mapping the three store zones

3

Your Top 5 Tasks

Process block walking through the daily task sequence

4

The GREET Model

Customer service mnemonic for approach and engagement

5

When Things Go Wrong ★ Centerpiece

Two branching scenarios with consequence-driven feedback — three response options each, drawn from the highest-frequency new hire mistakes

6

Your People & Resources

Tabs interaction mapping the four key contacts

7

Quiz + Wrap-Up

Three-question knowledge check with 80% passing threshold

Sample Deliverable: Week 1 Cheat Sheet

This job aid was designed to live in a new hire's apron pocket — a portable reference for the five core tasks, the GREET model, escalation paths, and scripted responses for the three most common difficult situations. Built in Canva as a companion to the Rise 360 module.

GL logo

Your Week 1 Cheat Sheet

PDF · Module Companion · New Hire Success / Rise 360

A pocket-sized quick reference covering the GREET method, escalation contacts, and scripted responses for common floor situations.

View PDF

I measure success in behavior change, not completion rates.

No post-launch data exists for this portfolio project. Based on the gap analysis and research on onboarding as a retention driver, the projected outcomes are:

Metric

Baseline

Target

Timeline

90-day voluntary turnover

30%

≤ 15%

2 quarters post-launch

Time to independent performance

Day 21

Day 10

Measured at 30 days

Week-one supervisor interventions

High (untracked)

25% reduction

30 days post-launch

Module knowledge check pass rate

N/A

≥ 80% first attempt

Immediate

Learner satisfaction (L1)

N/A

≥ 4.2 / 5.0

Immediate

Kirkpatrick L1–4: built in from the start.

I designed a Level 1–2 evaluation plan for this project, with a Level 3 recommendation for real deployment. The Level 3 checklist is the artifact I'm proudest of — it shifts accountability for transfer from the learner to the environment, and gives managers a structured tool, not just an expectation.

L1

Reaction

A five-question post-module survey in Google Forms asking learners to rate relevance, clarity, and confidence. Target: ≥ 4.2 / 5.0.

L2

Learning

The built-in Rise 360 knowledge check (3 questions, 80% passing threshold). Scores tracked via LMS reporting.

L3

Behavior ★ Most proud of this one

A 30-day manager observation checklist measuring whether new hires are independently completing their top 5 tasks without prompting. This is the real measure of whether the training worked — and it shifts accountability to the environment, not just the learner.

L4

Results

90-day retention data pulled from HR and compared against a 6-month pre-launch baseline. The only number that actually matters to the business.

What I'd do differently.

"If I were to iterate on this module, I would conduct real SME interviews with shift leads and 30-day employees before finalizing the scenarios — the fictional scenarios are grounded in common onboarding failure patterns, but actual incident data would sharpen the branching choices significantly. I would also test two versions of the scenario feedback language: one consequence-forward ('here's what went wrong') and one coaching-forward ('here's why the better choice works') — and use quiz retake data to determine which drives better second-attempt performance. Finally, I'd push for a Phase 2 Spanish translation given the demographics of most frontline retail workforces."

This isn't a teaching background. It's instructional design.

I spent more than 15 years diagnosing why learners disengage, structuring environments that support behavior change, and designing experiences that meet people where they are — not where we wish they were.

That's the foundation of every project in this portfolio. The tools change. The question doesn't: what does this person need to be able to do, and what's actually standing in their way?

← All Projects Next: HIPAA Compliance →