PIERCE   AUBREY   UX
DESIGNING CLARITY FROM COMPLEXITY

AMICUS

Duration
  • 3 Months
  • Project Type
  • Generative AI
  • Web Design
  • My Role
  • UX Lead
  • Front End Dev
  • Target Users
  • Internal Staff
  • 300+ Teachers
  • Contribution
  • User Reasearch
  • Flow Design
  • High Fidelity Design
  • Front End Dev
  • Outcome
  • Reduced Staff Leavers
  • Increase in staff support satisfaction
  • THE CHALLENGE

    Inspired Group needed to address critical teacher retention—annual turnover reached 65 staff members, largely due to burnout from excessive administrative workload. Lesson preparation (creating PowerPoints, handouts, curriculum-aligned materials) consumed hours that should have been spent on student interaction.

    The opportunity was to leverage generative AI to reduce this burden, but the challenge was multifaceted: teachers were skeptical of AI, worried about accuracy and reliability, and concerned that automation might undermine their professional expertise.

    My role: Lead UX strategy and execution from discovery through launch, balancing teacher needs, technical constraints, and organizational goals while introducing AI into an environment that had never used it before.

    UNDERSTANDING TEACHER NEEDS

    Working with the teaching team, I conducted discovery to understand where time was being wasted and what would actually help:

    What we learned from interviews and team discussions:

    • Teachers spent 2-3 hours per day on lesson prep, with much of it on formatting and finding resources
    • The biggest pain point wasn't creating content—it was tailoring existing resources to specific student needs and contexts
    • Teachers were concerned AI would produce "generic" or inaccurate content that didn't reflect our curriculum standards
    • There was significant anxiety about being replaced by automation
    • Teachers wanted to stay in control—they needed to review, edit, and adapt any AI-generated content

    The core insight: Teachers didn't need AI to replace their expertise—they needed it to handle the repetitive, time-consuming work so they could focus on the pedagogical decisions only they could make. The tool needed to feel like an assistant, not an autonomous system.

    KEY DESIGN DECISIONS

    Stripping Back Complexity

    Early wireframes included granular controls—temperature settings, token limits, model selection. This was based on assumptions about what "power users" might want.

    What we learned: When we tested these interfaces with teachers, they were overwhelmed. Most had never used AI tools before and didn't know what "temperature" or "tokens" meant. The complexity was creating a barrier to adoption.

    The pivot: We stripped it back to a simple three-step flow: Setup (who you are), Prompt (what you need), Creativity (how novel vs. standard). This meant losing some control, but it dramatically improved accessibility.

    Renaming "Temperature" to "Creativity"

    This seems small, but it was strategic. "Temperature" is AI jargon. "Creativity" is a concept teachers understand and use daily.

    Why this mattered: It signaled that Amicus was built for teachers, not AI enthusiasts. It reduced cognitive load and made the tool feel familiar rather than technical.

    Grounding in Internal Resources

    One of the biggest concerns was accuracy—would Amicus produce content that contradicted our curriculum or used unreliable sources?

    The solution: We worked with engineering to ensure all outputs were grounded in our internal educational resources and curriculum standards. This wasn't just a technical choice—it was a trust-building strategy.

    The impact: Teachers felt confident that Amicus-generated content would align with what they were already teaching, reducing the review burden and increasing adoption.

    Building "Amicus Together"

    Post-launch feedback revealed that some teachers were getting much better results than others. Rather than just improving the tool, we saw an opportunity to leverage this.

    The feature: Amicus Together allowed teachers to share effective prompts, setups, and generated materials with colleagues. This turned individual efficiency gains into collective learning.

    Why this worked: It created a cultural shift—teachers who were initially skeptical saw their peers succeeding and learned from them. It also reduced the support burden on our team.

    THE SOLUTION

    Simple Three-Step Flow

    Step 1: Setup
    Define your context (Year 11 English teacher, GCSE curriculum)

    Step 2: Prompt
    Describe what you need (Lesson plan on Macbeth Act 1, PowerPoint on photosynthesis)

    Step 3: Creativity
    Adjust output style (Standard curriculum-aligned vs. Creative exploratory)

    Additional features:

    • Image upload capability (reference existing materials)
    • Amicus Together (share effective prompts and materials)
    • In-product feedback (continuous improvement)
    • Training materials and always-available support

    Platform Considerations

    Designed for both desktop (primary lesson planning context) and mobile (quick reviews and on-the-go use). Teachers could start planning on desktop and review materials on mobile during commutes.

    IMPLEMENTATION & ITERATION

    Working with engineering: This was Inspired's first generative AI tool. We navigated API rate limits, response time optimization, and content safety filtering together. I worked closely with the development team to specify edge cases and ensure the experience matched design intent.

    Managing stakeholder expectations: Leadership wanted fast adoption. Teachers wanted reliability. Product wanted feature velocity. I facilitated alignment by establishing clear success metrics and running a phased rollout starting with early adopters.

    Post-launch optimization: We continued iterating based on usage analytics, support tickets, and direct teacher feedback. Key improvements included image upload capability, improved prompt suggestions, and mobile optimization.

    Results & Impact

    Business Impact:

    • Annual teacher turnover dropped from 65 → 25 (62% reduction)
    • Staff satisfaction with "technology and processes" increased 17% year-over-year
    • Steady adoption across multiple Inspired campuses
    • Teachers reported saving 2-3 hours per week on lesson prep

    User Experience:

    • High engagement with Amicus Together (teachers actively sharing and collaborating)
    • Low support burden (simple interface reduced confusion)
    • Teachers reported saving 45 minutes per day on lesson preparation
    What I learned

    Simplicity over control: The decision to prioritize simplicity over advanced settings was validated—95% of teachers never requested more granular controls. Starting with the simplest possible interface and adding complexity only when needed is often the right approach.

    Trust through transparency: Grounding content in internal resources built trust faster than any amount of UI polish could have. When teachers knew outputs were based on our curriculum, they adopted the tool with confidence.

    Community accelerates adoption: Amicus Together (the collaborative feature) accelerated adoption more than we anticipated. We should have built it from day one rather than as a post-launch addition. Peer learning is more powerful than top-down training.

    Cultural change matters: Beyond metrics, Amicus changed how teachers viewed AI—from a threat to a tool. This cultural shift opened the door for other AI initiatives within Inspired and demonstrated that thoughtful UX can bridge the gap between technological capability and human acceptance.

    Success Stories

    Let’s work together.

    Get in touch
    Local time
    Instagram LinkedIn © · Designed & Developed by yours truly