PIERCE   AUBREY   UX
DESIGNING CLARITY FROM COMPLEXITY

10-Point Summary

  1. Market Gap: Identified accessibility barriers in BSL learning where existing tools offered passive video watching with no feedback mechanism
  2. Technology Exploration: Conducted deep dive into how ML and AI could contribute directly towards accessibility and inclusive learning
  3. Self-Built Prototype: Developed working prototype independently, demonstrating camera-based sign recognition with real-time feedback mechanics
  4. Learning Psychology: Applied behavioural psychology principles to create encouraging feedback instead of pass/fail judgment
  5. Pedagogical Design: Created feedback loops that guide improvement ("You're close, adjust hand shape") rather than scoring performance
  6. Accessibility First: Designed camera-based interaction with minimal UI, serving Deaf users, hearing learners, and classroom environments
  7. Practice Modes: Mapped learning journey through Learn, Drill, and Conversational modes encouraging iteration over perfection
  8. Technical Exploration: Navigated camera accuracy and latency constraints whilst maintaining user expectations for seamless experience
  9. Ethical Approach: Created empowering tool for marginalised community without being extractive or tokenistic
  10. Validation: Secured interview opportunity on The Apprentice, demonstrating commercial viability of accessibility-focused AI learning tools
Contribution

Product Design

Prototype Development

ML Exploration

Project Type

EdTech Experiment

Accessibility

AI/ML

Exploring AI's Role in Accessibility

Identifying the Gap

I noticed a fundamental accessibility barrier in British Sign Language learning. Traditional methods required expensive classes, qualified tutors, and synchronous commitment, creating exclusion rather than inclusion. Existing digital tools relied entirely on passive video watching with no feedback mechanism, leaving learners with no way to know if they were signing correctly and no clear path from understanding to fluency.

This felt like an opportunity to explore how machine learning and AI could contribute directly towards accessibility. Could technology create an interactive learning experience that provides real-time feedback, works autonomously, and serves multiple user groups: Deaf users wanting to teach others, hearing learners discovering BSL, classroom environments, and self-teaching adults?

The challenge was more than technical. Designing for a marginalised community required approaching this as an ethical experiment, ensuring the tool felt supportive and empowering rather than clinical or extractive.

Building and Learning

I began a deep dive into how machine learning could enable camera-based sign recognition whilst creating an experience that actually helps people learn. Rather than just designing screens, I built a working prototype myself to understand the technical constraints and possibilities firsthand.

Pedagogical exploration: Instead of implementing pass/fail judgment or static scoring, I experimented with feedback patterns grounded in learning psychology:

  • 1:

    "You're close, adjust hand shape"

  • 2:

    "Try slowing down"

  • 3:

    "Good form, try again"

The goal was discovering feedback that remains helpful without discouraging, specific without overwhelming. This became a core design principle rather than an aesthetic choice.

Accessibility as foundation: The system prioritises camera-based input with minimal UI clutter and clear visual hierarchy, deliberately avoiding reliance on heavy text. This approach makes the experience usable across diverse contexts: Deaf users teaching others, hearing learners discovering BSL, classroom environments, and self-teaching adults at home. Genuine inclusive design, not cosmetic accessibility.

Mapping the learning journey: I structured the experience around distinct practice modes:

  • 1:

    Learn mode (introduction to new signs)

  • 2:

    Repeat/drill mode (building muscle memory)

  • 3:

    Conversational practice (future state for contextual application)

  • 4:

    Feedback loops encouraging iteration rather than perfection

What I built: Complete design system establishing visual identity and interaction patterns. Comprehensive wireframes and user flows mapping the journey from onboarding through progressive skill development. Most importantly, a working prototype I developed myself, demonstrating camera-based sign recognition with real-time feedback mechanics. Building the prototype firsthand gave me direct understanding of technical feasibility and constraints.

Navigating constraints: Working with camera accuracy and latency limitations whilst managing user expectations ("it should just work") taught me about balancing technical reality with ideal user experience. Creating a tool for a marginalised community required constant consideration: this needed to feel supportive, human, and empowering. The ongoing challenge was bridging technical feasibility with inclusive design whilst maintaining ethical responsibility.

Platform thinking: Mobile and tablet-first design allowing hands-free interaction with the camera, designed for one-handed menu navigation whilst keeping both hands visible for signing.

Sign2Me Interface Sign2Me Learning Sign2Me Feedback Sign2Me Practice Sign2Me Progress Sign2Me Camera
Sign2Me Prototype Sign2Me Prototype Sign2Me Prototype Sign2Me Prototype Sign2Me Prototype Sign2Me Prototype

Key outcomes:

→ Explored how ML and AI can contribute directly towards accessibility and inclusive learning experiences
→ Built working prototype independently, gaining firsthand understanding of technical constraints and possibilities
→ Designed feedback loops grounded in learning psychology rather than performance scoring
→ Created accessibility-first interaction model serving Deaf users, hearing learners, and classroom environments
→ Mapped complete learning journey from introduction through conversational fluency
→ Demonstrated camera-based sign recognition with real-time feedback mechanics
→ Secured interview opportunity on The Apprentice, validating commercial potential of accessibility-focused AI tools
→ Developed ethical approach to designing for marginalised communities through supportive, empowering experiences

Success Stories

Let's work together.

Get in touch
Local time
LinkedIn © · Designed & Developed by yours truly