Project Overview


ASL

ASL is a mobile app concept designed to translate sign language, specifically for individuals with speech impairments.

iOS App Screens

Problem Definition


What’s the alternative to dictation?

People with speech impairments, such as muteness, face significant challenges in verbal communication. Many rely on sign language to interact with others, but with approximately 300 different sign languages worldwide, this case study will focus specifically on American Sign Language (ASL).

Assistive technology plays a crucial role in enhancing their ability to communicate, helping them navigate daily interactions without exclusion — especially when using their phones. However, two major obstacles persist: dictation and virtual assistants, both of which rely on voice recognition technology.

Dictation is a powerful tool for quickly replying to text messages, composing emails, or generating written content. Yet, for individuals who cannot speak, it becomes inaccessible. This raises an important question: How can we address this challenge and design more inclusive technology that accommodates people with speech impairments?

Process


Design thinking

Since this was a concept project and I didn’t conduct direct user research, I based my assumptions on existing research about people with this type of disability.

Discovery

• Personas

• Journey mapping

Ideation

• User flows

• Wireframes

• High-Fidelity design

Prototyping

• Prototyping

Discovery


Persona

The aim of building a persona was to encompass the needs of a person with a speech impairment (muteness) in a use case when they are interacting with their phone but don’t want or can’t type while not relying on assistive technology.

Journey mapping

After clearly defining the persona, the next step was to get more granular when identifying pain points, friction, goals and all the steps that the person with a disability has to go through in order to complete simple tasks.

Ideation


Wireframes

Since the functionality of the app is built around translation of hand gestures, the goal was to simplify the solution and its features as much as possible, putting emphasis on functionality and usabiliity.

The challenge was — where and how to position triggers (Cancel, Record/Stop, Camera, Send) and input space for the translated text so it doesn’t obscure the video and important hand gestures.

User flow

The user flow shows how straighforward the translation process is, focusing on the easiest way for users to complete the task.

Solution


Hand gesture translation

ASL mobile app uses AI (machine learning) to instantly translate sign language hand gestures into text using a camera. The goal was to attempt to create a solution that provides a communication platform for people with a speech impairment for use cases where they don’t want or can’t type.

The translated text message could then be shared and used with other apps. The app would use advanced technology called classification algorithm. In machine learning, that’s a process of recognition, understanding, and grouping of objects and ideas into preset categories.

Outcome


Final design

User interface free of visual distractions and unnecessary features with a strong focus on task completion. User-friendly navigation, supported with eye-pleasing motion design that enriches the experience.

Translation

Project Details


Project details

Role: Product Design

Team: Product Designer

Tools (design): Figma, Miro

Year: 2021