A wearable smart interpreter that eliminates communication barriers for the deaf
Research, Ideation, UI/UX, Visual design, Prototyping
(October - January)
Figma, Principle, Adobe CC,
Cinema 4D, Blender, Premiere
“There are 466 million people in the world with disabling hearing loss. This is over 5% of the world’s population.”
- World Health Organization
In a majority-hearing world, communication already poses a significant challenge for the deaf. However, amid Covid-19 and the mask mandate, many important communication methods used by deaf individuals have become practically impossible.
Ligo is a wearable smart device capable of interpreting sign language for everyday scenarios that deaf individuals experience. Using motion detection technology and artificial intelligence, Ligo acts as a personal interpreter for every situation.
Translating sign language to speech and text
Interpreting speech and audio input in real-time
Pairing with a companion app for more accessibility options and a calling feature
To create our product, we reached out to over a hundred deaf organizations all over the world to better understand the challenges experienced by the deaf community on a daily basis.
93% of responders have experienced challenges in communicating with hearing individuals. Anecdotes from our responders show that instances where hearing individuals get frustrated or simply drop a conversation with them due to these challenges, are not rare at all.
View our full user research process here.
From the research, we were able to gain insight into the daily challenges that the deaf community faces, and were able to pinpoint 5 key pain points. Here is a summary of our findings:
With the pain points identified, we began to conduct market research to further explore areas of opportunities to address. Read our full benchmark analysis here.
To summarize our research and findings, we created a PACT analysis as well as 2 user personas, which helped us synthesize our research and create our user scenarios.
* images taken from Unsplash
After this research, we created a list of questions to help navigate through the ideation process. Some of these questions include:
How does the product work in a group setting where multiple people are talking?
How will the interface work for all the accessibility levels?
How will this product work when a conversation is not already engaged?
How to make the 2-part system more efficient/do we need it?
After finalizing our research, we began to explore possible concepts for how the device might look like. At first, we took the approach of having two bands being worn on the palm of the hand. However, after multiple rounds of iteration, referring back to our research, and feedback from our professor, we decided to go with a more inconspicuous smartwatch-like look.
Final Product Sketch
Introducing our final product: Ligo. A wearable smart interpreter consisting of two nine-axis motion-detecting bands, a comfortable viewing screen, six-speaker outputs, and a built-in AI. Ligo is intended for everyday wear, hence it's sleek, inconspicuous, and comfortable design.
Walkthrough of Ligo's Key Features
We began by consolidating all of our ideas and research insight into a flowchart. This helped us map out the overall layout and features of the app.
Ligo's companion app is designed to provide a seamless onboarding experience, as well as more flexible viewing options and extended features.
4 Straightforward and Easy Userflows
Easy & friendly onboarding experience for first-time users when pairing Ligo and navigating the app
Use Bluetooth to easily connect with others and have fully interpreted conversations
Use the calling feature to make interpreted calls anytime, anywhere
Fully customizable UI and interpreter to suit any accessibility need and preference
We created a promotional website to help explain how Ligo works and the technology behind it.
Whilst navigating this 4 months-long project, we ensured our priority was to serve the deaf community the best we could. However, that process came with a lot of learning and unlearning of our own assumptions. Initially, we were blindsided by our own narrow understanding of the deaf community. After reaching out to users in our target demographic, we were able to gain a clearer understanding of our objectives and thus, empathize better with their needs.
If we were to do this project again, we would definitely want to include a deaf consultant and a developer in the design process. Since we were just a team of designers, we didn’t have much knowledge on the technical feasibility of all the features we were proposing. Having those opinions would definitely help bring Ligo closer to being a realizable product. Additionally, we would also want to conduct user testing with people from our target audience to reiterate any lacking areas.
Include a deaf consultant and developer in the design process
Conduct user tests with people from our target audience
Further develop the 3D interpreter model to include facial expressions
Explore more possibilities to elaborate on the app and its current features
Check out my other projects ↘