Wes - A conversational AI audio-navigation that provides independent way-finding for visually impaired guests at Century City Mall.

 

Academic Project

ROLE

User Research & Interview, UX Design, Concept Design, User Testing

YEAR

Fall 2019

TEAM

Ami Kubota, Aaron Guhin, Chase Nguyen

Understanding

Onboarding screen for Wes. Designed by Ami Kubota

Onboarding screen for Wes. Designed by Ami Kubota

PROBLEM

Century City Mall is a beautiful, modern and very large mall. Sighted people get lost due to its size and complex layout, but they are able to use the interactive directories to navigate to their desired stores or services. However, the directories are not accessible to vision-impaired guests, making it impossible for them to navigate the mall due to the lack of accessible way-finding.

INSIGHT

During our interviews we discovered that visually impaired people not only lack accessible wayfinding, but they also lack independence “I can’t really go [to the mall] whenever I want because I need to ask someone to go with me and I have to work around their schedule.” Some interviewees also mentioned that completing a task independently boosts their confidence level.

SOLUTION

Wes: an AI assistant integrated into the existing Westfield app. Wes combines the built-in GPS capability found in virtually every smartphone today with a map of the mall to accurately pinpoint a guest's location. Using that information, Wes provides conversational audio directions and environmental descriptions. Wes also takes the guest’s walking speed into consideration, so visitors don’t feel rushed, can easily find their destination and be informed of their surroundings, thus creating a much more accessible and enjoyable mall experience.

Design Challenge

How might we enable independent navigation for visually impaired guest at the mall?

Learn the difference

As we approached this project it was important to learn about our audience. One of the first things we learned was that there is a wide range of visual impairment designations ranging from mild impairment to total blindness. We tackled this design challenge by targeting our designs for the most impaired users. Although only 15% of all visually impaired people are totally blind, designing for them would accommodate all other visual impairments.

Learn_Difference_1.png

Our Users

mark.png

Katie was born blind. She has a YouTube channel that she uses to teach people what it is like to be blind. During our interview Katie mentioned the struggles of asking for help from strangers or depending on someone to go to the mall with her.


“I can’t really go [to the mall] whenever I want because I need to ask someone to go with me and I have to work around their schedule.”

 
wyan.png
 

This is Mark, one of our eight interviewees. Mark is legally blind with severe low vision, but he is still able to perceive light. He teaches orientation and mobility for visually impaired students at Wayfinder. Mark walked us through the several strategies he uses when he visits a mall.

“I can’t see details, so signs don’t work for me.”

 
katie.png
 

This is Wyan! Wyan is legally blind with moderate visual impairment and can often get by without a cane. She doesn’t like to go to malls because they are complicated to navigate and she sometimes can’t tell what store she is in.


“[The mall] It’s unpleasant. The directory itself is not accessible and you can’t find where everything is.”

 Users Needs

MacBook Pro - 9 (1).png
Screen Shot 2020-11-17 at 9.42.35 AM.png

The Idea

What if we could provide interactive navigation for visually impaired guests at the Mall?

But How?

My teammate Aaron was the first to volunteer to blindfold himself

My teammate Aaron was the first to volunteer to blindfold himself

 
 

In order to solve this problem, we had to walk a mile in our visually impaired guests’ shoes and understand the obstacles they face when visiting the mall. So we visited the mall once again, but this time we blindfolded ourselves to see how we could guide our visitors. We considered haptic feedback as an option, but it didn’t  give the user enough detail. We tried directional sound to guide the user, but that felt annoying and boring.

Some of the insights from this experience were:

  • What happens if our guests get lost?

  • How do we make them feel independent and safe?

  • How do we provide feedback that is reassuring and doesn’t make them feel inept?


    These insights led us to settle on Voice Guidance. 

Getting feedback from Mark

Getting feedback from Mark

 

Prototype

We went to the mall with Mark and observed him as he walked around. This was an eye-opening experience that provided many insights. We asked Mark to walk around the mall without Wes’ Assistance and he struggled to find the bathroom because of poorly located signs and bad directions from sighted people.

For this test I took the role of Wes. We created a script that I had to follow and we instructed Mark to try to find the bathroom using Wes. To simulate the interaction with Wes, we asked Mark to wear earphones and we called him on his phone.  We later conducted this test with a different visually impaired person to test an iterated version of the script.


Testing Insights

1 - Visually Impaired people take longer to complete a task.

Mark enjoyed that he could get to the store he wanted a lot faster. He told us that whenever he went to a place he hadn’t visited before, he would spend sometime walking around to understand the shape of the mall.

2 - Measurement-based directions does not communicate to Visually Impaired people.

Giving directions using feet and steps as a distance measurement was confusing.  We then tried a different approach “Make a right turn in 3, 2, 1, now.” Our users seemed to understand this and therefore prefer this much more.


3 - People want to engage with different levels of description.

Mark did not like the description of his surroundings. Christine, on the other hand, really enjoyed the companion feeling it gave her, so we ensured that the level of description can be adjusted using voice-command.

The Solution

Wes - AI assistant integrated into the existing Westfield app. Wes combines the built-in GPS capability found in virtually every smartphone today with the existing map of the mall, to accurately pinpoint our guests’ location. With that information, Wes provides conversational audio directions and environmental descriptions, taking into consideration their walking speed, so visitors don’t feel rushed and can easily find their destination and be informed of their surroundings, creating a much more accessible and enjoyable mall experience.

A demonstration of how Wes work

A demonstration of how Wes work

User Interface and User Testing Insights

Wes was originally designed to be accessed via the current Westfield app, with a single screen designated for Wes. But after testing  we discovered that graphics and text helped our legally blind users and gave them more freedom to navigate the app. Therefore, we switched to using large graphics and high contrast colors to accommodate our audience. This current prototype differs from the one in our concept video, but is the latest version after testing. We discovered that visually impaired guests found it difficult to locate the button for the navigation within the Westfield app, and therefore we decided to create a separate app with a high contrast design to provide easier access and accommodate our guests’ needs better.

UI designed by Ami Kubota

UI designed by Ami Kubota

Reflections

Designers should be aware of every audience when approaching a design challenge, and this was a really valuable lesson I learned in this project. Designing for an extreme user enables our product to be suitable for all kinds of other visual impairment, but we also received feedback from sighted people that they could see themselves using Wes too.

I want to continue to explore how precisely we would be able to pinpoint our guests’ location at the mall. This is a key element to the success of our navigation system. I would also like to further investigate a similar navigation system for places like Supermarkets, hospitals, etc. I believe a tool like could even change the way Mark (our interviewee) teaches his students to navigate.

I had an amazing experience on this project. I worked with  a very talented and committed team and learned a great deal from the eight people we interviewed. I was saddened by learning about the daily challenges they face as visually impaired people, but inspired by the wide range of design opportunities yet to be pursued for this group of people.

Next
Next

SID