WHERE I WORK

FXGear

DATE

2015-2017

TEAM

Yeonkyung Kim

Hyejin Ham

Yanghee Kim

PROJECT

UX /UI

FXMirror: 3D AR Mirror Kiosk

OVERVIEW

FXMirror provides real-time 3D virtual fitting service with its AR mirror kiosk. This project improved the kiosk’s user interface.

Introduced in 2015, FXMirror allows shoppers to try on clothes without physically changing clothes. The solution consists of a 75” flat panel display, a console, a Kinect camera, and proprietary software with patented technologies. As the product was the first of its kind as a virtual mirror, there was no industry standard or publicly widely recognized user interface. The design team focused on identifying and solving key issues on the interface that arose from real-world users.

HOW IT WORKS

Customers first scan their bodies using FXMirror’s camera and can virtually try on various items on the screen. They can buy at the store right away or use the info to purchase online.

FXMirror system scans the user’s body with its camera, virtually recreating the skeleton of their body based on the analysis of the visual data. With a 3D version of their body ready, the customers can try on individual clothes by category (e.g. t-shirts), and also mix and match with items in other categories (e.g. tops with bottoms or jackets). They can download and share the picture with the clothes on and also share the info of the merchandise.

SITUATION

The product needs detailed refinement in its interface in response to the actual situations and customers’ feedback.

The initial product focused on proving its workability and functionality of the AR mirror kiosk. As the products are spread out and used by more diverse customers, previously unidentified problems surfaced above the water. Shoppers and merchants both had to learn how the mirror works and how to use it.  The Sales team and Customer Service team were receiving feedback from both the parties and it was the time that we need to resolve the issues in the interface to improve the user experience.

WHAT I DID

I coordinated with other teams to identify key challenges and designed interface improvements.

I conducted A/B tests and user tests to understand and verify various interface improvements. In the process, I took charge of external communication with other teams like the sales team and the CS team. We could identify key issues of the existing UI and I worked on the input method, responsiveness to the height, and visual nudge for posing. I worked on creating graphic assets such as icons and micro-interaction as well. I was also in charge of integrating different brands into FXMirror’s interface, as we collaborated with more than 20 fashion and retail brands for the product.

CHALLENGE - OUTCOME

#1: Input method for selection (‘click’ for AR mirror)

For FXMirror, users had to make an input without physically touching the screen or any other input device like a keyboard or a remote. Below points were considered to design an ideal input method for this product:

  • Using hands (either one or both), not just finger(s): fingers are too small for accurate recognition

‚Äč

  • Resembling the ‘touch’ interaction as close as it could be: merchants cannot teach the input method to the passing-by shoppers every time, and almost everyone is now accustomed to ‘touch’ the screen

 

  • Discerning to the users between hovering around and making a selection: hands would be always on display

 

  • The pointer/cursor has to be distinguishable from a couple of meters away, even from a complicated background

IDEATION - Cursor
IDEATION - Cursor hover
IDEATION - Cursor hover duration
0.5s
0.8s
1.1s

The team agreed to put a cursor following each hand. For the shape of the cursor, I came up with a round-shape design with shadow. While the arrow shape is most familiar to the public, it had to be large enough to be recognized, and such a large pointed shape looked intimidating. We did not ask for any precise inputs from the tip of such arrows, as well. 

I also designed the progress circle around the pointer so that the user can recognize she reached a UI element that can be selected and at the same time allow her short but enough time to hover on the element to decide if she will actually make an input.

CHALLENGE - OUTCOME

#2: Graphic user interfaces responsive to the user’s height

Different heights per person restrict the area where the user can reach for the interface. As the product was introduced to diverse fashion stores, brands with kids line gave us feedback that short people were having trouble reaching for buttons. The UI elements had to be re-arranged according to the reachable area of different statures. I have identified the reachable areas of adults and kids according to the reachability with arms so that the kids (or people with short stature) do not have to stand on their tiptoe to reach for key interfaces.

Adults' Interface
Kids' Interface
Adults' Interface
Kids' Interface
Easily reachable area
Extended reachable area
Hard-to-reach area
CHALLENGE - OUTCOME

#3: Visual nudge to users to take the proper pose for accurate scanning

For FXMirror’s camera to scan and measure the limbs accurately, users have to stretch and spread their arms and legs a little so that they are apart from the body and each other. (The system will just proceed with less accurate measurement even when limbs are overlapped, as it was more troublesome to prompt error and to ask for re-measurement.) The current version provided text instruction with a silhouette but it was not clear that the limbs have to be spread. I added a visual nudge with a simple animation showing a figure spreading arms and legs one by one. This change dramatically increased the percentage of users posing for accurate measurement and consequently improved user experience as customers can enjoy virtual clothes more fit to their body.

AS-IS

Not clear that the limbs have to be spread

TO BE

Added a visual nudge with a simple animation showing a figure spreading arms and legs one by one before measuring body

Thank you for watching :)

© 2019 Yeonkyung Kim