XFINITY
Peacock partnered with Xfinity to promote their streaming service. Pre-launch Xfinity wanted to refresh their in-store Peacock communications to focus on the content available through the streaming service. As part of their store-wide refresh they launched a touchless app on in-store mobile iOS devices. I helped architect the gesture experience for this app.
Role - UX Design for a mobile app
Team - 2 UI Designers, 3 Engineers, 1 PM
Client - Xfinity.com


After failed testing attempts it became clear that the peace sign was no longer an option. Instead we would use the hand swipe and open palm gestues in the left and right direction. Since we weren’t able to use the peace sign gesture there would be no signal to exit. So I mapped happy and non-happy paths to highlight what the new experience could look like with the chosen gestures.



After, I worked closely with the engineers to define how this experience would materialize in code. I created a state machine to highlight the micro-interactions of this touchless app experience. I was amazed to see how many states this app could be in based on a 2-3 minute interaction.

UI Design
Due to the highly visual nature of this project, the team decided on a design direction early on. The design elements were consistent across the entire experience. Since we were working with a mobile device the elements had to be interactive and a card-based design felt more functional.
Cards are the perfect metaphor as they mimic physical cards. We discussed the card shape, size, and radius before deciding on an expanded card that would alternate into a condensed card upon interaction. The card design was inspired by 80's baseball cards.




The second library had fewer gestures for us to test but was more responsive. It featured gestures such as
-
Open palm, forward-facing raise palm (L, R)
-
Fist, forward-facing close fist (L, R)
-
Pointer, wave finger (L, R)
-
Peace, forward-facing index, and middle finger (L, R)

INFORMATION ARCHITECTURE
GESTURE EXPLORATION
There were two libraries development gave us to test. The first library allowed us to test a handful of gestures such as
-
Hand (wave your hand )
-
Palm (raise your palm)
-
Head (tilt or turn your head)
-
Nose (wiggle your nose)
-
Voice (speak)
With the first library, a demo app was created to test the responsiveness of these gestures. This library supported up, down, left and right directions. I downloaded the app and studied how each gesture responded. Head and nose performed poorly at less than 50%. Hand and palm worked consistently at about 80% (in the right direction). Voice also performed well but stakeholders felt that voice activation could be intrusive to shoppers.
UX
COVID-19 and the change in consumer behavior were the driving factors behind creating a touchless app versus a static one. I began the project by going down a rabbit hole of sketching gesture signals and cues.
I discovered that Android was easier to "hack" in terms of programming the phone to do more gestures. iOS was more constrained. However, the partnership lead to us designing for iOS.
We collectively decided that having the overlays appear in between screens We discussed where the onboard could live in the experience. It would either happen exclusively at the beginning with featured overlays or the overlays would appear in between screens.. would be more helpful in guiding the user through the experience.
Based on this feedback, I quickly started to map out gestures in a user flow that would include the palm, wave and peace sign. I got creative with the defining the gestures for example the peace sign could signal that the user is exiting the experience. The designers quickly visualized this by defining key screens. I added to this by including a 'product tour' to help show users how the experience works.

LAST STEPS
After handing the diagrams off to engineers I began thinking of all the ways a visitor might come in contact with this app while in-store. I wanted to visualize their every move to highlight extra value between touchpoints. The journey map unveiled opportunities to improve the overall experience. For example, if we want to call users to action we can add QR codes to the table messaging which hosts the experience. This would direct them to the desired action

FINAL THOUGHTS
This project was very insightful from a UX perspective. The touchless app experience launches mid-July in Xfinity stores nationwide. The Peacock Xfinity relationship is strong and it will be interesting to see how they creatively strengthen their partnership further in the future.