Project Overview

Design a virtual assistant and multimodal interface that follows a specific car manufacturers brand and market position. No third party voice assistants. 
Assume a development horizon of 2025 with expected market trends and technologies. 
Place special attention on creating a fully integrated Conversational UI.

Team - Alex Heyison, Aashrita Indurti, Hongxun Jin, Yuchuan Shan,
My Role - Interaction Design, Motion Design, UX/UI Design
Tools - Figma, Adobe After Effects, Illustrator, 
Context - CMU Interaction Design Studio Project
Timeframe - 8 weeks, Fall 2020
Choosing a Brand
From the outset, our team understood that the design direction of our Voice Assistant and Conversational UI was going to be very heavily influenced by our choice of brand. 

We wanted to choose a brand that had a good balance between having a solid brand foundation for digital applications and one that would at the same time wouldn't be too restrictive. After considering widely, we settled on Tesla for a few key reasons. 

     1. We believed Tesla had the technical expertise to develop and implement their own Voice Assistant software.
     2. With a 2025 time horizon, we projected that self-driving, or assisted driving would become more common. Tesla's position in having the most autonomous miles driven would allow us to design for the future. 
     3. We appreciated the company's history of taking creative risks for memorable user experiences that surprise and delight.
     4. With the launch of the Model 3, we believe it is a crucial moment for Tesla's brand as they seek to move beyond their core demographic to a wider user base. We believe a well-considered VA could help substantiate that move. 


The words we used to lead our brand consideration for Tesla were

Innovative
Minimalist
Iconoclast
Understanding the Use Case
Diverge
Once we had our brand selection, we knew that we would have to zoom out to better understand the broad problem space that Voice Assistants address. 
To better understand our potential points of intervention, we collaboratively explored the automotive context, detailing all the ways a voice assistant might shape a user's experience in an auto.
User Interviews
Beyond our own experience and projections, we wanted to also better understand the needs of our users. 
We had the opportunity to speak to two recent Tesla owners, which helped us focus in specifically on the issue of learning curves in such a fully featured vehicle. 
Beyond that we also interviewed city commuters, daily VA users, and drivers of other autos with Voice Assistant packages. 
Defining the VA Personality
To help us coordinate our vision for what personality our VA should communicate through its form, motion, language and vocal tone, we began with a simple affinity mapping exercise.
Our choices for keywords would be
Empathetic
Culturally-Aware
Candid
  
Create a visual identity
Brand Integration
To allow the visual form of our voice assistant to better integrate into Tesla's existing brand imagery, we investigated both the current state of the brand as well as digging into some of the original source material that inspired early Tesla designers. 
Through sketching, early prototyping and group critique, we evolved the visual form over several iterations.
We decided to maintain the Tesla logo as part of our conversational UI given our observation that many Tesla owners are proud of their car's brand and have a positive emotional relationship already.
From there, we focused on three key visual forms as the basis for our VA
  
VA in Motion
Once the basis for our form was decided, we then began to explore the potential for the CUI to express personality and system information through motion.

The key issue we felt we needed to address while designing for motion had to do with attention management. Many of these animations had to have meaning without drawing the full focus of the driver away from the road. Additionally, the importance of the alert had to be considered carefully. 

We began by identifying the key states that our VA would need to represent. 
Each state was plotted based on how static or dynamic, formal or informal, the animation should appear. 
  
Iteration 1
Feedback
After presenting our first iteration, we settled on addressing some key reactions to our early design.

    1. Animating the logo itself might lead into dangerous territory with the branding dept.
    2. The animations have very similar overall forms when viewed in peripheral vision. 
    3. The VA could use more personality.
Iteration 2
To modify our original forms, we decided to allow our VA's forms to be more organic and less abstract. While maintaining the Circle x Tesla inspiration for our original forms, we softened arcs and simplified our key forms to a low attention aviary form, and a high attention spinning wheel form. 
Voice Assistant in Motion (Round 2)
Our second round of animation aim to be more expressive and playful without breaking down our attention-management structure. 

Since we were now working with only 2 basic forms, we focused on using differences in the energy of the animation as well as limited uses of color to add further distinction between states. 
Beyond that, we checked keyframes of the animation laid out in series to ensure that a user would have a good chance at identifying the VA state by a quick glance, since their attention to the road would be sporadic. ​​​​​​​
  
Integrate into Tesla UI
Analyzing the Current State
Tesla's UI has a lot of responsibility to manage. 
The central console handles many of the tasks that would be traditionally distributed across the dashboard in knobs, switches, and other physical inputs. 

Apart from the navigation map, the UI is divided into 4 sectors anchored to the bottom left of the screen with an icon tray below. 
Currently, the Voice Assistant is located in the bottom left column. 
In its active state, the CUI will locate on a card UI element in the bottom left.

In our conversation with users, we discovered three main problems with this arrangement. 
     1. The size of the CUI is too small for a driver to locate information while driving.
     2. The text is not clearly distinguished from the rest of the UI 
     3. The voice assistant is grouped seemingly at random, separate from other quick actions
  
Exploring Intuitive Alternatives
We first moved the VA trigger to the bottom toolbar, to better align with our user's expectations. 
As we discussed how to resolve these issues, we also realized that we could craft a more intuitive UI layout by borrowing from the film industry. We found that by using the commonly used subtitle framework, users could more intuitively follow a conversation without needing full attention. 
In its idle state, the VA logo would live at the base of the screen. 

To accommodate the switch, we moved the quick access to the mobile app, as we assumed this was not a feature that should be grouped with other features that needed to be used while driving. 
Upon trigger, a black bar slides up like a film subtitle.
The VA idle form appears as the VA becomes active...
...and text is displayed alongside.
Refining the Use Case
To best demonstrate the value, functionality and feel of our Conversational UI, our team created storyboards for 3 scenarios. These scenarios had to demonstrate a wide range of the VA states that we had developed while also showing the value that VA could provide both inside the vehicle and as part of Tesla's mobile app. 

Scenario 1 - Smart Pickup
A user schedules a remote pickup through Tesla's mobile app Voice Assistant. 
By integrating information from a user's mobile calendar, Tesla VA becomes even more helpful in making sure your day runs smoothly. 
Scenario 2 - VA Assisted Mid-Trip Planning
Tesla has an incredible database of information about charging locations that it uses for its trip planner function. Unfortunately, that information is only truly accessible to a driver when the car is stopped. 
In this scenario, Tesla VA allows a user to access and act on that information while driving as their needs change on longer trips. 
Scenario 3 - Voice Command Debugging
Voice Assistants rely on relationships, and relationships are built on trust. Often, a user will become frustrated if a VA is unable to help and become distrustful if those errors are not resolved. 
This scenario outlines a new debugging user flow, that allows users to help the VA learn from its mistakes. 
  
Final Video
Back to Top