top of page

Back to the future

The connected car experience for an 

Autonomous Vehicle

Phase 1: Summary

The aim of this project was to design an experience for the autonomus car of the future when people will trust them enough as a daily driver and infrastructure of our cities will be smart enough to support them for the daily commute.

Phase 1 - Group work

Interaction design,
Team project,

My role: Conceptualization,  wireframes, and prototyping.

Class: Interaction design methods


Balsamiq, Photoshop, Sketch, Marvell

Storyboard depicting the future scenario and product use:
The Process:

User study

User Study 

Task: Three individuals from different backgrounds (car owner, commuter, occasional driver) were invited to participate in the experiment. We provided them with a low fidelity prototype of a driverless car interior and a virtual conversational assistant. Their task was to interact with the prototype individually and in groups and provide feedback.


Outcome: The experiment helped us to understand:

-The expectations of users regarding the functionality of an intelligent vehicle as well as usability and modes of interaction.

-Change in user behavior and interaction with the system when alone versus with company.

-Concerns of the user about privacy while sharing and viewing data on the vehicle's system, and effect of the presence of another individual. 

Case study

Case study

"I want an application on my phone/tab which helps me to interact with my vehicle. I wish it could replace my car keys."

Case Study 

We did a case study of Tesla Model X and Acura MDX to understand the current experience of autonomous driving and gain insights about the interaction design trends for such vehicles.

Creating persona

We created several personas to analyze the different scenarios and use cases of an automated vehicle. Finally, we used the persona of  'Matt' for our storyboard.

User Journey

Entice > Enter >  Engage > Exit > Extend - Using the compelling experiences framework helped us to identify and compare the user's activities and interaction with a vehicle in the current scenario, and how their interactions might change in the future to perform the same activities in a smart vehicle.

Persona development

5 E framework exercise

Task analysis & Information architecture

Deriving from the framework exercise used earlier we did the task analysis of the current and future scenario.​ ​

TA -Current Scenario

TA -Future

Information architecture

We also created the information architecture for the mobile application which would tie the whole experience and services together

Application Prototype:

Click on the phone to

play the app demo.



Click on the icon to go to the interactive prototype.


Prototype testing

Learnings so far:
  • Trusting the car to drive in autonomy is a hurdle.

  • Even when the trust factor will be up, users will want more control over their car's driving as long as it remains personal.

  • Navigation and Services can be sorted by some hierarchy.

Phase 2 - Individual work
Phase 2: Individual Project

Interaction + Interface design

From the learnings of our group project, I went on to further develop the navigation aspect of the application for the connected autonomous car of the future.

I designed the new application for the Android platform and followed the Google's Material Design guidelines for the same.

Dashboard: Idle

Current Status:

Shows the current location and status of the vehicle: Parked, Arriving, Picking etc.


This tab displays the data about any upcoming trips if any.

Lock/unlock the vehicle remotely/from a distance.


This button takes the user to the 'Schedule' section where one can schedule trips for the future in advance or cancel any trips already scheduled.

Map View:

Status: For locating the vehicle in real time.


Upcoming: For viewing/editing the destination for the upcoming trip on the map.

Swipe for Action:

For actions that involve actual movement of the vehicle. This helps in the case of accidental touches.


All the services are nested under the 'More' section to cut down the clutter on the main screen.




Haptic feedback:

The device vibrates to confirm the commands which involve actual vehicle movement.

In-car environment:

Once the vehicle is on its way to pick up the user, they get the controls like temperature and music.

On the way

to pick up.


Time of


Map View

Once the vehicle is at the pickup location, it updated the user with a 360-degree image of its current surrounding. 

Cancel pick up

Park the vehicle at

the nearest available

parking location in case the user doesn't want to ride immediately.

Share location of

the vehicle with

your family and




Once the car is at pick up location, the user gets the control to unlock it.

When the user enters the vehicle, the application screen changes, greeting the user and giving more options for navigation.

User can select their route according to their preference.

When the user enters the vehicle


selection screen

Add stop to your current trip


Trip started

Add stop to your current trip

Swipe to Stop

Also available on your android watch-face when on trip.

User can stop the vehicle using this feature in the middle of a trip. Vehicle will slow down and find a safe stop.

The idea is to provide them with a control similar to the break paddle. 

"Break Paddle" watch-face,

active during a trip

Afterthoughts, for further exploration:

The interactions between the user and the vehicle will have to be multimodal: Touch (device, vehicle), Tactile (vehicle), Voice (Vehicle's AI?)


How will the vehicle/rider communicate with the pedestrians?


Who will have the vehicle's ownership? The employer, individual, community, manufacturer or 3rd party vehicle providing service?


How will the human - vehicle interaction evolve when the vehicles become autonomous.

Do user need to be informed about the vehicle - environment interaction? (V2X Display?)

Will there be parking tickets? 



bottom of page